CN111772639B - Motion pattern recognition method and device for wearable equipment - Google Patents
Motion pattern recognition method and device for wearable equipment Download PDFInfo
- Publication number
- CN111772639B CN111772639B CN202010659319.0A CN202010659319A CN111772639B CN 111772639 B CN111772639 B CN 111772639B CN 202010659319 A CN202010659319 A CN 202010659319A CN 111772639 B CN111772639 B CN 111772639B
- Authority
- CN
- China
- Prior art keywords
- motion
- variation
- type
- output
- types
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/725—Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Abstract
The invention discloses a motion mode identification method and a motion mode identification device for wearable equipment, wherein the motion mode identification method comprises the steps of signal acquisition, wherein motion information is acquired through an acceleration sensor; extracting characteristics, namely processing the motion information and extracting motion characteristics from the motion information; identifying and outputting, judging the motion type according to the motion characteristics, performing statistical classification, and identifying the corresponding motion type; according to the invention, through statistical classification, statistical integration is carried out on the motion types output by a plurality of time windows, the motion type larger than a set threshold value is taken as a final motion type to be output, and corresponding recognition accuracy or recognition efficiency can be set according to different scene requirements.
Description
Technical Field
The invention relates to the field of monitoring motion patterns, in particular to a motion pattern recognition method and device for wearable equipment.
Background
The increased computing power of wearable devices and the integration of sensors may distinguish between some simple types of motion. In the existing motion recognition technology, simple motions such as walking, running and riding are simply distinguished according to different vibration amplitudes and directions, but the recognition result is not accurate due to the simple structure of a portable sensor and the imprecise perception information.
Disclosure of Invention
The invention mainly solves the technical problem of providing a motion pattern recognition method and a motion pattern recognition device for wearable equipment, and solves the problem of inaccurate motion pattern recognition monitoring.
In order to solve the above technical problem, one technical solution adopted by the present invention is to provide a motion pattern recognition method for a wearable device, where an acceleration sensor is provided on the wearable device, and the method includes:
acquiring signals, namely acquiring motion information through an acceleration sensor;
extracting characteristics, namely processing the motion information and extracting motion characteristics from the motion information;
and identifying and outputting, judging the motion type according to the motion characteristics, and identifying the corresponding motion type.
Preferably, in the signal acquisition step, the method includes acquiring acceleration values of a plurality of acceleration sensors arranged at different positions in real time, and acquiring a variation of the acceleration values within a set time; and further carrying out motion grade division on the variation of the acceleration value.
Preferably, in the feature extraction step, the variation of the acceleration value is filtered to obtain a moving average value.
Preferably, in the feature extraction step, a time window is further set, and motion feature extraction is performed on the moving average value in the time window to obtain a motion feature value.
Preferably, the step of recognizing and outputting includes calculating and classifying the motion characteristic values, and calculating and recognizing the motion type.
Preferably, in the step of recognizing and outputting, the step of calculating and outputting further includes performing statistical classification on the motion type of the recognition output, and finally performing statistical recognition on the motion type of the recognition output.
Preferably, in the statistical classification, the motion types output by calculation and identification are subjected to time statistical integration, one motion type is output every time a time window is passed, and the motion types output by calculation and identification are subjected to integration and accumulation once, and when the accumulated integration of the statistical integration of the same motion type after a plurality of time windows exceeds a preset threshold, the motion type is finally subjected to statistical identification and output.
Preferably, in the statistical classification, the step of synchronously performing integral decrement on other motion types different from the motion type of the calculation identification output once every time a time window passes is further included.
A motion pattern recognition apparatus for a wearable device, comprising:
the signal acquisition module acquires motion information through the acceleration sensor;
the characteristic extraction module is used for processing the motion information and extracting motion characteristics from the motion information;
and the identification output module is used for judging the motion type according to the motion characteristics and identifying the corresponding motion type.
Preferably, the identification output module comprises an identification output sub-module and a statistical integration sub-module.
The invention has the advantages that through statistical classification and identification output, the statistical integral can be carried out on several motion types, and the rule of increasing and decreasing each other can quickly identify that the statistical integral of a certain motion type exceeds a threshold value, quickly obtain the output motion type, and greatly improve the identification accuracy or identification efficiency.
Drawings
Fig. 1 is a flowchart of a first embodiment of a motion pattern recognition method for a wearable device according to the present invention;
FIG. 2 is a flow chart of a second embodiment of the motion pattern recognition method for a wearable device of the present invention;
FIG. 3 is a diagram of acceleration characteristics of a ride vehicle for a motion pattern recognition method of a wearable device of the present invention;
FIG. 4 is an acceleration profile of a walking motion for the motion pattern recognition method of the wearable device of the present invention;
FIG. 5 is an acceleration profile of a fast-walk motion of the motion pattern recognition method for a wearable device of the present invention;
FIG. 6 is an acceleration profile of a running exercise for the motion pattern recognition method of the wearable device of the present invention;
FIG. 7 is another acceleration profile of a walking motion for the motion pattern recognition method of the wearable device of the present invention;
FIG. 8 is another acceleration profile of a running exercise for the motion pattern recognition method of the wearable device of the present invention;
FIG. 9 is another acceleration profile of a walking motion for the motion pattern recognition method of the wearable device of the present invention;
FIG. 10 is an acceleration profile of a cycling motion for the motion pattern recognition method of the wearable device of the present invention;
fig. 11 is a schematic structural diagram of a first embodiment of the motion pattern recognition apparatus for a wearable device according to the present invention;
fig. 12 is a schematic structural diagram of a signal acquisition module of a first embodiment of the motion pattern recognition device for a wearable device according to the present invention;
fig. 13 is a schematic structural diagram of a feature extraction module of a motion pattern recognition apparatus for a wearable device according to a first embodiment of the present invention;
fig. 14 is a schematic structural diagram of a second embodiment of the motion pattern recognition apparatus for a wearable device according to the present invention.
Detailed Description
In order to facilitate an understanding of the invention, reference will now be made in detail to the present embodiments of the invention, examples of which are illustrated in the accompanying drawings. Preferred embodiments of the present invention are shown in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
It is to be noted that, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
With reference to fig. 1, a motion pattern recognition method for a wearable device, in which an acceleration sensor is disposed on the wearable device, includes the steps of, as shown in fig. 1:
a signal acquisition S110, which acquires motion information through an acceleration sensor;
a feature extraction step S120, in which the motion information is processed and the motion features are extracted from the motion information;
and (S130) identifying and outputting, wherein the motion type is judged according to the motion characteristics, and the corresponding motion type is identified.
Wearable equipment can be wearing equipment that has acceleration sensor such as intelligent eyes, intelligent earphone, intelligent wrist-watch, intelligent wrist strap, intelligent necklace, intelligent shoes, the product of wearing on the leg, intelligent clothing, intelligent schoolbag, intelligent walking stick. And can also be products such as a mobile phone and the like which are carried with a person.
Referring to fig. 2, the signal acquisition step includes acquiring acceleration values of one acceleration sensor disposed at different positions in real time S1101, and acquiring a variation of the acceleration values within a set time.
Referring to fig. 2, S1102: in the signal acquisition step, further performing motion grade division on the variation of the acceleration value.
The variation of the acceleration value is divided into a plurality of movement grades, each movement grade corresponds to one or a plurality of movement types, and the larger the variation of the acceleration value is, the more violent the movement is. Preferably, the variation of the acceleration value is divided into six motion levels as shown in the motion level classification of table 1:
table 1: motion class classification
Dividing the movement grade into six grades, when the movement grade of the variation of the acceleration value is 0, the acceleration sensor is in a static state corresponding to a sleeping state or a sitting state, and when the movement grade of the variation of the acceleration value is 1 grade, the variation of the acceleration valueThe state corresponds to a state of taking a vehicle, such as taking a bus, taking a taxi, or riding an electric vehicle. When the motion level of the variation of the acceleration value is 2-level, the variation of the acceleration value is->Corresponding to the riding movement, such as riding bicycle, exercise bicycle or spinning bicycle, when the movement grade of the variation of the acceleration value is 3 grades, the variation of the acceleration value isCorrespond to and haveA walking motion such as walking, climbing, descending a mountain, etc., wherein the variation of the acceleration value is ^ 4 when the motion level of the variation of the acceleration value is 4>When the motion level corresponding to the variation of the acceleration value is 5 levels in response to a fast walking motion such as a race, a jogging, or the like, the variation of the acceleration value isCorresponding to running sports, such as running, football, basketball and the like.
When the motion level is 0 level, the variation of the acceleration valueThe variable quantity is extremely small, and sleep or sitting can be defaulted without analysis and calculation. When the movement level is greater than the 0 level, the change amount of the acceleration value->And the variation of the acceleration value can be used as motion information for feature extraction.
Extracting characteristics, namely processing the motion information and extracting motion characteristics from the motion information;
referring to fig. 2, in the step of extracting the characteristic, the method further includes filtering the variation of the acceleration value to obtain a moving average.
Preferably, the acceleration value is filtered using first-order lag filtering, and the formula is as follows:
in the formula: t represents time, f [ t ]]Represents a moving average at time t, s [ t ]]A sample value representing the acceleration sensor signal measured at time t,representing the weight. Can be based on use and performance>Is set to be in the interval->And can be adjusted by means of>To determine whether to apply more weight to the immediately preceding moving average or to the new detection signal. Those skilled in the art can adjust/be adjusted using machine learning methods>The signal has more proper sensitivity and stability, and the feature extraction is convenient.
Referring to fig. 2, S1202: in the feature extraction step, a time window is further set, the time window is preferably set for a certain time duration between 1 second and 3 seconds, and may preferably be 1.5 seconds, 2 seconds or 2.5 seconds, and the like, so that the time window can balance the sensitivity and the algorithm performance of feature extraction. And extracting the motion characteristics of the moving average value in a time window, wherein the characteristic values obtained in the time window comprise a minimum acceleration value, an acceleration amplitude and an acceleration high-frequency signal.
Taking a smart watch as an example, a three-axis acceleration sensor is arranged in the smart watch, the characteristic values obtained by the three-axis acceleration sensor refer to fig. 3, 4, 5 and 6, the abscissa count in fig. 3, 4, 5 and 6 is the number of signals of the acceleration collected in the time window, and the ordinate value is the acceleration value detected by the acceleration sensor. The signals in fig. 3, 4, 5 and 6 include acc _ x, acc _ y, acc _ z, x _0 and x _ range, and when the smart watch screen is worn on the wrist while facing upward, acc _ x in fig. 3, 4, 5 and 6 refers to the detected acceleration value in the arm direction, acc _ y refers to the detected acceleration value in the direction perpendicular to the arm, and acc _ z refers to the detected acceleration value in the direction perpendicular to the smart watch screen. x _0 refers to the minimum acceleration value in the x direction (i.e. the shortest distance from the acceleration value in the x direction to the 0 axis in the time window in fig. 3, 4, 5 and 6), x _ range refers to the acceleration amplitude in the x direction (i.e. the maximum value minus the minimum value of the amplitude in the x direction in the time window in fig. 3, 4, 5 and 6), acc _ x, acc _ y and acc _ z are the raw signals obtained by the three-axis acceleration sensor, and the riding vehicle, the walking motion, the fast walking motion and the running motion can be distinguished through acc _ x, acc _ y and acc _ z.
Taking acc _ x as an example, when a vehicle is riding, as shown in fig. 3, most of the acceleration values of acc _ x are 5 to 20, the fluctuation range is small, when the vehicle is suddenly braked or shaken, the acceleration values of acc _ x can severely fluctuate, and the vehicle can return to a state with a small fluctuation range after the severe fluctuation.
When walking movement is carried out, as shown in fig. 4, the acceleration value of acc _ x is 20 to 60, and the fluctuation is regular.
When the fast-walking movement is carried out, as shown in FIG. 5, the acceleration value of acc _ x is between-20 and 40, and the fluctuation is regular.
When running, as shown in fig. 6, the acceleration value of acc _ x is-25 to 25, the fluctuation is regular, and the maximum value and the minimum value of the acceleration fluctuation are symmetrical.
The vehicle riding, walking, fast walking and running can be distinguished from the acceleration value, however, when the vehicle riding, walking, fast walking and running are distinguished, the acceleration value is partially overlapped, for example, the acceleration value of the fast walking and running is distinguished from the acceleration value, the judgment of the vehicle riding, fast walking and running is easily carried out by distinguishing from the value, the identification error of the motion mode is caused, the number of signals of the acceleration value of the acc _ x is large, when the motion mode is identified, the calculated amount is large, the efficiency is low, and the vehicle riding, the walking, fast walking and running are not convenient to distinguish.
Referring to fig. 3, 4, 5, and 6, x _0 and x _ range are obtained characteristic values.
When the vehicle is in use, as shown in FIG. 3, the value of x _0 is 27. + -.2, and the value of x _rangeis 5 to 10.
When walking exercise is performed, as shown in FIG. 4, the value of x _0 is 22. + -.2, and the value of x _ range is 25 to 35.
In the case of fast-walking motion, as shown in FIG. 5, the value of x _0 is 35. + -.2, and the value of x _rangeis 38 to 45.
When running exercise is performed, as shown in FIG. 6, the value of x _0 is 5. + -. 2, and the value of x _rangeis 80 to 110.
The riding vehicle, the walking movement, the fast walking movement and the running movement can be clearly distinguished from the sizes of the characteristic values in fig. 3, 4, 5 and 6, the values of x _0 or x _ range can be used for distinguishing independently, in order to further improve the accuracy of movement pattern recognition, the values of x _0 or x _ range can be used for distinguishing, and then the values of x _ range or x _0 are used for verifying the recognized movement pattern, so that the accuracy of movement pattern recognition can be further ensured.
The number of data can be greatly reduced by identifying the motion mode through the characteristic value. The method is convenient to calculate, the characteristic values have obvious difference in different motion modes, and the method is convenient to distinguish and high in accuracy.
Referring to fig. 7 and 8, walking movement and running movement can be further distinguished by the variation trend of acceleration on the y-axis, namely, acc _ y is an acceleration value detected in the direction perpendicular to the arm, y _ slope is the slope of a linear regression line of the acceleration value, and the slope is divided into time windowsAnd calculating left and right once to fit the trend of the change of the acceleration of the y axis, wherein y _ slope _ max is the maximum value of the slope y _ slope of the linear regression line in a time window t, y _ slope _ max of running motion is 5 +/-1, y _ slope _ max of walking motion is 1 +/-1, and whether the motion mode is walking motion or running motion can be accurately distinguished through the y _ slope _ max.
Referring to fig. 9 and 10, during riding, since the smart watch is in a state of being stationary relative to a human body and the variation value of the acceleration is small, the riding, walking, fast walking and running cannot be accurately distinguished only by acc _ x, acc _ y, acc _ z, x _0 and x _ range, and thus the riding needs to be distinguished by using an acceleration high-frequency signal.
Taking the riding movement and the walking movement as examples, referring to fig. 9 and 10, acc _xyzis the resultant acceleration in the x, y, z directions,
wave _ cnt is a high frequency signalNumber of small undulations in a time window, i.e. </or >>Number of maxima or minima occurring.
acc _ xyz _ high _ in reigon _ cnt is the resultant accelerationWithin a certain small interval (e.g., [ -20, 20 ] within the time window t]) The number of points in.
<xnotran> , 9 , acc _ xyz 50-420,acc_xyz_high -150~230,wave_cnt 10 ± 2,acc_xyz_high_in reigon_cnt 20 ± 2. </xnotran>
In performing the riding exercise, as shown in figure 10, the value of acc _ xyz is 80 to 175, the value of acc _xyz _ _highis-50 to 40, wave _cnt has a value of 30. + -. 2 and a value of acc _xyz _high _inreigon _cntof 70. + -.2.
The walking movement and the riding movement are distinguished from the point number of the combined acceleration, the high-frequency signal, the small fluctuation number of the high-frequency signal and the point number of the combined acceleration in a certain area of a time window to a certain extent. When the combined acceleration a is used for distinguishing, the walking acc _ xyz value is 50-420, the riding acc _ xyz value is 80-175, a certain overlapped value is obviously provided, errors are easy to occur during identification, data are more, and the calculation amount is larger, so that high-frequency signals, small fluctuation numbers of the high-frequency signals or points of the combined acceleration in a certain area of a time window are preferably used, the riding motion and the walking motion are obviously distinguished, the difference of the value range is larger, and the overlapped value is not provided, so that the riding motion and the walking motion can be accurately and quickly distinguished.
The riding movement can be distinguished from the riding movement, the fast walking movement or the running movement by the high-frequency signal, the small fluctuation number of the high-frequency signal or the point number of the combined acceleration in a certain area of the time window.
Referring to fig. 2, the step of recognizing and outputting includes calculating and classifying the motion feature values, and calculating and outputting the motion type. Referring to fig. 2, s1301.
In consideration of algorithm performance, a logistic regression algorithm is preferably used for classification, and the classification effect is good and the operation speed is high. The formula is as follows:
is a coefficient matrix (<;)>Can be calculated in advance from a large amount of data, namely, the value is solved by a common descent algorithm by adopting a machine learning method), and the judgment is performed based on the result of the judgment>Is a motion characteristic value. Then probabilities belonging to different motion types can be obtained:
in the formulaIs the probability of the current type of motion and e is an index. />Is the number of types of sports.
Is a set of probabilities for various movement patterns, based on the measured values of the measured values>Is the probability of the current type of motion, is>Indicates the probability of riding the vehicle>Representing a probability of a cycling movement>Representing the probability of the walking motion,Indicates the probability of a quick-walk movement>Probability of running exercise. Selecting a probability geometry>The probability of being the greatest of the current type of motion->As an output. For example when the set of probabilities->Is equal to->When, at the probability set >>And if the numerical value of 0.8 is the maximum, the motion type corresponding to the 0.8 position is fast walking motion, namely the motion type is judged to be fast walking motion and output. The type of motion output at this time is the type of motion within a time window, i.e., 2 seconds. And carrying out calculation classification again in the next time window to obtain a new motion type.
Referring to fig. 2, S1302: in the step of identification output, the step of further carrying out statistical classification on the motion types which are calculated and identified to be output, and finally carrying out statistical identification on the motion types which are output.
Preferably, in the statistical classification, the motion types output by calculation and identification are subjected to time statistical integration, one motion type is output every time a time window is passed, the motion types output by calculation and identification are subjected to integration and accumulation once, when the accumulated integration of the statistical integration of the same motion type after a plurality of time windows exceeds or equals to a preset threshold value, the identification and output are performed, and in the statistical classification, the integration and decrement of other motion types different from the motion types output by calculation and identification once is further included every time a time window is passed.
Referring to fig. 2, S1303. In the recognition output step, each motion type is assigned an identical fractional interval [ x, y ] and an identical threshold value z, preferably a fractional interval of [ -100,100], preferably a threshold value of 90. The motion types are integrated from 0, and for each motion type output, the integral of the motion type is added with 1, and the integral of the rest motion types is subtracted with 1. For example, when the output motion type of the first time window is the riding motion, the integral of the riding motion is 1, the integral of the riding vehicle, the walking motion, the fast walking motion and the running motion is-1, if the output motion types of the continuous 90 time windows are all the riding motions, the integral of the riding motion is 90, the integral of the riding vehicle, the walking motion, the fast walking motion and the running motion is-90, and at this time, the riding motion can be output as the final motion type. The time of 90 time windows is 180s, and accurate motion type output can be obtained through 180 s.
When there are other motion types output in the middle stage, for example, when the statistical integral of the riding motion is 50, the integral of the riding vehicle, the walking motion, the fast walking motion and the running motion is-50, when the motion type output in the next time window is the running motion, the running motion integral is added with 1 to become-49, the riding motion integral is changed to 49, and the integral of the riding vehicle, the walking motion and the fast walking motion is-52. Until the integral of a certain motion type is greater than or equal to the threshold value 90, the motion type is output.
During the statistical integration, the value of the integration may also be adjusted as needed, for example, each time one motion type integration plus 2 is output, an integration equal to the threshold may be output after 45 time windows are shortest, that is, a motion type in which the statistical integration is greater than the threshold may be output in 90s, and the more the number of time windows is, that is, the longer the time of the statistical integration is, the more accurate the motion type is. The smaller the number of time windows, i.e. the more time for statistical integration, the higher the efficiency of outputting the motion type. Similarly, the motion type output by the motion type output device can be adjusted according to the size of the threshold, the larger the threshold is, the more time windows are needed, the longer the statistical time is, and the more accurate the output motion type is. The smaller the threshold, the less time window is required, the shorter the statistical time, the more efficient the type of motion output. The relationship between accuracy and efficiency is balanced according to specific needs. The statistical integrals of several motion types are increased and decreased, the statistical integral of a certain motion type can be rapidly identified to exceed a threshold value, the output motion type can be rapidly obtained, and the identification accuracy and the identification efficiency can be greatly improved.
Referring to fig. 2, S1304, the motion type is finally statistically recognized and output after statistical classification,
referring to fig. 11, based on the general inventive concept, the present application further provides a motion pattern recognition apparatus 20 for a wearable device, and fig. 11 is a schematic structural diagram of a first embodiment of the motion pattern recognition apparatus 20 for a wearable device, including:
referring to fig. 12, the signal collection module 210 collects motion information through an acceleration sensor; acceleration sensor can set up on intelligent eyes, intelligent earphone, intelligent wrist-watch, intelligent wrist strap, intelligent necklace, intelligent shoes, the product of wearing on the leg, intelligent clothing, intelligent schoolbag, intelligent walking stick isotructure, and this application does not do the injecing to this. The signal acquisition module comprises an acquisition sub-module 2101 and a grading sub-module 2102. The acquisition submodule 2101 acquires motion information detected by the acceleration sensor, and the ranking submodule 2102 ranks the motion information.
Referring to fig. 13, the feature extraction module 220 processes the motion information to extract motion features therefrom; the feature extraction module comprises a filtering processing submodule 2201 and an extraction submodule 2202, the filtering processing submodule 2201 carries out filtering processing on the motion information with the motion grade larger than 0 level to obtain a moving average value, and the extraction submodule 2202 extracts features in the moving average value in a certain time window to obtain a motion feature value.
Referring to fig. 14, the identification output module 230 performs motion type determination according to the motion characteristics to identify a corresponding motion type. The recognition output module 230 includes a recognition output sub-module 2301 and a statistical integration sub-module 2302. The recognition output sub-module 2301 calculates and classifies the motion characteristic values to obtain the outputable motion types, and the statistical integration sub-module 2302 performs statistical integration on the output multiple motion types and outputs the motion types larger than the threshold value.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all equivalent structural changes made by using the contents of the present specification and the drawings, or applied directly or indirectly to other related technical fields, are included in the scope of the present invention.
Claims (5)
1. A motion pattern recognition method for a wearable device on which an acceleration sensor is provided, comprising the steps of:
acquiring signals, namely acquiring motion information through the acceleration sensor;
extracting characteristics, namely processing the motion information and extracting motion characteristics from the motion information;
identifying and outputting, judging the motion type according to the motion characteristics, and identifying the corresponding motion type;
in the signal acquisition step, the acceleration values of a plurality of acceleration sensors arranged at different positions are acquired in real time, and the variation of the acceleration values within a set time is acquired;
in the identification output step, the variation of the acceleration value is subjected to motion grade division; dividing the variation of the acceleration value into a plurality of motion grades, wherein each motion grade corresponds to one or more motion types; the motion grade comprises six grades of 0-5 grades, when the motion grade of the variation of the acceleration value is 0, the acceleration sensor is in a static state, the corresponding motion type is a sleeping state or a sitting state, and when the motion grade of the variation of the acceleration value is 1 grade, the variation of the acceleration value isThe corresponding motion type is the state of riding a vehicle; when the motion level of the variation of the acceleration value is 2, the variation of the acceleration value is->The corresponding motion type is riding motion; when the motion level of the variation of the acceleration value is 3 levels, the variation of the acceleration value isThe corresponding exercise types are walking exercises including walking, slow walking, mountain climbing and mountain descending; when the motion level of the variation of the acceleration value is 4 levels, the variation of the acceleration value is->The corresponding sport type is fast walking sport, including heel-and-toe walking and jogging; when the motion level of the variation of the acceleration value is 5, the variation of the acceleration value is->The corresponding type of exercise is running exerciseMoving;
in the step of identifying and outputting, the motion characteristic values are calculated and classified, and the motion types are calculated and identified, specifically, the classification is performed by using a logistic regression algorithm, and the formula is as follows:
in the formula:is a coefficient matrix, is based on>Is a motion characteristic value; the probabilities for different types of motion are:
in the formula:is the probability of the current type of motion, e is an index; />Is the number of types of sports; />
In the formula:is a set of probabilities for various movement patterns, based on the measured values of the measured values>Indicates the probability of riding the vehicle>Represents the probability of a cycling movement>Representing the probability of the walking motion,Representing a probability of a quick-walk movement>The probability of running exercise;
in the step of identification and output, further performing statistical classification on the motion types output by calculation and identification, and finally performing statistical identification and output on the motion types; in the statistic classification, performing time statistic integration on the motion types output by calculation and identification, outputting one motion type every time a time window, performing integration accumulation on the motion types output by calculation and identification once, and synchronously performing integration decrement on other motion types different from the motion types output by calculation and identification once every time a time window passes; when the accumulated integral of the statistical integrals of the same motion type exceeds a preset threshold value after passing through a plurality of time windows, finally counting, identifying and outputting the motion type;
wherein, each motion type is set with an identical fractional interval [ x, y ] and an identical threshold value z, the fractional interval is [ -100,100], and the threshold value is 90; the motion types are integrated from 0, and each time one motion type is output, the integral of the motion type is added with 1, and the integral of the rest motion types is subtracted with 1; the motion type greater than the threshold 90 is output as the final motion type.
2. The motion pattern recognition method for the wearable device according to claim 1, wherein the feature extraction step includes performing filtering processing on the variation of the acceleration value to obtain a moving average.
3. The motion pattern recognition method for the wearable device according to claim 2, wherein in the feature extraction step, a time window is set, and the moving average is subjected to motion feature extraction within the time window to obtain a motion feature value.
4. A motion pattern recognition apparatus for a wearable device, comprising:
the signal acquisition module acquires motion information through the acceleration sensor;
the characteristic extraction module is used for processing the motion information and extracting motion characteristics from the motion information;
the identification output module is used for judging the motion type according to the motion characteristics and identifying the corresponding motion type;
in the signal acquisition module, acquiring acceleration values of a plurality of acceleration sensors arranged at different positions in real time, and acquiring the variation of the acceleration values within a set time;
in the identification output module, carrying out motion grade division on the variation of the acceleration value; dividing the variation of the acceleration value into a plurality of motion grades, wherein each motion grade corresponds to one or more motion types; the motion grade comprises six grades of 0-5 grades, when the motion grade of the variation of the acceleration value is 0, the acceleration sensor is in a static state, the corresponding motion type is a sleeping state or a sitting state, and when the motion grade of the variation of the acceleration value is 1 grade, the variation of the acceleration value isThe corresponding motion type is the state of riding a vehicle; when the motion level of the variation of the acceleration value is 2-level, the variation of the acceleration value is->The corresponding motion type is riding motion; when the motion level of the variation of the acceleration value is 3 levels, the variation of the acceleration value isThe corresponding movement types are walking movement, including walking, climbing and descending; when the motion level of the variation of the acceleration value is 4, the variation of the acceleration value is->The corresponding sport type is fast walking sport, including heel-and-toe walking and jogging; when the motion level of the variation of the acceleration value is 5, the variation of the acceleration value is->The corresponding exercise type is running exercise;
in the identification output module, the motion characteristic values are calculated and classified, and the identification output motion types are calculated and classified by using a logistic regression algorithm, wherein the formula is as follows:
in the formula:is a coefficient matrix, is based on>Is a motion characteristic value; the probabilities of different types of motion are:
in the formula:is the probability of the current type of motion, e is an index; />Is the number of types of sports;
in the formula:is a set of probabilities for various movement patterns, based on the measured values of the measured values>Indicates the probability of riding the vehicle>Represents the probability of a cycling movement>Representing the probability of the walking motion,Representing a probability of a quick-walk movement>The probability of running exercise;
the identification output module further performs statistical classification on the motion types output by calculation and identification, and finally performs statistical identification on the output motion types; in the statistic classification, performing time statistic integration on the motion types output by calculation and identification, outputting one motion type every time a time window, performing integration accumulation on the motion types output by calculation and identification once, and synchronously performing integration decrement on other motion types different from the motion types output by calculation and identification once every time a time window passes; when the accumulated integral of the statistical integrals of the same motion type exceeds a preset threshold value after a plurality of time windows, finally performing statistical identification to output the motion type;
wherein, each motion type is set with a same fractional interval [ x, y ] and a same threshold value z, the fractional interval is [ -100,100], and the threshold value is 90; the motion types are integrated from 0, and each time one motion type is output, the integral of the motion type is added with 1, and the integral of the rest motion types is subtracted with 1; the motion type greater than the threshold 90 is output as the final motion type.
5. The motion pattern recognition device for the wearable device according to claim 4, wherein the recognition output module comprises a recognition output sub-module and a statistical integration sub-module; the recognition output submodule is used for calculating and classifying the motion characteristic values to obtain an outputtable motion type, and the statistical integral submodule is used for performing statistical integral on the output multiple motion types and outputting the motion types larger than a threshold value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010659319.0A CN111772639B (en) | 2020-07-09 | 2020-07-09 | Motion pattern recognition method and device for wearable equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010659319.0A CN111772639B (en) | 2020-07-09 | 2020-07-09 | Motion pattern recognition method and device for wearable equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111772639A CN111772639A (en) | 2020-10-16 |
CN111772639B true CN111772639B (en) | 2023-04-07 |
Family
ID=72759483
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010659319.0A Active CN111772639B (en) | 2020-07-09 | 2020-07-09 | Motion pattern recognition method and device for wearable equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111772639B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113723987A (en) * | 2021-07-16 | 2021-11-30 | 北京马蹄铁科技有限责任公司 | Method and system for identifying touch type, computer equipment and storage medium |
CN114234963A (en) * | 2021-12-20 | 2022-03-25 | 北京华如科技股份有限公司 | Device and method for recognizing individual posture in individual training confrontation device |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4875010B2 (en) * | 2008-02-27 | 2012-02-15 | 株式会社東芝 | Exercise class classification device and tracking processing device |
TWI490011B (en) * | 2010-07-01 | 2015-07-01 | Ind Tech Res Inst | System and method for analyzing |
US20150164377A1 (en) * | 2013-03-13 | 2015-06-18 | Vaidhi Nathan | System and method of body motion analytics recognition and alerting |
US20150153380A1 (en) * | 2013-10-30 | 2015-06-04 | Invensense, Inc. | Method and system for estimating multiple modes of motion |
US20170188895A1 (en) * | 2014-03-12 | 2017-07-06 | Smart Monitor Corp | System and method of body motion analytics recognition and alerting |
CN105589977B (en) * | 2014-10-23 | 2019-01-25 | 安徽华米信息科技有限公司 | A kind of times of exercise monitoring method and device |
WO2016061668A1 (en) * | 2014-10-23 | 2016-04-28 | 2352409 Ontario Inc. | Device and method for identifying subject's activity profile |
CN206026334U (en) * | 2016-05-03 | 2017-03-22 | 广东乐心医疗电子股份有限公司 | Motion amount detection device and intelligent wearable equipment comprising same |
CN109002189B (en) * | 2017-06-07 | 2021-09-07 | 斑马智行网络(香港)有限公司 | Motion recognition method, device, equipment and computer storage medium |
CN107669278B (en) * | 2017-09-22 | 2020-11-13 | 广州杰赛科技股份有限公司 | Motion state recognition method and system and animal behavior recognition system |
US11099208B2 (en) * | 2018-10-30 | 2021-08-24 | Stmicroelectronics S.R.L. | System and method for determining whether an electronic device is located on a stationary or stable surface |
-
2020
- 2020-07-09 CN CN202010659319.0A patent/CN111772639B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN111772639A (en) | 2020-10-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Quaid et al. | Wearable sensors based human behavioral pattern recognition using statistical features and reweighted genetic algorithm | |
KR101690649B1 (en) | Activity classification in a multi-axis activity monitor device | |
CN111772639B (en) | Motion pattern recognition method and device for wearable equipment | |
US9700241B2 (en) | Gait analysis system and method | |
US10802038B2 (en) | Calculating pace and energy expenditure from athletic movement attributes | |
CN105877757A (en) | Multi-sensor integrated human motion posture capturing and recognizing device | |
CN106910314B (en) | A kind of personalized fall detection method based on the bodily form | |
CN104990562B (en) | Step-recording method based on auto-correlation computation | |
CN105771187B (en) | A kind of motion state detection method and the intelligent shoe based on this method | |
CN104215257B (en) | High-precision and high pseudo-step removing human step-counting method integrating power consumption management | |
CN106491138A (en) | A kind of motion state detection method and device | |
CN107048570B (en) | A kind of data analysis processing method of Intelligent insole | |
CN105224104B (en) | Pedestrian movement's state identification method based on smart mobile phone grip mode | |
EP3079568B1 (en) | Device, method and system for counting the number of cycles of a periodic movement of a subject | |
CN105023022A (en) | Tumble detection method and system | |
CN106096662A (en) | Human motion state identification based on acceleration transducer | |
Ghazali et al. | Common sport activity recognition using inertial sensor | |
Ahmed et al. | An approach to classify human activities in real-time from smartphone sensor data | |
CN112464738B (en) | Improved naive Bayes algorithm user behavior identification method based on mobile phone sensor | |
CN108510099A (en) | A kind of method and wearable device of path planning | |
CN106295675A (en) | A kind of Activity recognition method of based on smart mobile phone of high accuracy | |
CN102707806A (en) | Motion recognition method based on acceleration sensor | |
CN108021888A (en) | A kind of fall detection method | |
CN108309304A (en) | A method of generating freezing of gait intelligent monitor system | |
Koskimaki et al. | Accelerometer vs. electromyogram in activity recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |