CN111288986B - Motion recognition method and motion recognition device - Google Patents

Motion recognition method and motion recognition device Download PDF

Info

Publication number
CN111288986B
CN111288986B CN201911409891.5A CN201911409891A CN111288986B CN 111288986 B CN111288986 B CN 111288986B CN 201911409891 A CN201911409891 A CN 201911409891A CN 111288986 B CN111288986 B CN 111288986B
Authority
CN
China
Prior art keywords
motion
user
data
characteristic data
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911409891.5A
Other languages
Chinese (zh)
Other versions
CN111288986A (en
Inventor
董春娇
商春恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Science Pengzhou Intelligent Industry Innovation Center Co ltd
Original Assignee
China Science Pengzhou Intelligent Industry Innovation Center Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Science Pengzhou Intelligent Industry Innovation Center Co ltd filed Critical China Science Pengzhou Intelligent Industry Innovation Center Co ltd
Priority to CN201911409891.5A priority Critical patent/CN111288986B/en
Publication of CN111288986A publication Critical patent/CN111288986A/en
Application granted granted Critical
Publication of CN111288986B publication Critical patent/CN111288986B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration

Abstract

The invention provides a motion identification method and a motion identification device. And responding to the motion state of the user, calculating second characteristic data representing the motion type according to the motion data, and judging the motion type according to the second characteristic data. And calculating third characteristic data representing the motion action according to the determined motion type, and judging the motion action of the user according to the third characteristic data. In the second classification, the habit of human motion can be combined to adopt dynamic motion type judgment, low-frequency judgment is kept when the judgment results are consistent, and high-frequency judgment is started when the judgment results are scattered. The motion identification method does not need to calculate all feature data at the same time, can automatically adjust the type judgment frequency, effectively reduces the calculation amount, ensures the real-time performance of the identification method, and does not influence the classification precision.

Description

Motion recognition method and motion recognition device
Technical Field
The invention belongs to the technical field of motion recognition, and particularly relates to a motion recognition method and a motion recognition device.
Background
In the market, an accelerometer and a gyroscope are mainly used for collecting motion data, wherein the accelerometer and the gyroscope have the advantages of low cost, small size, low power consumption and the like and are widely applied, a plurality of bracelets use the sensors as pedometers, but aiming at further processing the collected motion data such as acceleration or angular velocity, the traditional multi-classification identification method firstly selects a plurality of characteristic data, performs some normalization and dimension reduction transformation, then inputs the characteristic data into the classifier together, and directly outputs a classification result. Because a plurality of feature data need to be extracted in the processing process, and the richer the feature data are selected, the higher the model accuracy is, although the dimension reduction selection can be performed, the extraction of the feature data in real time still has a large calculation overhead, and the calculation is not stopped when the motion is not performed, which wastes power consumption.
Therefore, in order to solve the above problems, it is necessary to develop an intelligent and simple motion recognition device that can perform classification and extraction on different feature data, thereby reducing the amount of calculation work.
Disclosure of Invention
The present invention is directed to solve at least one of the technical problems of the prior art, and provides a motion recognition method, a motion recognition apparatus, an electronic device, and a computer-readable storage medium.
In a first aspect of the present invention, a motion recognition method is provided, which specifically includes the following steps:
acquiring motion data of a user, wherein the motion data comprises triaxial acceleration data and triaxial angular velocity data;
calculating first characteristic data representing the motion state of the user according to the motion data, and judging whether the user is in a static state or a motion state according to the first characteristic data;
responding to the motion state of the user, calculating second characteristic data representing the motion type of the user according to the motion data, and judging the motion type of the user according to the second characteristic data;
and according to the determined motion type, calculating third characteristic data representing motion actions according to the motion data, and judging the motion actions of the user according to the third characteristic data.
Optionally, the acquiring motion data of the user includes:
and acquiring triaxial acceleration data and triaxial angular velocity data of the user by utilizing a six-axis sensor.
Optionally, the determining, according to the first feature data, whether the user is in a stationary state or a moving state includes:
calculating the maximum value of the actual combined acceleration once every a preset first time period, comparing the maximum value of the actual combined acceleration with a first preset value, and recording the times of exceeding the first preset value;
and calculating whether the times of exceeding the first preset value exceeds a second preset value in a preset second time period, if so, determining that the user is in a motion state, and if not, determining that the user is in a static state, wherein the second time period comprises a plurality of first time periods.
Optionally, the determining the motion type of the user according to the second feature data includes:
setting a first detection frequency, detecting the motion state of the user for multiple times according to the first detection frequency, respectively calculating second characteristic data corresponding to the motion state every time, and judging the motion type of the user according to the second characteristic data;
responding to the consistency of the multiple judgment results, outputting the motion type determined by the multiple judgment, and switching to a second detection frequency to detect the motion state of the user for multiple times;
responding to the dispersion of the multiple judgment results, and continuing to maintain the first detection frequency;
wherein the first detection frequency is greater than the second detection frequency.
Optionally, the switching to the second detection frequency to detect the motion state of the user for multiple times includes:
respectively calculating second characteristic data corresponding to the motion state each time, and judging the motion type of the user according to the second characteristic data;
responding to the consistency of the multiple judgment results, outputting the motion type determined by the multiple judgment, and continuously keeping the second detection frequency;
and switching to the first detection frequency in response to the dispersion of the multiple judgment results.
Optionally, a random forest is used as a classifier, judgment is performed according to the second feature data, and a classification result of the random forest is output as the motion type of the user;
if the output motion type is the same as the last output motion type, representing that the results are consistent; if not, the result is scattered.
Optionally, the determining the motion action of the user according to the third feature data includes:
selecting a classification model matched with the current motion type from a pre-stored classification model set;
and inputting the third feature data into the selected classification model, and judging the motion action of the user according to the classification result output by the classification model.
In a second aspect of the present invention, there is provided a motion recognition apparatus comprising: the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring motion data of a user, and the motion data comprises triaxial acceleration data and triaxial angular velocity data;
the motion state judgment module is used for calculating first characteristic data representing the motion state of the user according to the motion data and judging whether the user is in a static state or a motion state according to the first characteristic data;
responding to the motion state of the user, and calculating second characteristic data representing the motion type of the user according to the motion data;
the motion type judging module is used for judging the motion type of the user according to the second characteristic data;
and the motion action judging module is used for calculating third characteristic data representing motion actions according to the determined motion type and the motion data and judging the motion actions of the user according to the third characteristic data.
Optionally, the acquisition module adopts a six-axis sensor.
In a third aspect of the present invention, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, is capable of implementing the motion recognition method as described above.
Compared with the prior art, the motion recognition method and the motion recognition device provided by the invention have the following beneficial effects: the invention provides a motion recognition method of tertiary classification, namely, the first classification distinguishes static state and motion state, after determining the motion state, the second classification distinguishes different motion types, then according to the motion type, selects different characteristic data and classification models, and carries out the tertiary classification, namely motion classification. The motion recognition method determines the motion state, the motion type and the motion action step by step according to the required characteristic data so as to realize accurate recognition of the motion. In the second classification, the dynamic motion type judgment can be adopted by combining habits of human motion, when the judgment results are consistent, the low-frequency judgment is kept, and when the judgment results are scattered, the high-frequency judgment is started. The motion identification method does not need to calculate all feature data at the same time, can automatically adjust the type judgment frequency, effectively reduces the calculation amount, ensures the real-time performance of the identification method, and does not influence the classification precision.
Drawings
Fig. 1 is a schematic flow chart of a motion recognition method according to a first embodiment of the present invention;
FIG. 2 is a block diagram illustrating a motion recognition method according to a first embodiment of the present invention;
FIG. 3 is a schematic block diagram of a first classification in the motion recognition method according to the first embodiment of the present invention;
FIG. 4 is a schematic block diagram of a second classification in the motion recognition method according to the first embodiment of the present invention;
fig. 5 is a block diagram schematically illustrating a motion recognition apparatus according to a second embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
As shown in fig. 1 and fig. 2, a first aspect of the present invention provides a motion recognition method, which specifically includes: the method comprises the steps of collecting motion data of a user, wherein the motion data comprises three-axis acceleration data and three-axis angular velocity data. And calculating first characteristic data representing the motion state of the user according to the motion data, and judging whether the user is in a static state or a motion state according to the first characteristic data. And responding to the motion state of the user, calculating second characteristic data representing the motion type of the user according to the motion data, and judging the motion type of the user according to the second characteristic data. And according to the determined motion type, calculating third characteristic data representing the motion action according to the motion data, and judging the motion action of the user according to the third characteristic data.
It should be noted that, in the motion recognition method of the present invention, the three-axis acceleration, the three-axis angular velocity, the three-axis spatial angle, and the three-axis actual acceleration are obtained through the six-axis sensor, and then the feature data of each axis is extracted, where each feature data includes data such as the actual acceleration maximum value, the minimum value, the average value, the median, the peak, the trough, the rising/falling time, and the like of each of the three axes, or each feature value of the resultant acceleration, and the included angle between each axis direction and the horizontal direction, and the richer the feature data is selected, the higher the model accuracy is, and although the dimension reduction selection can be performed, the extraction of the feature data in real time is still a large calculation overhead, and when no motion is performed, the calculation is not performed, and the power consumption is wasted. Different motion types require different characteristic data when performing motion classification, and if the motion types are judged in advance, much work is reduced. Therefore, the invention provides a motion recognition method based on three-degree classification. The first classification, distinguishing static state and motion state, after determining motion state, the second classification distinguishes different motion types, then according to the motion type, selects different characteristic data and classification model, and carries on the third classification, namely motion classification.
The motion recognition method provided by the invention can be used for determining the motion state, the motion type and the motion action step by step according to the required characteristic data, and can realize accurate recognition of motion. In addition, all feature data do not need to be calculated at the same time in the process, so that the operation amount can be effectively reduced, the real-time performance of the identification method is ensured, and the classification precision is not influenced.
Specifically, as shown in fig. 1, in a first step, a six-axis sensor is used to acquire motion data of a user, where the motion data includes three-axis acceleration data and three-axis angular velocity data.
As shown in fig. 1 and fig. 2, in the second step, it is determined whether the user is in a moving state or a stationary state, specifically, first feature data representing the moving state of the user is calculated according to the collected moving data, and then, it is determined whether the user is in the stationary state or the moving state according to the first feature data, that is, the first classification in fig. 2 is corresponded. Wherein the first characteristic data refers to an actual resultant acceleration.
Specifically, the maximum value of the actual combined acceleration is calculated once every a preset first time period, the maximum value of the actual combined acceleration is compared with a first preset value, and the times of exceeding the first preset value are recorded. And calculating whether the times of exceeding the first preset value in a preset second time period exceed a second preset value, if so, judging that the user is in a motion state, otherwise, judging that the user is in a static state, wherein the second time period comprises a plurality of first time periods. That is, the present embodiment determines the motion state of the user by calculating the number of times the actual resultant acceleration exceeds the threshold value within a period of time.
In one example, the second time period includes a plurality of first time periods, and the detecting is performed a plurality of times, wherein each of the first time periods is detected once. In this embodiment, the motion state of the user is determined once every minute, and the maximum value of the actual acceleration is calculated for multiple times within one minute, for example, as shown in fig. 3, the maximum value of the actual acceleration calculated every time is calculated every 5 seconds within one minute, 1 is output if the maximum value of the actual acceleration calculated every time exceeds the first preset value, 0 is output if not, then an accumulated value of all output values within one minute is calculated, if the accumulated value within one minute exceeds the second preset value, it may be determined that the user is in a motion state, otherwise, it is determined that the user is in a stationary state.
It should be noted that the static state of the present embodiment does not refer to absolute static, but the motion amplitude is small and does not exceed the first preset value. The first preset value and the second preset value are not particularly limited, and can be set according to specific motions.
As shown in fig. 1 and fig. 2, in the third step, when it is determined that the user is in the exercise state, the exercise type of the exercise state is further determined, for example: the sport belongs to any one of walking, running, swimming, basketball, badminton, table tennis and the like, and the sport identification method can distinguish the sport type in real time, namely corresponding to the second classification in figure 2.
Specifically, the determining the motion type of the user according to the second feature data includes: setting a first detection frequency, detecting the motion state of the user for multiple times according to the first detection frequency, respectively calculating second characteristic data corresponding to the motion state of each time, and judging the motion type of the user according to the second characteristic data; and in response to the multiple judgment results being consistent, outputting the motion type determined by the multiple judgment, switching to a second detection frequency to detect the motion state of the user for multiple times, and in response to the multiple judgment results being dispersed, continuing to maintain the first detection frequency. Wherein the first detection frequency is greater than the second detection frequency. And then, switching to a second detection frequency to detect the motion state of the user for multiple times, wherein the method comprises the following steps: and respectively calculating second characteristic data corresponding to each motion state, and judging the motion type of the user according to the second characteristic data. And responding to the consistency of the multiple judgment results, outputting the motion type determined by the multiple judgment, continuing to maintain the second detection frequency, responding to the dispersion of the multiple judgment results, and switching to the first detection frequency.
Further, the judgment of the motion type of the user according to the second feature data can be performed by using a random forest as a classifier, specifically, the judgment is performed according to the second feature data of a plurality of detection time periods, the classification result of the random forest is output as the motion type of the user, and if the output motion type is the same as the last output motion type, the representing results are consistent; if not, the result is scattered.
It should be noted that the second feature data in the second classification includes some feature data of the actual resultant acceleration, such as a maximum value, a minimum value, an average value, a median, a peak, a trough, and a rise \ fall time. That is, the judgment of the motion type can be realized by calculating each feature data of the actual resultant acceleration in the second classification.
In one example, it is found through actually collecting data that a user continues to exercise for about half an hour to one hour after entering an exercise state, and generally takes a rest for at least one minute when switching different exercises, so that the exercise type determination can be dynamically performed, specifically, as shown in fig. 4, the second classification is divided into four processes, specifically: and a process a, representing that the user just starts to move, performing high-frequency detection and classification judgment, namely detecting the user multiple times and calculating second feature data at the first detection frequency, for example, detecting once per minute, judging the second feature data of the latest five times in one minute time period by using a random forest, outputting a result if the results are consistent, and switching to the second detection frequency, namely performing low-frequency detection. And B, representing the exercise period, performing low-frequency detection and classification judgment, namely performing detection at a second detection frequency, for example, taking one minute of data every five minutes for detection, judging second characteristic data of the latest five times and one minute time period by adopting a random forest, keeping low-frequency detection if the results are consistent, taking down the data of the next five minutes, continuing the judgment, and switching to high-frequency detection if the results are inconsistent. And C, representing that the judgment result is dispersed, switching to the first detection frequency again for detection judgment, namely detecting for five times of high-frequency detection once per minute. And D, switching to a second detection frequency to continue detection, namely low-frequency detection of one-time detection in five minutes, wherein the judgment result is consistent.
As shown in fig. 1 and fig. 2, the fourth step of further determining the exercise action according to the determined exercise type, specifically, calculating third feature data representing the exercise action according to the exercise data, and determining the exercise action of the user according to the third feature data includes: and selecting a classification model matched with the current motion type from a pre-stored classification model set. Inputting the third feature data into the selected classification model, and judging the motion action of the user according to the classification result output by the classification model, namely corresponding to the third classification in fig. 2.
That is, the determined motion types are further divided in detail, different feature data and classification models are selected, and a third classification, that is, motion classification is performed, for example, after the motion type is identified as a basketball motion in the third step, it is necessary to further distinguish whether the motion is a pitching motion, a dribbling motion, a passing motion, or the like, and output detailed data of an angle, an acceleration, and the like of each motion at the same time. Therefore, according to different motion types, more characteristic data can be calculated for classification, for example, in a basketball action, the ratio of the acceleration of the x axis/the combined acceleration, the ratio of the acceleration of the y axis/the combined acceleration, and the ratio of the acceleration of the z axis/the combined acceleration can be calculated respectively to judge the force of the arm in different directions.
A second aspect of the present invention, as shown in fig. 5, provides a motion recognition apparatus 100, comprising: the acquisition module 101 is configured to acquire motion data of a user, where the motion data includes three-axis acceleration data and three-axis angular velocity data, and the acquisition module employs a six-axis sensor. The motion state determining module 102 is configured to calculate first feature data representing a motion state of the user according to the motion data, and determine whether the user is in a stationary state or a motion state according to the first feature data. And in response to the user being in the motion state, calculating second characteristic data which is used for representing the motion type of the user according to the motion data. And the motion type judging module 103 is used for judging the motion type of the user according to the second characteristic data. And the motion action judging module 104 is configured to calculate third feature data representing the motion action according to the determined motion action and the motion data, and judge the motion action of the user according to the third feature data.
In a third aspect of the present invention, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, is capable of implementing the motion recognition method described above. The computer readable medium may be included in the apparatus, device, system, or may exist separately.
The computer readable storage medium may be any tangible medium that can contain or store a program, and may be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, more specific examples of which include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, an optical fiber, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination thereof.
The computer readable storage medium may also include a propagated data signal with computer readable program code embodied therein, for example, in a non-transitory form, such as in a carrier wave or in a carrier wave, wherein the carrier wave is any suitable carrier wave or carrier wave for carrying the program code.
Compared with the prior art, the motion recognition method and the motion recognition device provided by the invention have the following beneficial effects: the invention provides a motion recognition method of tertiary classification, namely, the first classification distinguishes static state and motion state, after determining the motion state, the second classification distinguishes different motion types, then according to the motion type, selects different characteristic data and classification models, and carries out the tertiary classification, namely motion classification. The motion recognition method determines the motion state, the motion type and the motion action step by step according to the required characteristic data so as to realize accurate recognition of the motion. In the second classification, the dynamic motion type judgment can be adopted by combining habits of human motion, when the judgment results are consistent, the low-frequency judgment is kept, and when the judgment results are scattered, the high-frequency judgment is started. The motion identification method does not need to calculate all feature data at the same time, can automatically adjust the type judgment frequency, effectively reduces the calculation amount, ensures the real-time performance of the identification method, and does not influence the classification precision.
It will be understood that the above embodiments are merely exemplary embodiments taken to illustrate the principles of the present invention, which is not limited thereto. It will be apparent to those skilled in the art that various modifications and improvements can be made without departing from the spirit and substance of the invention, and these modifications and improvements are also considered to be within the scope of the invention.

Claims (8)

1. A motion recognition method is characterized by comprising the following steps:
acquiring motion data of a user, wherein the motion data comprises triaxial acceleration data and triaxial angular velocity data;
calculating first characteristic data representing the motion state of the user according to the motion data, and judging whether the user is in a static state or a motion state according to the first characteristic data; responding to the motion state of the user, calculating second characteristic data representing the motion type of the user according to the motion data, and judging the motion type of the user according to the second characteristic data;
wherein the determining the motion type of the user according to the second feature data includes: setting a first detection frequency, detecting the motion state of the user for multiple times according to the first detection frequency, respectively calculating second characteristic data corresponding to the motion state every time, and judging the motion type of the user according to the second characteristic data;
responding to the consistency of the multiple judgment results, outputting the motion type determined by the multiple judgment, and switching to a second detection frequency to detect the motion state of the user for multiple times;
responding to the dispersion of the multiple judgment results, and continuing to maintain the first detection frequency;
the first detection frequency is greater than the second detection frequency;
wherein the switching to the second detection frequency to detect the motion state of the user a plurality of times comprises: respectively calculating second characteristic data corresponding to the motion state each time, and judging the motion type of the user according to the second characteristic data;
responding to the consistency of the multiple judgment results, outputting the motion type determined by the multiple judgment, and continuously keeping the second detection frequency;
responding to the dispersion of the multiple judgment results, and switching to the first detection frequency;
and according to the determined motion type, calculating third characteristic data representing motion actions according to the motion data, and judging the motion actions of the user according to the third characteristic data.
2. The motion recognition method of claim 1, wherein the collecting motion data of the user comprises:
and acquiring triaxial acceleration data and triaxial angular velocity data of the user by utilizing a six-axis sensor.
3. The motion recognition method according to claim 2, wherein the determining whether the user is in a stationary state or a moving state according to the first feature data includes:
calculating the maximum value of the actual combined acceleration once every a preset first time period, comparing the maximum value of the actual combined acceleration with a first preset value, and recording the times of exceeding the first preset value;
and calculating whether the times of exceeding the first preset value exceeds a second preset value in a preset second time period, if so, determining that the user is in a motion state, and if not, determining that the user is in a static state, wherein the second time period comprises a plurality of first time periods.
4. The motion recognition method according to claim 1, wherein a random forest is used as a classifier, judgment is performed according to the second feature data, and a classification result of the random forest is output as the motion type of the user;
if the output motion type is the same as the last output motion type, representing that the results are consistent; if not, the result is scattered.
5. The motion recognition method according to any one of claims 1 to 3, wherein the determining the motion action of the user according to the third feature data includes:
selecting a classification model matched with the current motion type from a pre-stored classification model set;
and inputting the third feature data into the selected classification model, and judging the motion action of the user according to the classification result output by the classification model.
6. A motion recognition device, comprising: the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring motion data of a user, and the motion data comprises triaxial acceleration data and triaxial angular velocity data;
the motion state judgment module is used for calculating first characteristic data representing the motion state of the user according to the motion data and judging whether the user is in a static state or a motion state according to the first characteristic data;
responding to the motion state of the user, and calculating second characteristic data representing the motion type of the user according to the motion data;
a motion type determining module, configured to determine a motion type of the user according to the second feature data, including: setting a first detection frequency, detecting the motion state of the user for multiple times according to the first detection frequency, respectively calculating second characteristic data corresponding to the motion state every time, and judging the motion type of the user according to the second characteristic data;
responding to the consistency of the multiple judgment results, outputting the motion type determined by the multiple judgment, and switching to a second detection frequency to detect the motion state of the user for multiple times;
responding to the dispersion of the multiple judgment results, and continuing to maintain the first detection frequency;
the first detection frequency is greater than the second detection frequency;
wherein the switching to the second detection frequency to detect the motion state of the user a plurality of times comprises: respectively calculating second characteristic data corresponding to the motion state each time, and judging the motion type of the user according to the second characteristic data;
responding to the consistency of the multiple judgment results, outputting the motion type determined by the multiple judgment, and continuously keeping the second detection frequency;
responding to the dispersion of the multiple judgment results, and switching to the first detection frequency;
and the motion action judging module is used for calculating third characteristic data representing motion actions according to the determined motion type and the motion data and judging the motion actions of the user according to the third characteristic data.
7. The motion recognition apparatus of claim 6, wherein the acquisition module employs a six-axis sensor.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, is able to carry out a method for motion recognition according to any one of claims 1 to 5.
CN201911409891.5A 2019-12-31 2019-12-31 Motion recognition method and motion recognition device Active CN111288986B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911409891.5A CN111288986B (en) 2019-12-31 2019-12-31 Motion recognition method and motion recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911409891.5A CN111288986B (en) 2019-12-31 2019-12-31 Motion recognition method and motion recognition device

Publications (2)

Publication Number Publication Date
CN111288986A CN111288986A (en) 2020-06-16
CN111288986B true CN111288986B (en) 2022-04-12

Family

ID=71022218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911409891.5A Active CN111288986B (en) 2019-12-31 2019-12-31 Motion recognition method and motion recognition device

Country Status (1)

Country Link
CN (1) CN111288986B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111773651A (en) * 2020-07-06 2020-10-16 湖南理工学院 Badminton training monitoring and evaluating system and method based on big data
CN112304316B (en) * 2020-10-23 2021-11-26 重庆越致科技有限公司 Method and device for automatically detecting state and track of pedestrian taking elevator
CN114440884A (en) * 2022-04-11 2022-05-06 天津果实科技有限公司 Intelligent analysis method for human body posture for intelligent posture correction equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108076227A (en) * 2017-12-27 2018-05-25 上海传英信息技术有限公司 A kind of control method and mobile terminal for mobile terminal
US10019078B2 (en) * 2013-06-17 2018-07-10 Samsung Electronics Co., Ltd. Device, method, and system to recognize motion using gripped object
CN109260673A (en) * 2018-11-27 2019-01-25 北京羽扇智信息科技有限公司 A kind of movement method of counting, device, equipment and storage medium
CN109447128A (en) * 2018-09-29 2019-03-08 中国科学院自动化研究所 Walking based on micro- inertial technology and the classification of motions method and system that remains where one is
CN110135246A (en) * 2019-04-03 2019-08-16 平安科技(深圳)有限公司 A kind of recognition methods and equipment of human action

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10019078B2 (en) * 2013-06-17 2018-07-10 Samsung Electronics Co., Ltd. Device, method, and system to recognize motion using gripped object
CN108076227A (en) * 2017-12-27 2018-05-25 上海传英信息技术有限公司 A kind of control method and mobile terminal for mobile terminal
CN109447128A (en) * 2018-09-29 2019-03-08 中国科学院自动化研究所 Walking based on micro- inertial technology and the classification of motions method and system that remains where one is
CN109260673A (en) * 2018-11-27 2019-01-25 北京羽扇智信息科技有限公司 A kind of movement method of counting, device, equipment and storage medium
CN110135246A (en) * 2019-04-03 2019-08-16 平安科技(深圳)有限公司 A kind of recognition methods and equipment of human action

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Human Activity Recognition Using Smartphone Sensors With Two-Stage Continuous Hidden Markov Models;Charissa Ann Ronao;《2014 10th International Conference on Natural Computation》;20141208;第681-686页 *
基于多特征融合与机器学习的篮球运动姿态识别;王高宣;《甘肃科学学报》;20190630;第1-4页 *

Also Published As

Publication number Publication date
CN111288986A (en) 2020-06-16

Similar Documents

Publication Publication Date Title
CN111288986B (en) Motion recognition method and motion recognition device
JP6064280B2 (en) System and method for recognizing gestures
CN104731307B (en) A kind of body-sensing action identification method and human-computer interaction device
CN105617638A (en) Badminton racket swinging movement recognizing method and device
CN108109336B (en) Human body falling identification method based on acceleration sensor
Jensen et al. Classification of kinematic swimming data with emphasis on resource consumption
CN106648078B (en) Multi-mode interaction method and system applied to intelligent robot
CN107469326A (en) A kind of swimming monitoring method for wearable device and device and wearable device
CN108245869B (en) Swimming information detection method and device and electronic equipment
KR20180020123A (en) Asynchronous signal processing method
CN110811578A (en) Step counting device and step counting method thereof, controller and readable storage medium
CN108021888B (en) Fall detection method
GB2529082A (en) Device and method for monitoring swimming performance
CN108195395A (en) Mobile terminal and its step-recording method, storage device
CN109758154B (en) Motion state determination method, device, equipment and storage medium
KR101870542B1 (en) Method and apparatus of recognizing a motion
CN106799025B (en) Ball hitting detection method, device, equipment and intelligent terminal
WO2018014432A1 (en) Voice application triggering control method, device and terminal
CN109793497A (en) A kind of sleep state recognition methods and device
CN114533010A (en) Heart rate detection method and device
CN111167105B (en) Shooting detection method, device, equipment, system and storage medium
US10569135B2 (en) Analysis device, recording medium, and analysis method
CN111803902B (en) Swimming stroke identification method and device, wearable device and storage medium
KR101958334B1 (en) Method and apparatus for recognizing motion to be considered noise
CN108009504A (en) A kind of recognition methods of moving sphere, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant