CN116631046A - Motion trail judging method and device, wearable device and storage medium - Google Patents

Motion trail judging method and device, wearable device and storage medium Download PDF

Info

Publication number
CN116631046A
CN116631046A CN202210125751.0A CN202210125751A CN116631046A CN 116631046 A CN116631046 A CN 116631046A CN 202210125751 A CN202210125751 A CN 202210125751A CN 116631046 A CN116631046 A CN 116631046A
Authority
CN
China
Prior art keywords
motion
action
information
sensing data
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210125751.0A
Other languages
Chinese (zh)
Inventor
姚丽峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202210125751.0A priority Critical patent/CN116631046A/en
Publication of CN116631046A publication Critical patent/CN116631046A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The disclosure provides a motion trail judging method, a device, a wearable device and a storage medium, wherein the method comprises the following steps: the method comprises the steps of obtaining motion information identified by a sensor, generating a first motion track according to the motion information, carrying out data analysis on the first motion track, determining motion information, and generating motion quality information based on the motion information.

Description

Motion trail judging method and device, wearable device and storage medium
Technical Field
The disclosure relates to the technical field of data processing, and in particular relates to a motion trail judging method and device, a wearable device and a storage medium.
Background
Along with the improvement of society and living standard, people pay more attention to health problems, and exercise is scientifically performed, so that an emerging exercise mode is derived, namely, wearable equipment is integrated into exercise, so that the judgment of the movement track of a user by combining the wearable equipment is realized, and scientific reference opinion can be provided for the exercise of the user based on the judgment result of the movement track.
In the related art, when judging the motion trail of the user, the judgment dimension is single, so that the referenceability of the judgment result of the motion trail of the user is poor.
Disclosure of Invention
The present disclosure aims to solve, at least to some extent, one of the technical problems in the related art.
Therefore, the purpose of the present disclosure is to provide a motion trail determination method, a device, a wearable device, a storage medium, and a computer program product, wherein the motion trail determination method, the device, the wearable device, the storage medium, and the computer program product are used for performing data analysis on a first motion trail generated by motion information identified by a joint sensor, so as to perform determination processing on a motion trail of a user, effectively improve the comprehensiveness of motion trail determination, effectively improve the accuracy and the determination effect of motion trail determination, and enable diversified motion quality information obtained by motion trail determination to have better reference guidance significance.
The motion trail judgment method provided by the embodiment of the first aspect of the present disclosure includes:
acquiring action information identified by a sensor;
generating a first motion trail according to the motion information;
data analysis is carried out on the first motion trail, and motion action information is determined;
based on the motion information, motion quality information is generated.
In some embodiments of the present disclosure, obtaining motion information identified by a sensor includes:
acquiring motion sensing data of a sensor;
generating space track information of the movement part according to the motion sensing data; and
the spatial trajectory information is used as the motion information.
In some embodiments of the present disclosure, generating spatial trajectory information of a motion site from motion sensing data includes:
generating a space motion track of the motion part according to the motion sensing data;
generating the height of the motion track of the motion part according to the motion sensing data; and
and the space motion track and the motion track height are used as space track information together.
In some embodiments of the present disclosure, the number of motion sensing data is a plurality, the plurality of motion sensing data corresponding to a plurality of marking times, respectively;
wherein, according to the motion sensing data, the space motion track of the motion part is generated, including:
Processing the plurality of motion sensing data respectively to obtain a plurality of single motion track characteristics corresponding to the plurality of marking times respectively;
generating a multi-action track feature associated with at least part of the marking time according to the plurality of single-action track features; and
and generating a space motion track of the motion part according to the plurality of single motion track features and the plurality of motion track features.
In some embodiments of the present disclosure, the motion sensing data includes: corresponding acceleration data, rotation motion data and magnetic force sensing data corresponding to user actions at corresponding marking time;
the method for processing the motion sensing data respectively to obtain a plurality of single motion track features corresponding to a plurality of marking times respectively comprises the following steps:
determining action gesture information of the user action at the corresponding mark time according to the acceleration data, the rotation movement data and the magnetic force sensing data;
and determining the single action track characteristics of the user action at the corresponding marking time according to the action gesture information.
In some embodiments of the present disclosure, obtaining motion information identified by a sensor further includes:
acquiring relevant sensing data of the action;
determining whether the action is a rope skipping action according to the related sensing data;
And if the action is a rope skipping action, acquiring action information identified by the sensor.
In some embodiments of the present disclosure, acquiring relevant sensed data of an action includes:
acquiring sensing data to be identified of the action, wherein the sensing data to be identified is used as related sensing data; and/or
Acquiring image sensing data of an action scene, wherein the image sensing data is used as related sensing data; and/or
Sound sensing data of the action scene is acquired, wherein the sound sensing data is used as related sensing data.
In some embodiments of the present disclosure, the relevant sensed data is sensed data to be identified;
wherein determining whether the action is a rope-skipping action based on the relevant sensed data comprises:
determining rope skipping action characteristics of rope skipping action;
analyzing action characteristics to be matched of sensing data to be identified;
and if the action characteristic to be matched is matched with the rope skipping action characteristic, determining that the action is a rope skipping action.
In some embodiments of the present disclosure, the related sensing data is image sensing data;
wherein determining whether the action is a rope-skipping action based on the relevant sensed data comprises:
Determining sensed image characteristics of the image sensing data;
identifying whether the image sensing data comprises rope skipping body data according to the sensed image characteristics;
if the image sensing data comprise rope skipping body data, interaction data of actions acting on the rope skipping bodies are obtained;
if the interaction data satisfies the motion condition, the action is determined to be a rope skipping motion action.
In some embodiments of the present disclosure, generating a first motion trajectory from motion information includes:
and generating a first motion track according to the space track information of the motion part.
In some embodiments of the present disclosure, generating a first motion trajectory from spatial trajectory information of a motion portion includes:
determining candidate motion trajectories matched with the space trajectory information from a plurality of candidate motion trajectories;
and taking the matched candidate motion trail as a first motion trail.
In some embodiments of the present disclosure, the data analysis of the first motion profile to determine motion information includes:
performing action feature decomposition on the first motion trail to obtain action features to be analyzed;
determining the related rope skipping times of rope skipping motion according to the motion characteristics to be analyzed;
Determining the relevant rope skipping height of the rope skipping motion according to the motion characteristics to be analyzed;
the relevant rope skipping times and the relevant rope skipping heights are used as the movement action information together.
In some embodiments of the present disclosure, generating motion quality information based on the motion information includes:
determining analysis and evaluation information of rope skipping motion according to the relevant rope skipping height and motion characteristics to be analyzed;
the relevant rope skipping times and the analysis and evaluation information are used as the movement quality information together.
In some embodiments of the present disclosure, determining analysis and evaluation information of a rope-skipping motion based on a relevant rope-skipping height and a motion feature to be analyzed includes:
acquiring reference track height and reference action characteristics;
matching the relevant rope skipping height with the reference track height to obtain a first matching result;
matching the action feature to be analyzed with the reference action feature to obtain a second matching result;
and determining analysis and evaluation information of the rope skipping motion according to the first matching result and the second matching result.
In some embodiments of the present disclosure, it further comprises:
correcting the first motion trail by adopting a deep learning and/or machine learning method to obtain a target motion trail;
The method for analyzing the data of the first motion trail to determine motion information comprises the following steps:
and carrying out data analysis on the target motion trail and determining motion action information.
In some embodiments of the present disclosure, the motion quality information comprises any one or a combination of the following:
action effect evaluation information;
action standard evaluation information;
action grade evaluation information;
action scoring index information;
physical stamina consumption information;
action error cause information;
action correction instruction information;
action consistency degree information.
According to the motion trail judging method provided by the embodiment of the first aspect of the disclosure, the motion information identified by the sensor is obtained, the first motion trail is generated according to the motion information, then the data analysis is performed on the first motion trail, the motion information is determined, and the motion quality information is generated based on the motion information.
The motion trail determination device provided in the embodiment of the second aspect of the present disclosure includes:
the acquisition module is used for acquiring action information identified by the sensor;
the first generation module is used for generating a first motion trail according to the motion information;
the analysis module is used for carrying out data analysis on the first motion trail and determining motion action information;
and the second generation module is used for generating motion quality information based on the motion action information.
In some embodiments of the present disclosure, the acquiring module includes:
the first acquisition sub-module is used for acquiring motion sensing data of the sensor;
the generation sub-module is used for generating space track information of the movement part according to the motion sensing data; and
the first processing sub-module is used for taking the space track information as action information.
In some embodiments of the present disclosure, the generating sub-module is specifically configured to:
generating a space motion track of the motion part according to the motion sensing data;
generating the height of the motion track of the motion part according to the motion sensing data; and
and the space motion track and the motion track height are used as space track information together.
In some embodiments of the present disclosure, the number of motion sensing data is a plurality, the plurality of motion sensing data corresponding to a plurality of marking times, respectively;
Wherein, the generation submodule is further used for:
processing the plurality of motion sensing data respectively to obtain a plurality of single motion track characteristics corresponding to the plurality of marking times respectively;
generating a multi-action track feature associated with at least part of the marking time according to the plurality of single-action track features; and
and generating a space motion track of the motion part according to the plurality of single motion track features and the plurality of motion track features.
In some embodiments of the present disclosure, the motion sensing data comprises: corresponding acceleration data, rotation motion data and magnetic force sensing data corresponding to user actions at corresponding marking time;
wherein, the generation submodule is further used for:
determining action gesture information of the user action at the corresponding mark time according to the acceleration data, the rotation movement data and the magnetic force sensing data;
and determining the single action track characteristics of the user action at the corresponding marking time according to the action gesture information.
In some embodiments of the present disclosure, the acquiring module further includes:
the second acquisition sub-module is used for acquiring relevant sensing data of the action;
the first determining submodule is used for determining whether the action is a rope skipping motion action or not according to the related sensing data;
And the third acquisition sub-module is used for acquiring the action information identified by the sensor when the action is a rope skipping action.
In some embodiments of the disclosure, the second obtaining submodule is specifically configured to:
acquiring sensing data to be identified of the action, wherein the sensing data to be identified is used as related sensing data; and/or
Acquiring image sensing data of an action scene, wherein the image sensing data is used as related sensing data; and/or
Sound sensing data of the action scene is acquired, wherein the sound sensing data is used as related sensing data.
In some embodiments of the present disclosure, the relevant sensed data is sensed data to be identified;
the first determining sub-module is specifically configured to:
determining rope skipping action characteristics of rope skipping action;
analyzing action characteristics to be matched of sensing data to be identified;
and if the action characteristic to be matched is matched with the rope skipping action characteristic, determining that the action is a rope skipping action.
In some embodiments of the present disclosure, the related sensing data is image sensing data;
wherein, confirm submodule, still be used for:
determining sensed image characteristics of the image sensing data;
Identifying whether the image sensing data comprises rope skipping body data according to the sensed image characteristics;
if the image sensing data comprise rope skipping body data, interaction data of actions acting on the rope skipping bodies are obtained;
if the interaction data satisfies the motion condition, the action is determined to be a rope skipping motion action.
In some embodiments of the disclosure, the first generating module is specifically configured to:
and generating a first motion track according to the space track information of the motion part.
In some embodiments of the disclosure, the generating module is specifically configured to:
determining candidate motion trajectories matched with the space trajectory information from a plurality of candidate motion trajectories;
and taking the matched candidate motion trail as a first motion trail.
In some embodiments of the disclosure, the analysis module is specifically configured to:
performing action feature decomposition on the first motion trail to obtain action features to be analyzed;
determining the related rope skipping times of rope skipping motion according to the motion characteristics to be analyzed;
determining the relevant rope skipping height of the rope skipping motion according to the motion characteristics to be analyzed;
the relevant rope skipping times and the relevant rope skipping heights are used as the movement action information together.
In some embodiments of the present disclosure, the second generating module includes:
the second determining submodule is used for determining analysis and evaluation information of rope skipping motion according to the relevant rope skipping height and motion characteristics to be analyzed;
and the second processing sub-module is used for jointly using the related rope skipping times and the analysis and evaluation information as motion quality information.
In some embodiments of the present disclosure, the second determining sub-module is specifically configured to:
acquiring reference track height and reference action characteristics;
matching the relevant rope skipping height with the reference track height to obtain a first matching result;
matching the action feature to be analyzed with the reference action feature to obtain a second matching result;
and determining analysis and evaluation information of the rope skipping motion according to the first matching result and the second matching result.
In some embodiments of the present disclosure, it further comprises:
the correction module is used for correcting the first motion track by adopting a deep learning and/or machine learning method so as to obtain a target motion track;
the analysis module is specifically configured to:
and carrying out data analysis on the target motion trail and determining motion action information.
In some embodiments of the present disclosure, the motion quality information comprises any one or a combination of the following:
action effect evaluation information;
action standard evaluation information;
action grade evaluation information;
action scoring index information;
physical stamina consumption information;
action error cause information;
action correction instruction information;
action consistency degree information.
According to the motion trail judging device provided by the embodiment of the second aspect of the disclosure, the motion information identified by the sensor is obtained, the first motion trail is generated according to the motion information, then the data analysis is performed on the first motion trail, the motion information is determined, and the motion quality information is generated based on the motion information.
An embodiment of a third aspect of the present disclosure provides a wearable device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor executes the program to implement a motion trail determination method as set forth in the embodiment of the first aspect of the present disclosure.
An embodiment of a fourth aspect of the present disclosure proposes a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a motion trajectory judgment method as proposed by an embodiment of the first aspect of the present disclosure.
An embodiment of a fifth aspect of the present disclosure proposes a computer program product which, when executed by an instruction processor in the computer program product, performs a motion trajectory determination method as proposed by an embodiment of the first aspect of the present disclosure.
Additional aspects and advantages of the disclosure will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the disclosure.
Drawings
The foregoing and/or additional aspects and advantages of the present disclosure will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a flow chart of a motion trail determination method according to an embodiment of the disclosure;
fig. 2 is a flowchart of a motion trajectory judgment method according to another embodiment of the present disclosure;
fig. 3 is a flowchart of a motion trajectory judgment method according to another embodiment of the present disclosure;
fig. 4 is a flowchart of a motion trajectory judgment method according to another embodiment of the present disclosure;
Fig. 5 is a schematic structural diagram of a motion trajectory judgment device according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a motion trajectory judgment device according to another embodiment of the present disclosure;
fig. 7 illustrates a block diagram of an exemplary wearable device suitable for use in implementing embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the present disclosure and are not to be construed as limiting the present disclosure. On the contrary, the embodiments of the disclosure include all alternatives, modifications, and equivalents as may be included within the spirit and scope of the appended claims.
Fig. 1 is a flowchart of a motion trajectory determination method according to an embodiment of the present disclosure.
It should be noted that, the execution body of the motion trajectory determination method in this embodiment is a motion trajectory determination device, and the device may be implemented in software and/or hardware, and the device may be configured in a wearable device, for example, a smart watch, a smart bracelet, or the like, which is not limited thereto.
The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also realizes corresponding functions through software support, data interaction and cloud interaction.
As shown in fig. 1, the motion trail determination method includes:
s101: and acquiring action information identified by the sensor.
The motion information may be used to describe a motion of the user, where the motion may be specifically, for example, jumping, walking, running, and accordingly, the motion information may be specifically, for example, information about the number of times the user jumps, information about the time the user walks, and the like, and the comparison is not limited.
In some embodiments, the obtaining the motion information identified by the sensor may be that the user wears the corresponding wearable device in advance for the user when in motion, and sets the corresponding sensor (the sensor may be specifically configured as an accelerometer, a gyroscope, or the like without limitation) in the wearable device in advance, and identifies the corresponding motion information via the sensor in the wearable device, for example, before the user performs the rope skipping motion, the user may wear the wearable device with the accelerometer and/or the gyroscope in advance, and then collect the number of times of the rope skipping motion of the user via the accelerometer and/or the gyroscope in the wearable device worn in advance by the user, and use the collected number of times of the rope skipping motion of the user as the corresponding motion information without limitation.
The motion information recognized by the sensor may be motion information acquired from various dimensions for a motion of the user, that is, the motion information may be, for example, motion image information acquired from an image dimension for a motion of the user, motion sound information acquired from a sound dimension for a motion of the user, or motion information acquired from an arbitrary dimension for a motion of the user, which is not limited.
That is, in the embodiment of the present disclosure, the sensing data of the user action may be obtained by using the sensor to identify the motion action of the user from multiple dimensions, so as to obtain multidimensional action information identified by the sensor.
For example, an image sensor may be employed, and the sound sensor may collect motion image information of a user motion and motion sound information of a user motion from an image dimension and a sound dimension, respectively, and use the collected motion image information of the user motion and the collected motion image information of the user motion together as motion information recognized by the sensor, which is not limited.
S102: and generating a first motion trail according to the motion information.
The first motion track may refer to a spatial track formed by a certain part and/or a certain part of the body of the user during the motion, or the first motion track may also refer to a spatial track of a swing of a sports apparatus held by the user during the motion, or the like, specifically may be, for example, a spatial track of a hand swing arm during the running of the user, or may be, for example, an altitude track of the body during the jumping of the user, or the like, which is not limited.
In some embodiments, the first motion trail may be generated according to the motion information, after the motion information identified by the sensor is obtained, the motion information is analyzed to obtain a corresponding analysis result, and then the corresponding first motion trail is generated according to the analysis result, which is not limited.
In other embodiments, the first motion track may be generated according to the motion information, or after the motion information of the user motion is identified by using an accelerometer and a gyroscope, the acquired motion information may be fused by using a sensor fusion algorithm to obtain the first motion track, or any other possible manner may be adopted to generate the first motion track according to the motion information, further, the first motion track may be further generated according to the motion information, or the whole body motion information of the user may be obtained by combining data of height, weight, arm length and the like recorded by the user through fusion processing of the motion data, which is not limited.
S103: and carrying out data analysis on the first motion trail and determining motion information.
After the first motion trail is generated according to the motion information, the embodiment of the disclosure may perform data analysis processing on the first motion trail to determine motion information of a motion of a user.
The information for describing the motion of the user may be referred to as motion information, and the motion information may be, for example, motion time information of the motion of the user, spatial trajectory information of the motion of the user, or the like.
In the embodiment of the disclosure, the pre-trained data analysis model may be adopted to perform data analysis on the first motion track, and determine motion information, that is, the first motion track may be input into the pre-trained data analysis model, and the data analysis model performs analysis processing on the first motion track and outputs motion information of the motion of the user, which is not limited.
In some embodiments, the first motion track is subjected to data analysis to determine motion information, or the first motion track is subjected to data analysis according to the first motion track in combination with other criteria (for example, national standard, line standard, and no limitation on the criteria), so as to determine whether the motion of the user meets the motion criteria, whether the motion gesture of the user is correct, and the obtained determination result is used as the motion information, which is not limited.
S104: based on the motion information, motion quality information is generated.
The information describing the motion quality of the user may be referred to as motion quality information, and the motion quality information may be information for evaluating the motion of the user, motion correction guidance information formed for the motion of the user, or the like, and is not limited thereto.
In an embodiment of the present disclosure, the motion quality information may include: the operation effect evaluation information, the operation standard evaluation information, the operation grade evaluation information, the operation score index information, the operation energy consumption information, the operation error cause information, the operation correction instruction information, the operation consistency degree information, and the like are not limited thereto.
The information for evaluating the action effect of the movement action of the user may be referred to as action effect evaluation information, and the action effect evaluation information may be, for example, evaluation content information of the action effect, and is not limited thereto.
The information for evaluating the motion standard of the exercise motion may be referred to as motion standard evaluation information, and the motion standard evaluation information may be, for example, standard degree information of the exercise motion, and is not limited thereto.
The athletic performance may have some related scoring indexes, which may be, for example, a national standard-based scoring index, a line-based scoring index, a user-ranking-based scoring index of the same type, and the like, and accordingly, information describing the plurality of scoring indexes may be referred to as performance scoring index information, which may be, for example, a national standard-based scoring index-derived ranking information, a line-based scoring index-derived ranking information, and the like, which is not limited thereto.
The user may have corresponding physical energy consumption when performing the exercise action, and the information describing the physical energy consumption may be referred to as physical energy consumption information, which may be, for example, calorie information consumed by the user, without limitation.
In this case, when the user performs the exercise action, there may occur an erroneous action, which may be specifically, for example, an action gesture error, and accordingly, information for describing the cause of the erroneous action generated by the user, that is, what may be referred to as action error cause information, which may be specifically, for example, content information of the cause of the action error, which is not limited.
When the user performs the exercise action, the user may perform corrective instruction on the exercise action of the user, and accordingly, the information for performing corrective instruction on the exercise action of the user may be referred to as action corrective instruction information, which may specifically be, for example, corrective instruction opinion information, which is not limited thereto.
The motion consistency degree may be used to quantitatively describe a motion consistency condition (for example, a motion completion condition, a motion posture, etc.) between two motions when the user performs the motion, and accordingly, information for describing the motion consistency degree may be referred to as motion consistency degree information, and the motion consistency degree information may be specifically, for example, motion consistency degree level information, which is not limited thereto.
In some embodiments, the motion quality information may be generated by analyzing the motion information to obtain a corresponding analysis result, and generating corresponding motion quality information according to the analysis result, or the motion quality information may be generated based on the motion information, or the motion information of the user may be scored by combining a corresponding motion scoring index to obtain corresponding scoring information, and the scoring information may be used as the motion quality information, where, of course, any other possible manner may be adopted to generate the motion quality information based on the motion information, which is not limited.
In this embodiment, by acquiring motion information identified by a sensor, generating a first motion track according to the motion information, performing data analysis on the first motion track, determining motion information, and generating motion quality information based on the motion information, the motion track of a user is judged by performing data analysis on the first motion track generated by the motion information identified by the joint sensor, so that the comprehensiveness of motion track judgment can be effectively improved, the accuracy and the judgment effect of motion track judgment can be effectively improved, and diversified motion quality information obtained by motion track judgment can have better reference guiding significance.
Fig. 2 is a flowchart of a motion trajectory judgment method according to another embodiment of the present disclosure.
As shown in fig. 2, the motion trajectory judging method includes:
s201: motion sensing data of the sensor is acquired.
The sensing data directly acquired by the sensor for the motion of the user during the motion of the user may be referred to as motion sensing data, and the motion sensing data may be, for example, time data of motion, sound data of motion, or the like.
In the embodiment of the disclosure, a corresponding sensor may be configured in advance for a user when the user moves (the sensor may be worn on a body part of the user when the user moves, or may be configured in a sports apparatus held by the user when the user moves, or the sensor may be integrally provided in a wearable device worn by the user when the user moves, which is not limited), and motion sensing data of the sensor may be acquired through the sensor, and then a subsequent motion trail determination method may be performed based on the motion sensing data of the sensor, which is not limited.
S202: and generating space track information of the movement part according to the motion sensing data.
The exercise part refers to a body part of a user that initiates an exercise action during exercise, and may be, for example, a hand, a foot, etc. of the user, which is not limited.
The motion portion of the user may form a corresponding spatial track when the user moves, where the spatial track may be a spatial track of hand swing, a spatial track of a running route, and the like, and corresponding information for describing the spatial track may be referred to as spatial track information, and the spatial track information may specifically be, for example, position information of each track point in the track, acquisition time information corresponding to the corresponding track point, and the like, which is not limited.
In some embodiments, the spatial track information of the motion part is generated according to the motion sensing data, which may be that when the user moves, a sensor is used to continuously collect a plurality of motion sensing data of the user, and the spatial position point information of the hand of the user is obtained by analyzing the plurality of motion sensing data, and then the plurality of extracted spatial position point information of the hand of the user is used as the spatial track information of the motion part, which is not limited.
In other embodiments, the spatial track information of the motion part is generated according to the motion sensing data, or after the motion sensing data of the user motion is acquired, the acquired motion sensing data of the user motion is fused by using a sensor fusion algorithm to obtain the spatial track information of the user motion, or any other possible manner may be adopted to generate the spatial track information of the motion part according to the motion sensing data, which is not limited.
S203: the spatial trajectory information is used as the motion information.
According to the method and the device for determining the motion of the user, the motion sensing data of the sensor are obtained, so that the spatial track information of the motion part of the user, which can be used for accurately representing the motion of the user, can be generated based on the motion sensing data of the sensor, and therefore when the spatial track information is used as the motion information, the motion information can accurately represent the spatial track information of the motion part of the user, the referenceability of the motion information is effectively improved, and the smooth execution of a subsequent track determination method can be effectively assisted based on the motion information.
S204: and generating a first motion trail according to the motion information.
S205: and carrying out data analysis on the first motion trail and determining motion information.
S206: based on the motion information, motion quality information is generated.
The descriptions of S204-S206 may be specifically referred to the above embodiments, and are not repeated here.
In this embodiment, by acquiring the motion sensing data of the sensor, spatial track information of the user motion part, which can be used for accurately characterizing the motion of the user, is generated based on the motion sensing data of the sensor, so that when the spatial track information is used as the motion information, the motion information can accurately characterize the spatial track information of the user motion part, so that the referenceability of the motion information can be effectively improved, the smooth execution of a subsequent track judging method can be effectively assisted based on the motion information, a first motion track is generated according to the motion information, the data analysis is performed on the first motion track, the motion information is determined, and the motion quality information is generated based on the motion information, so that the comprehensiveness of the motion track judgment can be effectively improved, the accuracy and the judging effect of the motion track judgment can be effectively improved, and the diversified motion quality information obtained by the motion track judgment can have better reference guiding significance.
Fig. 3 is a flowchart of a motion trajectory judgment method according to another embodiment of the present disclosure.
As shown in fig. 3, the motion trajectory judging method includes:
s301: motion sensing data of the sensor is acquired.
The description of S301 may be specifically referred to the above embodiments, and will not be repeated here.
S302: and generating a spatial motion track of the motion part according to the motion sensing data.
The spatial movement track may be a spatial track generated by movement of a certain part and/or a certain part of a body when a user performs a movement action, specifically may be a spatial track generated by swinging an arm of the user when running, a spatial track generated by a foot from vacation to landing when jumping, or may be a spatial track generated by movement of a sports apparatus used by the user when the user performs the action, specifically may be a spatial track generated by swinging a rope when the user jumps a rope, a spatial track generated by throwing basketball to landing when the user shoots a basket, or the like, which is not limited.
In some embodiments, the spatial motion track of the motion part may be generated according to the motion sensing data, by processing the obtained motion sensing data by using a sensor data fusion algorithm, so as to generate the spatial motion track of the motion part, or by generating the spatial motion track of the motion part according to the motion sensing data, or by performing data fusion on the motion sensing data and the data collected by the inertial measurement unit (Inertial Measurement Unit, IMU) and/or the geomagnetic meter, so as to generate the spatial motion track of the motion part, or by adopting any other possible manner, so that the spatial motion track of the motion part is generated according to the motion sensing data, which is not limited.
In the embodiment of the disclosure, the number of the motion sensing data may be plural, and the plural motion sensing data may correspond to the plural marking times, respectively.
In the embodiment of the disclosure, the motion of the user is not changed at any time, and the motion of the user may be a repeated motion with regularity, so that after the motion sensing data of the user is obtained, the motion sensing data of the user may not need to be processed, and thus, the motion sensing data of the user may be marked as a plurality of time points such as T1, T2, T3 … … Tn, etc., where the time points may be referred to as marking times, and the plurality of marking times may be continuous time points or time points with a certain time interval, which is not limited.
According to the method and the device for processing the motion sensing data, after the marking processing is carried out on the plurality of motion sensing data, analysis processing can be carried out on the motion sensing data corresponding to a certain marking time and/or a certain marking time, and the motion sensing data in any two marking times can be processed, so that the calculated amount of data processing can be effectively reduced, and the data processing efficiency is effectively improved.
Alternatively, in some embodiments, the spatial motion track of the motion part is generated according to the motion sensing data, which may be that the motion sensing data are processed separately to obtain a plurality of single motion track features corresponding to the marking times respectively, and a multi-motion track feature associated with at least part of the marking times is generated according to the single motion track features, and the spatial motion track of the motion part is generated according to the single motion track features and the multi-motion track features.
The motion track made by the user and corresponding to the plurality of marking times may be referred to as a single motion track, and accordingly, a feature for describing the single motion track may be referred to as a single motion track feature, where the single motion track feature may be specifically, for example, a spatial track of a limb portion of the user at the marking time point, and is not limited thereto.
In some embodiments, the method includes processing the plurality of motion sensing data respectively to obtain a plurality of single motion track features corresponding to the plurality of marking times respectively, and may be processing the plurality of motion sensing data respectively to determine a plurality of motion sensing data of the user corresponding to the plurality of marking times respectively, and then analyzing the plurality of motion sensing data of the user corresponding to the plurality of marking times to determine a plurality of user limb portion dimensional space tracks corresponding to the plurality of marking times, and using the plurality of user limb portion space tracks together as the plurality of single motion track features corresponding to the plurality of marking times respectively.
In an embodiment of the present disclosure, the motion sensing data may include: acceleration data, rotational motion data and magnetic force sensing data corresponding to user actions at corresponding marking times.
The acceleration data refers to motion sensing data of a user acquired by an accelerometer in the execution process of the motion trail judging method.
The rotation motion data refers to motion sensing data of a user acquired by a gyroscope in the execution process of the motion trail judging method.
The magnetic force sensing data refer to motion sensing data of a user acquired by a magnetometer in the execution process of the motion trail judging method.
Alternatively, in other embodiments, the plurality of motion sensing data are processed respectively to obtain a plurality of single motion track features corresponding to the plurality of marking times respectively, which may be that motion gesture information of a user motion at the corresponding marking time is determined according to the acceleration data, the rotation motion data and the magnetic force sensing data, and then the single motion track features of the user motion at the corresponding marking time are determined according to the motion gesture information.
The information describing the motion gesture of the user may be referred to as motion gesture information, where the motion gesture information may be gesture information of a certain and/or certain part of the body of the user, and may specifically be, for example, a spatial pose of a certain and/or certain part of the body of the user, which is not limited.
That is, in the embodiments of the present disclosure, the acceleration data, the rotational motion data, and the magnetic force sensing data may be processed by using a data fusion algorithm to determine the spatial pose of a certain and/or certain portion of the user's body at the corresponding marking time, and the determined spatial pose of the certain and/or certain portion of the user's body may be used together as a single action track feature of the user action at the corresponding marking time.
The association of at least a part of the marking time refers to the marking time associated with the presence time of the marking time, and may specifically be, for example, a time period formed by a plurality of other marking times between two marking times, which is not limited.
The motion trajectory corresponding to the associated at least part of the marking time may be referred to as a multi-motion trajectory, and accordingly, a feature for describing the multi-motion trajectory may be referred to as a multi-motion trajectory feature, and the multi-motion trajectory may specifically be, for example, a motion trajectory corresponding to a time period formed by a plurality of marking times between two marking times, which is not limited.
In some embodiments, the generating the multi-action track feature associated with at least part of the marking time according to the plurality of single-action track features may be determining a plurality of single-action track features associated with at least part of the marking time, and performing feature stitching processing on the obtained plurality of single-action track features to obtain corresponding multi-action track features, which is not limited.
According to the embodiment of the disclosure, after the plurality of single-action track features and the plurality of multi-action track features of the user action are determined, the spatial movement track of the user action can be generated according to the plurality of single-action track features and the plurality of multi-action track features, and the corresponding single-action track features are obtained by performing single-action segmentation on the user action, so that the spatial track of the user action can be more accurately represented based on the single-action track features.
For example, the determined single-action trajectory features and multi-action trajectory features may be input into a convolutional neural network (Convolutional Neural Networks, CNN) trained in advance, the CNN performs corresponding processing on the single-action trajectory features and the multi-action trajectory features, and outputs a spatial motion trajectory of the corresponding motion part, which is not limited.
S303: and generating the height of the motion track of the motion part according to the motion sensing data.
The motion track height may be a motion track height generated by motion of a certain part and/or a certain part of a body when a user performs an action, specifically may be a maximum altitude of an arm raised by the user when running, a maximum altitude of a foot vacated by the user when jumping, or may also be a corresponding altitude of a motion instrument used by the user when the user performs an action, specifically may be a maximum altitude generated by rope skipping swing when the user jumps, a maximum altitude of basketball shot by the user when shooting, or the like, which is not limited.
In some embodiments, the motion track height of the user action may be generated according to the action sensing data, by performing data fusion processing on the action sensing data, the global navigation satellite system (Global Navigation Satellite System, GNSS) and the data acquired by the barometer, so as to obtain the motion track height of the user action, or any other possible manner may be adopted to generate the motion track height of the user action according to the action sensing data, which is not limited.
S304: and the space motion track and the motion track height are used as space track information together.
In the embodiment of the disclosure, the spatial motion track of the user motion is generated according to the motion sensing data, and the motion track height of the user motion is generated according to the motion sensing data, so that the motion of the user can be represented based on two dimensions of the motion track and the track height, the comprehensiveness and the referenceability of the spatial track information can be effectively improved, and the execution of a subsequent motion track judging method can be effectively assisted based on the spatial track information.
S305: and generating a first motion track according to the space track information of the motion part.
After determining the spatial trajectory information of the motion part, the embodiment of the disclosure may analyze the spatial trajectory information of the motion part to obtain a corresponding first motion trajectory by analyzing the spatial trajectory information, or may adopt any other possible manner to generate the first motion trajectory according to the spatial trajectory information of the motion part, for example, a model analysis manner, a machine learning manner, and the like, which are not limited.
Optionally, in some embodiments, the first motion track is generated according to the spatial track information of the motion part, which may be a candidate motion track matched with the spatial track information determined from a plurality of candidate motion tracks, and the matched candidate motion track is used as the first motion track.
The candidate motion trail may be, for example, a motion trail of a plurality of motion parts of the body, or may be a motion trail generated by motion of other objects in the motion scene, for example, a motion trail formed by a motion apparatus, which is not limited.
That is, in the embodiment of the present disclosure, after the spatial trajectory information of the motion portion is determined, the spatial trajectory information may be respectively matched with a plurality of candidate motion trajectories, and when the spatial trajectory information and the candidate motion trajectories are matched, the candidate motion trajectories are used as the first motion trajectories, and then a subsequent motion trajectory determination method may be performed in combination with the first motion trajectories, which is not limited thereto.
S306: and carrying out data analysis on the first motion trail and determining motion information.
Optionally, in some embodiments, the data analysis is performed on the first motion track to determine motion information, which may be that motion characteristics of the first motion track are decomposed to obtain motion characteristics to be analyzed, relevant rope skipping times of rope skipping motion are determined according to the motion characteristics to be analyzed, relevant rope skipping heights of rope skipping motion are determined according to the motion characteristics to be analyzed, and the relevant rope skipping times and the relevant rope skipping heights are used together as the motion information.
The feature for describing the motion of the user may be referred to as a motion feature to be analyzed, and the motion feature to be analyzed may be, for example, a gesture feature of the motion of the user, an amplitude feature of the motion of the user, or the like, which is not limited.
That is, after determining the first motion track, the embodiment of the disclosure may perform motion feature decomposition on the first motion track, for example, a peak detection algorithm, a peak-to-valley detection algorithm, and a spectrum calculation algorithm may be used to perform feature recognition on the user motion, so as to obtain the motion feature to be analyzed corresponding to the user motion, which is not limited.
The number of relevant rope skipping may be the number of oscillations of a certain part of the user's body when the user performs the rope skipping motion, specifically may be the number of hops when the user performs the rope skipping motion, or the number of oscillations of a sports apparatus held by the user when the user performs the rope skipping motion, specifically may be the number of rope skipping rotations when the user performs the rope skipping motion, which is not limited.
In some embodiments, the determining the relevant rope skipping times of the rope skipping motion according to the motion feature to be analyzed may be determining the motion feature corresponding to the rope skipping motion in advance, comparing the motion feature to be analyzed with the motion feature of the rope skipping motion, and performing accumulated counting processing on the motion feature to be analyzed when the motion feature to be analyzed is matched with the motion feature of the rope skipping motion, so as to obtain the relevant rope skipping times of the rope skipping motion, which is not limited.
In the embodiment of the disclosure, the related rope skipping times may also be specifically, for example, the occurrence times of the rope skipping motion among the user motion, and the interruption times of the rope skipping motion among the user motion are not limited.
In other embodiments, the related rope skipping times of the rope skipping motion are determined according to the motion characteristics to be analyzed, the occurrence times of the rope skipping motion in the motion of the user in the set time range can be determined according to the motion characteristics to be analyzed, and/or the interruption times of the rope skipping motion in the motion of the user in the set time range can be determined according to the motion characteristics to be analyzed, and the occurrence times and/or the interruption times are used as the related rope skipping times of the rope skipping motion.
In the embodiment of the disclosure, the number of rope skipping times related to the rope skipping motion may be determined by presetting a corresponding time range, then determining the number of times of interruption and occurrence of the rope skipping motion in the user motion within the time range, and taking the number of times as the number of times of rope skipping related to the rope skipping motion, wherein the preset time range may be referred to as a set time range.
The value used for quantitatively describing the action interruption condition of the user when the rope skipping action is executed can be called as interruption times, and the interruption times can be specifically, for example, the times of rope skipping action suspension, and the limitation is not limited.
The value used for quantitatively describing the occurrence of the rope skipping motion when the user performs the rope skipping motion may be referred to as the occurrence number, which may be specifically, for example, the rope skipping completion number, which is not limited.
That is, in the embodiment of the present disclosure, the number of occurrences of the rope-skipping motion in the set time range among the user motions may be determined according to the motion feature to be analyzed, and/or the number of interruptions of the rope-skipping motion in the set time range among the user motions may be determined according to the motion feature to be analyzed, and the number of occurrences and/or the number of interruptions may be used as the number of rope-skipping times related to the rope-skipping motion, which is not limited.
S307: and determining analysis and evaluation information of the rope skipping motion according to the relevant rope skipping height and the motion characteristics to be analyzed.
In the embodiment of the disclosure, according to the relevant rope skipping height and the action characteristics to be analyzed, the rope skipping motion can be analyzed and evaluated to obtain corresponding evaluation information, and the evaluation information can be called as analysis and evaluation information of the rope skipping motion, and the analysis and evaluation information can be specifically, for example, scoring information of the rope skipping motion, effect evaluation information of the rope skipping motion, and the like, which are not limited.
Optionally, in some embodiments, the determining the analysis and evaluation information of the rope skipping motion according to the relevant rope skipping height and the motion feature to be analyzed may be obtaining the reference track height and the reference motion feature, and matching the relevant rope skipping height and the reference track height to obtain a first matching result, then matching the motion feature to be analyzed and the reference motion feature to obtain a second matching result, and determining the analysis and evaluation information of the rope skipping motion according to the first matching result and the second matching result, where the relevant rope skipping height and the reference track height are combined to obtain the first matching result, and the reference motion feature is combined to obtain the second matching result, so that the accuracy and the referenceness of the analysis and evaluation information can be effectively improved when the analysis and evaluation information of the rope skipping motion is determined based on the first matching result and the second matching result.
In the execution process of the motion trail determination method, the characteristic used as the reference of the motion trail height can be called as the reference trail height, and the reference trail height can be determined according to the national standard, so that the method is not limited.
In the execution process of the motion trail determination method, the feature used as a reference for the motion feature to be analyzed may be referred to as a reference motion feature, and the reference motion feature may be, for example, a feature of a standard motion gesture, which is not limited.
After the reference track height is obtained, the embodiment of the disclosure can perform matching processing on the relevant jump rope height and the reference track height to obtain a corresponding matching result, and the matching result can be called as a first matching result.
After the reference action feature is obtained, the embodiment of the disclosure can perform matching processing on the reference action feature and the action feature to be analyzed to obtain a corresponding matching result, and the matching result can be called a second matching result.
According to the embodiment of the disclosure, after the first matching result and the second matching result are obtained, analysis evaluation information of the rope skipping motion can be generated according to the first matching result and the second matching result.
For example, a dynamic time warping (Dynamic Time Warping, DTW) algorithm may be used to perform matching processing on the relevant rope skipping height and the reference track height to obtain a first matching result, perform matching processing on the reference motion feature and the motion feature to be analyzed to obtain a second matching result, and perform analysis and integration processing on the first matching result and the second matching result to obtain analysis and evaluation information of the rope skipping motion, which is not limited.
S308: the relevant rope skipping times and the analysis and evaluation information are used as the movement quality information together.
According to the embodiment of the disclosure, after the relevant rope skipping times and analysis and evaluation information of rope skipping motion are determined, the relevant rope skipping times and analysis and evaluation information of rope skipping motion can be used as motion quality information, and as the relevant rope skipping times of rope skipping motion are determined according to the motion characteristics to be analyzed, the accuracy of the relevant rope skipping times can be effectively improved, and the analysis and evaluation information of rope skipping motion is determined based on the relevant rope skipping height and the motion characteristics to be analyzed, the referenceability of the analysis and evaluation information can be effectively improved, so that when the relevant rope skipping times and the analysis and evaluation information are used as motion quality information together, the motion quality information can provide more diversified references for users from the aspects of rope skipping times and the analysis and evaluation information, and the motion quality information can have more reference and guidance significance.
In this embodiment, motion sensing data of a sensor is obtained, a spatial motion track of a motion part is generated according to the motion sensing data, then a motion track height of the motion part is generated according to the motion sensing data, then the spatial motion track and the motion track height are used as spatial track information together, so that the comprehensiveness and referenceability of the spatial track information can be effectively improved, the execution of a subsequent motion track judging method can be effectively assisted based on the spatial track information, a first motion track is generated according to the spatial track information of the motion part, data analysis is performed on the first motion track, motion information is determined, analysis evaluation information of the motion of the rope skipping is determined according to the relevant rope skipping frequency and the analysis evaluation information, and the relevant rope skipping frequency of the motion is determined according to the motion feature to be analyzed.
Fig. 4 is a flowchart of a motion trajectory judgment method according to another embodiment of the present disclosure.
As shown in fig. 4, the motion trajectory judging method includes:
s401: relevant sensed data of the action is acquired.
The data related to the motion sensing data for determining whether the user motion is a rope skipping motion may be referred to as related sensing data, which may be the same data as the motion sensing data, or may be data having a certain association with the motion sensing data, for example, the related sensing data may be data having an association with the motion sensing data in a time dimension, which is not limited.
Optionally, in some embodiments, the relevant sensing data of the action may be the sensing data to be identified of the action, where the sensing data to be identified is used as the relevant sensing data, and/or the image sensing data of the action scene is obtained, where the image sensing data is used as the relevant sensing data, and/or the sound sensing data of the action scene is obtained, where the sound sensing data is used as the relevant sensing data, and the relevant sensing data is used as the relevant sensing data, so that the relevant sensing data may be characterized based on three dimensions of the user action, the image of the user action scene, and the sound of the user action scene, so that the comprehensiveness and referenceability of the relevant sensing data can be effectively improved, and thus, based on the relevant sensing data, whether the user action is a rope skipping motion action can be accurately determined.
The sensing data to be identified may be used to describe the limb action condition of the user, and the sensing data to be identified may be, for example, amplitude data of limb swing of the user, speed data of arm rotation of the user, and the like, which are not limited.
The image data collected for the motion scene when the user performs the motion based on the image dimension may be referred to as image sensing data, where the image sensing data may be image data collected for the motion scene, specifically may be image data collected for the user and including the motion between the user and the motion device in the scene where the user is located, for example, without limitation.
The sound data collected for the action scene when the user performs the motion action based on the sound dimension may be called sound sensing data, and the sound sensing data may be sound data of the scene collected for the action scene of the user, specifically, for example, sound data collected for the user and generated by the ground impact of the scene, which is not limited.
That is, in the embodiment of the present disclosure, the sensing data to be recognized of the user action, the image sensing data of the action scene, and the sound sensing data of the action scene may be obtained respectively, and the obtained data may be used as the relevant sensing data of the user action, and then, based on the relevant sensing data, whether the user action is a rope skipping motion action may be determined.
S402: based on the relevant sensed data, it is determined whether the action is a rope skipping motion action.
After acquiring the relevant sensing data of the user action, the embodiment of the disclosure can determine whether the user action is a rope skipping action according to the relevant sensing data.
In some embodiments, according to the relevant sensing data, whether the user action is a rope skipping motion action may be determined by analyzing the relevant sensing data to determine whether the user action is a rope skipping motion action, or any other possible manner may be adopted, according to the relevant sensing data, whether the user action is a rope skipping motion action is determined, for example, a manner of a multi-layer perceptron (Multilayer Perceptron, MLP), a manner of a support vector machine (Support Vector Machines, SVM), a manner of a convolutional neural network Convolutional Neural Networks, CNN), and the method is not limited thereto.
Optionally, in some embodiments, according to the related sensing data, determining whether the motion is a rope skipping motion may be determining rope skipping motion characteristics of the rope skipping motion, analyzing to-be-matched motion characteristics of sensing data to be identified, and determining that the motion is the rope skipping motion when the to-be-matched motion characteristics are matched with the rope skipping motion characteristics, and determining whether the user motion is the rope skipping motion due to the combination of the rope skipping motion characteristics of the rope skipping motion and the to-be-matched motion characteristics of sensing data to be identified, so that other interference factors are avoided being introduced in the process of matching the rope skipping motion characteristics and the to-be-matched motion characteristics, and complexity of matching the motion characteristics is effectively reduced, so that accuracy and judging efficiency of judging the rope skipping motion are effectively improved.
The characteristic used for describing the rope skipping motion can be called as rope skipping motion characteristic, and correspondingly, the characteristic used for describing the motion corresponding to the sensing data to be identified can be called as motion characteristic to be matched.
In the embodiment of the disclosure, the rope skipping action characteristics of the rope skipping action are determined, and the to-be-matched action characteristics of the to-be-identified sensing data are analyzed, which may be that the rope skipping action and the to-be-identified sensing data are respectively subjected to characteristic extraction to obtain the rope skipping action characteristics corresponding to the rope skipping action and the to-be-matched action characteristics corresponding to the to-be-identified sensing data, and the method is not limited.
According to the embodiment of the disclosure, after rope skipping action characteristics of rope skipping action are determined and action characteristics to be matched with sensing data to be identified are determined, the rope skipping action characteristics and the action characteristics to be matched can be subjected to matching processing, when the action characteristics to be matched are matched with the rope skipping action characteristics, the user action is determined to be the rope skipping action, and when the action characteristics to be matched are not matched with the rope skipping action characteristics, the user action is determined not to be the rope skipping action.
Optionally, the matching process may be performed on the rope-skipping action feature and the action feature to be matched, which may be determining the similarity between the rope-skipping action feature and the action feature to be matched, for example, a dynamic time warping (Dynamic Time Warping, DTW) algorithm, a longest common subsequence (Longest Common Subsequence, LCSS) algorithm, and a Frechet Distance (Frechet Distance) algorithm may be adopted, determining the similarity between the rope-skipping action feature and the action feature to be matched, and determining that the rope-skipping action feature and the action feature to be matched are matched when the similarity is greater than or equal to a similarity threshold (the similarity threshold may be adaptively configured according to a scene analyzed by an actual user and is not limited), otherwise, determining that the rope-skipping action feature and the action feature to be matched are not matched when the similarity is less than the similarity threshold, and not limited.
Alternatively, in some embodiments, determining whether the user action is a rope skipping motion action based on the relevant sensed data may be determining a sensed image characteristic of the image sensed data, and based on the sensed image characteristic, identifying whether the rope skipping body data is included in the image sensed data, and when the rope skipping body data is included in the image sensed data, obtaining interaction data of the action acting between the rope skipping bodies, and when the interaction data satisfies the motion condition, determining that the action is a rope skipping motion action.
Among them, the feature for describing the image sensing data may be referred to as a sensed image feature, and the sensed image feature may specifically be, for example, an action feature of a motion action corresponding to the sensed image, which is not limited.
The rope skipping body may be equipment, instruments, articles and the like for rope skipping motion, and may be, for example, a rope skipping held by a user during rope skipping motion, which is not limited.
It can be understood that when the user interacts with the rope skipping body, the user does not always present the rope skipping motion, that is, when the user holds the rope skipping, the user cannot be represented that the user is executing the rope skipping motion at the moment, but the situation that the user holds the rope skipping is also possible, but the rope skipping is not performed, or the rope skipping is performed at the foot of the user, the situation can be accurately represented by the sensed image characteristics of the image sensing data, so that the user motion under the situation can be accurately judged based on the sensed image characteristics, the motion judgment requirement under the situation can be effectively met, and the reliability of the motion judgment is effectively ensured.
The data describing the mutual movement between the user and the jump rope body, which may be referred to as interaction data, may be, for example, data describing the movement and/or the standstill between the user and the jump rope body, without limitation.
The set condition that is preset for the mutual movement data may be called a movement condition, and the movement condition may be specifically, for example, a condition that the user jumps once and jumps the rope to rotate one turn, or may be configured as any other possible condition, which is not limited.
In summary, the determining that the mutual motion data satisfies the motion condition may be determining whether the mutual motion data satisfies the motion condition, for example, whether the user and the rope skipping satisfy the condition that the user jumps once and the rope skipping rotates by one circle, and determining that the user action is the rope skipping motion action when the mutual motion data satisfies the motion condition, and determining that the user action is not the rope skipping motion action when the mutual motion data does not satisfy the motion condition.
In some embodiments, according to the related sensing data, determining whether the motion is a rope skipping motion may be determining a sound feature of the rope skipping motion, analyzing a sound feature to be matched of the sound sensing data, determining that the motion is the rope skipping motion when the sound feature to be matched is matched with the sound feature of the rope skipping motion, determining that the motion is not the rope skipping motion when the sound feature to be matched is not matched with the sound feature of the rope skipping motion, and judging whether the user motion is the rope skipping motion by combining the sound feature of the rope skipping motion and the sound feature to be matched of the sound sensing data, so that other interference factors are avoided from being introduced in the process of feature matching the sound feature to be matched and the sound feature of the rope skipping motion, complexity of matching processing of the sound sensing data can be effectively reduced, and accuracy and efficiency of judging the rope skipping motion can be effectively improved.
The feature for describing the sound sensing data may be referred to as a sound feature to be matched, and the sound feature to be matched may specifically be, for example, a frequency, a volume, or the like of a motion corresponding to the sound sensing data, which is not limited.
The characteristic for describing the rope skipping motion may be referred to as a sound characteristic of the rope skipping motion, and the sound characteristic may be, for example, a sound frequency, a sound volume, or the like of friction between the rope skipping and the ground when the user skips the rope, and is not limited thereto.
That is, in the embodiment of the present disclosure, the sound feature of the rope skipping motion may be extracted in advance, the sound feature to be matched of the sound sensing data may be extracted, and then the sound feature to be matched and the sound feature of the rope skipping motion may be matched by adopting a feature matching algorithm, and when the sound feature to be matched and the sound feature of the rope skipping motion are matched, it is determined that the user motion is the rope skipping motion, and when the sound feature to be matched and the sound feature of the rope skipping motion are not matched, it is determined that the user motion is not the rope skipping motion, which is not limited.
S403: and if the action is a rope skipping action, acquiring action information identified by the sensor.
After determining that the user action is a rope skipping action, the embodiment of the disclosure can acquire action information identified by the sensor, and then can execute a subsequent motion trail judging method in combination with the action information identified by the sensor, and particularly, the embodiment can be seen.
S404: and generating a first motion trail according to the motion information.
The description of S404 may be specifically referred to the above embodiments, and will not be repeated here.
S405: and correcting the first motion track by adopting a deep learning and/or machine learning method so as to obtain a target motion track.
In the embodiment of the disclosure, after the first motion track is generated according to the motion information, the first motion track may be corrected by adopting a deep learning and/or machine learning method, so as to obtain a more accurate motion track, which may be called a target motion track.
In the embodiment of the disclosure, the first motion track can be generated according to the motion information by combining a pre-trained deep learning and/or machine learning model, i.e. the motion information can be input into the pre-trained deep learning and/or machine learning model to obtain the first motion track output by the deep learning and/or machine learning model.
Accordingly, the first motion trajectory may be corrected by using a deep learning and/or machine learning method, for example, a corresponding loss function may be introduced, the pre-trained deep learning and/or machine learning model may be corrected according to an output result of the loss function, and a motion trajectory output by the deep learning and/or machine learning model obtained by correction may be used as the target motion trajectory, which is not limited.
S406: and carrying out data analysis on the target motion trail and determining motion action information.
In the embodiments of the present disclosure, for a specific explanation of determining motion information by analyzing data of a target motion trajectory, reference may be made to the above embodiments, which are not described herein.
In the embodiment of the disclosure, the first motion track is corrected by the deep learning and/or machine learning method to obtain a more accurate target motion track, so that when the target motion track is subjected to data analysis and motion information is determined, the data analysis effect of the target motion track is effectively improved, and the accuracy of the motion information is effectively improved.
S407: based on the motion information, motion quality information is generated.
The description of S407 may be specifically referred to the above embodiments, and will not be repeated here.
In this embodiment, by acquiring relevant sensing data of the motion, and then determining whether the motion is a rope skipping motion according to the relevant sensing data, the motion judging requirements under different conditions can be effectively met, the reliability of motion judgment can be effectively ensured, when the motion is the rope skipping motion, the motion information identified by the sensor is acquired, the first motion track is generated according to the motion information, then the first motion track is corrected by adopting a deep learning and/or machine learning method, so as to obtain a target motion track, and because the first motion track is corrected by adopting the deep learning and/or machine learning method, the more accurate target motion track is obtained, so that when the data analysis is performed on the target motion track, and the motion information is determined, the data analysis effect of the target motion track is effectively improved, the accuracy of the motion information is effectively improved, and then the motion quality information is generated based on the motion information, so that the generated motion quality information has better reference guiding significance.
Fig. 5 is a schematic structural diagram of a motion trajectory determining device according to an embodiment of the present disclosure.
As shown in fig. 5, the motion trajectory judgment device 50 includes:
an obtaining module 501, configured to obtain motion information identified by a sensor;
the first generation module 502 is configured to generate a first motion trail according to the motion information;
the analysis module 503 is configured to perform data analysis on the first motion trail, and determine motion action information;
the second generation module 504 is configured to generate motion quality information based on the motion information.
In some embodiments of the present disclosure, as shown in fig. 6, fig. 6 is a schematic structural diagram of a motion trajectory determining device according to another embodiment of the present disclosure, and an obtaining module 501 includes:
a first obtaining submodule 5011 for obtaining motion sensing data of the sensor;
a generating submodule 5012 for generating space track information of the movement part according to the motion sensing data; and
the first processing submodule 5013 is configured to use the spatial trajectory information as motion information.
In some embodiments of the present disclosure, the generation submodule 5012 is specifically configured to:
generating a space motion track of the motion part according to the motion sensing data;
Generating the height of the motion track of the motion part according to the motion sensing data; and
and the space motion track and the motion track height are used as space track information together.
In some embodiments of the present disclosure,
the number of the motion sensing data is multiple, and the multiple motion sensing data respectively correspond to multiple marking times;
wherein, generate submodule 5012 is further configured to:
processing the plurality of motion sensing data respectively to obtain a plurality of single motion track characteristics corresponding to the plurality of marking times respectively;
generating a multi-action track feature associated with at least part of the marking time according to the plurality of single-action track features; and
and generating a space motion track of the motion part according to the plurality of single motion track features and the plurality of motion track features.
In some embodiments of the present disclosure, the motion sensing data comprises: corresponding acceleration data, rotation motion data and magnetic force sensing data corresponding to user actions at corresponding marking time;
wherein, generate submodule 5012 is further configured to:
determining action gesture information of the user action at the corresponding mark time according to the acceleration data, the rotation movement data and the magnetic force sensing data;
and determining the single action track characteristics of the user action at the corresponding marking time according to the action gesture information.
In some embodiments of the present disclosure, the acquiring module 501 further includes:
a second obtaining submodule 5014 for obtaining relevant sensing data of the action;
a first determination submodule 5015 for determining whether the action is a rope-jump motion action according to the relevant sensed data;
the third obtaining submodule 5016 is used for obtaining the motion information identified by the sensor when the motion is a rope skipping motion.
In some embodiments of the present disclosure, the second obtaining submodule 5014 is specifically configured to:
acquiring sensing data to be identified of the action, wherein the sensing data to be identified is used as related sensing data; and/or
Acquiring image sensing data of an action scene, wherein the image sensing data is used as related sensing data; and/or
Sound sensing data of the action scene is acquired, wherein the sound sensing data is used as related sensing data.
In some embodiments of the present disclosure, the relevant sensed data is sensed data to be identified;
the first determining submodule 5015 is specifically configured to:
determining rope skipping action characteristics of rope skipping action;
analyzing action characteristics to be matched of sensing data to be identified;
And if the action characteristic to be matched is matched with the rope skipping action characteristic, determining that the action is a rope skipping action.
In some embodiments of the present disclosure, the related sensing data is image sensing data;
wherein, the first determining submodule 5015 is further configured to:
determining sensed image characteristics of the image sensing data;
identifying whether the image sensing data comprises rope skipping body data according to the sensed image characteristics;
if the image sensing data comprise rope skipping body data, interaction data of actions acting on the rope skipping bodies are obtained;
if the interaction data satisfies the motion condition, the action is determined to be a rope skipping motion action.
In some embodiments of the present disclosure, the first generating module 502 is specifically configured to:
and generating a first motion track according to the space track information of the motion part.
In some embodiments of the present disclosure, the first generating module 502 is specifically configured to:
determining candidate motion trajectories matched with the space trajectory information from a plurality of candidate motion trajectories;
and taking the matched candidate motion trail as a first motion trail.
In some embodiments of the present disclosure, the analysis module 503 is specifically configured to:
Performing action feature decomposition on the first motion trail to obtain action features to be analyzed;
determining the related rope skipping times of rope skipping motion according to the motion characteristics to be analyzed;
determining the relevant rope skipping height of the rope skipping motion according to the motion characteristics to be analyzed;
the relevant rope skipping times and the relevant rope skipping heights are used as the movement action information together.
In some embodiments of the present disclosure, the second generating module 504 includes:
a second determining submodule 5041, configured to determine analysis and evaluation information of rope-skipping motion according to the relevant rope-skipping height and motion characteristics to be analyzed;
the second processing sub-module 5042 is configured to use the relevant rope skipping number and the analysis and evaluation information together as motion quality information.
In some embodiments of the present disclosure, the second determining submodule 5041 is specifically configured to:
acquiring reference track height and reference action characteristics;
matching the relevant rope skipping height with the reference track height to obtain a first matching result;
matching the action feature to be analyzed with the reference action feature to obtain a second matching result;
and determining analysis and evaluation information of the rope skipping motion according to the first matching result and the second matching result.
In some embodiments of the present disclosure, it further comprises:
the correction module 505 is configured to correct the first motion trajectory by using a deep learning and/or machine learning method to obtain a target motion trajectory;
the analysis module 503 is specifically configured to:
and carrying out data analysis on the target motion trail and determining motion action information.
In some embodiments of the present disclosure, characterized in that,
the motion quality information includes any one or a combination of the following:
action effect evaluation information;
action standard evaluation information;
action grade evaluation information;
action scoring index information;
physical stamina consumption information;
action error cause information;
action correction instruction information;
action consistency degree information.
Corresponding to the motion trajectory determination method provided in the embodiments of fig. 1 to 4, the present disclosure further provides a motion trajectory determination device, and since the motion trajectory determination device provided in the embodiments of the present disclosure corresponds to the motion trajectory determination method provided in the embodiments of fig. 1 to 4, implementation of the motion trajectory determination method is also applicable to the motion trajectory determination device provided in the embodiments of the present disclosure, which is not described in detail in the embodiments of the present disclosure.
In this embodiment, by acquiring motion information identified by a sensor, generating a first motion track according to the motion information, performing data analysis on the first motion track, determining motion information, and generating motion quality information based on the motion information, the motion track of a user is judged by performing data analysis on the first motion track generated by the motion information identified by the joint sensor, so that the comprehensiveness of motion track judgment can be effectively improved, the accuracy and the judgment effect of motion track judgment can be effectively improved, and diversified motion quality information obtained by motion track judgment can have better reference guiding significance.
To achieve the above embodiments, the present disclosure further proposes a wearable device, including: the motion trajectory judgment method provided by the previous embodiment of the disclosure is realized when the processor executes the program.
In order to implement the above-described embodiments, the present disclosure also proposes a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a motion trajectory determination method as proposed in the foregoing embodiments of the present disclosure.
In order to implement the above-mentioned embodiments, the present disclosure also proposes a computer program product which, when executed by an instruction processor in the computer program product, performs the motion trajectory judgment method as proposed in the foregoing embodiments of the present disclosure.
Fig. 7 illustrates a block diagram of an exemplary wearable device suitable for use in implementing embodiments of the present disclosure. The wearable device 12 shown in fig. 7 is only one example and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 7, wearable device 12 is in the form of a general purpose computing device. Components of wearable device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, a bus 18 that connects the various system components, including the system memory 28 and the processing units 16.
Bus 18 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include industry Standard architecture (Industry Standard Architecture; hereinafter ISA) bus, micro channel architecture (Micro Channel Architecture; hereinafter MAC) bus, enhanced ISA bus, video electronics standards Association (Video Electronics Standards Association; hereinafter VESA) local bus, and peripheral component interconnect (Peripheral Component Interconnection; hereinafter PCI) bus.
Wearable device 12 typically includes a variety of computer system readable media. Such media can be any available media that can be accessed by wearable device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 28 may include computer system readable media in the form of volatile memory, such as random access memory (Random Access Memory; hereinafter: RAM) 30 and/or cache memory 32. Wearable device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 7, commonly referred to as a "hard disk drive").
Although not shown in fig. 7, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a compact disk read only memory (Compact Disc Read Only Memory; hereinafter CD-ROM), digital versatile read only optical disk (Digital Video Disc Read Only Memory; hereinafter DVD-ROM), or other optical media) may be provided. In such cases, each drive may be coupled to bus 18 through one or more data medium interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of the various embodiments of the disclosure.
A program/utility 40 having a set (at least one) of program modules 42 may be stored in, for example, memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 42 generally perform the functions and/or methods in the embodiments described in this disclosure.
The wearable device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), one or more devices that enable a user to interact with the wearable device 12, and/or any devices (e.g., network card, modem, etc.) that enable the wearable device 12 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 22. Also, the wearable device 12 may communicate with one or more networks (e.g., a local area network (Local Area Network; hereinafter: LAN), a wide area network (Wide Area Network; hereinafter: WAN) and/or a public network, such as the internet) through the network adapter 20. As shown, network adapter 20 communicates with other modules of wearable device 12 via bus 18. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in connection with wearable device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 16 executes various functional applications and data processing by running a program stored in the system memory 28, for example, implementing the motion trajectory judgment method mentioned in the foregoing embodiment.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
It should be noted that in the description of the present disclosure, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Furthermore, in the description of the present disclosure, unless otherwise indicated, the meaning of "a plurality" is two or more.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present disclosure in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present disclosure.
It should be understood that portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
Furthermore, each functional unit in the embodiments of the present disclosure may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present disclosure have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the present disclosure, and that variations, modifications, alternatives, and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the present disclosure.

Claims (19)

1. The motion trail judging method is characterized by comprising the following steps of:
acquiring action information identified by a sensor;
generating a first motion trail according to the motion information;
performing data analysis on the first motion trail to determine motion action information; and
and generating motion quality information based on the motion information.
2. The method of claim 1, wherein the obtaining motion information identified by the sensor comprises:
acquiring motion sensing data of the sensor;
generating space track information of the movement part according to the motion sensing data; and
and taking the space track information as the action information.
3. The method of claim 2, wherein generating spatial trajectory information for a motion site from the motion sensing data comprises:
generating a spatial motion trail of the motion part according to the motion sensing data;
Generating the height of the motion trail of the motion part according to the motion sensing data; and
and taking the spatial movement track and the movement track height together as the spatial track information.
4. The method of claim 3, wherein the number of motion sensing data is a plurality, the plurality of motion sensing data corresponding to a plurality of marking times, respectively;
wherein the generating the spatial motion trail of the motion part according to the motion sensing data includes:
processing a plurality of motion sensing data respectively to obtain a plurality of single motion track characteristics corresponding to the marking times respectively;
generating a multi-action track feature associated with at least part of the marking time according to the plurality of single-action track features; and
and generating a space motion track of the motion part according to the single motion track features and the multi-motion track features.
5. The method of claim 4, wherein the motion sensing data comprises: acceleration data, rotation motion data and magnetic force sensing data corresponding to the user action at the corresponding marking time;
wherein the processing the motion sensing data to obtain a plurality of single motion track features corresponding to the marking times respectively includes:
Determining motion gesture information of the user motion at the corresponding marked time according to the acceleration data, the rotation motion data and the magnetic force sensing data;
and according to the action gesture information, determining the single action track characteristics of the user action at the corresponding marking time.
6. The method of claim 1, wherein the obtaining motion information identified by the sensor further comprises:
acquiring relevant sensing data of the action;
determining whether the action is a rope skipping action according to the relevant sensing data;
and if the action is the rope skipping movement action, acquiring action information identified by a sensor.
7. The method of claim 6, wherein the acquiring the relevant sensed data of an action comprises:
acquiring sensing data to be identified of the action, wherein the sensing data to be identified is used as the related sensing data; and/or
Acquiring image sensing data of an action scene, wherein the image sensing data is used as the related sensing data; and/or
Sound sensing data of the action scene is acquired, wherein the sound sensing data is used as the relevant sensing data.
8. The method of claim 7, wherein the relevant sensed data is the sensed data to be identified;
wherein said determining, based on said correlated sensed data, whether said action is a rope-skipping motion action comprises:
determining rope skipping action characteristics of the rope skipping action;
analyzing to-be-matched action characteristics of the to-be-identified sensing data;
and if the action characteristic to be matched is matched with the rope skipping action characteristic, determining that the action is the rope skipping action.
9. The method of claim 7, wherein the related sensed data is the image sensed data;
wherein said determining, based on said correlated sensed data, whether said action is a rope-skipping motion action comprises:
determining sensed image characteristics of the image sensing data;
identifying whether the image sensing data comprises rope skipping body data or not according to the sensing image characteristics;
if the image sensing data comprise rope skipping body data, interaction data of the action acting on the rope skipping bodies are obtained;
and if the interaction data meets a motion condition, determining that the motion is the rope skipping motion.
10. The method of claim 3, wherein the generating a first motion profile from the motion information comprises:
and generating the first motion track according to the space track information of the motion part.
11. The method of claim 10, wherein generating the first motion profile based on the spatial profile information of the motion location comprises:
determining candidate motion trajectories matched with the space trajectory information from a plurality of candidate motion trajectories;
and taking the matched candidate motion trail as the first motion trail.
12. The method of claim 11, wherein the data analysis of the first motion profile to determine motion information comprises:
performing action feature decomposition on the first motion trail to obtain action features to be analyzed;
determining the related rope skipping times of the rope skipping motion according to the motion characteristics to be analyzed;
determining the relevant rope skipping height of the rope skipping motion according to the motion characteristics to be analyzed;
and taking the related rope skipping times and the related rope skipping heights as the movement information.
13. The method of claim 12, wherein the generating motion quality information based on the motion action information comprises:
determining analysis and evaluation information of the rope skipping motion according to the relevant rope skipping height and the motion characteristics to be analyzed;
and the related rope skipping times and the analysis and evaluation information are used as the movement quality information together.
14. The method of claim 13, wherein said determining analysis and evaluation information of said rope-jump motion based on said associated rope-jump height and said motion characteristics to be analyzed comprises:
acquiring reference track height and reference action characteristics;
matching the relevant jump rope height with the reference track height to obtain a first matching result;
matching the action feature to be analyzed with the reference action feature to obtain a second matching result;
and determining analysis and evaluation information of the rope skipping motion according to the first matching result and the second matching result.
15. The method as recited in claim 1, further comprising:
correcting the first motion trail by adopting a deep learning and/or machine learning method to obtain a target motion trail;
The data analysis is performed on the first motion trail to determine motion information, including:
and carrying out data analysis on the target motion trail and determining motion action information.
16. The method of any one of claims 1-15, wherein the motion quality information comprises any one or a combination of:
action effect evaluation information;
action standard evaluation information;
action grade evaluation information;
action scoring index information;
physical stamina consumption information;
action error cause information;
action correction instruction information;
action consistency degree information.
17. A motion trajectory judgment device, characterized by comprising:
the acquisition module is used for acquiring action information identified by the sensor;
the first generation module is used for generating a first motion trail according to the motion information;
the analysis module is used for carrying out data analysis on the first motion trail and determining motion action information; and
and the second generation module is used for generating motion quality information based on the motion action information.
18. A wearable device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-16.
19. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-16.
CN202210125751.0A 2022-02-10 2022-02-10 Motion trail judging method and device, wearable device and storage medium Pending CN116631046A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210125751.0A CN116631046A (en) 2022-02-10 2022-02-10 Motion trail judging method and device, wearable device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210125751.0A CN116631046A (en) 2022-02-10 2022-02-10 Motion trail judging method and device, wearable device and storage medium

Publications (1)

Publication Number Publication Date
CN116631046A true CN116631046A (en) 2023-08-22

Family

ID=87590702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210125751.0A Pending CN116631046A (en) 2022-02-10 2022-02-10 Motion trail judging method and device, wearable device and storage medium

Country Status (1)

Country Link
CN (1) CN116631046A (en)

Similar Documents

Publication Publication Date Title
Blank et al. Sensor-based stroke detection and stroke type classification in table tennis
Vu et al. Smartwatch-based early gesture detection 8 trajectory tracking for interactive gesture-driven applications
RU2628155C2 (en) Device, method and computer program for reconstruction of object motion
US11100314B2 (en) Device, system and method for improving motion estimation using a human motion model
KR101872907B1 (en) Motion analysis appratus and method using dual smart band
Yang et al. TennisMaster: An IMU-based online serve performance evaluation system
CN109847321B (en) Athlete training assisting method and device, server and storage medium
CN107076554A (en) Determined for normal trajectories and the automatic method and system detected that jumps
Groh et al. IMU-based trick classification in skateboarding
JP7215515B2 (en) Analysis device, analysis method and program
Malawski Depth versus inertial sensors in real-time sports analysis: A case study on fencing
JP2019136493A (en) Exercise scoring method, system and program
Wei et al. Performance monitoring and evaluation in dance teaching with mobile sensing technology
CN109758154B (en) Motion state determination method, device, equipment and storage medium
Taghavi et al. Tennis stroke detection using inertial data of a smartwatch
Lisca et al. Less is more: learning insights from a single motion sensor for accurate and explainable soccer goalkeeper kinematics
CN116631046A (en) Motion trail judging method and device, wearable device and storage medium
US10001563B2 (en) Tracking and virtual reconstruction of activities
US20230206697A1 (en) Action recognition method and apparatus, terminal device, and motion monitoring system
KR20210010408A (en) Apparatus and Method for Vision-Sensor Data Based Golf Swing Trace Tracking and Analysis
US20230293941A1 (en) Systems and Method for Segmentation of Movement Repetitions and Extraction of Performance Metrics
KR20220052450A (en) Method and apparatus for assisting in golf swing practice
Tegicho et al. Basketball shot analysis based on goal assembly disturbance
CN116784838B (en) Steering identification system, method, equipment and medium based on wearable inertial sensor
Brock Automated Classification of Trampoline Motions Based on Inertial Sensor Input

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination