CN108921127A - Method for testing motion and device, storage medium, terminal - Google Patents

Method for testing motion and device, storage medium, terminal Download PDF

Info

Publication number
CN108921127A
CN108921127A CN201810798077.6A CN201810798077A CN108921127A CN 108921127 A CN108921127 A CN 108921127A CN 201810798077 A CN201810798077 A CN 201810798077A CN 108921127 A CN108921127 A CN 108921127A
Authority
CN
China
Prior art keywords
user
user action
human body
standard operation
default human
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810798077.6A
Other languages
Chinese (zh)
Inventor
孙鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xiaoyi Technology Co Ltd
Original Assignee
Shanghai Xiaoyi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Xiaoyi Technology Co Ltd filed Critical Shanghai Xiaoyi Technology Co Ltd
Priority to CN201810798077.6A priority Critical patent/CN108921127A/en
Publication of CN108921127A publication Critical patent/CN108921127A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

A kind of method for testing motion and device, storage medium, terminal, method for testing motion include:The user images captured in user movement are obtained, the user images include the humanoid of the user;Feature point recognition is carried out to default human body humanoid in the user images, obtains user action;According to corresponding standard operation in user action called data library;The user action is compared with the standard operation, obtains comparison result, moving-mass of the comparison result to measure the user.Technical solution of the present invention can measure moving-mass when user movement.

Description

Method for testing motion and device, storage medium, terminal
Technical field
The present invention relates to moving image processing technology fields more particularly to a kind of method for testing motion and device, storage to be situated between Matter, terminal.
Background technique
User has the demand for knowing Relative motility parameters during exercise, such as know amount of exercise, run duration etc., so as to The moving situation of user is adjusted.
In the prior art, the step number of user can be recorded in user movement, record movement duration calculates user movement institute The calorie etc. of consumption.Specifically, user need to carry relevant device, and direction, the size of Gravity changer can be detected by equipment To determine the step number of user.
But a kind of method that can measure user movement quality is needed, valuable ginseng is provided with the movement for user It examines.
Summary of the invention
Present invention solves the technical problem that being moving-mass when how to measure user movement.
In order to solve the above technical problems, the embodiment of the present invention provides a kind of method for testing motion, method for testing motion includes: The user images captured in user movement are obtained, the user images include the humanoid of the user;The user is schemed Humanoid default human body carries out Feature point recognition as in, obtains user action;According to user action called data library In corresponding standard operation;The user action is compared with the standard operation, obtains comparison result, the comparison knot Moving-mass of the fruit to measure the user.
Optionally, described to include to default human body progress Feature point recognition humanoid in the user images:To institute It states user images and carries out image recognition to obtain described at least one humanoid default human body;At least one is default to described Human body carries out Feature point recognition, obtains shape and the position of at least one default human body, the user action Shape and position including at least one default human body.
Optionally, the method for testing motion further includes:According to the shape of at least one default human body and position The movement range and operating angle for determining at least one default human body are set, the user action includes the movement width Degree and the operating angle.
Optionally, the user action includes movement range and operating angle, described by the user action and the mark Quasi- movement be compared including:Compare the movement range of the user action and the movement range of the standard operation, to obtain Amplitude difference;Compare the operating angle of the user action and the operating angle of the standard operation, to obtain angle difference, institute Stating comparison result includes the amplitude difference and/or the angle difference.
Optionally, the described user action is compared with the standard operation further includes later:According to the ratio It gives a mark to the size of amplitude difference and/or angle difference in result to the user action, it is dynamic to obtain the user The score of work, the moving-mass of user described in the fraction representation.
Optionally, the method for testing motion further includes:Determine whether the user action is marked according to the comparison result It is quasi-.
Optionally, the user images are 3-D image.
Optionally, the default human body is selected from head, shoulder, arm, hand, leg, foot and waist.
In order to solve the above technical problems, the embodiment of the invention also discloses a kind of motion detection apparatus, motion detection apparatus Including:User images obtain module, and suitable for obtaining the user images captured in user movement, the user images include institute State the humanoid of user;Identification module is obtained suitable for carrying out Feature point recognition to default human body humanoid in the user images To user action;Standard operation transfers module, is suitable for according to corresponding standard operation in user action called data library;Than To module, suitable for the user action to be compared with the standard operation, obtain comparison result, the comparison result to Measure the moving-mass of the user.
Optionally, the identification module includes:Image identification unit, be suitable for the user images carry out image recognition with Obtain described at least one humanoid default human body;Feature point recognition unit is suitable for at least one described default human body Position carries out Feature point recognition, and the shape and position, the user action for obtaining at least one default human body include The shape of at least one default human body and position.
Optionally, the identification module further includes:User action determination unit is suitable for according at least one described default people The shape of body region and position determine the movement range and operating angle of at least one default human body, and the user is dynamic Make to include the movement range and the operating angle.
Optionally, the comparison module includes:Movement range comparing unit is adapted to compare the movement width of the user action The movement range of degree and the standard operation, to obtain amplitude difference;It is dynamic to be adapted to compare the user for operating angle comparing unit The operating angle of the operating angle of work and the standard operation, to obtain angle difference, the comparison result includes the amplitude Difference and/or the angle difference.
Optionally, the motion detection apparatus further includes:Scoring modules, suitable for according to the amplitude difference in the comparison result Different and/or angle difference size gives a mark to the user action, to obtain the score of the user action, the score Indicate the moving-mass of the user.
Optionally, the motion detection apparatus further includes:Determination module is suitable for determining the use according to the comparison result Family movement whether standard.
Optionally, the user images are 3-D image.
Optionally, the default human body is selected from head, shoulder, arm, hand, leg, foot and waist.
The embodiment of the invention also discloses a kind of storage mediums, are stored thereon with computer instruction, the computer instruction The step of method for testing motion is executed when operation.
The embodiment of the invention also discloses a kind of terminal, including memory and processor, being stored on the memory can The computer instruction run on the processor, the processor execute the motion detection when running the computer instruction The step of method.
Compared with prior art, the technical solution of the embodiment of the present invention has the advantages that:
Technical solution of the present invention obtains the user images captured in user movement, and the user images include the use Family it is humanoid;Feature point recognition is carried out to default human body humanoid in the user images, obtains user action;According to institute State corresponding standard operation in user action called data library;The user action is compared with the standard operation, is obtained To comparison result, moving-mass of the comparison result to measure the user.Technical solution of the present invention is being transported using user Captured user images, obtain the user action of user during exercise by Feature point recognition when dynamic;By in database The comparison result that corresponding standard operation is compared realizes the detection to user movement quality, and to user movement effect Measurement, provide reference for the subsequent motion or physical condition of user, promote user experience.
Further, image recognition is carried out to the user images to obtain described at least one humanoid default human body portion Position;Feature point recognition is carried out at least one described default human body, obtains the shape of at least one default human body Shape and position, the user action include shape and the position of at least one default human body.Technical solution of the present invention It when determining user action, first determines that at least one humanoid presets human body, presets the selection of human body and specific Type of sports is related;Can determine user action by the shape and position that determine default human body, thus realize to The accurate identification of family movement.
Further, the movement range of the movement range of the user action and the standard operation, to obtain width Spend difference;Compare the operating angle of the user action and the operating angle of the standard operation, it is described to obtain angle difference Comparison result includes the amplitude difference and/or the angle difference.Technical solution of the present invention is by comparing user action and mark The amplitude difference and angle difference of quasi- movement, realize the accurate measurement to the moving-mass of user action, improve movement The accuracy of measuring quality.
Detailed description of the invention
Fig. 1 is a kind of flow chart of method for testing motion of the embodiment of the present invention;
Fig. 2 is a kind of flow chart of specific embodiment of step S102 shown in Fig. 1;
Fig. 3 is a kind of structural schematic diagram of motion detection apparatus of the embodiment of the present invention;
Fig. 4 is a kind of structural schematic diagram of specific embodiment of identification module 302 shown in Fig. 3.
Specific embodiment
As described in the background art, a kind of method that can measure user movement effect is needed, in the prior art for use The movement at family provides valuable reference.
Technical solution of the present invention obtains user by Feature point recognition using user's captured user images during exercise User action during exercise;By the comparison result being compared with standard operation corresponding in database, realize to user The detection of moving-mass, and the measurement to user movement effect provide reference for the subsequent motion or physical condition of user, mention Rise user experience.
To make the above purposes, features and advantages of the invention more obvious and understandable, with reference to the accompanying drawing to the present invention Specific embodiment be described in detail.
Fig. 1 is a kind of flow chart of method for testing motion of the embodiment of the present invention.
Method for testing motion shown in Fig. 1 may comprise steps of:
Step S101:The user images captured in user movement are obtained, the user images include the user It is humanoid;
Step S102:Feature point recognition is carried out to default human body humanoid in the user images, it is dynamic to obtain user Make;
Step S103:According to corresponding standard operation in user action called data library;
Step S104:The user action is compared with the standard operation, obtains comparison result, the comparison knot Moving-mass of the fruit to measure the user.
Moving image in specific implementation, when available user movement.Moving image can be through picture pick-up device reality When shoot, be also possible to obtain from image data base, be previously stored with user images in image data base.Specifically Ground, picture pick-up device can be pre-placed locating sports center when user movement, such as can be gymnasium, training chest, behaviour Etc..
Due to include in user images it is complete humanoid, when placing picture pick-up device, it is ensured that picture pick-up device is clapped The image taken the photograph can include the complete humanoid of the user.
In the specific implementation of step S102, default human body refers to human body, such as head, shoulder etc..Due to The difference of type of sports, user action also can be different, and human body involved in different user actions is different, therefore in determination When user action, it can be obtained by carrying out Feature point recognition to humanoid predetermined fraction.
It is understood that default human body can be user and be chosen according to type of sports, and set in advance before movement It has set.
In the present embodiment, standard operation can store in database.Standard operation indicates the correct movement of the movement. It include professional motion in the standard picture specifically, can shoot to obtain standard picture when professional motion personnel move Personnel's is humanoid;Feature point recognition is carried out to the humanoid default human body of players professional in the standard picture, is obtained To standard operation.
It, can be according to user action in the specific implementation of step S103 after user action when determining user movement The determination standard operation to be compared.Specifically, all standard operations in user action and database can be subjected to similarity Matching, with the corresponding standard operation of determination.The classification that can also determine user action will belong to this in user action and database The standard operation of classification carries out similarity mode, with the corresponding standard operation of determination.For example, user action is stretching, then Corresponding standard operation is also stretching.
Further, standard operation can be is stored in database profession in the form of picture.But due to user and profession The bodily form of players has differences, therefore when comparing user action and standard operation, being not will include standard operation Picture is compared with user images, but standard operation and user action are parsed respectively, then is compared.
And then in the specific implementation of step S104, by the way that user action to be compared with standard operation, compared As a result, comparison result indicates the difference of user action and standard operation.Since standard operation indicates the correct way of the movement, Therefore whether correct in such a way that comparison result can measure user movement, it can also measure the moving-mass of user.
Specifically, user action is more similar to standard operation, and the moving-mass of user is higher.
In specific application scenarios, method for testing motion can be used for body-building scene, detection user action whether standard, Whether reach movement effects to detect user, for example, detection Yoga movement whether standard, damaged to avoid to user's body. Whether method for testing motion can be used for physical examination scene, consistent with standard operation by detection user action, to detect user Physical condition, such as detection user walk act whether be in eight formulas or outer eight formula.Method for testing motion can also be used In examination scene;It is whether consistent with standard operation by detection user action, it is instructed with the movement to user, such as can be with The projects such as dancing, gymnastics, Yoga are checked and rated.
The embodiment of the present invention obtains user by Feature point recognition and exists using user's captured user images during exercise User action when movement;By the comparison result being compared with standard operation corresponding in database, realizes and user is transported The detection of kinoplaszm amount, and the measurement to user movement effect provide reference for the subsequent motion or physical condition of user, are promoted User experience.
In a specific embodiment of the invention, referring to figure 2., step S102 shown in Fig. 1 be may comprise steps of:
Step S201:Image recognition is carried out to the user images to obtain described at least one humanoid default human body portion Position;
Step S202:Feature point recognition is carried out at least one described default human body, obtains that described at least one is pre- If shape and the position of human body, the user action includes shape and the position of at least one default human body.
Specifically, user action is formed by by least one human body.So determining that at least one is default When the shape of human body and position, expression has determined user action.
For example, after determining arm, the shape of leg and position, can determine the user of user in road-work scene Movement.
Furthermore, the default human body is selected from head, shoulder, arm, hand, leg, foot and waist.
It is understood that human body involved by user action is different in different types of operation, people is preset It is pre-set that body region can be the type of sports moved according to user.For example, the corresponding default human body portion of road-work Position is leg, arm and foot;Yoga sports correspond to above-mentioned whole default human body.
It in the present embodiment, when determining user action, first determines that at least one humanoid presets human body, presets human body The selection at position is related with specific type of sports;It can determine that user is dynamic by the shape and position that determine default human body Make, to realize the accurate identification to user action.
Further, when user action includes shape and the position of at least one default human body, by user action Being compared with standard operation can refer to:In the shape and standard operation that at least one in user action is preset to human body The shape of corresponding default human body is compared, and/or at least one in user action is preset to the position of human body It is compared with the position of default human body corresponding in standard operation.
Further, with continued reference to Fig. 2, step S102 shown in Fig. 1 can also include the following steps:
Step S203:At least one is default described in shape and position determination according at least one default human body The movement range and operating angle of human body, the user action include the movement range and the operating angle.
In specific implementation, in order to more accurately compare user action and standard operation, determining that at least one is default After the shape of human body and position, movement range and the movement of a few default human body can also be determined on this basis Angle.That is, can more easily compare movement after user action is quantified as movement range and operating angle It is right, it is hereby achieved that more accurate comparison result, and then more accurately measure the moving-mass and movement effects of user.
In the case where the user action includes movement range and operating angle, step S104 shown in Fig. 1 may include Following steps:Compare the movement range of the user action and the movement range of the standard operation, to obtain amplitude difference;Than The operating angle of the operating angle of the user action and the standard operation, to obtain angle difference, the comparison result Including the amplitude difference and/or the angle difference.
In the present embodiment, user action can be indicated with quantifiable movement range and operating angle.Comparing as a result, When user action and standard operation, namely the movement range and/or operating angle of the two are compared.Pass through amplitude difference And/or the angle difference can characterize the standard degree of user action, so as to measure moving-mass.
In other words, the amplitude difference and/or the angle difference of user action and standard operation are smaller, the movement matter of user It measures higher.
In a particular application, by taking stretching as an example, the operating angle of user action can be the arm and level of user The angle in face;Movement range can be the linear distance of two feet.The operating angle of standard operation is 0 degree, and movement range is shoulder Portion's width.
It should be noted that the definition of operating angle and movement range is also different according to the difference of type of action, the present invention Embodiment no longer repeats one by one.
Technical solution of the present invention is realized by comparing the amplitude difference and angle difference of user action and standard operation Accurate measurement to the moving-mass of user action improves the accuracy of moving-mass measurement.
Further, can also include the following steps after step S104 shown in Fig. 1:According to the width in the comparison result The size of degree difference and/or angle difference gives a mark to the user action, described to obtain the score of the user action The moving-mass of user described in fraction representation.
It, can be according to amplitude difference and/or angle for the more intuitive moving-mass for measuring user in the present embodiment The size of difference determines the number of user action.Specifically, amplitude difference and/or angle difference are smaller, the score of user action Higher, quality of movement is higher;Vice versa.
The numerical value of specific score can be custom-configured according to actual application scenarios, the embodiment of the present invention to this not It is limited.
In an alternative embodiment of the invention, it can also include the following steps after step S104 shown in Fig. 1:According to described Comparison result determine the user action whether standard.
In the present embodiment, in order to allow users to intuitively understand itself moving-mass, comparison result can obtained Later, on this basis further judge user action whether standard namely user action it is whether consistent with standard operation.
It is identical as standard operation that user action and standard operation unanimously can be user action, be also possible to user action with Standard operation is identical in preset error range, and the embodiment of the present invention is without limitation.
In a preferred embodiment of the invention, the user images are 3-D image.
Since 3-D image can more comprehensively indicate that user is humanoid from multiple dimensions, to sentence using 3-D image When determining user action, more accurate judgement result can be obtained.For example, the lower waist for user acts, if only shooting two Direct picture is tieed up, the upper part of the body and the lower part of the body overlapping of user, then determine result inaccuracy in image;If shooting the three-dimensional of user Image, 3-D image can be humanoid from the comprehensive displaying user of front, side, reverse side, so as to determine under user Waist movement, and determine the information such as more specific angle, amplitude.
Referring to figure 3., the motion detection apparatus 30 of the embodiment of the present invention may include that user images obtain module 301, know Other module 302, standard operation transfer module 303 and comparison module 304.
Wherein, user images obtain module 301 and are suitable for obtaining the user images captured in user movement, the user Image includes the humanoid of the user;Identification module 302 is suitable for carrying out default human body humanoid in the user images Feature point recognition obtains user action;Standard operation is transferred module 303 and is suitable for according to right in user action called data library The standard operation answered;Comparison module 304 is suitable for for the user action being compared with the standard operation, obtains comparing knot Fruit, moving-mass of the comparison result to measure the user.
The embodiment of the present invention obtains user by Feature point recognition and exists using user's captured user images during exercise User action when movement;By the comparison result being compared with standard operation corresponding in database, realizes and user is transported The detection of kinoplaszm amount, and the measurement to user movement effect provide reference for the subsequent motion or physical condition of user, are promoted User experience.
In a specific embodiment of the invention, referring to figure 4., identification module 302 shown in Fig. 3 may include image recognition Unit 401 obtains described at least one humanoid default human body suitable for carrying out image recognition to the user images;It is special A sign point recognition unit 402 is suitable for carrying out Feature point recognition at least one described default human body, obtain it is described at least one The shape of default human body and position, the user action include shape and the position of at least one default human body It sets.
Please continue to refer to Fig. 4, identification module 302 shown in Fig. 3 can also include user action determination unit 403, be suitable for The movement width of at least one default human body is determined according to the shape of at least one default human body and position Degree and operating angle, the user action include the movement range and the operating angle.
In a specific embodiment of the invention, referring to figure 4., comparison module 304 shown in Fig. 3 may include movement range ratio Compared with unit (not shown), it is adapted to compare the movement range of the user action and the movement range of the standard operation, to obtain Amplitude difference;Operating angle comparing unit (not shown), operating angle and the standard for being adapted to compare the user action are dynamic The operating angle of work, to obtain angle difference, the comparison result includes the amplitude difference and/or the angle difference.
In one preferred embodiment of the embodiment of the present invention, motion detection apparatus 30 shown in Fig. 3 can also include scoring modules (not shown), suitable for according to the size of amplitude difference and/or angle difference in the comparison result to the user action into Row marking, to obtain the score of the user action, the moving-mass of user described in the fraction representation.
Optionally, motion detection apparatus 30 shown in Fig. 3 can also include determining whether module, suitable for being sentenced according to the comparison result The fixed user action whether standard.
Working principle, more contents of working method about the motion detection apparatus 30, are referred to Fig. 1 to Fig. 2 In associated description, which is not described herein again.
The embodiment of the invention also discloses a kind of storage mediums, are stored thereon with computer instruction, the computer instruction The step of method for testing motion shown in fig. 1 or fig. 2 can be executed when operation.The storage medium may include ROM, RAM, Disk or CD etc..The storage medium can also include non-volatility memorizer (non-volatile) or non-transient (non-transitory) memory etc..
The embodiment of the invention also discloses a kind of terminal, the terminal may include memory and processor, the storage The computer instruction that can be run on the processor is stored on device.The processor can be with when running the computer instruction The step of executing method for testing motion shown in fig. 1 or fig. 2.The terminal includes but is not limited to mobile phone, computer, plate electricity The terminal devices such as brain.
Although present disclosure is as above, present invention is not limited to this.Anyone skilled in the art are not departing from this It in the spirit and scope of invention, can make various changes or modifications, therefore protection scope of the present invention should be with claim institute Subject to the range of restriction.

Claims (18)

1. a kind of method for testing motion, which is characterized in that including:
The user images captured in user movement are obtained, the user images include the humanoid of the user;
Feature point recognition is carried out to default human body humanoid in the user images, obtains user action;
According to corresponding standard operation in user action called data library;
The user action is compared with the standard operation, obtains comparison result, the comparison result is to measure State the moving-mass of user.
2. method for testing motion according to claim 1, which is characterized in that described to humanoid pre- in the user images If human body carries out Feature point recognition:
Image recognition is carried out to the user images to obtain described at least one humanoid default human body;
Feature point recognition is carried out at least one described default human body, obtains the shape of at least one default human body Shape and position, the user action include shape and the position of at least one default human body.
3. method for testing motion according to claim 2, which is characterized in that further include:
The dynamic of at least one default human body is determined according to the shape of at least one default human body and position Make amplitude and operating angle, the user action includes the movement range and the operating angle.
4. method for testing motion according to claim 1, which is characterized in that the user action includes movement range and moves Make angle, it is described the user action is compared with the standard operation including:Compare the movement width of the user action The movement range of degree and the standard operation, to obtain amplitude difference;
Compare the operating angle of the user action and the operating angle of the standard operation, to obtain angle difference, the ratio It include the amplitude difference and/or the angle difference to result.
5. method for testing motion according to claim 4, which is characterized in that described by the user action and the standard Movement is compared further includes later:
It is given a mark according to the size of amplitude difference and/or angle difference in the comparison result to the user action, with Obtain the score of the user action, the moving-mass of user described in the fraction representation.
6. method for testing motion according to claim 1, which is characterized in that further include:
According to the comparison result determine the user action whether standard.
7. method for testing motion according to any one of claims 1 to 6, which is characterized in that the user images are three-dimensional Image.
8. method for testing motion according to any one of claims 1 to 6, which is characterized in that the default human body choosing From head, shoulder, arm, hand, leg, foot and waist.
9. a kind of motion detection apparatus, which is characterized in that including:
User images obtain module, and suitable for obtaining the user images captured in user movement, the user images include institute State the humanoid of user;
It is dynamic to obtain user suitable for carrying out Feature point recognition to default human body humanoid in the user images for identification module Make;
Standard operation transfers module, is suitable for according to corresponding standard operation in user action called data library;Comparison module, Suitable for the user action to be compared with the standard operation, comparison result is obtained, the comparison result is to measure State the moving-mass of user.
10. motion detection apparatus according to claim 9, which is characterized in that the identification module includes:Image recognition list Member obtains described at least one humanoid default human body suitable for carrying out image recognition to the user images;
Feature point recognition unit is suitable for carrying out Feature point recognition at least one described default human body, obtain it is described at least The shape of one default human body and position, the user action include at least one default human body shape and Position.
11. motion detection apparatus according to claim 10, which is characterized in that the identification module further includes:User is dynamic Make determination unit, determines at least one described default people suitable for the shape and position according at least one default human body The movement range and operating angle of body region, the user action include the movement range and the operating angle.
12. motion detection apparatus according to claim 9, which is characterized in that the comparison module includes:Movement range ratio Compared with unit, it is adapted to compare the movement range of the user action and the movement range of the standard operation, to obtain amplitude difference;
Operating angle comparing unit is adapted to compare the operating angle of the user action and the operating angle of the standard operation, To obtain angle difference, the comparison result includes the amplitude difference and/or the angle difference.
13. motion detection apparatus according to claim 12, which is characterized in that further include:
Scoring modules, it is dynamic to the user suitable for the size according to amplitude difference and/or angle difference in the comparison result It gives a mark, to obtain the score of the user action, the moving-mass of user described in the fraction representation.
14. motion detection apparatus according to claim 9, which is characterized in that further include:
Determination module, be suitable for being determined according to the comparison result user action whether standard.
15. according to the described in any item motion detection apparatus of claim 9 to 14, which is characterized in that the user images are three Tie up image.
16. according to the described in any item motion detection apparatus of claim 9 to 14, which is characterized in that the default human body Selected from head, shoulder, arm, hand, leg, foot and waist.
17. a kind of storage medium, is stored thereon with computer instruction, which is characterized in that the computer instruction executes when running Described in any one of claims 1 to 8 the step of method for testing motion.
18. a kind of terminal, including memory and processor, the meter that can be run on the processor is stored on the memory Calculation machine instruction, which is characterized in that perform claim requires any one of 1 to 8 institute when the processor runs the computer instruction The step of stating method for testing motion.
CN201810798077.6A 2018-07-19 2018-07-19 Method for testing motion and device, storage medium, terminal Pending CN108921127A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810798077.6A CN108921127A (en) 2018-07-19 2018-07-19 Method for testing motion and device, storage medium, terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810798077.6A CN108921127A (en) 2018-07-19 2018-07-19 Method for testing motion and device, storage medium, terminal

Publications (1)

Publication Number Publication Date
CN108921127A true CN108921127A (en) 2018-11-30

Family

ID=64415269

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810798077.6A Pending CN108921127A (en) 2018-07-19 2018-07-19 Method for testing motion and device, storage medium, terminal

Country Status (1)

Country Link
CN (1) CN108921127A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106611153A (en) * 2016-05-12 2017-05-03 简极科技有限公司 Intelligent ball training action recognition system and method
US20170224255A1 (en) * 2012-01-19 2017-08-10 Nike, Inc. Action Detection and Activity Classification
CN107748619A (en) * 2017-10-30 2018-03-02 南京布塔信息科技有限公司 A kind of motion analysis system and method based on motion capture technology
CN107832736A (en) * 2017-11-24 2018-03-23 南京华捷艾米软件科技有限公司 The recognition methods of real-time body's action and the identification device of real-time body's action
CN107909060A (en) * 2017-12-05 2018-04-13 前海健匠智能科技(深圳)有限公司 Gymnasium body-building action identification method and device based on deep learning
CN107924463A (en) * 2015-08-24 2018-04-17 夫斯特21有限公司 System and method for moving identification

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170224255A1 (en) * 2012-01-19 2017-08-10 Nike, Inc. Action Detection and Activity Classification
CN107924463A (en) * 2015-08-24 2018-04-17 夫斯特21有限公司 System and method for moving identification
CN106611153A (en) * 2016-05-12 2017-05-03 简极科技有限公司 Intelligent ball training action recognition system and method
CN107748619A (en) * 2017-10-30 2018-03-02 南京布塔信息科技有限公司 A kind of motion analysis system and method based on motion capture technology
CN107832736A (en) * 2017-11-24 2018-03-23 南京华捷艾米软件科技有限公司 The recognition methods of real-time body's action and the identification device of real-time body's action
CN107909060A (en) * 2017-12-05 2018-04-13 前海健匠智能科技(深圳)有限公司 Gymnasium body-building action identification method and device based on deep learning

Similar Documents

Publication Publication Date Title
US10799779B2 (en) Fitting system for golf equipment using camera image for measurement of individual and related methods
US20180177450A1 (en) Method and system for delivering biomechanical feedback to human and object motion
US11798318B2 (en) Detection of kinetic events and mechanical variables from uncalibrated video
Saponara Wearable biometric performance measurement system for combat sports
US20200372245A1 (en) Scoring metric for physical activity performance and tracking
KR20160046082A (en) Method of proving service of assessment of physical fitness and exercise prescrition by system for proving service of assessment of physical fitness and exercise prescrition
CN112364785B (en) Exercise training guiding method, device, equipment and computer storage medium
US20200406098A1 (en) Techniques for golf swing measurement and optimization
CN113597614B (en) Image processing method and device, electronic equipment and storage medium
CN113409651B (en) Live broadcast body building method, system, electronic equipment and storage medium
CN108921127A (en) Method for testing motion and device, storage medium, terminal
CN111353345B (en) Method, apparatus, system, electronic device, and storage medium for providing training feedback
TWM581492U (en) Weight training intelligence system
CN114708541A (en) Physical fitness test method and device, computer equipment and storage medium
CN113850150A (en) Motion scoring method and device based on deep learning 3D posture analysis
JP6939939B2 (en) Information processing equipment, information processing methods, and programs
WO2017056357A1 (en) Information processing apparatus, information processing method, and program
JP6944144B2 (en) Swing analyzer, method and program
Trinh et al. Design and Analysis of an FPGA-based CNN for Exercise Recognition
Virkud et al. A Cost-Efficient and Time Saving Exercise Posture Monitoring System
Vafadar et al. Evaluation of CNN-based human pose estimation for body segment lengths assessment
Hung et al. A HRNet-based Rehabilitation Monitoring System
Ahmadov Evaluating pose estimation and object detection models for the application in the minisoccerbal project
JP7447956B2 (en) Processing device, attitude analysis system, and program
Perini Feasibility of Mobile Phone-Based 2D Human Pose Estimation for Golf: An analysis of the golf swing focusing on selected joint angles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20181130