CN115105820A - Action evaluation system - Google Patents

Action evaluation system Download PDF

Info

Publication number
CN115105820A
CN115105820A CN202110317857.6A CN202110317857A CN115105820A CN 115105820 A CN115105820 A CN 115105820A CN 202110317857 A CN202110317857 A CN 202110317857A CN 115105820 A CN115105820 A CN 115105820A
Authority
CN
China
Prior art keywords
interface
action
motion
detection
testee
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110317857.6A
Other languages
Chinese (zh)
Inventor
薛雅馨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202110317857.6A priority Critical patent/CN115105820A/en
Publication of CN115105820A publication Critical patent/CN115105820A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/05Image processing for measuring physical parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/62Measuring physiological parameters of the user posture

Abstract

A motion evaluation system is characterized in that cameras are respectively erected in front of and laterally of a detection area, evaluation software of a host computer receives dynamic images of the front and the side of a joint of a detected person wearing a plurality of mark points when two cameras shoot the detected person to perform motion, the evaluation software captures more than two analysis images from the dynamic images, calculates more than one limb parameter of the plurality of mark points in each analysis image, generates an evaluation report containing the analysis images of each motion and the limb parameters, and provides the detected person and an instructor thereof as reference.

Description

Action evaluation system
Technical Field
The present invention relates to an action evaluation system, and more particularly, to an action evaluation system that obtains action data by image processing and can generate a report.
Background
The existing action detection simply identifies the functional limitation and asymmetry of the body of a testee by evaluating the performance data of a specific action, and is used as a basis for evaluating the risk of sports injury, such as the following seven actions: deep squat (squaring), striding (Stepping), squat (Lunging), shoulder capacity (reading), foot lift (Leg striking), Push-up (Push-up), extremity rotational Stability (Rotary Stability), and the like.
The instructor can evaluate the balance ability, control ability, muscle softness, proprioception, joint mobility, symmetry and other abilities of the user through the seven actions, and give the user proper exercise training, so that the risk of injury during exercise can be reduced, the defects of the user can be strengthened, and good exercise effect can be achieved.
Disclosure of Invention
Since the evaluation of the existing actions is performed manually. Therefore, the invention uses the mode of visually identifying the joint mark points of the testee to easily detect the data of the appointed action and immediately generate the evaluation report to provide the testee with the reference for knowing the self state or the motion training plan.
To achieve the above object, the present invention provides a motion estimation system, which includes a camera and a host computer electrically connected to the camera, wherein:
the camera device is provided with a front camera and a side camera respectively arranged in front of and in side direction of a detection area and shoots towards the detection area;
the image receiving and computing device of the host image receiving and computing device is provided with an evaluation software, the evaluation software receives dynamic images of the front and the side of a plurality of mark points worn by joints of a testee when the front camera and the side camera shoot the testee to perform more than one action, the evaluation software captures the dynamic images of the testee during each action into more than one analysis image, and generates an evaluation report after calculating more than one mark point in each analysis image into more than one limb parameter, wherein the evaluation report comprises each analysis image and each limb parameter of each action.
Furthermore, when the image receiving arithmetic device starts to execute the evaluation software, a password verification interface and a service life verification interface are sequentially displayed, and when a correct password is input into the password verification interface and the service life verification interface confirms that the running time is within the service life, the evaluation software is continuously executed.
Furthermore, the host image receiving and computing device can be linked with a display module, the evaluation software displays an action detection interface on the display module, and the dynamic image is displayed on the action detection interface in real time.
Preferably, the two ankles, two knees, two wrists, and the thighs, the shoulders, and the elbows of the person to be measured are respectively combined with a mark point, the evaluation software displays an action selection interface on the display module, and displays more than one detection button respectively corresponding to each action and a report generation button for generating the evaluation report on the action selection interface.
Preferably, the evaluation software of the present invention displays an input interface on the display module, and provides three fields for inputting the number of the subject, the name of the subject, and the height of the subject on the input interface, wherein the evaluation report includes a subject basic data area, and the number of the subject and the name of the subject are displayed on the subject basic data area.
The input interface may be a touch input interface, or the display module may only display a screen and input related information through other human-computer interfaces (e.g., a keyboard and a mouse set).
Further, the present invention provides a system for evaluating actions, comprising:
starting and executing evaluation software in an input interface;
inputting a data of a tested person, wherein the data comprises the number of the tested person, the name of the tested person, the height of the tested person or the weight of the tested person;
the evaluation software displays an action selection interface on the input interface, and the action selection interface comprises at least one detection button;
selecting one of the detection buttons through the input interface, and displaying an action detection interface corresponding to the detection button on the input interface by the evaluation software;
the testee executes an action corresponding to the action detection interface in a detection area, wherein the testee wears at least one mark point at least one joint, and a photographic device shoots the action to generate a dynamic image and transmits the dynamic image back to the evaluation software; and
the evaluation software analyzes the dynamic image and the mark point and generates at least one limb parameter in the action detection interface.
When the invention is used, the photographic device is used for shooting the dynamic image of each action of the testee, at least more than one analysis image is captured from the dynamic image, a plurality of mark points worn by the joint of the testee in the analysis image are calculated to obtain the required limb parameters, and an evaluation report comprising each analysis image and each limb parameter is generated for the testee or an instructor and can be used for the reference of the motion or training of the testee.
Compared with the prior art, the invention has the beneficial effects that:
the main effect of the invention is that the camera and the host can be used with evaluation software only by the video lens and the notebook/family computer or the system circuit board, and the hardware is easy to obtain; in addition, because the images of a plurality of marking points of the joint of the testee are captured to calculate the limb parameters during image processing, the complexity of the image processing is reduced, and the requirement for allocation can be greatly reduced; when the action is finished, the system can generate an evaluation report, so that the testee can improve himself by referring to the report content data, and the risk during the exercise is reduced.
The invention has the further effect that when the host computer starts to execute the evaluation software, the password verification interface and the service life verification interface are sequentially displayed, so that the system can be used only when the input password is correct and the time is in the time limit, and a user can use the system in a leasing mode.
Drawings
FIG. 1 is a schematic view of an apparatus according to a preferred embodiment of the present invention;
FIG. 2 is a diagram of a password verification interface and a lifetime verification interface according to a preferred embodiment of the present invention;
FIG. 3 is a diagram of an input interface and an action selection interface according to a preferred embodiment of the present invention;
FIG. 4 is a diagram of a motion detection interface according to a preferred embodiment of the present invention;
FIG. 5 shows a step of capturing an analysis image according to a preferred embodiment of the present invention;
FIG. 6 is a diagram of an evaluation report according to a preferred embodiment of the present invention.
Description of the symbols:
10 camera device 11 front camera
12 side direction camera 20 main unit
21 image receiving arithmetic device 22 display module
30 evaluation software 31 dynamic image
311 analysis image 32 password verification interface
33 lifetime verification interface 34 input interface
341 field 342 storage location
35 action selection interface 351 detection button
352 generate report button 36 action detection interface
361 limbs parameter 362 stop button
Detection area A and detection area B testees
C mark point X evaluation report
X1 tester basic data area X2 project area
X3 analysis results area X4 Manual filling
Detailed Description
So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings.
Referring to fig. 1 to 6, a preferred embodiment of the present invention provides a motion estimation system, which includes a camera 10 and a host 20 electrically connected to the camera 10, wherein:
the camera device 10 is provided with a front camera 11 and a side camera 12 respectively installed in front of and in side direction of a detection area a, the front camera 11 and the side camera 12 respectively shoot towards the detection area a, and an included angle between shooting directions of the front camera 11 and the side camera 12 is preferably a right angle.
The host 20 is provided with an image receiving computing device 21 and a display module 22 electrically connected to the image receiving computing device 21, the front camera 11 and the side camera 12 are electrically connected to the image receiving computing device 21, the image receiving computing device 21 is installed with an evaluation software 30 for performing motion recording and evaluation, the evaluation software 30 is used for receiving dynamic images 31 of the front and side surfaces of a plurality of mark points C worn by the joint of a testee B when the front camera 11 and the side camera 12 shoot the testee B to perform more than one motion, such as deep squatting, in the preferred embodiment, two ankle, two knee, two wrist of the testee B and the thigh (corresponding to the position of the greater trochanter), the shoulder and the elbow of the testee B are respectively combined with one mark point C, in the preferred embodiment, each mark point C is an orange cloth piece which can be adhered to the clothes by velcro, besides, each mark point C can also be a pin or a bright color cloth piece which is adhered and fixed or a mark which can be fixed, or a garment with marks.
Referring to fig. 2, when the image receiving and computing device 21 starts to execute the evaluation software 30, a password verification interface 32 and a lifetime verification interface 33 are sequentially displayed, and after the password verification interface 32 inputs a correct password and the lifetime verification interface 33 confirms that the running time is within the lifetime, the evaluation software 30 is continuously executed; as shown in fig. 3, an input interface 34 is displayed on the display module 22, the input interface 34 provides one or more fields 341 for inputting a serial number of a subject, a name of the subject, a height of the subject or a weight of the subject, and a storage location 342 of an evaluation report of the subject B, the evaluation software 30 displays an action selection interface 35 on the display module 22 after confirming the input, and displays seven detection buttons 351 respectively corresponding to seven actions, such as squat (squaring), striding (Stepping), Squatting (Lunging), shoulder ability (reading), foot Raising (Leg Raising), Push-up (Push-up), and limb rotation Stability (rotation Stability), and a generation report button 352 for generating an evaluation report on the action selection interface 35. The seven actions are only exemplary, and are not meant to be the only actions that the present invention can perform.
The steps of the present invention include starting and executing the evaluation software 30 in the input interface 34; inputting a data of a tested person, wherein the data of the tested person can comprise the number of the tested person, the name of the tested person, the height of the tested person or the weight of the tested person; the evaluation software 30 displays the action selection interface 35 on the input interface, the selection action interface 35 comprising at least one of the detection buttons 351; selecting one of the detection buttons 351 through the input interface 34, the evaluation software 30 displaying an action detection interface 36 corresponding to the detection button 351 on the input interface 34; the examinee B performs an action corresponding to the action detection interface 36 in a detection area a, and the photographing device 10 generates the dynamic image 31 after photographing the action and transmits the dynamic image back to the evaluation software 30; and, the evaluation software 30 analyzes the dynamic image 31 and the mark point C to generate at least one limb parameter 361 in the motion detection interface 36.
Further, in an embodiment, when one of the detection buttons 351 is pressed, the evaluation software 30 displays the motion detection interface 36 corresponding to one of the motions on the display module 22, please refer to fig. 4, the display module 22 displays the motion detection interface 36 corresponding to a deep squatting position, when the subject B stands in the range of the detection area a to face the front camera 11 to perform the corresponding deep squatting motion, and the left and right hands hold the cross bar and lift the cross bar during the deep squatting position, when the evaluation software 30 receives the front and side dynamic images 31 captured by the front camera 11 and the side camera 12 and displays the front and side dynamic images on the motion detection interface 36 in real time, and when the subject B can also adjust to a nearly correct posture by viewing the real-time dynamic image 31 to change the motion.
Then, the evaluation software 30 executes the step of capturing the analysis image as shown in fig. 5, and automatically uses the mark point C of the left hand or the right hand of the subject B to determine whether the left hand or the right hand reaches the lowest point during deep squatting (the final position of the mark point C is used to capture the image of the action end point), if yes, the dynamic image 31 is captured as the analysis image 311 of the front side and the side, and if not, the determination of whether the left hand or the right hand reaches the lowest point is repeated; then, the plurality of markers C in each analysis image 311 are used to calculate more than one of the limb parameters 361 through image processing, for example, the limb parameters 361 of the two-hand angle (two-hand level) are calculated through image processing by using the markers C of the two wrists of the front analysis image 311, or the limb parameters 361 of the shoulder flexion angle (shoulder extension), hip flexion angle (body-thigh), knee flexion angle (knee flexion) and ankle dorsiflexion angle (ankle dorsiflexion) are calculated through image processing by using the markers C of the elbows, shoulders, thighs, knees and ankles of the side analysis image 311.
After completing the detection of the deep squat motion to obtain the required analysis images 311 and the required limb parameters 361, pressing a stop button 362 of the motion detection interface 36 to return to the motion selection interface 35, directly pressing the generation report button 352 to generate the evaluation report X with only deep squat motion by the evaluation software 30 as shown in FIG. 6, or selectively pressing the rest of the detection buttons 351 to enter the motion detection interfaces 36 of the rest of the motion, using the camera 10 to capture the dynamic image 31 of each motion of the testee B by a similar detection method as the deep squat, then using the motion reaching the mark point C to reach the maximum position by a large margin, automatically capturing the analysis image 311 when the motion is completed, for example, the motion of shoulder ability is to capture the analysis image 311 when the motion of two mark points C of two wrists extending backwards of the testee B is completed, the image processing calculates the on-line angle of each mark point C in each analysis image 311 or the distance between the mark points to obtain the required limb parameters 361, so as to detect other actions to obtain the corresponding analysis image 311 and limb parameters 361, and then returns to the action selection interface 35 to press the report generation button 352 to generate the evaluation report X including more complete action items.
When the generate report button 352 is pressed on the action selection interface 35, the evaluation software 30 generates the evaluation report X in the previously selected storage location 342, such as Excel spreadsheet format in the preferred embodiment, but also web page format, WORD, PDF, etc. In the evaluation report X, the data corresponding to each action are sequentially located in different pages, for example, the data of deep-squatting actions are located in the page of the working table 1, and so on, the evaluation report X is provided with a subject basic data area X1, a project area X2, an analysis result area X3 and a manual filling place X4 in each page, for example, the page of the working table 1 displays the subject number, the subject name and the measured time in the subject basic data area X1, the project area X2 displays the items of actions, for example, deep-squatting, the analysis images 311 and the limb parameters 361 in the analysis result area X3, and the filling fields of pain, project score and suggested training in the manual filling place X4, providing a mentor a therapist; switching between different worksheets in the evaluation report X allows for inspection of the results of different action tests (if any).
Furthermore, the present invention may further include more than one electromyographic signal detecting element (not shown in the figures) for continuously and precisely detecting and reading an electromyographic signal of the muscle group of the subject B to perform the above-mentioned action and inputting the detected electromyographic signal (wired or wireless) into the host 20, so that the host 20 can analyze the detected electromyographic signal and output the analyzed electromyographic signal to the display module 22, such as the action detecting interface. Thus, the present embodiment can provide a mentor or therapist with a better understanding of the actual condition of the limb or muscle of the subject B.
Further, the computer 20 may count the data in the evaluation report of each subject B; for example, the detection results of different testees B at different times are recorded, and the limb movement state of the testee B is plotted against time to draw a statistical chart, so as to assist the testee B in mastering the training results. For example, the host 20 counts the bending angle of the knee of the subject B each time, and determines and marks the muscle strength change of the subject B by matching with the myoelectric signal.
Or, by matching with parameters such as the limb proportion, the angle, the height of the mark point and the like of the testee B, the standard data of people with the same limb proportion can be compared, and the training scheme and the adjustment direction directly given to the testee B are displayed on the display interface.
Further, the host 20 is connected to the internet and selectively and continuously connects data with a cloud host (not shown), which continuously accesses the detection results of each of the testees B, so that the testees B can access the detection results after appropriate authentication procedures. For example, the subject B can log into the cloud host through a mobile device, a computer, or the like, and display or download the evaluation report belonging to the subject B.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes will occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A motion estimation system, comprising a camera and a host electrically connected to the camera, wherein:
the camera device is provided with a front camera and a side camera respectively in front of and in side direction of a detection area to shoot towards the detection area;
the host is provided with an image receiving and operating device, an evaluation software is installed on the image receiving and operating device, the evaluation software receives dynamic images of the front and the side of a plurality of mark points worn on the joint of a testee when the front camera and the side camera shoot the testee to perform more than one action, the evaluation software captures the dynamic images of the testee during each action as more than one analysis image, and after the mark points in each analysis image are calculated as more than one limb parameter, an evaluation report is generated, and the evaluation report comprises each analysis image and each limb parameter of each action.
2. The system of claim 1, wherein the image receiving computing device sequentially displays a password verification interface and a lifetime verification interface when the evaluation software is started, and continues to execute the evaluation software when the password verification interface inputs a correct password and the lifetime verification interface confirms that the running time is within the lifetime.
3. The motion estimation system according to claim 1 or 2, wherein the host has a display module electrically connected to the image receiving and computing device, the estimation software displays a motion detection interface on the display module, and the dynamic image is displayed on the motion detection interface in real time.
4. The system of claim 3, wherein the ankle, knee, wrist, thigh bone, shoulder and elbow of the user are respectively associated with a marker, the evaluation software displays a motion selection interface on the display module, and displays more than one detection button corresponding to each motion and a report generation button for generating the evaluation report on the motion selection interface.
5. The system of claim 4, wherein the evaluation software displays an input interface on the display module, the input interface provides three fields for the input of the number of the subject, the name of the subject, and the height of the subject, and the evaluation report includes a subject data area, and the number of the subject and the name of the subject are displayed in the subject data area.
6. The motion estimation system of claim 1, wherein the host receives real-time myoelectric signals corresponding to the subject from one or more myoelectric signal detection elements, and compares and marks the correlation between the myoelectric signals and the limb parameters.
7. The motion estimation system of claim 6, wherein the host computer has a display module electrically connected to the image receiving and computing device, the estimation software displays a motion detection interface on the display module, the dynamic image is displayed on the motion detection interface in real time, and the real-time electromyographic signals are displayed on the display module.
8. The motion estimation system of claim 7, wherein the ankle, knee, wrist, thigh, shoulder and elbow of the user are respectively associated with a marker, the estimation software displays a motion selection interface on the display module, and displays more than one detection button corresponding to each motion and a report generation button for generating the estimation report on the motion selection interface.
9. The action-assessment system according to claim 7, wherein the assessment report is output to an action device via a network.
10. A motion estimation system, comprising:
starting and executing an evaluation software in an input interface;
inputting data of a testee, wherein the data comprises the serial number of the testee, the name of the testee, the height of the testee or the weight of the testee;
the evaluation software displays an action selection interface on the input interface, and the action selection interface comprises at least one detection button;
selecting one of the detection buttons through the input interface, and displaying an action detection interface corresponding to the detection button on the input interface by the evaluation software;
the testee executes an action corresponding to the action detection interface in a detection area, wherein the testee wears at least one mark point at least one joint, and a photographic device shoots the action and then generates a dynamic image which is transmitted back to the evaluation software; and
the evaluation software analyzes the dynamic image and the mark point and generates at least one limb parameter in the action detection interface.
CN202110317857.6A 2021-03-23 2021-03-23 Action evaluation system Pending CN115105820A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110317857.6A CN115105820A (en) 2021-03-23 2021-03-23 Action evaluation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110317857.6A CN115105820A (en) 2021-03-23 2021-03-23 Action evaluation system

Publications (1)

Publication Number Publication Date
CN115105820A true CN115105820A (en) 2022-09-27

Family

ID=83323786

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110317857.6A Pending CN115105820A (en) 2021-03-23 2021-03-23 Action evaluation system

Country Status (1)

Country Link
CN (1) CN115105820A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170000386A1 (en) * 2015-07-01 2017-01-05 BaziFIT, Inc. Method and system for monitoring and analyzing position, motion, and equilibrium of body parts
KR20190041239A (en) * 2017-10-12 2019-04-22 한국과학기술연구원 System for the assessment of lower limb activity and the personalized electrical stimulation using surface electromyography and motion signals
CN109840478A (en) * 2019-01-04 2019-06-04 广东智媒云图科技股份有限公司 A kind of movement appraisal procedure, device, mobile terminal and readable storage medium storing program for executing
KR102000763B1 (en) * 2018-02-07 2019-07-16 주식회사 인프라웨어테크놀러지 Smart mirror device for providing personal training service
WO2021002522A1 (en) * 2019-07-04 2021-01-07 주식회사 펀웨이브 Dance motion analysis evaluation device and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170000386A1 (en) * 2015-07-01 2017-01-05 BaziFIT, Inc. Method and system for monitoring and analyzing position, motion, and equilibrium of body parts
KR20190041239A (en) * 2017-10-12 2019-04-22 한국과학기술연구원 System for the assessment of lower limb activity and the personalized electrical stimulation using surface electromyography and motion signals
KR102000763B1 (en) * 2018-02-07 2019-07-16 주식회사 인프라웨어테크놀러지 Smart mirror device for providing personal training service
CN109840478A (en) * 2019-01-04 2019-06-04 广东智媒云图科技股份有限公司 A kind of movement appraisal procedure, device, mobile terminal and readable storage medium storing program for executing
WO2021002522A1 (en) * 2019-07-04 2021-01-07 주식회사 펀웨이브 Dance motion analysis evaluation device and method

Similar Documents

Publication Publication Date Title
KR101959079B1 (en) Method for measuring and evaluating body performance of user
WO2019232899A1 (en) Comprehensive evaluation system and method for physical fitness and muscular strength
KR101416282B1 (en) Functional measurement and evaluation system for exercising Health and Rehabilitation based on Natural Interaction
CN104274182A (en) Motion information processing apparatus and method
JP2016140591A (en) Motion analysis and evaluation device, motion analysis and evaluation method, and program
JP2017086184A (en) Muscular activity visualization system and muscular activity visualization method
CN107115102A (en) A kind of osteoarticular function appraisal procedure and device
KR101446653B1 (en) Health and rehabilitation game apparatus, system and game method
JP2020174910A (en) Exercise support system
Gauthier et al. Human movement quantification using Kinect for in-home physical exercise monitoring
CN106821387A (en) Using the lower limb rehabilitation degree quantitative evaluating system and appraisal procedure of motion capture sensor
KR20130099323A (en) Functional measurement and evaluation method for exercising health and rehabilitation based on natural interaction
Alahmari et al. Concurrent validity of two-dimensional video analysis of lower-extremity frontal plane of movement during multidirectional single-leg landing
Larsen et al. Effects of stance width and barbell placement on kinematics, kinetics, and myoelectric activity in back squats
KR20140082449A (en) Health and rehabilitation apparatus based on natural interaction
Choi et al. The development and evaluation of a program for leg-strengthening exercises and balance assessment using Kinect
McAllister et al. Evaluating movement performance: What you see isn’t necessarily what you get
CN115105820A (en) Action evaluation system
KR101276734B1 (en) System and method for testing flexibility of the body
WO2021261529A1 (en) Physical exercise assistance system
JP2017038812A (en) Body misalignment checker, body misalignment check method, and program
TWI432178B (en) Physical fitness movement system
TWM615470U (en) Motion evaluation system
KR20230005693A (en) Exercise management healthcare service system using tag
Söchting et al. Development of tests to evaluate the sensory abilities of children with autism spectrum disorder

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination