CN113569743B - Body building assessment method and system based on limb identification technology - Google Patents

Body building assessment method and system based on limb identification technology Download PDF

Info

Publication number
CN113569743B
CN113569743B CN202110862266.7A CN202110862266A CN113569743B CN 113569743 B CN113569743 B CN 113569743B CN 202110862266 A CN202110862266 A CN 202110862266A CN 113569743 B CN113569743 B CN 113569743B
Authority
CN
China
Prior art keywords
limb
joint
action
data
action data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110862266.7A
Other languages
Chinese (zh)
Other versions
CN113569743A (en
Inventor
周微
王海龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jianzhishu Health Management Co ltd
Original Assignee
Shanghai Jianzhishu Health Management Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jianzhishu Health Management Co ltd filed Critical Shanghai Jianzhishu Health Management Co ltd
Priority to CN202110862266.7A priority Critical patent/CN113569743B/en
Publication of CN113569743A publication Critical patent/CN113569743A/en
Application granted granted Critical
Publication of CN113569743B publication Critical patent/CN113569743B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a body building evaluation method and a body building evaluation system based on a limb identification technology. And displaying the limb model of the target joint containing the mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint in the corresponding second action data exceeds the target value. By utilizing the body-building evaluation method provided by the invention, the action data of the user can be extracted based on the limb identification technology and compared with the action data of the coach side acquired by the second terminal, so that remote body-building guidance is realized, and the automatic, accurate and efficient evaluation of body-building action is realized by displaying the limb model with the action not in place, thereby further realizing remote body-building evaluation.

Description

Body building assessment method and system based on limb identification technology
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a fitness evaluation method, apparatus, system, device, and storage medium based on a limb identification technology.
Background
At present, the body-building exercise is more and more popular, and a body-building user can be guided by a special coach in a body-building place and can also imitate the body-building exercise through videos, but the prior art can not accurately evaluate the body-building exercise except providing guidance of the user's exercise through manual means.
Disclosure of Invention
The invention provides a body-building assessment method, device, system, equipment and storage medium based on limb identification technology, which are used for solving the defect of low accuracy of body-building assessment by adopting a manual means in the prior art and realizing a more accurate body-building assessment scheme.
The invention provides a body building evaluation method based on a limb identification technology, which is applied to a first terminal at a user side and comprises the following steps:
Collecting a first action image and forming a first video stream containing the first action image;
extracting limb joint positions from the first action images based on a limb identification technology, and obtaining first action data according to limb joint position changes in each first action image in the first video stream;
The first action data is sent to a second terminal on the coaching side through a server;
and displaying the limb model of the target joint containing the mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint in the corresponding second action data exceeds the target value, wherein the second action data is received from the second terminal through the server.
According to the fitness evaluation method based on the limb identification technology provided by the invention, the method further comprises the following steps:
Receiving a second video stream containing a second action image, which is acquired by a second terminal at the coaching side;
displaying the second video stream;
the second motion data is obtained by extracting the position of a limb joint from the second motion image based on a limb identification technology and according to the position change of the limb joint in each second motion image in the second video stream.
According to the fitness evaluation method based on the limb identification technology provided by the invention, before displaying the limb model of the target joint containing the mark, under the condition that the comparison value of the included angle at the corresponding target joint in the first motion data and the corresponding second motion data exceeds the target value, the method further comprises:
Calculating a comparison value of the included angle of each corresponding joint in the first action data and the corresponding second action data based on the following first formula:
Wherein AS x is the degree of the included angle at the joint x in the first motion data, AC x is the degree of the included angle at the joint x in the second motion data, and the similarity between the first motion data and the corresponding second motion data is calculated based on the following second formula:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
And under the condition that the similarity is smaller than the target similarity, determining a target joint corresponding to S x exceeding a target value, and marking the target joint to obtain a limb model.
The invention also provides a body building evaluation method based on limb identification technology, which is applied to a second terminal on a coach side and comprises the following steps:
receiving first action data sent by a first terminal at a user side through a server;
Displaying a limb model of the target joint including a marker when a comparison value of the included angle at the corresponding target joint in the first motion data and the corresponding second motion data exceeds a target value, wherein the second motion data is received from a second terminal through a server;
the first motion data is obtained by extracting the position of a limb joint from first motion images acquired by the first terminal based on a limb identification technology and according to the position change of the limb joint in each first motion image in a first video stream containing the first motion images.
According to the fitness evaluation method based on the limb identification technology provided by the invention, before displaying the limb model of the target joint containing the mark, under the condition that the comparison value of the included angle at the corresponding target joint in the first motion data and the corresponding second motion data exceeds the target value, the method further comprises:
Collecting the second action image and forming the second video stream containing the second action image;
Extracting limb joint positions from the second action images based on a limb identification technology, and obtaining second action data according to limb joint position changes in each second action image in the second video stream;
and sending the second video stream to the first terminal through the server.
According to the fitness evaluation method based on the limb identification technology provided by the invention, before displaying the limb model of the target joint containing the mark, under the condition that the comparison value of the included angle at the corresponding target joint in the first motion data and the corresponding second motion data exceeds the target value, the method further comprises:
Calculating a comparison value of the included angle of each corresponding joint in the first action data and the corresponding second action data based on the following first formula:
Wherein AS x is the degree of the included angle at the joint x in the first motion data, AC x is the degree of the included angle at the joint x in the second motion data, and the similarity between the first motion data and the corresponding second motion data is calculated based on the following second formula:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
And under the condition that the similarity is smaller than the target similarity, determining a target joint corresponding to s x exceeding a target value, and marking the target joint to obtain a limb model.
The invention also provides a body-building evaluation system based on limb identification technology, which comprises: a first terminal at the user side, a server and a second terminal at the coach side;
The first terminal is used for acquiring a first action image, forming a first video stream containing the first action image, extracting limb joint positions from the first action image based on a limb identification technology, acquiring first action data according to limb joint position changes in each first action image in the first video stream, sending the first action data to a server, and displaying a limb model of a corresponding target joint containing a mark under the condition that a comparison value of an included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value, wherein the second action data is received from the second terminal through the server;
the server is used for receiving the first action data from the first terminal and sending the first action data to the second terminal;
The second terminal is used for receiving the first action data from the server, and displaying a limb model of the target joint containing the mark when the comparison value of the included angle between the first action data and the corresponding second action data at the corresponding target joint exceeds a target value.
The invention also provides a body building assessment device based on limb identification technology, which is applied to a first terminal at a user side and comprises:
The first image acquisition module acquires a first action image and forms a first video stream containing the first action image;
The first data processing module is used for extracting limb joint positions from the first action images based on a limb identification technology and obtaining first action data according to limb joint position changes in each first action image in the first video stream;
The first data transmission module is used for transmitting the first action data to a second terminal at the coaching side through a server;
And the first image display module is used for displaying a limb model of the target joint containing the mark under the condition that the comparison value of the included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value, wherein the second action data is received from the second terminal through the server.
The invention also provides a body building assessment device based on limb identification technology, which is applied to a second terminal at a training side and comprises:
The second receiving module is used for receiving first action data sent by the first terminal at the user side through the server;
A second image display module for displaying a limb model of the target joint including a marker when a comparison value of the included angle at the corresponding target joint in the first motion data and the corresponding second motion data exceeds a target value, wherein the second motion data is received from the second terminal through the server;
the first motion data is obtained by extracting the position of a limb joint from first motion images acquired by the first terminal based on a limb identification technology and according to the position change of the limb joint in each first motion image in a first video stream containing the first motion images.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the fitness assessment method based on limb identification technology as described in any one of the above when the program is executed.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of a fitness assessment method based on limb identification techniques as described in any one of the above.
According to the fitness evaluation method, device, system, equipment and storage medium based on the limb identification technology, the first action image is acquired through the first terminal, the limb joint position is extracted from the first action image based on the limb identification technology, and the first action data is obtained according to the change of the limb joint position in the first video stream, so that the automatic identification and extraction of the user action are realized. And displaying the limb model of the target joint containing the mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint in the corresponding second action data exceeds the target value. By utilizing the body-building evaluation method provided by the invention, the action data of the user can be extracted based on the limb identification technology and compared with the action data of the coach side acquired by the second terminal, so that remote body-building guidance is realized, and the automatic, accurate and efficient evaluation of body-building action is realized by displaying the limb model with the action not in place, thereby further realizing remote body-building evaluation.
Drawings
In order to more clearly illustrate the invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a fitness evaluation system based on limb identification technology according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a fitness evaluation method based on a limb identification technology according to an embodiment of the present invention;
FIG. 3 is a second flow chart of a fitness evaluation method based on a limb identification technique according to an embodiment of the present invention;
Fig. 4 is an interface schematic diagram of a first terminal according to an embodiment of the present invention;
Fig. 5 is a schematic diagram of a distribution of joints of limbs of a human body according to an embodiment of the present invention;
FIG. 6 is a schematic illustration of a limb model provided in an embodiment of the present invention;
FIG. 7 is a third flow chart of a fitness evaluation method based on a limb identification technique according to an embodiment of the present invention;
fig. 8 is an interface schematic diagram of a second terminal according to an embodiment of the present invention;
FIG. 9 is a flowchart of a body-building assessment method based on a limb identification technique according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a body-building assessment device based on limb identification technology according to an embodiment of the present invention;
FIG. 11 is a second schematic diagram of a fitness evaluation device according to the present invention;
FIG. 12 is a third schematic diagram of a fitness evaluation device according to an embodiment of the present invention;
FIG. 13 is a schematic diagram of a body-building assessment device based on a limb identification technique according to an embodiment of the present invention;
FIG. 14 is a schematic diagram of a body-building assessment device based on limb identification technology according to an embodiment of the present invention;
fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
With reference to the body-building assessment system based on limb identification technology provided in the embodiment shown in fig. 1, the system includes: a first terminal 11 on the user side, a server 12 and a second terminal 12 on the coach side.
The first terminal 11 is configured to: collecting a first action image and forming a first video stream containing the first action image;
extracting limb joint positions from the first action images based on a limb identification technology, and obtaining first action data according to limb joint position changes in each first action image in the first video stream;
Transmitting the first action data to the server 12;
And displaying a limb model of the target joint containing a mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint exceeds a target value, wherein the second action data is received from the second terminal through the server.
The server 12 is configured to receive the first action data from the first terminal 11 and send the first action data to the second terminal 13;
the second terminal 13 is configured to receive the first motion data from the server 12, and display a limb model of the target joint including a marker when a comparison value of an included angle at the corresponding target joint in the first motion data and the corresponding second motion data exceeds a target value.
In a specific application, the first terminal 11 executes the fitness evaluation method based on the limb identification technology by installing an application program on the user side and running the application program.
And the second terminal 13 performs the fitness evaluation method based on the limb identification technology by installing an application on the coach side and running the application.
In a specific application, the first terminal 11 faces the user, and may collect the first action image of the user in real time and form the first video stream. The first terminal 11 captures a first video stream containing first action images by means of a camera, such that the first video stream contains a plurality of consecutive first action images.
The second motion data may be motion data recorded and stored in advance, so as to serve as a reference basis for evaluating the first motion data of the user. Alternatively, the second motion data may be received from a trainer-side second terminal, in which case the second terminal may collect the trainer-demonstration motion and form the second motion data.
In the embodiment of the invention, the joint is movably connected between two or more bones of a human body, and the body fit action of the user is reflected by joint action information. Therefore, the user's movement can be further determined by recognizing the joint position information of the user by the limb recognition technique.
In the embodiment of the invention, the limb recognition technology may specifically be a limb recognition technology adopting a deep learning model, and the deep learning model may be obtained by training a neural network model by using motion sample data.
In alternative embodiments, the limb identification technique may also identify the user's joint motion learning by key point marking and identification of the joints without using a deep learning model.
The body-building evaluation system provided by the embodiment of the invention can automatically extract the action data of the user by applying the limb identification technology, marks the actions which are not in place by the user in the form of the limb model, realizes the intellectualization and high precision of body-building guidance and realizes the intelligent body-building private teaching service.
The fitness evaluation method based on the limb identification technique of the present invention is described below with reference to fig. 2 to 9. The execution subject of the method may be a first terminal located at the user side in the fitness evaluation system, specifically, the first terminal is installed with a corresponding application program, which is not specifically limited herein, as a client for executing the fitness evaluation method.
The first terminal may be an intelligent device such as a smart phone, a tablet computer, a smart television, and the like, which is not particularly limited herein.
Referring to fig. 2, the fitness evaluation method based on the limb identification technology provided by the embodiment of the invention specifically may include the following steps:
step 210: collecting a first action image and forming a first video stream containing the first action image;
step 220: extracting limb joint positions from the first action images based on a limb identification technology, and obtaining first action data according to limb joint position changes in each first action image in the first video stream;
step 230: the first action data is sent to a second terminal on the coaching side through a server;
step 240: and displaying a limb model of the target joint containing a mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint exceeds a target value, wherein the second action data is received from the second terminal through the server.
In the embodiment of the invention, the first terminal acquires the first action image of the user, and extracts the first action data based on the position of the limb joint based on the limb identification technology, so as to realize automatic identification and extraction of the action of the user. Meanwhile, the first terminal can display a limb model for marking a target joint, the mark of the target joint represents a position where the user action is not in place, and the first terminal can clearly display the user action flaws to the user at a glance, so that display output of the user flaw actions is realized. In addition, by sending the first action data to the second terminal, the second terminal on the coaching side can automatically recognize the action of the user based on the first action data and judge whether the action of the user is in place.
The embodiment of the invention provides a feasible fitness evaluation scheme, and can accurately and efficiently evaluate and display the fitness actions of the user.
Referring to fig. 3, an exercise assessment method based on a limb identification technology according to an alternative embodiment of the present invention includes:
Step 310: receiving a second video stream containing a second action image, which is acquired by a second terminal at the coaching side;
Step 320: displaying the second video stream;
Step 330: collecting a first action image and forming a first video stream containing the first action image;
Step 340: extracting limb joint positions from the first action images based on a limb identification technology, and obtaining first action data according to limb joint position changes in each first action image in the first video stream;
Step 350: the first action data is sent to a second terminal on the coaching side through a server;
Step 360: displaying a limb model of the target joint containing the mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint in the corresponding second action data exceeds a target value;
the second motion data is obtained by extracting the position of a limb joint from the second motion image based on a limb identification technology and according to the position change of the limb joint in each second motion image in the second video stream.
Specifically, the first terminal receives a second video stream acquired by the second terminal through the server.
In this embodiment, the second video stream provides an action guiding function, and through the displayed second video stream, the user can learn a standard exercise action.
In a specific application scene, a second action image of a coach acquired by the second terminal is transmitted to the first terminal and displayed on the first terminal, and the first terminal acquires a limb model in which the user does not act in place by acquiring a body-building action of the user based on the second action image and comparing the body-building action with second action data corresponding to the second action image.
Referring to fig. 4, a second video stream of the user's limb model and the coach is simultaneously displayed at the first terminal, and the user can correct his own actions against the limb model.
In an actual scene, a coach and a user can continuously interact by using the fitness evaluation system of the embodiment, the coach can feed back second motion data or second video stream according to the received first motion data or first video stream, the user can feed back the first motion data or the first video stream according to the received second motion data or second video stream, in the process, the user motion is automatically quantitatively evaluated based on a limb identification technology, and an accurate quantitative evaluation result is output.
Step 310 precedes step 330 in embodiments of the present invention, which is exemplary. In an alternative embodiment, step 310 may be followed by step 330.
In an alternative embodiment, the second motion data or the second video stream may also be recorded in advance and stored in the first terminal or the server, and the first terminal may extract the second motion data or the second video stream in real time for limb identification.
In the case that the comparison value of the included angle between the first motion data and the corresponding target joint in the corresponding second motion data exceeds the target value, before displaying the limb model of the target joint including the mark, the method further comprises:
Calculating a comparison value of the included angle of each corresponding joint in the first action data and the corresponding second action data based on the following first formula:
Wherein AS x is the degree of the included angle at the joint x in the first motion data, AC x is the degree of the included angle at the joint x in the second motion data, and the similarity between the first motion data and the corresponding second motion data is calculated based on the following second formula:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
And under the condition that the similarity is smaller than the target similarity, determining a target joint corresponding to S x exceeding a target value, and marking the target joint to obtain a limb model.
In this embodiment, the comparison value for the single joint angle can be calculated by the above formula S x. For example, when the degree of the included angle at the joint in the second motion data is 180, the degree of the included angle at the same joint in the first motion data is 0, and the comparison value is 0 (indicating 0% similarity). When the degree of the included angle at the joint in the second motion data is 180, and when the degree of the included angle at the same joint in the first motion data is 180, the comparison value of the included angle at the joint is 1 (representing 100% similarity), and when the degree of the included angle at the joint in the second motion data is 180, the degree of the included angle at the same joint in the first motion data is 90, the comparison value of the included angle at the joint is 0.5 (representing 50% similarity).
Therefore, the comparison value is quantized by adopting the formula, and the accuracy of comparison of the action data is improved.
Specifically, with reference to the schematic diagram of the joint distribution of the human limb shown in fig. 5, the included angles of the joints are calculated:
a) An included angle A between the neck and the shoulder, namely an included angle between a connecting line of two joint points 1 and 2 and a connecting line of two joint points 3 and 6 in the figure;
b) The included angle B between the left upper arm and the shoulder is the included angle between the connecting line of the two joint points 3 and 4 and the connecting line of the two joint points 3 and 6 in the figure;
c) The included angle C between the right upper arm and the shoulder, namely the included angle between the connecting line of the two joint points 6 and 7 and the connecting line of the two joint points 3 and 6 in the figure;
d) An included angle D between the left lower arm and the left upper arm, namely an included angle between a connecting line of two joint points 3 and 4 and a connecting line of two joint points 4 and 5 in the figure;
e) An included angle E between the right lower arm and the right upper arm, namely an included angle between a connecting line of the two joint points 6 and 7 and a connecting line of the two joint points 7 and 8 in the figure;
f) The included angle F between the lumbar vertebra and the hip, namely the included angle between the connecting line of two articulation points 2 and 9 and the connecting line of two articulation points 10 and 13 in the figure;
g) The included angle G between the hip and the left thigh is the included angle between the connecting line of the joint points 10 and 13 and the connecting line of the joint points 10 and 11 in the figure;
h) The included angle H between the hip and the right thigh is the included angle between the connecting line of the two joint points 10 and 13 and the connecting line of the two joint points 13 and 14 in the figure;
i) An included angle I between the left thigh and the left shank, namely an included angle between a connecting line of two joint points 10 and 11 and a connecting line of two joint points 11 and 12 in the figure;
j) The included angle I between the right thigh and the right shank is the included angle between the connecting line of the two joint points 13 and 14 and the connecting line of the two joint points 14 and 15 in the figure.
The first motion data and the second motion data can be characterized by the limb joint model shown in fig. 4, and the included angle of each joint position is calculated by using the formula.
Referring to fig. 6, where the angle comparison value of the joint 4 position exceeds the target value, which is determined as the target joint, the joint 4 is marked in the limb model shown at the first terminal. In particular, the mark may be a highlighting, a color mark, etc., without limitation.
The above calculation process may be performed at the first terminal or may be performed at the server.
Referring to fig. 7, an embodiment of the present invention provides a body-building assessment method based on a limb recognition technology applied to a second terminal, and the implementation subject of the method is the second terminal, specifically, a coach side application program is installed on the second terminal to specifically implement the method. The method specifically comprises the following steps:
Step 710: receiving first action data sent by a first terminal at a user side through a server;
Step 720: displaying a limb model of the target joint including a marker in case a comparison value of the first motion data and an included angle at the corresponding target joint in corresponding second motion data exceeds a target value, wherein the second motion data is received from the second terminal through the server;
the first motion data is obtained by extracting the position of a limb joint from first motion images acquired by the first terminal based on a limb identification technology and according to the position change of the limb joint in each first motion image in a first video stream containing the first motion images.
Through the displayed limb model, a coach can confirm in real time that the user is not in place.
Referring to fig. 8, a limb model of a trainer can be simultaneously displayed at the second terminal, and a total of 12 limb models of trainees a-L, wherein the joint angles with large differences between each trainee and the trainer can be highlighted for the trainer to correct the action basis of the trainee.
Optionally, referring to fig. 9, the fitness evaluation method according to an alternative embodiment of the present invention includes the steps of:
step 910: collecting the second action image and forming the second video stream containing the second action image;
Step 920: extracting limb joint positions from the second action images based on a limb identification technology, and obtaining second action data according to limb joint position changes in each second action image in the second video stream;
step 930: receiving first action data sent by a first terminal at a user side through a server;
Step 940: displaying a limb model of the target joint containing the mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint in the corresponding second action data exceeds a target value;
Step 950: and sending the second video stream to the first terminal through the server.
By using the embodiment, the body building system can instantly collect the first video stream and the second video stream, extract corresponding action data based on the limb identification technology and obtain a limb model.
The timing relationship between steps 910 and 920 and 930 is not limited by fig. 9, and the timing relationship between steps 950 and 930 and 940 is not limited by fig. 9.
Optionally, in a case where the comparison value of the included angle between the first motion data and the corresponding target joint in the corresponding second motion data exceeds a target value, before displaying the limb model of the target joint including the marker, the method further includes:
Calculating a comparison value of the included angle of each corresponding joint in the first action data and the corresponding second action data based on the following first formula:
Wherein AS x is the degree of the included angle at the joint x in the first motion data, AC x is the degree of the included angle at the joint x in the second motion data, and the similarity between the first motion data and the corresponding second motion data is calculated based on the following second formula:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
And under the condition that the similarity is smaller than the target similarity, determining a target joint corresponding to S x exceeding a target value, and marking the target joint to obtain a limb model.
The body-building evaluation device based on the limb identification technology provided by the invention is described below, and the body-building evaluation device based on the limb identification technology described below and the body-building evaluation method based on the limb identification technology described above can be referred to correspondingly.
Referring to fig. 10, the fitness evaluation device based on the limb identification technique includes:
a first image acquisition module 1001 for acquiring a first motion image and forming a first video stream containing the first motion image;
The first data processing module 1002 extracts a limb joint position from the first motion image based on a limb identification technology, and obtains first motion data according to a limb joint position change in each first motion image in the first video stream;
a first data transmission module 1003 for transmitting the first motion data to a second terminal on the coach side through a server;
The first image display module 1004 displays a limb model of the target joint including a marker when a comparison value of an included angle at the corresponding target joint in the first motion data and the corresponding second motion data exceeds a target value, wherein the second motion data is received from the second terminal through the server.
Optionally, referring to fig. 11, compared to fig. 10, the fitness evaluation device of the present embodiment further includes:
the first receiving module 1101 receives a second video stream including a second motion image collected by a second terminal on the coach side;
The first image display module 1102 is specifically further configured to: and displaying the second video stream.
In this case, the second motion data is obtained by extracting a limb joint position from the second motion image based on a limb recognition technique, and according to a limb joint position change in each second motion image in the second video stream.
Optionally, referring to fig. 12, compared to fig. 11, the fitness evaluation device of the present embodiment further includes:
The computing module 1210 is specifically configured to:
before displaying a limb model of the target joint containing a mark, calculating the comparison value of the included angle between the first motion data and each corresponding joint in the corresponding second motion data based on the following first formula under the condition that the comparison value of the included angle between the first motion data and the corresponding target joint in the corresponding second motion data exceeds a target value:
Wherein AS x is the degree of the included angle at the joint x in the first motion data, AC x is the degree of the included angle at the joint x in the second motion data, and the similarity between the first motion data and the corresponding second motion data is calculated based on the following second formula:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
And the marking module 1220 is configured to determine a target joint corresponding to S x exceeding a target value if the similarity is smaller than the target similarity, and mark the target joint to obtain a limb model.
Referring to fig. 13, an embodiment of the present invention provides a fitness evaluation device based on a second terminal on a trainer side, the device comprising:
a second receiving module 1310 for receiving, by the server, the first action data sent by the first terminal on the user side;
A second image display module 1320 for displaying a limb model of the target joint including a marker in case that a comparison value of the included angle at the corresponding target joint in the first motion data and the corresponding second motion data received from the second terminal through the server exceeds a target value;
the first motion data is obtained by extracting the position of a limb joint from first motion images acquired by the first terminal based on a limb identification technology and according to the position change of the limb joint in each first motion image in a first video stream containing the first motion images.
Optionally, referring to fig. 14, compared to fig. 13, the fitness evaluation device provided by the alternative embodiment of the present invention further includes:
a second image acquisition module 1410, configured to acquire a second motion image and form the second video stream including the second motion image before displaying a limb model including the marked target joint if a comparison value of the first motion data and an included angle at the corresponding target joint in the corresponding second motion data exceeds a target value;
a second data processing module 1420 that extracts a limb joint position from the second motion image based on a limb identification technique, and obtains the second motion data according to a limb joint position change in each second motion image in the second video stream;
and a second data transmission module 1430 for transmitting the second video stream to the first terminal through the server.
Fig. 15 illustrates a physical structure diagram of an electronic device, as shown in fig. 15, which may include: processor 1510, communication interface (Communications Interface) 1520, memory 1530, and communication bus 1540, wherein processor 1510, communication interface 1520, memory 1530 communicate with each other via communication bus 1540. Processor 1510 may invoke logic instructions in memory 1530 to perform a fitness assessment method based on limb identification techniques, the method comprising:
Collecting a first action image and forming a first video stream containing the first action image;
extracting limb joint positions from the first action images based on a limb identification technology, and obtaining first action data according to limb joint position changes in each first action image in the first video stream;
The first action data is sent to a second terminal on the coaching side through a server;
displaying a limb model of the target joint containing the mark under the condition that the comparison value of the included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value, wherein the second action data is received from a second terminal through a server; or alternatively
Receiving first action data sent by a first terminal at a user side through a server;
Displaying a limb model of the target joint including a marker when a comparison value of the included angle at the corresponding target joint in the first motion data and the corresponding second motion data exceeds a target value, wherein the second motion data is received from a second terminal through a server;
the first motion data is obtained by extracting the position of a limb joint from first motion images acquired by the first terminal based on a limb identification technology and according to the position change of the limb joint in each first motion image in a first video stream containing the first motion images.
Further, the logic instructions in the memory 1530 described above may be implemented in the form of software functional units and may be stored on a computer readable storage medium when sold or used as a stand alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform a method of assessing fitness based on limb identification techniques provided by the methods described above, the method comprising:
Collecting a first action image and forming a first video stream containing the first action image;
extracting limb joint positions from the first action images based on a limb identification technology, and obtaining first action data according to limb joint position changes in each first action image in the first video stream;
The first action data is sent to a second terminal on the coaching side through a server;
displaying a limb model of the target joint containing the mark under the condition that the comparison value of the included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value, wherein the second action data is received from a second terminal through a server; or alternatively
Receiving first action data sent by a first terminal at a user side through a server;
Displaying a limb model of the target joint including a marker when a comparison value of the included angle at the corresponding target joint in the first motion data and the corresponding second motion data exceeds a target value, wherein the second motion data is received from a second terminal through a server;
the first motion data is obtained by extracting the position of a limb joint from first motion images acquired by the first terminal based on a limb identification technology and according to the position change of the limb joint in each first motion image in a first video stream containing the first motion images.
In yet another aspect, the present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform the above provided fitness assessment method based on limb identification techniques, the method comprising:
Collecting a first action image and forming a first video stream containing the first action image;
extracting limb joint positions from the first action images based on a limb identification technology, and obtaining first action data according to limb joint position changes in each first action image in the first video stream;
The first action data is sent to a second terminal on the coaching side through a server;
displaying a limb model of the target joint containing the mark under the condition that the comparison value of the included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value, wherein the second action data is received from a second terminal through a server; or alternatively
Receiving first action data sent by a first terminal at a user side through a server;
Displaying a limb model of the target joint including a marker when a comparison value of the included angle at the corresponding target joint in the first motion data and the corresponding second motion data exceeds a target value, wherein the second motion data is received from a second terminal through a server;
the first motion data is obtained by extracting the position of a limb joint from first motion images acquired by the first terminal based on a limb identification technology and according to the position change of the limb joint in each first motion image in a first video stream containing the first motion images.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. A fitness evaluation method based on a limb identification technology, which is characterized in that the fitness evaluation method is applied to a first terminal at a user side and comprises the following steps:
Collecting a first action image and forming a first video stream containing the first action image;
Extracting limb joint positions from the first action images based on a limb recognition technology, and obtaining first action data according to limb joint position changes in each first action image in the first video stream, wherein the limb recognition technology is a limb recognition technology adopting a deep learning model, the deep learning model is obtained by training a neural network model by using action sample data, or the limb recognition technology recognizes joint actions of a user by marking and recognizing key points of joints;
the first action data are sent to a second terminal on the coaching side through a server, so that the second terminal calculates comparison values of included angles at corresponding joints in the first action data and the corresponding second action data based on the following first formula:
Wherein, For joint/>, in the first motion dataDegrees of included angle at position,/>For joints/>, in the second motion dataThe degree of included angle is calculated based on the following second formula, and the similarity between the first action data and the corresponding second action data is calculated:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
And in order to enable the second terminal to determine that the target value is exceeded in the case that the similarity is smaller than the target similarity The corresponding target joint is marked to obtain a limb model, and the limb model of the target joint containing the mark is displayed under the condition that the comparison value of the included angle between the first action data and the corresponding second action data at the corresponding target joint exceeds a target value;
The first terminal displays a limb model of a corresponding target joint containing a mark under the condition that a comparison value of an included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value, wherein the second action data is received from the second terminal through the server, and the mark of the target joint represents a position where a user action is not in place;
the first terminal displays a limb model of the target joint containing a mark before the first terminal displays the limb model of the target joint containing the mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint in the corresponding second action data exceeds a target value, and the method further comprises:
Calculating a comparison value of the included angle of each corresponding joint in the first action data and the corresponding second action data based on the following first formula:
Wherein, For joint/>, in the first motion dataDegrees of included angle at position,/>For joints/>, in the second motion dataThe degree of included angle is calculated based on the following second formula, and the similarity between the first action data and the corresponding second action data is calculated:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
determining that the target value is exceeded in the case that the similarity is smaller than the target similarity And marking the corresponding target joint to obtain a limb model.
2. The method of fitness assessment based on limb identification technology according to claim 1, wherein the method further comprises:
Receiving a second video stream containing a second action image, which is acquired by a second terminal at the coaching side;
displaying the second video stream;
the second motion data is obtained by extracting the position of a limb joint from the second motion image based on a limb identification technology and according to the position change of the limb joint in each second motion image in the second video stream.
3. A fitness evaluation method based on a limb identification technology, which is characterized in that the method is applied to a second terminal on a coach side and comprises the following steps:
receiving first action data sent by a first terminal at a user side through a server;
The second terminal displays a limb model of the target joint containing a mark when the comparison value of the included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value, wherein the mark of the target joint represents a position where the action of the user is not in place;
the first motion data are obtained by extracting limb joint positions from first motion images acquired by the first terminal based on a limb identification technology and according to limb joint position changes in each first motion image in a first video stream containing the first motion images, and the first motion data are used for calculating comparison values of included angles at each corresponding joint in the first motion data and the corresponding second motion data by the first terminal based on the following first formula:
Wherein, For joint/>, in the first motion dataDegrees of included angle at position,/>For joints/>, in the second motion dataThe degree of included angle is calculated based on the following second formula, and the similarity between the first action data and the corresponding second action data is calculated:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
And is used for the first terminal to determine that the target value is exceeded under the condition that the similarity is smaller than the target similarity The corresponding target joint is marked to obtain a limb model, and the limb model of the target joint containing the mark is displayed under the condition that the comparison value of the included angle of the first action data and the corresponding second action data at the corresponding target joint exceeds a target value, wherein the second action data is received from the second terminal through the server; the limb recognition technology is a limb recognition technology adopting a deep learning model, wherein the deep learning model is obtained by training a neural network model by utilizing action sample data, or the limb recognition technology recognizes joint actions of a user by marking and recognizing key points of joints;
The second terminal displays a limb model of the target joint containing the mark before the second terminal displays the limb model of the target joint containing the mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint in the corresponding second action data exceeds a target value, and the method further comprises:
Calculating a comparison value of the included angle of each corresponding joint in the first action data and the corresponding second action data based on the following first formula:
Wherein, For joint/>, in the first motion dataDegrees of included angle at position,/>For joints/>, in the second motion dataThe degree of included angle is calculated based on the following second formula, and the similarity between the first action data and the corresponding second action data is calculated:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
determining that the target value is exceeded in the case that the similarity is smaller than the target similarity And marking the corresponding target joint to obtain a limb model.
4. A method of fitness assessment based on limb identification technology according to claim 3 wherein in the event that the comparison of the first motion data and the corresponding second motion data exceeds a target value for the angle at the corresponding target joint, the method further comprises, prior to displaying the limb model of the target joint containing the marker:
Collecting a second action image and forming a second video stream containing the second action image;
Extracting limb joint positions from the second action images based on a limb identification technology, and obtaining second action data according to limb joint position changes in each second action image in the second video stream;
and sending the second video stream to the first terminal through the server.
5. A fitness evaluation system based on limb identification technology, comprising: a first terminal at the user side, a server and a second terminal at the coach side;
The first terminal is used for acquiring a first action image, forming a first video stream containing the first action image, extracting the position of a limb joint from the first action image based on a limb recognition technology, acquiring first action data according to the change of the position of the limb joint in each first action image in the first video stream, sending the first action data to the server, and displaying a limb model of a target joint containing a mark under the condition that the comparison value of the included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value, wherein the second action data is received from the second terminal through the server, the limb recognition technology is a limb recognition technology adopting a deep learning model, the deep learning model is obtained by training a neural network model by utilizing action sample data, or the limb recognition technology recognizes the joint action of a user by marking and recognizing key points of the joint;
the server is used for receiving the first action data from the first terminal and sending the first action data to the second terminal;
The second terminal is used for receiving the first action data from the server, and displaying a limb model of the target joint containing a mark, wherein the mark of the target joint represents a position where the action of a user is not in place when the comparison value of the included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value;
the first terminal and the second terminal are further configured to calculate a comparison value of an included angle between the first motion data and each corresponding joint in the corresponding second motion data based on the following first formula:
Wherein, For joint/>, in the first motion dataDegrees of included angle at position,/>For joints/>, in the second motion dataThe degree of included angle is calculated based on the following second formula, and the similarity between the first action data and the corresponding second action data is calculated:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
determining that the target value is exceeded in the case that the similarity is smaller than the target similarity And marking the corresponding target joint to obtain a limb model.
6. A fitness evaluation device based on limb identification technology, characterized in that it is applied to a first terminal on a user side, comprising:
The first image acquisition module acquires a first action image and forms a first video stream containing the first action image;
The first data processing module is used for extracting limb joint positions from the first action images based on a limb identification technology, and obtaining first action data according to limb joint position changes in each first action image in the first video stream, wherein the limb identification technology is a limb identification technology adopting a deep learning model, the deep learning model is obtained by training a neural network model by using action sample data, or the limb identification technology is used for identifying joint actions of a user by marking and identifying key points of joints;
the first data sending module sends the first action data to a second terminal on a coach side through a server, so that the second terminal calculates a comparison value of the included angle between the first action data and each corresponding joint in the corresponding second action data based on the following first formula:
Wherein, For joint/>, in the first motion dataDegrees of included angle at position,/>For joints/>, in the second motion dataThe degree of included angle is calculated based on the following second formula, and the similarity between the first action data and the corresponding second action data is calculated:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
And in order to enable the second terminal to determine that the target value is exceeded in the case that the similarity is smaller than the target similarity The corresponding target joint is marked to obtain a limb model, and the limb model of the target joint containing the mark is displayed under the condition that the comparison value of the included angle between the first action data and the corresponding second action data at the corresponding target joint exceeds a target value;
The first image display module is used for displaying a limb model of the target joint containing a mark under the condition that the comparison value of the included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value, wherein the second action data is received from the second terminal through the server, and the mark of the target joint represents a position where the action of a user is not in place;
The first calculation module is used for calculating the comparison value of the included angle between the first action data and each corresponding joint in the corresponding second action data based on the following first formula before displaying the limb model of the target joint containing the mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint in the corresponding second action data exceeds a target value:
Wherein, For joint/>, in the first motion dataDegrees of included angle at position,/>For joints/>, in the second motion dataThe degree of included angle is calculated based on the following second formula, and the similarity between the first action data and the corresponding second action data is calculated:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
determining that the target value is exceeded in the case that the similarity is smaller than the target similarity And marking the corresponding target joint to obtain a limb model.
7. A fitness assessment device based on limb identification technology, characterized by a second terminal applied to a trainer side, comprising:
The second receiving module is used for receiving first action data sent by the first terminal at the user side through the server;
the second image display module is used for displaying a limb model of the target joint containing the mark under the condition that the comparison value of the included angle at the corresponding target joint in the first action data and the corresponding second action data exceeds a target value;
The first motion data are obtained by extracting the positions of limb joints from first motion images acquired by the first terminal based on a limb recognition technology and according to the position changes of the limb joints in each first motion image in a first video stream containing the first motion images, the marks of the target joints represent the positions where the user motion is not in place, and the first motion data are used for calculating comparison values of included angles at each corresponding joint in the first motion data and the corresponding second motion data by the first terminal based on the following first formula:
Wherein, For joint/>, in the first motion dataDegrees of included angle at position,/>For joints/>, in the second motion dataThe degree of included angle is calculated based on the following second formula, and the similarity between the first action data and the corresponding second action data is calculated:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
And is used for the first terminal to determine that the target value is exceeded under the condition that the similarity is smaller than the target similarity The corresponding target joint is marked to obtain a limb model, and the limb model of the target joint containing the mark is displayed under the condition that the comparison value of the included angle of the first action data and the corresponding second action data at the corresponding target joint exceeds a target value, wherein the second action data is received from the second terminal through the server; the limb recognition technology is a limb recognition technology adopting a deep learning model, wherein the deep learning model is obtained by training a neural network model by utilizing action sample data, or the limb recognition technology recognizes joint actions of a user by marking and recognizing key points of joints;
The second calculation module is used for calculating the comparison value of the included angle between the first action data and each corresponding joint in the corresponding second action data based on the following first formula before displaying the limb model of the target joint containing the mark under the condition that the comparison value of the included angle between the first action data and the corresponding target joint in the corresponding second action data exceeds a target value:
Wherein, For joint/>, in the first motion dataDegrees of included angle at position,/>For joints/>, in the second motion dataThe degree of included angle is calculated based on the following second formula, and the similarity between the first action data and the corresponding second action data is calculated:
wherein a is the first joint number of the limb, j represents the total number of joints of the limb;
determining that the target value is exceeded in the case that the similarity is smaller than the target similarity And marking the corresponding target joint to obtain a limb model.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the fitness assessment method according to any one of claims 1 to 4 when the program is executed.
9. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the steps of the fitness assessment method according to any one of claims 1 to 4 based on limb identification technology.
CN202110862266.7A 2021-07-29 2021-07-29 Body building assessment method and system based on limb identification technology Active CN113569743B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110862266.7A CN113569743B (en) 2021-07-29 2021-07-29 Body building assessment method and system based on limb identification technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110862266.7A CN113569743B (en) 2021-07-29 2021-07-29 Body building assessment method and system based on limb identification technology

Publications (2)

Publication Number Publication Date
CN113569743A CN113569743A (en) 2021-10-29
CN113569743B true CN113569743B (en) 2024-06-18

Family

ID=78168928

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110862266.7A Active CN113569743B (en) 2021-07-29 2021-07-29 Body building assessment method and system based on limb identification technology

Country Status (1)

Country Link
CN (1) CN113569743B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109102857A (en) * 2018-05-31 2018-12-28 杭州同绘科技有限公司 A kind of intelligence limb rehabilitation training system and method
KR20200074609A (en) * 2018-12-17 2020-06-25 이화여자대학교 산학협력단 Supporting method and system for home fitness

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101711488B1 (en) * 2015-01-28 2017-03-03 한국전자통신연구원 Method and System for Motion Based Interactive Service
KR20180058139A (en) * 2016-11-23 2018-05-31 한국기술교육대학교 산학협력단 Smart health service system and smart health service method
KR101970687B1 (en) * 2018-04-11 2019-04-19 주식회사 큐랩 Fitness coaching system using personalized augmented reality technology
KR20200081629A (en) * 2018-12-27 2020-07-08 이진욱 Dance evaluation device using joint angle comparison and the method thereof
CN109840478B (en) * 2019-01-04 2021-07-02 广东智媒云图科技股份有限公司 Action evaluation method and device, mobile terminal and readable storage medium
CA3146658A1 (en) * 2019-07-11 2021-01-14 Elo Labs, Inc. Interactive personal training system
CN110718280A (en) * 2019-09-26 2020-01-21 北京金山安全软件有限公司 Fitness action accuracy determining method and device, electronic equipment and storage medium
CN112933579B (en) * 2019-12-11 2023-03-31 中移(苏州)软件技术有限公司 Motion quality evaluation method and device and storage medium
CN112364818A (en) * 2020-11-27 2021-02-12 Oppo广东移动通信有限公司 Action correcting method and device, electronic equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109102857A (en) * 2018-05-31 2018-12-28 杭州同绘科技有限公司 A kind of intelligence limb rehabilitation training system and method
KR20200074609A (en) * 2018-12-17 2020-06-25 이화여자대학교 산학협력단 Supporting method and system for home fitness

Also Published As

Publication number Publication date
CN113569743A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
CN110448870B (en) Human body posture training method
CN110945522B (en) Learning state judging method and device and intelligent robot
CN112184705A (en) Human body acupuncture point identification, positioning and application system based on computer vision technology
CN109840478B (en) Action evaluation method and device, mobile terminal and readable storage medium
CN112862639B (en) Education method of online education platform based on big data analysis
CN113516064A (en) Method, device, equipment and storage medium for judging sports motion
CN113569743B (en) Body building assessment method and system based on limb identification technology
CN116168346B (en) Remote accompanying-reading monitoring system based on student behavior capturing
KR20220058790A (en) Exercise posture analysis method using dual thermal imaging camera, guide method for posture correction and computer program implementing the same method
CN116805433A (en) Human motion trail data analysis system
CN114639168B (en) Method and system for recognizing running gesture
CN115761873A (en) Shoulder rehabilitation movement duration evaluation method based on gesture and posture comprehensive visual recognition
CN113392744A (en) Dance motion aesthetic feeling confirmation method and device, electronic equipment and storage medium
CN115568823A (en) Method, system and device for evaluating human body balance ability
CN113657337A (en) Method, device, equipment and medium for detecting expression of crew member
CN114360052A (en) Intelligent somatosensory coach system based on AlphaPose and joint point angle matching algorithm
CN117423166B (en) Motion recognition method and system according to human body posture image data
CN114419711B (en) Concentration degree identification method based on AI (artificial intelligence) education system
CN116012938B (en) Construction method and system of CPR automatic feedback detection model based on AlphaPose algorithm
CN110472476B (en) Gesture matching degree acquisition method, device, computer and storage medium
CN116958205A (en) Dynamic tracking method for human body dance
CN116510271A (en) Intelligent auxiliary training and assessment system for Taiji boxing
CN116824459B (en) Intelligent monitoring and evaluating method, system and storage medium for real-time examination
US20230306616A1 (en) Device and method for capturing and analyzing a motion of a user
CN116726458A (en) Data processing method, device, enhancement implementation equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant