CN113808446B - Fitness course interaction method and related device - Google Patents

Fitness course interaction method and related device Download PDF

Info

Publication number
CN113808446B
CN113808446B CN202010531077.7A CN202010531077A CN113808446B CN 113808446 B CN113808446 B CN 113808446B CN 202010531077 A CN202010531077 A CN 202010531077A CN 113808446 B CN113808446 B CN 113808446B
Authority
CN
China
Prior art keywords
user
action
sub
electronic device
fitness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010531077.7A
Other languages
Chinese (zh)
Other versions
CN113808446A (en
Inventor
郁心迪
姜永航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010531077.7A priority Critical patent/CN113808446B/en
Publication of CN113808446A publication Critical patent/CN113808446A/en
Application granted granted Critical
Publication of CN113808446B publication Critical patent/CN113808446B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/08Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
    • G09B5/14Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations with provision for individual teacher-student communication
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/065Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems

Abstract

Disclosed is a fitness course interaction method, comprising: the first electronic equipment displays a first interface, wherein the first interface comprises a first area and a second area, the first area is used for displaying the fitness course, and the second area is used for displaying an image containing a first user; the first interface comprises a first identifier for indicating the exercise effect of the first user in the fitness class, wherein the exercise effect of the first user in the fitness class is determined by the first electronic device according to the motion data of the first user in the fitness class. The embodiment of the application can feed back the exercise effect reached by the user in real time, and improves the user experience.

Description

Fitness course interaction method and related device
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a fitness course interaction method and a related device.
Background
Along with the improvement of living standard, people pay more and more attention to the improvement of physical quality, and body building becomes a part of people's life. The user can select various fitness courses of fitness software to exercise by using a mobile phone, a computer or a television according to personal actual requirements. Fitness sessions typically consist of one or more sets of exercises.
At present, in the process that a user follows a fitness course to build a body, fitness software cannot feed back the exercise effect which is achieved by the user in real time, so that the user cannot adjust the training state in real time according to the achieved exercise effect. For example, when the achieved exercise effect is lower than the standard level, the exercise completion degree is improved, and the training frequency of the dynamic aerobic exercise is accelerated. The existing fitness software generally generates and feeds back the exercise effect of the user according to the collected exercise data of the user in the fitness course after the user finishes the fitness course. The user may not be able to achieve the desired exercise effect even though the user is completing the current fitness session.
In summary, the existing fitness methods cannot feed back the exercise effect achieved by the user in real time.
Disclosure of Invention
The embodiment of the application provides a fitness course interaction method and a related device, which can feed back the exercise effect achieved by a user in real time and improve the user experience.
In a first aspect, the present application provides a fitness course interaction method, where the method is applied to a first electronic device, and includes:
the first electronic equipment displays a first interface, wherein the first interface comprises a first area and a second area, the first area is used for displaying the fitness course, and the second area is used for displaying an image containing a first user; the first interface comprises a first identifier for indicating the exercise effect of the first user in the fitness course, wherein the exercise effect of the first user in the fitness course is determined by the first electronic device according to the motion data of the first user in the fitness course.
According to the embodiment of the application, the first electronic equipment can reflect the fitness effect which is achieved by the user in the current fitness course in real time through the first identification, and then the user can adjust the exercise state of the user according to the achieved exercise effect, so that a better exercise effect is achieved. In addition, the body-building image of the user is displayed in real time in the first area of the display screen by the first electronic device, so that the user can clearly observe the body posture of the user in real time, the user can adjust the action conveniently, and better visual experience is brought to the user.
In the application, the first electronic device may display the first area and the second area side by side on the display screen, and the first area and the second area do not have an overlapping area. The first electronic device may also display the first area (or the second area) full screen and hover the second area (or the first area) over the display screen. In addition to the above display modes, the first region and the second region may have other display modes, and are not limited specifically herein.
In one implementation, the workout comprises one or more activities, each of the one or more activities of the workout comprising one or more identical sub-activities.
In one implementation, at a first point in time, the first identifier is specifically for indicating an exercise effect of the first user performing the first set of actions; wherein the first action set comprises part or all of a second action set, and the second action set is one or more sub-actions from a starting time point to a first time point of the fitness course; the exercise effect of the first user performing the first set of actions is determined by the first electronic device based on the motion data of the first user performing the first set of actions. Wherein, the first time point can be any time point in the playing process of the fitness course.
In one implementation, the first identifier is specifically used to indicate an exercise effect of the sub-action currently performed by the first user. In this way, the user can observe the exercise effect of each sub-action performed by the user.
In this application, the first identifier may be displayed in one or more of voice, text, picture, and animation. And is not particularly limited herein. For example, the first indicator may show the exercise effect that the user has achieved through a progress bar, a percentage, a score, a picture color.
In one implementation, the first identifier is a first progress bar, a length of a first portion of the first progress bar is used to indicate an exercise effect of the first user performing the first set of actions, and a total length of the first progress bar is used to indicate an exercise effect of the character performing all actions of the workout.
In one implementation, the first interface further includes a second identifier, and the second identifier is used for instructing the person in the fitness class to perform the exercise effect of the second set of actions.
In one implementation, the second identifier may be a second progress bar, a length of a second portion of the second progress bar indicating an exercise effect of the person performing the second set of actions during the workout, and a total length of the second progress bar indicating an exercise effect of the person performing all of the actions during the workout.
In this way, the user can more clearly know the exercise effect which is achieved currently through the difference between the first identifier and the second identifier.
In one implementation, the image of the first user is captured by the first electronic device through a camera.
In one implementation, the image of the first user may also be a virtual portrait generated by the first electronic device through motion data of the first user acquired by the wearable device in real time.
In one implementation, the exercise data of the first user during the workout is determined based on images of the first user.
In one implementation, the exercise data of the first user during the workout may include: and when the first user performs the action in the fitness course, the position information of each joint of the first user. In one implementation, the exercise data of the first user during the workout may further include: and in the specified movement of the fitness course, one or more of data such as the moving direction of each joint point of the user on a specific position, the moving distance of the specific joint point on the specific position, the relative position relationship of a plurality of specific joint points and the like.
In one implementation, the exercise effect of the first user performing the first set of actions is determined by the first electronic device based on the motion data of the first user performing the first set of actions and the motion data of the person performing the second set of actions during the workout session.
In one implementation, the first action set includes a first sub-action, the exercise effect indicated by the first identifier is a first value before the first sub-action is performed by the first user, and the exercise effect indicated by the first identifier is a second value after the first sub-action is performed by the first user.
In one implementation, the first sub-action is a sub-action of a first action of a fitness class. The exercise effect of the first user performing the first sub-action is determined based on the motion data of the first user performing the first sub-action and the motion data of the character performing the first sub-action in the workout session. The exercise effect of the first sub-action may be used to adjust the first indicator from a first value to a second value.
In one implementation, the exercise effect of the first sub-action is determined according to the standard degree of the first sub-action, and the standard degree of the first sub-action is determined according to the motion data of the first user performing the first sub-action and the motion data of the person performing the first sub-action in the fitness class.
In one implementation, the exercise effect of the first sub-action is determined according to the standard degree of the first sub-action and the completion time of the first sub-action.
In one implementation manner, when the completion time of the first sub-action is outside the preset time range of the first sub-action, the second value is smaller than the first value, and the farther the completion time of the first sub-action is away from the preset time range, the larger the difference between the second value and the first value is; or when the completion time of the first sub-action is out of the preset time range, the second value is equal to the first value; or when the completion time of the first sub-action is out of the preset time range, the second value is larger than the first value, and the farther the completion time of the first sub-action is away from the preset time range, the smaller the difference between the second value and the first value is.
In one implementation, the second value is greater than or equal to the first value, and the higher the standard degree of the first sub-action is, the greater the difference between the second value and the first value is; or when the standard degree of the first sub-action is higher than or equal to a first preset standard degree, the second value is larger than the first value, and the higher the standard degree of the first sub-action is, the larger the difference between the second value and the first value is; when the standard degree of the first sub-action is lower than a first preset standard degree, the second value is equal to the first value; or when the standard degree of the first sub-action is higher than or equal to a second preset standard degree, the second value is larger than or equal to the first value, and the higher the standard degree of the first sub-action is, the larger the difference between the second value and the first value is; when the standard degree of the first sub-action is lower than the second preset standard degree, the second value is smaller than the first value, and the lower the standard degree of the first sub-action is, the larger the difference value between the second value and the first value is.
In one implementation, the exercise effect of the first sub-action is determined according to the energy consumption of the first sub-action; the second value is greater than or equal to the first value, the greater the energy consumption of the first sub-action, the greater the difference between the second value and the first value; the energy consumption of the first sub-action is determined based on the height of the first user, the weight of the first user, the time of completion of the first sub-action, the motion data of the first user performing the first sub-action, and the motion data of the person in the fitness class performing the first sub-action.
In one implementation, the fitness course comprises a plurality of sub-courses, each of the plurality of sub-courses comprises one or more consecutive movements of the fitness course, a first movement in an ith sub-course of the plurality of sub-courses is a next movement to a last movement in an i-1 th sub-course of the plurality of sub-courses, i is a positive integer greater than or equal to 2; after the character in the fitness course in the second area performs the last action in the (i-1) th sub-course, the first progress bar further comprises a completion identifier of the (i-1) th sub-course of the first user, the completion identifier is used for indicating the first user to perform the exercise effect of a third action set, and the third action set comprises all or part of all sub-actions of the first i-1 courses of the fitness course.
Thus, by comparing the exercise effect of each sub-course performed by the first user with the standard exercise effect of the sub-course, the exercise effect achieved by each sub-course user can be clearly understood. Furthermore, the first user can know the sub-course where the weak item is, and the subsequent first user can perform the strengthening training for the sub-course.
In one implementation, the first interface further includes a third area, where the third area is used to display an image including a second user; the first interface further comprises a third identifier, wherein the third identifier is used for indicating the exercise effect of the second user in the fitness course, and the exercise effect indicated by the third identifier is determined according to the movement data of the second user in the fitness course.
In one implementation, the first area and the third area are the same area, that is, the image including the first user and the second user is displayed in the same area.
In one implementation, at the first point in time, the third identifier is specifically for instructing the second user to perform an exercise effect of the fourth set of actions; wherein the fourth action set includes some or all of the second action set.
In one implementation, the image of the second user is acquired by the first electronic device through a camera; or the image of the second user is acquired by the second electronic device through the camera and sent to the first electronic device.
In one implementation, the exercise data of the second user in the fitness course is determined by the first electronic device through an image of the second user; or the motion data of the second user in the fitness course is determined by the second electronic device through the image of the second user and is sent to the first electronic device.
In a second aspect, an embodiment of the present application provides an electronic device, including: a display screen, one or more processors, and one or more memories; the one or more memories are for storing computer program code, the computer program code including computer instructions; the computer instructions, when executed on the processor, cause the electronic device to perform the method of fitness course interaction in any one of the possible implementations of the first aspect.
In a third aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions, when the computer instructions are executed on an electronic device, cause a communication apparatus to perform the fitness course interaction method in any one of the possible implementations of the first aspect.
In a fourth aspect, the present application provides a computer program product, which when run on a computer, causes the computer to execute the fitness course interaction method in any one of the possible implementations of the first aspect.
Drawings
FIG. 1 is a schematic view of a joint according to an embodiment of the present application;
fig. 2 is a schematic view of a moving direction provided in the present application;
FIGS. 3 and 4 are schematic diagrams of a user interface provided by an embodiment of the present application;
fig. 5A to 5D are schematic diagrams of a training interface provided in an embodiment of the present application;
fig. 6A to 6D are schematic diagrams of a training interface provided in an embodiment of the present application;
FIGS. 7A-7C are schematic diagrams of a training interface for motion guidance provided by embodiments of the present application;
fig. 8A to 8E are schematic diagrams of a training interface of a completion degree progress bar provided in the embodiment of the present application;
fig. 8F is a schematic diagram of a motion trajectory provided in the present embodiment;
fig. 9A to 9E are schematic diagrams of a training interface of an effect progress bar provided in the embodiment of the present application;
FIGS. 10A-10C are schematic diagrams of a plurality of sub-courses provided in accordance with an embodiment of the present application;
FIGS. 11A-11H are schematic diagrams of muscle exercise intensity provided by embodiments of the present application;
fig. 12A to 12E are schematic diagrams related to user identification of multi-user shared fitness according to an embodiment of the present application;
13A-13E are schematic diagrams of a training interface for multi-user shared fitness provided by embodiments of the present application;
FIGS. 14A-14C are schematic views of a fitness report provided in accordance with an embodiment of the present application;
fig. 15A to 15B are schematic diagrams of time matching provided in the present embodiment;
FIG. 16 is a flowchart illustrating a method for interaction during a workout provided in accordance with an embodiment of the present application;
fig. 17 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 18 is a schematic diagram of a software structure according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and exhaustively described below with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; the "and/or" in the text is only an association relation describing the association object, and indicates that three relations may exist, for example, a and/or B may indicate: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of this application, a "plurality" means two or more unless indicated otherwise.
At present, people pay more and more attention to physical exercise, and exercise is performed at home through electronic equipment such as a mobile phone or a television and the like along with fitness courses, so that the physical exercise mode gradually becomes a trendy fitness mode.
The electronic device 100 according to the embodiment of the present application may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a notebook computer, an ultra-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a dedicated media player, an AR (augmented reality)/VR (virtual reality) device, or other types of electronic devices. In the embodiment of the present application, the electronic device 100 may also be other large-screen devices including a display screen, such as a television. The large-screen device can also be other devices in a human-computer interaction scene, such as a motion sensing game machine and the like. The embodiment of the present application does not limit the specific category of the large-screen device.
The existing fitness software usually generates and feeds back the exercise effect of the user according to the exercise data of the user in the fitness course after the user finishes the fitness course. For example, the percentage of completion of the whole. Since the exercise effect achieved cannot be known in real time during the exercise, the user may not achieve the desired exercise effect even though the current exercise session is completed.
In order to solve the above problem, an embodiment of the present application provides a fitness course interaction method. The electronic device 100 may evaluate the exercise effect of each action of the user according to the exercise data of the user and the standard exercise data of the fitness course collected in real time, and control the change of the fitness effect that the user has achieved in the fitness course according to the exercise effect of each action. The electronic device 100 may embody the fitness effect that the user has achieved in the current fitness class through the effect progress bar. The user can definitely know the currently achieved exercise effect, and further can adjust the exercise state of the user according to the achieved exercise effect. For example, when the achieved exercise effect is below a standard level, the user may increase the completion of the exercise and speed up the training frequency of aerobic exercise.
The following describes in detail concepts related to the fitness course, the standard exercise data of the fitness course, the exercise data of the user, and the like according to the embodiment of the present application.
(1) Body-building course
A fitness session typically includes a plurality of movements, two consecutive movements of which may have a predetermined rest time, and any two movements may be the same or different. The fitness course can be recommended by the electronic equipment according to the historical fitness data of the user, and can also be selected by the user according to actual needs. The fitness course can be played locally or online. Are not particularly limited herein.
In some embodiments, each of the above-described fitness sessions may consist of a sub-session that is repeated one or more times. When an action in a fitness class only comprises 1 sub-action, the action is equal to the sub-action of the action. For example, the ith movement of the fitness class is "knee lift leg", and the ith movement includes 5 knee lifts leg, that is, the ith movement includes a sub-movement repeated 5 times, and the sub-movement is one knee lift leg. For example, the ith motion includes only 1 knee raise leg, i.e., the ith motion includes only 1 sub-motion.
In some embodiments, a fitness session may also include a plurality of sub-sessions, each of which may include one or more consecutive movements of the fitness session. The plurality of sub-courses may be divided according to a type of exercise, an exercise purpose, an exercise part, and the like. And is not particularly limited herein.
For example, a fitness session includes three sub-sessions. Wherein the first sub-course is a warm-up exercise, the second sub-course is a formal exercise, the third sub-course is a stretching exercise, and any one of the three sub-courses includes one or more continuous actions.
In the embodiment of the present application, the fitness class may include one or more types of contents in the forms of video, animation, voice, text, and the like, which are not limited in this embodiment.
(2) Standard exercise data
In some embodiments of the present application, the standard motion data for the fitness class may include standard position information for the trainer's joints in each motion corresponding to the progress of the fitness class.
Illustratively, as shown in fig. 1, the human joint may include: head point, neck point, left shoulder point, right elbow point, left elbow point, right hand point, left hand point, right hip point, left hip point, right knee point, left knee point, right foot point, left foot point. The present invention is not limited to the above joint points, and other joint points may also be included in the embodiments of the present application, which are not specifically limited herein.
In the embodiment of the present application, the position information of each joint may be position information in which one of the joints is used as a reference node. For example, with the head point as a reference node, the head point position information may be coordinates (0, 0), and then the position information of other nodes is determined according to the relative positions of the other nodes and the head node.
In some embodiments, the standard movement trace of the joints of the trainer in each motion corresponding to the fitness course playing process can be obtained according to the change of the standard position information of the joints of the trainer in each motion.
It is understood that the motion data may include position information of the three-dimensional space of each joint point, and may also include time information corresponding to the position information.
In some embodiments of the present application, the standard motion data of the fitness course may further include a moving direction of each joint point at a specific position in each motion corresponding to the fitness course playing process, and may further include a moving distance of the specific joint point at the specific position in each motion corresponding to the fitness course playing process.
The moving direction of the joint point can be represented by an included angle between the motion track of the joint point and the reference line. The reference line may be a predetermined reference line such as a gravity vertical line, a horizontal line, etc. The moving distance of the specific joint point at the specific position may refer to a distance that the specific joint point moves in a specific moving direction with the specific position as a starting point in one motion.
Illustratively, the user shown in fig. 2 is performing a "head-holding left knee-raising" action, and the motion trajectory of the user's left knee node includes the trajectory shown in fig. 2. The tangent line shown in fig. 2 is the tangent line of the left knee point locus at the position of the left knee point when the user starts to lift the knee, and the reference line may be a horizontal line. The direction of movement of the left knee node when the user begins to lift his knee can be characterized by the angle between the tangent and the reference line shown in FIG. 2.
In some embodiments, the moving direction of each joint point at the specific position and the moving distance of the specific joint point at the specific position may be indirectly analyzed by the electronic device according to the position information (or the motion track) of each joint point in the standard motion data.
In some embodiments of the present application, the standard motion data for each motion in the workout may also include the relative position of a plurality of joints. For example, in one action, the user is required to raise both arms flat. After the two arms are lifted flat, the heights of any two nodes of the left shoulder point, the right elbow point, the left elbow point, the right hand point and the left hand point are consistent, namely the longitudinal positions are consistent.
In the embodiment of the present application, that the longitudinal positions (or horizontal positions) of two nodes are consistent may mean that a difference between the longitudinal positions (or horizontal positions) of the two nodes is smaller than a preset threshold.
In some embodiments of the present application, the standard exercise data used in practice may be generated according to the standard exercise data of the trainer in the fitness class and the position information of each joint of the user. It can be understood that the user's exercise data cannot be directly compared with the standard exercise data of the trainer due to the inconsistency between the user's body type and the trainer's body type. The electronic device 100 can process each item of data of each joint in the standard exercise data of the coach according to the comparison between the body type of the user and the body type of the coach, so as to generate the standard exercise data suitable for the user.
In some embodiments of the present application, the electronic device 100 determines the length-width ratio of each body part of the user and the trainer according to the position information of each joint of the user and the trainer, and scales the standard motion data of the joint related to the body part according to the length-width ratio of one body part, so as to generate the standard motion data suitable for the user.
For example, based on the position information of the joints of the user and the trainer, the electronic device 100 determines that the thigh length ratio of the user and the trainer is 5:4, in the leg raising action, the raising distance of the left knee point or the right knee point in the standard exercise data of the coach is 60 (centimeters) cm. The electronic device 100 determines that the standard lift distance of the user's left knee point or right knee point in a leg-raising motion is four-fifths of 60cm, i.e., 48cm.
The standard athletic data may be generated in advance based on the specific content of each trainer's movements during the workout. In some embodiments, the electronic device receives a play operation of the user, and in response to the play operation, the electronic device 100 plays the fitness course and may directly obtain standard motion data corresponding to the fitness course locally or online. The standard exercise data may be generated according to the specific content of the fitness course after the electronic device 100 determines the fitness course selected by the user. In some embodiments, the electronic device receives a play operation of the user, and in response to the play operation, the electronic device 100 plays the fitness course and generates standard motion data corresponding to each motion in the fitness course through image processing. This is not particularly limited in the embodiments of the present application.
(3) User's motion data
In some embodiments of the present application, the motion data of the user may include position information of the respective joint points in each action of the user. According to the change of the position information of each joint point, the motion trail of each joint point in each action of the user can be acquired.
In some embodiments of the present application, the motion data of the user may further include a moving direction of each joint point in each motion of the user at a specific position, a moving distance of the specific joint point in each motion of the user at the specific position, and a relative position relationship of a plurality of specific joint points in each motion of the user, corresponding to the standard motion data.
In this embodiment, the electronic device 100 may acquire the motion data of the user through a camera of the electronic device 100, and may also acquire the motion data of the user through a wearable device of the user. These two implementations are specifically described below.
(1) The electronic device 100 obtains the motion data of the user through the camera.
Specifically, the electronic device 100 acquires a fitness video of the user through a camera, periodically performs image recognition on frame images in the fitness video, and recognizes position information of each joint of the user in the frame images, thereby acquiring a motion track of each joint in real time.
In some embodiments of the present application, the electronic device 100 obtains the motion data of the target user through a camera.
In some embodiments, the electronic device stores facial images of the target user. The target user inputs personal information before playing the fitness course. When the electronic device 100 needs to collect the motion data of the user, the face image of the target user corresponding to the personal information is acquired. The television apparatus 100 detects a user matching the face image of the above-described target user within the shot picture, and collects the motion data of the user.
For example, the personal information may be an account of the user. Before playing the fitness course, the user logs in the personal account of the fitness app, and after the user logs in successfully, the electronic device 100 determines that the user corresponding to the account is the target user, and obtains the face image of the target user from the local or online.
It is understood that when the electronic device 100 detects that one or more faces are included in the frame image captured by the camera, the electronic device 100 may match the one or more faces in the frame image with a face image of the target user. And when the matching degree of a first face in the one or more faces and the face image of the target user exceeds a preset threshold value, determining the user corresponding to the first face as the target user.
In this way, the electronic device 100 can accurately acquire the motion data of the target user, and it is avoided that the motion data acquired by the electronic device 100 is inaccurate when other users other than the target user appear in the shooting range of the camera.
(2) The electronic device 100 obtains the motion data of the user through the wearable device.
Specifically, the wearable device is provided with one or more built-in sensors, the one or more sensors can be fixed at a specified body position, and the one or more sensors can be used for collecting position information of each joint point of the user, so that the motion track of each joint point can be obtained in real time, and even a three-dimensional model of the user can be obtained. For example, the intelligent wearable device is an intelligent sports wear. The one or more sensors may include one or more of a gravity sensor, an acceleration sensor, a gyroscope sensor, and the like.
In some embodiments, the moving direction and the moving distance of the joint points in the user motion data may be directly obtained by the electronic device using an external or internal sensor, or may be indirectly analyzed according to the position information (or the motion trajectory) of each joint point in the user motion data.
It should be noted that the data collected by the wearable device may be sent to the electronic device 100 in a wireless manner or a wired manner, such as WIFI or bluetooth. The wearable device or electronic device 100 described above may also upload the user's motion data to a server of the fitness app. The server can store historical movement data of the user, so that the user can obtain the historical movement data on different devices. The fitness app may also formulate a more accurate fitness plan for the user based on the user's historical motion data.
Some exemplary graphical user interfaces provided by embodiments of the present application are described below.
Fig. 3 illustrates an exemplary user interface 10 on the electronic device 100 for exposing applications installed by the electronic device 100.
The user interface 10 may include: status bar 101, application icon 102. Wherein:
status bar 101 may include: one or more of a wireless fidelity (Wi-Fi) signal strength indicator 101C, a battery status indicator 101D, and a time indicator 101E.
In some embodiments, the status bar 101 may further include: one or more signal strength indicators for mobile communication signals (which may also be referred to as cellular signals), operator name (e.g., "china mobile").
The application icon 102 may, for example: an icon 102A for body building, an icon 102B for gallery, an icon 102C for music, an icon 102D for application, an icon 102E for contact, an icon 102F for mailbox, an icon 102G for cloud sharing, an icon 102H for memo, an icon 102I for setting, and an icon 102J for camera. The user interface 10 may also include a page indicator 103. Other application icons may be distributed across multiple pages and page indicator 103 may be used to indicate which page the user is currently browsing for applications in. The user may slide the area of the other application icons from side to browse the application icons in the other pages.
In some embodiments, the user interface 10 may further include: a navigation bar 104.
Navigation bar 104 may include: a return key 104A, a home screen key 104B, a multitasking key 104C, and other system navigation keys. When it is detected that the user clicks the return key 104A, the electronic apparatus 100 may display a page previous to the current page. When the user is detected to click home screen key 104B, electronic device 100 may display user interface 10. When the user is detected to click the multi-task key 104C, the electronic device 100 may display the task that was recently opened by the user. The names of the navigation keys can be other keys, and the application does not limit the names. Not limited to virtual keys, each navigation key in navigation bar 104 may also be implemented as a physical key.
In some embodiments, the user interface 10 exemplarily illustrated in FIG. 3 may be a Home screen (Home Screen).
In other embodiments, the electronic device 100 may further include a front camera, which may also be referred to as a sub-camera, located mainly above the screen of the electronic device 100, and the front camera may be used for self-timer shooting, video call, and the like.
In other embodiments, electronic device 100 may also include a home screen key. The home screen key may be a physical key or a virtual key. The home screen key may be used to receive a user's instruction to return the currently displayed UI to the home interface, which may facilitate the user to view the home screen at any time. The instruction may be an operation instruction for the user to press the home screen key once, an operation instruction for the user to press the home screen key twice in a short time, or an operation instruction for the user to press the home screen key for a long time. In other embodiments of the present application, the home screen key may also incorporate a fingerprint recognizer for fingerprint acquisition and recognition therewith when the home screen key is pressed.
It is understood that fig. 3 is only an exemplary illustration of the user interface on the electronic device 100, and should not be construed as a limitation on the embodiments of the present application.
Illustratively, as shown in fig. 4, a user may click on a fitness icon 107 on the user interface 10, the electronic device 100 detects the user operation, and in response to the user operation, the electronic device 100 displays an interface 11 of a fitness app.
The user interface 11 may include: an application title bar 201, a search box 202, settings 203, a function bar 204, and a display area 205. Wherein:
the application title bar 201 may be used to indicate that the current page is used to present a settings interface of the electronic device 100. The presentation form of the application title bar 201 may be the text message "smart body", icon, or other form.
The search box 202 may be used to search for workout sessions that match the character entered by the user.
Settings 203 may receive a user operation (e.g., a touch operation), and in response to detecting the user operation, electronic device 100 may display a settings interface for the smart workout.
The ribbon bar 204 may include: a user centric control 204A, a course recommendation control 204B, and a plurality of course classification controls. The plurality of course classification controls may include, but are not limited to: a fat burning special area control 204C, a shaping special area control 204D and a shaping special area control 204E. Wherein
User hub control 204A may receive a user operation (e.g., a touch operation), and in response to detecting the user operation, electronic device 100 may display interface content for the user's personal hub in display area 205.
The course recommendation control 204B can receive a user operation (e.g., a touch operation), and in response to detecting the user operation, the electronic device 100 can display one or more recommended workout courses in the display area 205. For example, as shown in fig. 4, the display area 205 displays the course cover of a plurality of recommended courses, and the course classification, the time length, and the name of each recommended course.
A control of the lesson classification controls can receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic device 100 can display a lesson cover page of one or more fitness lessons corresponding to the lesson classification control in the display area 205.
In some embodiments of the present application, the course cover or the name of the fitness course may receive a playing operation (e.g., a touch operation) of the user, and in response to the detected playing operation, the electronic device 100 may display the specific content of the fitness course on the display screen.
The fitness app may be launched in response to a user touch operation on an icon of the fitness app, e.g., a single click, a double click, a long press, etc. In some embodiments of the present application, the display screen is configured with a touch panel, which is configured to receive a touch operation of a user, where the touch operation refers to an operation of a user's hand, elbow, stylus, or the like contacting the display screen. There may be other ways to open the interface 11 of the fitness app in particular implementations. And is not limited herein.
For example, the user may open the interface 11 of the fitness app by pressing a key to start the first control mode; alternatively, the interface 11 of the fitness app is opened by detecting the voice input by the user; alternatively, the interface 11 of the fitness app is opened by a specific shape (e.g., Z-shape, rectangle, circle, etc.) drawn through the knuckle. The embodiment of the present application is not particularly limited to this.
In addition, the user can also control the electronic device 100 to display the interface 11 of the fitness app through the remote controller; the user may also control the electronic device 100 to display the interface 11 of the fitness app through a particular gesture. In the embodiment of the present application, the user operation is not specifically limited.
It is understood that fig. 4 only illustrates the fitness app interface on the electronic device 100, and should not be construed as limiting the embodiments of the present application.
In this embodiment, one or more users may exercise along with the fitness course played by the electronic device 100, and the electronic device 100 may provide fitness feedback for the one or more users at the same time.
First, an application scenario of single-user training is taken as an example, and a fitness course interaction method of the application is described with reference to the accompanying drawings. In this particular application scenario, a user plays a fitness course through electronic device 100 and trains along with the fitness course. The electronic device 100 collects the exercise data of the user in real time, and guides the user to better complete the movement and display the exercise effect of the user in real time according to the exercise data of the user and the standard exercise data of the fitness course.
In some embodiments of the present application, the electronic device 100 may receive a play operation of a user, and in response to detecting the play operation, the electronic device 100 may simultaneously display a fitness course window and a user fitness window on the display screen, where the fitness course window is used to display specific contents of a fitness course, and the user fitness window is used to display a body posture of the user in real time.
Illustratively, as shown in FIG. 5A, the user may click on the recommended course displayed in display area 205, the electronic device 100 detects the user operation, and in response to the user operation, the electronic device 100 displays the exercise interface 12. Training interface 12 includes a workout window 301 and a user workout window 302. In the training interface shown in FIG. 5A, workout window 301 and user workout window 302 do not have an overlapping area.
In some embodiments, electronic device 100 displays the user's body gesture full-screen and displays the workout window in suspension on the display screen. Alternatively, electronic device 100 displays the workout session full-screen and the user workout window in suspension on the display screen.
Illustratively, as shown in FIG. 5B, electronic device 100 displays user workout window 302 in full screen on exercise interface 12 and displays workout window 301 in suspension on the display screen.
In some embodiments of the present application, when the video interface is displayed full screen, the video interface may occupy the entire display area of the display screen. In one possible implementation, displaying the video interface in a full screen mode means that only the video interface is displayed in the display screen, and other content is not displayed. In another possible embodiment, the video interface may also occupy only a part of the display area of the display screen, for example, when the video interface is displayed in the middle of the display screen, and when one or both side edge portions are white or black, the video interface may also be regarded as being displayed on the display screen in a full screen.
In other embodiments of the present application, displaying a video interface in a full screen may refer to displaying a video interface in a display screen, and simultaneously displaying interface elements at a system level, such as a status bar, a floating shortcut menu, and the like.
In addition to the window display manners shown in fig. 5A and 5B, the fitness course window 301 and the user fitness window 302 may also be displayed in other display manners, which is not specifically limited in this embodiment of the application.
In some embodiments of the present application, the user fitness window is used for displaying in real time a fitness video of the user captured by the electronic device 100 using a camera. Illustratively, as shown in fig. 5C.
The camera is a front camera of the electronic device 100, and the camera may also be an external camera of the electronic device 100, which is not limited specifically here.
In some embodiments, the user workout window is used to display a virtual portrait of the user. The virtual portrait of the user is generated by the electronic device 100 according to the motion data of the user collected by the wearable device in real time. Illustratively, as shown in fig. 5D.
It is understood that the electronic device 100 may obtain real-time position information of various joint points of the user according to the motion data of the user collected by the wearable device. The electronic device 100 can generate a virtual portrait of the user based on the position information of the respective joints of the user, and can control the posture of the virtual portrait in real time based on the change of the position information of the respective joints of the user.
The virtual portrait of the user shown in fig. 5D is merely an exemplary virtual portrait provided by the embodiments of the present application. The virtual portrait of the user may be two-dimensional or three-dimensional, and is not limited in detail here.
The electronic device 100 may enable the user to observe the real-time body posture of the user by displaying the real-time body posture of the user (e.g., the real-time fitness video of the user, the real-time virtual portrait of the user) on the display screen in real time, so as to adjust the action of the user more accurately. The electronic device 100 may also display prompt information in conjunction with the user's body posture displayed in real time on the display screen to guide the user to better complete the current action.
In addition, the body-building video of the user is displayed in real time, so that the user can observe the real-time body posture of the user more clearly, and better visual experience is brought to the user. However, the electronic device needs to process the fitness video of the user through technologies such as image processing and the like to provide fitness feedback for the user, and the energy consumption of the electronic device is large. And the wearable device acquires the motion data of the user and generates the body posture of the user, the scheme is simple to implement, and the electronic device needs less energy consumption, but needs additional wearable devices.
The training interface provided in the embodiment of the present application is further described below by taking the training interface shown in fig. 5C as an example.
Fitness class window 303 shown in fig. 5C may also include a class name, a timeline for the video, virtual keys to adjust volume, virtual keys to play/pause the video, and the like.
Illustratively, as shown in FIG. 6A, electronic device 100 plays a first action of a workout, and training interface 12 also includes a name 303 of the first action. For example, the name 303 of the first action may be "holding the head and raising the knee left".
It should be noted that the first motion is any one of a plurality of motions of the fitness session, and the first motion may include a sub-motion that is repeated one or more times.
In some embodiments of the present application, training interface 12 may also display a time indicator or timeline for the workout, which may be used to indicate the time at which the workout has been played.
Illustratively, as shown in FIG. 6B, where electronic device 100 plays a first motion of a workout, training interface 12 may further include a time indicator 304, where time indicator 304 is used to indicate a positive timing for the workout. The specific content of the time indicator 304 may be "20.
In some embodiments of the application, the electronic device 100 may further display a time indicator of the first action on the display screen when the first action of the fitness class is played, wherein the time indicator may be used to indicate a time that the first action has been performed and a preset total time of the first action. If the first action includes a plurality of sub-actions, the electronic device 100 may further display a number indicator of the sub-actions in the first action, where the number indicator may be used to indicate the number of times the first action has been completed and a preset total number of times the first action has been completed.
Illustratively, as shown in FIG. 6C, electronic device 100 plays the first action of the workout, and exercise interface 12 may also include a time indicator 305. The specific content of the time indicator 305 may be "5s/10s", which indicates that the preset total time of the action is 10s and the elapsed time of the action is 5s.
If the first motion shown in FIG. 6C includes only 1 sub-motion, the fitness session is played within 10s of the first motion, and the coach only takes one "hold head and raise knees left". If the first motion shown in FIG. 6C includes only 2 sub-motions, the fitness session is played within 10s of the first motion and the trainer takes two "hold head and raise knees left".
Illustratively, as shown in FIG. 6D, electronic device 100 plays a first action of a workout, the first action including 3 sub-actions, and training interface 12 may further include a number indicator 305. The specific content of the number indicator 305 may be "1/3", which indicates that the preset total number of sub-actions in the action is 3, and the electronic device has played the first sub-action of the first action and is playing the second sub-action of the first action. The first action comprises 3 sub-actions, the preset total time of the first action is 10s, namely the coach can carry out 3 times of holding head and raising knees left within 10 s.
It will be appreciated that in some embodiments, the sub-actions of the user's current first action and the sub-actions of the first action being played during the workout may not be synchronized. For example, the training speed of the user is relatively fast, and when the electronic device plays the 2 nd sub-action of the first action during the playing of the first action, the user is already doing the third sub-action of the first action. The number of sub-actions actually performed by the user during the playing of the first action may be different from the number of sub-actions included in the first action in the workout session. Therefore, the user can adjust the training speed according to the actual situation of the user. The training speed is increased, i.e. the number of sub-actions of the first action to be performed is increased, by adjusting for the specific action, to improve the exercise effect.
Taking the training interface 12 shown in fig. 6C as an example, how the electronic device 100 guides the user to better complete the exercise and to show the exercise effect of the user in real time will be described. The first motion of the workout illustrated in fig. 6C may include one or more secondary motions, which are not specifically limited herein.
(1) In the following, how to guide the sub-actions of the user is described for a single sub-action.
Specifically, the electronic device 100 plays a first action of the fitness course, the user performs a first sub-action of the first action, and the electronic device generates first prompt information according to the standard motion data of the first action and the motion data of the first sub-action of the user collected in real time, where the first prompt information is used to guide the user to complete the current first sub-action. The electronic device 100 may present the first prompt information in one or more of voice, text, picture, and animation. The first action may include one or more identical sub-actions, and the standard motion data of the first action is the standard motion data of each sub-action in the first action.
In some embodiments of the present application, the indication may include two parts, one part for indicating the joint point and one part for indicating how the joint point performs the action.
For example, as shown in fig. 7A, the first action is "raising the knee left while holding the head", and when the electronic device 100 determines that the current sub-action of the user needs to continue raising the knee left while holding the knee left according to the standard motion data and the motion data of the user, the electronic device 100 displays the picture indication information 307. The picture indication information 307 includes a dot located at the left knee point and an upward arrow starting from the dot. It is understood that the dots are used to indicate the left knee point and the picture indication information 307 is used to raise the knee up and left.
Illustratively, as shown in fig. 7B, the electronic device 100 indicates the user to complete the current sub-action by using the picture indication information 307 and the text indication information 308, and the specific content of the text may be "continue to raise the knee upwards and leftwards".
In some embodiments of the present application, the electronic device 100 plays the first action of the fitness course, and when a difference between the exercise data of the user and the standard exercise data exceeds a preset threshold, generates an indication message according to the difference to indicate that the user continues to complete the sub-action of the current first action.
Specifically, when at least one data difference between the motion data of the user and the standard motion data about the first joint exceeds a preset threshold, the electronic device 100 generates indication information of the first joint according to the difference to indicate that the user continues to complete the first action. For example, when it is determined that a difference between at least one of a moving direction, a moving distance, and a distance from the first joint point to the second joint point at a specific position exceeds a preset threshold according to the motion data of the user and the standard motion data, the electronic device 100 generates the indication information of the first joint point according to the difference.
The preset threshold may be set by default in the fitness course, or may be determined by the electronic device 100 according to a fitness level preset by the user. For example, the fitness level includes a primary fitness person, a middle fitness person, and a high fitness person, and the higher the level is, the smaller the above-mentioned preset threshold value is. The setting of the preset threshold is not particularly limited.
The distance between the first joint point and the second joint point may be a longitudinal distance, a lateral distance, or a linear distance between the first joint point and the second joint point. This is determined according to the actual requirements of each action and is not specifically limited herein.
For example, as shown in fig. 7C, the first motion played by the electronic device is "raise the knee left with holding the head", and in the standard motion data of the first motion, when the left knee point is raised to the highest position, the standard longitudinal distance between the left knee point and the left elbow point is 8cm. The preset threshold value of the longitudinal distance between the left knee point and the left elbow point is 3cm. When the electronic device 100 determines that the longitudinal distance between the left knee point and the left elbow point of the user is 18cm according to the motion data of the user and determines that the difference between the longitudinal distance between the left knee point and the left elbow point of the user and the standard longitudinal distance is greater than 3cm, the electronic device 100 displays picture indication information 306 and character indication information 308 related to the left knee point to indicate that the user continues to complete the first action. The specific content of the text indication information 308 may be "continue moving up 10cm".
(2) In the following, how to show the completion degree of each sub-action of the user is described for a single sub-action.
In some embodiments of the application, the electronic device 100 plays the first motion of the fitness course, and evaluates the completion degree of the first sub-motion performed by the user in real time according to the motion data of the user and the standard motion data. The completion degree of the first sub-action is used for representing the sub-action relative to the standard first action, and the current completion degree of the first sub-action of the user is standard.
In this embodiment, the electronic device 100 may display the completion degree of the first sub-action performed by the user in real time through one or more of voice, text, picture, and animation.
In some embodiments of the present application, the electronic device 100 plays the first action of the fitness course, and the electronic device 100 displays the completion degree of the current sub-action of the user by displaying and controlling the change of the completion degree progress bar. The total length of the completion progress bar may be used to indicate the highest completion of the sub-actions of the first action.
Illustratively, as shown in fig. 8A, when the electronic device 100 plays the "raise the knee left while holding the head" motion of the fitness course, the user is doing the first sub-motion of the motion, and the electronic device displays a completion progress bar 311 of the first sub-motion on the training interface 12. Wherein the shaded portion in the completion degree progress bar 311 is used to indicate the current completion degree of the first sub-action of the user.
In some embodiments, the electronic device grows the completion progress bar when it determines that the completion of the first sub-action of the user increases.
Illustratively, as shown in FIG. 8B, as the user continues to complete the first sub-action, the electronic device 100 determines that the completion of the user's first sub-action increases and grows the completion progress bar 311. The grow completion progress bar 311 refers to increasing the length of the shaded portion in the completion progress bar 311.
In some embodiments of the present application, the completion progress bar includes a lowest completion flag, and the lowest completion flag is used to indicate a position of the lowest completion of the first sub-action on the completion progress bar.
The minimum completion degree may be set by the electronic device as a default, may be set by the user, and may be generated by the electronic device according to data such as a fitness level and/or a fitness plan preset by the user. For example, the minimum completion is 60%. The lowest completion degree identification is set on the completion degree progress bar, so that a user can clearly observe whether the current sub-action of the user reaches the lowest completion degree or not, and meanwhile, the improvement of the completion degree of the current sub-action can be stimulated, and the completion degree of the current sub-action exceeds the preset lowest completion degree.
Illustratively, as shown in fig. 8C, a minimum completion indicator 312 may also be included on the completion progress bar 311.
In some embodiments of the present application, the completion degree progress bar further includes one or more other identifiers, and the one or more other identifiers can be used to indicate other completion degrees of the first sub-action.
In some embodiments, the completion progress bar further includes an excellent completion flag and a perfect completion flag.
The excellent completion degree mark is used for indicating the position of the excellent completion degree of the first action on the completion degree progress bar, and the perfect completion degree mark is used for indicating the position of the perfect completion degree of the first action on the completion degree progress bar. The excellent completion degree and the perfect completion degree can be set by the electronic equipment by default, can also be set by the user, and can also be generated by the electronic equipment according to the fitness level, the fitness plan and other data preset by the user. For example, the excellent completion index may be 80%, and the perfect completion index may be 95%. Similarly, by setting the outstanding completion degree identifier and/or the perfect completion degree identifier on the completion degree progress bar, the user can clearly observe whether the current sub-action of the user reaches the completion degree corresponding to the preset identifier, and meanwhile, the improvement of the completion degree of the current sub-action can be stimulated, so that the completion degree of the current sub-action reaches the completion degree corresponding to the preset identifier.
For example, as shown in fig. 8D, the completion progress bar 311 may further include a superior completion flag 313, and the perfect completion flag is 314.
In some embodiments of the application, when the completion of the first sub-action of the user reaches the completion corresponding to the preset completion identifier on the completion progress bar, the electronic device generates evaluation information. The evaluation information is used for evaluating the current completion degree of the user so as to motivate the user to achieve higher completion degree. The evaluation information can be displayed in one or more of text, voice, picture and the like.
For example, the total length of the completion progress bar represents 100% completion, with the minimum completion being 60%, the excellent completion being 80%, and the perfect completion being 95%. When the action completion degree reaches 60%, the length of the shadow part in the completion degree progress bar reaches the position of the lowest completion degree mark, and the electronic equipment displays evaluation information 'good, continuous and hard' corresponding to the lowest completion degree; when the action completion degree reaches 80%, the length of the shadow part in the completion degree progress bar reaches the position of the excellent completion degree mark, and the electronic equipment displays evaluation information 'good and continuous insistence' corresponding to the lowest completion degree; when the action completion degree reaches 95%, the length of the shadow part in the completion degree progress bar reaches the position of the perfect completion degree mark, and the electronic equipment displays the evaluation information 'perfect, too excellent' corresponding to the lowest completion degree.
For example, as shown in fig. 8E, when the electronic device 100 determines that the completion degree of the first sub-action of the user reaches 60%, the completion degree progress bar is increased to the position of the lowest completion degree mark, and the evaluation information 314 is displayed on the training interface 12, where the evaluation information 314 may be displayed by text and picture information, and the specific content of the text may be "oil filling, continuous effort".
The following describes in detail how to evaluate the completion of a single sub-action in real time.
In some embodiments of the present application, the electronic device 100 plays the first action of the fitness class, and the electronic device obtains the motion data of the first sub-action of the current first action of the user in real time. The motion data includes motion trajectories of the respective joint points. The electronic device 100 determines the similarity of the motion trajectory of each joint point of the user and the motion trajectory of the joint point in the standard motion data. And then, determining the current completion degree of the first sub-action of the user according to the track similarity of each joint.
For example, the motion trajectory 315 of the user's left knee point, and the motion trajectory 316 of the left knee point in the standard motion trajectory may be as shown in fig. 8F.
There are many algorithms for measuring the Similarity between two tracks, for example, the algorithm for measuring the Similarity between two tracks may be based on one or more of Euclidean Distance (Euclidean Distance), minkowski Distance (Minkowski Distance), vector space Cosine Similarity (Cosine Similarity), pearson Correlation Coefficient (Pearson Correlation Coefficient). The embodiment of the present application may also use other methods to measure the similarity between two tracks, which is not specifically limited herein.
In the embodiment of the present application, the similarity between the two tracks can be represented in the form of a percentage, a numerical value, and the like, and the completion degree of the first action can also be represented in the form of a percentage, a numerical value, and the like. For example, the representation of the completion of an action is in the form of a percentage, and the highest completion may be 100%. For example, the presentation form of the action completion is a score, and the highest score may be 100.
In some embodiments of the present application, the electronic device 100 obtains a motion trajectory of a first joint point of the user, and one or more data of a moving direction of the first joint point at a first specific position and a moving distance of the first joint point at a second specific position, and correspondingly obtains the data of the first joint point in the standard motion trajectory. And further determining the proportion of each item of data in the one or more items of data in the motion trail of the user and the standard motion trail, wherein the action completion degree of the first joint point at the first specific position is equal to the product of the proportion of each item of data and the weight of the item of data. The motion completion degree of the first joint point may be determined according to the motion completion degrees of the first joint point at a preset positions, for example, the motion completion degree of the first joint point is equal to the average value of the motion completion degrees of the first joint point at the preset positions. The completion degree of the first sub-action of the user can be determined according to the action completion degree of each joint point of the user. For example, the completion degree of the first sub-action is equal to the average value of the action completion degrees of the respective joint points of the user.
For example, the moving direction and the moving distance of the position 1 of the first joint point are x1 and y1, respectively, and the preset weights for the moving direction and the moving distance are a1 and b1, respectively, so that the motion completion degree of the first joint point at the position 1 is a1 × x1+ b1 × y1.
Wherein, the weights of the moving direction and the moving distance are preset by the electronic equipment.
The above-mentioned evaluation mode of the completion degree of the single sub-action combines various motion data of each joint point of the user. Therefore, the completion degree of the current sub-action of the user can be more accurately evaluated in real time.
In addition to the above-mentioned method for evaluating the completion degree of a single sub-action, the embodiments of the present application may also adopt other evaluation methods, which are not specifically limited herein.
In this embodiment, after the electronic device 100 starts playing the first action of the fitness course and before starting playing the next action of the first action, the electronic device 100 matches the motion data of the current sub-action of the user with the standard data of the first action, so as to determine the completion degree of the current action of the user in real time. If the first action package includes c sub-actions, after the electronic device 100 starts playing the first action of the fitness course and before starting playing the next action of the first action, the user may perform d sub-actions of the first action, where c and d may be unequal. For a specific action, the user can complete more sub-actions of the first action by increasing the speed, so as to achieve a higher exercise effect. In order to make the user know explicitly which actions may improve the exercise effect by increasing the speed, in some embodiments, the electronic device 100 may display a prompt indicator for prompting the user whether the action is suitable for improving the exercise speed when playing each action.
(3) In the following, how to show the exercise effect achieved by the user is described with respect to the action completed by the user.
In some embodiments of the present application, during the course of the fitness session, the electronic device 100 evaluates the exercise effect achieved by the user in real time according to the completion degree and/or the energy consumption of the sub-actions completed by the user. The user completed action includes all sub-actions that the user has completed.
In the embodiment of the present application, the electronic device 100 may display the exercise effect of the action completed by the user through one or more of voice, text, picture, and animation. In some embodiments, the electronic device 100 may present the exercise effect that the user has completed the action through the effect progress bar. When the user completes each sub-action, the electronic apparatus 100 may adjust the effect progress bar according to the exercise effect of the sub-action.
It can be appreciated that the electronic device 100 displays the user effect progress bar and the standard effect progress bar of the fitness session during the playing of the fitness session. The user can more clearly understand the exercise effect which is achieved currently by comparing the difference between the user effect progress bar and the standard effect progress bar.
Illustratively, as shown in fig. 9A, during the exercise course played by the electronic device 100, an effect progress bar 401 and an effect progress bar 401 are displayed on the training interface 12, where the effect progress bar 401 is used to show the exercise effect that has been achieved by the user, and the effect progress bar 402 shows the standard exercise effect of the exercise course played action. The fitness session played action includes all sub-actions that the fitness session has completed playing.
It should be noted that the total length of the effect progress bar 401 and the effect progress bar 402 can be used to indicate the standard maximum exercise effect of the workout, and can also be used to indicate M times the standard maximum exercise effect of the workout. For example, M equals 1.3. The user can exceed the standard maximum exercise effect of the fitness course by improving the action completion degree, accelerating the frequency of specific actions and the like.
In some embodiments of the present application, a minimum effect identifier 403 may be further included on the effect progress bar 401, and the minimum effect identifier is used to identify a position of a minimum exercise effect on the effect progress bar. The least effective mark 403 may be set by default by the electronic device, may be preset by the user, and may be determined by the electronic device according to data such as a fitness level and/or a fitness target preset by the user.
For example, the user sets an energy consumption target in advance. The electronic device 100 determines the position of the lowest effect identifier 403 on the effect progress bar corresponding to the energy consumption target set by the user according to the energy consumption of the fitness course standard. Through setting up minimum effect sign at the effect progress bar, can let the user clear observe whether the exercise effect that the user has reached has currently reached has reached minimum exercise effect to the excitation improves the action completion degree, accelerates the exercise frequency of specific action, so that the exercise effect that the user has reached surpasss minimum exercise effect.
Illustratively, as shown in fig. 9B, the least effective mark 403 may be further included on the above-mentioned effect progress bar 401.
In some embodiments of the present application, after the electronic device 100 detects that the user completes the first sub-action, the higher the completion degree of the first sub-action of the user is, the greater the increase of the exercise effect that the user has achieved is determined by the electronic device.
For example, before the user performs the first sub-action, the exercise effect achieved by the user may be F1, and after the user completes the first sub-action, the exercise effect achieved by the user may increase to F2. The higher the degree of completion of the first sub-action, the greater the amplification of F1 to F2.
In some embodiments of the present application, when the electronic device 100 detects that the completion degree of the first sub-action of the user is higher than the minimum completion degree after the user completes the first sub-action, the higher the completion degree of the first sub-action of the user is, the greater the increase of the exercise effect that the user has achieved is determined to be, the greater the increase of the length of the effect progress bar that the electronic device 100 controls the user is.
Illustratively, as shown in fig. 9C, the electronic device 100 plays "hold head and raise knee left", and displays an effect progress bar 401 and an effect progress bar 402. As shown in fig. 9C, the preset completion time of the action is 10s, and when the electronic device 100 plays the 5 th s of the action, the completion degree of the first sub-action currently performed by the user is lower than the lowest completion degree; when the electronic device 100 plays the 10 th s of the action, the user completes the first sub-action and resumes the standing state, and the completion degree of the first sub-action of the user is lower than the highest completion degree of the coach. The electronic device 100 grows the effect progress bar 401 and the effect progress bar 402, and the increased length of the effect progress bar 401 is smaller than that of the effect progress bar 402. The effect progress bar is lengthened, that is, the length of the shaded portion in the effect progress bar is increased.
In some embodiments of the present application, after the electronic device 100 detects that the user completes the first sub-action, and when the completion degree of the first sub-action of the user is lower than the minimum completion degree, it is determined that the exercise effect achieved by the user is not changed, and the length of the effect progress bar for controlling the user by the electronic device 100 is not changed.
Illustratively, as shown in FIG. 9D, the user completes the current sub-action, resumes the stance state, and the completion of the sub-action is still below the minimum completion. The electronic apparatus 100 keeps the effect progress bar 401 unchanged and, at the same time, adds the effect progress bar 402.
In some embodiments of the present application, after the electronic device 100 detects that the user completes the first sub-action, and when the completion degree of the first sub-action of the user is lower than the preset completion degree, it is determined that the exercise effect reached by the user is negatively increased, and the length of the effect progress bar of the user controlled by the electronic device 100 is decreased. Further, the lower the completion degree of the first action of the user, the greater the negative increase of the exercise effect that the user has achieved, the more the length of the effect progress bar that the electronic apparatus 100 controls the user is reduced.
It is understood that when the user's action completion is too low, the actual exercise effect may not be brought about. In this case, in order to stimulate the user to do a refueling exercise, the exercise effect of the above-described action can also be exhibited by negative increase. For example, the preset completion is 30%.
Illustratively, as shown in fig. 9E, the user completes the current sub-action and resumes the standing state, and the completion degree of the sub-action is lower than the preset completion degree by 30%. The electronic device 100 shortens the effect progress bar 401 and, at the same time, increases the effect progress bar 402. The effect progress bar is reduced, that is, the length of the shaded portion in the effect progress bar 402 is reduced.
In some embodiments of the application, the electronic device determines an exercise effect achieved by the first sub-action of the user according to the completion degree of the first sub-action of the user and the completion time of the first sub-action, and adjusts the user effect progress bar.
In some embodiments of the present application, when the completion time of the first sub-action of the user is outside the preset qualified time range of the first sub-action, the farther the completion time of the first sub-action is from the above qualified time range, the less exercise effect is achieved by the first sub-action of the user. Or, when the completion time of the first sub-action of the user is outside the preset qualified time range of the first sub-action, the farther the completion time of the first sub-action is from the qualified time range, the greater the reverse exercise effect achieved by the first sub-action of the user is, that is, the greater the negative increase brought to the exercise effect achieved by the user is.
It can be understood that for the specific aerobic fat burning action, the completion time is reduced, the action frequency is increased, the energy consumption can be improved, and the exercise effect is improved. For a particular anaerobic activity, the longer the duration of adherence, the greater the energy consumption and the better the exercise effect. The qualified time range corresponding to the action can be preset for different actions.
It should be noted that the electronic device may determine the start and end of a sub-action of the first action according to the first action standard motion data and the motion data of the user.
In some embodiments of the present application, during the course of the fitness session, the electronic device 100 may further estimate the exercise effect achieved by the user according to the motion completion degree and the energy consumption amount of each sub-motion in the motion completed by the user. The energy consumption of a sub-action of the user may be determined based on one or more of the height, weight, completion of the sub-action, time of completion of the sub-action, etc. of the user.
In some embodiments of the present application, the greater the energy consumption of the first sub-action of the user after the electronic device 100 detects that the user completes the first sub-action, the greater the increase in the exercise effect that the user has achieved.
In some embodiments of the present application, during the course of the fitness course played by the electronic device 100, the exercise effect achieved by the user may also be evaluated according to the completion degree, the energy consumption amount, the weight of the completion degree, and the weight of the energy consumption amount of each sub-action in the completed action of the user.
For example, if the completion degree of a sub-action is X, the energy consumption is Y, the completion degree is weighted e, and the energy consumption is weighted f, the exercise effect of the sub-action is equal to e X + f Y.
In general, a fitness session may include a plurality of sub-sessions, each of which may include one or more consecutive activities. In the embodiment of the present application, in addition to evaluating the completion degree of each sub-action of the user and the exercise effect of the action completed by the user, the electronic device 100 may also perform a periodic summary of each sub-course completed by the user during the exercise of the user along with the fitness course.
For example, one fitness session includes 10 actions, which is divided into three sub-sessions, i.e., warming-up motion, burning-fat motion, and stretching motion. Wherein, the exercise of warming up includes the first action to the third action of this body-building course, and fat burning exercise includes the fourth action to the eighth action of this body-building course, and stretching exercise includes the first action to the third action of this body-building course.
In some embodiments of the present application, the electronic device 100 displays the user effect progress bar and the standard effect progress bar while playing the workout. When the user completes one sub-lesson of the fitness lesson, the electronic apparatus 100 displays the user completion identification of the sub-lesson on the user effect progress bar, and displays the standard completion identification of the sub-lesson on the standard effect progress bar.
It should be noted that, by comparing the position difference between the user completion identifier and the standard completion identifier of each sub-course on the progress bar, the exercise effect achieved by the user of each sub-course can be clearly known. Furthermore, the user can know the sub-course where the weak item is, and the subsequent user can carry out the strengthening training for the sub-course.
For example, as shown in fig. 10A, the fitness class includes three sub-classes, and after playing the second sub-class, the electronic device will play a third sub-class for stretching exercise. The electronic device displays a user completion identifier 404 corresponding to the first sub-course and a user completion identifier 405 corresponding to the second sub-course on the effect progress bar 401, and displays a standard completion identifier 406 corresponding to the first sub-course and a standard completion identifier 407 corresponding to the second sub-course on the effect progress bar 402.
In some embodiments of the present application, the electronic device 100 displays the user effect progress bar and the standard effect progress bar when the fitness lesson is played. When the user completes one sub-course of the workout course, the electronic device 100 displays the user completion identifier of the sub-course on the user effect progress bar, displays the standard completion identifier of the sub-course on the standard effect progress bar, and may also display the evaluation of the workout effect of the sub-course within a preset time.
For example, the evaluation of the sub-course may include a standard completion percentage of the exercise effect and the standard exercise effect of the sub-course user, a total completion percentage of the exercise effect and the highest exercise effect of the fitness course of the sub-course user, a minimum percentage of the exercise effect and a preset minimum exercise effect of the sub-course user, and an energy consumption amount of the sub-course user. And is not particularly limited herein. For example, the standard percent completion equals 70%, the total percent completion equals 45%, the minimum percent equals 60%, and the energy consumption equals 100 Kilojoules (KJ).
Illustratively, as shown in FIG. 10B, just after the user completes the second sub-lesson and will begin the third sub-lesson, the training interface may further include evaluation boxes 408, which may include a standard completion percentage 408A, a total completion percentage 408B, a minimum percentage 408C, and an energy consumption amount 408D.
In some embodiments of the present application, a user may view an assessment of exercise effectiveness for a completed sub-course by completing an identification for the user of the sub-course. For example, the electronic apparatus 100 receives a user operation that the user has completed the identification for the user of the first sub-course, and the electronic apparatus 100 displays the stage evaluation box of the first sub-course of the user.
Illustratively, as shown in fig. 10C, the user controls the electronic device 100 to display an evaluation box 409 of the first sub-course through a voice command, and the specific content of the voice command may be "one-stage exercise effect evaluation".
In this embodiment, the electronic device may further control the electronic device 100 to display the evaluation box 409 of the sub-course through other user operations, which is not specifically limited herein.
In the embodiment of the application, the electronic device 100 may also evaluate and present the exercise degree of the user's muscle while the user exercises along with the fitness course. The degree of muscle exercise can also be regarded as an indication of the exercise effect of the user.
Illustratively, fig. 11A is a diagram illustrating a distribution of muscles on a front side of a human body according to an embodiment of the present application, and fig. 11B is a diagram illustrating a distribution of muscles on a back side of a human body according to an embodiment of the present application. Wherein each of the muscles shown in fig. 11A and 11B may further include one or more muscles therein. In addition to the muscle types shown in fig. 11A and 11B, other muscle types may be included in the embodiment of the present application, and are not particularly limited herein.
The muscle classes shown in fig. 11A include: trapezius, deltoid, biceps brachii, longimanus, forearm, rectus femoris, sartorius, vastus lateralis, tibialis anterior, peroneal longus, pectoralis, rectus abdominis, abdominal muscle. The muscle classes shown in fig. 11B include: rhombus, posterior cephalic, triceps brachii, flexor carpi ulnaris, extensor carpi ulnaris,
the muscles exercised by different actions in the workout may be different, and for example, as shown in fig. 11C, the workout comprises actions 1, 2, and 3, the muscles exercised by action 1 are muscle 1, muscle 2, and muscle 3, the muscles exercised by action 2 are muscle 4 and muscle 5, and the muscles exercised by action 1 are muscle 1, muscle 2, muscle 3, and muscle 5. The muscles exercised by any two different actions may or may not coincide.
In some embodiments of the present application, the degree of exercise of the muscle may be divided into a plurality of intervals. When the electronic device 100 plays the fitness course, a human muscle map is displayed, and in the human muscle map, muscles of different exercise degree intervals are distinguished by different colors.
In some embodiments of the present application, when the electronic device 100 plays the fitness class, a human muscle map is displayed, and in the human muscle map, the muscle color is heavier the higher the exercise degree is.
Illustratively, as shown in fig. 11D, when the electronic device 100 plays a fitness course, a human muscle map 501 is displayed, and the color depth of each muscle is different according to the exercise degree of each muscle. The muscles exercised by the "holding head and raising knee left" shown in fig. 11D include the muscles exercised by the motions shown in the figure, including the abdominal muscle, the right abdominocervical syndesmus, the right trapezius muscle and the right spina muscle, and it can be seen from fig. 11D that after the user completes one sub-motion, the color of the region where the abdominal muscle, the right abdominocervical syndesmus, the right trapezius muscle and the right spina muscle are located in the human body muscle diagram 501 is deepened.
In some embodiments of the present application, when the electronic device 100 plays a fitness session, the electronic device 100 evaluates the degree of various muscle exercises of the user according to the sub-actions that the user has completed. When the exercise degree of at least one type of muscle reaches a preset limit value, the electronic device 100 generates a strain prompt message, where the strain prompt message is used to prompt the user that the at least one type of muscle is over-exercised, so as to prevent muscle damage caused by over-exercise. The electronic device 100 may display the strain prompt information in one or more of text, voice, pictures, animation, and the like.
In some embodiments of the present application, when the electronic device 100 plays the fitness course, a human muscle map is displayed, and when the exercise degree of at least one type of muscle reaches a preset limit value, the fatigue prompting message may be represented as that the at least one type of muscle displays a specific color in a corresponding area of the human muscle map, or may be represented as that the at least one type of muscle displays a blinking state in the corresponding area of the human muscle map. For example, the specific color is dark black.
Illustratively, as shown in fig. 11E, the electronic device 100 displays the body muscle map 501 while playing the workout. After the user has performed a sub-action, the degree of exercise of the right abdominocentesis muscle reaches a preset limit value. The electronic device 100 displays the right abdominal muscles in the area corresponding to the human muscle map 501 as the darkest color (e.g., black) to indicate that the user is exercising excessively in the right abdominal muscles.
The electronic device 100 displays the right abdominal muscles in the area corresponding to the human muscle map 501 as the darkest color (for example, black), and simultaneously can prompt the user that the right abdominal muscles are over-exercised through text information. Illustratively, as shown in FIG. 11F, training interface 12 also includes prompt information 502. The specific content of the reminder 502 may be "Please Note! Right abdominal muscles over exercise ".
How to evaluate the degree of exercise of the muscle is described below.
In some embodiments, the standard exercise data of the workout may also record one or more types of muscles that each motion of the workout is capable of exercising primarily, and the respective exercise time of the one or more types of muscles in a sub-motion of the motion. The user's motion data may also record the completion time of each sub-action. During the exercise of the user along the workout, the exercise intensity achieved for each muscle may be characterized by the total time that the muscle has been exercised during the workout.
For example, as shown in fig. 11G, the muscles exercised by different motions may be different, and the exercise time of different types of muscles may be different in the same sub-motion. For example, the fitness session shown in FIG. 11G includes action 1, action 2, and action 3, with action 1 comprising 2 sub-actions, action 1 comprising 1 sub-action, and action 3 comprising 1 sub-action. The user has currently completed actions 1, 2 and 3, wherein the exercise time for each sub-action of muscle 1 in action 1 of the user is 4s, the exercise time for each sub-action of muscle 1 in action 2 of the user is 0s, and the exercise time for each sub-action of muscle 1 in action 3 of the user is 4s. The degree of exercise of the muscle 1 in the workout may be characterized by 9 s.
In some embodiments of the present application, when the electronic device 100 plays the workout, the exercise time of the first muscle in each sub-action is determined according to the exercise data of the user and the standard exercise data, and the exercise degree of the first muscle is evaluated according to the exercise time.
Specifically, when the electronic device 100 plays the first action of the fitness course, the electronic device 100 determines the exercise time of the first muscle corresponding to the first sub-action of the first action in the standard exercise data, and after occupying the proportion of the total time of the first sub-action, the exercise time of the first muscle in the first sub-action of the user can be determined according to the actual completion time of the first sub-action of the user and the proportion.
In some embodiments, the electronic device 100 may directly determine the exercise time of the first muscle in the first action performed by the user according to the motion trajectory of each joint of the user changing with time.
In some embodiments of the present application, when the electronic device 100 plays the workout, the exercise level of the first muscle is evaluated according to the completion level of the sub-action that the user has completed and the exercise time of the first muscle corresponding to the sub-action. The exercise intensity of a muscle in each completed sub-action may be characterized by the product of the exercise time of that muscle in said sub-action and the degree of completion of said sub-action.
Illustratively, the workout session as shown in FIG. 11H includes action 1, action 2, and action 3, with action 1 containing 2 sub-actions, action 1 containing 1 sub-action, and action 3 containing 1 sub-action. The user has currently completed action 1, action 2, and action 3. The completion degrees of sub-action 1 and sub-action 2 of action 1 of the user are 80% and 90%, respectively, the completion degree of sub-action 1 of action 2 of the user is 60%, and the completion degree of sub-action 1 of action 3 of the user is 90%. The current degree of exercise of the muscle 1 of the user is 3 x 80% +3 x 90% +3 x 80%, i.e. 7.6.
In addition to the above-mentioned manner of evaluating the muscle exercise degree, in the embodiment of the present application, the muscle exercise degree may also be evaluated by other manners, which are not specifically limited herein.
For a multi-user fitness scenario, multiple users may simultaneously follow the fitness course displayed by the electronic device 100 to exercise, and the electronic device 100 may simultaneously provide fitness feedback for the multiple users. In the following, two users are taken as an example to introduce various multi-user fitness scenarios.
A multi-user scenario one: the user 1 and the user 2 perform fitness following the same fitness course through the same electronic device, and the training interface displayed by the electronic device may also include the body posture of the user 1, the body posture of the user 2, and the specific content of the fitness course.
A multi-user scene two: user 1 and user 2 follow the same fitness session simultaneously through different electronic devices for remote shared fitness. And the electronic devices of the user and the user 2 may display the same training interface, which may be used to display the body posture of the user 1, the body posture of the user 2, and the specific content of the fitness course in real time.
For a multi-user fitness scenario, the electronic device 100 may obtain motion data of multiple users through a wearable device, and may also obtain motion data of multiple users through a camera. And is not particularly limited herein.
The following further introduces the multi-user scenario one and the multi-user scenario two, respectively.
(1) Multi-user scenario one
In some embodiments of the present application, before a user begins to follow a fitness session, electronic device 100 may receive input from the user that identifies several users performing exercises simultaneously. So that the electronic device 100 can definitely guide and feed back the fitness of several users.
Illustratively, as shown in FIG. 12A, the interface 11 of the fitness app may also include a trainer people input box 206. The trainer number input box 206 may receive an input operation by the user, and in response to the input operation by the user, the electronic device 100 may display the number of people input by the user at the trainer number input box 206. The default value in the trainer number input box 206 may be 1.
In some embodiments, electronic device 100 may obtain the motion data of user 1 and user 2 through the same camera. In some embodiments, the electronic device 100 may acquire the motion data of the user 1 and the user 2 through different cameras respectively. And is not particularly limited herein.
It is understood that the electronic apparatus 100 may recognize the user 1 and the user 2 in the photographed picture through a face recognition technology, and perform fitness guidance and exercise effect feedback for each user, respectively. First, the electronic apparatus 100 needs to acquire face images of the user 1 and the user 2.
In some embodiments of the present application, before the electronic device 100 receives a play operation of a user, the electronic device 100 obtains a face image corresponding to each user according to personal information respectively input by two users. It can be understood that, in the above embodiment, the electronic device 100 prestores the face images of the two users in advance, and the face images corresponding to the users can be retrieved according to the personal information input by the users.
For example, as shown in fig. 12B, after the electronic device 100 confirms the number of trainees input by the user, an account input box 601 is displayed. The account input box 601 includes an account input box 601A of user 1, an account input box 601B of user 2, and a confirmation control 601C. Account input box 601 the account input box may receive an account input by a user, and according to the account input by the user, the electronic device 100 may acquire a facial image corresponding to the account online (or locally).
The confirmation control 601C may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic device 100 may close the display of the account input box 601.
Besides the account of the user, the personal information may also be an account encryption code of the user, or a fingerprint of the user, which is not specifically limited herein. It should be noted that the above-mentioned personal information may be only used for the electronic device 100 to obtain a face image of the user, so as to ensure privacy of other information of the user.
In some embodiments of the present application, after receiving the training population input by the user, the electronic device 100 displays a face collection interface for collecting face images of two users.
Therefore, the face image of the user can be acquired temporarily, and the electronic equipment is not required to store the face image of the user in advance.
For example, as shown in fig. 12C, after the electronic device 100 confirms the number of trainees input by the user, the user 1 face collecting interface 13 is displayed, where the user 1 face collecting interface includes prompt information 602, a face collecting box 603, and prompt information 604. The prompt information 602 is used to prompt the user 1 to perform face acquisition, the face acquisition box 603 is used to acquire a face image, and the prompt information 604 is used to prompt the user to perform face acquisition.
For example, as shown in fig. 12D, after the face of the user 1 is collected, a prompt message 605 is displayed on the collection interface 13, where the prompt message 605 is used to prompt the user 1 that the face collection is completed. Then, the electronic device 100 displays the user 2 face collecting interface 14, and the user 2 face collecting interface comprises prompt information 606, a face collecting box 607 and prompt information 608. In some embodiments, a thumbnail 609 of the face of user 1 may also be displayed on the user 2 face recognition interface 14.
Illustratively, as shown in fig. 12E, after the face recognition of the user 2 is completed, the electronic device 100 displays a prompt message 610 on the acquisition interface 14. Then, the electronic device 100 displays the interface 11 of the fitness app. In some embodiments, thumbnail 611 for user 1 and thumbnail 612 for user 2 may also be displayed on interface 11.
Besides face recognition, other recognition methods, such as posture recognition, are also possible. And is not particularly limited herein.
In some embodiments of the present application, the electronic device 100 receives a play operation of a user, and in response to the play operation, the electronic device displays a training interface, where the training interface includes a fitness course window, a user 1 fitness window, and a user 2 fitness window, the fitness course window is used for playing a fitness course, the user 1 fitness window is used for displaying a body posture of the user 1 in real time, and the user 2 fitness window is used for displaying a body posture of the user 2 in real time. The electronic equipment can pass
Illustratively, as shown in fig. 13A, the electronic device 100 receives a play operation of a user, and in response to the play operation, the electronic device displays a training interface 15, where the training interface 15 includes a workout window 701, a user 1 workout window 702, and a user 2 workout window 703.
It should be noted that, in addition to the display modes of the fitness course window, the user 1 fitness window, and the user 2 fitness window shown in fig. 13A, the fitness course window, the user 1 fitness window, and the user 2 fitness window on the training interface 15 may have other display modes. And is not particularly limited herein.
In some embodiments, the electronic device captures body gestures of two users through the same camera. The training interface comprises a fitness course window and a user fitness window, and the user fitness window is used for displaying fitness videos of the user 1 and the user 2 in real time.
Illustratively, as shown in fig. 13B, the electronic device 100 receives a play operation of the user, and in response to the play operation, the electronic device displays the training interface 15, where the training interface 15 includes a workout window 701 and a user workout window 703.
It should be noted that, in addition to the display modes of the fitness course window, the user 1 fitness window, and the user 2 fitness window shown in fig. 13B, the fitness course window and the user fitness window on the training interface 15 may have other display modes. And is not particularly limited herein.
The following takes the training interface 15 shown in fig. 13A as an example to describe how the electronic device provides fitness feedback to the user 1 and the user 2.
In some embodiments of the application, the electronic device 100 plays the first action of the fitness course, instructs the user 1 to complete the current sub-action according to the motion data and the standard motion data of the sub-action of the first action being performed by the user 1, and simultaneously instructs the user 2 to complete the current sub-action according to the motion data and the standard motion data of the sub-action of the first action being performed by the user 2. The electronic device 100 may instruct the user to complete the first action by one or more of voice, text, picture, animation.
Illustratively, as shown in fig. 13C, the electronic device plays a first motion of the workout, the first motion being "raise the knee left while holding the head". The electronic device 100 displays the picture indication information 705 and the text indication information 706 on the training interface 12 according to the motion data of the current sub-action of the user 1 and the standard motion data. The picture indication information 705 is used for indicating that the user 1 lifts the knee upwards, and the text indication information 706 is used for indicating that the left knee of the user 1 continues to move upwards by 10cm. Electronic device 100 displays graphical indicators 707 and textual indicators 708 on training interface 12 based on the motion data and the standard motion data for the first action of user 2. The picture indication information 703 is used for indicating that the user 2 lifts the knee upwards, and the text indication information 708 is used for indicating that the left knee of the user 2 continues to move upwards by 20cm.
Specifically, how to instruct each user to complete the current sub-action may refer to a single-user scenario in the foregoing embodiment, which is not described herein again.
In some embodiments of the present application, the electronic device 100 obtains the motion data of two users respectively, and evaluates the completion degree of the current sub-action of each user respectively according to the motion data of each user and the standard motion data.
In some embodiments of the present application, the electronic device 100 plays the first action of the fitness course, and the electronic device 100 presents the completion of the current sub-action of the user 1 by displaying and controlling the completion progress bar of the user 1. The electronic apparatus 100 presents the completion of the current sub-action of the user 2 by displaying and controlling the completion progress bar of the user 2. The total length of the completion progress bar may be used to indicate the highest completion of the first action.
Thus, user 1 and user 2 can see the completion of the current sub-actions of the two users, and thus can mutually stimulate to achieve higher completion of the actions. Illustratively, as shown in FIG. 13D, the electronic device 100 plays the first action of the workout and displays a completion progress bar 709 for the current sub-action of user 1 and a completion progress bar 710 for the current sub-action of user 2 on the workout interface. A minimum completion indicator 711 for USER-1 may also be included on the completion progress bar 709, and a minimum completion indicator 712 for USER-1 may also be included on the completion progress bar 710.
It should be noted that the minimum completion level of each user may be preset by the user, or may be set by the electronic device 100 according to the default of the fitness course. The minimum completion degree of the two users can be the same or different, and the positions of the minimum completion degree marks of the two users on the progress bar can be the same or different. And is not particularly limited herein.
In addition, the specific implementation manner of the completion degree progress bar of each user may also refer to the completion degree progress bar of a single user in the foregoing embodiment, which is not described herein again.
In some embodiments of the present application, the electronic device 100 obtains the exercise data of two users respectively, and evaluates the exercise effect achieved by each user in the current fitness course according to the exercise data of each user and the standard exercise data respectively.
In some embodiments of the present application, the exercise effect currently achieved in the workout is characterized by an effect progress bar. During the process of playing the fitness course, the electronic device 100 displays the user 1 effect progress bar, the user 2 effect progress bar and the standard effect progress bar in real time. The user can more clearly know the exercise effect that each user has reached at present through the difference of contrasting two user effect progress bars and standard effect progress bar.
Illustratively, as shown in fig. 13E, during the course of the fitness course played by the electronic device 100, an effect progress bar 713, an effect progress bar 714, and an effect progress bar 715 are displayed on the training interface 12, where the effect progress bar 713 is used for showing the exercise effect that the user 1 has achieved in real time, the effect progress bar 714 is used for showing the exercise effect that the user 2 has achieved in real time, and the effect progress bar 715 is used for showing the standard exercise effect that the fitness course has achieved in real time. A least effective mark 716 of the user 1 may be further included on the effect progress bar 709, and a most displayed effect mark 717 of the user 1 may be further included on the effect progress bar 710.
It should be noted that the minimum exercise effect of each user may be preset by the user, or may be set by the electronic device 100 according to the fitness course by default. The minimum exercise effect of the two users may be the same or different, and the minimum effect marks of the two users may be the same or different at positions on the progress bar. And is not particularly limited herein.
In addition, the specific implementation manner of each user effect progress bar may also refer to the effect progress bar of a single user in the foregoing embodiment, which is not described herein again.
For a scenario of training multiple users simultaneously, reference may be made to the specific implementation manner of a single user in the foregoing embodiment for other implementation manners of the body-building feedback of the electronic device 100 for each user, which is not described herein again.
(2) Multi-user scenario two
In some embodiments, electronic device 100 may provide remote shared fitness functions. User 1 may send a share request to user 2's electronic device via electronic device 100, the share request inviting user 2 to learn a fitness course together. The user 2 may send a receive response to the electronic device 100 via the electronic device of the user 2, and the electronic devices of the two users may establish a connection. After the connection is established, the two electronic devices may share the motion data of the two users.
The electronic devices of the user 1 and the user 2 may be connected end to end directly or through a third-party device. For example, the third party device is a server. And is not particularly limited herein.
In some embodiments, after the user 1 and the user 2 electronic devices establish a connection, the user 1 electronic device receives a play operation of the user 1, and in response to the play operation, the electronic device sends a first request to the user 2 electronic device, and displays a training interface of the user 1. After receiving the first request, the electronic device of the user 2 displays the training interface of the user 2.
For example, the training interface of user 1 and the training interface of user 2 can both refer to the training interface 15 shown in FIG. 13A. The fitness video of the user 2 displayed in the fitness window of the user 2 in the training interface of the user 1 is sent by the electronic equipment of the user 2 in real time, and the fitness video of the user 1 displayed in the fitness window of the user 1 in the training interface of the user 2 is sent by the electronic equipment of the user 1 in real time.
In some embodiments of the present application, the training interfaces of the electronic devices of the user 1 and the user 2 both play the first action of the fitness course, the electronic device of the user 1 displays the prompt information 1 and the prompt information 2, the prompt information 1 is used to instruct the user 1 to complete the current sub-action of the first action, and the prompt information 2 is used to instruct the user 2 to complete the current sub-action of the first action. The prompt message 1 is generated by the electronic device of the user 1 according to the motion data of the current sub-action of the user 1 and the standard motion data. The prompt message 2 may be generated by the electronic device of the user 1 according to the motion data of the current sub-action of the user 2 and the standard motion data, wherein the motion data of the current sub-action of the user 2 is sent by the electronic device of the user 2 to the electronic device of the user 1. The prompt message 2 may also be generated by the electronic device of the user 2 according to the motion data of the current sub-action of the user 2 and the standard motion data, and the generated prompt message 2 may be sent to the electronic device of the user 1 by the electronic device of the user 2. And is not particularly limited herein.
Similarly, the electronic device of the user 2 also displays the above-described prompt information 1 and prompt information 2. The prompt message 1 may be generated by the electronic device of the user 2 according to the motion data of the current sub-action of the user 1 and the standard motion data, where the motion data of the current sub-action of the user 1 is sent by the electronic device of the user 1 to the electronic device of the user 2. The prompt message 1 may also be generated by the electronic device of the user 1 according to the motion data of the current sub-action of the user 1 and the standard motion data, and the generated prompt message 2 is sent to the electronic device of the user 2 by the electronic device of the user 1. And is not particularly limited herein.
For example, the training interfaces displayed by the electronic devices of user 1 and user 2 may refer to the training interface 15 shown in fig. 13C; hint 1 can refer to the hint 705 and the hint 706 shown in FIG. 13C; the hint information 2 can refer to the hint information 707 and the hint information 708 shown in FIG. 13C.
Specifically, how to instruct each user to complete the current sub-action may refer to a single-user scenario in the foregoing embodiment, which is not described herein again.
In some embodiments of the present application, the electronic devices of user 1 and user 2 play the first action of the fitness class, and the electronic devices of user 1 and user 2 show the completion of the current sub-action of user 1 by displaying and controlling the completion progress bar of user 1. The electronic devices of the user 1 and the user 2 show the completion degree of the current sub-action of the user 2 by displaying and controlling the completion degree progress bar of the user 2.
Similar to the aforementioned message 1 and message 2. The completion degree indicated by the completion degree progress bar of the user 2 displayed by the electronic device of the user 1 may be generated by the electronic device of the user 1 according to the motion data of the current sub-action of the user 2 and the standard motion data, where the motion data of the current sub-action of the user 2 is sent by the electronic device of the user 2 to the electronic device of the user 1; the completion degree indicated by the completion degree progress bar of the user 2 may also be generated by the electronic device of the user 2 according to the motion data of the current sub-action of the user 2 and the standard motion data, and the generated completion degree is sent to the electronic device of the user 1 by the electronic device of the user 2. And is not particularly limited herein. Similarly, the completion degree indicated by the completion degree progress bar of the user 1 displayed by the electronic device of the user 2 may also have the above two implementation manners.
For example, the completion progress bar for USER-1 may refer to the completion progress bar 710 shown in FIG. 13D; the completion degree progress bar of the user 1 may refer to the completion degree progress bar 711 shown in fig. 13D.
Specifically, how to determine the completion degree of the current sub-action of the user may refer to a single user scenario in the foregoing embodiment, which is not described herein again.
In some embodiments of the application, the electronic devices of user 1 and user 2 play the first motion of the fitness session and display and control the effect progress bar of user 1, the effect progress bar of user 2, and the standard effect progress bar to show the exercise effect achieved by each of the two users.
Similar to the aforementioned prompt information 1 and prompt information 2. The exercise effect indicated by the effect progress bar of the user 2 displayed by the electronic device of the user 1 may be generated by the electronic device of the user 1 according to the motion data of the current sub-action of the user 2 and the standard motion data, wherein the motion data of the current sub-action of the user 2 is sent by the electronic device of the user 2 to the electronic device of the user 1; the exercise effect indicated by the effect progress bar of the user 2 may also be generated by the electronic device of the user 2 according to the motion data of the current sub-action of the user 2 and the standard motion data, and the generated exercise effect is provided to the electronic device of the user 1 by the electronic device of the user 2. And is not particularly limited herein. Similarly, the exercise effect indicated by the effect progress bar of the user 1 displayed by the electronic device of the user 2 can also be realized in the above two ways.
For example, the effect progress bar of the user 1 may refer to the effect progress bar 713 shown in fig. 13E; the effect progress bar of the user 1 may refer to the effect progress bar 714 shown in fig. 13E.
Specifically, how to determine the exercise effect currently achieved by the user may refer to a single user scenario in the foregoing embodiment, which is not described herein again.
In this embodiment, after the user finishes the exercise session, the electronic device 100 may generate and display an exercise report for the user.
In some embodiments of the present application, the workout report may include one or more of a completion of each action in the workout, an assessment of the effectiveness of each sub-workout in the user's workout, an exercise effect of the user's entire workout, a report of the stretching of the user's muscles, an actual workout time for each action of the user, and an overall actual workout time for the user.
Wherein the report of the stretching of the user's muscles may include an assessment of the degree of overall exercise of the user's muscles; the exercise device can further comprise a user muscle diagram, and the exercise degree of each muscle can be shown through the user muscle diagram; instructions for the extent of exercise of a particular muscle may also be included.
Illustratively, as shown in fig. 14A, a fitness reporting interface 15 is provided in the embodiments of the present application.
Workout reporting interface 15 may include a course name 901, a user's actual exercise time 902, a score for the exercise performance of the entire workout, a muscle exercise report 904, the user's actual exercise time 905 for each action, a next control 906, a close control 907, and a share control 908. Wherein:
the muscle exercise report 904 may include: general muscle stretch evaluation 904A, human muscle front view 904B, human muscle back view 904C, muscle illustration 904D, and exercise effect 904E for a specific muscle.
The electronic device 100 determines the total stretching degree evaluation 904A according to the overall exercise effect of the muscles of the user, and the specific content of the total stretching degree evaluation 904A may be "complete is very good, so that the stretching effect is achieved".
In the human muscle front view 904B and the human muscle back view 904C, the exercise degree of each muscle can be exhibited by different colors. Muscle schematic illustration 904D is a color illustration of different degrees of exercise in the muscle schematic. In some embodiments, the degree of muscle exercise can be divided into 3 levels and presented by 3 different colors. The muscle diagram illustration 904D includes the colors corresponding to the above 3 layers. The exercise effect 904E for a particular muscle may include a description of the exercise effect of the muscle that the workout primarily exercises. For example, exercise effect 904E for a particular muscle includes a description of the muscles surrounding the joint, which may specifically be "stabilizing the knee joint.
In some embodiments, the electronic device may also suggest corresponding stretching suggestions based on the current degree of exercise of the user's respective muscles. For example, fitness reporting interface 15 may also include muscle stretch recommendations, the specific content of which is "recommendations to enhance the degree of rectus abdominis stretch".
Control 906 can receive a user operation (e.g., a touch operation) again, and in response to detecting the user operation, electronic device 100 can play the workout again.
The close control 907 may receive a user operation (e.g., a touch operation), and in response to detecting the user operation, the electronic device 100 may close the workout reporting interface 15 and display the workout app interface 11.
Close control 907 may receive a user operation (e.g., a touch operation), and in response to detecting the user operation, electronic device 100 may display a sharing interface for the workout report.
It will be appreciated that for a multi-user fitness scenario, the fitness report for each user may be included in the fitness report for the workout.
Illustratively, taking two users as an example, as shown in fig. 14B, a user 1 control 909 and a user 2 control 910 are also included on the fitness reporting interface 15. Display area 16 of fitness report interface 15 is used to display fitness reports for user 1 or user 2. As shown, user 2 control 910 may receive a user operation (e.g., a touch operation), in response to which electronic device 100 may display a workout report for user 2 in display area 16 of workout report interface 15.
In addition, the user may also view fitness reports for historical workout sessions through the user center of the fitness app.
Illustratively, as shown in fig. 14C, the user clicks on the user hub control 204A of the workout app interface 11, and in response to detecting the user operation described above, the electronic device 100 displays the user hub content in the display area 205. The user center content includes a historical training course title bar 1001, a control 1002, a course project bar 1003, a course project bar 1004, a fitness plan title bar 1005, and a control 1006. Wherein:
control 1002 can receive a user operation (e.g., a touch operation) and, in response to detecting the user operation, electronic device 100 can display more bars of historical workout items.
The course item bar 1003 includes a historical course profile 1003A and a workout report 1003B. Historical course profile 1003A may include the course name, the user's date of exercise. The fitness report 1003B may receive a user operation (e.g., a touch operation), and in response to the detected user operation, the electronic device 100 may display a fitness report corresponding to the historical lesson.
The course item bar 1004 includes a historical course profile 1004A and a workout report 1004B. Historical course profile 1004A may include a course name, a user's date of exercise. The workout report 1004B may receive a user operation (e.g., a touch operation), and in response to detecting the user operation, the electronic device 100 may display a workout report corresponding to the historical workout.
Control 1006 can receive a user operation (e.g., a touch operation), and in response to detecting the user operation, electronic device 100 can display a fitness plan for the user.
In addition, in the existing fitness software, the progress bar displayed by the electronic device 100 is only used for indicating the time progress of the fitness lessons, and the exercise effect of the user cannot be reflected in real time. The timeline may also not match the user's actual workout session. In following the workout, a casual user may not be able to keep up with the workout speed in the workout, and the workout may have begun playing the next workout when the user's current workout has not completed. In this case, the user can only give up the current action and follow the fitness course for the next action, or complete the current action and give up following the fitness course for the next action. If the user wants to complete the two actions in sequence, the user may need to manually pause the playing of the fitness course, and after the current action is completed, the user starts playing the next action. Therefore, the above situation may strike the user's fitness enthusiasm, increase the user's fitness frustration, and is not favorable for the user to improve the fitness effect.
It is understood that the time schedule may also be represented by the numerical time 304 shown in FIG. 6B.
The embodiment of the application further provides a fitness course interaction method. After the playing of the first motion of the electronic device 100 is finished, if it is detected that the user performs the sub-motion of the first motion, the electronic device 100 plays the next motion of the first motion after the user completes the current sub-motion, that is, the playing of the fitness course may be suspended.
Therefore, the actual training time of the user can be different from the time schedule of the fitness course, the playing of the fitness course can be matched with the actual exercise course of the user, and the user experience is effectively improved.
The description will be given by taking the single user training interface 12 shown in fig. 9A as an example. Illustratively, as shown in fig. 15A, the training interface 12 may also display the user's actual training time 801. Current fitness session timeline is 20, user actual workout time is 24. As shown in the figure, when the "head-holding left knee-raising" action of the fitness course is finished, the head-holding left knee-raising action of the user is not finished yet, and the electronic device 100 suspends the playing of the fitness course. For example, as shown in fig. 15B, when the user ends the head-holding and knee-raising motion at the actual training time of 24.
In the embodiment of the present application, the electronic device 100 determines whether the user is still performing the sub-action of the first action and whether to end the current sub-action according to the real-time motion trajectory of each joint point of the user and the standard motion trajectory of the first action.
It will be appreciated that the fitness feedback scheme described above for matching the actual training session of a user is also applicable to multi-user scenarios. After the electronic device 100 determines that the first action is played, if it is detected that at least one user of the multiple users performs the sub-action of the first action, the electronic device 100 plays the next action of the first action after the at least one user completes the current sub-action.
Based on some of the embodiments shown in fig. 1-15B, the following describes the exercise course interaction method provided by the present application.
Referring to fig. 16, fig. 16 is a schematic flowchart of a fitness course interaction method provided by an embodiment of the present application. As shown in fig. 16, the fitness course interaction method provided in the embodiment of the present application includes, but is not limited to, steps S101 to S108. Possible implementations of embodiments of the method are described further below.
S101, the first electronic device receives a playing operation of a first user.
The first electronic device may refer to the electronic device 100 in the foregoing embodiment.
In this application, first input operation can be the gesture that acts on the display screen at terminal, also can be voice command, eyeball and shake operation etc..
As can be seen from the foregoing related embodiments of fig. 5A and 5B, the play operation may be: and clicking a preset area of the fitness course on a terminal display screen by the fingers of the user. The predetermined area may be a cover of the exercise session shown in fig. 5A. The above-described clicking operation may be performed by one or more finger joints, finger pad, fingertip, stylus, and the like. And are not particularly limited herein. Wherein the preset area of the workout is displayed in the workout app interface 11 shown in fig. 5A and 5B.
Wherein the workout may include one or more actions, each of the one or more actions of the workout including one or more of the same sub-actions. The workout may also include a plurality of sub-lessons, each of which may include one or more consecutive movements of the workout.
S102, responding to the detected input operation, displaying a first interface by the first electronic equipment, wherein the first interface comprises a first area and a second area, the first area is used for displaying the fitness course, and the second area is used for displaying an image containing a first user; the first interface comprises a first identifier for indicating the exercise effect of the first user in the fitness class, wherein the exercise effect of the first user in the fitness class is determined by the first electronic device according to the movement data of the first user in the fitness class.
Referring to fig. 5A and 5B, the first interface may be a training interface 12. The first area may be a workout window 301 and the second area may be a user workout window 302. As shown in FIG. 5A, workout window 301 and user workout window 302 are displayed side-by-side and have no overlapping area. As shown in fig. 5B, the first electronic device displays user workout window 302 full screen and user workout window 301 in suspension on the display screen. In addition to the display modes of the first region and the second region shown in fig. 5A and 5B, other display modes may be provided, and this embodiment of the present application is not particularly limited thereto.
Referring to fig. 5C, the image of the first user may be: the first electronic equipment acquires the body-building video image of the first user through the camera. Referring to fig. 5C, the image of the first user may be: the first electronic equipment generates a virtual portrait according to the motion data of the first user, collected by the wearable equipment in real time.
In some embodiments, the first interface further includes a second identifier for indicating an exercise effect of the person performing the second set of actions during the workout.
In this application, the first identifier may be displayed in one or more of voice, text, picture, and animation. And is not particularly limited herein. For example, the first indicator may show the exercise effect that the user has achieved through a progress bar, a percentage, a score, a picture color.
In some embodiments, the first indicator may be a first progress bar, a length of a first portion of the first progress bar indicating an exercise effect of the first user performing the first set of actions, and a total length of the first progress bar indicating an exercise effect of the character performing all of the actions of the workout. Similarly, the second mark may be a second progress bar, a length of a second portion of the second progress bar is used to indicate an exercise effect of the person performing the second set of actions in the fitness class, and a total length of the second progress bar is also used to indicate an exercise effect of the person performing all actions in the fitness class.
Referring to the embodiment of fig. 9A to 9E, the first progress bar may refer to an effect progress bar 401 shown in fig. 9A to 9E, and the second progress bar may refer to an effect progress bar 402 shown in fig. 9A to 9E. The first portion of the first progress bar is a shaded portion of the progress bar 401.
In some embodiments of the present application, at a first point in time, the first identifier is specifically for indicating an exercise effect of the first user performing the first set of actions; wherein the first action set comprises part or all of a second action set, and the second action set is one or more sub-actions from a starting time point to a first time point of the fitness course; the exercise effect of the first user performing the first set of actions is determined by the first electronic device from the motion data of the first user performing the first set of actions. Wherein, the first time point can be any time point in the playing process of the fitness course.
Referring to fig. 9A, the first user performs the current sub-action at 20 minutes and 45 seconds of the fitness session being played. At this point in time, where the first time point is 20 minutes and 45 seconds of the workout session, the first set of actions of the first user may include: after the fitness session begins playing, the first user performs all sub-actions that the user performed before performing the current sub-action. The second set of actions of the workout may include: all actions taken by the coach in the fitness class are within 0 second to 20 minutes 45 seconds of playing the fitness class. At this time, the electronic device displays an effect progress bar 401 for instructing the first user to perform an exercise effect of the first set of actions, and an effect progress bar 402 for instructing the coach in the fitness class to perform an exercise effect of the second set of actions.
For example, a fitness session includes 10 actions, where the first action of the fitness session includes 2 sub-actions, the second action of the fitness session includes 3 sub-actions, and the third action of the fitness session includes 2 sub-actions. At 10 minutes and 8 seconds of the fitness session, the coach is performing the second sub-activity of the third activity of the fitness session and the first user is performing the first sub-activity of the third activity of the fitness session. At this point in time, where the first time point is 10 minutes and 8 seconds of the workout session, the first set of actions of the first user may include: after the workout session begins playing, all of the sub-actions performed by the first user are performed before the first user performs the first sub-action of the third action. The second set of actions of the workout may include: the coach's 2 sub-motions of the first motion, 3 sub-motions of the second motion, and the first sub-motion of the third motion of the fitness class. It is to be appreciated that the first set of actions can include some or all of the second set of actions.
S103, the first user executes the first sub-action, and the first electronic device acquires motion data of the first user executing the first sub-action in real time.
In an embodiment of the application, the exercise data of the first user in the fitness class is determined according to the image of the first user.
Specifically, the motion data of the first user may be: the first electronic equipment acquires the fitness video of the first user through the camera and then obtains the fitness video of the first user through image processing of frame images of the fitness video. The motion data of the first user may also be: the first electronic device is acquired in real time according to a wearable device of a first user.
Referring to the embodiment of fig. 4, when the first user opens the fitness app, the first electronic device starts to acquire the motion data of the user. Therefore, the exercise data of the user can be acquired before the fitness course is played, so that the electronic equipment can conveniently acquire the relative position information of each joint point of the user in advance, and the exercise data of the user can be acquired more accurately during the fitness course. Referring to the embodiment of fig. 5A and 5B, when the first electronic device receives a play operation of the first user, the first electronic device starts to acquire motion data of the user. The first electronic device may stop obtaining the user's athletic data when the user exits the current workout. Therefore, the exercise data of the first user is obtained only in the body-building course playing process, and energy consumption can be saved.
In some embodiments, the first set of actions includes a first sub-action.
S104, the first electronic equipment adjusts the first identification according to the motion data of the first user executing the first sub-action and the standard motion data of the fitness course.
Specifically, the first sub-motion is a sub-motion of the first motion of the fitness course. The motion data of the first user performing the first sub-action and the motion data of the character performing the first sub-action in the fitness course of the electronic equipment determine the exercise effect of the first user performing the first sub-action. And adjusting the first identifier according to the exercise effect of the first user performing the first sub-action. Wherein the exercise effect indicated by the first indicator is a first value before the first user performs the first sub-action and the exercise effect indicated by the first indicator is a second value after the first user performs the first sub-action.
It should be noted that the exercise effect of the first user performing the first set of actions is determined by the first electronic device based on the motion data of the first user performing the first set of actions and the motion data of the person performing the second set of actions during the workout.
In some embodiments, the exercise effect of the first sub-action is determined according to a standard degree of the first sub-action, the standard degree of the first sub-action being determined according to the motion data of the first user performing the first sub-action and the motion data of the person in the workout session performing the first sub-action. The exercise effect of the first sub-action may also be determined according to the standard degree of the first sub-action and the completion time of the first sub-action.
In the embodiment of the present application, the standard degree of the sub-action may also be referred to as a completion degree of the sub-action, and is not specifically limited herein.
Referring to the fig. 9C embodiment, the first sub-action is "holding head and raising knee left", after the user performs the first sub-action, the shaded portion of the progress bar 401 increases, that is, the exercise effect indicated by the progress bar 401 increases from the first value to the second value. In this embodiment, the second value is greater than or equal to the first value, and the higher the degree of standardization of the first sub-action is, the greater the difference between the second value and the first value is.
Referring to fig. 9D, the first sub-motion is "holding head and raising knee left", and the first predetermined standard degree may refer to the lowest completion degree in the previous embodiment. After the user performs the first sub-action, if it is determined that the standard degree of the first sub-action is lower than the first preset standard degree, the length of the shaded portion of the progress bar 401 is controlled to be unchanged, that is, the exercise effect indicated by the progress bar 401 is unchanged. In this embodiment, when the standard degree of the first sub-action is lower than the first preset standard degree, the second value is equal to the first value; when the standard degree of the first sub-action is higher than or equal to the first preset standard degree, the second value is larger than the first value, and the higher the standard degree of the first sub-action is, the larger the difference between the second value and the first value is.
For example, the first predetermined standard degree may be 60%.
Referring to the embodiment of fig. 9E, the first sub-action is "holding head and raising knee left", and the second predetermined standard degree may refer to the predetermined completion degree in the foregoing embodiment. After the user performs the first sub-action, if it is determined that the standard degree of the first sub-action is lower than the second preset standard degree, the length of the shaded portion of the progress bar 401 is controlled to be reduced, that is, the exercise effect indicated by the progress bar 401 is reduced. In this embodiment, when the standard degree of the first sub-action is lower than the second preset standard degree, the second value is smaller than the first value, and the lower the standard degree of the first sub-action is, the larger the difference between the second value and the first value is. When the standard degree of the first sub-action is higher than or equal to a second preset standard degree, the second value is larger than or equal to the first value, and the higher the standard degree of the first sub-action is, the larger the difference between the second value and the first value is.
For example, the second preset standard degree may be 30%.
In some embodiments, when the completion time of the first sub-action is outside the preset time range of the first sub-action, the second value is smaller than the first value, and the farther the completion time of the first sub-action is from the preset time range, the larger the difference between the second value and the first value is; or when the completion time of the first sub-action is out of the preset time range, the second value is equal to the first value; or when the completion time of the first sub-action is out of the preset time range, the second value is larger than the first value, and the farther the completion time of the first sub-action is away from the preset time range, the smaller the difference between the second value and the first value is. In the present application, the preset time range may also be referred to as a qualified time range.
In some embodiments, the exercise effect of the first sub-action is determined according to the energy consumption amount of the first sub-action; the second value is greater than or equal to the first value, the greater the energy consumption of the first sub-action, the greater the difference between the second value and the first value; the energy consumption of the first sub-action is determined based on the height of the first user, the weight of the first user, the time of completion of the first sub-action, the motion data of the first user performing the first sub-action, and the motion data of the person in the fitness class performing the first sub-action.
In this application, a fitness session may also include a plurality of sub-sessions, each of the plurality of sub-sessions including one or more consecutive movements of the fitness session. It is understood that the first action in the ith sub-lesson of the plurality of sub-lessons is the next action to the last action in the i-1 th sub-lesson of the plurality of sub-lessons, and i is a positive integer greater than or equal to 2.
Referring to the embodiment of FIG. 10A, after the character in the fitness class in the second area performs the last action in the i-1 st sub-class, the first progress bar further includes a completion flag for the i-1 st sub-class of the first user, the completion flag being used to instruct the first user to perform an exercise effect for a third set of actions, the third set of actions including some or all of all sub-actions of the first i-1 classes of the fitness class.
In addition, referring to the embodiment of FIG. 10B, after the character in the fitness course in the second area performs the last action in the (i-1) th sub-course, the finishing identification of the (i-1) th sub-course of the first user is displayed on the first progress bar, and the exercise effect evaluation box of the (i-1) th sub-course may also be displayed. Referring to the embodiment of fig. 10C, the completion identifier for the displayed sub-lesson may receive an input operation of the user, and in response to the detected input operation, the first electronic device may display the evaluation box of the sub-lesson again.
In this application, except that single user follows the body-building course and takes exercise, can also have a plurality of users to follow the body-building course and take exercise together, and first electronic equipment can provide body-building feedback for above-mentioned a plurality of users simultaneously.
In some embodiments, the first interface may further include a third area for displaying an image containing the second user; the first interface further comprises a third identifier, wherein the third identifier is used for indicating the exercise effect of the second user in the fitness course, and the exercise effect indicated by the third identifier is determined according to the movement data of the second user in the fitness course. Referring to fig. 13A and 13B, the first interface may be a training interface 15.
In this application, the first user may also be referred to as user 1, and the second user may also be referred to as user 2.
Referring to fig. 13A, the first area may be a workout window 701, the first area may be a user workout window 702, and the third area may be a user workout window 703. In addition to the display mode shown in fig. 13B, fitness class window 701, user fitness window 702, and user fitness window 703 may have other display modes, which are not limited herein. In a possible implementation manner, the first user and the second user use the same electronic device, and the image of the second user may be acquired by the first electronic device through the camera. In another possible implementation manner, the first user and the second user use different electronic devices to perform exercise simultaneously along the same fitness course, and the image of the second user is acquired by the second electronic device through the camera and sent to the first electronic device.
When the first user and the second user use different electronic devices, the exercise data of the second user in the fitness course may be determined by the first electronic device through the image of the second user, or determined by the second electronic device through the image of the second user and sent to the first electronic device.
Referring to FIG. 13B, the first area may be workout window 701 and the first area and the third area may be the same area, i.e., user workout window 702. It will be appreciated that the image containing the first user and the second user may be displayed in the same area. In a possible implementation manner, the first electronic device acquires images including the first user and the second user through a camera, and displays the images in the user fitness window 702. In another possible implementation manner, the first electronic device generates a virtual portrait of the first user through user data collected by a wearable device of the first user, and displays the virtual portrait on the user fitness window 702; meanwhile, the first electronic device generates a virtual portrait of the second user through the user data collected by the wearable device of the second user, and the virtual portrait is displayed in the user fitness window 702.
In some embodiments, the third identification may be specifically for instructing the second user to perform an exercise effect for the fourth set of actions; wherein the fourth action set includes some or all of the second action set. Referring to fig. 13E, a third indicator may be an effect progress bar 714, the third indicator used to indicate to the second user that an exercise effect of the fourth set of actions was performed. The first identification may also be an effects progress bar 713 and the second identification may also be an effects progress bar 715.
Referring to fig. 17, fig. 17 shows a schematic structural diagram of an exemplary electronic device 100 provided in an embodiment of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
The NPU may perform artificial intelligence operations using Convolutional Neural Networks (CNN) processing. For example, a CNN model is used for carrying out a large amount of information identification and information screening, and training and identification of scene intelligence can be achieved.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The charging management module 140 is configured to receive charging input from a charger. Wherein the charger may be a wireless or wired charger.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charging management module 140, and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In some embodiments of the present application, the interface content currently output by the system is displayed in the display screen 194. For example, the interface content is an interface provided by an instant messaging application.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like. The ISP is used to process the data fed back by the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. In some alternative embodiments of the present application, the pressure sensor 180A may be configured to capture a pressure value generated when the user's finger portion contacts the display screen and transmit the pressure value to the processor, so that the processor identifies which finger portion the user entered the user action.
The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions.
In some alternative embodiments of the present application, the pressure sensor 180A may transmit the detected capacitance value to the processor, so that the processor recognizes through which finger portion (knuckle or pad, etc.) the user inputs the user operation. In some alternative embodiments of the present application, the pressure sensor 180A may also calculate the number of touch points according to the detected signal and transmit the calculated value to the processor, so that the processor recognizes the user operation by the single-finger or multi-finger input.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications. In some alternative embodiments of the present application, the acceleration sensor 180E may be used to capture acceleration values generated when a user's finger portion contacts the display screen and transmit the acceleration values to the processor, so that the processor identifies which finger portion the user entered the user operation.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from a nearby object using a photodiode.
The ambient light sensor 180L is used to sense ambient light brightness. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or thereabout, which is an operation of a user's hand, elbow, stylus, or the like contacting the display screen 194. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human body pulse to receive the blood pressure pulsation signal.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration prompts as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card.
The software system of the electronic device 100 may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 18 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 18, the application packages may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
In the application, a floating window starting component (floating launcher) may be added to the application layer, and is used as a default display application in the mentioned floating window, and provides a user with an entry for entering another application.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 18, the application framework layer may include a window manager (window manager), a content provider, a view system, a phone manager, a resource manager, a notification manager, an activity manager (activity manager), and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the display screen, intercept the display screen and the like. In the application, a floating window can be expanded based on the Android native PhoneWindow and is specially used for displaying the mentioned floating window so as to be different from a common window, and the window has the attribute of being displayed on the topmost layer of a series of windows in a floating manner. In some alternative embodiments, the window size may be given a suitable value according to the size of the actual screen, according to an optimal display algorithm. In some possible embodiments, the aspect ratio of the window may default to the screen aspect ratio of a conventional mainstream handset. Meanwhile, in order to facilitate the user to close the exit and hide the floating window, a close key and a minimize key can be additionally drawn at the upper right corner.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures. In the application, the button views for closing, minimizing and other operations on the floating window can be correspondingly added and bound to the floating window in the window manager.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager allows applications to display notification information in the status bar 207, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears in the form of a dialog window on the display. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The activity manager is used for managing the active services running in the system, and comprises processes (processes), applications, services (services), task information and the like. In the application, an Activity task stack specially used for managing the application Activity displayed in the floating window can be newly added in the Activity manager module, so that the application Activity and task in the floating window cannot conflict with the application displayed in the full screen in the screen.
In the application, a motion detector (motion detector) may be additionally arranged in the application framework layer, and is used for performing logic judgment on the acquired input event and identifying the type of the input event. For example, it is determined that the input event is a finger joint touch event or a finger pad touch event by information such as touch coordinates included in the input event and a time stamp of a touch operation. Meanwhile, the motion detection assembly can also record the track of the input event, judge the gesture rule of the input event and respond to different operations according to different gestures.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: input manager, input dispatcher, surface manager, media Libraries, three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engine (e.g., SGL), and the like.
And the input manager is responsible for acquiring event data from the input driver at the bottom layer, analyzing and packaging the event data and then transmitting the event data to the input scheduling manager.
The input scheduling manager is used for storing window information, and after receiving an input event from the input manager, the input scheduling manager searches a proper window in the stored window and distributes the event to the window.
The surface manager is used to manage the display subsystem and provide a fusion of the 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of the software and hardware of the electronic device 100 in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera 193.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application, in essence or part of the technical solutions contributing to the prior art, or all or part of the technical solutions, may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the present application should be covered within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (20)

1. A fitness course interaction method is applied to a first electronic device, and comprises the following steps:
the first electronic device displaying a first interface, the first interface comprising a first area and a second area, the first area for displaying a workout session, the second area for displaying an image comprising a first user;
the first interface further comprises a first identifier, a second identifier and a human body muscle map, wherein the first identifier is used for indicating the exercise effect of the first user in the fitness course, and the second identifier is used for indicating the exercise effect of the character in the fitness course; the human muscle map comprises a plurality of muscles, wherein the color of a first muscle of the plurality of muscles is used for indicating the current exercise degree of the first muscle, and the exercise degree of each muscle of the plurality of muscles is determined according to the motion data of the first user in the fitness class;
at a first time point, the first identifier is specifically configured to instruct the first user to perform an exercise effect of a first set of actions, the second identifier is configured to instruct a person in the fitness class to perform an exercise effect of a second set of actions, the first set of actions includes some or all of the second set of actions, and the second set of actions is one or more sub-actions included from a starting time point of the fitness class to the first time point; the exercise effect of the first user performing the first set of actions is determined by the first electronic device from the motion data of the first user performing the first set of actions.
2. The method of claim 1, wherein the workout comprises one or more actions, each of the one or more actions of the workout comprising one or more identical sub-actions.
3. The method of claim 1, wherein the first identifier is a first progress bar, wherein a length of a first portion of the first progress bar is used to indicate an exercise effect of the first user performing the first set of actions, and wherein a total length of the first progress bar is used to indicate an exercise effect of a person in the workout performing all of the actions of the workout.
4. The method of claim 1, wherein the image of the first user is captured by the first electronic device via a camera.
5. The method of claim 1, wherein the athletic data of the first user during the workout is determined from an image of the first user.
6. The method of claim 1, wherein the exercise effect of the first user performing the first set of actions is determined by the first electronic device based on the motion data of the first user performing the first set of actions and the motion data of the person performing the second set of actions in the workout session.
7. The method of claim 1, wherein the first set of actions includes a first sub-action, wherein the first identifier indicates a first value of exercise effect before the first user performs the first sub-action, and wherein the first identifier indicates a second value of exercise effect after the first user performs the first sub-action.
8. The method of claim 7, wherein the exercise effect of the first sub-action is determined according to a standard degree of the first sub-action, the standard degree of the first sub-action being determined according to the motion data of the first user performing the first sub-action and the motion data of the person in the workout performing the first sub-action.
9. The method of claim 7, wherein the exercise effect of the first sub-action is determined according to a criterion of the first sub-action and a completion time of the first sub-action.
10. The method of claim 9,
when the completion time of the first sub-action is out of the preset time range of the first sub-action, the second value is smaller than the first value, and the farther the completion time of the first sub-action is away from the preset time range, the larger the difference between the second value and the first value is;
or when the completion time of the first sub-action is out of the preset time range, the second value is equal to the first value;
or when the completion time of the first sub-action is outside the preset time range, the second value is greater than the first value, and the farther the completion time of the first sub-action is away from the preset time range, the smaller the difference between the second value and the first value is.
11. The method of claim 8 or 9,
the second value is greater than or equal to the first value, and the higher the standard degree of the first sub-action is, the larger the difference between the second value and the first value is;
or, when the standard degree of the first sub-action is higher than or equal to a first preset standard degree, the second value is greater than the first value, and the higher the standard degree of the first sub-action is, the greater the difference between the second value and the first value is; when the standard degree of the first sub-action is lower than the first preset standard degree, the second value is equal to the first value;
or, when the standard degree of the first sub-action is higher than or equal to a second preset standard degree, the second value is greater than or equal to the first value, and the higher the standard degree of the first sub-action is, the greater the difference between the second value and the first value is; when the standard degree of the first sub-action is lower than the second preset standard degree, the second value is smaller than the first value, and the lower the standard degree of the first sub-action is, the larger the difference between the second value and the first value is.
12. The method of claim 7, wherein the exercise effect of the first sub-action is determined from the energy consumption of the first sub-action;
the second value is greater than or equal to the first value, the greater the energy consumption of the first sub-action, the greater the difference between the second value and the first value; the energy consumption of the first sub-action is determined based on the height of the first user, the weight of the first user, the completion time of the first sub-action, the motion data of the first user performing the first sub-action, and the motion data of a person in the workout session performing the first sub-action.
13. The method of claim 3, wherein the fitness course comprises a plurality of sub-courses, each of the plurality of sub-courses comprising one or more consecutive movements of the fitness course, a first movement of an ith sub-course of the plurality of sub-courses being a next movement to a last movement of an i-1 th sub-course of the plurality of sub-courses, i being a positive integer greater than or equal to 2; after the character in the fitness course in the second area executes the last action in the (i-1) th sub-course, the first progress bar further comprises a completion identifier of the (i-1) th sub-course of the first user, the completion identifier is used for indicating the first user to execute the exercise effect of a third action set, and the third action set comprises part or all of all sub-actions of the first i-1 courses of the fitness course.
14. The method of claim 1, wherein the first interface further comprises a third area for displaying an image containing a second user; the first interface further comprises a third identifier, wherein the third identifier is used for indicating the exercise effect of the second user in the fitness course, and the exercise effect indicated by the third identifier is determined according to the motion data of the second user in the fitness course.
15. The method of claim 14, wherein at the first point in time, the third identification is specifically for instructing the second user to perform an exercise effect for a fourth set of actions; wherein the fourth set of actions includes some or all of the second set of actions.
16. The method of claim 14,
the image of the second user is acquired by the first electronic equipment through a camera;
or the image of the second user is acquired by the second electronic device through the camera and is sent to the first electronic device.
17. The method of claim 14,
the exercise data of the second user in the fitness class is determined by the first electronic device through the image of the second user;
or the motion data of the second user in the fitness course is determined by the second electronic device through the image of the second user and is sent to the first electronic device.
18. An electronic device, comprising: a display screen, one or more processors, and one or more memories for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed on the one or more processors, cause the electronic device to perform the method of any of claims 1-17.
19. A computer-readable storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-17.
20. A computer program product, which, when run on a computer, causes the computer to perform the method of any one of claims 1-17.
CN202010531077.7A 2020-06-11 2020-06-11 Fitness course interaction method and related device Active CN113808446B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010531077.7A CN113808446B (en) 2020-06-11 2020-06-11 Fitness course interaction method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010531077.7A CN113808446B (en) 2020-06-11 2020-06-11 Fitness course interaction method and related device

Publications (2)

Publication Number Publication Date
CN113808446A CN113808446A (en) 2021-12-17
CN113808446B true CN113808446B (en) 2022-11-18

Family

ID=78892000

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010531077.7A Active CN113808446B (en) 2020-06-11 2020-06-11 Fitness course interaction method and related device

Country Status (1)

Country Link
CN (1) CN113808446B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114602196B (en) * 2022-02-18 2024-01-30 周昕 Suitcase for stage play
CN116920353A (en) * 2022-04-06 2023-10-24 成都拟合未来科技有限公司 Body-building shaping course display equipment and method
CN115202531A (en) * 2022-05-27 2022-10-18 当趣网络科技(杭州)有限公司 Interface interaction method and system and electronic device
CN117370878B (en) * 2023-12-07 2024-03-05 山东第一医科大学第一附属医院(山东省千佛山医院) Epidermis extraction and positioning method and system based on spine joint vibration information

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110032572A (en) * 2017-12-27 2019-07-19 晶翔机电股份有限公司 Plan the method and system of body-building course
CN111111111A (en) * 2020-01-14 2020-05-08 广东技术师范大学 Real-time fitness monitoring system and method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101884832A (en) * 2009-05-15 2010-11-17 英业达股份有限公司 Virtual character fitness interactive system and method thereof
CN201674596U (en) * 2010-05-25 2010-12-15 深圳创维-Rgb电子有限公司 Television and television network system
KR101810754B1 (en) * 2012-10-30 2017-12-19 나이키 이노베이트 씨.브이. User interface and fitness meters for remote joint workout session
CN204425409U (en) * 2015-03-05 2015-06-24 蒋雪峰 Based on the body-building data supervisory systems of Internet of Things and cloud computing
CN106807056A (en) * 2017-02-15 2017-06-09 四川建筑职业技术学院 A kind of fitness training based on somatic sensation television game instructs system and guidance method
CN108209910A (en) * 2017-05-25 2018-06-29 深圳市未来健身衣科技有限公司 The feedback method and device of body building data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110032572A (en) * 2017-12-27 2019-07-19 晶翔机电股份有限公司 Plan the method and system of body-building course
CN111111111A (en) * 2020-01-14 2020-05-08 广东技术师范大学 Real-time fitness monitoring system and method

Also Published As

Publication number Publication date
CN113808446A (en) 2021-12-17

Similar Documents

Publication Publication Date Title
CN113808446B (en) Fitness course interaction method and related device
CN110045819B (en) Gesture processing method and device
EP4020491A1 (en) Fitness-assisted method and electronic apparatus
CN109495688A (en) Method for previewing of taking pictures, graphic user interface and the electronic equipment of electronic equipment
CN110362373A (en) A kind of method and relevant device controlling screen wicket
CN109766043A (en) The operating method and electronic equipment of electronic equipment
CN110134316A (en) Model training method, Emotion identification method and relevant apparatus and equipment
CN111544852B (en) Method and related apparatus for correcting body-building posture
CN114466128B (en) Target user focus tracking shooting method, electronic equipment and storage medium
CN110456938A (en) A kind of the false-touch prevention method and electronic equipment of Curved screen
CN111202955A (en) Motion data processing method and electronic equipment
EP4270245A1 (en) User determination method, electronic device, and computer-readable storage medium
EP4310724A1 (en) Method for determining exercise guidance information, electronic device, and exercise guidance system
CN117130469A (en) Space gesture recognition method, electronic equipment and chip system
EP4224485A1 (en) Adaptive action evaluation method, electronic device, and storage medium
WO2022214004A1 (en) Target user determination method, electronic device and computer-readable storage medium
EP4006754A1 (en) Prompting method for fitness training, and electronic device
CN113996046B (en) Warming-up judgment method and device and electronic equipment
CN116391212A (en) Method for preventing gesture from being misidentified and electronic equipment
CN112149483A (en) Blowing detection method and device
CN115445170A (en) Exercise reminding method and related equipment
CN113693556A (en) Method and device for detecting muscle fatigue degree after exercise and electronic equipment
CN116798470A (en) Audio playing method and device and electronic equipment
CN115203524A (en) Fitness recommendation method and electronic equipment
CN117982126A (en) Prompting method, prompting device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant