CN110716773A - Motion information display method and device - Google Patents

Motion information display method and device Download PDF

Info

Publication number
CN110716773A
CN110716773A CN201810771584.0A CN201810771584A CN110716773A CN 110716773 A CN110716773 A CN 110716773A CN 201810771584 A CN201810771584 A CN 201810771584A CN 110716773 A CN110716773 A CN 110716773A
Authority
CN
China
Prior art keywords
user
point
target path
target
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810771584.0A
Other languages
Chinese (zh)
Inventor
曹思真
贾鹏飞
徐勉
庄云辉
陈坚
金鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810771584.0A priority Critical patent/CN110716773A/en
Publication of CN110716773A publication Critical patent/CN110716773A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a motion condition display method and a motion condition display device, the method comprises the steps of setting a target path with a starting point and an end point, displaying position points corresponding to a plurality of users on the target path, when motion parameters of the users are obtained, determining a motion distance of each user, which is mapped to the target path, according to the motion parameters, and then displaying the position points corresponding to the users on the target path according to the motion distance of each user on the target path. Because the position points corresponding to different users can be displayed on the target path, the user using the target path to display the motion information can visually check the moving progress of the user and other users on the target path through the position points, and therefore the interactive experience brought by motion can be better provided for the user.

Description

Motion information display method and device
Technical Field
The present application relates to the field of data processing, and in particular, to a method and an apparatus for displaying motion information.
Background
The user can record the number of the self moving steps by using the portable terminal, for example, the number of the steps generated by the user every day is recorded by using a mobile phone, a wearable device and the like.
In some sports and social Applications (APPs), a display list of the number of steps of a sport of a plurality of users may be provided, and the difference in the number of steps of the sport between the user and a friend or the like may be checked through the display list, thereby improving the degree of interaction between the users.
However, the currently provided motion step number comparison is mostly limited to the motion step number comparison, and it is difficult to intuitively reflect the motion trend difference between users only through the step number value, so that the interaction between users is limited.
Disclosure of Invention
In order to solve the technical problem, the application provides a motion information display method and a motion information display device, and the method can visually represent motion trend differences among users who display motion information of the users by using a target path through the target path, so that interactive experience brought by motion can be better provided for the users.
The embodiment of the application discloses the following technical scheme:
in a first aspect, an embodiment of the present application provides a method for displaying motion information, where the method includes:
acquiring motion parameters of a plurality of users;
respectively determining the movement distance of the movement parameter of each user mapped to the target path;
and displaying the position point corresponding to each user on the target path according to the movement distance of each user on the target path.
In a second aspect, an embodiment of the present application provides a motion information display apparatus, where the apparatus includes a first obtaining unit, a determining unit, and a display unit:
the first acquisition unit is used for acquiring the motion parameters of a plurality of users;
the determining unit is used for respectively determining the movement distance of the movement parameter of each user mapped to the target path;
and the display unit is used for displaying the position point corresponding to each user on the target path according to the movement distance of each user on the target path.
In a third aspect, an embodiment of the present application provides an apparatus for motion information presentation, where the apparatus includes a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the motion information presentation method of any of the first aspect according to instructions in the program code.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store program codes, and the program codes are used to execute the motion information presentation method according to any one of the first aspect.
According to the technical scheme, the target path with the starting point and the end point is set, the position points corresponding to a plurality of users can be displayed on the target path, when the motion parameters of the users are obtained, the motion distance of the motion parameter of each user mapped to the target path can be determined, and then the position points corresponding to each user can be displayed on the target path according to the motion distance of each user on the target path. Because the position points corresponding to different users can be displayed on the target path, the user who uses the target path to display the motion information of the user can visually check the moving progress of the user on the target path through the position points, and can clearly confirm the moving progress of other users on the target path by checking the position points of other users. Therefore, the motion trend difference between the users who use the target path to display the motion information of the users can be visually reflected through the target path, and therefore the interactive experience brought by motion can be better provided for the users.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a system architecture diagram of a motion information display method according to an embodiment of the present application;
fig. 2 is a flowchart of a method for displaying exercise information according to an embodiment of the present disclosure;
fig. 3 is an exemplary diagram of a display interface for requesting to join a target path according to an embodiment of the present disclosure;
FIG. 4 is a diagram illustrating an example of a display interface for providing auxiliary athletic parameters, according to an embodiment of the present application;
fig. 5 is an exemplary diagram of a leader board display interface according to an embodiment of the present application;
fig. 6 is a flowchart of a method for creating a target path according to an embodiment of the present disclosure;
FIG. 7 is a diagram illustrating an example of a creation interface of a target path according to an embodiment of the present disclosure;
fig. 8 is an exemplary diagram of a display interface for motion information according to an embodiment of the present application;
FIG. 9 is an exemplary diagram of a completion information display interface according to an embodiment of the present application;
fig. 10 is a flowchart of a method for displaying exercise information according to an embodiment of the present application;
fig. 11a is a structural diagram of a sports information displaying apparatus according to an embodiment of the present application;
FIG. 11b is a block diagram of an apparatus for displaying exercise information according to an embodiment of the present disclosure;
FIG. 11c is a block diagram of an athletic information display device according to an embodiment of the present disclosure;
FIG. 11d is a block diagram of an apparatus for displaying exercise information according to an embodiment of the present disclosure;
fig. 11e is a structural diagram of a sports information displaying apparatus according to an embodiment of the present application;
fig. 12 is a block diagram of an apparatus for displaying motion information according to an embodiment of the present disclosure;
fig. 13 is a block diagram of an apparatus for displaying motion information according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the currently provided motion information comparison method, motion steps of different users in a fixed time period are mainly compared, for example, the motion steps of each day are compared, and the difference of motion trends among the users is difficult to embody intuitively only through the step number value, so that the interactive experience brought by motion is difficult to provide for the users well.
Therefore, the method for displaying the motion information provided by the embodiment of the application can map the motion parameter of each user to the motion distance on the target path by setting the target path with the starting point and the end point, so that the position points corresponding to a plurality of users are displayed on the target path, the abstract motion parameter is converted into the position points of the image on the target path, and the motion trend difference between the users who display the motion information of the users by using the target path is visually reflected.
The motion information display method provided by the embodiment of the application can be applied to terminal equipment, such as an intelligent terminal, a computer, a Personal Digital Assistant (PDA for short), a tablet computer and the like. The terminal device can be provided with an APP for displaying motion information, and the motion information displaying method can be integrated on the APP. The motion information display method can also be applied to a server. When the motion information display method is applied to the server, the server can acquire the motion parameters uploaded by the terminal equipment, so that the motion information display method is executed.
Next, for convenience of description, the motion information presentation method will be described by taking the motion information presentation method as an example for a server.
Referring to fig. 1, fig. 1 shows a system architecture diagram of a motion information presentation method, where the system architecture diagram may include a server 101 and a plurality of terminal devices 102, and a target path having a start point and an end point is set in the server 101, where the target path may be used to display location points of a plurality of users, where each user displayed on the target path has a corresponding location point, different location points represent different users, and the location point of each user is determined according to a motion parameter of the user. The position point of any user can embody the motion parameters accumulated by the user on the target path. The location points may be represented by dots or circles. In general, different users can be distinguished by location points with different colors.
The plurality of terminal devices 102 may be configured to upload the motion parameters of the user to the server 101, and different users may upload the motion parameters to the server 101 through corresponding terminal devices 102, so that the server 101 may display, according to the obtained motion parameters, only three terminal devices in fig. 1 on a target path as an example, which is not limited in the number of terminal devices 102 in this embodiment of the present application.
The server 101 may determine, according to the motion parameters uploaded by the plurality of terminal devices 102, a motion distance at which the motion parameter of each user is mapped onto the target path, and then display, on the target path, a position point corresponding to each user according to the motion distance of each user on the target path. The motion parameters of the multiple users acquired by the server 101 may be uploaded by the multiple terminal devices 102 at the same time, or may be uploaded by the multiple terminal devices 102 sequentially.
The motion information display interface may be as shown in 103 in fig. 1, and taking a plurality of users including user 1, user 2, and user 3 as an example, the motion information display method may be implemented to display a location point of user 1, a location point of user 2, and a location point of user 3 on the target path. The motion information display interface 103 may be displayed on any one or more terminal devices 102 in fig. 1, so that a user corresponding to the terminal device 102 may view the motion situation of the user or other users on the target path.
In this embodiment, the target route may be an actual route corresponding to the road network on the electronic map, or may be a virtual route drawn based on the starting point and the ending point.
If the target path is an actual path, the motion parameters of the users may be motion parameters generated by the users moving on the target path. For example, if a plurality of users participate in a running game of the route a of the olympic forest park, and the difference of motion trends among the plurality of users participating in the running game is expected to be intuitively known, the position points respectively corresponding to the users can be displayed on the route a of the olympic forest park corresponding to the electronic map according to the motion parameters of the users. At this time, the corresponding route of the olympic forest park a on the electronic map may be used as a target path, and the user performs a motion on the corresponding route of the olympic forest park a on the electronic map, where the target path is an actual path. If the target path is a virtual path drawn based on the starting point and the end point, the virtual path is endowed with a distance concept with a certain length, the length value on the virtual path can embody a certain distance, and the user motion parameters can be converted into a length value on the target path and embodied on the movement of a position point corresponding to the user according to the corresponding relation between the distance and the motion parameters, so that the target path can provide intuitive comparison of the motion parameters of a plurality of users. At this time, the target path is only a virtual path, and the user is not required to actually move on the target path, and the motion parameters of the plurality of users may be generated by the user moving in daily life, work, and the like.
The drawn virtual path may have many possible forms, and two forms will be described as examples.
In a first form: the virtual path may be a path that is drawn according to a starting point and an ending point and is unrelated to an actual road network, the length values between different positions on the path may represent a certain distance, and the path may be a line segment or a curve.
In a second form: the virtual path may be a path corresponding to or similar to an actual road network drawn according to the starting point and the ending point, so that the motion information of the user can be embodied vividly and vividly. Specifically, after the starting point and the end point are determined, a virtual path with the starting point and the end point or a virtual path formed by splicing a plurality of sub-paths with the starting point and the end point can be determined by using the automatic route searching function of the electronic map. In this embodiment of the application, the target path may be determined based on a virtual path in which the user participates, and the virtual path in which the user participates may be determined according to an actual motion requirement or an interaction requirement of the user, for example, the virtual path may include a sub-path, and may also be a sub-path in the virtual path.
In order to improve the user participation, a virtual match may be set corresponding to the virtual path, a track of the virtual match corresponding to the virtual path, and if the track of the virtual match is relatively long, the track may include a plurality of segments, and correspondingly, the virtual match may include a plurality of segment matches, each segment corresponding to one segment match, and each segment corresponding to one sub-path.
If the user participates in the complete virtual competition, the target path may be the track, that is, the target path is a virtual path including a sub-path, a starting point of the track may be a starting point of the target path, and an end point of the track may be an end point of the target path; if the user participates in a segment game of a certain segment, the target path may be the segment, the target path may be a part of the track, that is, the target path is a sub-path in the virtual path, the start point of the segment may be the start point of the target path, and the end point of the segment may be the end point of the target path.
For example, a virtual match of "jingkong rally" is set, the track of "jingkong rally" can be used as a virtual path, the starting point of the track is beijing, the ending point of the track is hong kong, the track can include a plurality of segments, for example, including a Jiangsu segment, which is a sub-path of the virtual path, the starting point of the Jiangsu segment is a sunshine beacon, and the ending point of the Jiangsu segment is a village. If the user participates in the complete race course of the Beijing hong Kong Racing, namely the user participates in the whole virtual path, the target path can be the race track, Beijing can be used as the starting point of the target path, and hong Kong can be used as the end point of the target path; if the user only participates in the competition of Jiangsu competition segments, namely the user participates in a certain sub-path, the target path can be the Jiangsu competition segment, the sunshine lighthouse can be used as the starting point of the target path, and the Zhou Zhuang can be used as the end point of the target path.
The motion parameters mentioned in the embodiments of the present application may be parameters used to reflect motion information of the target user. The motion parameter may be, for example, the number of motion steps of the user or the actual moving distance of the user in real life.
Compared with the traditional step counting software, the traditional step counting software adopts a fixed time period to compare the number of the motion steps to reflect the difference of the motion trends among the users, and the motion information display method provided by the embodiment of the application realizes that the difference of the motion trends among the users is reflected by comparing the time required by different users to finish the distance by adopting a fixed distance through setting a target path, so that the difference of the motion conditions among the different users is visually and vividly compared.
Next, a motion information display method provided by an embodiment of the present application will be described with reference to the drawings.
Referring to fig. 2, fig. 2 is a flowchart illustrating a motion information presentation method applied to a target path having a start point and an end point, the target path including location points corresponding to a plurality of users, the method including:
s201, obtaining motion parameters of a plurality of users.
In this embodiment, the relationship between the users on the target path may include various situations, for example, the relationship may be a friend or a stranger.
In this embodiment, the motion parameter may be periodically counted by a recording unit in the terminal device, and when a counting period is reached, the motion parameter counted by the counting period is cleared, and then the counting of the motion parameter of the next counting period is restarted. Wherein, the statistical period can be one day, one week, one month and the like.
The motion parameter obtained in S201 may be an accumulated value of the motion parameters in a statistical period, that is, the motion parameter is a total motion parameter from the start of the statistical period to the uploading of this time; the motion parameter obtained in S201 may also be a difference value between two uploaded motion parameters, that is, the motion parameter is a difference value between the motion parameter uploaded this time and the motion parameter uploaded last time.
Taking the exercise parameter as the exercise step number and the statistical period as one day as an example, in one day, the exercise step number of the user at 10 points is 800 steps, the exercise step number of the user at 11 points is 1000 steps, if the exercise parameter for the user is obtained at 11 points, the obtained exercise parameter for the user may be 1000 steps, and at this time, the exercise parameter is the accumulated value of the exercise parameter; of course, the acquired exercise parameter for the user may be 200 steps, and in this case, the exercise parameter is the difference between the exercise step number of the user at 11 points and the exercise step number of the user at 10 points.
It should be noted that the timing for acquiring the motion parameters may be different according to the needs of the user. If a user needs to check his/her own motion information, the motion parameters may be obtained when the user logs in the APP and needs to check his/her own motion information.
Of course, for any user of a plurality of users, even if the user does not have a need to view the motion information of the user, other users on the target path may need to view the motion information of the user to compare with the motion information of the user. Therefore, the motion parameters may be acquired periodically, so that the position of the user on the target path may be updated periodically, so that other users may know the motion information of the user. The period may be preset, the setting manner of the period may include multiple, the first period setting manner may be to set the period according to time, for example, the movement parameter may be acquired every 5 minutes; the second cycle setting mode may also be setting a cycle according to the motion parameters, for example, if the motion parameters include the number of motion steps, the motion parameters may be acquired every 100 steps.
For each user, the corresponding relationship between the acquired motion parameter of the user and the location point where the user is located when the motion parameter of the user is acquired may be saved so as to be used for subsequently updating the location point of the user. Under some information, different position points may have different position point identifiers, so that corresponding position points and motion parameters may be obtained according to the position point identifiers.
If the motion parameter is obtained when the user needs to check the motion information of the user and logs in the APP, the user may open the APP on the terminal device and log in the APP, thereby triggering execution of S201.
If the motion parameters are periodically acquired, the execution of S201 is triggered every time the period is reached, so that the execution of S201 is performed periodically.
It can be understood that, regardless of whether the target path is a virtual path or an actual path, the motion information of the user can be presented on the target path only if the user joins the target path, and therefore, taking the target user as an example, the target user needs to join the target path before obtaining the motion parameters for the target user. The method for adding the target user into the target path may be: and the target user triggers and sends a joining request, and after the joining request of the target user is obtained, a position point corresponding to the target user can be added at the starting point of the target path.
One way for the target user to trigger sending the join request may be that the target user opens an APP for displaying the motion information, a function key for requesting to join the target path is provided on a display interface of the APP, when the user clicks the key, the APP may pop up an interface for determining whether to join the target path to the user, and when the user clicks the determination, the target user triggers sending the join request; of course, when the user clicks the button, the join request may be directly triggered to be sent.
Taking a target path as an example of a path with a certain length provided on a map in a competition activity provided on an APP, if the name of the competition activity is "jingkong rally", see 301 in fig. 3, a function key provided on a display interface of the APP for requesting to join the target path, that is, a key named "jingkong rally", when a target user clicks the key of "jingkong rally", the APP may pop up an "interface whether to determine to join the target path" to the user, and when the user clicks the determination, the APP triggers to send a join request.
After acquiring the join request of the target user, a location point corresponding to the target user may be added at a starting point of the target path, for example, as shown in 302 in fig. 3, where the starting point of the target path is "beijing ocs", and after acquiring the join request of the target user, a location point corresponding to the target user may be added at "beijing ocs".
It should be noted that the target user may join the target route at any time, for example, the timing when the target user joins the target route may be when other users on the target route are still at the starting point position, or may be when some users on the target route have moved a distance on the target route along the direction from the starting point to the ending point.
S202, respectively determining the movement distance of the movement parameter of each user mapped to the target path. The movement distance on the target path has a corresponding relation with the movement parameter, and the movement parameter can be converted into the movement distance on the target path. In general, the manner of determining the motion distance of each user, which the motion parameter maps to the target path, may be: for each user, firstly, the actual moving distance of the user is determined according to the motion parameters of the user, and then the moving distance of the user, which is mapped on the target path, is determined according to the proportional relation between the actual moving distance and the moving distance on the target path.
It should be noted that, since the motion parameter may be the number of motion steps of the user or the actual moving distance of the user in real life, the above-mentioned manner of determining the motion parameter of each user to map to the motion distance on the target path may be slightly different according to the difference of the motion parameters.
If the motion parameter is the motion step number of the user, the motion step number needs to be converted into an actual moving distance, and then the motion distance of the user, which is mapped on the target path, is determined according to the proportional relation between the actual moving distance and the motion distance on the target path.
And if the motion parameter is the actual moving distance of the user, determining the moving distance of the user, which is mapped on the target path, according to the proportional relation between the actual moving distance and the moving distance on the target path.
S203, displaying the position point corresponding to each user on the target path according to the movement distance of each user on the target path.
It should be noted that, in this embodiment, the position points respectively corresponding to each user may be displayed on the target path according to the motion parameters, and for each user, the method of displaying the position point corresponding to the user on the target path according to the motion parameters is similar, and in order to facilitate detailed understanding of the motion information display method, next, S202 and S203 will be described by taking the target user of the multiple users as an example.
For the target user, one implementation of S202 may be: determining a movement distance which is mapped to a target path and is relative to a first position point by the movement parameter of the target user; one implementation of S203 may be: and updating the position point displayed on the target path by the target user from the first position point to the second position point according to the movement distance of the target user and the moving direction from the starting point to the end point of the target path. Taking user 1 in fig. 1 as an example of a target user, where the location point of the target user is shown as a gray location point in 103 in fig. 1, the server 101 may obtain the motion parameter for the target user from the terminal device 102 corresponding to the target user. Then, the server 101 determines, according to the motion parameter for the target user, that the motion parameter of the target user is mapped to the motion distance on the target path relative to the first position point. Next, according to the determined movement distance and the moving direction from the starting point to the end point on the target path, the position point displayed on the target path by the target user is updated from the first position point to the second position point, and the updated position of the target user on the target path may be shown as 104 in fig. 1. Wherein, the gray position point in 103 can be used as the first position point; a gray position point with a solid frame in the 104 can be used as a second position point, and a gray position point with a broken frame in the 104 corresponds to the first position point in the 103, so that the target user can visually check the moving progress of the target user on the target path through the change of the position point in the 104; the distance between the corresponding second position point and the first position point on the target path in 104 can be mapped as a motion parameter to the motion distance on the target path relative to the first position point.
The first location point may be a location point that is historically updated on the target path by the target user, that is, a location point obtained by updating according to the motion parameter and recorded before the motion parameter of the target user is acquired this time.
For example, the previously recorded location points updated according to the motion parameters are respectively: if the motion parameter is 200 steps corresponding to the position point a, the motion parameter is 400 steps corresponding to the position point B, the motion parameter is 600 steps corresponding to the position point C, and the obtaining of the motion parameter for the target user this time is 800 steps, then any one of the position points A, B or C before the obtaining of the motion parameter for the target user this time may be used as the first position point.
The second location point may be a location point obtained by updating on the target path according to the motion parameter for the target user when the motion parameter for the target user is obtained, and may embody a motion situation of the target user based on motion parameter accumulation.
It is understood that the moving distance of the target path relative to the first position point may be related to the first position point and the second position point, and the moving distance may be a distance from the first position point to the second position point on the target path, and may be determined according to a preset calculation manner. It should be noted that the first position point is selected in relation to the motion parameter acquired in S201. If the motion parameters are acquired when the target user needs to check the motion information of the target user and logs in the APP, the first position point is an updated position point when the target user logs out of the APP last time, and therefore the change situation of the position of the user on the target path during each login can be reflected.
If the motion parameter is obtained periodically and the motion parameter is an accumulated value of the motion parameter, the first location point may be a location point where the target user is located when the motion parameter for the target user is obtained at the beginning of the statistical period. For example, in one statistical cycle, a motion parameter is acquired every 200 steps, the motion parameter is an accumulated value of the number of motion steps, if a position point a corresponding to a motion parameter of 200 steps, a position point B corresponding to a motion parameter of 400 steps, a position point C corresponding to a motion parameter of 600 steps are periodically and sequentially acquired, and a motion parameter acquired this time for the target user is 800 steps, a position point corresponding to the target user on the target path when the number of motion steps started in the statistical cycle is 0 may be taken as the first position point.
If the motion parameter is obtained periodically and the motion parameter is a difference value between two uploaded motion parameters, the first location point may be a location point where the target user was located when the motion parameter was uploaded last time.
Updating based on the first location points basically realizes incremental updating, namely, the location change between the location points of two updates can be reflected in the updating of the location points. In some cases, it may not be necessary to know the position change between the position points updated by the target user twice, but only the moving progress of the target user with respect to the starting position point of the target path needs to be known, at this time, a full amount of updating may be adopted, that is, the starting position point of the target user on the target path may be taken as the first position point, and each time of updating, the target user is updated from the starting position point to the second position point on the target path.
It can be understood that, according to the motion parameters and the different timings for obtaining the motion parameters, the manner of determining the motion parameters of the target user to map to the motion distance on the target path relative to the first position point may be different.
If the motion parameter is the difference value of the motion parameters uploaded twice, the motion parameter for the target user can reflect the change situation of the target user from the first position point to the second position point without other motion parameters, and in this situation, the motion parameter corresponding to the first position point does not need to be used, and the motion parameter of the target user can be mapped to the motion distance on the target path relative to the first position point only according to the motion parameter.
For example, in a day, the number of motion steps of the target user is 800 steps at 10 o 'clock, the target user moves 200 more steps with respect to 10 o' clock at 11 o 'clock, and at this time, the obtained number of motion steps is 200 steps, if the number of motion steps is 200 steps, the motion parameter is obtained in S201, the position point corresponding to the target user on the target path when the motion parameter is obtained is the second position point, and the position point corresponding to the target user on the target path at 10 o' clock is the first position point, the motion parameter is a changed motion parameter, and the motion parameter may represent a change situation between the first position point and the second position point.
If the motion parameter is the accumulated value of the motion parameters, the motion parameter corresponding to the second position point is reflected by the motion parameter for the target user, so that the change situation of the target user from the first position point to the second position point is difficult to reflect, and if the motion parameter corresponding to the first position point is not used, the motion distance corresponding to the target user on the target path is difficult to determine. In this case, it is necessary to determine the first position point and the motion parameter corresponding to the first position point, so as to determine the motion distance by using the motion parameter corresponding to the first position point, that is, the motion parameter of the target user is determined according to the motion parameter to be mapped to the motion distance on the target path relative to the first position point, and the method may be: determining a first position point of a target user on a target path and a motion parameter corresponding to the first position point; and determining the movement distance mapped to the target path relative to the first position point according to the movement parameter aiming at the target user and the movement parameter corresponding to the first position point.
If it is required to determine the movement distance of the movement parameter of the target user mapped to the target path relative to the first position point, the movement parameter of the target user at the second position point relative to the first position point may be determined first, for example, the movement parameter of the target user and the movement parameter corresponding to the first position point may be differed to obtain a difference value of the movement parameters, so as to determine the movement distance of the movement parameter of the target user mapped to the target path relative to the first position point according to the difference value of the movement parameters.
It can be understood that the first location point and the motion parameter corresponding to the first location point may be obtained according to a previously stored corresponding relationship between the motion parameter of the target user and the location point where the target user is located, for example, the first location point may be obtained according to the first location point identifier, and then the motion parameter corresponding to the first location point is obtained. After the movement parameter of the target user is mapped to the movement distance, corresponding to the first position point, on the target path, the movement distance of the target user at the original first position point on the target path may be advanced along the movement direction according to the movement distance and the movement direction of the user on the target path, so as to update the position of the target user on the target path to the second position point.
Wherein updating the location of the target user on the target path to the second location point may be in different manners. A first way of updating may be to directly jump the target user from the first location point to the second location point, so that the first location point where the target user is no longer located is seen when viewing the motion information of the target user, but is located at the second location point.
In some cases, the target path may be long, if the location point of the updated target user on the target path is directly displayed, it is difficult for the user to visually check the change of the location of the target user, or the time for the target user to log in the APP is uncertain, and there may be a case where the target user logs in once at a long interval. For this reason, the second updating method may be to gradually move the target user from the first location point to the second location point, so that a dynamic process that the target user moves from the first location point to the second location point is observed when viewing the motion information of the target user, thereby facilitating the target user to intuitively and vividly perceive the position change caused by the motion parameter.
Under the information of the second updating mode, in order to avoid that the user on the target path checks the information that the target user repeatedly passes through the same position point, and improve the user experience, the first position point may be a position point on the target path where the target user is located, which is obtained by updating according to the motion parameter when the motion parameter is obtained last before the motion parameter for the target user is obtained this time.
It should be noted that, in order to show the difference between the target user and the other users, in addition to showing the position change of the target user on the target path, the distance between the target user and the other users on the target path or the number of steps corresponding to the distance may be shown.
Based on the method provided by the embodiment corresponding to fig. 2, the position points of a plurality of users can be displayed on the target path based on the motion parameters, so that the influence of the motion parameters on the positions of the users on the target path is embodied, and the difference of the motion parameters among different users is embodied.
It can be understood that, on the target path, although each user has a corresponding position point, even different users can be distinguished by using different colors of the position point, many users may exist on the target path, and in the case of many users, it is difficult to distinguish different users by using the color of the position point, regardless of the setting angle of the color or the memory angle of the user corresponding to the position point with different color. Under the information, an identifier can be set for the user, the identifier can be used for distinguishing different users, the identifier is displayed on the APP display interface more obviously relative to the position point, and the user corresponding to the identifier can be conveniently known.
The identification may be an avatar, nickname, real name, other shape that can embody the user's characteristics, etc. that the user uses in the APP. At the location point of the target user, the identification of the target user, which is the avatar used by the target user in the APP, is also presented, as shown at 302 in fig. 3.
Since the identifier has the characteristics of being obvious and easy to distinguish, in this embodiment, when the first location point of the target user on the target path is updated to the second location point according to the movement distance and the movement direction from the starting point to the ending point, the identifier of the target user can be moved from the first location point to the second location point in order to conveniently view the movement information of the target user. Therefore, the motion information of the target user on the target path can be visually observed according to the movement information of the identification of the target user, and the interactive experience among users is improved.
According to the technical scheme, the target path with the starting point and the end point is set, the position points corresponding to a plurality of users can be displayed on the target path, when the motion parameters of the users are obtained, the motion distance of the motion parameter of each user mapped to the target path can be determined, and then the position points corresponding to each user can be displayed on the target path according to the motion distance of each user on the target path. Because the position points corresponding to different users can be displayed on the target path, the user who uses the target path to display the motion information of the user can visually check the moving progress of the user on the target path through the position points, and can clearly confirm the moving progress of other users on the target path by checking the position points of other users. Therefore, the motion trend difference between the users who use the target path to display the motion information of the users can be visually reflected through the target path, and therefore the interactive experience brought by motion can be better provided for the users.
It can be understood that the difference of the movement tendency between the users who use the target path to show the self movement information can be visually reflected through the target path, so that the competitive psychology between the users can be enhanced, and the user can hope that the user or some other user is in the leading position in the target path, or the user is not hoped that the user is in the leading position in the target path. For the user, the actual motion parameter of any user may not be changed, and it is difficult for the user to change the position of a certain user in the target path only according to the actual motion parameter.
The changing of the exercise parameter of any one of the users may be increasing or decreasing the exercise parameter of any one of the users. The user with the changed motion parameters may be the user himself who obtains the auxiliary motion parameters, a friend of the user, or a stranger.
For example, if a user looks at his moving progress on the target path and feels that he is behind too much, he may want to move forward to exceed users who are not far away from him. At this time, if the user obtains the auxiliary exercise parameter for increasing the exercise parameter, the user may use the auxiliary exercise parameter by himself or herself, so that the position of the user on the target path may be advanced a little.
For another example, if a user finds that a friend of the user is about to go beyond a stranger after looking up the moving progress of the user and other users on the target path, the user may wish to help the friend go beyond the stranger. At this time, if the user obtains the auxiliary motion parameter for increasing the motion parameter, the user may use the auxiliary motion parameter for the friend so that the position of the friend on the target path may advance a little beyond the stranger.
For another example, if a user looks at the moving progress of himself or other users on the target path and finds that user a is at the leading position on the target path, user a may be an opponent to the user, and the user does not want user a to be at the leading position. At this time, if the user obtains the auxiliary motion parameter for decreasing the motion parameter, the user may use the auxiliary motion parameter for the user a, so that the location of the friend on the target path may be moved backward by some.
In this embodiment, one way to provide the user with the auxiliary exercise parameters may be: the method comprises the steps that a function key for extracting the prop card is provided on an APP for displaying motion information, and when a user clicks the function key, an interface for extracting the prop card can be accessed, so that auxiliary motion parameters are provided for the user according to the prop card extracted by the user. Wherein the number of times the prop card is extracted per day, and thus the number of times the user is provided with the auxiliary exercise parameters, can be defined.
Referring to fig. 4, a function key such as "prop extraction" is provided on an APP for presenting sports information, as shown at 401 in fig. 4; when the user clicks 'extract prop', an interface for extracting a prop card can be displayed to the user, as shown in 402 in fig. 4; the user may extract one prop card from the provided plurality of prop cards, and if the user extracts one prop card, the prop card may be displayed to the user, as shown in 403 in fig. 4.
The prop cards may include a variety of cards, some of which may provide auxiliary props for increasing the exercise parameters of the user, such as a "bicycle" prop card, a "motorcycle" prop card, a "car" prop card, and so on, and some of which may provide auxiliary props for decreasing the exercise parameters of the user, such as a "backward" prop card, and so on. In fig. 4, 403 shows an example where the prop card is a "bicycle" prop card.
After the prop card is extracted, the user can determine the auxiliary motion parameters provided by the prop card through the 'self use' key and the 'other people use' key so as to change the motion parameters of which user in the plurality of users.
Through the use of the auxiliary motion parameters, the position of any user on the target path can be changed according to the requirements of the user, the interactive pleasure of the user on the target path is increased, and the interactive experience brought by motion is better provided for the user.
It should be noted that, in the case of using the auxiliary exercise parameter, the exercise parameter of the user may be obtained by modifying the actual exercise parameter of the user, so that the exercise parameter of the user may be different from the actual exercise parameter of the user, and the actual exercise parameter of the user may have other purposes besides being used for displaying the exercise information of the target user on the target path. For example, when the motion information display method is applied to a terminal device, there may be many APPs related to motion parameters on the terminal device, and the actual motion parameters need to be called.
In this case, in order to avoid the actual motion parameters being covered, so as to obtain both the motion parameters of the user and the actual motion parameters, and ensure that the actual motion parameters can be applied to other uses, in this embodiment, the motion parameters of a plurality of users on the target path may be independently stored corresponding to the target path.
The embodiments corresponding to fig. 1 to 4 mainly describe that the motion information of a plurality of users is vividly displayed by using a target path with a portrait, in some cases, there may be many users on the target path, the positions of many users on the target path may be very close, and the position points corresponding to the plurality of users overlap, so that it is difficult to distinguish the positions of the plurality of users on the target path. In this case, the present embodiment may also provide the user with a chart with detailed sports information so that the ranking of the position where the user is located on the target path can be viewed according to the chart.
Referring to fig. 5, fig. 5 shows a display interface of a ranking list, where the ranking list mainly includes a course ranking list and a complete-match ranking list, and each ranking list may include a friend ranking list and a national ranking list. The course ranking can be used for representing the sequence of the positions of the users who do not finish the target path on the target path; the completion leaderboard may be used to indicate a sequence of completing the target path by the user completing the target path.
Fig. 5 shows a chart 501, which may include, for example, a friend chart, the ranking of each user on a position on the target path, a position point where the user is located, and an accumulated motion parameter; the completion list is shown at 502, and for example, the national ranking list may include ranking information of the user who completes the target path, a duration used by the user who completes the target path, and the like.
The embodiments corresponding to fig. 1 to 5 mainly introduce a motion information display method, which can intuitively embody a motion trend difference between users who display their motion information by using a set target path, so as to better provide interactive experience brought by motion for the users, and the target path is a basis for implementing the motion information display method. Next, how to create the target path will be described.
Referring to fig. 6, fig. 6 is a flowchart illustrating a method for creating a target path, the method comprising:
s601, obtaining the starting point position information corresponding to the starting point and the end point position information corresponding to the end point of the target path.
The position information may be information indicating a position of the position point on the plane. The position information may be expressed by latitude and longitude, for example, the target path is a path created on the electronic map that approximates to an actual road network, and if the selected starting point is beijing and the destination is hong kong, the position information of the starting point corresponding to the starting point may be the latitude and longitude of beijing and the position information of the destination corresponding to the destination may be the latitude and longitude of hong kong; the position information may also be represented by coordinates in a planar coordinate system, for example, the target path is a virtual line segment unrelated to the actual road network, then the starting point and the end point may be two points in the planar coordinate system, and the position information corresponding to the starting point and the end point may be represented by their corresponding coordinates in the planar coordinate system.
The creation interface of the target route may be as shown in 701 in fig. 7, start point position information and end point position information may be added in the creation interface, and the manner of adding the start point position information and the end point position information may be input by an operator of an APP for displaying the motion information, for example, the operator may input the start point position information at a position corresponding to the start point position information in 701, and input the end point position information at a position corresponding to the end point position information; the manner of adding the start point position information and the end point position information may also be automatically generated according to the start point and the end point selected by the operator of the APP for displaying the motion information, for example, the operator selects beijing as the start point, and then, when the operator clicks the position of beijing on the map, the operator automatically generates corresponding position information according to the position clicked on the map by the operator, that is, the start point position information, and correspondingly, the end point position information may be automatically generated.
After the operator determines the starting point position information and the end point position information, the operator can click a 'create target route' button, so that the starting point position information and the end point position information can be obtained, and a target route can be created according to the starting point position information and the end point position information.
S602, the target path is created according to the starting point position information and the end point position information.
After the start point position information and the end point position information are acquired, a target path can be automatically created according to the start point position information and the end point position information, for example, a feasible path between the start point and the end point can be created as the target path through the automatic routing capability of the map interface, as shown in 702 in fig. 7, wherein the target path is shown as 7021.
After the target path is created, if the automatically created target path does not meet the requirement, the target path can be manually adjusted on the map, so that a proper target path is obtained.
It should be noted that the length of the target path may be long, and it may take a long time for the user to reach the end point from the start point of the target path, and if only the end point is used as the target of each user on the target path, it may require that the user is more patience, and for the user, it may be monotonous. In this case, a staged target may be set on the target path, that is, when the target path is created, at least one identification point may be determined in addition to the start point and the end point, and each time the user reaches one identification point on the target path, the user may be considered to complete one staged target, so that the user is more interested in continuously completing the entire target path, and user experience is improved.
Referring to 701 in fig. 7, by clicking the add mark point "+" key, mark points may be added, one mark point may be added each time the "+" key is clicked, and mark point position information is added at a position corresponding to the mark point position information in 701, so that the mark point position information may be obtained when the target path is created.
When the target path is created on the map, the identification point can be a point of interest, a landmark, and the like, and for example, a Wanli Changcheng, an Baotu spring, a Repeat tower, and the like can be used as the identification point.
If the obtained location information includes identification point location information in addition to the start location information and the end location information, then an implementation manner of S602 may be to create a target route according to the start location information, the end location information, and the identification point location information, where the identification point location information is used to identify an identification point of the target route.
The target path may be divided into a plurality of game segments by the identification point, the target path displays the position points of a plurality of users, for a certain user, when viewing the motion information of the user, the motion information of the user on the whole target path may be viewed, and the motion information of the user on the whole target path may be as shown in 801 in fig. 8; the motion information of the user at a certain playing stage on the target path may also be viewed, for example, the motion information of the user at a playing stage in Jiangsu, a display interface of the motion information of the playing stage in Jiangsu may be as shown in 802 in FIG. 8, and the motion information of other users at the playing stage in Jiangsu may also be viewed in the interface. For example, the motion information of the user on the whole target path is displayed in 801, at this time, the switch button 803 displays "my course", that is, when "my course" is clicked, the switch button 803 may be switched to a display interface for displaying the motion information of a certain course of the user on the target path, that is, the display interface shown by 802, at this time, the switch button 803 displays "course overview", that is, when "course overview" is clicked, the switch button may be switched to a display interface for displaying the motion information of the user on the whole target path, that is, the switch button is switched back to the display interface shown by 801.
Under the condition that the target path includes the information of the identification point, as the position of the user on the target path is updated, the user may reach or exceed the identification point, that is, the second position point may reach or exceed the identification point on the target path, and at this time, the user may be considered to complete a staged target, and then, the completion information corresponding to the identification point may be provided to the user.
The completion information may include achievement information when the second location point may reach or exceed the identified point on the target path, and introduction information of the identified point. If a game segment includes a plurality of identification points, the identification point is the last identification point on a certain game segment, and the second location point may reach or exceed the identification point on the target path, the target user may be considered to complete the game segment, then the completion information may also include a game completion certificate for the game segment.
The completion information display interface may be as shown in fig. 9, and the completion information corresponding to the identification point may be provided to the user through identification point cards as shown in 901 in fig. 9, where each identification point has an identification point card, for example, the identification point is "cangzhou iron lion" and the identification point has "cangzhou iron lion" card. When the second position point reaches or exceeds the identification point on the target path, the user can acquire the identification point card corresponding to the identification point, so that the completion information corresponding to the identification point is acquired.
For example, if the second location point reaches or exceeds the identification point of "cangzhou iron lion" on the target path, the target user may obtain a "cangzhou iron lion" card, which may provide performance information to the user as shown by 902, such as the accumulated number of steps of the user, the distance corresponding to the accumulated number of steps of the user, and the like; the "cangzhou iron lion" card may also provide the user with introductory information of "cangzhou iron lion" as shown at 903 so that the target user knows "cangzhou iron lion".
If the "Cangzhou iron lion" is the last identification point included in a certain game segment, the user may also be provided with a finish certificate of the game segment, as shown in 904, the finish certificate of the game segment may include the departure time and the finish time of the user for the game segment, the distance of the game segment, the time taken to finish the game segment, and the like.
Each user can also share the completion information to other users on the target path, for example, friends of the user, so that the user can interact with other users, and the interaction experience brought by the movement is better provided for the user.
By providing the completion information for the user, the user can know the score when the user reaches or exceeds the identification point, can know the identification point and learn the knowledge related to the identification point, so that the user is more interested in continuously completing the whole target path, and the user experience is improved.
Next, the motion information display method will be described with reference to a specific application scenario. In this scenario, the terminal device may obtain the actual number of exercise steps of the user, and use the prop for the user, so as to provide the user with an auxiliary exercise parameter (auxiliary exercise step number), and under this information, this embodiment provides an exercise information display method, see fig. 10, where the method includes:
s1001, acquiring actual movement steps of a plurality of users.
S1002, acquiring the auxiliary exercise steps aiming at a target user, wherein the target user is one of a plurality of users.
S1003, determining the motion parameters of each user according to the actual motion step number and the auxiliary motion step number, and independently storing the motion parameters corresponding to the target path.
S1004, converting each motion parameter into a moving distance, and determining the motion distance of each user mapped to the target path according to the moving distance.
S1005, displaying the position points corresponding to the users on the target path according to the movement distance of the users on the target path.
According to the technical scheme, the target path with the starting point and the end point is set, the position points corresponding to a plurality of users can be displayed on the target path, when the motion parameters of the users are obtained, the motion distance of the motion parameter of each user mapped to the target path can be determined, and then the position points corresponding to each user can be displayed on the target path according to the motion distance of each user on the target path. Because the position points corresponding to different users can be displayed on the target path, the user who uses the target path to display the motion information of the user can visually check the moving progress of the user on the target path through the position points, and can clearly confirm the moving progress of other users on the target path by checking the position points of other users. Therefore, the motion trend difference between the users who use the target path to display the motion information of the users can be visually reflected through the target path, and therefore the interactive experience brought by motion can be better provided for the users.
Based on the motion information presentation method provided by the foregoing embodiment, the present embodiment provides a motion information presentation apparatus 1100, referring to fig. 11a, the apparatus 1100 includes a first obtaining unit 1101, a determining unit 1102 and a display unit 1103:
the first obtaining unit 1101 is configured to obtain motion parameters of a plurality of users;
the determining unit 1102 is configured to determine a moving distance of each of the users, where the moving parameter of each of the users is mapped to a target path;
the display unit 1103 is configured to display, on the target path, position points corresponding to each user according to the movement distance of each user on the target path.
In one implementation manner, the determining unit 1102 is configured to determine that the motion parameter of the target user is mapped to a motion distance on the target path relative to a first location point, where the first location point is a location point that is updated by the target user in history on the target path;
the display unit 1103 is configured to update the position point displayed on the target path by the target user from a first position point to a second position point according to the movement distance of the target user and the moving direction from the starting point to the ending point of the target path.
In one implementation manner, the determining unit 1102 is configured to determine the first location point on the target path of the target user and a motion parameter corresponding to the first location point; and determining a movement distance mapped to the target path relative to the first position point according to the movement parameter aiming at the target user and the movement parameter corresponding to the first position point.
In one implementation, the display unit 1103 is configured to update a first location point of the target user on the target path to a second location point according to the movement distance of the target user and a moving direction from a starting point to an end point of the target path, and move the identifier of the target user from the first location point to the second location point.
In one implementation, the first location point is a starting location point of the target user on the target path, or a location point updated when the target user logs out last time.
In one implementation, referring to fig. 11b, the apparatus 1100 further includes a second obtaining unit 1104 and an adding unit 1105:
the second obtaining unit 1104 is configured to obtain a join request of the target user;
the adding unit 1105 is configured to add a location point corresponding to the target user at the starting point of the target route.
In one implementation, referring to fig. 11c, the apparatus 1100 further includes a first providing unit 1106:
the first providing unit 1106 is configured to provide auxiliary exercise parameters for the user, where the auxiliary exercise parameters are used to change the exercise parameters of any user in the plurality of users.
In one implementation, the motion parameters of the plurality of users are independently saved corresponding to the target path.
In one implementation, referring to fig. 11d, the apparatus 1100 further includes a third obtaining unit 1107 and a creating unit 1108:
the third acquiring unit 1107 is configured to acquire start point position information corresponding to a start point and end point position information corresponding to an end point of the target path;
the creating unit 1108 is configured to create the target route according to the starting point position information and the ending point position information.
In an implementation manner, the creating unit 1108 is configured to create the target path according to the start position information, the end position information, and identification point position information, where the identification point position information is used to identify an identification point of the target path.
In one implementation, referring to fig. 11e, the apparatus further includes a second providing unit 1109:
the second providing unit 1109 is configured to provide, to the target user, completion information corresponding to the identification point when the second location point reaches or exceeds the identification point on the target path.
As can be seen from the foregoing technical solutions, by setting a target path having a start point and an end point, position points corresponding to a plurality of users may be displayed on the target path, when the first obtaining unit 1101 obtains motion parameters of the plurality of users, the determining unit 1102 may respectively determine a motion distance of each motion parameter of the user mapped onto the target path, and then the updating unit 1103 may display the position points corresponding to each user on the target path according to the motion distance of each user on the target path. Because the position points corresponding to different users can be displayed on the target path, the user who uses the target path to display the motion information of the user can visually check the moving progress of the user on the target path through the position points, and can clearly confirm the moving progress of other users on the target path by checking the position points of other users. Therefore, the motion trend difference between the users who use the target path to display the motion information of the users can be visually reflected through the target path, and therefore the interactive experience brought by motion can be better provided for the users.
The embodiment of the application also provides a device for displaying the motion information, and the device for displaying the motion information is described below with reference to the attached drawings. Referring to fig. 12, an embodiment of the present application provides an apparatus 1200 for displaying motion information, where the apparatus 1200 may be a server, may have a relatively large difference due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 1222 (e.g., one or more processors) and a memory 1232, and one or more storage media 1230 (e.g., one or more mass storage devices) storing an application program 1242 or data 1244. Memory 1232 and storage media 1230 can be, among other things, transient storage or persistent storage. The program stored in the storage medium 1230 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Still further, the central processor 1222 may be configured to communicate with the storage medium 1230, to execute a series of instruction operations in the storage medium 1230 on the device 1200 for motion information presentation.
The apparatus 1200 for athletic information presentation may also include one or more power supplies 1226, one or more wired or wireless network interfaces 1250, one or more input-output interfaces 1258, and/or one or more operating systems 1241, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
The steps performed by the server in the above embodiment may be based on the server structure shown in fig. 12.
The CPU 1222 is configured to perform the following steps:
acquiring a motion parameter for a target user, wherein the target user is one of the plurality of users;
determining a movement distance corresponding to the target user on the target path according to the movement parameters aiming at the target user;
and updating a first position point of the target user on the target path to a second position point according to the movement distance and the moving direction from the starting point to the end point.
Referring to fig. 13, an embodiment of the present application provides a device 1300 for displaying motion information, where the device 1300 may also be a terminal device, and the terminal device may be any terminal device including a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a Point of Sales (POS), a vehicle-mounted computer, and the terminal device is a mobile phone:
fig. 13 is a block diagram illustrating a partial structure of a mobile phone related to a terminal device provided in an embodiment of the present application. Referring to fig. 13, the handset includes: a Radio Frequency (RF) circuit 1310, a memory 1320, an input unit 1330, a display unit 1340, a sensor 1350, an audio circuit 1360, a wireless fidelity (WiFi) module 1370, a processor 1380, and a power supply 1390. Those skilled in the art will appreciate that the handset configuration shown in fig. 13 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 13:
RF circuit 1310 may be used for receiving and transmitting signals during a message transmission or call, and in particular, for processing received downlink information of a base station by processor 1380; in addition, the data for designing uplink is transmitted to the base station. In general, the RF circuit 1310 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, RF circuit 1310 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The memory 1320 may be used to store software programs and modules, and the processor 1380 executes various functional applications and data processing of the cellular phone by operating the software programs and modules stored in the memory 1320. The memory 1320 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1320 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1330 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1330 may include a touch panel 1331 and other input devices 1332. Touch panel 1331, also referred to as a touch screen, can collect touch operations by a user (e.g., operations by a user on or near touch panel 1331 using any suitable object or accessory such as a finger, a stylus, etc.) and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 1331 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1380, where the touch controller can receive and execute commands sent by the processor 1380. In addition, the touch panel 1331 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 1330 may include other input devices 1332 in addition to the touch panel 1331. In particular, other input devices 1332 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 1340 may be used to display information input by a user or information provided to the user and various menus of the cellular phone. The Display unit 1340 may include a Display panel 1341, and optionally, the Display panel 1341 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, touch panel 1331 can overlay display panel 1341, and when touch panel 1331 detects a touch operation on or near touch panel 1331, processor 1380 can be configured to determine the type of touch event, and processor 1380 can then provide a corresponding visual output on display panel 1341 based on the type of touch event. Although in fig. 13, the touch panel 1331 and the display panel 1341 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1331 and the display panel 1341 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1350, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 1341 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 1341 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The audio circuit 1360, speaker 1361, microphone 1362 may provide an audio interface between the user and the handset. The audio circuit 1360 may transmit the electrical signal converted from the received audio data to the speaker 1361, and the electrical signal is converted into a sound signal by the speaker 1361 and output; on the other hand, the microphone 1362 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 1360, and then processes the audio data by the audio data output processor 1380, and then sends the audio data to, for example, another cellular phone via the RF circuit 1310, or outputs the audio data to the memory 1320 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 1370, and provides wireless broadband internet access for the user. Although fig. 13 shows the WiFi module 1370, it is understood that it does not belong to the essential constitution of the handset, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 1380 is a control center of the mobile phone, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1320 and calling data stored in the memory 1320, thereby integrally monitoring the mobile phone. Optionally, processor 1380 may include one or more processing units; preferably, the processor 1380 may integrate an application processor, which handles primarily operating systems, user interfaces, application programs, etc., and a modem processor, which handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated within processor 1380.
The handset also includes a power supply 1390 (e.g., a battery) to supply power to the various components, which may preferably be logically coupled to the processor 1380 via a power management system to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In this embodiment, the processor 1380 included in the terminal device further has the following functions:
acquiring motion parameters of a plurality of users;
respectively determining the movement distance of the movement parameter of each user mapped to the target path;
and displaying the position point corresponding to each user on the target path according to the movement distance of each user on the target path.
The embodiment of the present application further provides a computer-readable storage medium, configured to store a program code, where the program code is configured to execute any one implementation of the motion information presentation method described in the foregoing embodiments.
The terms "first," "second," "third," "fourth," and the like in the description of the application and the above-described figures, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (15)

1. A method for displaying motion information, the method comprising:
acquiring motion parameters of a plurality of users;
respectively determining the movement distance of the movement parameter of each user mapped to the target path;
and displaying the position point corresponding to each user on the target path according to the movement distance of each user on the target path.
2. The method of claim 1, wherein said determining, for a target user of said plurality of users, a movement distance at which a movement parameter of each of said users is mapped onto a target path comprises:
determining a movement distance of the target user relative to a first position point on the target path, wherein the movement distance is mapped to the movement parameter of the target user, and the first position point is a position point which is updated by the target user on the target path in history;
the displaying, according to the movement distance of each user on the target path, a position point corresponding to each user on the target path includes:
and updating the position point displayed on the target path by the target user from a first position point to a second position point according to the movement distance of the target user and the moving direction from the starting point to the end point of the target path.
3. The method of claim 2, wherein determining the mapping of the motion parameter of the target user to the motion distance on the target path relative to the first location point comprises:
determining the first position point of the target user on the target path and the motion parameter corresponding to the first position point;
and determining a movement distance mapped to the target path relative to the first position point according to the movement parameter aiming at the target user and the movement parameter corresponding to the first position point.
4. The method of claim 2, wherein the updating the position point displayed on the target path by the target user from a first position point to a second position point according to the movement distance of the target user and the moving direction from the starting point to the ending point of the target path comprises:
and updating the first position point of the target user on the target path to the second position point according to the movement distance of the target user and the moving direction from the starting point to the end point of the target path, and moving the identifier of the target user from the first position point to the second position point.
5. The method according to any one of claims 2 to 4, wherein the first location point is a starting location point of the target user on the target path or a location point updated when the target user logs out last time.
6. The method of claim 2, wherein prior to obtaining the motion parameters for the target user, the method further comprises:
acquiring a joining request of the target user;
and adding a position point corresponding to the target user at the starting point of the target path.
7. The method of claim 1, further comprising:
providing the user with an auxiliary exercise parameter for altering an exercise parameter of any of the plurality of users.
8. The method according to claim 1 or 2, characterized in that the method further comprises:
acquiring starting point position information corresponding to a starting point and end point position information corresponding to an end point of the target path;
and creating the target path according to the starting point position information and the end point position information.
9. The method of claim 8, wherein creating the target path based on the start point location information and the end point location information comprises:
and creating the target path according to the starting point position information, the end point position information and the identification point position information, wherein the identification point position information is used for identifying the identification point of the target path.
10. The method of claim 9, wherein if the position point displayed on the target path by the target user is updated from a first position point to a second position point according to the movement distance of the target user and the moving direction from the starting point to the ending point of the target path, the method further comprises:
and when the second position point reaches or exceeds the identification point on the target path, providing completion information corresponding to the identification point for the target user.
11. An exercise information presentation apparatus, characterized in that the apparatus comprises a first acquisition unit, a determination unit and a display unit:
the first acquisition unit is used for acquiring the motion parameters of a plurality of users;
the determining unit is used for respectively determining the movement distance of the movement parameter of each user mapped to the target path;
and the display unit is used for displaying the position point corresponding to each user on the target path according to the movement distance of each user on the target path.
12. The apparatus according to claim 11, wherein the determining unit is configured to determine that the motion parameter of the target user is mapped to a motion distance on the target path relative to a first location point, where the first location point is a location point that is updated by the target user historically on the target path;
and the display unit is used for updating the position point displayed on the target path by the target user from a first position point to a second position point according to the movement distance of the target user and the moving direction from the starting point to the end point of the target path.
13. The apparatus according to claim 12, wherein the determining unit is configured to determine the first location point on the target path of the target user and the motion parameter corresponding to the first location point; and determining a movement distance mapped to the target path relative to the first position point according to the movement parameter aiming at the target user and the movement parameter corresponding to the first position point.
14. An apparatus for athletic information presentation, the apparatus comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the motion information presentation method according to any one of claims 1 to 10 according to instructions in the program code.
15. A computer-readable storage medium for storing a program code for executing the motion information presentation method according to any one of claims 1 to 10.
CN201810771584.0A 2018-07-13 2018-07-13 Motion information display method and device Pending CN110716773A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810771584.0A CN110716773A (en) 2018-07-13 2018-07-13 Motion information display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810771584.0A CN110716773A (en) 2018-07-13 2018-07-13 Motion information display method and device

Publications (1)

Publication Number Publication Date
CN110716773A true CN110716773A (en) 2020-01-21

Family

ID=69209325

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810771584.0A Pending CN110716773A (en) 2018-07-13 2018-07-13 Motion information display method and device

Country Status (1)

Country Link
CN (1) CN110716773A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114488761A (en) * 2022-01-13 2022-05-13 杭州景而腾科技有限公司 Tensile race timing device
CN116543013A (en) * 2023-04-19 2023-08-04 北京拙河科技有限公司 Ball movement track analysis method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105381588A (en) * 2014-09-02 2016-03-09 耐克创新有限合伙公司 Monitoring fitness using a mobile device
CN105664447A (en) * 2010-06-28 2016-06-15 耐克创新有限合伙公司 Monitoring and tracking athletic activity
CN105797349A (en) * 2016-03-17 2016-07-27 深圳市智游人科技有限公司 Live-action running device, method and system
CN106913340A (en) * 2009-04-26 2017-07-04 耐克创新有限合伙公司 GPS features and function in sports watch system
CN107427713A (en) * 2015-04-27 2017-12-01 欧姆龙健康医疗事业株式会社 Movable information measure device, exercising support method and motion auxiliary program
CN107580120A (en) * 2017-08-30 2018-01-12 努比亚技术有限公司 The recording method of running route and mobile terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106913340A (en) * 2009-04-26 2017-07-04 耐克创新有限合伙公司 GPS features and function in sports watch system
CN105664447A (en) * 2010-06-28 2016-06-15 耐克创新有限合伙公司 Monitoring and tracking athletic activity
CN105381588A (en) * 2014-09-02 2016-03-09 耐克创新有限合伙公司 Monitoring fitness using a mobile device
CN107427713A (en) * 2015-04-27 2017-12-01 欧姆龙健康医疗事业株式会社 Movable information measure device, exercising support method and motion auxiliary program
CN105797349A (en) * 2016-03-17 2016-07-27 深圳市智游人科技有限公司 Live-action running device, method and system
CN107580120A (en) * 2017-08-30 2018-01-12 努比亚技术有限公司 The recording method of running route and mobile terminal

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114488761A (en) * 2022-01-13 2022-05-13 杭州景而腾科技有限公司 Tensile race timing device
CN114488761B (en) * 2022-01-13 2023-08-08 杭州景而腾科技有限公司 Tension race timing device
CN116543013A (en) * 2023-04-19 2023-08-04 北京拙河科技有限公司 Ball movement track analysis method and device

Similar Documents

Publication Publication Date Title
CN111773696B (en) Virtual object display method, related device and storage medium
CN108259990B (en) Video editing method and device
WO2018126885A1 (en) Game data processing method
WO2016209034A1 (en) Method and device for providing workout guide information
CN108920084B (en) Visual field control method and device in game
CN111672125B (en) Virtual object interaction method and related device
WO2019228038A1 (en) Positioning information prompting method and apparatus, and storage medium and electronic apparatus
CN111672109B (en) Game map generation method, game testing method and related device
CN106693367B (en) Processing method for displaying data at client, server and client
CN106303733B (en) Method and device for playing live special effect information
CN107979628B (en) Method, device and system for acquiring virtual article
CN111686447B (en) Method and related device for processing data in virtual scene
CN111327914B (en) Interaction method and related device
CN112774194B (en) Virtual object interaction method and related device
CN110955510B (en) Isolation processing method and related device
CN110716773A (en) Motion information display method and device
CN112131438A (en) Information generation method, information display method and device
CN112202970A (en) Friend making method, terminal equipment and computer readable storage medium
CN111617472A (en) Method and related device for managing model in virtual scene
CN112764543A (en) Information output method, terminal equipment and computer readable storage medium
CN114849215B (en) Rope skipping counting method and device based on intelligent wearable equipment
CN110597973A (en) Man-machine conversation method, device, terminal equipment and readable storage medium
WO2020073205A1 (en) Game icon display method, related device, and computer-readable storage medium
CN113633985B (en) Virtual accessory using method, related device, equipment and storage medium
CN108737244B (en) Attribute information display method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40020306

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination