CN113838550A - Health data display method and electronic equipment - Google Patents

Health data display method and electronic equipment Download PDF

Info

Publication number
CN113838550A
CN113838550A CN202010514771.8A CN202010514771A CN113838550A CN 113838550 A CN113838550 A CN 113838550A CN 202010514771 A CN202010514771 A CN 202010514771A CN 113838550 A CN113838550 A CN 113838550A
Authority
CN
China
Prior art keywords
user
interface
health data
elements
dimension
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010514771.8A
Other languages
Chinese (zh)
Inventor
漆永强
梁萍
黎元昶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010514771.8A priority Critical patent/CN113838550A/en
Priority to PCT/CN2021/092122 priority patent/WO2021249073A1/en
Publication of CN113838550A publication Critical patent/CN113838550A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Physical Education & Sports Medicine (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a health data display method and electronic equipment, relates to the field of electronic equipment, and solves the problem that the current health data cannot be visually displayed to a user at present. The specific scheme is as follows: a first operation of a user is received. In response to the first operation, a first interface is displayed, a first area of the first interface includes M first elements, the M first elements correspond to M dimensions, each of the M first elements is used for displaying a completion degree of the corresponding dimension, the completion degree of the dimension is a completion degree including a task in the dimension, the completion degrees of the plurality of tasks are determined according to corresponding health data, and M is an integer greater than 1. Wherein, the figure formed by the M first elements is a rotation symmetry figure, and the area of any two elements in the M first elements is the same.

Description

Health data display method and electronic equipment
Technical Field
The embodiment of the application relates to the field of electronic equipment, in particular to a health data display method and electronic equipment.
Background
With the improvement of living standard of people, the demand for health management in daily life is increasing. At present, an Application (APP) of a health management class may be set in an electronic device (such as a smart phone), and the collected health data such as the exercise condition of the user is displayed to the user through the APP of the health management class. Therefore, the user can check the current health condition through the APP in the electronic equipment, and accordingly management of the user on the self health is achieved.
Therefore, the electronic device plays a very important role in efficiently managing the health of the user through the health data interface displayed to the user through the APP.
Disclosure of Invention
The embodiment of the application provides a health data display method and electronic equipment, and solves the problem that the current health data cannot be visually displayed to a user at present.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, a method for displaying health data is provided, the method including: a first operation of a user is received. In response to the first operation, a first interface is displayed, a first area of the first interface includes M first elements, the M first elements correspond to M dimensions, each of the M first elements is used for displaying a completion degree of the corresponding dimension, the completion degree of the dimension is a completion degree including a task in the dimension, the completion degrees of the plurality of tasks are determined according to corresponding health data, and M is an integer greater than 1. Wherein, the figure formed by the M first elements is a rotation symmetry figure, and the area of any two elements in the M first elements is the same.
Based on the scheme, in the embodiment of the application, the health data can be divided into different dimensions, each dimension can comprise one or more tasks, and the task completion conditions in the different dimensions are displayed globally, so that a user can clearly and intuitively know the current health data completion conditions. As an example, the interface may display the health data in a multi-leaf (e.g., clover) model, wherein each leaf of the multi-leaf may be a first element, and each leaf may be used to display the completeness of the corresponding dimension of the first element.
In a possible implementation manner, the first region further includes N second elements, each of the N second elements corresponds to one first element, and N is an integer smaller than or equal to M and greater than or equal to 0. The second element is included in the first element corresponding to the second element and used for indicating the completion degree of the dimension corresponding to the first element. Based on the scheme, more elements can be set, so that the user can know the current health data more clearly through the first interface. Illustratively, taking M as 3, a first element corresponds to a leaf in the clover model. Two layers may be disposed on the first interface, where the layer on the lower layer may be clover with unchanged size and color as a reference. An upper layer may be provided on the basis of this layer, for example, a small clover blade may be provided as the second element in a clover blade. When the completion degree of the dimension corresponding to the clovers in the lower layer is changed, the completion degree of the dimension can be visually shown to a user through the color, the area (namely the size) and/or the transparency difference of the small clovers.
In a possible implementation manner, in the first interface, the areas of the M first elements are fixed, and the area of the second element is determined according to the degree of completion of the dimension corresponding to the first element corresponding to the second element. The area of the second element is in positive correlation with the completion degree of the dimension. Based on the scheme, an implementation mode that the second element changes along with the task completion degree under the dimensionality is provided, namely, along with the improvement of the task completion degree, the area of the second element (such as the small blade in the example) can be increased along with the second element, so that a user can intuitively know the completion progress of the task under the corresponding dimensionality through the size change of the small blade.
In one possible implementation, the first element and the second element are different colors. Based on the scheme, the user can distinguish the size difference of the first element as the background and the second element for representing the current task completion situation through colors.
In a possible implementation manner, in the process of displaying the first element on the first interface, the two-dimensional coordinate of the display image of the electronic device is used as a reference, the center line of the first element forms a first included angle with the coordinate axis at a first time, the center line of the first element forms a second included angle with the coordinate axis at a second time, the second time is later than the first time, and the second included angle is different from the first included angle. Based on the scheme, when the first element is displayed on the first interface, the first element can be displayed on the interface in an animation effect mode. For example, by rotation with the root position of the clover blade as the origin. It will be appreciated that when the clover blades are shown rotated, the angle between the corresponding centerline and the coordinate axis may vary over time. The two-dimensional coordinates of the electronic equipment display image can be the center of a display screen of the electronic equipment as an original point, when a user holds the electronic equipment and faces the display screen, the positive direction of an x axis is rightward along the horizontal direction of the electronic equipment, and the positive direction of a y axis is upward along the vertical direction of the electronic equipment.
In a possible implementation manner, in the process of displaying the second element on the first interface, the two-dimensional coordinate of the display image of the electronic device is used as a reference, the center line of the second element forms a third included angle with the coordinate axis at a third time, the center line of the second element forms a fourth included angle with the coordinate axis at a fourth time, the fourth time is later than the third time, and the fourth included angle is different from the third included angle. Based on the scheme, a dynamic effect display mode of the second element in the display process is provided. In a similar manner to the above possible implementation, the dynamic display of the first element is also referred to.
In one possible implementation manner, when the tasks in the dimensions corresponding to the M first elements are not completely completed, the color of each first element in the M first elements is different from each other. When the tasks under the corresponding dimensions of the M first elements are all completed, the colors of the M first elements are the same. Based on this solution, a post-exposure method is provided, i.e. when a task is not completed, each different first element (e.g. a leaf of M-leaf grass) may have a different color. When all tasks are completed, the color of the different first elements is normalized for display, which may be, for example, green. So that the user can more intuitively know whether the current task is completely completed.
In one possible implementation, the method further includes: and receiving a second operation of the first element by the user. And responding to the second operation, and displaying a second interface, wherein the second interface comprises detailed completion conditions of tasks under the dimension corresponding to the first element. Based on the scheme, when a user wants to acquire the detailed condition of the task in the corresponding dimension through the first element, second operations such as clicking, touching and the like can be input to the first element, and the specific condition of the task in the corresponding dimension can be seen from the interface. Such as which tasks are included in the dimension, the current completion of each task, the goal of each task, etc.
In one possible implementation, the first interface further includes: a second area comprising the first historical health data within a preset period shown by any one of the possible health data showing methods as described above. Based on the scheme, an area where the user views the completion condition of the historical task within the preset period, namely a second area, can be provided on the first interface. The preset period may be a preset time period such as one week, one month, etc.
In one possible implementation manner, the first historical health data is displayed through thumbnails of images composed of M first elements corresponding to different dates in the preset period. Based on the scheme, since the display area of the first interface is limited, the example provides a better implementation manner, namely, thumbnails of M-leaf grasses of the current day are displayed at dates corresponding to each time period, so that more information is shown to a user through a smaller screen area.
In one possible implementation, the method further includes: and receiving a third operation of the thumbnail in the second area by the user. In response to the third operation, M first elements corresponding to the thumbnail are presented in the first area. Based on the scheme, when the user wants to view the specific task completion condition of a certain day in the history, a third operation such as clicking or touching can be input to the corresponding thumbnail, so that the thumbnail can be enlarged in the first area, and the task completion condition of the day can be clearly viewed.
In one possible implementation, the method further includes: and receiving a fourth operation of the thumbnail in the second area by the user. In response to the fourth operation, M first elements and N second elements corresponding to the thumbnail are presented in the first area. Based on the scheme, when the user views the task completion condition corresponding to a certain day in the history record, the first element can be displayed in the first area, and the second element can also be displayed at the same time, so that the user can clearly know the task completion condition of the day through the second element.
In one possible implementation, the method further includes: and receiving a fifth operation of the user on the second area. In response to the fifth operation, displaying second historical health data in the second area, wherein the second historical health data are displayed through thumbnails of the images formed by the M first elements corresponding to different dates in the preset period. The time corresponding to the second historical health data is different from the time corresponding to the first historical health data. Based on the scheme, when the user wants to view the history records earlier or later, a fifth operation of sliding left or right can be input to the second area so as to display the history records in the corresponding preset period on the first interface.
In one possible implementation, the first interface further includes a third area, where the third area includes an excitation phrase, and the excitation phrase is determined according to the degree of completion of the dimensions corresponding to the M first elements exhibited by the first area. Based on the scheme, the user can receive health management suggestions given by the electronic equipment according to the current motion situation by setting the incentive phrases, so that the user can manage the self health more scientifically. For example, the motivational phrase may be displayed in the first interface on the current date, or may be displayed in the first interface on the historical date.
In one possible implementation, the first interface further includes a first control, and the method includes: and receiving a sixth operation of the first control by the user. And responding to the sixth operation, displaying a third interface, wherein the third interface comprises an image corresponding to the M first elements displayed in the first area, so as to share the image. Based on the scheme, the user is provided with an entrance for sharing the current M-leaf grass model, namely a first control at the first interface. Through inputting sixth operations such as clicking or touching, the user can obtain the picture including the image corresponding to the current M-leaf grass model, and then the health data is shared according to the picture.
In one possible implementation, the first interface further includes a second control, and the method further includes: and receiving a seventh operation of the second control by the user. And responding to the seventh operation, displaying a configuration menu on the first interface so that a user can set tasks in different dimensions through the configuration menu, or generating a weekly report/monthly report/annual report corresponding to the health data through the configuration menu. Based on the scheme, the second control is provided, so that a user can actively manage the corresponding dimensionality of different blades in the M-shaped grass and the corresponding task entries in different dimensionalities. Through inputting a seventh operation such as clicking or touching, a user can see the configuration menu from the first interface, and further through carrying out related operation on the control of the target management function in the configuration menu, management of the dimension and the task under the dimension is achieved. In addition, the configuration menu may further include related function controls for generating a weekly report, a monthly report, a quarterly report, and/or an annual report, so that the user may perform corresponding operations thereon to obtain corresponding set data.
In one possible implementation, the health data is acquired by the electronic device itself, and/or acquired by other electronic devices, and/or input by the user itself. Based on the scheme, the method for acquiring the health data is explained. For example, some or all of the health data may be acquired by a sensor module provided in the electronic device. As another example, some or all of the health data may be acquired by a wearable device in communication with the electronic device. In another example, the user can input the information by himself (for example, the user performs a card punching operation, etc.).
In one possible implementation, the method further includes: and receiving an eighth operation of the user, wherein the eighth operation comprises a double-click operation, a long-press operation or a zooming operation on the first area. And responding to the eighth operation, adding a first element in the first area, wherein the dimension corresponding to the added first element is different from the dimension corresponding to the first element displayed in the current first area. Based on the scheme, a convenient mode for the user to manage different dimensions and tasks therein is provided. Namely, the effect of increasing or decreasing one blade is obtained by inputting the eighth operation to one blade in the M-blade grass and/or the blank area in the first area, and meanwhile, the user can modify and adjust the dimension information corresponding to the newly added blade through the interface. It can be understood that, when the user adds or deletes a blade through the eighth operation, the corresponding configuration menu is also automatically adjusted, so as to ensure the consistency between the display of the M-leaf in the first interface and the information in the configuration menu.
In a possible implementation manner, the dimension corresponding to each first element includes one or more tasks; the first elements having different task completions differ in color, and/or transparency. Based on the scheme, the display method convenient to implement is provided, namely only one layer is arranged on the first interface, and the color and/or transparency of the layer have a corresponding relation with the completion degree of the corresponding dimension. The task completion condition under the dimension can be quickly displayed by adjusting the color or transparency of different elements. Of course, in other embodiments, the display of the task completion degree may also be achieved by adjusting the size of the first element on the first interface.
In one possible implementation manner, for each first element in the M first elements, when all tasks in the dimension corresponding to the first element are not completed, the color of the first element is a first color; when all tasks in the dimension are completed, the color of the first element is a second color; the different colors of the first element are used for representing different completion degrees of the task under the corresponding dimension, and the deeper the color of the first element is, the higher the completion degree of the task of the corresponding dimension is. Based on the scheme, a specific method for showing the task completion degree corresponding to the dimension through the color of the first element is provided. I.e., the darker the color, the higher the task completion, whereas the lighter the color, the lower the task completion.
In one possible implementation, the color of the first element is determined according to the following formula:
Figure BDA0002529638570000041
wherein, SRR value of RGB of a first color, ERR value, N, of RGB of the second coloriThe number of tasks in the ith dimension is, and i is the number of completed tasks in the ith dimension; sGG value of RGB of the first color, EGA G value of RGB of a second color; sBB value of RGB of the first color, EBB value of RGB of the second color. Based on the scheme, a corresponding relation example between the color and the task completion degree is provided. Through the formula, the color required to be displayed under the current task completion degree can be obtained.
In a possible implementation manner, for each first element in the M first elements, when all tasks in the dimension corresponding to the first element are not completed, the transparency of the first element is a first transparency; when all tasks in the dimension are completed, the transparency of the first element is a second transparency; the different transparencies of the first element are used for representing different completions of the task under the corresponding dimension, and the larger the transparency of the first element is, the higher the completion of the task of the corresponding dimension is. Based on the scheme, another scheme for displaying the task completion degree by setting a layer is provided. That is, the completion degree of the corresponding task is shown through the transparency of the first element. For example, the greater the transparency, the higher the completion, whereas the less the transparency, the lower the completion. Alternatively, the smaller the transparency, the higher the completeness, whereas the greater the transparency, the lower the completeness.
In one possible implementation, the transparency of the first element is determined according to the following formula:
Figure BDA0002529638570000042
wherein, T0Is of a first transparency, TNTo end transparency, NiThe number of tasks in the ith dimension is, and i is the number of completed tasks in the ith dimension. Based on the scheme, the method for determining the transparency required to be displayed according to the task completion degree is provided.
In a second aspect, a health data presentation device is provided, the device comprising: the receiving unit and the display unit. The receiving unit is used for receiving a first operation of a user. The display unit is used for responding to the first operation and displaying a first interface, a first area of the first interface comprises M first elements, the M first elements correspond to M dimensions, each first element in the M first elements is used for displaying the completion degree of the corresponding dimension, the completion degree of the dimension is the completion degree of tasks in the dimension, the completion degrees of the tasks are determined according to corresponding health data, and M is an integer greater than 1. Wherein, the figure formed by the M first elements is a rotation symmetry figure, and the area of any two elements in the M first elements is the same.
In a possible implementation manner, the first region further includes N second elements, each of the N second elements corresponds to one first element, and N is an integer smaller than or equal to M and greater than or equal to 0. The second element is included in the first element corresponding to the second element and used for indicating the completion degree of the dimension corresponding to the first element.
In a possible implementation manner, in the first interface, the areas of the M first elements are fixed, and the area of the second element is determined according to the degree of completion of the dimension corresponding to the first element corresponding to the second element. The area of the second element is in positive correlation with the completion degree of the dimension.
In one possible implementation, the first element and the second element are different colors.
In a possible implementation manner, in the process of displaying the first element on the first interface, the two-dimensional coordinate of the display image of the electronic device is used as a reference, the center line of the first element forms a first included angle with the coordinate axis at a first time, the center line of the first element forms a second included angle with the coordinate axis at a second time, the second time is later than the first time, and the second included angle is different from the first included angle.
In a possible implementation manner, in the process of displaying the second element on the first interface, the two-dimensional coordinate of the display image of the electronic device is used as a reference, the center line of the second element forms a third included angle with the coordinate axis at a third time, the center line of the second element forms a fourth included angle with the coordinate axis at a fourth time, the fourth time is later than the third time, and the fourth included angle is different from the third included angle.
In one possible implementation manner, when the tasks in the dimensions corresponding to the M first elements are not completely completed, the color of each first element in the M first elements is different from each other. When the tasks under the corresponding dimensions of the M first elements are all completed, the colors of the M first elements are the same.
In a possible implementation manner, the receiving unit is further configured to receive a second operation of the first element by the user. The display unit is further used for responding to the second operation and displaying a second interface, and the second interface comprises detailed completion conditions of tasks under the corresponding dimension of the first element.
In one possible implementation, the first interface further includes: a second area comprising the first historical health data during a preset period of time as presented by any one of the possible health data presenting means as described above.
In one possible implementation manner, the first historical health data is displayed through thumbnails of images composed of M first elements corresponding to different dates in the preset period.
In a possible implementation manner, the receiving unit is further configured to receive a third operation of the thumbnail in the second area by the user. The presentation unit is further configured to present M first elements corresponding to the thumbnail in the first area in response to the third operation.
In a possible implementation manner, the receiving unit is further configured to receive a fourth operation of the thumbnail in the second area by the user. The presentation unit is further configured to present, in response to the fourth operation, M first elements and N second elements corresponding to the thumbnail in the first area.
In a possible implementation manner, the receiving unit is further configured to receive a fifth operation of the second area by the user. The display unit is further used for responding to the fifth operation, displaying second historical health data in the second area, wherein the second historical health data are displayed through thumbnails of images formed by M first elements corresponding to different dates in the preset period. The time corresponding to the second historical health data is different from the time corresponding to the first historical health data.
In one possible implementation, the first interface further includes a third area, where the third area includes an excitation phrase, and the excitation phrase is determined according to the degree of completion of the dimensions corresponding to the M first elements exhibited by the first area.
In a possible implementation manner, the first interface further includes a first control, and the receiving unit is further configured to receive a sixth operation of the first control by the user. The display unit is further configured to display, in response to the sixth operation, a third interface, where the third interface includes an image corresponding to the M first elements displayed in the first area, so as to perform a sharing operation on the image.
In a possible implementation manner, the first interface further includes a second control, and the receiving unit is further configured to receive a seventh operation of the second control by the user. The display unit is further used for displaying a configuration menu on the first interface in response to the seventh operation, so that a user can set tasks in different dimensions through the configuration menu, or a weekly report/monthly report/annual report corresponding to the health data is generated through the configuration menu.
In one possible implementation, the health data is acquired by the electronic device itself, and/or acquired by other electronic devices, and/or input by the user itself.
In a possible implementation manner, the receiving unit is further configured to receive an eighth operation of the user, where the eighth operation includes a double-click operation, a long-press operation, or a zoom operation on the first area. And the display unit responds to the eighth operation, adds a first element in the first area, wherein the dimension corresponding to the added first element is different from the dimension corresponding to the first element displayed in the current first area.
In a possible implementation manner, the dimension corresponding to each first element includes one or more tasks; the first elements having different task completions differ in color, and/or transparency.
In one possible implementation manner, for each first element in the M first elements, when all tasks in the dimension corresponding to the first element are not completed, the color of the first element is a first color; when all tasks in the dimension are completed, the color of the first element is a second color; the different colors of the first element are used for representing different completion degrees of the task under the corresponding dimension, and the deeper the color of the first element is, the higher the completion degree of the task of the corresponding dimension is.
In one possible implementation, the color of the first element is determined according to the following formula:
Figure BDA0002529638570000061
wherein, SRR value of RGB of a first color, ERR value, N, of RGB of the second coloriThe number of tasks in the ith dimension is, and i is the number of completed tasks in the ith dimension; sGG value of RGB of the first color, EGA G value of RGB of a second color; sBB value of RGB of the first color, EBB value of RGB of the second color.
In a possible implementation manner, for each first element in the M first elements, when all tasks in the dimension corresponding to the first element are not completed, the transparency of the first element is a first transparency; when all tasks in the dimension are completed, the transparency of the first element is a second transparency; the different transparencies of the first element are used for representing different completions of the task under the corresponding dimension, and the larger the transparency of the first element is, the higher the completion of the task of the corresponding dimension is.
In one possible implementation, the transparency of the first element is determined according to the following formula:
Figure BDA0002529638570000071
wherein, T0Is of a first transparency, TNTo end transparency, NiThe number of tasks in the ith dimension is, and i is the number of completed tasks in the ith dimension.
In a third aspect, an electronic device is provided that includes one or more processors and one or more memories. The one or more memories coupled with the one or more processors, the one or more memories storing computer instructions. The computer instructions, when executed by the one or more processors, cause the electronic device to perform the method of presenting health data as described in any one of the first aspect and its possible arrangements.
In a fourth aspect, a computer-readable storage medium is provided, which comprises computer instructions that, when executed, perform the health data presentation method according to the first aspect and any possible arrangement thereof.
In a fifth aspect, a chip system is provided, the chip comprising processing circuitry and an interface. The processing circuit is configured to call up and run a computer program stored in the storage medium from the storage medium to execute the health data presentation method according to any one of the first aspect and its possible settings.
It is understood that any design manner of the second aspect to the fifth aspect may correspond to the first aspect and any possible design manner thereof, and therefore, similar technical effects can be brought, and are not described herein again.
Drawings
FIG. 1 is a schematic illustration of a health data presentation;
fig. 2 is a schematic composition diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 3 is a schematic drawing of leaves in a leafy grass model according to an embodiment of the present disclosure;
FIG. 4 is a schematic view of a color change of a blade according to an embodiment of the present disclosure;
FIG. 5 is a schematic illustration of a display of a leafy grass model according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating a process of displaying a leafy grass model according to an embodiment of the present disclosure;
fig. 7 is a schematic flowchart of a health data display method according to an embodiment of the present application;
fig. 8 is a schematic view of an interface of an electronic device according to an embodiment of the present disclosure;
fig. 9A is a schematic view of an interface of another electronic device according to an embodiment of the present disclosure;
fig. 9B is a schematic view of an interface of another electronic device according to an embodiment of the present disclosure;
fig. 9C is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 10A is a schematic view of an interface of another electronic device according to an embodiment of the present disclosure;
fig. 10B is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 11A is a schematic view of an interface of another electronic device according to an embodiment of the present disclosure;
fig. 11B is a schematic view of an interface of another electronic device according to an embodiment of the present disclosure;
fig. 11C is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 12A is a schematic view of an interface of another electronic device according to an embodiment of the present disclosure;
fig. 12B is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 12C is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 12D is a schematic diagram of an interface of another electronic device provided in the embodiment of the present application;
fig. 13A is a schematic view of an interface of another electronic device according to an embodiment of the present disclosure;
fig. 13B is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 14A is a schematic view of an interface of another electronic device according to an embodiment of the present disclosure;
fig. 14B is a schematic view of an interface of another electronic device according to an embodiment of the present disclosure;
fig. 14C is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 15A is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 15B is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 16A is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 16B is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 16C is a schematic view of an interface of another electronic device according to an embodiment of the present application;
fig. 16D is a schematic diagram of an interface of another electronic device provided in the embodiment of the present application;
fig. 16E is a schematic diagram of an interface of another electronic device provided in the embodiment of the present application;
FIG. 17 is a schematic diagram of a health data display device according to an embodiment of the present application;
fig. 18 is a schematic composition diagram of a chip system according to an embodiment of the present disclosure;
fig. 19 is a schematic composition diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Generally, a user can manage his/her health through a health management APP provided in his/her electronic device (e.g., a smart phone, a smart watch, etc.). It should be noted that, in the present application, the health data may be data provided to the user so that the user can actively perform health management according to the data. For example, the health data may include exercise data (e.g., number of steps, number of moderate to high intensity exercises, etc.). The health data may also include sleep data (e.g., regular sleep, etc.). The health data may also include mood data (e.g., whether to perform breathing exercises, etc.). For example, the user may input an operation to start the health management APP on the electronic device, and in response to the operation, the electronic device may run the APP and present the home page content. For example, referring to fig. 1, the electronic device is a smart phone (or simply referred to as a mobile phone) as shown in fig. 1, where the APP1 is a health management APP. As shown in fig. 1 (a), a user may touch an icon of APP1 on a display screen of a mobile phone to trigger a mobile phone start APP 1. After starting to the APP1, an interface as shown in (b) of fig. 1 may be displayed on the interface of the mobile phone. Wherein the current movement situation of the user can be included. As shown in fig. 1 (b), the interface may show the current number of steps (e.g., 4590 steps) of the user and whether the user has performed a fitness activity (e.g., has not performed). According to the health data, the user can effectively manage the health condition of the user. For example, the target number of steps of the user is 6000 steps. When the user sees that the current step number is less than 6000 steps, if the current step number is 4590 steps, the user can complete the target step number by increasing walking appropriately. As another example, the user's target fitness activity may be performed 2 times. When the user sees that exercise is not currently being performed, then the exercise activity can be scheduled appropriately to accomplish the goal.
At present, health management APPs can display health data such as current activity conditions of users on corresponding interfaces so as to display corresponding information to the users. However, this display method is too single, and the user cannot intuitively acquire the current health data through the interface, and thus the user cannot be effectively encouraged to actively perform health management.
In order to solve the above problem, an embodiment of the present application provides a method for displaying health data, which can more intuitively display the health data to a user, so as to encourage the user to perform health management accordingly.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
It should be noted that the health data display method provided by the embodiment of the present application may be applied to electronic devices of users. The electronic device can be a portable mobile device provided with a display screen, such as a mobile phone, a tablet personal computer, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, a media player, and the like, and can also be a smart watch, a smart bracelet, and the like. The embodiment of the present application does not specifically limit the specific form of the apparatus.
Referring to fig. 2, a structural diagram of an electronic device 200 according to an embodiment of the present disclosure is shown in fig. 2, where the electronic device 200 may include a processor 210, an external memory interface 220, an internal memory 230, a Universal Serial Bus (USB) interface 240, a charging management module 250, a power management module 251, a battery 252, an antenna 1, an antenna 2, a mobile communication module 260, a wireless communication module 270, an audio module 280, a speaker 280A, a receiver 280B, a microphone 280C, an earphone interface 280D, a sensor module 290, a key 291, a motor 292, an indicator 293, a camera 294, a display screen 295, and a Subscriber Identity Module (SIM) card interface 296. Wherein, the sensor module may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic apparatus 200. In other embodiments, electronic device 200 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units, such as: the processor 210 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be a neural center and a command center of the electronic device 200. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 210. If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 210, thereby increasing the efficiency of the system.
The electronic device 200 may implement the display function via the GPU, the display screen 295, the application processor 210, and the like. The GPU is a microprocessor for image processing, and connects the display screen 295 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter presentation information.
The display screen 295 is used to display images, videos, and the like. Display screen 295 includes a display panel. The display panel may be a Liquid Crystal Display (LCD) 295, an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, electronic device 200 may include 1 or N display screens 295, N being a positive integer greater than 1.
The wireless communication function of the electronic device 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 260, the wireless communication module 270, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 260 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device 200. The mobile communication module 260 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 260 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the filtered electromagnetic wave to the modem processor for demodulation. The mobile communication module 260 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 260 may be disposed in the processor 210. In some embodiments, at least some of the functional modules of the mobile communication module 260 may be disposed in the same device as at least some of the modules of the processor 210.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 280A, the receiver 280B, etc.) or presents images or video through the presentation screen 295. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 210, and may be located in the same device as the mobile communication module 260 or other functional modules.
The wireless communication module 270 may provide a solution for wireless communication applied to the electronic device 200, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 270 may be one or more devices that integrate at least one communication processing module. The wireless communication module 270 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 270 may also receive a signal to be transmitted from the processor 210, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 200 is coupled to mobile communication module 260 and antenna 2 is coupled to wireless communication module 270, such that electronic device 200 may communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
As an example, the electronic device 200 is taken as a mobile phone. The mobile phone can be in communication connection with the wearable electronic device through the antenna 1 and/or the antenna 2 and the wireless communication module 270 arranged therein. So that the mobile phone can acquire the health data of the user through the wearable electronic device.
In this example, a plurality of different sensors may be included in the sensor module 290 that may be disposed in the electronic device 200, so as to collect data required for the operation of each item of the electronic device 200.
Among them, the touch sensor is also called a "touch panel". The touch sensor may be disposed on the display screen 295, and the touch sensor and the display screen 295 form a touch screen, which is also called a "touch screen". The touch sensor is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. In some embodiments, visual output related to touch operations may be provided through the display screen 295. In other embodiments, the touch sensor can be disposed on a surface of the electronic device 200 at a different location than the display screen 295.
The pressure sensor is used for sensing a pressure signal and converting the pressure signal into an electric signal. In some embodiments, a pressure sensor may be provided on display screen 295. There are many types of pressure sensors, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor, the capacitance between the electrodes changes. The electronic device 200 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen 295, the electronic apparatus 200 detects the intensity of the touch operation according to the pressure sensor. The electronic apparatus 200 may also calculate the touched position based on the detection signal of the pressure sensor. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor may be used to determine the motion pose of the electronic device 200. In some embodiments, the angular velocity of the electronic device 200 about three axes (i.e., the x, y, and z axes) may be determined by a gyroscope sensor. The gyro sensor may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor detects the shake angle of the electronic device 200, calculates the distance to be compensated for by the lens module according to the shake angle, and enables the lens to counteract the shake of the electronic device 200 through reverse movement, thereby achieving anti-shake. The gyroscope sensor can also be used for navigation and body feeling game scenes.
The air pressure sensor is used for measuring air pressure. In some embodiments, the electronic device 200 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by a barometric pressure sensor.
The magnetic sensor includes a hall sensor. The electronic device 200 may detect the opening and closing of the flip holster using a magnetic sensor. In some embodiments, when the electronic device 200 is a flip phone, the electronic device 200 may detect the opening and closing of the flip according to the magnetic sensor. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor may detect the magnitude of acceleration of the electronic device 200 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 200 is stationary. The method can also be used for identifying the posture of the electronic equipment 200, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor for measuring a distance. The electronic device 200 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 200 may utilize a range sensor to range to achieve fast focus.
The proximity light sensor may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic apparatus 200 emits infrared light to the outside through the light emitting diode. The electronic device 200 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 200. When insufficient reflected light is detected, the electronic device 200 may determine that there are no objects near the electronic device 200. The electronic device 200 can utilize the proximity light sensor to detect that the user holds the electronic device 200 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor can also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor is used for sensing the ambient light brightness. The electronic device 200 may adaptively adjust the brightness of the display screen 295 in accordance with the perceived ambient light level. The ambient light sensor can also be used to automatically adjust the white balance when taking a picture. The ambient light sensor may also cooperate with the proximity light sensor to detect whether the electronic device 200 is in a pocket to prevent inadvertent contact.
The fingerprint sensor is used for collecting fingerprints. The electronic device 200 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and the like.
The temperature sensor is used for detecting temperature. In some embodiments, the electronic device 200 implements a temperature processing strategy using the temperature detected by the temperature sensor. For example, when the temperature reported by the temperature sensor exceeds the threshold, the electronic device 200 performs a performance reduction on the processor 210 located near the temperature sensor, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 200 heats the battery 252 when the temperature is below another threshold to avoid the low temperature causing the electronic device 200 to shut down abnormally. In other embodiments, the electronic device 200 performs boosting of the output voltage of the battery 252 when the temperature is below a further threshold to avoid abnormal shutdown due to low temperature.
The bone conduction sensor may acquire a vibration signal. In some embodiments, the bone conduction sensor may acquire a vibration signal of a human voice vibrating a bone mass. The bone conduction sensor can also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor may also be disposed in a headset, integrated into a bone conduction headset. The audio module 280 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signals acquired by the bone conduction sensor, and a heart rate detection function is realized.
In this embodiment of the application, the processor 210 may be configured to start a corresponding APP according to an operation of starting the health management APP input by the user, and display a corresponding interface on the display screen 295. For example, when it is desired to turn on the health management class APP "sports health," an icon of the "sports health" APP on the display screen 295 is touched. In response to the touch operation, processor 210 may launch the "sports health" APP and present a home interface of the "sports health" APP on presentation screen 295.
The processor 210 may also be used to acquire some or all of the health data of the user acquired by the sensor module 290 in cooperation with the sensor module 290. Of course, the partial health data of the user may also be provided by a wearable electronic device in communication with the electronic device 200. That is, the health data of the user acquired by the electronic device 200 may include the health data acquired by the sensor module 290 of the electronic device 200, may also include the health data acquired by the wearable electronic device of the user, and may also include both the health data acquired by the sensor module 290 and the health data acquired by the wearable electronic device.
In embodiments of the present application, the health data may include data in a plurality of different dimensions, for example, the health data may include data in a sleep dimension, data in an activity dimension, and data in an emotion dimension. The data of the sleep dimension may include data related to the sleep of the user, such as the sleep quality of the user, the time of falling asleep of the user, the time of getting up of the user, and the like. The data for the activity dimension may include data related to the user's activity, such as the number of steps of the user, the time the user is engaged in moderate to high intensity exercise, and the number of times the user is performing fitness exercises, etc. The data of the mood dimension may comprise data related to the mood of the user, such as the number of breathing exercises performed by the user, the heart rate condition of the user, etc.
It should be noted that, in other embodiments, the health data of the user may also be input to the processor 210 in the electronic device 200 by the user touching a button on the interface shown in the "sports health" APP, such as a "punch" button. For example, a "punch card" button of the fitness training can be displayed on the interface of the "sports health" APP, and after the user performs the fitness training, the user can input the operation of completing the fitness training by touching the "punch card" button.
The processor 210 may be further configured to display the current health data of the user through a visualized image on an interface in the "exercise health" APP after acquiring the health data of the user. For example, the health data of the user may be presented through a clover model. Wherein each leaf in the clover can be used to identify health data for one dimension. The processor 210 may identify a corresponding relationship between the current health data and the preset target by controlling the size, color, brightness, and other characteristics of each leaf. The specific method for displaying the health data will be described in detail below, and will not be described herein again.
The electronic device 200 may implement a shooting function through the ISP, the camera 294, the video codec, the GPU, the display screen 295, and the application processor.
The ISP is used to process the data fed back by the camera 294. For example, when taking a picture, the shutter is opened, light is transmitted to the photosensitive element of the camera 294 through the lens, the optical signal is converted into an electrical signal, and the photosensitive element of the camera 294 transmits the electrical signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 294.
The camera 294 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 200 may include 1 or N cameras 294, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 200 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 200 may support one or more video codecs. In this way, the electronic device 200 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor 210 that processes input information quickly by referencing a biological neural network structure, for example, by referencing a transfer pattern between neurons in the human brain, and may also learn itself continuously. The NPU can implement applications such as intelligent recognition of the electronic device 200, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The charging management module 250 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 250 may receive charging input from a wired charger via the USB interface 240. In some wireless charging embodiments, the charging management module 250 may receive a wireless charging input through a wireless charging coil of the electronic device 200. The charging management module 250 may also supply power to the electronic device 200 through the power management module 251 while charging the battery 252.
The power management module 251 is used to connect the battery 252, the charging management module 250 and the processor 210. The power management module 251 receives input from the battery 252 and/or the charging management module 250, and provides power to the processor 210, the internal memory 230, the external memory, the display screen 295, the camera 294, and the wireless communication module 270. The power management module 251 may also be used to monitor parameters such as the capacity of the battery 252, the cycle count of the battery 252, and the state of health (leakage, impedance) of the battery 252. In some other embodiments, the power management module 251 may also be disposed in the processor 210. In other embodiments, the power management module 251 and the charging management module 250 may be disposed in the same device.
The external memory interface 220 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 200. The external memory card communicates with the processor 210 through the external memory interface 220 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 230 may be used to store computer-executable program code, which includes instructions. The processor 210 executes various functional applications of the electronic device 200 and data processing by executing instructions stored in the internal memory 230. The internal memory 230 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (e.g., audio data, a phone book, etc.) created during use of the electronic device 200, and the like. In addition, the internal memory 230 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 200 may implement audio functions via the audio module 280, the speaker 280A, the receiver 280B, the microphone 280C, the headset interface 280D, and the application processor. Such as music playing, recording, etc.
The audio module 280 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 280 may also be used to encode and decode audio signals. In some embodiments, the audio module 280 may be disposed in the processor 210, or some functional modules of the audio module 280 may be disposed in the processor 210.
The speaker 280A, also referred to as a "horn", is used to convert audio electrical signals into sound signals. The electronic apparatus 200 can listen to music through the speaker 280A or listen to a hands-free call.
The receiver 280B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 200 receives a call or voice information, it is possible to receive a voice by placing the receiver 280B close to the human ear.
The microphone 280C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When a call is placed or a voice message is sent or it is desired to trigger the electronic device 200 to perform some function by the voice assistant, the user may speak via his/her mouth near the microphone 280C and input a voice signal into the microphone 280C. The electronic device 200 may be provided with at least one microphone 280C. In other embodiments, the electronic device 200 may be provided with two microphones 280C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 200 may further include three, four or more microphones 280C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 280D is used to connect wired headphones. The headset interface 280D may be a USB interface 240, or may be a 3.5mm open mobile electronic device 200 platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The keys 291 include a power-on key, a volume key, and the like. The keys 291 may be mechanical keys 291. Touch keys 291 are also possible. The electronic apparatus 200 may receive a key 291 input, and generate a key signal input related to user setting and function control of the electronic apparatus 200.
The motor 292 may generate a vibration cue. The motor 292 may be used for both an incoming vibratory alert and for touch vibratory feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 292 may also respond to different vibration feedback effects for touch operations on different areas of the display screen 295. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 293 may be an indicator light, and may be used to indicate a charging status, a power change, or a message, a missed call, a notification, or the like.
The SIM card interface 296 is for connecting a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 200 by being inserted into the SIM card interface 296 or being pulled out of the SIM card interface 296. The electronic device 200 may support 1 or N SIM card interfaces 296, with N being a positive integer greater than 1. The SIM card interface 296 may support a Nano SIM card, a Micro SIM card, a SIM card, or the like. The same SIM card interface 296 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 296 may also be compatible with different types of SIM cards. The SIM card interface 296 may also be compatible with an external memory card. The electronic device 200 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 200 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 200 and cannot be separated from the electronic device 200.
The health data display method provided by the embodiment of the application can be implemented in the electronic device 200. The method is described in detail below with reference to examples.
It is understood that the present display of health data can be displayed through a tab as shown in fig. 1, and also can be displayed through a linear or circular progress bar. However, the health data is presented to the user in these ways, and the user cannot intuitively acquire the overall situation of the health data.
In general, a user may set a corresponding task and a goal corresponding to the task for different health data. When the health data reaches the goal of the task, it indicates that the task is completed. By completing one or more tasks, the user can actively manage the health state of the user according to the health data.
According to the health data display method provided by the embodiment of the application, the completion conditions of different tasks can be displayed through a clover (such as clover, clover and the like) model. For example, different tasks may be divided into different dimensions, each of which may correspond to a leaf in a multi-leaf grass model, according to the type of health data to which they correspond. And integrally displaying the health data of the user by displaying the leafy grass model with the corresponding relation. The user can more intuitively and comprehensively know the completion conditions of different tasks in each dimension, and active health management is carried out accordingly.
The tasks and health data corresponding to different dimensions displayed in the leafy grass model can be selected by a user or preset in the electronic equipment. Similarly, the task in each dimension and the target of the corresponding task may be set by the user, or may be a preset recommended value in the electronic device. For example, the three-leaf clover model with three leaves is used for showing tasks under three different dimensions, wherein the three different dimensions are the motion dimensions including a step number task, a middle and high exercise intensity task and a fitness training task; a sleep dimension comprising regular wake-up tasks; and emotional dimensions including respiratory training, for example. The task in the movement dimension may be selected from a task library by a user through an input click operation or the like, and the target may be input by the user through a virtual keyboard or a physical keyboard or other input device. When the user does not configure the task and the task target in the dimension, the electronic device may determine the target of the corresponding task according to a recommendation preset in the electronic device. The recommended value can be obtained according to big data, or the electronic device can flexibly determine according to the historical task completion condition of the user. Similarly, for the other two dimensions (such as the sleep dimension and the emotion dimension) of the three dimensions, the task and the target corresponding to the task may be determined by referring to the above method for determining the movement dimension, which is not described herein again.
For example, tasks corresponding to the user's health data may be divided into M dimensions, such as X1-XMAnd M is an integer greater than or equal to 1. May include N in each dimensioniA task, wherein i is 1, 2, 3, … …, M, NiCan be used to identify dimension XiThe number of tasks involved. For example, X1In dimension including N1A task, X2In dimension including N2Task, by analogy, XMIn dimension including NMAnd (4) each task.
In this example, for each of the M dimensions, there may be one leaf in the multi-leaf grass model. For example, X1Dimension to blade 1, X2Dimension to blade 2, XMThe dimensions correspond to the blades M. That is, when presenting health data comprising M dimensions, an M-leaf grass model having M leaves may be employed for presentation.
It should be understood that each blade in the M-leaf grass model has a correspondence with added dimensions and tasks in a setup menu for object management and the like in the electronic device. That is, when dimensions and/or tasks are increased or decreased by user settings or other means in a setting menu for object management or the like, the corresponding M-leaf grass model is also changed. For example, when the user increases the dimension of the health data through the goal management menu (e.g., from the previous 3 dimensions to 4 dimensions), the number of leaves in the M-leaf grass may also change accordingly (e.g., from the previous 3 leaves to 4 leaves).
It should be noted that, in some implementations, a user may directly control, through an interface of the electronic device, the selected number and types of dimensions and tasks in a setting menu for adjusting the target management and the like. In other implementations, the user may also add a leaf by operating any one of the M leaves. For example, the user may perform operations such as double-clicking, pressing for a certain time, dragging, or zooming to increase or decrease the blades of the M-leaf grass model. Correspondingly, the dimension and task in the setting menu of target management and the like can be correspondingly changed while the blades are increased or decreased.
As an example, the processor may map and constrain each blade corresponding to the M-leaf grass model according to the following method.
Please refer to fig. 3, which is a schematic drawing diagram of a clover blade according to an embodiment of the present application. As shown in fig. 3 (a), the processor may obtain the contour of the clover blade by performing circle (or ellipse) drawing and triangle drawing, respectively, and perform beautification processing (e.g., processing the edge angle into a rounded corner, etc.) on the basis of the contour to obtain the image of the clover blade. Wherein the triangle and the ring or ellipse can have different relative position relationships. For example, in some implementations, the left and right sides of the triangle may be in circumscribed relation to a circle as shown in the figures. For another example, in other implementations, the left and right fixed points of the triangle may be disposed on the outer edge of the circle.
It should be understood that the processor may draw other blades in the M-leaf grass in a similar way and display the other blades in the same layer to realize the display of the M-leaf grass. Of course, the processor may also perform rotation or symmetry processing on the image of one blade after acquiring the image of the blade as shown in fig. 3 (b) to acquire images of other blades.
In this application, in order to distinguish tasks where different blades correspond to different dimensions, in some implementations, the colors of different blades may be different. For each blade in the M-leaf grass model, the task completion condition under the corresponding dimension can be identified by adjusting the visualization parameters such as the depth, transparency and/or area of the color of the blade.
For example, in some embodiments, task completion in the corresponding dimension may be identified by adjusting the depth of the leaf color. To set the starting color of the leaf to (S)R,SG,SB) Setting the end color of the leaf to (E)R,EG,EB) For example. It will be appreciated that different depths of color may correspond to different color values (i.e., RGB values) under the RGB standard color spectrum, and thus, in this example, different colors and color depths may be identified with different color values under the RGB standard color spectrum. For example, the starting color (S)R,SG,SB) May be FCA 077. As another example, the end color (E)R,EG,EB) May be FB 6522.
In this example, as the task completion of the corresponding dimension of the leaf increases, the leaf color depth at the corresponding completion can be determined according to the following equation (1), e.g., using (R)i,Gi,Bi) Color values of the blade at some intermediate state are identified.
Figure BDA0002529638570000161
Wherein S isRR value of RGB for the starting color, ERR value of RGB for end color, NiThe number of tasks in the ith dimension is, and i is the number of completed tasks in the ith dimension. SGG value of RGB for the starting color, EGG value of RGB of the end color. SBB value of RGB for the starting color, EBB value of RGB of the end color.
For example, 3 tasks are included in dimension 1, where 1 task has been completed, i.e., NiFor example, i is 3 and i is 1. The color of the corresponding leaf may be set to (R)1,G1,B1). Wherein the content of the first and second substances,
Figure BDA0002529638570000162
Figure BDA0002529638570000163
it should be noted that in other implementations of the present application, the color of the blade may also be adjusted according to the completion of the task. At this time, i/N in the formula (1)iCan be used in this dimension for completion progress of tasks. Illustratively, this dimension includes walking, moderate to high intensity exercise, and fitness training, walking being targeted at 6000 steps. When all three tasks are completed, but the walking task is 50% complete (i.e., 3000 steps have been completed), i/N in this formula (1)iThe task completion degree of the entire dimension obtained by the conversion according to the walking task completion degree (i.e., 50%) may be, for example, 25%. The processor may adjust the color of the corresponding leaf to (R)1',G'1,B1') wherein R1'=SR+(ER-SR)×25%,G1'=SG+(EG-SG)×25%,B1'=SB+(EB-SB)×25%。
Therefore, the user can intuitively acquire the completion condition of the task in the dimension through the color of the blade. For example, referring to fig. 4, a lighter color corresponding to the leaf 401 may be used to identify that the task in the corresponding dimension is less complete or has not yet been completed. As the task completion increases, the color of the leaf may change to the darker color shown by leaf 402. It should be noted that, when the color transformation of the blade is displayed on the interface, the changing process may be dynamically shown or may be directly switched. For example, the dynamically shown effect may be a slow change of the color of the leaf from the starting color gradually to the color in the intermediate state, and finally stopping when the color of the leaf is the corresponding color in the intermediate state. As another example, the effect of the direct switching may be that the color of the blade directly exhibits the corresponding color in the intermediate state.
It should be noted that, when the M-leaf grass is used for displaying the health data, a plurality of leaf images may appear on the interface to identify the health data with different dimensions. Therefore, in order to distinguish the corresponding dimensions of different blades, in the embodiment of the present application, an image identifier for identifying the corresponding dimensions may also be set in each blade. For example, 403 in fig. 4 may be used to identify that the blade is a blade with a corresponding motion dimension. In this example, the image identifier corresponding to the item dimension may be set according to a user-defined selection. Or the electronic equipment can automatically perform matching acquisition according to the item type. The source of the image identifier selected by the user through customization or automatically matched by the electronic device may be an image identifier database locally stored by the electronic device, or may be derived from a network database, which is not limited in the embodiment of the present application.
In other embodiments, task completion in the corresponding dimension may be identified by setting the transparency of the blade. Take for example setting the starting transparency of the blade to a first transparency (e.g. 20%) and setting the ending transparency of the blade to a second value (e.g. 80%).
As the task completion of the corresponding dimension of the blade increases, the transparency of the blade at the corresponding completion can be determined according to the following formula (2), for example, the transparency of the blade at a certain intermediate state is identified by T.
Figure BDA0002529638570000171
Wherein, T0Is of a first transparency, TNTo end transparency, NiThe number of tasks in the ith dimension is, and i is the number of completed tasks in the ith dimension.
For example, in dimension 2Comprising 3 tasks, of which 2 have been completed, i.e. Ni3, i is 2. The transparency of the corresponding blade may be set to
Figure BDA0002529638570000172
Similar to the above description regarding color change, in other implementations of the present application, the transparency of the blade may also be adjusted according to the completion of the task. At this time, i/N in the formula (2)iCan be used in this dimension for completion progress of tasks. Illustratively, this dimension includes walking, moderate to high intensity exercise, and fitness training, walking being targeted at 6000 steps. When all three tasks are completed, but the walking task is 50% complete (i.e., 3000 steps have been completed), i/N in this formula (2)iThe task completion degree of the entire dimension obtained by the conversion according to the walking task completion degree (i.e., 50%) may be, for example, 25%. The processor may adjust the transparency of the corresponding leaf to Ti'=T0+(TN-T0)×25%。
Therefore, the user can intuitively acquire the completion condition of the task in the dimension through the transparency of the blade.
In other embodiments, task completion in the corresponding dimension may also be identified by adjusting the area of the blade. Therefore, the user can intuitively acquire the completion condition of the task in the dimension by observing the area of the blade.
It should be noted that, in practical implementation, the display of the health data in the corresponding dimension may be implemented in any one of the three manners (i.e., adjusting any one of the depth, the transparency, or the size of the color of the corresponding leaf according to the task completion). Any two or three of the three ways can be combined to realize the display of the health data under the corresponding dimension.
It can be understood that, through the three ways, the processor of the electronic device can intuitively and quickly learn about task completion conditions under different dimensions through the M-leaf grass image by displaying the M-leaf grass image including one image layer on the interface, so as to acquire overall health data and perform other actions accordingly. For example, if the task completion degree in a certain dimension is low, the related action of the task in the dimension is strengthened. Therefore, the aim that the user carries out active health management according to the health data displayed by the M-leaf grass image is fulfilled.
In other implementation manners, in order to enable a user to more clearly acquire task completion conditions in different dimensions, the application further provides a health data display method. According to the method, the completion condition of the task under the current dimension can be displayed by adding a new blade image on the basis of the M-leaf grass. For example, in the following, M is 3, that is, the health data is divided into 3 dimensions, and a clover model is correspondingly used to present the health data.
Please refer to fig. 5. When tasks in all three dimensions are not completed, the color of the blade corresponding to all three dimensions may be light as shown in fig. 5 (a). In some implementations, to identify blades corresponding to different dimensions, three blades may be set to colors of different color families. For example, 501 may be set to light orange, 502 to light purple, and 503 to light blue. As the completion of the task increases in each dimension, the color of the blade may change accordingly. For example, the dimensions corresponding to 502 and 503 each include only one task. When the tasks in the dimensions 502 and 503 are completed, the colors of 502 and 503 may be adjusted to the corresponding dark colors. For example, in conjunction with (b) in fig. 5, a light purple color corresponding to 502 may be adjusted to a dark purple color corresponding to 505, thereby prompting the user that the task in the current dimension has been completed. Similarly, the light blue color corresponding to 503 may also be adjusted to the dark blue color corresponding to 506 to prompt the user that the task in the current dimension has been completed.
For 501, for example, 3 tasks are included in the corresponding dimension, when 1 of the 3 tasks is completed, the color of 501 remains unchanged and is light orange because all tasks are not completed. In some implementations, to enable the user to obtain more specific task completion through the clover model, a smaller blade may be set in 501 for identifying the progress of the current task. As shown in fig. 5 (b), a layer 504 may be disposed in the layer above the layer of 507 (i.e., 501 as shown in fig. 5 (a)), and this layer 504 may be used to identify the number of tasks that have been completed. In some implementations, the 504 can be set to a darker color, such as a dark orange. When 1 task is completed for 3 tasks in the corresponding dimension 507, the processor may display the bladeletts in the layer 504 on the display screen. That is, the user can see that a small vane 504 appears in the vane 507. As the progress of the task continues to increase, this 504 may grow larger until all tasks expand to fully overlap 507 when they are complete. In this example, since the layer of 504 is set above 507, when all tasks are completed, 501 may be displayed as a dark orange corresponding to 508 as shown in (c) of fig. 5.
Based on the above description, 504 is related to the progress of the task in this dimension, and therefore, in this example, the area of 504 may be adjusted continuously with the progress of the task. That is, the area of 504 corresponds to the completion of the task. For example, taking a two-dimensional coordinate system when the display screen of the electronic device normally displays an image as an example, the vertical size of 507 (i.e. the size coinciding with or parallel to the y-axis in the coordinate system) may be divided equally according to the number of tasks included in the corresponding dimension, for example, divided equally into 3. When 1 of these tasks is completed, then the longitudinally highest point of 504 is 1/3 of the longitudinal dimension 507, and the rest of 504 can be scaled down according to the longitudinal dimension to obtain the dimension of 504. For another example, when 1 task is completed, the area of 504 may be 1/3 of the area of 507, i.e. the size of 507 is scaled down to the previous 1/3 to obtain the size of 504. Of course, the size of 504 may also be calculated and obtained according to other manners, which is not described in this embodiment of the application.
It should be noted that, in order to make the display of the clover model more vivid so as to obtain a better display effect, in some implementations of the present application, when the user completes a task in all dimensions, the overall change of the clover model (such as overall adjustment of color, lighting, shaking, etc.) may be triggered so as to encourage the user to insist on completing the task in all dimensions.
Illustratively, taking as an example that when the user completes the task in all dimensions, the clover model as shown in (c) of fig. 5 is shown on the interface. In connection with the above description, when the task is not yet completed, the blades corresponding to different dimensions may be set to different colors, for example, 501 is set to orange, 502 is set to purple, and 503 is set to blue. In this example, when the user is done with all three dimensions of the task, the processor may adjust the color of each leaf of the clover model to green to visually present the user with one complete green clover form. In particular implementations, to enable the color adjustment to be made without being too obtrusive to the user, the processor can control the clover models having different colors to be gradually reduced to the origin and gradually expanded to the size and position prior to reduction, and set the color of each leaf to green when expanded.
It should be noted that, in the present application, the display of the clover model on the interface may be directly included on the interface when the processor displays the page on the display screen, or may be displayed to the user through an animation effect. For example, the health data display method shown in fig. 5 is used for illustration. With reference to fig. 6 (a), as the processor presents the clover model interface on the display screen, the initial position of 601 (i.e., 501 as shown in fig. 5 (a)) may be as shown in fig. 6 (a), i.e., the central axis of 601 is at an angle, for example, of-60 ° to the positive y-axis direction of the two-dimensional coordinate system of the display screen. Also, the area 601 may be set to a size smaller than that of 602 at the time of the completed presentation (the state shown in (b) in fig. 6). The processor may set a preset time from which to finish 601 dynamically rotates from the initial position to the position at which the presentation is completed. The size of the time interval can also be gradually increased along with the preset time interval. Similarly, the presentation methods 502 and 503 shown in fig. 5 (a) may be similar to 601, so that the clover image shown in fig. 6 (b) may be obtained at the end of the preset time. Therefore, the dynamic display of the clover model is completed. It should be understood that through the dynamic presentation, the interest of the user in the health data review can be effectively promoted, and therefore the probability that the user actively performs health management according to the health data can be promoted.
In other exemplary implementations, the dynamic presentation as shown in fig. 6 (a) and fig. 6 (b) above may also be performed on the bladeletts 504 as shown in fig. 5. The beneficial effects are similar, and are not described in detail here. It should be noted that, in different implementations of the present application, the dynamic display method of the small blade 504 may be completely the same as or different from the dynamic display method of the large blade 601 as shown in (a) of fig. 6. For example, in connection with (c) of fig. 6, the angle between the central axis of the small blade 603 and the positive y-axis direction of the two-dimensional coordinate system of the display screen in the initial position may be different from that of the large blade 601, such as being set at-90 ° as shown in (c) of fig. 6.
It should be appreciated that, for the user, since the presentation of the small leaf 603 is based on the presentation of the large leaf 601, i.e., logically, the appearance of the small leaf 603 should be more reasonable after the appearance of the large leaf 601. Thus, in some implementation scenarios of the present application, the blades 501, 502, and 503 as shown in fig. 5 (a) may rotate in at the same time. The bladeletts 603 may then perform the rotational entry with a delay after a period of time (e.g., 150 milliseconds) after the rotational entry is initiated at 601. Thereby providing a better viewing experience for the user.
Of course, in other examples, the clover model may be presented to the user in other ways, such as flying in, flashing, etc. The embodiments of the present application do not limit this.
In other implementation manners of the present application, after the processor displays the complete green clover model on the interface, other image layers may be set outside the image corresponding to the clover model, so as to further increase the identifiability of the green clover. For example, referring to (d) in fig. 5, 510 and 511 may be provided in addition to 509. The images 510 and 511 may be similar to the images formed by the 509 edge, and the colors may be white or gold, etc. for showing the light effect of the blade 509 to the user. The lighting effects shown at 510 and 511 may or may not be synchronized. Similarly, after the green leaves corresponding to 502 and 503 shown in fig. 5 (a) are grown, layers like 510 and/or 511 may also be disposed at the outer edges thereof, so as to make the animation effect of the green clover more vivid and improve the viewing experience of the user. It should be understood that, in the above description, only the layer is disposed at the outer edge of the blade to provide the lighting effect as an example, in different implementations, the newly added layer may also be used to provide other dynamic effects to enhance the viewing experience of the user. In addition, this luminous effect can be when green clover grows to the final position, show in the preset time to when promoting the user and watching experience, reduce because the demonstration pressure to treater and show screen that this luminous effect leads to.
Based on the above description, the health data display method provided by the embodiment of the application can visually display the health data of the user by displaying the task completion conditions under different dimensions by combining the leafy grass (such as M-leafy grass) model, so that the user can know the overall health data distribution condition more quickly and conveniently.
It should be noted that, when the electronic device displays the health data according to the M-leaf grass model, the display of the M-leaf grass may be appropriately processed in combination with the display mode of the current electronic device, so as to obtain a better display effect. For example, the electronic device may obtain a presentation mode of the current device, such as day or night mode. And adjusting the content displayed by the blades according to the corresponding display mode. For another example, the electronic device may obtain data of local time, geographical state, location information, etc. provided by the sensor module, and make appropriate adjustments to the content displayed by the blade, such as when the local time, and/or geographical state, and/or location information are different, the electronic device may adjust the color of the blade to match the color, transparency, or size of the data. In other implementations of the present application, the user may also perform corresponding adjustment in an Always On Display (AOD) mode, so that the image of the M-leaf grass model may be displayed in an AOD scene, thereby enabling the user to more conveniently acquire the overall status of the health data displayed by the M-leaf grass model.
In order to make the implementation process of the health data presentation method provided by the embodiment of the present application more clearly known to those skilled in the art, the following takes as an example that the electronic device provides the health data presentation based on the leafy grass model to the user through the "sports health" APP. As shown in fig. 7, the method may include S701-S704.
S701, the electronic equipment receives operation of starting the 'exercise health' APP input by a user.
S702, the electronic equipment responds to the operation and displays a home page interface of the 'sports health' APP.
When a user wants to know own health data through an 'exercise health' APP in electronic equipment of the user, the user can input and start the operation of the 'exercise health' APP on a display screen of the electronic equipment. The user can input the operation of starting the APP by touching the icon of the APP, clicking the icon of the APP, or in other manners (such as inputting through a key or an external input device). The embodiment of the present application does not limit this. For convenience of explanation, the user inputs an operation of starting the APP to the electronic device by touching the icon of the APP is taken as an example below.
In response to the touch operation, the electronic device may start the "sports health" APP and expose a home interface of the APP. For example, the home interface of the "sports health" APP may be an interface as shown in (a) of fig. 8. As shown in fig. 8 (a), a tab for showing different health data of the user may be included on the home interface. For example, the tab 801 is used to show the user's motion record, and the type of motion shown in the motion record may be a motion type preset by the user (or automatically generated by the system), for example, the motion type may be a "tablet support" as shown in the figure. In some embodiments, the motion record may also be used to show a completion goal for the corresponding motion type that was preset by the user (or automatically generated by the system). For example, as shown in (a) of fig. 8, the moving objects supported by the flat plate may be set to 23. The completion of the current target may be shown in the tab 801, for example 5/23 indicating that the current target is 23 and 5 have been completed. In order to enable the user to more clearly understand the effect of exercise on his/her own health data, the heat consumed by the completed exercise may also be displayed in the tab 801, for example, 5 kilocalories (5 kilocalories for short) are consumed by 5 flat supports that have been completed. Similar to the tab 801, tabs corresponding to other health data may also be displayed on the home interface. For example tab 802 for representing heart rate related data of the user, tab 804 for representing sleep related data of the user, and tab 803 for representing weight related data of the user.
It is understood that, since the health data of the user may be changed in real time, after the user enters the home interface, the electronic device may be instructed to update the current data through a slide-down operation as described in (a) of fig. 8. In response to the downslide operation, the electronic device may update the current data while presenting a prompt corresponding to 805 as shown in fig. 8 (b) on the presentation screen. For example, the hint information may be "data is being synchronized". And until the electronic equipment finishes updating the current data, displaying the corresponding updated data in each tab on the home page interface.
It can be seen that on the home interface as shown in fig. 8, although a part of the health data can be presented to the user, it is not intuitive to present the status of each data in the form of a tab. Making it difficult for the user to fully understand his or her own health data. The user cannot be motivated to adjust correspondingly according to the health data.
Currently, there are other ways to display health data, for example, each item in the current health data is displayed in the form of a linear/circular progress bar. This way of presentation also presents similar problems as the tabs described above.
Thus, in this example, a "healthy life" tab is added to the home page interface, through which the user can access the leafy grass model-based health data presentation interface provided in the above description.
S703, the electronic equipment receives the operation of entering the health data display interface, which is input by the user.
When a user wants to know the overall situation of the health data of the user or the overall completion situation of different set targets, the user can input the operation of entering the corresponding interface by touching the 'healthy life' tab on the home page interface. Similar to the operation of opening the "sports health" APP described above, in some embodiments, the user may implement an operation of entering the corresponding interface by touching the "healthy life" tab. In other embodiments, the user may also enter the corresponding interface by clicking/double-clicking the "healthy life" tab. In other embodiments, the user may input the operation in other manners, such as inputting the operation through an external input device.
It should be noted that, currently, many elements may be included in the home interface and need to be displayed, so when the user does not find the "healthy life" tab on the currently displayed interface, the user may input a slide-up operation as shown in fig. 9A, or click a button 901 as shown in fig. 9B, to trigger the electronic device to display the non-displayed portion of the home interface. In response to the operation of the user input, the electronic device may present an interface as shown in fig. 9C. It can be seen that after the home interface has scrolled upwards, a "healthy life" tab appears on the display screen.
And S704, the electronic equipment responds to the operation and displays a health data display interface.
In response to a user input of an operation to enter the "healthy life" tab, the electronic device may switch from the current home interface to the "healthy life" interface.
As an example, the interface for "healthy life" may be an interface as shown in fig. 10A. As shown in fig. 10A, a health data presentation element 1001 of a clover structure may be included in the interface. In some embodiments, in this 1001, dimensional information corresponding to each leaf in the clover structure may also be shown. Such as sleep, activity, and emotional dimensions as shown in fig. 10A. It will be appreciated that in order to enable a user to establish an intuitive correspondence between each leaf in the clover structure and different dimensions, an identifier for identifying the dimensional information corresponding to the leaf may be provided in each leaf. For example, referring to fig. 10B, a flag as shown in 1005 may be set on the leaf corresponding to the activity dimension, a flag as shown in 1006 may be set on the leaf corresponding to the sleep dimension, and a flag as shown in 1007 may be set on the leaf corresponding to the emotion dimension. In some possible implementations, 1005, 1006 and/or 1007 in the figures may be set as dynamic images so as to improve the identification degree of the identifier, and facilitate the user to quickly establish the corresponding relationship between the leaf and the dimension of the health data.
In the example of fig. 10A, with reference to fig. 4 and the description of fig. 5, since the tasks in each dimension are not completed, each leaf of the clover model is in a light color state, and no small leaf corresponding to the number of completed tasks is grown.
In addition, in other embodiments of the present application, other elements besides the above description may be included in the interface of "healthy life" so that the user can obtain more information from the interface and perform further operations accordingly.
For example, in some possible implementations, as shown in fig. 10A, 1002 for identifying current date information may also be included on the interface. The interface may also include a thumbnail 1003 identifying the completion of the task within the week.
Wherein the current date information presented at 1002 may correspond to the health data of the clover model shown at 1001. As shown in fig. 10A, the health data displayed by the clover model is the health data of 6 months and 3 days and wednesday 2020. From 1003 shown in fig. 10A, the user can quickly obtain historical health data for the week. In some embodiments, this 1003 also provides functionality to present more historical data to the user. For example, referring to FIG. 11A, the user may trigger the electronic device to present earlier health data in 1101 by sliding the 1101 (to the right in the figure). For example, health data of a week "5-month-25-day-monday 2020" to "5-month-5-day-31-day-monday 2020" as shown in fig. 11B may be presented. The user can adjust the display information of the clover model displayed on the interface 1102 by clicking on the thumbnail of the clover model on a day in 1101. For example, when the electronic device receives a user touching the thumbnail corresponding to "5/2020, 27 th, wednesday", the health data for "5/2020, 27 th, wednesday" may be presented in 1102. It can be seen that the user has completed all tasks for the motion dimension on the current day, and is therefore shown as dark on the corresponding leaf. As another example, when the electronic device receives that the user touches the thumbnail corresponding to "2020-5-month-26-day-tuesday", the health data for "2020-5-month-26-day-tuesday" may be presented in 1102 (as shown in fig. 11C). It can be seen that the user has completed all three dimensional tasks on the day, so a dark color can be shown on the corresponding leaf.
Fig. 10A illustrates an example of thumbnails showing how clover models correspond to each day in a week. In other implementations of the present application, thumbnails corresponding to each day of the clover model in months may also be presented on the interface as shown in FIG. 10A. In this example, 1003 included in fig. 10A may include a thumbnail image corresponding to each day from 5/month 1/2020 to 5/month 31/2020, so that the user can intuitively know the task completion in the current month.
It should be understood that the thumbnail representation in the above description in terms of months may be presented when it is determined that the user wants to view the content, since it includes more content (e.g., may include thumbnails corresponding to clover models each day in the month). Thus, in other implementations of the present application, a user may trigger the electronic device to present the corresponding thumbnails each day of the month by clicking on some elements of the interface shown in FIG. 10A. For example, the user may click (or double-click or long-press, etc.) 1002 to trigger the electronic device to present the corresponding thumbnail for each day of the month. In response to this operation, the electronic device may display an upper window or a new interface on the interface shown in fig. 10A, and the window or the interface may include the elements such as the thumbnail images in the month unit.
In this example, as described in fig. 6, after clicking on one of the thumbnails in 1003 in fig. 10A, the user may display a clover model of the date corresponding to the thumbnail in 1001. The clover model may be displayed directly on the screen, or may be displayed in the form of an animation as shown in fig. 6. It should be noted that, in some embodiments, when some or all of the tasks in the corresponding dimension of a certain blade are completed, another blade may be displayed in the form of an animation effect in the blade to indicate the completion of the tasks. For example, as shown in fig. 12A, the user clicks the thumbnail image corresponding to "5/2020/25/friday" in 1201. Referring to fig. 12B, 12C, and 12D, in response to an operation input by a user, the electronic device may present 1202 a corresponding clover model in the form of an animation effect on the interface. Since the user is done with all the movement dimensions on the day, as the rotation of the corresponding blade of the movement dimension becomes larger, another darker clover image can be presented in a fan-shaped rotation. When the blade corresponding to the movement dimension is displayed, the clover images are displayed, and the blade images of the movement dimension of the corresponding area are covered for displaying the completion degree of the task in the current dimension. For example, as shown in fig. 12D, since only one task is included in the motion dimension, and the user completes the task in the dimension, the area of the clover blade in the clover corresponding to the motion dimension can be equivalent to the size of the blade corresponding to the dimension when the clover blade is fully displayed, i.e., the lighter color image is overlaid with the darker color image. It should be noted that, in a specific implementation process, the electronic device may set another image of the blade with a darker color on the image layer where the image of the blade corresponding to the motion dimension is located, so as to implement coverage of the image. The electronic equipment can also directly adjust the color change of the image of the blade corresponding to the movement dimension through calculation, so as to achieve the covering effect in vision.
It should be noted that, in some implementations of the present application, in combination with the interface shown in fig. 10A, a user may input an operation through an image of clovers displayed on the interface to view specific task completion information in a corresponding dimension. For example, the user may input a first operation with one of the blades 1001 shown in fig. 10A. In response to the first operation, the electronic device may pop up a new window on the interface, where the window may include elements such as a progress bar or a new M-leaf grass for showing the completion of different tasks in the dimension. Correspondingly, when the user enters a second operation into an element of the window, detailed completion of the corresponding task, such as the current run 4/5, may be realized in the vicinity of the element. Also for example, swim 0/5, etc. The first operation and the second operation may be a click operation, a slide operation, a double click operation, a touch operation or other operations. Of course, the first operation may be a different operation from the second operation. The embodiments of the present application do not limit this.
With continued reference to FIG. 10A, the interface may also include 1004 for incentivizing the user to conduct active health management based on the current health data. As shown in FIG. 10A, the word "feel tense and do breathing exercises to keep oneself in the best condition".
In the present application, the prompt presented at 1004 is variable, and may be a prompt corresponding to the current health data of the user. For example, a prompt phrase library may be set in the electronic device, and the electronic device may select a corresponding prompt phrase from the prompt phrase library to display through the collected health data of the user, so as to prompt the user to perform a corresponding health activity, such as increasing exercise, and paying attention to rest, so as to encourage the user to actively manage the health of the user. For another example, the electronic device may scroll to display the prompt information in the prompt phrase library in 1004, and the electronic device may adjust the probability of occurrence of different phrases according to the collected health data of the user, such as to increase the probability of occurrence of a prompt phrase corresponding to a lack of health activity displayed by the current health data, and to decrease the probability of occurrence of a prompt phrase corresponding to a sufficient health activity displayed by the current health data.
It should be noted that 1002, 1003, and 1004 in the foregoing example are only an exemplary illustration, and in other implementations, the electronic device may further flexibly display a part or more of the content according to user settings or scene requirements, which is not limited in this embodiment of the present application.
It should be appreciated that the user, through the interface shown in FIG. 10A, can more clearly understand the completion of the task in the different dimensions corresponding to the current fitness activity. In the embodiment of the present application, in order to enable a user to obtain more detailed data, other elements may be further included in the interface shown in fig. 10A. Illustratively, on the interface, the completion of a particular task included in each dimension may also be presented. For example, as shown at 1008 in fig. 10A. As shown in fig. 10A, below 1004, tabs corresponding to tasks such as "fitness training", "respiratory training", etc. may also be displayed on the interface. It can be understood that due to the size limitation of the display screen of the electronic device, the elements displayed on one interface are limited, so that when the user wants to view more task tabs, the corresponding elements can be displayed on the electronic device by inputting corresponding operations. For example, as shown in fig. 13A, the user may input a slide-up operation on the interface, and in response to the operation, the electronic device may scroll up the screen content to present the content as shown in fig. 13B on the presentation screen. In this example, in response to the user input swiping up, the electronic device may display the full "fitness training," "breathing training," and "regular wake up" corresponding tabs on the display screen. From these tabs, the user can see the preset goals and completion of the corresponding task.
In combination with the above description, the data acquisition related to the embodiment of the present application may be acquired by the electronic device itself, for example, by being matched with a sensor module of the electronic device, such as receiving from a wearable device worn by a user, and the like. Or may be actively entered by the user. As an example, a user may implement active input of data by "punch-card".
As an example, when a user wants to punch a card for a certain preset task, a corresponding operation may be input. In response to the operation, the electronic device may present a corresponding window or interface to receive a user's card punch operation.
For example, in conjunction with fig. 14A. When the user wants to punch the "regular get up" task, the user can click on the "regular get up" tab on the screen as in fig. 14A, and in response to this operation, the electronic device can show a window as shown in fig. 14B. A "punch" button is provided on the window. When the user inputs a click or touch operation to the "punch card" button, the electronic device may consider that the user has finished punching the card. If the 'card punching' meets the preset time convention, the task is considered to be completed 1 time. In one possible implementation, after the user completes 1 task, as shown in fig. 14C, the electronic device may adjust the color of the corresponding image on the tab and display the "punched" word, so that the user can confirm that the punching is successful.
It should be noted that, in this example, in the interface of "healthy life", in addition to the above elements being shown, other contents may be provided to further improve the interaction performance of the interface with the user.
Illustratively, in conjunction with fig. 15A, in one possible implementation, a share button as 1501 in the figure may also be included on the interface. The user can realize the sharing operation of the current health data by clicking the sharing button. After receiving an operation of clicking the share button input by the user, the electronic device may display a share preview interface as shown in fig. 15B. In the sharing interface, the exercise data of the shared day and the number of times of acquiring complete clovers (such as healthy clovers) determined according to the historical data acquisition condition can be displayed. In addition, a button for selecting the sharing object can be further included on the interface. For example, a share to friend button 1503, a share to friend circle button 1504, a share to blog button 1505, and save the image 1502 locally 1506, among others, may be included.
In other possible implementations, in conjunction with fig. 16A, an extended function button 1601 may also be displayed on the interface. The user may invoke a drop down menu by clicking on the extended function button to provide more function buttons. Illustratively, as shown in fig. 16A, the user may click (e.g., first click) 1601 to invoke buttons including "object management", "weekly news", and "setup" functions. The user can select a function desired to be executed by inputting the click operation again.
In some scenarios, as shown in FIG. 16B, the user may enter a second click operation to click the "target management" button in order to perform management of tasks in various dimensions. In response to this operation, the electronic apparatus may present an interface as shown in fig. 16C on the screen. It can be seen that on this interface, a number of tasks are shown that may be included in different dimensions. For example, the activity dimension may include tasks such as walking, moderate to high intensity exercise, and fitness exercise. The emotional dimension can include tasks such as respiratory training. Tasks such as regular getting up, early sleep and the like can be included in the sleep dimension. The user can select tasks included in different dimensions displayed in the clover model according to the needs of the user. For example, when a user wants to demonstrate a fitness training task in the motion dimension in a clover model, the task may be selected as shown in FIG. 16C. Similarly, the user may add a "breathing training" task to the emotional dimension by checking the task. The user can add the task to the sleep dimension by tickling the "regular wake up" task. It should be noted that, in the target management function shown in fig. 16C, the completion targets corresponding to different tasks may be set by the user, may be automatically generated by the electronic device according to a history, or may be set by the electronic device, which is not limited in this embodiment of the present application.
In other scenarios, as shown in FIG. 16D, the user may enter a second click on the "Weekly" button to view the Weekly conditions formed by the history through the clover model. In response to the user's clicking operation, the electronic device may present an interface as shown in fig. 16E on the screen. As shown in FIG. 16E, the interface may include the user accumulating the number of healthy clovers captured, the number to achieve a small goal, and weekly health data. According to the interface, the user can clearly and clearly acquire the historical health data, and accordingly, the data can be analyzed or shared.
The scheme provided by the embodiment of the application is mainly introduced from the perspective of electronic equipment. It is understood that the electronic device may include a health data presentation device to implement the health data presentation method. In order to realize the functions, the health data display device comprises a hardware structure and/or a software module which are corresponding to the execution of each function. Those skilled in the art will readily appreciate that the elements of the various examples described in connection with the embodiments disclosed herein may be implemented as hardware or a combination of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the health data display apparatus may be divided into the functional modules according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. Optionally, the division of the modules in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 17 shows a schematic diagram of a health data display device 1700, where the health data display device 1700 may be a chip or a system on a chip in an electronic device. As one way of implementation, the health data presentation apparatus 1700 shown in fig. 17 includes: a receiving unit 1701 and a presentation unit 1702.
The receiving unit 1701 is configured to receive a first operation of a user. The display unit 1702, configured to display a first interface in response to the first operation, where a first region of the first interface includes M first elements, the M first elements correspond to M dimensions, each first element of the M first elements is used to display a completion degree of the corresponding dimension, the completion degree of the dimension is a completion degree including a task in the dimension, the completion degrees of the multiple tasks are determined according to corresponding health data, and M is an integer greater than 1. Wherein, the figure formed by the M first elements is a rotation symmetry figure, and the area of any two elements in the M first elements is the same.
In a possible implementation manner, the first region further includes N second elements, each of the N second elements corresponds to one first element, and N is an integer smaller than or equal to M and greater than or equal to 0. The second element is included in the first element corresponding to the second element and used for indicating the completion degree of the dimension corresponding to the first element.
In a possible implementation manner, in the first interface, the areas of the M first elements are fixed, and the area of the second element is determined according to the degree of completion of the dimension corresponding to the first element corresponding to the second element. The area of the second element is in positive correlation with the completion degree of the dimension.
In one possible implementation, the first element and the second element are different colors.
In a possible implementation manner, in the process of displaying the first element on the first interface, the two-dimensional coordinate of the display image of the electronic device is used as a reference, the center line of the first element forms a first included angle with the coordinate axis at a first time, the center line of the first element forms a second included angle with the coordinate axis at a second time, the second time is later than the first time, and the second included angle is different from the first included angle.
In a possible implementation manner, in the process of displaying the second element on the first interface, the two-dimensional coordinate of the display image of the electronic device is used as a reference, the center line of the second element forms a third included angle with the coordinate axis at a third time, the center line of the second element forms a fourth included angle with the coordinate axis at a fourth time, the fourth time is later than the third time, and the fourth included angle is different from the third included angle.
In one possible implementation manner, when the tasks in the dimensions corresponding to the M first elements are not completely completed, the color of each first element in the M first elements is different from each other. When the tasks under the corresponding dimensions of the M first elements are all completed, the colors of the M first elements are the same.
In one possible implementation, the receiving unit 1701 is further configured to receive a second operation of the first element by the user. The presenting unit 1702 is further configured to present, in response to the second operation, a second interface, where the second interface includes detailed completion of the task in the dimension corresponding to the first element.
In one possible implementation, the first interface further includes: a second area comprising the first historical health data during a preset period of time as presented by any one of the possible health data presenting means as described above.
In one possible implementation manner, the first historical health data is displayed through thumbnails of images composed of M first elements corresponding to different dates in the preset period.
In one possible implementation, the receiving unit 1701 is further configured to receive a third operation of the thumbnail in the second area by the user. The presentation unit 1702 is further configured to present, in response to the third operation, M first elements corresponding to the thumbnail in the first area.
In one possible implementation, the receiving unit 1701 is further configured to receive a fourth operation of the thumbnail in the second area by the user. The presentation unit 1702 is further configured to present, in response to the fourth operation, M first elements and N second elements corresponding to the thumbnail in the first area.
In one possible implementation, the receiving unit 1701 is further configured to receive a fifth operation performed by the user on the second area. The presentation unit 1702 is further configured to, in response to the fifth operation, present second historical health data in the second area, where the second historical health data is presented by thumbnails of images composed of M first elements corresponding to different dates within the preset period. The time corresponding to the second historical health data is different from the time corresponding to the first historical health data.
In one possible implementation, the first interface further includes a third area, where the third area includes an excitation phrase, and the excitation phrase is determined according to the degree of completion of the dimensions corresponding to the M first elements exhibited by the first area.
In a possible implementation manner, the first interface further includes a first control, and the receiving unit 1701 is further configured to receive a sixth operation performed by the user on the first control. The display unit 1702 is further configured to display, in response to the sixth operation, a third interface, where the third interface includes an image corresponding to the M first elements displayed in the first area, so as to perform a sharing operation on the image.
In a possible implementation manner, the first interface further includes a second control, and the receiving unit 1701 is further configured to receive a seventh operation performed by the user on the second control. The display unit 1702 is further configured to display a configuration menu on the first interface in response to the seventh operation, so that a user can set tasks in different dimensions through the configuration menu, or generate a weekly report/monthly report/annual report corresponding to the health data through the configuration menu.
In one possible implementation, the health data is acquired by the electronic device itself, and/or acquired by other electronic devices, and/or input by the user itself.
In one possible implementation, the receiving unit 1701 is further configured to receive an eighth operation of the user, where the eighth operation includes a double-click operation, or a long-press operation, or a zoom operation on the first area. In response to the eighth operation, the showing unit 1702 adds a first element to the first region, where a dimension corresponding to the added first element is different from a dimension corresponding to the first element shown in the current first region.
In a possible implementation manner, the dimension corresponding to each first element includes one or more tasks; the first elements having different task completions differ in color, and/or transparency.
In one possible implementation manner, for each first element in the M first elements, when all tasks in the dimension corresponding to the first element are not completed, the color of the first element is a first color; when all tasks in the dimension are completed, the color of the first element is a second color; the different colors of the first element are used for representing different completion degrees of the task under the corresponding dimension, and the deeper the color of the first element is, the higher the completion degree of the task of the corresponding dimension is.
In one possible implementation, the color of the first element is determined according to the following formula:
Figure BDA0002529638570000271
wherein, SRR value of RGB of a first color, ERR value, N, of RGB of the second coloriThe number of tasks in the ith dimension is, and i is the number of completed tasks in the ith dimension; sGG value of RGB of the first color, EGA G value of RGB of a second color; sBB value of RGB of the first color, EBB value of RGB of the second color.
In a possible implementation manner, for each first element in the M first elements, when all tasks in the dimension corresponding to the first element are not completed, the transparency of the first element is a first transparency; when all tasks in the dimension are completed, the transparency of the first element is a second transparency; the different transparencies of the first element are used for representing different completions of the task under the corresponding dimension, and the larger the transparency of the first element is, the higher the completion of the task of the corresponding dimension is.
In one possible implementation, the transparency of the first element is determined according to the following formula:
Figure BDA0002529638570000272
wherein, T0Is of a first transparency, TNTo end transparency, NiThe number of tasks in the ith dimension is, and i is the number of completed tasks in the ith dimension.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again. The health data display apparatus 1700 according to the embodiment of the present application is configured to perform the functions described in the above embodiments, so that the same effects as those of the above method can be achieved. Alternatively, but not necessarily, it is understood that the health data display apparatus 1700 provided in the embodiment of the present application may include a processing module or a control module for supporting the receiving unit 1701 and/or the display unit 1702 to perform corresponding functions, if necessary.
Fig. 18 shows a schematic diagram of the components of a chip system 1800. The chip system 1800 may include: a processor 1801 and a communication interface 1802 for enabling the electronic device to carry out the functions recited in the above embodiments. In one possible design, the chip system 1800 may also include a memory that retains program instructions and data necessary for the terminal. The chip system may be constituted by a chip, or may include a chip and other discrete devices.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
Fig. 19 shows a schematic block diagram of an electronic device 1900. The electronic device 1900 may include: a processor 1901 and a memory 1902. The memory 1902 is used to store computer-executable instructions. For example, in some embodiments, the processor 1901, when executing the instructions stored in the memory 1902, can cause the electronic device 1900 to perform operations S701-S704 shown in fig. 7, as well as other operations that the electronic device needs to perform.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The functions or actions or operations or steps, etc., in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the present application are all or partially generated upon loading and execution of computer program instructions on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or can comprise one or more data storage devices, such as a server, a data center, etc., that can be integrated with the medium. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to include such modifications and variations.

Claims (21)

1. A method for displaying health data, the method comprising:
receiving a first operation of a user;
in response to the first operation, displaying a first interface, where a first region of the first interface includes M first elements, the M first elements correspond to M dimensions, each of the M first elements is used to display a completion degree of the corresponding dimension, the completion degree of the dimension is a completion degree including a task in the dimension, the completion degrees of the multiple tasks are determined according to corresponding health data, and M is an integer greater than 1;
wherein, the figure formed by the M first elements is a rotation symmetry figure, and the area of any two elements in the M first elements is the same.
2. The method according to claim 1, wherein the first region further comprises N second elements, each of the N second elements corresponds to one first element, N is an integer less than or equal to M and greater than or equal to 0;
the second element is included in the first element corresponding to the second element and used for indicating the completion degree of the dimension corresponding to the first element.
3. The method according to claim 2, wherein in the first interface, the areas of the M first elements are fixed, and the area of the second element is determined according to the degree of completion of the dimension corresponding to the first element corresponding to the second element; the area of the second element is in positive correlation with the completion degree of the dimension.
4. A method according to claim 2 or 3, wherein the first element and the second element are of different colours.
5. The method according to any one of claims 2-4, wherein the first element is displayed on the first interface with reference to two-dimensional coordinates of a displayed image of the electronic device, and a center line of the first element forms a first angle with a coordinate axis at a first time, and a center line of the first element forms a second angle with the coordinate axis at a second time, the second time being later than the first time, and the second angle is different from the first angle.
6. The method according to any one of claims 2 to 4, wherein the second element is displayed on the first interface with reference to two-dimensional coordinates of a displayed image of the electronic device, a center line of the second element forms a third angle with the coordinate axis at a third time, a center line of the second element forms a fourth angle with the coordinate axis at a fourth time, the fourth time is later than the third time, and the fourth angle is different from the third angle.
7. The method according to any one of claims 2 to 5,
when the tasks in the dimensions corresponding to the M first elements are not completely completed, the color of each first element in the M first elements is different;
and when the tasks under the corresponding dimensionalities of the M first elements are all completed, the colors of the M first elements are the same.
8. The method according to any one of claims 1-7, further comprising:
receiving a second operation of the first element by the user;
and responding to the second operation, and displaying a second interface, wherein the second interface comprises detailed completion conditions of tasks under the dimension corresponding to the first element.
9. The method of any one of claims 1-8, wherein the first interface further comprises: a second area comprising the first historical health data within a preset period displayed by the health data displaying method according to any one of claims 1 to 8.
10. The method of claim 9, wherein the first historical health data is presented by thumbnails of images of the M first elements corresponding to different dates within the preset period.
11. The method according to claim 9 or 10, characterized in that the method further comprises:
receiving a third operation of the user on the thumbnail in the second area;
in response to the third operation, presenting M first elements corresponding to the thumbnail in the first area.
12. The method according to claim 9 or 10, characterized in that the method further comprises:
receiving a fourth operation of the user on the thumbnail in the second area;
in response to the fourth operation, the M first elements and the N second elements corresponding to the thumbnail are shown in the first area.
13. The method according to any one of claims 9-12, further comprising:
receiving a fifth operation of the user on the second area;
in response to the fifth operation, displaying second historical health data in the second area, wherein the second historical health data are displayed through thumbnails of images formed by M first elements corresponding to different dates in the preset period;
the time corresponding to the second historical health data is different from the time corresponding to the first historical health data.
14. The method of any one of claims 1-13, wherein the first interface further comprises a third region, the third region comprising an activation phrase, the activation phrase determined from a degree of completion of dimensions corresponding to the M first elements exhibited by the first region.
15. The method of any of claims 1-14, wherein the first interface further comprises a first control, the method comprising:
receiving a sixth operation of the first control by the user;
and responding to the sixth operation, displaying a third interface, wherein the third interface comprises images corresponding to the M first elements displayed in the first area, so that the images can be shared conveniently.
16. The method of any of claims 1-15, wherein the first interface further comprises a second control, the method further comprising:
receiving a seventh operation of the second control by the user;
and responding to the seventh operation, displaying a configuration menu on the first interface so that a user can set tasks under different dimensions through the configuration menu, or generating a weekly report/monthly report/annual report corresponding to health data through the configuration menu.
17. The method according to any one of claims 1 to 16, wherein the health data is acquired by the electronic device itself, and/or acquired by other electronic device acquisition, and/or input by the user itself.
18. The method according to any one of claims 1-17, further comprising:
receiving an eighth operation of a user, wherein the eighth operation comprises a double-click operation, a long-press operation or a zooming operation on the first area;
and responding to the eighth operation, adding a first element in the first area, wherein the dimensionality corresponding to the added first element is different from the dimensionality corresponding to the first element displayed in the current first area.
19. An electronic device, comprising one or more processors and one or more memories, and a display screen; the one or more memories coupled with the one or more processors, the one or more memories storing computer instructions; the computer instructions, when executed by the one or more processors, cause the electronic device to perform a presentation of health data on the display screen in accordance with the health data presentation method of any one of claims 1-18.
20. A computer-readable storage medium comprising computer instructions which, when executed, perform a method of health data presentation as claimed in any one of claims 1 to 18.
21. A chip system, wherein the chip comprises processing circuitry and an interface; the processing circuit is used for calling and running a computer program stored in the storage medium from the storage medium to execute the health data presentation method according to any one of claims 1 to 18.
CN202010514771.8A 2020-06-08 2020-06-08 Health data display method and electronic equipment Pending CN113838550A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010514771.8A CN113838550A (en) 2020-06-08 2020-06-08 Health data display method and electronic equipment
PCT/CN2021/092122 WO2021249073A1 (en) 2020-06-08 2021-05-07 Health data display method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010514771.8A CN113838550A (en) 2020-06-08 2020-06-08 Health data display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN113838550A true CN113838550A (en) 2021-12-24

Family

ID=78845267

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010514771.8A Pending CN113838550A (en) 2020-06-08 2020-06-08 Health data display method and electronic equipment

Country Status (2)

Country Link
CN (1) CN113838550A (en)
WO (1) WO2021249073A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114428659A (en) * 2022-01-26 2022-05-03 惠州Tcl移动通信有限公司 Information display method and device, electronic equipment and computer readable storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115620866B (en) * 2022-06-17 2023-10-24 荣耀终端有限公司 Motion information prompting method and device

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1875863A1 (en) * 2005-04-28 2008-01-09 Shiseido Company, Limited Skin state analyzing method, skin state analyzing device, and recording medium on which skin state analyzing program is recorded
CN101197628A (en) * 2007-12-29 2008-06-11 中国移动通信集团湖北有限公司 Net element information presenting method for wireless system
US20110054269A1 (en) * 2009-08-26 2011-03-03 Samsung Electronics Co., Ltd. Method and apparatus for providing and sharing comprehensive health information
CN105260078A (en) * 2014-05-30 2016-01-20 苹果公司 Wellness data aggregator
CN106250153A (en) * 2016-08-01 2016-12-21 乐视控股(北京)有限公司 A kind of user interface control method and equipment
CN106997574A (en) * 2017-02-15 2017-08-01 王梅 A kind of health data interactive approach and system based on cloud service
CN107533580A (en) * 2015-04-16 2018-01-02 索尼公司 The multiple parameters for the biological information that part as live plant is shown over the display
CN108133757A (en) * 2016-12-01 2018-06-08 三星电子株式会社 For providing the device and method thereof of health management service
CN109685593A (en) * 2017-10-19 2019-04-26 阿里巴巴集团控股有限公司 Data object information providing method, device and electronic equipment
CN109829107A (en) * 2019-01-23 2019-05-31 华为技术有限公司 A kind of recommended method and electronic equipment based on user movement state
WO2019135808A1 (en) * 2018-01-05 2019-07-11 Byton North America Corporation In-vehicle user health platform systems and methods
CN110403585A (en) * 2019-08-30 2019-11-05 周国明 A kind of armband with sport health monitoring function
CN110569095A (en) * 2019-08-09 2019-12-13 华为技术有限公司 Method and electronic equipment for displaying page elements

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9940682B2 (en) * 2010-08-11 2018-04-10 Nike, Inc. Athletic activity user experience and environment
US10678890B2 (en) * 2015-08-06 2020-06-09 Microsoft Technology Licensing, Llc Client computing device health-related suggestions
US11224782B2 (en) * 2017-06-04 2022-01-18 Apple Inc. Physical activity monitoring and motivating with an electronic device
CN109446033B (en) * 2018-09-30 2022-06-14 维沃移动通信有限公司 Method and device for displaying downloading progress

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1875863A1 (en) * 2005-04-28 2008-01-09 Shiseido Company, Limited Skin state analyzing method, skin state analyzing device, and recording medium on which skin state analyzing program is recorded
CN101197628A (en) * 2007-12-29 2008-06-11 中国移动通信集团湖北有限公司 Net element information presenting method for wireless system
US20110054269A1 (en) * 2009-08-26 2011-03-03 Samsung Electronics Co., Ltd. Method and apparatus for providing and sharing comprehensive health information
CN105260078A (en) * 2014-05-30 2016-01-20 苹果公司 Wellness data aggregator
CN107533580A (en) * 2015-04-16 2018-01-02 索尼公司 The multiple parameters for the biological information that part as live plant is shown over the display
CN106250153A (en) * 2016-08-01 2016-12-21 乐视控股(北京)有限公司 A kind of user interface control method and equipment
CN108133757A (en) * 2016-12-01 2018-06-08 三星电子株式会社 For providing the device and method thereof of health management service
CN106997574A (en) * 2017-02-15 2017-08-01 王梅 A kind of health data interactive approach and system based on cloud service
CN109685593A (en) * 2017-10-19 2019-04-26 阿里巴巴集团控股有限公司 Data object information providing method, device and electronic equipment
WO2019135808A1 (en) * 2018-01-05 2019-07-11 Byton North America Corporation In-vehicle user health platform systems and methods
CN109829107A (en) * 2019-01-23 2019-05-31 华为技术有限公司 A kind of recommended method and electronic equipment based on user movement state
CN110569095A (en) * 2019-08-09 2019-12-13 华为技术有限公司 Method and electronic equipment for displaying page elements
CN110403585A (en) * 2019-08-30 2019-11-05 周国明 A kind of armband with sport health monitoring function

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114428659A (en) * 2022-01-26 2022-05-03 惠州Tcl移动通信有限公司 Information display method and device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
WO2021249073A1 (en) 2021-12-16

Similar Documents

Publication Publication Date Title
CN112130742B (en) Full screen display method and device of mobile terminal
CN110244893B (en) Operation method for split screen display and electronic equipment
CN110543289B (en) Method for controlling volume and electronic equipment
EP3893129A1 (en) Recommendation method based on user exercise state, and electronic device
CN111650840B (en) Intelligent household scene arranging method and terminal
CN109445572A (en) The method, graphical user interface and terminal of wicket are quickly recalled in full screen display video
CN110825469A (en) Voice assistant display method and device
CN109274828B (en) Method for generating screenshot, control method and electronic equipment
CN109766043A (en) The operating method and electronic equipment of electronic equipment
CN110633043A (en) Split screen processing method and terminal equipment
CN111566606B (en) Interface display method and electronic equipment
CN113824834B (en) Control method for screen-off display, electronic equipment, chip and computer readable storage medium
CN115437541A (en) Electronic equipment and operation method thereof
CN110059211B (en) Method and related device for recording emotion of user
CN114115512B (en) Information display method, terminal device, and computer-readable storage medium
CN110727380A (en) Message reminding method and electronic equipment
WO2020118490A1 (en) Automatic screen-splitting method, graphical user interface, and electronic device
CN112583957A (en) Display method of electronic device, electronic device and computer-readable storage medium
WO2021249073A1 (en) Health data display method and electronic device
CN113010076A (en) Display element display method and electronic equipment
CN114077365A (en) Split screen display method and electronic equipment
WO2022242217A1 (en) Linear motor control method and apparatus, device, and readable storage medium
CN115421619A (en) Window display method and electronic equipment
CN115268735A (en) Display method and apparatus thereof
CN113539487A (en) Data processing method and device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination