CN115705117A - Method for setting moving target and related electronic equipment - Google Patents

Method for setting moving target and related electronic equipment Download PDF

Info

Publication number
CN115705117A
CN115705117A CN202110903392.2A CN202110903392A CN115705117A CN 115705117 A CN115705117 A CN 115705117A CN 202110903392 A CN202110903392 A CN 202110903392A CN 115705117 A CN115705117 A CN 115705117A
Authority
CN
China
Prior art keywords
user
food
interface
target
exercise
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110903392.2A
Other languages
Chinese (zh)
Inventor
张兆铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110903392.2A priority Critical patent/CN115705117A/en
Priority to PCT/CN2022/109725 priority patent/WO2023011477A1/en
Publication of CN115705117A publication Critical patent/CN115705117A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/955Retrieval from the web using information identifiers, e.g. uniform resource locators [URL]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Abstract

The application discloses a method for setting a moving target and related electronic equipment, wherein the method for setting the moving target can support intelligent moving equipment to provide various and interesting targets for users, convert the moving target into a visual target and even disassemble the visual target into a visual staged target. Such as setting a food consumption target as a moving target or setting a scenic spot moving route as a moving target by a user. By implementing the method, the man-machine interaction performance can be improved, the interestingness and the visibility of the moving target are increased when the user sets the moving target by using the moving equipment, the moving target is defined more clearly by the user, a more visual moving effect is fed back to the user, the user does not feel boring during the moving process by using the moving equipment, and the movement enthusiasm of the user is improved.

Description

Method for setting moving target and related electronic equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method for setting a moving object and a related electronic device.
Background
With the upgrading of user consumption, the health consciousness of the nation is gradually improved, the consumption market scale of sports equipment is gradually huge, and consumers are more prone to selecting more professional sports fields such as gymnasiums and using more professional sports equipment such as running machines, spinning bicycles, elliptical machines and the like to assist in exercise and monitor the sports effect.
The user can set targets in the sports equipment, such as target distance, target time, target heat and the like, so that the user can more clearly set the targets conveniently, and the purpose of completing the expected training effect is achieved by completing the targets.
However, the setting of the target, such as the target distance, the target time, the target heat, etc., on these conventional sports apparatuses is tedious for the user. For example, when the user runs 10 km on the treadmill, the user can only see the variation of the running distance number on the treadmill, which is very tedious, and the user easily loses the activity of exercise, and abandons the target.
Disclosure of Invention
The application provides a method for setting a moving target and related electronic equipment, and the method for setting the moving target can support intelligent moving equipment to provide various and interesting targets for users, convert the moving target into a visual target, and even disassemble the visual target into a visual staged target. If the user sets the food consumption target as the motion target, when the user moves, the food target can show the visual consumption progress of the food according to the motion amount of the user; or a regional movement route can be set as a movement target, when the user moves, the progress of the movement route can be displayed according to the amount of the user movement, the scenery of a plurality of scenic spots on the scenic spot route can also be displayed, and the regional movement route is used as a staged target to encourage the user to continuously and actively move for unlocking the next scenic spot.
The above and other objects are achieved by the features of the independent claims. Further implementations are presented in the dependent claims, the description and the drawings.
In a first aspect, an embodiment of the present application provides a method for setting a moving object, where the method includes: the first device acquires that the user sets the moving target to consume the first food in the first application. The first device displays a first image of a first food item. The first device acquires a first amount of motion of a user. The first device displays a second image of a consumption ratio of the first food, the consumption ratio being associated with the first amount of motion, the consumption ratio being greater the first amount of motion.
The method of the first aspect can improve the man-machine interaction performance, measures the training target of the user in the form of common food of the user, can enable the user to define the exercise target more clearly during exercise, namely, consumes the heat of the selected food, and can simultaneously display the exercise effect visually on the user interface, so that the user does not feel boring during exercise by using the exercise equipment, the exercise enthusiasm of the user is improved, and the exercise enthusiasm is increased.
With reference to the first aspect, in some embodiments, the second image includes a first portion indicating a portion of the first food item that is consumed and a second portion indicating a portion of the first food item that remains unconsumed, an area of the first portion becoming larger as the amount of the first motion increases. That is, during the movement of the user, the first device may acquire the amount of the movement of the user in real time, and the first device may display an image in real time in which the first food is continuously consumed. Referring to the embodiment shown in fig. 25, the first device may display a food consumption diagram of the first food, the food consumption diagram being a food consumption progress visualization diagram corresponding to a current user exercise consumption calorie. The consumption proportion of the first food is determined by the user's motion quantity, which indicates the consumption proportion of the first food, and the larger the user's motion quantity, the larger the consumption proportion of the first food. As the user continues to move, the food consumption portion of the first food displayed on the user interface increases and the remaining unconsumed portion of the first food decreases until the first food is consumed and the user's movement goal is achieved.
With reference to the first aspect, in some embodiments, the first portion is a blurred image, or alternatively, the first portion is a transparent image. The embodiment of the application does not limit the representation mode of the food consumption diagram, and the consumed food part and the unconsumed food part are different in illustration, so that the consumed food part and the unconsumed food part can be obviously distinguished, and the food consumption can be clearly indicated.
In combination with the first aspect, in some embodiments, the first food may include one or more foods, and the first food may be obtained by one or more of: the first food is selected by a user in a food list of the first application; or, the first food is selected by the user by entering characters and searching; alternatively, the first food is specified by the user by inputting voice; or the first food is specified by the user through inputting the picture; alternatively, the first food is derived from the user's meal information in a second application, which is a take-away application or an ordering application. The second application may open permissions to the first application to allow it to obtain the order information. The function of quickly importing the meal ordering information can quickly acquire the food information of the user for eating, and the trouble of manually inputting the food information one by one is saved for the user.
With reference to the first aspect, in some embodiments, the first application includes a food library, so that a user can conveniently view the corresponding heat of the food. The application food library can be understood as one or more data tables storing the corresponding relationship between a plurality of foods and the calories contained in the foods under a certain quantity, such as 50 kcal per 100 ml of cola.
The user can select single food or a package on the food list interface, or input characters, letters or numbers in the food search input box to search for food, for example, the user can eat hand-pulled noodles in lunch, the user can input hand-pulled noodles in the food search input box and search, the first device or the second device can search for hand-pulled noodles in the application food library and display all search results to the user, and the user can add corresponding food and quantity as required. If the food is not available in the application food library, the network can be searched for the food and corresponding calories and the application food library can be updated.
The voice searching mode can facilitate the user to search food through voice. The user may speak the food desired to be searched upon clicking the voice search control. After the first device acquires the voice message, recognizing food and quantity in the voice message, searching the corresponding food in the application food library, and if the food does not exist in the application food library, searching the food and corresponding heat in a network and updating the application food library.
The code scanning searching mode can facilitate the user to search food through the code scanning mode. The food codes for marking food information such as two-dimensional codes or bar codes are pasted on packages of some foods, and a user can start the camera to scan the food codes after clicking the code scanning search control to identify the corresponding food information including information such as food names, quantities and heat.
The photographing searching mode can facilitate the user to search for food through pictures. The user can be when having a dinner, shoots food, and food and quantity in first equipment or the second equipment can the intelligent recognition picture add the heat of discerning all food in the picture again, and help the user more swift interpolation food information. The embodiments described with reference to fig. 12 and 13 can be specifically referred to.
In some embodiments, in combination with the first aspect, the first application provides a choose party size option, and if the first food is shared for consumption by N people, then selecting N among the choose party size option results in food intake for a single person to eat. That is, in order to calculate the calorie of food taken by a single person when a plurality of persons have meals, the user can input the number of persons having meals at the same time when inputting food. In one example, the user may take a picture of all of the food items at a party and then select the number of people sharing those food items at the meal people selection section. The first device or the second device can perform data processing according to the number of people having meals after the total calorie of all foods in the picture is identified, and can approximately estimate the calorie taken by each person, so that the problem that the calculation of the calorie taken by a single person when a plurality of persons have a dinner can be solved. In some embodiments, the total calories of all foods may be divided by the number of people sharing a meal to obtain an average to estimate the amount of food ingested by each person. In other embodiments, other ways of estimating food intake for each person may be used. For example, the user can input the ratio of the meal intake of different persons by himself. For example, images of all the people who have meals can be obtained, and then the food intake of different people can be distributed according to the fat and thin of different people who have meals. Alternatively, the first application may obtain a user representation of the meal people and assign food intake of different people based on the user representation or historical data of different meal people, for example, a person has a meal for the father, mother, and child, the food intake of the father is the most, about 1/3 more than the mother, and the food intake of the child is about 1/2 of the mother, and the food intake of different people is estimated based on the user representation.
In combination with the first aspect, in some embodiments, in order to more accurately count the calorie of the food taken by the user, the user may take a picture of the food before eating and take another picture of the food after eating, and the first application may identify the total amount of the food before eating and the remaining amount of the food after eating, and calculate a difference between the calorie of the food before eating and the calorie of the food after eating as the calorie of the food taken by the user.
With reference to the first aspect, in some embodiments, the actual caloric value required for consuming the first portion of the first food and the caloric value of the first amount of exercise are in a first ratio, which may be less than 1, or greater than 1. In one example, the value of the first ratio may be less than 1. As exercise consumes calories more than eating something, say 5 minutes for a hamburger, 900 kcal for a calorie intake, and 90 minutes for a 900 kcal run. It is possible for a user to run for 90 minutes to consume a hamburger, which is too long in duration to easily combat the user's athletic enthusiasm. If the user selects to consume the heat of one hamburger, the heat can be actually set to be correspondingly subtracted by a certain proportion of the heat of one hamburger as a moving target, for example, the heat is converted by the proportion of 50%. When the user moves, correspondingly displaying the movement consumption progress of the user for 450 kilocalories according to one hamburger heat. When the user moves to consume up to 450 kcal, the user is informed that a hamburger of calories has been consumed. Therefore, the moving target is easier to achieve for the user, and the movement enthusiasm of the user can be improved.
In some embodiments, in conjunction with the first aspect, the user may generate a picture of the first amount of motion and consumption of the first food, share the picture to another social platform, or save the picture to a local device. When the first device detects the sharing operation of the user, the first device displays a sharing interface, and images and/or text descriptions of the first food consumed are displayed in the sharing interface. This makes the movement more stylistic, giving the user more sense of achievement, as can be seen in the embodiment of figure 27.
With reference to the first aspect, in some embodiments, the shared picture generated by the first application further displays a portrait of the user.
With reference to the first aspect, in some embodiments, the first device further displays a map of the first region, and a movement route of the user at the first region, the movement route being associated with the first amount of movement, a length of the movement route becoming longer as the first amount of movement increases. That is, in the user interface, it is possible to display both the food consumption map and the region movement route map. Therefore, the food consumption condition can be displayed in comparison with the regional movement route condition, so that the movement target is clearer and more visual. If the user can intuitively know how far the actual path of the hamburger needs to be moved on the map when eating the hamburger. As the user continues to move, the progress of the movement route displayed on the user interface increases until the user's movement goal is achieved.
With reference to the first aspect, in some embodiments, the first region may be a first scenic region, and the first scenic region may be a popular scenic region recommended by the first application or a nearby scenic region recommended according to the current geographic location of the user. In other embodiments, the user may arbitrarily select an area on the map as the first region, and the first application may assist the user in planning the movement route, or the user may specify the movement route by himself or herself.
In combination with the first aspect, in some embodiments, the image of the first food and the map of the first region are displayed in the same interface. As referring to fig. 42, a food consumption map is displayed in a first area, e.g., a left half, of the user interface, and a scenic spot motion route map is displayed in a second area, e.g., a right half, of the user interface, so that the user can visually see the progress of food consumption and/or the progress of a motion route fed back in real time while performing a motion at the same time. The first region and the second region may overlap. In still other embodiments, the food consumption image of the first food and/or the motion roadmap of the first region may be set to a certain transparency.
With reference to the first aspect, in some embodiments, the food consumption image of the first food and the motion roadmap of the first region may be switched to be displayed, for example, the food consumption image of the first food is displayed for one minute, the motion roadmap of the first region is switched to be displayed for one minute, and then the food consumption image of the first food is switched back.
With reference to the first aspect, in some embodiments, an actual distance of the movement route in the first region is the same as the movement distance indicated by the first amount of movement. In other embodiments, the actual distance of the movement path in the first area may be different from the movement distance indicated by the first amount of movement.
In combination with the first aspect, in some embodiments, in order to promote the enthusiasm of the user for exercise, it may be set that the route progress shown by the user does not correspond to the heat consumed by the user in actual exercise, and may be converted in a certain proportion. If the actual heat value required by the user to pass through the exercise route and the heat value of the second exercise amount are the second ratio, the value of the second ratio is not limited in this embodiment, and the second ratio may be greater than 1 or smaller than 1. In one example, the value of the second ratio may be less than 1. Since the user is relatively tedious to exercise on the exercise equipment such as a treadmill, and the scenic spot route is often long, for example, the west lake scenic spot is over 10 km around a circle, the ordinary user basically needs more than one hour to run the circle, the duration is too long, and the exercise enthusiasm of the user is easily attacked. It is possible to set the user's actual movement distance or actual movement consumption calorie on the running machine multiplied by a coefficient greater than 1, such as a coefficient of 1.5, the user travels 1 km on the running machine, corresponding to a travel distance of 1.5 km on the movement route schematic picture 3303. Therefore, the moving target is easier to achieve for the user, and the movement enthusiasm of the user can be improved.
In conjunction with the first aspect, in some embodiments, when the user motion progress corresponds to reaching certain sights in a scenic spot, the first device may present a scene map of the sights to the user and introduce the sights, refer to the embodiment illustrated in fig. 34. The user may also choose to switch between the scenic map interface and the scenic presentation interface. When the user arrives at a scenery spot, the first device can play voice, video, motion picture or one or more pictures, show the scenery of the current scenery spot to the user, introduce the characteristics of the scenery spot, related events and the like, and can also forecast the next scenery spot. The step of informing the next scenic spot is equivalent to the step of dividing a long route into a plurality of small destinations, a stage small target is set for a user, when the user informs the next scenic spot in advance, the next scenic spot is taken as a new small target, the distance between every two scenic spots is relatively short, the user can easily reach the target, and the user can be fully motivated to continue to move.
With reference to the first aspect, in some embodiments, an image and/or a text description of the movement route of the user is displayed in the sharing interface.
In a second aspect, an embodiment of the present application provides a method for setting a moving target, where the method may include: the first device acquires a first movement route of a user, wherein the movement target is set to be a first area in a first application. The first device acquires a second amount of motion of the user. The first device displays a map of a first region and a movement route of a user in the first region. The movement route is associated with the first movement amount, and the length of the movement route becomes longer as the first movement amount increases.
The method of the second aspect can improve the man-machine interaction performance, the method takes the movement route of the region as the training target of the user, the user can define the movement target more clearly during movement, namely, the specific movement route of the region as the movement target is more visual and interesting than the simple time and heat target, meanwhile, the movement progress and the movement effect of the route can be visually displayed on a user interface, so that the user does not feel boring during the movement using the movement equipment, the movement enthusiasm of the user is improved, and the movement enthusiasm is increased.
In combination with the second aspect, in some embodiments, as the amount of user motion increases, the first device displays an increasing progression of the user motion route until the user's motion goal is achieved. The first equipment acquires the amount of motion of the user in real time and displays the progress of the user on the motion route in real time.
In conjunction with the second aspect, in some embodiments, the first application may provide a plurality of movement routes for the first scenic spot.
In conjunction with the second aspect, in some embodiments, the first region may be a first scenic spot, which may be a popular scenic spot recommended by the first application or a nearby scenic spot recommended based on the current geographic location of the user. In other embodiments, the user may arbitrarily select an area on the map as the first region, and the first application may assist the user in planning the movement route, or the user may specify the movement route by himself or herself.
In some embodiments, in combination with the second aspect, the first region may be obtained by one or more of: the first zone is selected by a user in a zone list of the first application; or the first region is selected by the user through inputting characters and searching; or, the first region is specified by the user by inputting voice; or the first region is specified by the user through inputting the picture; alternatively, the first region is obtained from region information in a second application, which is a tourist/scenic region recommendation application or a map application or a navigation application or the like. The second application may open permissions to the first application to allow it to obtain locale information. The function of quickly importing the region information can quickly acquire the region information, and the trouble of manually inputting the region information one by one is saved for a user.
In combination with the second aspect, in some embodiments, on the map, starting from a set starting point, the route that the user has travelled may be marked with a highlight-colored line.
In combination with the second aspect, in some embodiments, an actual distance of the movement route in the first region is the same as the movement distance indicated by the first amount of movement. In other embodiments, the actual distance of the movement path in the first area may be different from the movement distance indicated by the first amount of movement.
In combination with the second aspect, in some embodiments, in order to promote the enthusiasm of the user for exercise, it may be set that the progress of the route shown by the user does not correspond to the amount of heat consumed by the user in actual exercise, and may be converted in a certain proportion. If the actual heat value required by the user to pass through the exercise route and the heat value of the second exercise amount are the second ratio, the value of the second ratio is not limited in this embodiment, and the second ratio may be greater than 1 or smaller than 1. In one example, the value of the second ratio may be less than 1. Since the user is relatively tedious to exercise on the exercise equipment such as a treadmill, and the scenic spot route is often long, for example, the west lake scenic spot is over 10 km around a circle, the ordinary user basically needs more than one hour to run the circle, the duration is too long, and the exercise enthusiasm of the user is easily attacked. It is possible to set the user's actual movement distance or actual movement consumption calorie on the running machine multiplied by a coefficient greater than 1, such as a coefficient of 1.5, the user travels 1 km on the running machine, corresponding to a travel distance of 1.5 km on the movement route schematic picture 3303. Therefore, the moving target is easier to achieve for the user, and the movement enthusiasm of the user can be improved.
In conjunction with the second aspect, in some embodiments, the first application may recommend a different movement route depending on the device type of the first device. For example, if the first device is a treadmill, a scenic spot movement route suitable for running is preferentially recommended; if the first device is a mountain climbing machine, a mountain landscape movement route suitable for mountain climbing can be preferentially recommended; if the first device is a spinning bike, a scenic road movement route suitable for riding can be preferentially recommended, and the like.
In conjunction with the second aspect, in some embodiments, when the user motion progress corresponds to reaching a first sight in the first sight zone, the first device may present a scene map of the first sight to the user and introduce the first sight, with reference to the embodiment shown in fig. 34. The user may also select to switch between the first scene map interface and the first scene display interface. When a user arrives at a scenery spot, the first device can play voice, video, motion pictures or one or more pictures, show the scenery of the current scenery spot to the user, introduce the characteristics of the scenery spot, related classical events and the like.
In conjunction with the second aspect, in some embodiments, the next attraction may also be announced. The method is characterized in that the step of forecasting the next scenic spot is equivalent to the step of dividing a route with a longer whole course into a plurality of small destinations, a stage small target is set for a user, when the user watches the next scenic spot, the next scenic spot is used as a new small target, the distance between every two scenic spots is relatively short, the user can easily arrive, and the user can be fully motivated to continue to move.
In some embodiments, the first scene map interface and the scenery spot scene display interface may be switched to display, for example, the first scene map interface is displayed for one minute, the scenery spot scene display interface is switched to display the scenery spot closest to the current user for one minute, and then the first scene map interface is switched back.
In combination with the second aspect, in some embodiments, the user may generate a picture of the second amount of motion and the completion of the motion route, share the picture to another social platform, or save the picture to a local device. When the first device detects the sharing operation of the user, the first device displays a sharing interface, and images and/or text descriptions of the completion condition of the user passing through the movement route are displayed in the sharing interface. This makes the movement more stylish, allowing the user to obtain more accomplishment, as can be seen in the embodiment of figure 37.
With reference to the second aspect, in some embodiments, the shared picture generated by the first application may display a group image of the scenery spot scene display image and the portrait of the user, which may refer to the embodiment shown in fig. 35.
In conjunction with the second aspect, in some embodiments, the map of the first area may display the motion situation of a plurality of users, such as displaying a ranking of the motion situation of the plurality of users on the same day or within a certain time (e.g., within a week), or displaying the farthest distance reached by each user and an icon, such as prompting the user to: you also have 1 km beyond the second user. So as to encourage the user to overtake others, refresh the record and improve the enthusiasm of the user for movement.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device may include: a communication device, a display device, a memory, and a processor coupled to the memory, a plurality of application programs, and one or more programs. The communication device is used for communication, the display device is used for displaying an interface, and the memory stores computer-executable instructions which, when executed by the processor, enable the electronic device to implement any one of the functions of the first device in the first aspect or the first device in the second aspect.
In a fourth aspect, the present application provides a computer storage medium, in which a computer program is stored, where the computer program includes executable instructions, and when the executable instructions are executed by a processor, the processor is caused to perform operations corresponding to the method provided in the first aspect or the second aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on an electronic device, causes the electronic device to perform any one of the possible implementation manners as in the first aspect or the second aspect.
In a sixth aspect, an embodiment of the present application provides a chip system, where the chip system may be applied to an electronic device, and the chip includes one or more processors, where the processors are configured to invoke computer instructions to enable the electronic device to implement any implementation manner of the first aspect or the second aspect.
By implementing the aspects provided by the application, the man-machine interaction performance can be improved, the interestingness and the visualization of the moving target are increased when the user sets the moving target by using the moving equipment, the moving target is defined more clearly by the user, a more visual moving effect is fed back to the user, the user does not feel boring any more during the moving process by using the moving equipment, and the movement enthusiasm of the user is improved.
Drawings
Fig. 1 is a schematic view of a motion scene according to an embodiment of the present application;
fig. 2 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application;
fig. 3A is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3B is a schematic diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic diagram of a communication system according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 6 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 9 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 10 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 11 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 12 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 13 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 14 is a schematic diagram of an image processing process provided in an embodiment of the present application;
FIG. 15 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 16 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 17 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 18 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 19 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 20 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 21 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 22 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 23 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 24 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 25 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 26 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 27 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 28 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 29 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 30 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 31 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 32 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 33 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 34 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 35 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 36 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 37 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 38 is a schematic illustration of a user interface provided by an embodiment of the present application;
FIG. 39 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 40 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 41 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 42 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 43 is a flowchart of a method for setting a moving object according to an embodiment of the present disclosure;
Fig. 44 is a flowchart of a method for setting a moving object according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and exhaustively described below with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature and, in addition, in the description of embodiments of the present application, "plurality" means two or more than two.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for performing interaction and information exchange between an Application (APP) or an Operating System (OS) and a user, and it implements conversion between an internal form of information and a form acceptable to the user. The user interface is source code written by java, extensible markup language (XML) and other specific computer languages, and the interface source code is analyzed and rendered on the electronic device and finally presented as content which can be identified by the user. A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be a visual interface element such as text, an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. displayed in a display of the electronic device.
With the upgrading of user consumption, the health consciousness of the nation is gradually improved, the consumption market scale of sports equipment is gradually huge, and consumers are more prone to selecting more professional sports fields such as gymnasiums and using more professional sports equipment such as running machines, spinning bicycles, elliptical machines and the like to assist in exercise and monitor the sports effect.
The indoor sports equipment has the characteristics of capability of quantifying sports, small occupied space, professional auxiliary sports, practicability, multiple functions and the like, and is favored by many consumers. The user can set targets in the sports equipment, such as target distance, target time, target heat and the like, so that the user can more clearly set the targets conveniently, and the purpose of completing the expected training effect is achieved by completing the targets.
However, the setting of the target, such as the target distance, the target time, the target heat, etc., on these conventional sports apparatuses is tedious for the user. For example, when the user runs 10 km on the treadmill, the user can only see the change of the running distance number on the treadmill, the user is very tedious, the user does not have an intuitive concept on the amount of heat consumption, the user easily loses the enthusiasm of the user, and the exercise target is abandoned.
The method for setting the moving target can be applied to intelligent moving equipment, the intelligent moving equipment aims to explore a brand-new man-machine interaction mode, provide a more visual and interesting motion feedback mode and bear the motion requirements of users, the intelligent moving equipment can be a treadmill, an elliptical machine, a dynamic bicycle, a rowing machine, a climbing machine and the like, and the method is not limited. The intelligent sports equipment can be matched with a mobile phone or other terminals to provide exclusive, multifunctional, personalized, more interesting and more convenient service for consumers. The method for setting the moving target can support the intelligent moving equipment to provide not only common distance targets, time targets, heat targets and the like for the user, but also provide various and interesting targets, convert the moving target into a visual target, and even disassemble the moving target into a visual staged target. If the food consumption target can be set according to the food intake calorie of the user, when the user moves, the food target can show the visual consumption progress of the food according to the movement amount of the user; or a scenic spot route can be provided, when the user moves, the progress of the traveling route can be displayed according to the amount of the user movement, the scenic spots of a plurality of scenic spots on the scenic spot route can be displayed, and the scenic spot route is used as a stage target to encourage the user to continuously and actively move for unlocking the next scenic spot.
By implementing the method provided by the application, the man-machine interaction performance can be improved, the interestingness and the visibility of the moving target are increased when the user sets the moving target by using the moving equipment, the moving target is defined more clearly by the user, a more visual moving effect is fed back to the user, the user does not feel boring during the moving process by using the moving equipment, and the movement enthusiasm of the user is improved.
The intelligent sports equipment related to the embodiment of the application can be but is not limited to a running machine, an elliptical machine, a spinning bike, a rowing machine, a mountain climbing machine and the like, the application is not limited, besides the functions of assisting exercise, counting exercise amount and the like of common sports equipment, the intelligent sports equipment also has a communication function, can be communicated with a mobile phone or other terminals, and is matched with the functions of providing exclusive, multifunctional, personalized, more convenient and more interesting sports health services for consumers.
In the embodiment of the application, the intelligent sports equipment can trigger a corresponding instruction, start a sports mode and count the amount of sports such as sports time, sports distance and sports consumed heat when detecting user operations such as touch, pressing, gestures and voice of a user.
In the embodiment of the present application, the intelligent sports apparatus mainly uses a treadmill as an example to describe various embodiments.
Referring to fig. 1, fig. 1 shows a schematic view of a motion scene. Fig. 1 illustrates a scenario in which a user performs a running exercise on a treadmill 100.
As shown in fig. 1, the treadmill 100 may include, but is not limited to, a display device 101, an armrest frame 102, a support frame 103, a tread belt 104, a motor (not shown), a processor (not shown), a communication device (not shown), and the like. It is to be understood that the structure of the treadmill 100 described herein is merely exemplary and not limiting with respect to other embodiments of the present application.
The display device 101 may be a display screen, a projection device, a dashboard, etc., and may generate an optical signal and map the optical signal to the eyes of the user. The display device 101 is used to display a user interface, and may display exercise information such as a current running speed, a running distance, a consumed calorie, a food consumption progress, a scenic spot route running progress, and the like to the user. The display device 101 may also sense user operations to generate corresponding instructions, for example, the display device 101 may include a touch sensing component, and the user may select start/end running, increase/decrease running speed, increase/decrease gradient of the treadmill, select a food consumption target or select a running route, etc. on a touch screen.
The armrest frame 102 is mounted on the top end of the support frame 103 for the user to hold when tired or in an emergency.
The bracket 103 is used to connect and support various components of the treadmill 100, such as the display device 101, the handrail frame 102, the running belt 104, the motor (not shown), and the like.
The tread belt 104 is the area where the user runs, and the treadmill 100 may determine the running distance of the user based on the distance traveled by the tread belt 104.
A motor (not shown) is generally located at the front end of the tread belt 104 for powering the drive of the tread belt 104.
A processor (not shown) is used to interpret signals or generate instructions, as well as process data, coordinate scheduling processes, and the like.
The communication device (not shown) is used for transmitting communication data or instructions, including receiving and sending communication signals, such as voice information, control signaling, image data stream, video data stream, and the like. In some embodiments, the treadmill 100 can establish a communication connection with other electronic devices, such as a mobile phone, a computer, etc., through the communication device.
The communication connection provided by the communication device may be, but is not limited to, a wired connection or a wireless connection. For example, the wireless connection may be a wireless fidelity (Wi-Fi) connection, a bluetooth (bluetooth) connection, a Near Field Communication (NFC) connection, a ZigBee connection, or other short-range transmission technologies. The wired connection may be a Universal Serial Bus (USB) connection, a High Definition Multimedia Interface (HDMI) connection, a display interface (DP) connection, or the like. The present embodiment does not limit the type of communication connection.
It should be understood that, in the embodiment of the present application, the intelligent sports device is mainly illustrated by using a treadmill as an example, and does not limit other embodiments of the present application, and the intelligent sports device may also be other types of sports devices, such as an elliptical machine, a spinning bike, a rowing machine, a mountain climbing machine, and the like, and functions and inventions described later may be correspondingly migrated.
Fig. 2 is a schematic hardware structure diagram of a first device according to an embodiment of the present application. The first device is an intelligent sports device. The embodiment of the present application does not set any limit to the specific type of the first device. The intelligent sports equipment can be a treadmill, an elliptical machine, a spinning bike, a rowing machine, a climbing machine and the like. It is understood that when the first device corresponds to different types of sports devices, a part of hardware structure can be increased or reduced.
The first device may include a processor 201, a display 202, a communicator 203, a motor 204, an actuator 205, a power management module 206, a charging interface 207, buttons 208, a pointer 209, a camera 210, an audio module 211, a sensor module 212, the audio module 211 may include a speaker 211A, a microphone 211B, an audio interface 211C, the sensor module 212 may include a pressure sensor 212A, a touch sensor 212B, and the like.
It is to be understood that the structure illustrated in the present embodiment does not constitute a specific limitation of the first device. In other embodiments of the present application, the first device may include more or fewer components than illustrated, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 201 is typically used to control the overall operation of the first device and may comprise one or more processing units. For example: the processor 201 may include a Central Processing Unit (CPU), an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a Video Processing Unit (VPU), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), and the like. Wherein, the different processing units may be independent devices or may be integrated in one or more processors. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
The first device may implement display functions via the GPU, the display apparatus 202, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display device 202 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 201 may include one or more GPUs that execute program instructions to generate or change display information.
The display device 202, which may be a display screen or a projection device or a dashboard, may generate an optical signal and map the optical signal into the user's eye. The display device 202 is used to display a user interface such as images, videos, etc., and display exercise information such as current running speed, running distance, calories consumed, food consumption progress, scenic spot route running progress, etc., to the user. Display device 202 may include a display screen, which may include a display panel. The display panel may be used to display physical objects and/or virtual objects. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. The display device 202 may also sense user operations to generate corresponding commands, such as a touch sensor component included in the display device 202, and may select start/end running, increase/decrease running speed, increase/decrease treadmill gradient, select food consumption target or select running route, etc. through a touch screen.
The first apparatus may comprise 1 or N display devices 202, N being a positive integer larger than 1.
The communication means 203 may be used for communication of the first device with other devices. The communication device 203 may include a mobile communication module and may also include a wireless communication module. The mobile communication module may provide a solution for wireless communication including a second generation (2th generation, 2g) network, a third generation (3th generation, 3g) network, a fourth generation (4th generation, 4g) network, a fifth generation (5th generation, 5g) network, etc. applied to the first device. The wireless communication module may provide a solution for wireless communication applied to the first device, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module may be one or more devices integrating at least one communication processing module. The wireless communication module receives electromagnetic waves via the antenna, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 201. The wireless communication module may also receive a signal to be transmitted from the processor 201, frequency-modulate and amplify the signal, and convert the signal to electromagnetic radiation via the antenna.
The motor 204 may convert the electrical power to mechanical power to provide mechanical power to the transmission 205. The motor may include a direct current motor and an alternating current motor. A motor is typically coupled to the actuator 205 for moving the actuator 205.
An actuator 205 for directly assisting the user's movement. For example, for a treadmill, the transmission 205 may include a roller that may transmit power from a motor to a tread belt that is continuously transmitted and on which a user may run. The user can adjust the conveying speed of the running belt, and then adjust the running speed.
The power management module 206 is used to supply power to the first device, and the power management module 206 may receive a current input and supply power to the processor 201, the display device 202, the communication device 203, the motor 204, the transmission device 205, and the like.
The charging interface 207 is an interface that receives a charging input from a charger.
The first device may include one or more keys 208. These keys 208 may control the first device to provide the user with access to functions on the first device. The keys 208 may be in the form of mechanical keys such as buttons, switches, dials, etc., or may be touch or near touch sensing devices (e.g., touch sensors). The first device may receive a key input, and generate a key signal input related to a user setting and a function control of the first device. The keys 208 may include a start/stop key, a raise/lower key, a previous/next key, a confirm key, and the like.
The indicator 209, the indicator 209 may be an indicator light, may be used to indicate a sports device status, a sports change, or indicate a user status, may also be used to indicate a message, notification, or the like.
A camera 210 for capturing still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the first device may include 1 or N cameras 210, N being a positive integer greater than 1. The camera 210 may include, but is not limited to, a conventional color camera (RGB camera), a depth camera (RGB depth camera), a Dynamic Vision Sensor (DVS) camera, and the like. In some embodiments, camera 210 may be a depth camera. The depth camera can acquire spatial information of a real environment.
In some embodiments, the first device may capture an image of a user through the camera 210, identify different users through faces, correspondingly enable different user accounts, and store movement information of different users through the accounts of the different users, so as to ensure that the accounts of the different users are not confused, and further protect data privacy of the users.
In some embodiments, the first device may also intelligently recognize the sweat amount or the state of facial muscles (relaxed or tense) of the user through the camera 210 to determine the current exercise state of the user, so as to prompt the user to take a rest appropriately in time, and prevent the body health from being negatively affected due to excessive exercise.
In some embodiments, the camera 210 may capture an image of a user's hand or body, and the processor 201 may be configured to analyze the image captured by the camera 210 to identify a hand or body motion input by the user. For example, the hand motion of the user can be recognized through the camera 210 to realize the gesture control of the user, the body motion of the user can be recognized through the camera 210, whether the motion of the user is standard or not is judged, and a prompt is given to the user.
The audio module 211, the audio module 211 may include a speaker 211A, a microphone 211B, and an audio interface 211C, and the first device may implement an audio function through the audio module 211, the speaker 211A, the microphone 211B, the audio interface 211C, and the application processor. Such as sound playback, recording, etc.
The audio module 211 is used for converting digital audio information into an analog audio signal for output, and also for converting an analog audio input into a digital audio signal. The audio module 211 may also be used to encode and decode audio signals.
The speaker 211A, also called a "horn", is used to convert an audio electrical signal into a sound signal. The user can listen to music through the speaker 211A or listen to a handsfree call.
The microphone 211B, also called "microphone", is used to convert the sound signal into an electrical signal. When collecting voice information, the user can input a voice signal to the microphone 211B by speaking near the microphone 211B through the mouth. The first device may be provided with at least one microphone 211B. In other embodiments, the electronic device may be provided with two microphones 211B, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device may further include three, four, or more microphones 211B to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The audio interface 211C is used for connecting external audio devices, such as earphones, speakers, and the like. The audio interface 211C may be a Universal Serial Bus (USB) interface, an open mobile electronic device platform (OMTP) standard interface of 3.5mm, a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA, or the like.
The first device may also comprise other input and output interfaces, and other means may be connected to the first device via suitable input and output interfaces. The components may include, for example, video jacks, data connectors, and the like.
The sensor module 212 is used for detecting an external signal, and the first device is equipped with one or more sensors, including but not limited to a pressure sensor 212A, a touch sensor 212B, and the like.
The pressure sensor 212A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. The pressure sensor 212A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, or the like. When a touch operation is applied to the electronic device, the electronic device detects the intensity of the touch operation according to the pressure sensor 212A. The electronic device may also calculate the position of the touch from the detection signal of the pressure sensor 212A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: when a touch operation having a touch operation intensity smaller than the first pressure threshold value is applied to the pressure sensor 212A, an instruction to pause audio is executed. When a touch operation having a touch operation intensity greater than or equal to the first pressure threshold value acts on the pressure sensor 212A, an instruction to turn off the audio is executed. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation time lengths may correspond to different operation instructions. For example: when a touch operation having a touch operation time length smaller than the first time threshold value is applied to the pressure sensor 212A, the confirmed instruction is executed. When a touch operation with a touch operation time length greater than or equal to the first time threshold acts on the pressure sensor 212A, a power-on/power-off instruction is executed.
The touch sensor 212B is also referred to as a "touch device". The touch sensor 212B is used to detect a touch operation applied thereto or thereabout. Touch sensor 212B can communicate the detected touch operation to an application processor to determine the touch event type. The first device may provide visual output related to the touch operation through the display apparatus 202. The first device may also send an instruction corresponding to the touch operation to another electronic device that establishes the communication connection.
The first device may establish a communication connection with the second device through which data or instructions are transmitted. The user may enter information on the second device and synchronize to the first device. The user may issue the control instruction to the first device by a user operation acting on the second device.
Can be mounted on the first device or the second device
Figure BDA0003200609950000121
The system,
Figure BDA0003200609950000122
The system,
Figure BDA0003200609950000123
A system,
Figure BDA0003200609950000124
A system (harmony os, HOS) or other type of operating system, which is not limited in this application.
Exemplary second devices provided in embodiments of the present application may include, but are not limited to, a mobile phone, a notebook, a desktop, a tablet computer (PAD), a Personal Computer (PC), or other types of electronic devices, such as a desktop computer, a laptop computer, a handheld computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a smart screen, a vehicle-mounted device, a game console, a smart watch, a smart bracelet or other smart wearable device, and the like, and may further include an internet of things (internet of things) device and/or a smart weight and/or smart scale, such as a smart home scale, a smart phone scale, and/or the like. The embodiment of the present application does not set any limit to the specific type of the second device. In this embodiment, the terminal device may also be referred to as a terminal for short, and the terminal device is generally an intelligent electronic device that can provide a user interface, interact with a user, and provide a service function for the user.
Fig. 3A is a schematic hardware structure diagram of a second device according to an embodiment of the present application.
The second device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation to the second apparatus. In other embodiments of the present application, the second device may include more or fewer components than illustrated, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 is generally used to control the overall operation of the second device and may include one or more processing units. For example: the processor 110 may include a Central Processing Unit (CPU), an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a Video Processing Unit (VPU), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the second device is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The second device may support one or more video codecs. In this way, the second device may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent cognition of electronic equipment, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, a Serial Peripheral Interface (SPI) interface, and the like.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, audio module 170 may communicate audio signals to wireless communication module 160 through an I2S interface.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, audio module 170 may also communicate audio signals to wireless communication module 160 through a PCM interface. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
In some embodiments, a UART interface is generally used to connect the processor 110 and the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the UART interface to realize the function of playing the audio.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display screen 194, the camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device, and may also be used to transmit data between the electronic device and a peripheral device. The interface can also be used for connecting other electronic equipment, such as a mobile phone, a PC, a smart television and the like. The USB interface may be USB3.0, and is configured to be compatible with Display Port (DP) signaling, and may transmit video and audio high-speed data.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a limitation to the structure of the second device. In other embodiments of the present application, the second device may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the second device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the second device may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the second device. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the second device, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the second device is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the second device can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The second device may implement display functionality via the GPU, the display screen 194, and the application processor, among other things. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the second device may include 1 or N display screens 194, N being a positive integer greater than 1.
The second device may implement a shooting function through the ISP, camera 193, video codec, GPU, display screen 194, application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the second device may include 1 or N cameras 193, N being a positive integer greater than 1. The camera 193 may include, but is not limited to, a conventional color camera (RGB camera), a depth camera (RGB depth camera), a Dynamic Vision Sensor (DVS) camera, and the like. In some embodiments, camera 193 may be a depth camera. The depth camera can acquire spatial information of a real environment.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the using process of the electronic device.
In some embodiments of the present application, internal memory 121 may be used to store application programs, including instructions, for one or more applications. The application program, when executed by the processor 110, causes the second device to generate content for presentation to the user. Illustratively, the application may include an application for managing the second device, such as a gaming application, a conferencing application, a video application, a desktop application, or other application, among others.
The internal memory 121 may include one or more Random Access Memories (RAMs) and one or more non-volatile memories (NVMs).
The random access memory has the characteristics of high reading/writing speed and volatility. Volatile means that upon power down, the data stored in the RAM will subsequently disappear. In general, the ram has a very low static power consumption and a relatively large operating power consumption. The data in the RAM is the memory data which can be read at any time and disappears when the power is off.
The nonvolatile memory has nonvolatile and stable data storage characteristics. The nonvolatile property means that after power is off, the stored data can not disappear, and the data can be stored for a long time after power is off. Data in the NVM includes application data and can be stably stored in the NVM for a long time. The application data refers to content written in the running process of an application program or a service process, such as photos or videos acquired by a photographing application, text edited by a user in a document application, and the like.
The random access memory may include static random-access memory (SRAM), dynamic random-access memory (DRAM), synchronous dynamic random-access memory (SDRAM), double data rate synchronous dynamic random-access memory (DDR SDRAM), such as fifth generation DDR SDRAM generally referred to as DDR5 SDRAM, and the like.
The nonvolatile memory may include a magnetic disk storage (magnetic disk storage), a flash memory (flash memory), and the like.
The magnetic disk storage device is a storage device using a magnetic disk as a storage medium, and has the characteristics of large storage capacity, high data transmission rate, long-term storage of stored data and the like.
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. according to the operation principle, may include single-level cells (SLC), multi-level cells (MLC), three-level cells (TLC), four-level cells (QLC), etc. according to the level order of the memory cells, and may include universal FLASH memory (UFS), embedded multimedia memory cards (eMMC), etc. according to the storage specification.
The random access memory may be read directly by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other programs that are running, and may also be used to store data for user and application programs, etc.
The nonvolatile memory may also store executable programs, data of users and application programs, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
The external memory interface 120 may be used to connect an external nonvolatile memory, so as to expand the storage capability of the electronic device. The external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are saved in an external nonvolatile memory.
The second device may implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into analog audio signals for output, and also used to convert analog audio inputs into digital audio signals. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic device answers a call or voice information, it can answer the voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device may be provided with at least one microphone 170C. In other embodiments, the electronic device may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and the like.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The second device may include one or more keys 190, and these keys 190 may control the second device to provide the user with access to functions on the second device. The keys 190 may be in the form of mechanical buttons, switches, dials, etc., and may also be touch or near touch sensing devices (e.g., touch sensors). The second device may receive a key input, and generate a key signal input related to user setting and function control of the second device. The keys 190 may include a power-on key, a volume key, and the like.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration prompts as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the second device. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or may be used to indicate a message, notification, or the like.
The second device may also comprise other input and output interfaces, and other apparatus may be connected to the second device by suitable input and output interfaces. The components may include, for example, audio/video jacks, data connectors, and the like.
The second device is equipped with one or more sensors including, but not limited to, a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, etc.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The second device determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the second device detects the intensity of the touch operation based on the pressure sensor 180A. The second device may also calculate the position of the touch from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation time lengths may correspond to different operation instructions. For example: when a touch operation having a touch operation time length smaller than the first time threshold value is applied to the pressure sensor 180A, the confirmed instruction is executed. When a touch operation with a touch operation time length greater than or equal to the first time threshold acts on the pressure sensor 180A, a power-on/power-off instruction is executed.
The gyro sensor 180B may be used to determine the motion pose of the second device. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor 180B detects a shake angle of the electronic device, calculates a distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device through a reverse motion, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the second device calculates altitude from barometric pressure values measured by barometric pressure sensor 180C to assist in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The second device may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the second device is a flip, the opening and closing of the flip may be detected according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the second device in various directions (typically three axes). The magnitude and direction of gravity can be detected when the second device is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The second device may measure distance by infrared or laser. In some embodiments, the scene is photographed and the second device may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The second device emits infrared light outward through the light emitting diode. The second device detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object in the vicinity of the second device. When insufficient reflected light is detected, the second device may determine that there is no object nearby. The second device can utilize the proximity light sensor 180G to detect that the user holds the second device by hand and speaks near the ear, so as to automatically extinguish the screen and achieve the purpose of power saving. The proximity light sensor 180G can also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The second device may adaptively adjust the brightness of the display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the second device is in a pocket to prevent inadvertent contact.
The fingerprint sensor 180H is used to collect a fingerprint. The second device can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to the application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the second device implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the second device heats the battery 142 when the temperature is below another threshold to avoid abnormal shutdown of the second device due to low temperature. In other embodiments, the second device performs a boost on the output voltage of the battery 142 when the temperature is below a further threshold to avoid an abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the second device, different from the position of the display screen 194.
The bone conduction sensor 180M can acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so that the heart rate detection function is realized.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the second device by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The second device may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The second device interacts with the network through the SIM card to realize functions of communication, data communication and the like. In some embodiments, the second device employs eSIM, namely: an embedded SIM card. The eSIM card can be embedded in the second device and cannot be separated from the second device.
The software system of the second device may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the invention adopts a layered architecture
Figure BDA0003200609950000201
The system is an example illustrating the software structure of the second device.
Fig. 3B is a block diagram of a software configuration of the second device of the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the method can be used for
Figure BDA0003200609950000202
The system is divided into four layers, which are respectively an application program layer, an application program framework layer and an android runtime (from top to bottom)
Figure BDA0003200609950000203
runtime) and system libraries, and kernel layer.
The application layer may include a series of application packages.
As shown in fig. 3B, the application packages may include camera, gallery, calendar, phone, map, navigation, WLAN, bluetooth, music, video, settings, etc. applications. The size, thickness, etc. of the font can be set in the setting application.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3B, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
Content providers are used to store and retrieve data and make it accessible to applications. Such data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
Figure BDA0003200609950000204
The Runtime comprises a core library and a virtual machine.
Figure BDA0003200609950000205
runtime is responsible for scheduling and management of the android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The above description of the software architecture of the second device is only an example, and it should be understood that the software architecture illustrated in the embodiment of the present invention is not specifically limited to the present application. In other embodiments of the present application, the software architecture of the second device may include more or fewer modules than shown, or some modules may be combined, some modules may be split, or a different architectural arrangement. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The following describes a communication system 10 provided in an embodiment of the present application.
Fig. 4 illustrates a communication system 10 according to an embodiment of the present application.
As shown in fig. 4, the communication system 10 includes a first device and a second device. A first connection is established between the first device and the second device. The first device and the second device may communicate over the first connection. The first connection may be a wired connection or a wireless connection, and the embodiment is not limited. The first device may communicate data or instructions with the second device over the first connection. The user may enter information on the second device and synchronize to the first device. The user may issue the control instruction to the first device by a user operation acting on the second device. The first connection may include a wireless connection, such as a bluetooth connection, a wireless fidelity Wi-Fi) connection, a hotspot connection, and the like, and enables communication between the first device and the second device under the condition of the same account number, no account number, or a different account number. The wireless connection is not bound by a connecting line, and the freedom degree of the movement of the user is higher. The first connection may also be an Internet connection, and in some embodiments, the first device and the second device may log in to the same account, so as to connect and communicate via the Internet. Of course, a plurality of terminals may log in different accounts, but connect in a binding manner. For example, the first device and the second device may log in different accounts, and the second device sets, in a device management application, to bind the first device and itself, and then connects through the device management application. The first connection may also include a wired connection, such as a USB connection, a High Definition Multimedia Interface (HDMI) connection, a display interface (DP) connection, or the like.
In fig. 4, the first device is exemplified as a treadmill and the second device is exemplified as a cellular phone. Without limitation, the first device may also be other intelligent sports devices such as an elliptical machine, a rowing machine, a mountain climbing machine, and the like, and the second device may also be a PC, a tablet computer, a notebook computer, a cloud host/cloud server or other desktop computers, a laptop computer, a handheld computer, an Artificial Intelligence (AI) device, a smart television, a vehicle-mounted device, a game machine, and the like, which have stronger processing capabilities, and this embodiment is not limited thereto.
In some embodiments, when the first device and the second device are trusted devices, for example, when the second device is matched with or connected to the first device before, the first device will automatically establish a communication connection with the second device and then perform data interaction, without requiring a user to manually perform a connection or matching operation again, which is time-saving and labor-saving.
Can be mounted on the first device or the second device
Figure BDA0003200609950000211
A system,
Figure BDA0003200609950000212
The system,
Figure BDA0003200609950000213
A system,
Figure BDA0003200609950000214
A system (harmony os, HOS) or other type of operating system, and the operating systems of the first device and the second device may be the same or different, and are not limited in this application.
In some embodiments, multiple terminals are each equipped with communications system 10
Figure BDA0003200609950000215
A system, then the plurality of terminals constituteMay be referred to as
Figure BDA0003200609950000221
A super virtual device (super virtual device) may also be called
Figure BDA0003200609950000222
The super terminal integrates the capabilities of a plurality of terminals through a distributed technology, stores the capabilities in a virtual hardware resource pool, and uniformly manages, schedules and integrates the capabilities of the terminals according to business needs to provide services to the outside, so that quick connection, capability mutual assistance and resource sharing are realized among different terminals.
The embodiment of the present application does not limit the type of the first connection, and data transmission and interaction may be performed between terminals in the communication system 10 through multiple communication connection types. In addition, each terminal may also be connected and communicate in any of the above manners, which is not limited in this embodiment of the application.
For example, the first connection may be a combination of a plurality of connections, for example, the first device or the second device may access the network by establishing a connection with a router through Wi-Fi or a connection with a base station through cellular signals, and the first device and the second device may communicate through the network. If the second device sends the information to the cloud server through the network, the cloud server sends the information to the first device through the network.
Accordingly, a mobile communication module and a wireless communication module may be configured in the first device or the second device for communication. The mobile communication module can provide a solution including 2G/3G/4G/5G wireless communication and the like applied to the terminal. The wireless communication module may include a bluetooth module and/or a Wireless Local Area Network (WLAN) module, etc. Wherein, the bluetooth module may provide a solution including one or more of classic bluetooth (bluetooth 2.1) or Bluetooth Low Energy (BLE), and the WLAN module may provide a solution including wireless fidelity peer-to-peer (Wi-Fi P2P), wireless fidelity local area network (wireless f)An identity local area networks (Wi-Fi LANs) or a wireless fidelity software access point (Wi-Fi software access point) (WLAN AP). In some embodiments, wi-Fi P2P refers to a technology that allows devices in a wireless network to connect to each other in a point-to-point fashion without going through a wireless router
Figure BDA0003200609950000223
The system may also be referred to as wireless fidelity direct (Wi-Fi direct). The devices establishing the Wi-Fi P2P connection can directly exchange data through Wi-Fi (which must be in the same frequency band) under the condition of not connecting with a network or a hot spot, so that point-to-point communication is realized, such as data of transmission files, pictures, videos and the like. Compared with Bluetooth, wi-Fi P2P has the advantages of higher searching speed and transmission speed, longer transmission distance and the like.
In combination with the communication system 10, the technical scheme provided by the application can integrate software and hardware capabilities of different devices, and provide multifunctional, visual and more interesting exercise experience for users.
In some embodiments, the second device has an exercise health application installed thereon, which may obtain user exercise data detected by the second device or the first device. The user may also record athletic data, dietary data, or personal signs data via the athletic health application. The first device may also have the exercise health application installed thereon.
In some embodiments, the same user may be logged on to the exercise health application of the first device and the second device, facilitating the first device and the second device to synchronize other data such as exercise data or dietary data.
In some embodiments, the second device may, when detecting some user operations acting on the second device, such as user pressing, touch, voice, gesture, and the like, transmit the user operations collected by the sensor to the processor to generate the control instruction, and transmit the control instruction to the first device through the first connection by the communication device, so that the user can conveniently control the first device through the second device. For example, the user can control the functions of starting or stopping the operation, adjusting the speed, setting the moving object and the like of the first device by operating the second device.
In the embodiment of the application, the first device or the second device can support providing of not only common moving objects such as distance objects, time objects and heat objects, but also various and interesting objects for a user, converting the moving objects into visual objects, and even disassembling the moving objects into visual stage objects. If the food consumption target can be set according to the food intake calorie of the user, when the user moves, the food target can show the visual consumption progress of the food according to the movement amount of the user; or a scenic spot route can be provided, when the user moves, the progress of the traveling route can be displayed according to the amount of the user movement, the scenic spots of a plurality of scenic spots on the scenic spot route can be displayed, and the scenic spot route is used as a stage target to encourage the user to continuously and actively move for unlocking the next scenic spot.
It should be noted that the communication system 10 shown in fig. 4 is only used to assist in describing the technical solutions provided in the embodiments of the present application, and does not limit other embodiments of the present application. In an actual service scenario, more or fewer terminal devices may be included in the communication system 10, for example, other smart wearable devices may be included, and the first device and/or the second device may be used together with the other smart wearable devices, for example, a smart band, a smart watch, and the like. Smart wearable devices may be configured with a variety of sensors, such as acceleration sensors, gyroscope sensors, magnetic sensors, etc., that may be used to detect and track user motion. The intelligent wearable device can communicate with the first device or the second device through near-distance transmission technologies such as Wi-Fi, bluetooth, NFC and ZigBee, and can also communicate through wired connections such as a USB interface or a custom interface. The present application does not limit the types of terminals, the number of terminals, the connection method, and the like in the communication system 10.
For example, the smart watch may detect the user's athletic data, such as number of steps taken, length of time run, length of time swim, etc., and synchronize the user's athletic data to the treadmill and/or cell phone. Likewise, the treadmill and/or the cell phone may synchronize the detected user data to the smart watch. The multiple terminals are matched for use, so that more accurate detection of the motion data of the user can be realized.
In addition, when the plurality of terminals sum up the exercise data of the user, duplicate data detected in the same time period are also eliminated. For example, the intelligent watch and the treadmill detect running data of the user in a time period, and the intelligent watch and the treadmill can be stamped when recording exercise data. When they synchronize the motion data with each other, a reasonable motion data can be taken out from the motion data of the same time stamp. The reasonable exercise data can be a maximum value in the plurality of exercise data (considering that some devices may have errors in detection and have undetected exercise data), or an average value (considering balance errors), or considering that the exercise data detected by the running machine is more accurate, the exercise data of the running machine is directly taken as the final statistical exercise data to cover the exercise data detected by the smart watch in the same time period, or other calculation modes, and the method is not limited by the application.
In conjunction with the embodiments illustrated in fig. 1-4 and described above, exemplary user interfaces associated with some embodiments of the present application are described below.
After a first connection is established between a first device (a smart sports device such as a treadmill) and a second device (a cell phone), the first device and the second device may transmit instructions, data, etc. to each other through the first connection. The first connection may be a bluetooth connection, a Wi-Fi P2P connection, etc., and the embodiment is not limited.
In order to enable a user to have a better exercise experience, related user interfaces can be added to the first device and the second device, so that the user can feel intuitive, and the user can conveniently perform functional management on the first device or the second device.
It should be understood that the user interfaces described in the following embodiments of the present application are only exemplary interfaces and are not intended to limit the present application. In other embodiments, different interface layouts can be used in the user interface, more or fewer controls can be included, and other functional options can be added or subtracted, which are within the scope of the present application based on the same inventive concept provided in the present application.
It is understood that, in the following example interfaces provided in the embodiments of the present application, the first device is a treadmill, and the second device is a mobile phone to display the relevant user interface, but this does not limit the present application. The following example interfaces may also be migrated for use on other types of devices, all within the scope of the present application, based on the same inventive concepts provided herein.
Referring to fig. 5, fig. 5 illustrates a user interface 500 displayed by the first device in some embodiments. In the user interface 500, the user can see the day's motion record and can set some motion-related functions.
As shown in fig. 5, a user avatar and name 501, a title bar 502, a history option 503, and a plurality of exercise function related options such as exercise time statistics option 504, exercise calorie consumption statistics option 505, a set goal option 505, a select exercise item option 507, a synchronize exercise data option 508, a quick start/pause option 509, and the like are displayed in the user interface 500.
The user avatar and name 501 shows the avatar and name of the currently logged-in user account, and is used to identify different users. A plurality of users can use the first equipment, different users correspond to different account numbers, the amount of exercise of different users can be conveniently counted respectively, personalized and exclusive experience is given to the users, exercise data of different users can be better isolated, and data privacy of the users is protected. In the user interface 500, the current user is Alice.
The title of the currently set page is displayed in the title bar 502, and the current title bar 502 is "today's sports recording".
The history option 503 is an entry control to view the history. When the user clicks the history option 503, the user can view the exercise records counted today and before, such as exercise time, exercise heat consumption, etc., and can also analyze the exercise records according to different periods, such as monthly, weekly, yearly, etc., such as counting average exercise amount, etc., summarize the exercise data, give analysis suggestions, etc.
The exercise time statistic option 504 may display the time length of the exercise today, which may be the total time of the exercise today, such as 60 minutes shown in the user interface 500, or each exercise may give a statistic time and then give a total time length. The user clicks on the move time statistics option 504 and may manually modify the move time.
The exercise calorie consumption statistic option 505 may display the consumed calories of the present exercise, which may be total consumed calories of the present exercise, such as 700 kcal shown in the user interface 500, or may be that each exercise gives a consumed calorie and then gives a total consumed calorie. The user may manually modify the athletic expenditure heat by clicking on the athletic expenditure heat statistics option 505.
Set target option 505 is an entry control for a user to set a moving target. The user can set a time target, a distance target, a calorie consumption target, a food consumption target, a movement route target, and the like by himself. The following description will be made with respect to the interface and description for setting the moving object.
Selecting the sports item option 507 is an entry control for the user to select a different sports item. The first device may support multi-mode, multi-functional motion assistance. For example, the treadmill may include modes of flat running, climbing, variable speed running, fast running, jogging, etc., and the user may select the exercise mode at will according to personal needs.
The synchronized motion data option 508 is for a user to trigger the synchronized motion data between multiple devices. The synchronous motion data is obtained by summarizing the motion data recorded by a plurality of devices (not limited to the first device and the second device) to generate uniform and more complete motion data. For example, the smart watch may detect the user's athletic data, such as number of steps taken, length of time run, length of time swim, etc., and synchronize the user's athletic data to the treadmill and/or cell phone. Likewise, the treadmill and/or the cell phone may synchronize the detected user data to the smart watch. The multiple terminals are matched for use, so that more accurate detection of the motion data of the user can be realized. In addition, when the plurality of terminals add up the exercise data of the user, the duplicate data detected in the same time slot is also eliminated. For example, the intelligent watch and the treadmill detect running data of the user in a time period, and the intelligent watch and the treadmill can be stamped when recording exercise data. When they synchronize the motion data with each other, a reasonable motion data can be taken out among the motion data of the same time stamp. The reasonable exercise data may be a maximum value in the plurality of exercise data (considering that some devices may have errors in detection and have undetected exercise data), or may be an average value (considering balance errors), or considering that the exercise data detected by the treadmill is more accurate, the exercise data of the treadmill is directly taken as the final statistical exercise data, which covers the exercise data detected by the smart watch in the same time period, or may be in other calculation modes, which is not limited in this application.
The quick start/pause option 509 is a quick control to start or stop the operation of the instrument. For example, when the treadmill is not running, the fast start/pause option 509 is pressed to quickly turn on the running belt and enter the running mode. Pressing the quick start/pause option 509 during running of the treadmill may quickly stop running.
Fig. 6 shows an illustrative user interface 600 on an exercise device for the user during exercise. If the exercise device is a treadmill, user interface 600 is illustrative of a running related interface. While the user is running with the treadmill, the treadmill may display an interface as shown in user interface 600.
As shown in fig. 6, a user avatar and name 601, a title 602, a return control 603, a synchronized motion data control 604, a motion progress icon 605, a motion time statistic option 606, a motion heat consumption statistic option 607, a speed option 608, a gradient option 609, adjustment controls 610a and 610b, a start control 611, a stop control 612, and the like are displayed in the user interface 600.
The user avatar and name 501 shows the avatar and name of the currently logged-in user account, and is used to identify different users. In user interface 600 and subsequent interfaces, the current user is Alice.
The title of the current page is shown in title 602 as "running".
The return control 603 is used to indicate a return to the previous interface.
The synchronized motion data control 604 facilitates a user to quickly synchronize motion data.
And the motion progress schematic picture 605 is used for displaying the current motion progress. In the user interface 600, the athletic progress graphic 605 illustrates the progress of the running, which may simulate the display of an endless track, visualizing the distance traveled as the progress of the endless track. As shown in fig. 6, a circle of the circular track is displayed, and the position on the circular track corresponding to the current running distance of the user is displayed in real time, and the user indicates that the progress is completed as one circle every time the user runs 400 meters.
The exercise time statistics option 606 may display the length of the current exercise, such as a running length of 30 minutes as shown in the user interface 600.
The exercise caloric consumption statistic 607 may display the current exercise caloric consumption, such as 400 kcal for running as shown in user interface 600.
Speed option 608 for the user to adjust the running belt speed, the current running speed as shown in user interface 600 is 10.0 kilometers per hour.
Grade option 609 is for the user to adjust the grade of the tread belt, as shown in user interface 600 for a current tread belt grade of 0.0 degrees.
Adjustment controls 610a, 610b are used for user options to adjust speed up or down, grade, etc. In user interface 600, the user may decrease the speed by clicking on the down control 610a and increase the speed by clicking on the up control 610 b.
The start control 611 is a control that the user turns on to run the treadmill. For example, when the treadmill is not running, the start control 611 is pressed to turn on the running belt to enter the running mode.
Stop control 612 is a control for the user to stop running the treadmill. Such as during operation of the treadmill, depressing the stop control 612 may stop operating the treadmill.
The second device such as a mobile phone can be added with exercise health application, so that a user can conveniently check exercise related information on the mobile phone, and manage and set exercise related functions. The following examples illustrate the relevant user interfaces.
Referring to fig. 7, fig. 7 illustrates a cell phone desktop interface, i.e., a user interface 700.
As shown in fig. 7, a top status bar 701, a search box 702, a desktop interface 703, and a bottom tray 705 are displayed in the user interface 700.
Among other things, the top status bar 701 may include a mobile signal indicator, a wireless network indicator, a power indicator, a time indicator, etc.
The search box 702 may be used for a user to search for applications or files, etc.
The desktop interface 703 may include a date component, a time component, a weather component, an application icon, and the like. In the user interface 700, a setting application icon, a photo application icon, a clock application icon, and an exercise health application 704 icon are displayed in the desktop interface 703. Clicking on the exercise health application 704 icon may enter the exercise health application, an example interface of which is described below.
The bottom tray 705 may place some user icons that are commonly used to facilitate the user's quick opening of these applications. In the user interface 700, the bottom tray 705 includes a phone application, a short message application, a browser application, and a music application.
After the user clicks on the athletic health application 704 icon, the cell phone may open the athletic health application, displaying an athletic health application home page, such as user interface 800 shown in fig. 8.
Referring to FIG. 8, the user interface 800 is a home page of an exercise health application and may include an exercise health heading 801, a today's exercise record column 802, a set goals option 803, a diet record option 804, a history record option 805, an exercise advice option 806, a connected devices column 807, a one-touch synchronized exercise data control 808, a more exercise record option 810, and the like.
The today's exercise record column 802 is used to show the statistical data of the user's exercise of the day, and may include exercise time, calories burned, and sports items. In the user interface 800, the user Alice today's exercise record is shown with 30 minutes of exercise time, 300 kcal of calories burned, and the exercise program is running. The mobile phone can detect by itself, for example, the number of steps the user walks can be detected, user data on other sports equipment, such as running data of the user on the treadmill, the number of steps the user walks on the sports bracelet and the like, can also be acquired, and the sports data acquired by each equipment is summarized to generate final sports data.
The set target option 803 sets an entry control for the moving target for the user, and the user clicks on the set target option 803, which may display a user interface 1900 as shown in fig. 19, which will be described later.
The food history option 804 is an entry control for recording food data and counting food intake calories for the user, and the user may display a user interface 1000 as shown in fig. 10 by clicking on the food history option 804, which will be described later.
History option 805 the user views the historical athletic data entry control, clicks on the history option 805, may display the current day and previous statistical athletic data, etc., and may also display average athletic data over a period (e.g., weekly, monthly, yearly).
The motion suggestion option 806 is an entry control of motion suggestion, and the mobile phone can give out corresponding motion suggestions based on information such as motion data, diet records and the like of the user, so as to help the user scientifically make a motion plan. The user clicks on the sports suggestion option 806, which may display a user interface 4000 as shown in FIG. 40, which is described subsequently.
The connected device field 807 may display a list of sports devices that have established a communication connection with the cellular phone, and indicate the connection status. The connected device list may display devices being connected, or historically connected devices, or detected unconnected devices, etc., such as a treadmill, a sports bracelet, a spinning bike, etc., as shown in fig. 8. The current treadmill icon is darker in display color and can indicate that the treadmill and the mobile phone are in a connection state. The current sports bracelet icon is lighter in display color, and can indicate that the sports bracelet and the mobile phone are not in a connection state currently. And clicking the sports bracelet icon to quickly establish connection. If the desired connected device is not in the connected device column 807, clicking on the top right control 809 can display more scanned sports devices.
The one-touch synchronized motion data control 808 may be used to quickly synchronize motion data between the cell phone and other motion devices.
The more health records option 810 displays other health function options such as the sleep option, weight option, heart rate option, blood oxygen saturation option, etc. shown in fig. 8. Clicking on a different option may enter a more detailed function page. The mobile phone can acquire the sleep data detected by the sports bracelet and remind the user of falling asleep at the time of falling asleep, so that the user can be helped to form a better sleep habit. The weight option is used for recording the weight of the user, the user can be helped to manage the stature, and in some cases, the mobile phone can acquire the body fat or the body fat data of the body fat scale or the weighing scale without manual input of the user. The heart rate option is used for recording the heart rate of the user, in some cases, the mobile phone can acquire heart rate data detected by the sports bracelet, and when the heart rate of the user is abnormal, the user is reminded of having a rest or seeking medical advice in time. The oxyhemoglobin saturation option is used for recording oxyhemoglobin saturation of the user, in some cases, the mobile phone can acquire oxyhemoglobin saturation data detected by the motion bracelet, and the user is reminded of timely hospitalizing when the oxyhemoglobin saturation is too low.
Referring to fig. 9, fig. 9 is a schematic diagram of a user interface for a cell phone to initiate a pairing invitation to a treadmill.
Before the mobile phone does not establish communication connection with the treadmill, after the user clicks the treadmill icon 811 in the connection device bar 807 in the user interface 800, the mobile phone can quickly send a pairing invitation request to the treadmill.
When the treadmill receives the pairing invitation from the mobile phone, a message reminder for the pairing invitation may be received, such as an interface shown in display user interface 900. On the user interface 900, a pairing invitation prompting box 901 may be displayed, in which a pairing invitation prompting message is displayed, as shown in fig. 9, "hua shi P50 initiates a pairing invitation to you, and if pairing is successful, then the sport information can be quickly synchronized, and whether to agree? "and so on. The pairing invitation prompt box 901 further comprises an "agree" control 902 and a "cancel" control 903, wherein if the user clicks the "agree" control 902, the pairing invitation is received, if the user clicks the "cancel" control 903, the pairing invitation is rejected, and a corresponding message of receiving or rejecting the pairing invitation is returned to the requesting device.
After the treadmill and the mobile phone successfully establish communication connection, the treadmill and the mobile phone can synchronize the motion data in real time, or a user clicks a one-key synchronous motion data control to trigger the synchronization of the motion data. The communication connection may be a bluetooth connection, a Wi-Fi connection, etc., and the embodiment is not limited.
The successfully connected sports equipment can display the distinguishing marks in the mobile phone sports health application, and the connection state of the sports equipment and the mobile phone can be marked by different colors or different icons, for example, the highlighted icon mark indicates that the sports equipment is connected, and the gray icon mark indicates that the sports equipment is not connected.
The user may record food intake calories through exercise health. When the user clicks on the diet recording option 804 of the user interface 800, the user interface 1000 shown in fig. 10 may be displayed.
Referring to the user interface 1000 related to the food record shown in fig. 10, in the user interface 1000, a title bar 1001, a date selection control 1002, a calorie count window 1003, and a breakfast, lunch, dinner, and snack food list, a quick import order information control 1014, and the like may be displayed.
The title bar 1001 includes the title of the current page: a diet record, and a return control for instructing a return to the previous interface.
Date selection control 1002 may be used for a user to select to view diet records on different dates. As today or other dates, the user can quickly switch to the previous/next day's diet record by clicking the previous/next control.
The caloric statistics window 1003 is used to display caloric statistics, such as dietary intake caloric data 1004, exercise consumption caloric data 1005, basal metabolic caloric data 1006, remaining consumption caloric data 1007, a caloric statistics graph 1008, and the like. The residual consumed calorie is obtained by subtracting basal metabolic calorie and exercise consumed calorie from dietary intake calorie. The basic metabolic heat data of the user can be calculated by a mobile phone according to the height and the weight of the user, and can also be automatically input by the user after professional measurement. The dietary intake calorie data can be summarized according to the dietary data input by the user in breakfast, lunch, dinner and snack food lists. The exercise heat consumption data can be obtained by summarizing the exercise data acquired by the mobile phone and the exercise data acquired from other exercise equipment.
In the user interface 1000, the current user's dietary intake of 3000 kcal, exercise consumption of 300 kcal, basal metabolic calorie of 2000 kcal, and remaining consumption of 700 kcal are shown. In some implementations, the caloric statistical representation 1008 is a circular graph for the convenience of the user to visually view, the entire circle representing the basal metabolic calories, and the dark areas representing the proportion of the calories still to be consumed in the circular graph.
The food intake and the corresponding calorie of the user are respectively displayed in the breakfast food list, the lunch food list, the dinner food list and the snack food list. A lunch title 1009, a food add control 1010, a single food description 1011 (including information such as a food graphic, a food name, a food quantity, etc.), a single food corresponding calorie 1013, a lunch total calorie 1012, etc., may be included in the displayed lunch food listing column as in user interface 1000.
As shown in the user interface 1000, breakfast includes 200 grams of bread, totaling 500 kcal; the lunch comprises 150 g of rice, 200 kcal of heat, 1 plate of fish-flavored shredded pork, 750 kcal of heat, and 950 kcal of total heat of the lunch; the supper comprises 1 hamburger, 900 kcal, 500 ml cola, 250 kcal, 1150 kcal in total; the snack includes 300 grams of grilled food, which has a total caloric value of 400 kcal. The user had a total food intake calorie of 3000 kcal for breakfast, lunch, dinner, and snack, and the dietary intake calorie data 1004 displayed the total dietary intake calorie of 3000 kcal.
The user may click on the food add control 1010 to add food, and the user interface for adding food may refer to the user interface 1100 shown in fig. 11.
When the user clicks the rapid import food ordering information control 1014, the user can select to obtain food ordering information in other applications (such as a third-party takeout application, a food ordering application and the like), so that the food information used by the user can be rapidly obtained, and the trouble of manual one-by-one input is avoided.
The user interface 1100 shown in FIG. 11 is a schematic view of a user interface for adding lunch.
Referring to the add lunch related user interface 1100 shown in fig. 11, in the user interface 1100, a title bar 1101, a food search input box 1102, a voice search control 1103, a sweep code search control 1104, a photo search control 1105, a single food list 1106, a package list 1112, a completion confirmation control 1113, and the like may be displayed.
The title bar 1101 includes the title of the current page: lunch is added, and a return control is used to instruct return to the last interface.
The food search input box 1102 may be used for a user to enter text, letters, or numbers therein to search for food. For example, the user eats hand-pulled noodles in lunch, the user can input the hand-pulled noodles in the food search input box 1102 and search, the mobile phone can search the hand-pulled noodles in the application food library and display all search results to the user, and the user can add corresponding food and quantity as required. If the food is not in the application food library, the application food library can be updated by searching the network for the food and the corresponding calorie. The food library can be understood as a data table in the sports health application, which stores the corresponding relationship between a plurality of foods and the calories contained in the foods under a certain quantity, such as 50 kcal per 100 ml of cola.
The voice search control 1103 can facilitate a user to search for food by voice. The user can speak the food he wants to search for after clicking on voice search control 1103. After the mobile phone acquires the voice message, identifying food and quantity in the voice message, searching corresponding food in the application food library, and if the food does not exist in the application food library, searching the food and corresponding heat in a network and updating the application food library.
The code scanning search control 1104 can facilitate a user searching for food in a code scanning manner. Some food packages are pasted with food codes such as two-dimensional codes or bar codes for identifying food information, and a user can start a camera to scan the food codes after clicking the code scanning search control 1104 to identify corresponding food information including information such as food names, quantities and calories.
The photo search control 1105 may facilitate a user searching for food via pictures. The user can take a picture of food when having a meal, and the mobile phone can intelligently identify the food and the quantity in the picture, so that the user can be helped to add food information more quickly. After the user clicks the photo search control 1105, the user interface 1200 shown in FIG. 12 may be displayed, as described below.
Food options common or common to the user may be listed in the single item food list 1106. A plurality of columns of food options are included in the single item food list 1106, each column of food options may include a food graphic and name 1108, a food quantity column 1109, an increase control 1110, a decrease control 1111, and the like. The user can input the quantity of the ingested food in the food quantity column 1109 corresponding to the name of the food shown in each row, the quantity unit can be an accurate metering unit such as gram and milliliter, and can also be a rough metering unit such as unit, bowl and plate, and the user can select the metering unit according to the self condition. The user can also adjust the number by clicking increase control 1110 and decrease control 1111. If no food to be added is found in the single food list 1106, the more control 1107 can be clicked and the phone will display more single food options. In the user interface 1100, the single food options shown in the single food list 1106 are bread, rice, green pepper fried meat, hamburgers, cola, milky tea, barbeque, and the like.
Package options common or familiar to the user may be listed in package list 1112. Package list 1112 includes columns of package options, and likewise, each column of package options may include package illustrations and names, package number columns, add controls, subtract controls, and the like. The user can input the number of the packages to be taken in units of shares in a package number column corresponding to the package name shown in each column. The user can also increase the controls and reduce the number of controls to be adjusted by clicking. If no food to be added is found in package list 1112, more control 1114 can be clicked and the cell phone will display more package options. In user interface 1100, the package options shown in package list 1112 are pizza packages (including pizza, french fries, milk tea), hamburger packages (including hamburgers, french fries, cola), fish-flavored shredded meat packages (including fish-flavored shredded meat, rice), salad packages, and the like.
After the user finishes selecting each item of food or set of food eaten by the selected lunch, the user can click the finish confirmation control 1113 to inform the mobile phone to confirm the selection.
When a user has a meal, the user can click the photographing search control 1105 to photograph food, and the mobile phone can intelligently identify the food and the quantity in the picture, so that the user can add food information more quickly. The function can facilitate the user to quickly search for food by inputting pictures, and the trouble of using character input by the user is saved.
Referring to the user interface 1200 relating to picture recognition shown in fig. 12, in the user interface 1200, a title bar 1201, a picture recognition box 1202, a food description section 1203, a party number selection section 1204, a confirmation control 1205, and the like may be displayed.
The title bar 1201 includes the title of the current page: picture recognition and a return control for indicating to return to the last interface.
The picture recognition box 1202 is used for a user to input pictures, the picture recognition box 1202 can display pictures taken by the user, and the mobile phone can intelligently detect and recognize food in the picture recognition box 1202.
The food description part 1203 is used to list the type and quantity of food in the picture recognition box 1202 identified by the mobile phone, and estimate the total calorie. The user can check whether the food type and quantity identified by the picture are correct or not, and if not, can click the 'modification' control to modify the food type and quantity. As in user interface 1200, food description section 1203 lists that the cell phone recognized that a package of french fries, a hamburger, a glass of cola, was included in picture recognition box 1202, and the total calories is estimated to be about 1500 kcal.
The meal person number selecting section 1204 can allow the user to select the number of meal persons, and facilitates division of calories by the user who shares a plurality of foods. For example, under the condition of having multiple persons together, each person does not eat one part of the meal independently, but each person can share some foods when eating, and the calculation of the calorie is difficult. At this time, the user can take a photograph of all the foods at the party and then select the number of people who share the foods at the party number selecting section 1204. The mobile phone can perform data processing according to the number of people having a meal after identifying the total calories of all foods in the picture, and can approximately estimate the calories ingested by each person, so that the problem that the calories ingested by a single person when a plurality of people have a dinner can be difficultly calculated can be solved.
In some embodiments, the total calories for all food items may be divided by the number of people sharing a meal to obtain an average to estimate the amount of food ingested by each person.
In other embodiments, other ways of estimating food intake of each person may be used. For example, the user can input the ratio of the meal intake of different persons by himself. For example, images of all diners can be acquired, and then food intake of different diners can be distributed according to the fat and thin conditions of different diners. Alternatively, the first application may obtain a user representation of the meal people and assign food intake of different people based on the user representation or historical data of different meal people, for example, a person has a meal for the father, mother, and child, the food intake of the father is the most, about 1/3 more than the mother, and the food intake of the child is about 1/2 of the mother, and the food intake of different people is estimated based on the user representation.
After confirming that the picture recognition food is correct, the user can click the confirmation control 1205 to notify the mobile phone to confirm that the picture recognition of the food is completed.
In another embodiment, in order to more accurately count the calorie of the food taken by the user, the user can take a picture of the food before eating, and take a picture of the food after eating, and the mobile phone can identify the total amount of the food before eating and the residual amount of the food after eating, and calculate the difference between the calorie of the food before eating and the calorie of the food after eating as the calorie of the food taken by the user.
Referring to the user interface 1300 shown in fig. 13, in the user interface 1300, a title bar 1301, a before-meal picture recognition box 1302, a after-meal picture recognition box 1303, a food description part 1304, a diner number selection part 1305, a confirmation control 1306, and the like may be displayed.
The title bar 1301 includes the title of the current page: picture recognition and a return control for indicating to return to the last interface.
The picture before meal recognition box 1302 is used for inputting a picture of food before meal by a user, the picture before meal recognition box 1302 can display a picture photographed by the user, and the mobile phone can intelligently detect and recognize the food in the picture before meal recognition box 1302.
The post-meal picture recognition box 1303 is used for inputting a picture of food after the user finishes eating, the post-meal picture recognition box 1303 can display a picture photographed by the user, and the mobile phone can intelligently detect and recognize the food in the post-meal picture recognition box 1303.
The food description part 1204 is used for listing the types and the quantities of the foods identified by the mobile phone in the picture before meal identification box 1302 and the picture after meal identification box 1303, and the estimated food calorie difference value before meal and after meal. The user can check whether the food type and quantity identified by the picture are correct or not, and if not, can click the 'modification' control to modify the food type and quantity. As in user interface 1300, food description section 1304 lists that the phone recognizes that the pre-meal picture recognition box 1302 contains a packet of french fries, a hamburger, a cup of cola, and that the post-meal picture recognition box 1303 contains 1/3 hamburger, with an estimated difference in pre-meal and post-meal food calories of about 1200 kcal.
The number of people to eat selection section 1305 allows the user to select the number of people to eat, and facilitates the user who shares a plurality of foods to divide calories. For example, in the case of a plurality of people gathering food, the user can take a picture of all the food at the table before meal and a picture of the food remaining after meal, and then select the number of people who share the food in the number-of-people-for-meal selecting section 1305. The mobile phone can perform average processing according to the number of people having meals after identifying the difference calorie of food in the picture before meals and after meals, and can approximately estimate the calorie taken by each person, so that the problem that the calorie taken by a single person when a plurality of persons have a dinner can be solved, and the calculation is difficult when the calorie taken by the single person is taken.
After confirming that the picture recognition food is correct, the user can click the confirmation control 1306 to notify the mobile phone to confirm that the picture recognition of the food is completed.
Regarding the method for calculating the food calorie in the picture by the user, an embodiment provided by the present application is described herein with reference to fig. 14.
Step (1): the mobile phone acquires the picture.
The picture acquired by the mobile phone may be obtained by taking a picture of food whose calorie is to be calculated by the user, or may be a picture selected by the user in a gallery, which is not limited in this embodiment. One or more items of food may be included in the picture. If the mobile phone detects that the picture does not contain any food target, the mobile phone can remind the user to input the picture again.
In order to improve the accuracy of food detection, a user can take food pictures from multiple angles, such as from a view above the food, from a view to the side of the food, from a view to the front of the food, and so on. The mobile phone acquires the food pictures with multiple visual angles, so that the food can be identified more accurately.
Step (2): the mobile phone carries out image recognition processing on the picture, and divides a plurality of foods in the picture into a plurality of single food targets.
For example, the mobile phone can use the image edge recognition technology to trace the edges of various foods, thereby separating a plurality of single food objects.
And (3): the handset identifies the type and number of each individual food object.
The mobile phone can identify the type and the number of each single food object. For example, the mobile phone can compare the single food picture with the pictures in the application food library one by one, and select the food with the highest similarity. For another example, a food recognition model may be preset in the mobile phone, where the model is a trained neural network model, and the mobile phone may obtain a recognition result by inputting a picture of the single food object.
The mobile phone can also detect the number of single food objects in the picture, wherein the number comprises the estimated weight, volume, number of copies and the like. For example, the actual volume of the apple can be estimated by the mobile phone according to the distance between the mobile phone camera and food during shooting and the image volume of the apple in the picture, the weight of the apple can be estimated according to the volume, and the heat of the apple can be further calculated.
In fig. 14, after the image recognition processing is performed on the picture by the mobile phone, it is recognized that the picture includes a potato chip, a hamburger, and a cup of cola. For example, in a fast food restaurant, the chips are usually divided into different portions such as small portions, medium portions, large portions, etc., and the mobile phone can estimate that the chips are medium portions according to the number or volume of the chips in the picture, and then inquire the heat of the medium portions in the application food warehouse. Similarly, the cola is also generally divided into small, medium and large cola volumes, the mobile phone can estimate that the cola cup is a medium cola cup according to the volume of the cola cup in the picture, and then the heat of the medium cola cup is inquired in the application food library.
And (4): the mobile phone calculates the calorie of each single food target and then sums the total calories of the multiple foods in the picture.
After the mobile phone identifies each single food target, the corresponding heat can be inquired in the application food library. If the application food library can not be inquired, the mobile phone can carry the picture of the single food to the Internet for intelligent search. After the search is completed, the search result can be added into the application food library, so that the next query is facilitated.
After the mobile phone obtains the calorie of each single food target, the calories of the multiple foods can be summed up to obtain the total calories of the multiple foods in the picture. As shown in fig. 14, if a bale of french fries is found to have a caloric value of 300 kcal, a hamburger has a caloric value of 900 kcal, and a cup of cola has a caloric value of 300 kcal, then the total caloric value in the picture is about 1500 kcal.
By clicking on the quick import order information control 1014 in the user interface 1000, the user interface 1500 in fig. 15 may be displayed, and the user may select to obtain order information in another application (e.g., a third-party takeaway application, an order application, etc.). The function of quickly importing the meal ordering information can quickly acquire the food information of the user for eating, and the trouble of manually inputting the food information one by one is saved for the user. These third party take-away applications, ordering applications, etc. may open permissions to the sports health application to obtain ordering information.
Referring to fig. 15, a floating window 1501 is displayed in a user interface 1500, and a plurality of other application icons 1502 can be displayed in the floating window 1501 for a user to select an application that needs to acquire meal ordering information, such as a takeaway application, an ordering application, a group purchase application, and other applications that contain meal information of the user. If the application desired by the user is not displayed in the floating window 1501, the more control 1503 can be clicked and the mobile phone can list more applications. If the user clicks on cancel control 1504, the currently hovering window 1501 may be dismissed, returning to the home page.
The first device such as a treadmill can also be provided with an exercise health application, and the user can also add diet records and set exercise targets on the first device.
Fig. 16 shows an example user interface 1600 for an adduction diet record for a first device.
Referring to the user interface 1600 related to the food and drink record shown in fig. 16, in the user interface 1600, a user avatar and name 1601, a title bar 1602, a date selection control 1603, a breakfast, lunch, dinner, and snack food list, a quick import order information control 1609, and the like may be displayed.
The title column 1602 includes the title of the current page: add a diet record, and a return control for instructing a return to the previous interface.
Date selection control 1603 may be used for a user to select to view diet records on different dates. As today or other dates, the user can quickly switch to the previous/next day's diet record by clicking the previous/next control.
The food intake and the corresponding calorie of the user are respectively displayed in the breakfast food list, the lunch food list, the dinner food list and the snack food list. A lunch title 1605, a food addition control 1606, an individual food description 1607 including information such as a food graphic, a food name, a food quantity, an individual food corresponding calorie, etc., a lunch total calorie 1608, etc., may be included in the lunch food list column 1604 as displayed in the user interface 1600.
As shown in the user interface 1600, breakfast includes 200 grams of bread, totaling 500 kcal; the lunch comprises 150 g of rice, 200 kcal of heat, 1 plate of fish-flavored shredded pork, 750 kcal of heat, and 950 kcal of total heat of the lunch; the supper comprises 1 hamburger, 900 kcal, 500 ml cola, 250 kcal, 1150 kcal in total; the snack includes 300 grams of grilled food, which has a total caloric value of 400 kcal. The total food intake calorie of the user is 3000 kcal for breakfast, lunch, dinner and dinner.
The user may click on the food add control 1606 to add food, and the user interface for adding food may refer to the user interface 1700 shown in fig. 17.
When the user clicks the fast import food ordering information control 1609, the user can select to obtain food ordering information in other applications (such as a third-party takeaway application, an ordering application and the like), so that the food information for the user to eat can be quickly obtained, and the trouble of manual input one by one is avoided.
User interface 1700 shown in fig. 17 is a schematic view of a user interface for adding lunch on the first device.
Referring to the add lunch related user interface 1700 shown in fig. 17, in the user interface 1700, a user avatar and name 1701, a title bar 1702, a food search input box 1703, an individual food list 1704, a package list 1705, a completion confirmation control 1712, and the like may be displayed.
The title bar 1702 includes the title of the current page: lunch is added, and a return control is used to instruct return to the last interface.
The food search input box 1703 may be used for a user to input text, letters, or numbers therein to search for food. For example, the user eats the stretched noodles in lunch, the stretched noodles can be input and searched in the food search input box 1703, the first device can search the stretched noodles in the application food library and display all the search results to the user, and the user can add corresponding food and quantity as required. If the food is not available in the application food library, the network can be searched for the food and corresponding calories and the application food library can be updated.
The right side of the food search input box 1703 also includes a voice search control, a code scanning search control, a photo search control, and the like.
The voice search control can facilitate a user to search for food through voice. The user may speak the food desired to be searched upon clicking the voice search control. After the first device acquires the voice message, recognizing food and quantity in the voice message, searching the corresponding food in the application food library, and if the food does not exist in the application food library, searching the food and corresponding heat in a network and updating the application food library.
The code scanning search control can facilitate a user to search for food through code scanning. In view of the inconvenience of scanning a code using a smart sports device, in some embodiments, a first device may invoke a camera of a second device, such as a cell phone, to assist in scanning the code. For example, after clicking a code scanning search control, a user can call a mobile phone to start a camera to scan the food codes to identify corresponding food information including information such as food names, quantities and heat, and then the mobile phone sends the information to the first device.
The photo search control can facilitate the user to search for food through pictures. In view of the inconvenience of using smart sports devices to take pictures of food, in some embodiments, the first device may invoke a camera of a second device, such as a cell phone, to take pictures, or the first device may select a picture in a gallery of the second device while connected to the second device. If the user can take a picture of food and store the picture by using the mobile phone when the user has a meal, after the user clicks the picture taking search control, a picture library of the mobile phone can be displayed in a user interface of the first device, the user selects pictures containing the food, and then the first device intelligently identifies the food and the quantity in the pictures, so that the user can add food information more quickly.
Food options common or common to the user may be listed in the single item food list 1704. A plurality of columns of food options are included in the single item food list 1704, each column of food options may include a food graphic and name 1706, a food quantity column 1707, an increase control 1708, a decrease control 1709, and the like. The user can input the number of the ingested food in the food number column 1707 corresponding to the name of the food shown in each column, the number unit can be an accurate measurement unit such as gram and milliliter, or a rough measurement unit such as unit, bowl and plate, and the user can select the measurement unit according to the self condition. The user can also adjust the amount by clicking on increase control 1708 and decrease control 1709. If no food to be added is found in the single food list 1704, the more control 1710 can be clicked and the first device will display more single food options. In the user interface 1700, the single food options shown in the single food list 1704 are bread, rice, green pepper fried meat, hamburgers, cola, milky tea, pizza, french fries, ice cream, and the like.
Package options common to or common to the user may be listed in package list 1705. Package list 1705 includes a plurality of columns of package options, and likewise, each column of package options may include package illustrations and names, package number columns, add controls, subtract controls, and the like. The user can input the number of the packages to be taken in units of shares in a package number column corresponding to the package name shown in each column. The user can also increase the controls and reduce the number of controls to be adjusted by clicking. If no food to be added is found in package list 1705, more control 1711 may be clicked and the first device may display more package options. In user interface 1700, the package options shown in package list 1705 are pizza packages (including pizza, french fries, milk tea), hamburger packages (including hamburgers, french fries, cola), shredded fish (including shredded fish, rice), salad packages, and the like.
The user, upon completing selection of each food item or set of foods consumed by the selected lunch, may click on completion confirmation control 1712 to notify the first device to confirm the selection.
By clicking a shortcut import order information control 1609 in the user interface 1600 on the display screen of the first device by the user, the user interface 1800 in fig. 18 may be displayed, and the user may select to obtain order information in other applications (e.g., a third-party takeaway application, an ordering application, etc.). In some embodiments, the applications containing the order information are applications on a second device, such as a cell phone, for which the order information is also stored on the second device. The first device needs to request the second device, such as a mobile phone, to obtain the meal ordering information included in the applications, and the user can select meal information on the second device and send the meal ordering information to the first device. The function of quickly importing the meal ordering information can quickly acquire the food information of the user for eating, and the trouble of manually inputting the food information one by one is saved for the user. These third party take-away applications, ordering applications, etc. may open permissions for the athletic health application to obtain order information.
Referring to fig. 18, a floating window 1801 is displayed in the user interface 1800, and a plurality of other application icons 1802 may be displayed in the floating window 1801, so that the user may select an application that needs to acquire meal ordering information, such as an takeaway application, an ordering application, a group purchase application, and other applications that include meal information of the user. If the user-desired application is not displayed in flyout window 1801, more control 1803 may be clicked and the first device may list more applications. If the user clicks the cancel control 1804, the currently floating window 1801 may be canceled and returned to the home page.
Fig. 19 to 28 show the relevant user interface for setting food consumption as a moving object.
When the user clicks the set object control 803 in the user interface 800 shown in fig. 8, the cell phone may display a set object related user interface, such as the user interface 1900 shown in fig. 19. Fig. 19, 20, 21 illustrate example user interfaces on a second device, such as a cell phone, relating to setting a moving object.
Referring to the setting target related user interface 1900 shown in fig. 19, in the user interface 1900, a title bar 1901, a date selection control 1902, a calorie target setting bar 1903, a time length target setting bar 1904, a food consumption target setting bar 1907, a movement route target setting bar 1908, a confirmation control 1913, and the like may be displayed.
The title bar 1901 includes the title of the current page: a target is set, and a return control is used for indicating to return to the previous interface.
Date selection control 1902 may be used for a user to select a target for viewing or setting different dates. Such as today, tomorrow, or other dates, the user can quickly switch to the previous/next day setup target page by clicking the previous/next control.
In the interface for setting the target, there are various target options for the user to select. The user may target the amount of heat consumed or the length of exercise.
The user may select the amount of heat consumed as the exercise goal in the heat goal setting column 1903, such as 1000 kcal as shown in the user interface 1900. If the user selects to consume 1000 kcal of heat as the exercise target, the exercise device counts whether the heat consumed by the exercise of the user reaches the set target value. If the exercise heat consumption value of the user does not reach the set target, the mobile phone or the exercise equipment can remind the user of the residual progress and display an incentive utterance to encourage the user to finish the target; if the user's exercise consumption calorie value reaches the set target, the handset or exercise device may display a praise word giving positive feedback to the user. The user may click the switch control 1905 to control the on or off state thereof, so as to select whether to set the consumed heat as the exercise target, and in the user interface 1900, the switch control 1905 is in the off state, which indicates that the user does not select to set the consumed heat as the exercise target currently.
The user may also select a duration value as a motion target in duration target setting field 1904, such as 1.0 hour as shown in user interface 1900. If the user selects 1.0 hour of movement as the movement target, the movement device counts whether the movement time length of the user reaches the set target value. If the exercise duration value of the user does not reach the set target, the mobile phone or the exercise equipment can remind the user of the remaining duration and display the motivation words to encourage the user to complete the target; if the user's exercise duration value reaches the set goal, the phone or exercise device may display a praise utterance to give positive feedback to the user. The user may click the switch control 1906 to control the on or off state thereof, so as to select whether to use the set training duration as the moving target, and in the user interface 1900, the switch control 1906 is in the off state, which indicates that the set training duration is not selected by the current user as the moving target.
Besides the common consumed heat or exercise duration as the exercise target, the embodiment also provides an interesting target, relieves the tedious mood of the user in the exercise process, converts the exercise amount into visual object targets, such as food consumption progress, scenic spot routes and the like, and improves the exercise enthusiasm of the user.
As shown in user interface 1900, interesting targets may include a set food consumption target option and a punch-card cloud point option.
The food consumption target setting field 1907 is displayed with a set food consumption target option and a switch control 1909. The user may click the switch control 1909 to control the on or off state thereof to select whether to set the food consumption target as the exercise target, and in the user interface 1900, the switch control 1909 is in the on state to indicate that the user currently selects to set the food consumption target as the exercise target. At this time, a sub-option may be displayed under the food consumption target setting field 1907: setting the present food intake as a consumption target option 1911, and setting the other food as a consumption target option 1912, and the like.
The user may choose to set the present food intake as a consumption target option 1911, such as a cell phone showing the user's diet record, the user may choose to consume the intake calories for breakfast, lunch or dinner as a sport target, refer to the user interface 2000 shown in fig. 20; or the user may select to set other food as a consumption target option 1912, such as the user selecting to consume only one hamburger's calories as a sports target, as shown with reference to user interface 2100 in fig. 21.
The movement route goal setting bar 1908 displays the card cloud point options and switch controls 1910. The user may click the switch control 1910 to control the on or off state thereof, so as to select whether to use the set movement route as the movement target, and in the user interface 1900, the switch control 1910 is in the off state, which indicates that the user does not select the set movement route as the movement target.
In some embodiments, switch control 1905 indicating a set caloric consumption goal, switch control 1906 indicating a set training duration, switch control 1909 indicating a set food consumption goal, and switch control 1910 indicating a set movement route goal may not be in an on state at the same time, and the user may only select one as a movement goal.
Confirmation control 1913 is configured to click on the control after the user has selected the moving object, and confirm the selection result.
Fig. 20 shows an example user interface 2000 for setting today's food intake as a consumption target. In the user interface 2000, a title bar 2001, a meal list 2002, a selection control 2003, a confirmation control 2004, and the like can be displayed.
The title column 2001 includes the title of the current page: set present day food intake as a consumption target, and a return control for instructing a return to the previous interface.
Also shown in the user interface 2000 are the user's dietary records, such as breakfast, lunch, dinner, a list of meals, including meal food name and quantity, individual food calories, total calories per meal, and the like. For example, the dinner contains 1 hamburger, 900 kcal, 500 ml cola, 250 kcal, and 1150 kcal total of the dinner. On the right side of the list of meals, a selection control 2003 is displayed, and the user indicates a selection status by clicking on the selection control 2003, such as highlighted for selection and not highlighted for non-selection. In the user interface 2000, the selection control 2003 corresponding to the dinner column is selected, indicating that the user selects dinner ingestion as the consumption target. In some embodiments, the user may select more targets, such as the user selecting breakfast and dinner together as the consumption target.
A confirmation control 2004 is used for the user to confirm the selection result.
Fig. 21 shows an example user interface 2100 for setting other food as a consumption target. In the user interface 2100, a title bar 2101, a food search input box 2102, a single-item food list 2103, a package list 2104, a confirmation control 2105, and the like may be displayed.
The title field 2101 includes the title of the current page: setting other food as consumption target, and a return control for indicating to return to the last interface.
The food search input box 2102 may be used for a user to enter text, letters, or numbers therein to search for food. For example, the user may input corn and search in the food search input box 2102, the mobile phone may search corn in the application food library and display all search results to the user, and the user may add corresponding food and amount as needed. If the food is not in the application food library, the application food library can be updated by searching the network for the food and the corresponding calorie.
The right side of the food search input box 2102 also includes a voice search control, a code scan search control, a picture search control, and the like. The voice search control can facilitate a user to search for food through voice. The user may speak the food desired to be searched upon clicking the voice search control. The code scanning search control can facilitate a user to search for food through code scanning. The photographing search control can facilitate a user to search for food through pictures and help the user to add food information more quickly.
Food options common or common to the user may be listed in the single item food list 2103. Multiple columns of food options are included in the single item food list 2103, each column of food options may include food illustrations and names, food quantity columns, increase controls, decrease controls, and the like. The user can input the number of the ingested food in the food number column corresponding to the food name shown in each column, the number unit can be an accurate metering unit such as gram, milliliter and the like, or a rough metering unit such as unit, bowl, disc and the like, and the user can select the metering unit according to the self condition. The user can also increase the controls and reduce the number of controls to be adjusted by clicking. If the food to be added is not found in the list of single items, the user can click on the more control on the right side, and the mobile phone can display more single item food options.
Package options that are common or familiar to the user may be listed in package list 2104. Package list 2104 includes columns of package options, and likewise, each column of package options may include package illustrations and names, package number columns, add controls, subtract controls, and the like. The user may enter the number of packages to be ingested, in units of shares, in the package number column corresponding to each of the package names shown in the columns. The user can also increase the controls and reduce the number of controls to be adjusted by clicking. If no food to add is found in package list 2104, the right more control can be clicked and the cell phone will display more package options. In user interface 2100, the user has selected a hamburger as the consumption target.
The confirmation control 2105 is used for the user to confirm the selection result.
The user may also set the moving object on a user interface of a first device, such as a treadmill.
Fig. 22, 23, 24 illustrate example user interfaces on a first device, such as a treadmill, relating to setting an athletic goal. If the user clicks on the set target option 506 in the user interface 500 shown in FIG. 5, the first device may display the user interface 2200 shown in FIG. 22.
Referring to the setting target related user interface 2200 shown in fig. 22, in the user interface 2200, a title bar 2201, a date selection control 2202, a calorie target setting bar 2203, a duration target setting bar 2204, a food consumption target setting bar 2205, a movement route target setting bar 2206, a confirmation control 2209, and the like may be displayed.
The title column 2201 includes the title of the current page: a target is set, and a return control is used for indicating to return to the previous interface.
Date selection control 2202 may be used for a user to select a target to view or set different dates. Such as today, tomorrow, or other dates, the user can quickly switch to the previous/next day setup target page by clicking the previous/next control. The user can set the moving target of the future date in advance, for example, the moving target of the tomorrow is set, and the moving health application can remind the user of completing the moving target in time in the tomorrow.
In the interface for setting the target, there are various target options for the user to select. The user may target the amount of heat consumed or the length of exercise. In addition, the user can also set an interesting target to convert the exercise amount into visual object targets, such as food consumption progress, scenic spot routes and the like, so that the tedious mood of the user in the exercise process is relieved, and the exercise enthusiasm of the user is improved.
The user may select the value of the amount of heat consumed as the movement target in the heat target setting field 2203, such as 1000 kcal as shown in the user interface 2200. If the user selects the heat consumption of 1000 kcal as the exercise target, the exercise device counts whether the heat consumption of the user exercise reaches the set target value. If the exercise heat consumption value of the user does not reach the set target, the mobile phone or the exercise equipment can remind the user of the residual progress and display an incentive word to encourage the user to finish the target; if the user's exercise consumption heat value reaches the set target, the handset or exercise device may display a praise utterance to give positive feedback to the user. The user may click the right switch control to control the on or off state thereof, so as to select whether to set the consumed heat as the exercise target, and in the user interface 2200, the switch control for setting the consumed heat is in the off state, which indicates that the user does not select the set consumed heat as the exercise target.
The user may also select a duration value as a motion target in duration target setting field 2204, such as 1.0 hour as shown in user interface 2200. If the user selects 1.0 hour of movement as the movement target, the moving apparatus counts whether the movement time period of the user reaches the set target value. If the exercise duration value of the user does not reach the set target, the mobile phone or the exercise equipment can remind the user of the remaining duration and display the motivation words to encourage the user to complete the target; if the user's exercise duration value reaches the set goal, the phone or exercise device may display a praise utterance to give positive feedback to the user. The user may click the right switch control to control the on or off state thereof, so as to select whether to use the set training duration as the moving target, and in the user interface 2200, the switch control for setting the training duration is in the off state, which indicates that the user does not select the set training duration as the moving target.
As shown in user interface 2200, the interesting goals may include a set food consumption goal option and a punch-card cloud point option.
The food consumption target setting field 2205 is displayed with a set food consumption target option and a switch control. The user may click the switch control to control the on or off state thereof, so as to select whether to set the food consumption target as the moving target, and in the user interface 2200, the switch control for setting the food consumption target is in the on state, which indicates that the user currently selects to set the food consumption target as the moving target. At this time, a sub-option may be displayed under the food consumption target setting field 2205: setting the current food intake as a consumption target option 2207, and setting the other food as a consumption target option 2208, and the like.
The user may select to set this day's food intake as a consumption target option 2207, as the first device shows the user's dietary history, the user may select to consume breakfast, lunch or dinner intake calories as a sport target, referring to the user interface 2300 shown in FIG. 23; or the user selects set other food as a consumption target option 2208, such as the user selecting only one hamburger of heat to consume as a sports target, with reference to the user interface 2400 shown in fig. 24.
The movement route target setting column 2206 displays the card punching cloud sight option and the switch control. The user can click the switch control to control the on or off state of the switch control, so as to select whether to use the set movement route as the movement target, and in the user interface 2200, the switch control for setting the card punching cloud scenery spot is in the off state, which indicates that the current user does not select the set movement route as the movement target.
In some embodiments, the switch control for setting the caloric consumption target, the switch control for setting the training time length, the switch control for setting the food consumption target, and the switch control for setting the movement route target may not be in the on state at the same time, and the user can only select one as the movement target.
The confirmation control 2209 is used for clicking the control after the user selects the moving target to confirm the selection result.
FIG. 23 shows an example user interface 2300 for setting present day food intake as a consumption target. In the user interface 2300, a title bar 2301, a meal list 2302, a selection control 2303, a confirmation control 2304, an added diet recording control 2305, and the like can be displayed.
The title column 2301 includes the title of the current page: set present day food intake as a consumption target, and a return control for instructing a return to the previous interface.
The user's dietary records are shown in the user interface 2300, such as breakfast, lunch, dinner, a list of snacks, including meal food name and quantity, individual food calories, total calories per meal, and the like. For example, the dinner contains 1 hamburger, 900 kcal, 500 ml cola, 250 kcal, and 1150 kcal total of the dinner. On the right side of the list of meals per meal, a selection control 2303 is displayed, and the user indicates a selection status by clicking on selection control 2303, such as highlighted for selection and not highlighted for non-selection. In user interface 2300, selection control 2303 corresponding to the dinner column is selected, indicating that the user selects dinner intake as the consumption target. In some embodiments, the user may select more targets, such as the user selecting breakfast and dinner together as the consumption target.
A confirmation control 2304 is used for user confirmation of the selection result.
The user can click on the add diet recording control 2305 in the user interface 2300 to quickly jump to the diet recording interface, i.e., the user interface 1000.
Fig. 24 shows an example user interface 2400 for setting other food as a consumption target. In the user interface 2400, a title bar 2401, a food search input box 2402, an individual food list 2403, a package list 2404, a confirmation control 2405, and the like may be displayed.
The title bar 2401 includes the title of the current page: setting other food as consumption target, and a return control for indicating to return to the last interface.
The food search input box 2402 may be used for a user to input text, letters, or numbers therein to search for food. For example, the user may enter corn and search in the food search input box 2402, the first device may search for corn in the application food library and display all search results to the user, and the user may add corresponding food and quantities as desired. If the food is not available in the application food library, the network can be searched for the food and corresponding calories and the application food library can be updated.
The right side of the food search input box 2402 also includes a voice search control, a code scanning search control, a photo search control, and the like. The voice search control can facilitate a user to search for food through voice. The user may speak the food desired to be searched upon clicking the voice search control. The code scanning search control can facilitate a user to search for food through code scanning. The photographing search control can facilitate a user to search for food through pictures and help the user to add food information more quickly. Considering that a user is inconvenient to use the first device to take a picture, the first device can request to call a camera of the mobile phone to take a picture under the condition of being connected with the mobile phone. Or the first device can acquire the pictures in the mobile phone gallery.
Food options common or common to the user may be listed in the single item food list 2403. The single item food list 2403 includes multiple columns of food options, each of which may include a food graphic and name, a food quantity column, an increase control, a decrease control, and so forth. The user can input the number of the ingested food in the food number column corresponding to the food name shown in each column, the number unit can be an accurate metering unit such as gram, milliliter and the like, or a rough metering unit such as unit, bowl, disc and the like, and the user can select the metering unit according to the self condition. The user can also adjust the number by clicking the increase control and the decrease control. If no food to be added is found in the single food list, the right-hand more control can be clicked and the first device will display more single food options.
Package options common or familiar to the user may be listed in package list 2404. Package list 2404 includes a plurality of columns of package options, and similarly, each column of package options may include a package graphic and name, a package number column, an add control, a subtract control, and the like. The user can input the number of the packages to be taken in units of shares in a package number column corresponding to the package name shown in each column. The user can also increase the controls and reduce the number of controls to be adjusted by clicking. If no food to add is found in package list 2404, the right-side more control can be clicked and the first device will display more package options. In the user interface 2400, the user has selected a hamburger as the consumption target.
The confirmation control 2405 is used for the user to confirm the selection result.
After the user has selected a hamburger as the consumption target, the first device may display a user interface 2500 as shown in fig. 25.
Referring to fig. 25, a title bar 2501, a food consumption icon 2502, a food description 2503, a food consumption schedule 2504, a motion schedule 2505, a synchronized motion data control 2506, a fast start/pause control 2507, and the like may be displayed in the user interface 2500.
The title bar 2501 includes the title of the current page: a food consumption target, and a return control for instructing a return to a previous interface.
The food consumption schematic picture 2502 is a food consumption progress materialized schematic view corresponding to the current user exercise consumption calorie. In some embodiments, the portion of food that has been consumed may be displayed as a blurred image. In other embodiments, the portion of the image corresponding to the portion of the food that has been consumed may disappear. The embodiment does not limit the expression of the food consumption diagram, and the food part which is consumed is different from the food part which is not consumed in the diagram, so that the two food parts can be clearly distinguished, and the food part which is consumed can be clearly indicated. The food consumption schematic picture 2502 displays a food map that is a user-selected food consumption target, which may be one or more individual foods or packages selected by the user, which may be breakfast, lunch, or dinner, etc., selected by the user, including one or more foods.
In the user interface 2500, the food consumption graphic 2502 shows a diagram in which the upper half of a hamburger is blurred and the lower half is also an original diagram indicating that 45% of a hamburger has been consumed by the current amount of user movement.
In some embodiments, in order to promote the enthusiasm of the user for exercise, the food consumption schedule shown to the user may be set not to correspond to the actual calorie of the food, and the calorie may be converted in proportion. The present embodiment does not limit the value of the conversion ratio, and the ratio may be greater than 1 or less than 1.
In one example, the ratio of the actual calories needed to consume food to the actual calories moved by the user may be less than 1. As exercise consumes calories more than eating something, say 5 minutes for a hamburger, 900 kcal for a calorie intake, and 90 minutes for a 900 kcal run. It is possible for a user to run for 90 minutes to consume a hamburger, which is too long in duration to easily combat the user's athletic enthusiasm. If the user selects to consume the heat of one hamburger, the heat corresponding to one hamburger minus a certain proportion can be actually set as the moving target, for example, the heat is converted by 50 percent. When the user moves, correspondingly displaying the movement consumption progress of the user for 450 kilocalories according to one hamburger heat. When the user moves to consume up to 450 kcal, the user is informed that a hamburger of calories has been consumed. Therefore, the moving target is easier to achieve for the user, and the movement enthusiasm of the user can be improved.
The food description 2503 is a description of the food and may include food name, quantity, calories, etc. As in user interface 2500, food description 2503 displays the text: one hamburger with a total heat of 900 kcal.
The food consumption schedule specification 2504 is a specification of the food consumption schedule, and may include a food consumption percentage, and the like. As in the user interface 2500, the food consumption schedule specification 2504 displays the text: 45% of the heat is consumed! Continue to refuel |!
The exercise schedule 2505 illustrates user exercise data, which may include user exercise time, heat consumption, and the like. As in user interface 2500, the athletic schedule specification 2505 displays the text: the exercise time was accumulated for 45 minutes, and 450 kcal had been consumed.
Synchronized motion data control 2506 may facilitate a user in rapidly synchronizing motion data in the consumable food target user interface while exercising or after exercising. Clicking on synchronized motion data control 2506 may quickly synchronize the motion data of the current user.
Quick start/pause control 2507 is used for a user to quickly start/pause operation of a first device. When the first device is in an inoperative state, the user clicks on the quick start/pause control 2507, which can quickly turn on the first device to be operated. When the first device is in an operational state, the user can quickly stop operating the first device by clicking on the quick start/pause control 2507.
Fig. 26 shows user interface 2600 as the goal for consuming food is completed.
Referring to fig. 26, a title bar 2601, a sharing control 2602, a food consumption icon 2603, a food consumption schedule 2604, a sports schedule 2605, a synchronized sports data control 2606, a quick start/pause control 2607, and the like may be displayed in the user interface 2600.
The title bar 2601 includes the title of the current page: a consume food target, and a return control for instructing to return to a previous interface.
The sharing control 2602 is used for the user to share the motion situation to other social platforms or save the motion situation in a local device in a picture form.
The food consumption sketch picture 2603 is a food consumption progress materialized sketch corresponding to the current user movement consumption calorie. In some embodiments, the portion of food that has been consumed may be displayed as a blurred image. In other embodiments, the portion of the image corresponding to the portion of the food that has been consumed may disappear. The embodiment does not limit the expression of the food consumption diagram, and the food part which is consumed is different from the food part which is not consumed in the diagram, so that the two food parts can be clearly distinguished, and the food part which is consumed can be clearly indicated.
In the user interface 2600, the food consumption schematic picture 2603 shows a hamburger full image blurred, indicating that the current user movement amount has completely consumed one hamburger of heat.
The food consumption schedule specification 2604 is a specification of the food consumption schedule and may include a food name, a quantity, a calorie, and the like. As in the user interface 2600, the food consumption schedule 2604 displays the text: may you have consumed the entire hamburger 900 kcal!
The exercise schedule 2605 illustrates user exercise data, which may include user exercise time, heat consumption, and the like. As in user interface 2600, sports progress caption 2605 displays the text: the accumulated exercise time is 90 minutes, and the heat quantity is consumed by 900 kilocalories.
The synchronized motion data control 2606 may facilitate a user in quickly synchronizing motion data in the consume food goal user interface while or after exercising.
A quick start/pause control 2607 is used for a user to quickly start/pause the operation of the first device.
The user clicks on the share control 2602, and the first device can display the user interface 2700 shown in fig. 27.
User interface 2700 is a card interface for a user after completing a food consumption goal. A card punch title 2701, a completed food consumption goal graphic 2702, a completed food consumption goal caption 2703, a sharing platform application icon 2704, and the like can be displayed in the user interface 2700.
The punch-in title 2701 may include a date and title, as shown in user interface 2700: 24 th 7 th month 2021, saturday, my exercise today.
The finish food consumption goal sketch picture 2702 displays an image of the user's finish food consumption goal. In the user interface 2700, the finish consumed food goal schematic 2702 displays that the food consumed by the user includes a bale of french fries, a hamburger, a glass of cola.
The text 2703 of the completed consumed food object is a description of the completed consumed food object, and may include a name, a number, a calorie consumed by exercise, exercise time, and the like of food. As in user interface 2700, complete consume food object caption 2703 displays the text: i have exercised today to consume a hamburger package, accumulated exercise time 150 minutes, consumed calories 1500 kcal!
A plurality of sharing platform application icons 2704 are also displayed in the user interface 2700, and a user can click the sharing platform application icons 2704 to share the punch-card picture to a selected social platform or friend, such as a microblog, a WeChat, a QQ, a facebook, and the like. The user can also click the saving control to save the punched-card picture on a local device, such as a first device, or send the punched-card picture to a second device for saving through a first connection between the first device and the second device. The user may also share the punch card picture to the social platform on the second device. For example, after the user finishes the exercise goal on the treadmill, a card punching user interface achieved by the exercise goal may also be displayed on the mobile phone, and the user may operate on the mobile phone to share the card punching picture with the social platform or store the card punching picture in the mobile phone gallery, referring to the user interface 2700, which is not described herein again.
FIG. 28 shows an illustrative user interface 2800 for a first device to remind a user to continue to complete a target when it is detected that a user set a food consumption target is not complete. The reminding function can prompt the user to finish the moving target in time.
A reminder window 2801 may be displayed in user interface 2800, and reminder descriptions may be displayed in reminder window 2801, such as: in the parent, detect that you still have the food consumption goal not yet completed, continue?
A continuation control 2802, a new target control 2803, and the like are also displayed in the reminder window 2801. If the user clicks the continue control 2802, the user may jump to the consume food target user interface and the first device continues to detect user motion data. If the user clicks on set new target control 2803, the user may jump to the set target user interface and the user may reset the target.
Fig. 29 to 38 show the relevant user interface for setting the scenic spot route as the moving object.
Fig. 29 and 30 show example user interfaces related to setting a movement route target on a second device, such as a mobile phone.
Referring to fig. 29, in the user interface 2900, the user may click on the switch control 2902 corresponding to the card cloud point option 2901 in the movement route target setting bar, and the switch control 2902 is in an on state, which indicates that the user currently selects to set the movement route as the movement target. A sub-option is displayed under the punch-card cloudland point option 2901: the sight option 2903 is selected. The user clicks on the next control 2904 to the right of the select attractions option 2903 and the cell phone can display the user interface 3000 shown in figure 30 that is relevant to selecting attractions.
Referring to FIG. 30, a title bar 3001, a sight search input box 3002, a hot sight list 3003, a nearby sight list 3007, a confirmation control 3008, and the like may be displayed in the user interface 3000.
Among them, the title bar 3001 includes the title of the current page: a selection of a sight, and a return control for indicating a return to the previous interface.
The sight search input box 3002 may be used for a user to enter text, letters, or numbers therein to search for sights. For example, the user may input and search west lakes in the sight spot search input box 3002, the mobile phone may search for the west lake scenic spots in the application sight spot library, and display all search results to the user, for example, a plurality of running routes including the characters of west lakes, and the user may select a movement route as desired. If the application sight library has no sight spot input by the user, the application sight library can be updated by searching the sight spot and the corresponding sight spot map and the movement route in the network. The application of the scenic spot library can be understood as a data table in the sports health application, and a plurality of scenic spots and sports route maps corresponding to the scenic spots are stored.
The right side of the sight search input box 3002 also includes a voice search control, a photo search control, and the like. The voice search control can facilitate a user to search for the scenic spot through voice. The user can say the sight that the user wants to search after clicking the voice search control. The photographing search control can facilitate the user to search the scenic spot through the picture and help the user to search the scenic spot through the scenic spot features shown in the picture when the name of the scenic spot is unknown.
The hot spot list 3003 lists a plurality of spot names 3004 of domestic and foreign hot spots, a selection control 3005 is displayed at the right side of each spot name 3004, and the user indicates a spot selection state by clicking the selection control 3005, wherein the spot selection state is highlighted or not highlighted. In the user interface 3000, the hot spots displayed include the hangzhou west lake, xiamen drum wave, changsha orange shou, nanjing basalt lake park, paris senna river side, and new york central park in the united states, wherein the selection control 3005 corresponding to the hangzhou west lake column is selected to indicate that the user selects the hangzhou west lake as the movement route target. If no sights are found in the popular sight list 3003 that are desired to be added, the cell phone can display more popular sight options by clicking on the more control 3006 on the right.
The nearby sight list 3007 lists the sight names of multiple nearby sights, and the user may also select a nearby sight as the movement route. In the user interface 3000, the nearby attractions displayed include Shenzhen bay seaside park, lotus mountain park. If no sights are found in the nearby sights list 3007 that are desired to be added, more controls on the right side can be clicked and the cell phone will display more nearby sights options.
The confirmation control 3008 is used for the user to confirm the selection result.
After the user clicks the confirmation control 3008 to confirm that the hang west lake scenic spot is selected as the movement route, the user interface 3300 corresponding to the hang west lake scenic spot movement route shown in fig. 33 may be displayed on the first device.
The user may also set the movement route target on a user interface of a first device, such as a treadmill.
Fig. 31, 32 illustrate example user interfaces associated with setting an athletic route goal on a first device, such as a treadmill.
Referring to fig. 31, in the user interface 3100, the user may click a switch control 3102 corresponding to the card cloud point option 3101 in the movement route target setting bar, where the switch control 3102 is in an on state, which indicates that the current user selects to set the movement route as the movement target. A sub-option is displayed under the punch-card cloudlet option 3101: the sight option 3103 is selected. By clicking on the next control 3104 to the right of the select attraction option 3103, the first device may display the user interface 3200 shown in FIG. 32 associated with selecting an attraction.
Referring to FIG. 32, a title bar 3201, a sights search input box 3202, a popular sights list 3203, a nearby sights list 3204, a confirmation control 3206, etc. may be displayed in the user interface 3200.
The title bar 3201 includes the title of the current page: the attraction is selected, and a return control is used to indicate a return to the previous interface.
The sights search input box 3202 may be used for a user to input text, letters, or numbers therein to search for sights. For example, the user may input and search in the sight spot search input box 3202, the first device may search for the west lake scenic spot in the application sight spot library and display all search results to the user, such as a plurality of running routes including the characters of the west lake, and the user may select an exercise route as desired. If the application sight library has no sight spot input by the user, the application sight library can be networked to search the sight spot and the corresponding sight spot map and the movement route in the network and be updated. The application of the scenic spot library can be understood as a data table in the sports health application, and a plurality of scenic spots and sports route maps corresponding to the scenic spots are stored.
The right side of the sight search input box 3202 also includes a voice search control, a photograph search control, and the like. The voice search control can facilitate a user to search for the scenic spot through voice. The user can say the sight that the user wants to search after clicking the voice search control. The photographing search control can facilitate the user to search the scenic spot through the picture and help the user to search the scenic spot through the scenic spot features shown in the picture when the name of the scenic spot is unknown. Considering that the first device is not convenient to take pictures, when the user takes pictures to search the control, the first device can call a camera of the mobile phone to take pictures, and the mobile phone sends the pictures to the first device after the pictures are taken by the mobile phone. Or when the user takes a picture of the search control, the first device can acquire and display a plurality of pictures in the mobile phone gallery, and the user can select a required picture on the first device.
The popular sight list 3203 lists a plurality of sight names 3205 of popular sights at home and abroad, a selection control is displayed on the right side of each sight name 3205, and a user indicates the selection state of the corresponding sight by clicking the selection control, wherein the selection state indicates that the sight is selected if highlighted, and the non-selection indicates that the sight is not selected if not highlighted. In the user interface 3200, the hot spots displayed include the hangzhou west lake, xiamen drum wave, changsha orange shou, nanjing basalt lake park, beijing aosen park, qingdao trestle, france Paris serina river side, new York Central park, roman holiday, wherein a selection control corresponding to the hangzhou west lake column is selected to indicate that the user selects the hangzhou west lake as the motion route target. If no sights are found in the popular sight list 3203 that are desired to be added, the more controls on the right side may be clicked and the first device may display more popular sight options.
The nearby sight list 3204 lists the sight names of a plurality of nearby sights, and the user may also select a nearby sight as a movement route. In the user interface 3200, the nearby attractions displayed include Shenzhen bay seaside park, lotus mountain park, and Dashahe park. If no sights are found in the nearby sights list 3204 that are desired to be added, the more control on the right side may be clicked and the first device may display more nearby sights options.
Confirmation control 3206 is used for user confirmation of the selection result.
After the user clicks the confirmation control 3206 to confirm that the Hangzhou West lake scenic spot is selected as the movement route, the user interface 3300 corresponding to the Hangzhou West lake scenic spot movement route shown in fig. 33 may be displayed on the first device.
It is understood that the schematic diagram in this embodiment does not limit other embodiments. In other embodiments, the application may recommend the movement route based on a device type of the first device. For example, if the first device is a treadmill, preferentially recommending scenic spot movement routes suitable for running; if the first device is a mountain climbing machine, a mountain landscape movement route suitable for mountain climbing can be preferentially recommended; if the first device is a spinning bike, a road scenic movement route suitable for riding can be preferentially recommended, and the like.
After the user selects the Hangzhou West lake scenic spot as the movement route target, the first device may display a user interface 3300 as shown in FIG. 33.
Referring to fig. 33, a title bar 3301, a switch view control 3302, a movement route schematic picture 3303, a route description 3304, a movement route progress description 3305, a synchronous movement data control 3306, a quick start/pause control 3307, a label description 3308, a movement data description 3309, and the like may be displayed in the user interface 3300.
The title bar 3301 includes the title of the current scenic spot page: hangzhou West lake and a return control for indicating a return to the previous interface.
The switch view control 3302 is used for a user to select to switch between the scenic map interface and the scenic presentation interface. User interface 3300 shows a scenic map interface, which can be seen on user interface 3300, along with the user's movement route. If the user clicks to switch to a scene presentation in the user interface 3300, the first device may display the user interface 3400 shown in fig. 34, and the user interface 3400 may display the scene presentation of a particular sight.
The motion route schematic picture 3303 is a scenic spot route traveling progress visualization diagram corresponding to the current user motion amount. In some embodiments, starting from the starting point, the route that has been traveled is marked with a highlight colored line. The motion path diagram 3303 in fig. 33 is only an example, and the present embodiment does not limit the representation manner of the motion path diagram.
The first device may run 5 km with the treadmill according to the distance of the user's motion from the motion path in the corresponding motion path schematic picture 3303, and a route line segment corresponding to 5 km in the route map of the scenic spot in the west lake is marked as a highlight line.
In other embodiments, the first device may also correspondingly convert the 500 kcal into a distance of 5 km according to the distance of the movement route in the movement route schematic picture 3303 consumed by the user, for example, the user consumed 500 kcal using the treadmill, and a route line segment corresponding to 5 km in the route map of the scenic spot in the west lake is marked as a highlight line.
In other embodiments, in order to promote the enthusiasm of the user for movement, the route travel progress shown by the user can be set not to correspond to the heat consumed by the user in actual movement, and can be converted according to a certain proportion. The present embodiment does not limit the value of the conversion ratio, and the ratio may be greater than 1 or less than 1.
In one example, the ratio of the actual amount of heat required to complete the illustrated movement path to the actual amount of movement heat of the user may be less than 1. Since the user is relatively tedious to exercise on the exercise equipment such as a treadmill, and the scenic spot route is often long, for example, the west lake scenic spot is over 10 km around a circle, the ordinary user basically needs more than one hour to run the circle, the duration is too long, and the exercise enthusiasm of the user is easily attacked. It is possible to set the user's actual movement distance or actual movement consumption calorie on the running machine multiplied by a coefficient greater than 1, such as a coefficient of 1.5, the user travels 1 km on the running machine, corresponding to a travel distance of 1.5 km on the movement route schematic picture 3303. Therefore, for the user, the motion target is easier to achieve, and the motion enthusiasm of the user can be improved.
In the user interface 3300, the movement route schematic picture 3303 shows a map of the scenic spot in the west Hangzhou lake, the famous scenic spot is marked on the map, the route that the user has traveled is marked on the map with highlight lines, the current location of the user is marked with a person icon, and in the movement route schematic picture 3303, the movement route of the user starts from the broken bridge and snow-remained scenic spot, and reaches the sunset scene of the thunderstorm, autumn moon of the lake, chard, spring dawn of Su dam, three ponds reflexing moon, fish of flower harbor, and late clock of south screen, and arrives at the sunset scene spot of the thunderstorm. Each time the user arrives at a sight spot, the first device may play a voice, video, motion picture, or one or more pictures, show the user the view of the current sight spot, introduce related allusions, etc., and advance the next sight spot to encourage the user to continue moving.
Route specification 3304 is a specification of a scenic spot movement route, and may include a scenic spot name, distance, expected heat consumption, and the like. As in user interface 3300, route specification 3304 displays the text: which is about 11.5 kilometers throughout and is expected to consume 800 kcal.
The movement route progress specification 3305 is a specification of the progress of the route, and may include a travel distance percentage and the like. As in user interface 3300, the movement route schedule specification 3305 displays the text: run 63% of the way! Victory before the eye!
The synchronized motion data control 3306 may facilitate a user in quickly synchronizing motion data in the user interface while or after exercising. The motion data of the current user can be quickly synchronized by clicking the synchronized motion data control 3306.
A quick start/pause control 3307 is used for the user to quickly start/pause the first device operation. When the first device is in an inoperative state, the user clicks on the quick start/pause control 3307, which can quickly turn on the operative first device. When the first device is in an active state, the user clicks on the quick start/pause control 3307, which can quickly stop operating the first device.
The callout 3308 is a description of some callout patterns in the schematic moving route picture 3303, such as a small person icon representing the current location of the map where the user is located, and a pin icon representing the current location as a sight spot.
Exercise data specification 3309 describes the user's current exercise data, which may include the user's exercise time, calories burned, and the like. As in user interface 3300, motion data description 3309 displays the text: the exercise time is accumulated for 50 minutes, and 500 kcal of heat is consumed.
When the user clicks the switch view control 3302 in the user interface 3300, the route map is switched to the scene display, the first device may display the user interface 3400 shown in fig. 34, and the user interface 3400 may display the scene display of the specific scenery spot.
Referring to fig. 34, a title bar 3401, a view switching control 3402, scene schematic pictures of a scene 3403, a current scene description 3404, a motion route progress description 3405, a next scene preview description 3406, a sharing control 3407, a group photo control 3408, a motion data description 3409, a synchronous motion data control 3410, a quick start/pause control 3411, and the like may be displayed in a user interface 3400.
The title column 3401 includes the title of the current scenic spot page: hangzhou West lake and a return control for indicating a return to the previous interface.
The switch view control 3402 is used for a user to select to switch between a scenic map interface and a scenic presentation interface. The user interface 3400 shows a scenery spot scenery presentation interface, and scenery presentation pictures and scenery introductions of scenery spots reached by a user can be seen on the user interface 3400. If the user clicks to switch to a roadmap in user interface 3400, the first device may switch back to display user interface 3300 shown in FIG. 33.
The sight spot scene schematic picture 3403 is a scene display picture of a sight spot that the current user arrives on the travel route. The part is not limited to pictures, and can also display other expression forms such as videos and motion pictures, and can also explain the characteristics, the classical allusions and the like of the current scenic spots by matching with audio.
The scenic spot scenes are displayed for the user, so that the interest of the user can be promoted, the user can relax the mood, and boring in the movement of the user can be eliminated. The virtual scenic spot route is combined for movement, so that the user feels that the movement is not only tedious on the movement equipment, but also completed scenic spot movement route in person, and the virtual scenic spot route has more objective feeling for the user.
In the user interface 3400, the scenery spot scenery schematic picture 3403 shows the scenery of a leifeng sunset scenery, and the forest of leifeng tower, sunset and shallot can be seen in the scenery spot scenery schematic picture 3403.
The current sight description 3404 is used to describe the current sight, as in the user interface 3400, showing the sight currently arriving as a thunderbolt. The sight description 3404 may also include simple introductions of sights, such as sight characteristics, classic events, and the like.
The movement route progress specification 3405 is a specification of progress of the route, and may include a travel distance percentage and the like. As in the user interface 3400, the athletic route progress specification 3405 displays the text: run 63% of the way! Victory before the eye!
The next attraction forecast 3406 is used to forecast the next attraction to the user and may include the name of the next attraction, the distance remaining, the expected heat consumption, and the like. The step of informing the next scenic spot is equivalent to the step of dividing a long route into a plurality of small destinations, a stage small target is set for a user, when the user informs the next scenic spot in advance, the next scenic spot is taken as a new small target, the distance between every two scenic spots is relatively short, the user can easily reach the target, and the user can be fully motivated to continue to move. In the user interface 3400, the next attraction forecast 3406 displays text: the distance from the next attraction willow warrior is 3.8 kilometers.
The sharing control 3407 may be used for the user to share the athletic performance to other social platforms or save it on a local device.
The group photo control 3408 may be used for a user to combine an image of the user with a current scenery spot color map to generate a composite picture, and then select to share the composite picture with other social platforms or store the composite picture in a local device. With respect to the user interface for sharing the sport situation, reference may be made to the user interface 3500 shown in fig. 35.
The athletic data specification 3409 specifies the user's current athletic data, which may include the user's athletic time, calories burned, etc. As in the user interface 3400, the motion data specification 3409 displays the text: the exercise time is accumulated for 50 minutes, and 500 kcal of heat is consumed.
The synchronize motion data control 3410 may facilitate a user in quickly synchronizing motion data in the user interface while in motion or after in motion. Clicking on the synchronized motion data control 3410 can quickly synchronize the motion data of the current user.
Fast start/pause control 3411 is used for a user to quickly start/pause the operation of the first device. When the first device is in an inactive state, the user can quickly turn on the active first device by clicking on the quick start/pause control 3411. When the first device is in an operational state, the user can quickly stop operating the first device by clicking on the quick start/pause control 3407.
Fig. 35 shows a user interface 3500 for a user to share an athletic performance. When the user clicks the group photo control 3408, the user synthesizes the image of the user with the current scenery spot color map to generate a synthetic picture, and then selects to share the synthetic picture to other social platforms or store the synthetic picture in local equipment.
As shown in fig. 35, the user interface 3500 may display a card title 3501, a portrait and landscape composite image 3502, sight spot descriptions 3503, a finish target description 3504, a sharing platform application icon 3505, and the like.
The card title 3501 may include a date and a title, as shown in the user interface 3500: 24 th 7 th 2021, saturday, my exercise today.
The person and scene composite image 3502 displays a composite image of the person image input by the user and the scene image of the current scene. The personal image of the user can be shot by the user after the user moves by using a camera of the first device or the second device, and can also be a personal image selected in a user gallery. After the first device obtains the picture containing the personal image of the user, the edge of the user image can be identified, and the part of the user image is intercepted and fused in the scene graph. In the user interface 3500, a composite image of the upper body image of the user and a Raynaud's photograph of the scene displayed in the portrait-scene composite image 3502.
The sight spot description 3503 is illustrative of the illustrated sight spot information, and for example, in the user interface 3500, the sight spot description 3503 is displayed as a thunderbolt.
The finish target description 3504 is a description of a finish movement route situation, and may include a name of a card-punching sight, a heat consumption amount of movement, a movement time, and the like. As in user interface 3500, finish objectives specification 3504 displays the text: the accumulated movement time of 50 minutes today is 500 kilocalories of heat consumed in the thunder Fengchi in Hangzhou lake scenic spots! Move with me soon!
A plurality of sharing platform application icons 3505 are also displayed in the user interface 3500, and a user can click on the sharing platform application icons 3505 to share the card pictures to selected social platforms or friends, such as micro blogs, weChats, QQQs, facebooks, and the like. The user can also click the saving control to save the punched-card picture on local equipment, such as first equipment, or send the punched-card picture to second equipment for saving through a first connection between the first equipment and the second equipment.
Similar to user interface 3500, the user may also share the punch card picture to the social platform on the second device. For example, after the user finishes the moving target on the treadmill, a card punching user interface achieved by the moving target can be synchronously displayed on the mobile phone, the user can operate and shoot and upload a head portrait of the user on the mobile phone, the mobile phone synthesizes the scene picture and the user head portrait to generate a card punching picture, then the user shares the card punching picture with a social platform or stores the card punching picture in a mobile phone picture library, and the user interface 3500 is referred to, which is not described herein again.
Fig. 36 shows a user interface 3600 when the movement route goal is completed.
Referring to fig. 36, a title bar 3601, a schematic picture 3602 of a movement route, a description 3603 of the movement route, a description 3604 of the movement route, a sharing control 3605, a description 3606 of movement data, and the like can be displayed in a user interface 3600.
The title bar 3601 includes, among others, the title of the current page: hangzhou West lake and a return control for indicating a return to the previous interface.
The motion route schematic picture 3602 is a scenic spot route traveling progress visualization schematic diagram corresponding to the current user motion amount. In the movement route schematic picture 3602, the distance that the user has traveled is marked with a highlight line on the map, the current position of the user is marked with a man icon, and in the movement route schematic picture 3602, the user has moved to complete one round of distance in the west lake scenic spot.
The route specification 3603 is a description of the movement route of the scenic spot, and may include a name of the scenic spot, a distance, a predicted consumed calorie, and the like. In the user interface 3600, the route specification 3603 displays the text: approximately 11.5 kilometers throughout, and is expected to consume 800 kcal.
The athletic route progress specification 3604 is a description of the progress of the route. In the user interface 3600, the movement route progress caption 3604 displays the text: may you have punched all the scenic spots! Consuming a total of 900 kcal of heat! True club!
Sharing control 3605 is used for users to share the motion situation to other social platforms or save in local devices in the form of pictures.
The athletic data description 3606 describes user athletic data, which may include user athletic time, calories burned, and the like. As in the user interface 3600, the athletic progress caption 3606 displays the text: the cumulative exercise time is 80 minutes, and 900 kcal are consumed.
The user clicks on the share control 3605 and the first device may display the user interface 3700 shown in fig. 37.
User interface 3700 is a user's card punch interface after completing the movement route objective. A card punching title 3701, a movement route schematic picture 3702, a movement data description 3703, a sharing platform application icon 3704, and the like can be displayed in the user interface 3700.
The punch-in title 3701 may include a date and title, as shown in the user interface 3700: 24 th 7 th month 2021, saturday, my exercise today.
The movement path schematic picture 3702 displays a schematic diagram of a movement path that the user has completed. In the user interface 3700, the highlight bar in the motion path schematic 3702 circles around the west lake for one turn, indicating that the user has run through one turn in the west lake scenic area.
The exercise data description 3703 is a description of a situation where the user has completed exercise, and may include a name of a card-punching attraction, an amount of heat consumed by exercise, and exercise time. As in user interface 3700, sports data legend 3703 displays the text: i run for a circle in the West lake scenic spot today, accumulate the movement time for 80 minutes, consume 900 kcal of heat quantity!
A plurality of sharing platform application icons 3704 are also displayed in the user interface 3700, and a user can click the sharing platform application icons 3704 to share the card pictures to a selected social platform or friends, such as a microblog, a WeChat, a QQ, a facebook, and the like. The user can also click the saving control to save the punched-card picture on local equipment, such as first equipment, or send the punched-card picture to second equipment for saving through a first connection between the first equipment and the second equipment.
Similar to the user interface 3700, the user may also share the punch card picture to the social platform on the second device. For example, after the user finishes the moving target on the treadmill, a card punching user interface achieved by the moving target can be synchronously displayed on the mobile phone, the user can operate on the mobile phone to generate a card punching picture, and then the user shares the card punching picture with the social platform or stores the card punching picture in a mobile phone gallery, referring to the user interface 3700, which is not described herein again.
Fig. 38 shows an illustrative user interface 3800 for the first device to prompt the user to proceed with completing the goal upon detecting that the user has not completed setting the movement route goal. The reminding function can prompt the user to finish the moving target in time.
A reminder window 3801 may be displayed in the user interface 3800, and reminder descriptions may be displayed in the reminder window 3801, such as: parent, detect if your hangzhou west lake circuit running is not completed yet, continue?
A continuation control 3802, a new target control 3803, and the like are also displayed in the reminder window 3801. If the user clicks the continue control 3802, a jump may be made to the athletic route user interface and the first device may continue to detect user athletic data. If the user clicks to set a new target control 3803, the user may jump to the set target user interface and the user may reset the target.
Fig. 39 shows a user interface 3900 that alerts the user to the completion of the remaining targets.
The mobile phone can remind the user that the rest moving targets are not finished, and can give a movement suggestion to urge the user to move. The mobile phone can give clear and useful suggestions to the user, such as recommending a movement mode and a predicted time according to the incomplete heat target of the user.
If the user interface 3900 may display a reminder window 3901, the reminder window 3901 may display reminder descriptions, such as: warm prompts that the sports goal of today is not finished, 400 kilocalories of heat are consumed, and people are recommended to run for 30 minutes, walk for 60 minutes or jump for 40 minutes.
A control 3902 is also displayed in the reminder window 3901. After the user clicks the control 3902, the user can exit the current reminding window 3901, return to the home page, or jump to the interface related to the moving object.
The mobile phone can also display the exercise suggestion for the user in the exercise suggestion page, for example, a recommended exercise mode and a predicted time can be given according to data of food intake calorie, a basal metabolic value, exercise consumption calorie and the like of the user every day. Reference is made to the user interface 4000 shown in fig. 40.
When the user clicks the exercise health control 806 in the exercise health home page, i.e., user interface 800, the cell phone may display the user interface 4000 as shown in fig. 40. A motion suggestion title 4001, a heat statistics picture 4002, a motion suggestion description 4003, and the like may be displayed in the user interface 4000.
The calorie statistics picture 4002 is used for displaying calorie statistics data, which may include dietary intake calorie data, exercise consumption calorie data, basic metabolic calorie data, remaining calorie data to be consumed, and a calorie statistics chart. The residual consumed calorie is obtained by subtracting basal metabolic calorie and exercise consumed calorie from dietary intake calorie. The basic metabolic heat data of the user can be calculated by a mobile phone according to the height and the weight of the user, and can also be automatically input by the user after professional measurement. The dietary intake calorie data can be summarized according to the dietary data input by the user in breakfast, lunch, dinner and snack food lists. The exercise heat consumption data can be obtained by summarizing the exercise data acquired by the mobile phone and the exercise data acquired from other exercise equipment.
In the calorie statistics picture 4002, the current diet intake of the user is 3000 kcal, the exercise consumption is 300 kcal, the basal metabolic calorie is 2000 kcal, and the remaining consumption is 700 kcal. The calorimetric chart is a circular chart, the whole circle represents the basal metabolic heat, and the dark area represents the proportion of the heat value to be consumed in the circular chart.
The motion suggestion explanation 4003 is a motion suggestion obtained by analyzing various items of thermal data of the user by the mobile phone, and the motion suggestion can be a single motion mode or a combined motion mode. In the user interface 4000, exercise advice given is running for 60 minutes, swimming for 40 minutes, skipping for 90 minutes, or spinning for 40 minutes, etc., knowing that the user still consumes 700 kcal.
In some embodiments, if the first device or the second device detects that the user starts moving shortly after having a meal, a health prompt may be given, such as user interface 4100 shown in fig. 41. The health prompt can avoid physical discomfort caused by movement of the user after meals as much as possible, the first device or the second device can determine the time stamp of the food picture taken by the user as the meal time, can also estimate the time of adding the diet record by the user as the meal time, and can also regard the common meal time as the meal time, for example, lunch is usually half at 11 to half at 12 noon.
Referring to fig. 41, a prompt window 4101 may be displayed in the user interface 4100, and: the warm prompt indicates that a parent detects that the time for which you just eat dinner does not exceed one hour, and recommends that you digest first and do exercises later for your physical health.
In other embodiments, both the food consumption map and the scenic spot movement roadmap may be displayed in the user interface. Therefore, the food consumption condition can be displayed in contrast with the scenic spot movement route condition, so that the moving object is clearer and more intuitive. If the user can intuitively know how far the actual distance to move on the map the user needs to eat a hamburger.
The embodiment of the present application does not set any limit to the layout of the user interface. In some embodiments, the food consumption image of the food and the motion roadmap of the scenic spot are displayed in the same interface. As with the user interface 4200 shown in fig. 42, a food consumption schematic picture 4201 (see the food consumption schematic picture 2502 in the user interface 2500) is displayed in a first area, e.g., the left half, of the user interface, a scenic region motion route schematic picture 4202 (see the motion route schematic picture 3303 in the user interface 3300) is displayed in a second area, e.g., the right half, of the user interface, and a motion data description 4203, which is a description of the current motion situation of the user, may be displayed, e.g., the motion data description 4203 displays text: the exercise time is accumulated for 50 minutes, and 500 kcal of heat is consumed. Therefore, the user can intuitively see the food consumption progress and/or the movement route progress fed back in real time during movement at the same time. The first region and the second region may overlap. In still other embodiments, the food consumption image of the food and/or the motion roadmap of the scenic spot may be set to a certain transparency.
In other embodiments, the food consumption image of the food and the motion roadmap of the scenic spot may be switched to display, for example, the food consumption image of the food is displayed for one minute, the motion roadmap of the scenic spot is displayed for one minute, and then the food consumption image of the food is switched back.
In some embodiments, the mobile phone may analyze the motion law of the user according to the historical data of the motion of the user, and recommend the motion item and recommend the goal of consuming calories to the user. If it is counted that the exercise mode commonly used by the user is running and the previous week is running for 40 minutes on average each day, the exercise advice and the exercise goal recommended to the user by the mobile phone on the day can be running for 40 minutes.
In some embodiments, the user interface displayed by the first device may be a screen-cast interface of the second device. The cell phone as shown in fig. 19 displays a user interface 1900 related to the setting target, and through the first connection, the cell phone casts the application interface onto the display screen of the treadmill, and the display screen of the treadmill displays a user interface 2200 as shown in fig. 22. The interface elements in the user interface 2200 shown in the treadmill correspond to the interface elements in the user interface 1900 shown in the mobile phone one to one, and the layouts may be different. The user can carry out application interaction on the mobile phone, and the user can also carry out application interaction on the display screen of the treadmill. The display screen of the treadmill can be sent to the mobile phone after the input information of the user is acquired, and the mobile phone refreshes the interface according to the input information.
In some embodiments, a privacy statement-related user interface may be added.
In some embodiments, a novice tutorial-related user interface may also be added.
It is to be understood that the above example interfaces are described with the first device as the display subject and the second device as the display subject, which do not limit the other embodiments. Some of the user interfaces shown on the first device may also have a corresponding user interface on the second device. Some of the user interfaces shown on the second device may also have a corresponding user interface on the first device. For example, regarding the interface related to adding the food record, a user interface 1000 as shown in fig. 10 may be shown on the mobile phone, and a user interface 1600 as shown in fig. 16 may be shown on the treadmill, so that the user may add the food record on the mobile phone or add the food record on the treadmill.
It should be understood that the user interfaces described in fig. 5 to 41 are only exemplary interfaces for assisting the reader in understanding the technical solutions described in the present application, and do not limit the user interfaces of other embodiments of the present application. In other embodiments, other different user interfaces may also be used to help the user set the interesting sports target, and in other embodiments, more or fewer user interfaces may be added or decreased, more or fewer controls may be added or decreased, or different man-machine interaction operations may be designed, etc., so that the user interface is more suitable for the user experience.
In conjunction with the foregoing embodiments shown in fig. 1 to 42, a method for setting a moving object according to an embodiment of the present application is described below.
The embodiment of the method is described by taking the first device as an example. The first equipment is intelligent sports equipment, and this intelligent sports equipment can be treadmill, elliptical machine, spinning bike, rowing machine, climbing machine etc. and this application does not put a limit on.
In some embodiments, the first communication system is a first communication system composed of a first device and a second device, and the first communication system is the communication system 10 shown in fig. 4. The first device can be matched with the second device to provide exclusive, multifunctional, personalized, more interesting and more convenient service for consumers.
A first connection is established between the first device and the second device. The first device and the second device may communicate over the first connection. The first connection may be a wired connection or a wireless connection, and the embodiment is not limited. The first device may communicate data or instructions with the second device over the first connection. The user may enter information on the second device and synchronize to the first device. The user may issue the control instruction to the first device by a user operation acting on the second device. The second device may be a mobile PC, a tablet computer, a notebook computer, a cloud host/cloud server, or other desktop computer, a laptop computer, a handheld computer, an artificial intelligence device, an intelligent television, a vehicle-mounted device, a game console, and the like, which is not limited in this embodiment.
The first device or the second device can be mounted with
Figure BDA0003200609950000481
The system,
Figure BDA0003200609950000485
The system,
Figure BDA0003200609950000486
The system,
Figure BDA0003200609950000482
The operating system of each terminal device in the first communication system may be the same or different, and this application is not limited to this. In some embodiments, both the first device and the second device are loaded in the first communication system
Figure BDA0003200609950000483
The system, then the system composed of the plurality of terminals can be called as
Figure BDA0003200609950000487
Super virtual terminal, also called
Figure BDA0003200609950000484
And (4) super terminals.
The exercise health application can be installed on the first device or the second device, and may also be called the first application, and the exercise health application can acquire exercise data of the user acquired by the first device and/or the second device, such as running time, distance, heat consumption and the like of the user acquired by the treadmill, walking number and the like of the user acquired by the mobile phone, and can summarize the exercise data acquired by each device to generate final exercise data. In some embodiments, the same user may be logged on to the exercise health application of the first device and the second device, facilitating synchronization of exercise data or other data, such as dietary data, by the first device and the second device.
In other embodiments, the first communication system may further include a third device, and the third device may be, for example, a smart band, a smart watch, or the like. For example, the smart watch may detect the user's athletic data, such as number of steps taken, length of time run, length of time swim, etc., and synchronize the user's athletic data to the treadmill and/or cell phone. Likewise, the treadmill and/or the cell phone may synchronize the detected user data to the smart watch. The multiple terminals are matched for use, so that more accurate detection of the motion data of the user can be realized. When the plurality of terminals add up the exercise data of the user, the repeated data detected in the same time period can be eliminated. For example, the intelligent watch and the treadmill detect running data of the user in a time period, and the intelligent watch and the treadmill can be stamped when recording exercise data. When they synchronize the motion data with each other, a reasonable motion data can be taken out among the motion data of the same time stamp. The reasonable exercise data may be a maximum value in the plurality of exercise data (considering that some devices may have errors in detection and have undetected exercise data), or may be an average value (considering balance errors), or considering that the exercise data detected by the treadmill is more accurate, the exercise data of the treadmill is directly taken as the final statistical exercise data, which covers the exercise data detected by the smart watch in the same time period, or may be in other calculation modes, which is not limited in this application.
The user may also record, manage exercise data, diet data, or personal signs data via the exercise health application. Such as a user may record food intake calories through exercise health.
It will be appreciated that the user may interact on any device on which the first application is installed. The data such as diet records and set targets input by the user in the first application can be synchronously migrated to other equipment.
In some embodiments, the second device may, when detecting some user operations acting on the second device, such as user pressing, touch, voice, gesture, and the like, transmit the user operations collected by the sensor to the processor to generate the control instruction, and transmit the control instruction to the first device through the first connection by the communication device, so that the user can conveniently control the first device through the second device. For example, the user can control the functions of starting or stopping the operation, adjusting the speed, setting the moving object and the like of the first device by operating the second device.
The embodiment provided by the application can integrate the software and hardware capabilities of different devices, and provide multifunctional, visual and more interesting exercise experience for users.
Example one
In embodiment one, the user may select a food consumption goal as the movement goal. The food consumption goal may be embodied as a visualized food image. The user can set a food consumption target according to the food intake condition, and when the user moves, the food consumption target can show a visual image of the food consumption progress according to the amount of the user's movement.
Fig. 43 is a flowchart of a method for setting a moving object according to the first embodiment, which specifically includes the following steps:
s101, the first device acquires that a user inputs first food in a first application.
The user may record the situation of food intake in the first application. The user may record the type, amount, etc. of food consumed for breakfast, lunch, dinner, snack, etc., and the first application may calculate the food intake calories of the user according to the type, amount, etc. of food input by the user.
In some embodiments, the user may operate to add food information on a display screen of a first device (e.g., a treadmill), embodiments described with reference to fig. 16, 17. In other embodiments, the user may operate to add food information on a display screen of a second device (e.g., a mobile phone), and then the information added by the user on the first application of the second device may be synchronized to the first application on the first device, such as the embodiments described with reference to fig. 10 and 11. And will not be described in detail herein.
The way for the user to add the food can be various, and the embodiment is not limited. If a single item of food or a package of food can be selected on the food list interface, characters, letters or numbers can be input in the food search input box to search for food, and the food can be added in a mode of searching for food through voice, searching for food through code scanning, searching for food through photographing and the like.
The first application comprises an application food library, and a user can conveniently check the heat corresponding to food. The application food library can be understood as one or more data tables storing the corresponding relationship between a plurality of foods and the calories contained in the foods under a certain quantity, such as 50 kcal per 100 ml of cola.
For example, the user eats the stretched noodles in lunch, the stretched noodles can be input in the food search input box and searched, the first device or the second device can search the stretched noodles in the application food library and display all search results to the user, and the user can add corresponding food and quantity as required. If the food is not available in the application food library, the network can be searched for the food and corresponding calories and the application food library can be updated.
The voice searching mode can facilitate the user to search food through voice. The user can speak the food that he wants to search for after clicking the voice search control. After the first device acquires the voice message, recognizing food and quantity in the voice message, searching for corresponding food in the application food library, and if the food does not exist in the application food library, searching for the food and corresponding heat in the network and updating the application food library.
The code scanning searching mode can facilitate the user to search food in the code scanning mode. The food codes for marking food information such as two-dimensional codes or bar codes are pasted on packages of some foods, and a user can start the camera to scan the food codes after clicking the code scanning search control to identify the corresponding food information including information such as food names, quantities and heat.
The shooting searching mode can facilitate the user to search food through pictures. The user can take a picture of food when having a meal, and the first equipment or the second equipment can intelligently identify the food and the quantity in the picture, so that the user is helped to add food information more quickly. The embodiments described with reference to fig. 12 and 13 can be specifically referred to.
In some embodiments, to facilitate calculating the calorie intake of food by a single person when multiple persons have meals, the user may enter the number of people having meals at the same time as entering food. For example, the user can take a picture of all the food items at a party and then select the number of people sharing the food items at the party number selection section. The first device or the second device can perform data processing according to the number of people having meals after the total calorie of all foods in the picture is identified, and can approximately estimate the calorie taken by each person, so that the problem that the calculation of the calorie taken by a single person when a plurality of persons have a dinner can be solved. That is, the first application may provide the option of selecting the number of people at a meal, and if the first food is shared by N people, then selecting N among the options of the number of people at a meal will result in the food intake for a single person to eat.
In some embodiments, the total calories for all food items may be divided by the number of people sharing a meal to obtain an average to estimate the amount of food ingested by each person.
In other embodiments, other ways of estimating food intake for each person may be used. For example, the user can input the ratio of the meal intake of different persons by himself. For example, images of all the people who have meals can be obtained, and then the food intake of different people can be distributed according to the fat and thin of different people who have meals. Alternatively, the first application may obtain a user representation of the meal people and assign food intake of different people based on the user representation or historical data of different meal people, for example, a person has a meal for the father, mother, and child, the food intake of the father is the most, about 1/3 more than the mother, and the food intake of the child is about 1/2 of the mother, and the food intake of different people is estimated based on the user representation.
In other embodiments, in order to more accurately count the calorie of the food taken by the user, the user may take a picture of the food before eating and take another picture of the food after eating, and the first device or the second device may identify the total amount of the food before eating and the remaining amount of the food after eating, and calculate the difference between the calorie of the food before eating and the calorie of the food after eating as the calorie of the food taken by the user.
In some embodiments, a first application may obtain ordering information in other applications. Refer to the user interface shown in fig. 15, 18. The user may choose to obtain order information from other applications (e.g., third party take-away applications, order applications, etc.) that may open permissions to the first application to allow it to obtain order information. The function of quickly importing the meal ordering information can quickly acquire the food information of the user for eating, and the trouble of manually inputting the food information one by one is saved for the user.
S102, the first device acquires that the user sets the moving target in the first application to consume the first food.
The user may set the moving object in the first application, which in this embodiment is consuming the first food. The first food may be a single food, or may include multiple foods, and this embodiment is not limited. For example, the user can singly select one hamburger as a caloric consumption sports object, the user can also select lunch (containing a plurality of foods) as the caloric consumption sports object, and the like.
In some embodiments, the user may operate to set a food consumption goal on a display screen of a first device (e.g., a treadmill), embodiments described with reference to fig. 22, 23, 24. In other embodiments, the user may operate to set the food consumption target on a display screen of a second device (e.g., a cell phone), and then the information added by the user on the first application of the second device may be synchronized to the first application on the first device, as described with reference to the embodiments of fig. 19, 20, and 21. And will not be described in detail herein.
In some embodiments, the user may directly set the consumption of the first food as the moving object, the first food is preset in the application food library and is not the user' S meal food, and then step S101 may be omitted, and the user does not need to input the food in the first application before setting the moving object to consume the first food.
S103, the first equipment displays a first user interface, and a first image of first food is displayed in the first user interface.
After the user selects to consume the first food as the moving target, the first device may display a first image of the first food.
The first device may display a complete first schematic food item before the first device has not detected the user's motion data.
S104, the first equipment acquires a first movement amount of the user.
In some embodiments, a user uses a first device to exercise, and the first device may directly acquire exercise data of the user, which may include exercise time, exercise distance, exercise heat consumption, and the like, which may be collectively referred to as an amount of exercise. For example, the treadmill can acquire data such as running time and running distance of the user on the treadmill, and roughly estimate the corresponding heat consumed by the user.
Since different amounts of exercise may be associated with the amounts of consumed food, the amount of exercise is not limited herein and may be specifically indicated by exercise time, exercise distance, exercise consumed calories, and the like.
And S105, the first device displays a second user interface, wherein a second image of the consumption ratio of the first food is displayed in the second user interface, the consumption ratio of the first food is determined by the first motion amount of the user, the second image comprises a first part and a second part, the first part indicates the consumed part of the first food, the second part indicates the remaining unconsumed part of the first food, and the area of the first part is increased along with the increase of the motion amount of the user.
Referring to the embodiment shown in fig. 25, the first device may display a food consumption diagram of the first food, the food consumption diagram being a food consumption progress visualization diagram corresponding to a current user exercise consumption calorie. The consumption proportion of the first food is determined by the user's quantity of motion, which indicates the consumption proportion of the first food, and the larger the quantity of motion, the larger the consumption proportion of the first food.
In some embodiments, the portion of food that has been consumed may be displayed as a blurred image. In other embodiments, the portion of the image corresponding to the portion of the food that has been consumed may disappear, or be a transparent image. The embodiment does not limit the representation mode of the food consumption diagram, and the consumed food part and the unconsumed food part are different from each other in the diagram, so that the two food parts can be clearly distinguished, and the food part can be clearly indicated to be consumed.
The first device may display an image of the first food being consumed continuously in real time during the user's motion.
In some embodiments, in order to promote the enthusiasm of the user for exercise, the food consumption schedule shown to the user may be set not to correspond to the actual calorie of the food, and the calorie may be converted in proportion. I.e. the actual caloric value needed for consuming the first portion of the first food is a first ratio to the caloric value of the first amount of exercise. The value of the first ratio is not limited in this embodiment, and the first ratio may be greater than 1 or smaller than 1.
In one example, the value of the first ratio may be less than 1. Because exercise consumes calories more than eating something, say 5 minutes for a hamburger, 900 kcal for calorie intake, and 90 minutes for 900 kcal for running. It is possible for a user to run for 90 minutes to consume a hamburger, which is too long in duration to easily combat the user's athletic enthusiasm. If the user selects to consume the heat of one hamburger, the heat corresponding to one hamburger minus a certain proportion can be actually set as the moving target, for example, the heat is converted by 50 percent. When the user moves, the movement consumption progress of the user is correspondingly displayed for 450 kilocalories according to one hamburger heat. When the user moves to consume up to 450 kcal, the user is informed that a hamburger of calories has been consumed. Therefore, the moving target is easier to achieve for the user, and the movement enthusiasm of the user can be improved.
As the user continues to exercise, the food consumption portion of the first food displayed on the user interface increases and the remaining unconsumed portion of the first food decreases until the first food is consumed and the exercise goal of the user is achieved.
In some embodiments, the user may picture the amount of motion and consumption of the first food, share it with other social platforms, or save it to a local device. When the first device detects a sharing operation of a user, the first device may display a sharing interface, and an image and/or a text description of the first food consumed may be displayed in the sharing interface. Reference may be made to the embodiment shown in fig. 27.
In the first embodiment, the training target of the user is measured in the form of food which is common to the user, so that the user can define the exercise target more clearly during exercise, namely, the heat of the selected food is consumed, meanwhile, the exercise effect can be visually displayed on the user interface, the user does not feel boring during exercise by using the exercise equipment, the exercise enthusiasm of the user is improved, and the exercise enthusiasm is increased.
Example two
In the second embodiment, the user can select the scenic spot movement route as the movement target. The first application can provide a plurality of scenic spot movement routes, when a user moves, the travel progress of the user on the routes can be drawn according to the amount of the user movement, and when the user reaches a plurality of scenic spots on the scenic spot routes, scenic pictures or introductions of the scenic spots can be displayed and used as a stage target to encourage the user to continuously and actively move for unlocking the next scenic spot.
Fig. 44 is a flowchart of a method for setting a moving object according to the second embodiment, which specifically includes the following steps:
s201, the first device obtains a first movement route of a user, wherein the movement target is set to be a first area in a first application.
The user may set a moving object in the first application, and in this embodiment, the moving object is a first movement route of the first region. The first area may be a scenic spot, a park, or other user-selected area. The first application may provide a movement route of a nationwide or world-wide famous scenic region, or may acquire a current geographic location of the user and then recommend a local or nearby scenic region route, which is not limited in this embodiment. The user can select the scenic spot movement route as the movement target in the first application. Certainly, the scenic spot is only an example, and any other regions may be used, for example, the user may select an area on the map at will, the first application may help the user to plan the movement route, or the user may specify the movement route by himself, which is not limited in this embodiment.
In some embodiments, the user may operate on a display screen of a first device (e.g., a treadmill) to select a scenic spot movement route, embodiments described with reference to fig. 31, 32. In other embodiments, the user may select the scenic spot movement route on the display screen of the second device (e.g., a mobile phone), and then the information added by the user on the first application of the second device may be synchronized to the first application on the first device, as described with reference to fig. 29 and 30. And will not be described in detail herein.
The user may search for scenic spots in the first application, each scenic spot may also provide a plurality of movement routes. If the user can input the west lake in the sight spot search input box and search, the first device or the second device can search the west lake sight spots in the application sight spot library and display all search results to the user, such as a plurality of running routes containing the characters of the west lake, and the user can select the movement route as required. If the application sight library has no sight spot input by the user, the application sight library can be updated by searching the sight spot and the corresponding sight spot map and the movement route in the network. The application of the scenic spot library can be understood as a data table in sports health application, and a plurality of scenic spots and sports route maps corresponding to the scenic spots are stored.
The first application can also support a user to search scenic spots through voice and pictures. For example, after clicking a voice search control, the user may say the sight that the user wants to search, and the first application performs voice recognition on the voice and then searches in the application sight library. The photographing search function can facilitate the user to search the scenic spots through the pictures, and help the user to search the scenic spots through the scenic spot features shown in the pictures when the name of the scenic spot is unknown. If the first application recognizes that the image input by the user contains the thunderpeak tower, the scenic spot shown by the image can be identified as a west lake scenic spot.
In some embodiments, the first region is derived from region information in a second application, which is a tourist/scenic region recommendation application or a mapping application or a navigation application, etc. The second application may open permissions to the first application to allow it to obtain locale information. The function of quickly importing the region information can quickly acquire the region information, and the trouble of manually inputting the region information one by one is saved for a user.
In some embodiments, the first application may recommend the movement route based on a device type of the first device. For example, if the first device is a treadmill, a scenic spot movement route suitable for running is preferentially recommended; if the first device is a mountain climbing machine, a mountain landscape movement route suitable for mountain climbing can be preferentially recommended; if the first device is a spinning bike, a scenic road movement route suitable for riding can be preferentially recommended, and the like.
S202, the first device displays a third user interface, and a map of the first region is displayed in the third user interface.
After the user selects the first movement route of the first region as the movement target, the first device may display a map image of the first region.
The first device may display a schematic representation of a first region without a user motion identification before the first device has not detected the user's motion data.
S203, the first device acquires a second amount of motion of the user.
In some embodiments, the user exercises with the first device, and the first device may directly obtain exercise data of the user, which may include exercise time, exercise distance, exercise heat consumption, and the like, which may be collectively referred to as an exercise amount. For example, the treadmill can acquire data such as running time and running distance of the user on the treadmill, and roughly estimate the corresponding heat consumed by the user.
Since different amounts of exercise can be associated with the length of the user's exercise route, the amount of exercise is not limited herein, and the amount of exercise can be specifically indicated by exercise time, exercise distance, heat consumed by exercise, and the like.
The second amount of motion corresponds to the progress of the user in the first movement route, and the second amount of motion may be a movement distance, a movement time, an amount of heat consumed by the movement, and the like. If the user runs one kilometer each time on the treadmill, a route section with the actual distance of one kilometer is marked in the scenic spot map as the movement route of the user path.
And S204, the first device displays a fourth user interface, a map of the first region and a movement route of the user in the first region are displayed in the fourth user interface, the movement route is related to the second movement amount, and the length of the movement route becomes larger along with the increase of the second movement amount.
Referring to the embodiment shown in fig. 33, the first device may display a movement route diagram of the first region, where the movement route diagram is a concrete diagram of a travel progress of a route of the region corresponding to the current amount of user movement. In some embodiments, starting from the starting point, the route that has been traveled may be marked with a highlight color line.
During the movement of the user, the first device can display the travel progress of the user on the first movement route in real time.
In some embodiments, the first device may move away from the corresponding movement route in the movement route schematic according to a distance of the user movement, such as the user running 5 kilometers using the treadmill, and a route line segment corresponding to 5 kilometers in the western lake scenic spot route map is marked as a highlight line. The actual distance of the movement route, such as the user path, in the first area is the same as the movement distance indicated by the user's amount of movement.
In other embodiments, the first device may also correspond to the distance of the movement route in the movement route schematic diagram according to the heat consumed by the movement of the user, for example, the user consumes 500 kcal of heat by using a treadmill, the first device may correspondingly convert the 500 kcal of heat into a distance of 5 kilometers, and a route line segment corresponding to 5 kilometers in the west lake scenic spot route map is marked as a highlight line.
In other embodiments, in order to promote the enthusiasm of the user for movement, the route travel progress shown by the user can be set not to correspond to the heat consumed by the user in actual movement, and can be converted according to a certain proportion. If the actual heat value required by the user to pass through the exercise route and the heat value of the second exercise amount are the second ratio, the value of the second ratio is not limited in this embodiment, and the second ratio may be greater than 1 or smaller than 1.
In one example, the value of the second ratio may be less than 1. Since the user is relatively boring to move on the sports equipment such as a treadmill, and the route of the scenic spot is often long, for example, the scenic spot of the west lake circles over 10 kilometers, the time for the ordinary user to run for one hour is more than one hour, the duration is too long, and the movement enthusiasm of the user is easily attacked. It is possible to set the user's actual movement distance or actual movement consumption calorie on the running machine multiplied by a coefficient greater than 1, for example, a coefficient of 1.5, the user travels 1 km on the running machine, and the travel distance corresponding to the movement route graphic 3303 is 1.5 km. Therefore, for the user, the motion target is easier to achieve, and the motion enthusiasm of the user can be improved.
As the user continues to move, the progress of the movement route displayed on the user interface increases until the user's movement goal is achieved, in reference to the embodiment shown in fig. 36.
In some embodiments, the user may generate pictures of the amount of motion and the travel of the motion route, share the pictures to other social platforms, or save the pictures to a local device. When the first device detects the sharing operation of the user, the first device may display a sharing interface, and an image and/or a text description of the movement route of the user may be displayed in the sharing interface. Reference may be made to the embodiments shown in fig. 35, 37.
In some embodiments, when the user motion progress corresponds to reaching certain sights in a scenic region, the first device may present a scene map of the sights to the user and introduce the sights, with reference to the embodiment shown in fig. 34. The user may also choose to switch between the scenic map interface and the scenic presentation interface. When the user arrives at a scenery spot, the first device can play voice, video, a motion picture or one or more pictures, show the scenery of the current scenery spot to the user, introduce the characteristics of the scenery spot, related events and the like, and can also forecast the next scenery spot. The step of informing the next scenic spot is equivalent to the step of dividing a long route into a plurality of small destinations, a stage small target is set for a user, when the user informs the next scenic spot in advance, the next scenic spot is taken as a new small target, the distance between every two scenic spots is relatively short, the user can easily reach the target, and the user can be fully motivated to continue to move.
In some embodiments, the map of the first region may display the motion situations of a plurality of users, such as displaying a ranking of the motion situations of the plurality of users on the same day or within a certain time (e.g., within a week), or displaying the farthest distance reached by each user and an icon, such as prompting the user to: you also have 1 km beyond the second user. So as to encourage the user to overtake others, refresh the record and improve the enthusiasm of the user for movement.
The second embodiment takes the movement route of the region as the training target of the user, so that the user can define the movement target more clearly during movement, namely, the specific movement route of the region is more visual and interesting as the movement target compared with a simple time and heat target, and meanwhile, the movement progress and movement effect of the route can be visually displayed on a user interface, so that the user does not feel boring during movement by using movement equipment, the movement enthusiasm of the user is improved, and the movement enthusiasm is increased.
In other embodiments, both the food consumption map and the regional movement roadmap may be displayed in the user interface. Therefore, the food consumption condition can be displayed in comparison with the regional movement route condition, so that the moving target is clearer and more intuitive. If the user can intuitively know how far the actual distance to move on the map the user needs to eat a hamburger.
The embodiment of the application does not set any limit to the layout of the user interface. In some embodiments, the food consumption image of the first food and the motion roadmap for the first region are displayed in the same interface. As shown in fig. 42, a food consumption map is displayed in a first area, e.g., a left half, of the user interface, and a scenic spot motion route map is displayed in a second area, e.g., a right half, of the user interface, so that the user can simultaneously and intuitively see a food consumption progress and/or a motion route progress fed back in real time while exercising. The first region and the second region may overlap. In still other embodiments, the food consumption image of the first food and/or the motion roadmap of the first region may be set to a certain transparency.
In other embodiments, the food consumption image of the first food and the motion road map of the first region may be displayed in a switchable manner, for example, the food consumption image of the first food is displayed for one minute, the motion road map of the first region is displayed for one minute, and then the food consumption image of the first food is switched back.
In some embodiments, the user may generate a card-punching picture during the exercise process or after the exercise goal is reached, share the card-punching picture to the social platform or store the card-punching picture on the local device, so that the exercise is more ceremonial, and the user obtains more achievement, which may refer to the embodiments shown in fig. 27, fig. 35, and fig. 37.
By implementing the method provided by the application, the interestingness and the visualization of the moving target can be increased, the moving target can be defined more clearly by a user, a more visual moving effect can be fed back to the user, the user does not feel boring during the moving process of using the moving equipment, and the movement enthusiasm of the user is improved.
The implementation manner described in the above embodiments is only an example, and does not set any limit to other embodiments of the present application. The specific internal implementation manner may be different according to different types of electronic devices, different loaded operating systems, different used programs, and different called interfaces, and the embodiments of the present application are not limited at all, and may implement the feature functions described in the embodiments of the present application.
As used in the above embodiments, the term "when 8230; may be interpreted to mean" if 8230, "or" after 8230; or "in response to a determination of 8230," or "in response to a detection of 8230," depending on the context. Similarly, the phrase "at the time of determination of \8230;" or "if (a stated condition or event) is detected" may be interpreted to mean "if it is determined 8230;" or "in response to the determination of 8230;" or "upon detection (a stated condition or event)" or "in response to the detection (a stated condition or event)" depending on the context.
In the above embodiments, all or part of the implementation may be realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. A method of setting a moving object, the method comprising:
the method comprises the steps that first equipment acquires that a moving target set by a user in a first application is a first food consumed;
the first device displaying a first image of the first food;
The first equipment acquires the quantity of motion of the user;
the first device displays a second image of a consumption fraction of the first food, the consumption fraction being associated with the amount of exercise, the consumption fraction being greater the amount of exercise.
2. The method according to claim 1, wherein the second image includes a first portion indicating a portion where the first food is consumed and a second portion indicating a portion where the first food remains unconsumed, an area of the first portion becoming larger as the amount of motion increases.
3. The method of claim 2, wherein the first portion is a blurred image or the first portion is a transparent image.
4. The method of any one of claims 1-3, wherein the first food comprises one or more foods, the first food being obtained by one or more of:
the first food is selected by the user in a food list of the first application;
or, the first food is selected by the user by inputting characters and searching;
Alternatively, the first food is specified by the user by inputting speech;
or, the first food is specified by the user by inputting a picture;
alternatively, the first food is obtained from meal information of the user in a second application, the second application being a take-away application or an ordering application.
5. The method of any of claims 1-4, wherein the first application provides a select people for meal option, and wherein if the first food is shared for consumption by N people, then selecting N among the people for meal option results in food intake for a single meal.
6. The method as claimed in any one of claims 2-5, characterized in that the ratio of the actual caloric value needed for consuming said first portion of said first food to the caloric value of the quantity of exercise is less than 1.
7. The method according to any one of claims 1-6, further comprising:
the first device detects sharing operation of the user;
the first equipment displays a sharing interface, and images and/or text descriptions of the first food which is consumed are displayed in the sharing interface.
8. The method according to any one of claims 1-7, further comprising:
The first device also displays a map of a first region, and a movement route of the user in the first region, the movement route being associated with the amount of movement, the length of the movement route becoming longer as the amount of movement increases.
9. The method of claim 8, wherein the image of the first food and the map of the first region are displayed in the same interface.
10. The method according to claim 8 or 9, wherein an actual distance of the movement route in the first area is the same as a movement distance of the movement amount indication.
11. The method according to any one of claims 8 to 10, wherein an image and/or a text description of the user's route to the movement route is also displayed in the sharing interface.
12. An electronic device, characterized in that the electronic device comprises: a communication device, a display device, a memory, and a processor coupled to the memory, and one or more programs; the communication device is used for communication, the display device is used for displaying an interface, and the memory stores computer-executable instructions which, when executed by the processor, cause the electronic equipment to implement the method according to any one of claims 1 to 11.
13. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-11.
CN202110903392.2A 2021-08-06 2021-08-06 Method for setting moving target and related electronic equipment Pending CN115705117A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110903392.2A CN115705117A (en) 2021-08-06 2021-08-06 Method for setting moving target and related electronic equipment
PCT/CN2022/109725 WO2023011477A1 (en) 2021-08-06 2022-08-02 Method for setting exercise target and related electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110903392.2A CN115705117A (en) 2021-08-06 2021-08-06 Method for setting moving target and related electronic equipment

Publications (1)

Publication Number Publication Date
CN115705117A true CN115705117A (en) 2023-02-17

Family

ID=85154817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110903392.2A Pending CN115705117A (en) 2021-08-06 2021-08-06 Method for setting moving target and related electronic equipment

Country Status (2)

Country Link
CN (1) CN115705117A (en)
WO (1) WO2023011477A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101147831A (en) * 2007-11-06 2008-03-26 金毅电子(深圳)有限公司 Heat control system for sports equipment instrument
JP5695052B2 (en) * 2009-09-04 2015-04-01 ナイキ イノベイト セー. フェー. How to monitor and track athletic activity
CN106621190B (en) * 2016-12-30 2023-05-12 江西中阳电器有限公司 Method, device and system for simulating running on map data based on running machine
CN107330242A (en) * 2017-05-26 2017-11-07 北京卡路里信息技术有限公司 Moving target method to set up, device, storage medium and processor

Also Published As

Publication number Publication date
WO2023011477A1 (en) 2023-02-09

Similar Documents

Publication Publication Date Title
US20220080261A1 (en) Recommendation Method Based on Exercise Status of User and Electronic Device
CN109814766B (en) Application display method and electronic equipment
WO2020238356A1 (en) Interface display method and apparatus, terminal, and storage medium
US10803315B2 (en) Electronic device and method for processing information associated with food
US11567985B2 (en) Mood determination of a collection of media content items
CN111161035B (en) Dish recommendation method and device, server, electronic equipment and storage medium
EP2919142A1 (en) Electronic apparatus and method for providing health status information
CN110609903A (en) Information presentation method and device
WO2021244457A1 (en) Video generation method and related apparatus
US10922354B2 (en) Reduction of unverified entity identities in a media library
US11663261B2 (en) Defining a collection of media content items for a relevant interest
CN112214636A (en) Audio file recommendation method and device, electronic equipment and readable storage medium
CN110377204A (en) A kind of method and electronic equipment generating user's head portrait
CN112529645A (en) Picture layout method and electronic equipment
CN111222569A (en) Method, device, electronic equipment and medium for identifying food
CN115699130A (en) Augmented reality cosmetic product tutorial
CN110209316B (en) Category label display method, device, terminal and storage medium
WO2022037479A1 (en) Photographing method and photographing system
WO2021249073A1 (en) Health data display method and electronic device
WO2022222761A1 (en) Map display method and related device
CN115705117A (en) Method for setting moving target and related electronic equipment
CN116861066A (en) Application recommendation method and electronic equipment
CN115525783A (en) Picture display method and electronic equipment
CN114077713A (en) Content recommendation method, electronic device and server
CN116649951B (en) Exercise data processing method, wearable device, terminal, body-building device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination