CN112908482A - Health data recording method, device, equipment and storage medium - Google Patents

Health data recording method, device, equipment and storage medium Download PDF

Info

Publication number
CN112908482A
CN112908482A CN202110335652.0A CN202110335652A CN112908482A CN 112908482 A CN112908482 A CN 112908482A CN 202110335652 A CN202110335652 A CN 202110335652A CN 112908482 A CN112908482 A CN 112908482A
Authority
CN
China
Prior art keywords
user
behavior
health data
tag
behavior tag
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110335652.0A
Other languages
Chinese (zh)
Inventor
黄汪
严纪年
张霄
董佳君
邵彦娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Huami Information Technology Co Ltd
Original Assignee
Anhui Huami Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Huami Healthcare Co Ltd filed Critical Anhui Huami Healthcare Co Ltd
Priority to CN202110335652.0A priority Critical patent/CN112908482A/en
Publication of CN112908482A publication Critical patent/CN112908482A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/60Healthcare; Welfare
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/10Detection; Monitoring
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/20Analytics; Diagnosis

Abstract

The present disclosure provides a health data recording method, apparatus, device and storage medium, the method comprising: acquiring one or more behavior labels and displaying the behavior labels on an interactive interface; responding to a first operation of the user aiming at the behavior tag, displaying a first mark in an area where the behavior tag is located, and generating health data about the user; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user; wherein the first operation and the second operation indicate different execution behaviors of a user with respect to an event indicated by the behavior tag. The embodiment realizes that different execution behaviors of the user aiming at the same time can be recorded, so that more comprehensive health data about the user can be acquired.

Description

Health data recording method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for recording health data.
Background
With the recent rise of health consciousness, more and more people begin to pay attention to their health, and at the same time, more devices or applications for recording information such as lifestyle habits, physical conditions, and daily life behaviors of users are being developed. However, current methods for health recording on electronic devices are outdated, cumbersome, and inefficient. For example, some existing methods typically require the user to enter his daily activities on a user interface in order to make a record, which is cumbersome to operate and reduces the user's motivation to make a health record.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a health data recording method, apparatus, device, and storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided a health data recording method, the method including:
acquiring one or more behavior labels and displaying the behavior labels on an interactive interface;
responding to a first operation of the user aiming at the behavior tag, displaying a first mark in an area where the behavior tag is located, and generating health data about the user; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user;
wherein the first operation and the second operation indicate different execution behaviors of a user with respect to an event indicated by the behavior tag. In an embodiment, the first operation and the second operation are different; and the first indicia and the second indicia are different.
In an embodiment, one of the first operation and the second operation is a click operation for the left side of the area where the behavior tag is located, and the other one is a click operation for the right side of the area where the behavior tag is located;
or one of the first operation and the second operation is a sliding operation in a first direction in an area where the behavior tag is located, and the other is a sliding operation in a direction opposite to the first direction in the area where the behavior tag is located.
In one embodiment, the first indicia and the second indicia comprise one or more of: color marking, font size marking, or motion effect marking.
In one embodiment, the method further comprises: generating reminding information according to the difference between the health data and a preset health plan and displaying the reminding information on the interactive interface; the preset health plan indicates predetermined execution information for the event indicated by the behavior tag.
In an embodiment, the one or more behavior tags are generated based on user base information, historical health data, information about the environment in which the user is currently located, and/or motion information detected by the wearable device.
In one embodiment, the obtaining one or more behavior tags includes:
determining a behavior category according to historical health data of a user and/or basic information of the user, wherein the behavior category comprises a plurality of behavior tags;
and acquiring one or more behavior tags from the plurality of behavior tags.
In an embodiment, the determining the behavior category according to the historical behavior tag and/or the user basic information of the user includes: inputting the historical health data and/or the user basic information of the user into a preset identification model, and performing identification analysis on the historical health data and/or the user basic information of the user through the identification model to output the behavior category.
In an embodiment, the method is performed by an electronic device, and the one or more tags are obtained by the electronic device from a cloud.
According to a second aspect of embodiments of the present disclosure, there is provided a health data recording apparatus, the apparatus comprising:
the label display module is used for acquiring one or more behavior labels and displaying the behavior labels on the interactive interface;
the mark display and health data generation module is used for responding to a first operation of the user aiming at the behavior tag, displaying a first mark in the area where the behavior tag is located and generating health data related to the user; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user; wherein the first operation and the second operation indicate different execution behaviors of a user with respect to an event indicated by the behavior tag.
In an embodiment, one of the first operation and the second operation is a click operation for the left side of the area where the behavior tag is located, and the other one is a click operation for the right side of the area where the behavior tag is located;
or one of the first operation and the second operation is a sliding operation in a first direction in an area where the behavior tag is located, and the other is a sliding operation in a direction opposite to the first direction in the area where the behavior tag is located.
In one embodiment, the first indicia and the second indicia comprise one or more of: color marking, font size marking, or motion effect marking.
In an embodiment, the one or more behavior tags are generated based on user base information, historical health data, information about the environment in which the user is currently located, and/or motion information detected by the wearable device.
In one embodiment, the method further comprises: the reminding message generating module is used for generating reminding information according to the difference between the health data and a preset health plan and displaying the reminding information on the interactive interface; the preset health plan indicates predetermined execution information for the event indicated by the behavior tag.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor, when executing the executable instructions, is configured to implement the method of the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of any of the methods described above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the embodiment of the disclosure, one or more behavior labels are acquired and displayed on an interactive interface, then a first mark is displayed in an area where the behavior label is located in response to a first operation of a user aiming at the behavior label, and health data about the user is generated; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user; wherein the first operation and the second operation indicate different execution behaviors of a user with respect to an event indicated by the behavior tag. In the embodiment, corresponding health data can be recorded through operation of the behavior tag, information does not need to be manually input by a user, operation steps of the user are reduced, and therefore the user can conveniently and intuitively perform health management.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic diagram of a behavior tag shown in accordance with an exemplary embodiment of the present disclosure.
FIG. 2 is a schematic diagram illustrating displaying a first marker in accordance with an exemplary embodiment of the present disclosure.
FIG. 3 is a schematic diagram illustrating displaying a second marker in accordance with an exemplary embodiment of the present disclosure.
FIG. 4 is a flow chart diagram illustrating a method of health data recording according to an exemplary embodiment of the present disclosure.
FIG. 5 is a schematic diagram illustrating a configuration of a health data record device according to an exemplary embodiment of the present disclosure.
Fig. 6 is a schematic structural diagram of an electronic device shown in accordance with an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
With the recent rise of health consciousness, more and more people begin to pay attention to their health, and at the same time, more devices or applications for recording information such as lifestyle habits, physical conditions, and daily life behaviors of users are being developed. However, current methods for health recording on electronic devices are outdated, cumbersome, and inefficient. For example, some existing methods typically require the user to manually enter his daily activities on a user interface for recording, which is cumbersome to operate and reduces the user's motivation to record health.
Based on this, the embodiment of the present disclosure provides a health data recording method, which includes acquiring one or more behavior tags and displaying the behavior tags on an interactive interface, then displaying a first mark in an area where the behavior tags are located in response to a first operation of a user on the behavior tags, and generating health data about the user; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user; wherein the first operation and the second operation indicate different execution behaviors of a user with respect to an event indicated by the behavior tag. In the embodiment, corresponding health data can be recorded through the operation of the behavior tag without manually typing information by a user, so that the operation steps of the user are reduced, the user can simply, conveniently and intuitively perform health management, the first mark or the second mark can be fed back in time in response to the operation of the user on the behavior tag, the interactivity in the health recording process is increased, and the enthusiasm of the user for performing health management is effectively improved; further, the embodiment of the disclosure can record different execution behaviors of the user for the same time, so as to obtain more comprehensive health data about the user.
The health data recording methods provided by embodiments of the present disclosure may be performed by an electronic device with an interactive interface, including, but not limited to, a smartphone/cell phone, a tablet computer, a Personal Digital Assistant (PDA), a laptop computer, a desktop computer, a media content player, a video game station/system, a virtual reality system, an augmented reality system, a wearable device (e.g., a watch, glasses, gloves, headwear (e.g., a hat, a helmet, a virtual reality headset, an augmented reality headset, a Head Mounted Device (HMD), a headband), a pendant, an armband, a leg ring, a shoe, a vest), a remote controller, or any other type of device.
In an exemplary embodiment, referring to fig. 1, a plurality of behavior tags are displayed on an interactive interface of the electronic device, a user may operate the behavior tags according to an actual behavior of the user, without manually entering behavior information by the user, so as to reduce operation steps of the user, when the behavior tags are operated, for example, for a behavior tag that is "exercise today", referring to fig. 2, if the user exercises correspondingly today, a first operation (for example, an operation of sliding right) may be performed on the behavior tags, and the electronic device, in response to the first operation on the behavior tags by the user, displays a first mark, for example, displays a color mark, a font size mark and/or a movement effect mark, in an area where the behavior tags are located, and generates health data about the user; alternatively, referring to fig. 3, if the user does not exercise correspondingly today, a second operation (e.g., a leftward sliding operation) may be performed on the behavior tag, and the electronic device displays a second mark, such as a color mark, a font size mark and/or a dynamic effect mark, in the area where the behavior tag is located in response to the second operation on the behavior tag by the user, and generates health data about the user; the embodiment realizes timely feedback of the user operation through the first mark or the second mark, enhances the interactivity in the health recording process, and improves the enthusiasm of the user for health recording; the electronic equipment can generate health data related to the user according to the first operation or the second operation of the user, and can record different execution behaviors of the user aiming at the same event, so that more comprehensive health data related to the user can be acquired, and the health management of the user can be simply, conveniently and intuitively performed.
Referring to fig. 4, fig. 4 is a schematic flow chart of a health data recording method provided by an embodiment of the present disclosure, where the method is executed by an electronic device, and the method includes:
in step S101, one or more behavior tags are obtained and displayed on the interactive interface.
In step S102, responding to a first operation of a user aiming at the behavior tag, displaying a first mark in an area where the behavior tag is located, and generating health data about the user; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user; wherein the first operation and the second operation indicate different execution behaviors of a user with respect to an event indicated by the behavior tag. In some embodiments, the electronic device may obtain one or more behavior tags and display the behavior tags on an interactive interface of the electronic device in response to an operation of a user or for a specified period of time. For example, a designated control is displayed on the interactive interface, and the electronic device responds to an operation of a user on the designated control, acquires one or more behavior tags, and displays the behavior tags on the interactive interface of the electronic device; the operation of the user on the specified control includes but is not limited to click operation, long-time press operation or sliding operation and the like; the specified time period can be specifically set according to the actual application scene.
In one implementation, the behavior tag is generated according to basic information of the user, historical health data, information of the current environment where the user is located, and/or motion information detected by the wearable device, that is, the behavior tag may be determined based on any combination of the 4 information, and since the 4 information are all related to the user, the accuracy of the determined behavior tag is ensured.
The basic information of the user includes, but is not limited to, personal information (such as name, address, sex, age, etc.), physical characteristics (height, weight, body circumference (waist circumference, chest circumference, etc.), heart rate, etc.), lifestyle (whether smoking is performed, whether drinking is performed, whether exercise is performed, sleep conditions, eating conditions, etc.), and hobbies (such as reading, exercise, traveling, music, etc.) of the user. The historical health data refers to historical health data about the user. The information of the current environment of the user includes, but is not limited to, position information, weather information, traffic conditions, and the like. The motion information detected by the wearable device can be obtained by the electronic device from a wearable device, the wearable device is worn on the user, and the motion information of the user is detected through a sensor (such as an inertial measurement unit, a heart rate sensor, and the like) configured on the wearable device, wherein the motion information includes but is not limited to the motion category (running, swimming), the motion duration, and the like.
Considering the problem of computing resources of the electronic device, in the case that the computing resources of the electronic device are sufficient, the electronic device may generate the behavior tag according to basic information of the user, historical health data, information of a current environment where the user is located, and/or motion information detected by the wearable device; under the condition that the computing resources of the electronic equipment are insufficient, the electronic equipment can upload basic information, historical health data, information of the current environment where the user is located and/or motion information detected by wearable equipment of the user to a cloud end, so that the cloud end generates the behavior tag according to the basic information, the historical health data, the information of the current environment where the user is located and/or the motion information detected by the wearable equipment of the user and returns the behavior tag to the electronic equipment.
As an example, the electronic device may count a target behavior tag whose occurrence frequency is greater than a preset threshold from all historical behavior tags in the historical health data, and use the target behavior tag as a behavior tag displayed on the interactive interface this time. As an example, the electronic device may generate a behavior tag according to the motion information detected by the wearable device, for example, if the wearable device detects that the motion category is running and the motion duration is 30 minutes, the electronic device may generate a behavior tag "run 30 minutes today". As an example, the electronic device may generate the behavior tag according to information of a current environment where the user is located, for example, if the current geographic location information of the user is at home and the current weather is rainy, the behavior tag "take a rest at home" may be used.
In a second implementation manner, a behavior category may be determined according to historical health data of a user and/or user basic information, where the behavior category includes a plurality of behavior tags, and then one or more behavior tags are obtained from the plurality of behavior tags. The basic information of the user includes, but is not limited to, personal information (such as name, address, sex, age, etc.), physical characteristics (height, weight, body circumference (waist circumference, chest circumference, etc.), heart rate, etc.), lifestyle (whether smoking is performed, whether drinking is performed, whether exercise is performed, sleep conditions, eating conditions, etc.), and hobbies (such as reading, exercise, traveling, music, etc.) of the user. The historical health data refers to historical health data about the user.
The behavior category is determined by performing clustering or statistical analysis on health data and/or basic information of a plurality of users, the behavior category is used for indicating a certain group, a plurality of behavior tags included in the behavior category embody characteristics of the group, the behavior category is determined according to historical health data and/or user basic information of the user, namely, a process of determining which group the user belongs to is determined, and then one or more behavior tags are obtained from the plurality of behavior tags included in the behavior category corresponding to the group of the user.
In one example, a behavior category a, a behavior category b and a behavior category c are determined after analysis, such as clustering of health data and/or basic information of a plurality of users, wherein the behavior category a comprises behavior labels { working mania, staying up, irregular work and rest, dark circles, sedentariness, lumbar discomfort, dysphoria and … … after 8 o' clock off duty }, the behavior category b comprises behavior labels { home, cartoon, take-out, programmer, lying, game, soap opera, book reading … … }, and the behavior category c comprises behavior labels { three-meal regularity, early-sleep, running, health preserving, subway trip, public trip, low-carbon life … … }. For example, it may be determined that the user a belongs to a group having an "overnight" behavior tag according to the health data and/or the basic information of the user a, the group belongs to the behavior category a, and one or more behavior tags may be obtained from a plurality of behavior tags included in the behavior category a, such as behavior tags of "madness", "lumbar discomfort", and the like. In the embodiment, by determining the behavior category to which the user belongs, more various and complete behavior labels can be recommended for the user, which is beneficial to the convenience of the user to simply and intuitively record health.
As an example, an identification model may be established in advance, where the identification model is used to determine a corresponding behavior category according to health data and/or user basic information of a user, the identification model is generated based on a plurality of training samples, and the training samples include health data and/or user basic information of a plurality of users and corresponding category labels; in the training process, the health data and/or the user basic information of the users are input into a preset model so that the model outputs a category prediction result, and then parameters of the model are adjusted according to the difference between the preset category result and the category label, so that the recognition model is obtained. In the application process, the historical health data and/or the user basic information of the user can be input into a preset identification model, the historical health data and/or the user basic information of the user are identified and analyzed through the identification model to output the behavior category, and one or more behavior tags are obtained from a plurality of behavior tags included in the behavior category.
Considering the problem of computing resources of electronic equipment, in the case that the computing resources of the electronic equipment are sufficient, the pre-established recognition model may be installed on the electronic equipment, so that the electronic equipment may input the historical health data and/or the user basic information of the user into the pre-established recognition model, and perform recognition analysis on the historical health data and/or the user basic information of the user through the recognition model to output the behavior category. The method comprises the steps that under the condition that computing resources of the electronic equipment are insufficient, a pre-established recognition model can be installed on a cloud end, the electronic equipment can upload historical health data and/or user basic information of a user to the cloud end, so that the historical health data and/or the user basic information of the user are input into the pre-established recognition model through the cloud end, the historical health data and/or the user basic information of the user are subjected to recognition analysis through the recognition model, behavior categories are output, and one or more behavior tags are obtained from a plurality of behavior tags included in the behavior categories and returned to the electronic equipment.
Of course, in an implementation manner, the user may also add a behavior tag on the electronic device according to the actual needs of the user, which is not limited in this embodiment.
Referring to fig. 1, after one or more behavior tags are obtained and displayed on an interactive interface, the electronic device may respond to a first operation of a user on the behavior tags, display a first mark in an area where the behavior tags are located, and generate health data about the user; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user. Wherein the first operation and the second operation indicate different execution behaviors of a user for an event indicated by the behavior tag, the first operation and the second operation are different, and the first marker and the second marker are different. According to the embodiment, different marks are displayed according to different execution behaviors of the events indicated by the behavior labels, so that the user operation can be fed back in time through the marks, the interactivity in the health recording process is enhanced, and the enthusiasm of the user for health recording is improved; meanwhile, different execution behaviors of the user aiming at the same event can be recorded, so that more comprehensive health data about the user can be acquired, and the user can conveniently and intuitively perform health management.
Wherein the first indicia and the second indicia comprise one or more of: color marking, font size marking, or motion effect marking. The color marks include, but are not limited to, font color marks or background color marks, and the like, and the font size marks include, but are not limited to, changing font sizes of fonts, thickness of font lines, and the like. In the embodiment, the operation of the user is fed back in time through the first mark or the second mark, so that the interactivity in the health record process is enhanced, and the enthusiasm of the user for health record is improved.
For example, one of the first operation and the second operation is a click operation for the left side of the area where the behavior tag is located, and the other is a click operation for the right side of the area where the behavior tag is located; or one of the first operation and the second operation is a sliding operation in a first direction in an area where the behavior tag is located, and the other is a sliding operation in a direction opposite to the first direction in the area where the behavior tag is located.
For example, for the behavior tag "exercise today", referring to fig. 2, if the user exercises today accordingly, a first operation (for example, a right sliding operation or a click operation on the right side of the region where the behavior tag is located) may be performed on the behavior tag, and the electronic device displays a first mark, for example, a color mark, a font size mark and/or a dynamic effect mark, in the region where the behavior tag is located in response to the first operation on the behavior tag by the user; alternatively, referring to fig. 3, if the user does not exercise correspondingly today, a second operation (e.g., a leftward sliding operation or a clicking operation on the left side of the area where the behavior tag is located) may be performed on the behavior tag, and the electronic device displays a second mark, e.g., a color mark, a font size mark and/or a dynamic effect mark, in the area where the behavior tag is located in response to the second operation on the behavior tag by the user.
In an embodiment, semantic analysis may be performed on the behavior tag to determine a tag attribute, and then a corresponding mark may be displayed in an area where the behavior tag is located according to the tag attribute and an operation of the behavior tag by a user. As an example, the tag attributes include a positive attribute and a negative attribute, such as a tag attribute of a behavior tag "exercise today" belonging to the positive attribute and a tag attribute of a behavior tag "joint pain" belonging to the negative attribute. Wherein, a first operation of the user for the behavior tag can be set to represent that the user operates or has performed a behavior on the event indicated by the behavior tag, and a second operation of the user for the behavior tag can be set to represent that the user does not operate or perform the behavior on the event indicated by the behavior tag; if one of the first flag and the second flag is a positive flag and the other is a negative flag, for example, the first flag is set to be the positive flag and the second flag is set to be the negative flag, then the various situations shown in table 1 can be obtained.
Figure BDA0002997432310000091
Specifically, the electronic equipment responds to a first operation of a user aiming at the behavior tag and the fact that the attribute of the behavior tag is a forward attribute, and displays a first mark in an area where the behavior tag is located; responding to a second operation of the user aiming at the behavior tag and the negative attribute of the behavior tag, and displaying a second mark in the area where the behavior tag is located; responding to a first operation of a user aiming at the behavior tag and the fact that the attribute of the behavior tag is a negative attribute, and displaying a second mark in the area where the behavior tag is located; and responding to a second operation of the user aiming at the behavior tag and the fact that the attribute of the behavior tag is a negative attribute, and displaying a first mark in the area where the behavior tag is located. In the embodiment, the operation of the user is fed back in time through the first mark or the second mark, so that the interactivity in the health record process is enhanced, and the enthusiasm of the user for health record is improved.
Illustratively, when a user annotates a first operation with a behavior tag that is beneficial to health (i.e., a behavior tag having a positive attribute), such as a behavior tag related to motion, a positive motivation action is generated; when the user performs a first operation on a behavior label harmful to health (i.e. a behavior label with a negative attribute), for example, a behavior label related to smoking and drinking, a negative dynamic effect is generated, the user can experience the behavior label in real time, interactivity in the health record process is enhanced, and the enthusiasm of the user for performing health record is improved.
Furthermore, the operation times of the user on the behavior tag in a preset time period can be counted, and the operation times are displayed in the area where the behavior tag is located.
In an embodiment, the electronic device may further generate a reminding message according to a difference between the health data and a preset health plan, and display the reminding message on the interactive interface; the preset health plan indicates predetermined execution information for the event indicated by the behavior tag. As an example, if the preset health plan is running for 10 minutes and the health data indicates that the user runs for 15 minutes, a reminder message "last for more than 5 minutes today, good bar" may be generated according to the difference between the health data and the preset health plan. As an example, if the preset health plan is no smoking today, the health data indicates that the user has smoked 2 cigarettes, and a reminder message "smoking is harmful to health" may be generated according to the difference between the health data and the preset health plan. According to the embodiment, the user can be effectively detected according to the health data of the user, when the behavior beneficial to the healthy life reminds the user to carry out more operations, the harmful behavior reminds the user to pay attention to the behavior, and the user is helped to effectively manage the life behavior.
The various technical features in the above embodiments can be arbitrarily combined, so long as there is no conflict or contradiction between the combinations of the features, but the combination is limited by the space and is not described one by one, and therefore, any combination of the various technical features in the above embodiments also belongs to the scope disclosed in the present specification.
Corresponding to the embodiment of the health data recording method, the disclosure also provides an embodiment of a health data recording device, an electronic device applied by the device and a storage medium.
Correspondingly, referring to fig. 5, an embodiment of the present disclosure further provides a health data recording device, where the health data recording device includes:
and the label display module 201 is configured to obtain one or more behavior labels and display the behavior labels on the interactive interface.
A mark display and health data generation module 202, configured to respond to a first operation of the user on the behavior tag, display a first mark in an area where the behavior tag is located, and generate health data about the user; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user; wherein the first operation and the second operation indicate different execution behaviors of a user with respect to an event indicated by the behavior tag.
In an embodiment, the first operation and the second operation are different; and the first indicia and the second indicia are different.
In an embodiment, one of the first operation and the second operation is a click operation for the left side of the area where the behavior tag is located, and the other one is a click operation for the right side of the area where the behavior tag is located;
or one of the first operation and the second operation is a sliding operation in a first direction in an area where the behavior tag is located, and the other is a sliding operation in a direction opposite to the first direction in the area where the behavior tag is located.
In one embodiment, the first indicia and the second indicia comprise one or more of: color marking, font size marking, or motion effect marking.
In one embodiment, the method further comprises: the reminding message generating module is used for generating reminding information according to the difference between the health data and a preset health plan and displaying the reminding information on the interactive interface; the preset health plan indicates predetermined execution information for the event indicated by the behavior tag.
In an embodiment, the one or more behavior tags are generated based on user base information, historical health data, information about the environment in which the user is currently located, and/or motion information detected by the wearable device.
In one embodiment, the tag display module 201 includes:
the category determination unit is used for determining a behavior category according to historical health data and/or user basic information of a user, wherein the behavior category comprises a plurality of behavior tags;
and the label acquiring unit is used for acquiring one or more behavior labels from the behavior labels.
In an embodiment, the category determination unit includes: inputting the historical health data and/or the user basic information of the user into a preset identification model, and performing identification analysis on the historical health data and/or the user basic information of the user through the identification model to output the behavior category.
In an embodiment, the method is performed by an electronic device, and the one or more tags are obtained by the electronic device from a cloud.
The implementation process of the functions and actions of each module in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the present disclosure also provides an electronic device, comprising: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to:
acquiring one or more behavior labels and displaying the behavior labels on an interactive interface;
responding to a first operation of the user aiming at the behavior tag, displaying a first mark in an area where the behavior tag is located, and generating health data about the user; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user;
wherein the first operation and the second operation indicate different execution behaviors of a user with respect to an event indicated by the behavior tag.
Accordingly, the present disclosure also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of any of the methods described above.
The present disclosure may take the form of a computer program product embodied on one or more storage media including, but not limited to, disk storage, CD-ROM, optical storage, and the like, having program code embodied therein. Computer-usable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of the storage medium of the computer include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
As shown in fig. 6, fig. 6 is a block diagram of an electronic device shown in accordance with an exemplary embodiment of the present disclosure. The device 300 may be a smartphone/cell phone, a tablet computer, a Personal Digital Assistant (PDA), a laptop computer, a desktop computer, a media content player, a video game station/system, a virtual reality system, an augmented reality system, a wearable device (e.g., a watch, glasses, gloves, headwear (e.g., a hat, a helmet, a virtual reality headset, an augmented reality headset, a Head Mounted Device (HMD), a headband), a pendant, an armband, a leg loop, a shoe, a vest), a remote control, or any other type of device.
Referring to fig. 6, device 300 may include one or more of the following components: processing component 302, memory 304, power component 306, multimedia component 308, audio component 310, input/output (I/O) interface 312, sensor component 314, and communication component 316.
The processing component 302 generally controls overall operation of the device 300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 302 may include one or more processors 320 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 302 can include one or more modules that facilitate interaction between the processing component 302 and other components. For example, the processing component 302 may include a multimedia module to facilitate interaction between the multimedia component 308 and the processing component 302.
The memory 304 is configured to store various types of data to support operations at the device 300. Examples of such data include instructions for any application or method operating on device 300, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 304 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 306 provides power to the various components of the device 300. The power components 306 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 300.
The multimedia component 308 comprises a screen providing an output interface between the device 300 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 308 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 300 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 310 is configured to output and/or input audio signals. For example, audio component 310 may include a Microphone (MIC) configured to receive external audio signals when device 300 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 304 or transmitted via the communication component 316. In some embodiments, audio component 310 also includes a speaker for outputting audio signals.
The I/O interface 312 provides an interface between the processing component 302 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
Sensor assembly 314 includes one or more sensors for providing status assessment of various aspects of device 300. For example, sensor assembly 314 may detect an open/closed state of device 300, the relative positioning of components, such as a display and keypad of device 300, the change in position of device 300 or one of the components of device 300, the presence or absence of user contact with device 300, the orientation or acceleration/deceleration of device 300, and the change in temperature of device 300. Sensor assembly 314 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 314 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 314 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 316 is configured to facilitate wired or wireless communication between the device 300 and other devices. The device 300 may access a wireless network based on a communication standard, such as WiFi, 2G, 3G, or 4G, or a combination thereof. In an exemplary embodiment, the communication component 316 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 316 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the device 300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 304, that are executable by the processor 320 of the device 300 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
The above description is only exemplary of the present disclosure and should not be taken as limiting the disclosure, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (16)

1. A method of health data recording, the method comprising:
acquiring one or more behavior labels and displaying the behavior labels on an interactive interface;
responding to a first operation of the user aiming at the behavior tag, displaying a first mark in an area where the behavior tag is located, and generating health data about the user; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user;
wherein the first operation and the second operation indicate different execution behaviors of a user with respect to an event indicated by the behavior tag.
2. The method of claim 1, wherein the first operation and the second operation are different; and the first indicia and the second indicia are different.
3. The method according to claim 2, wherein one of the first operation and the second operation is a click operation for the left side of the area where the behavior tag is located, and the other is a click operation for the right side of the area where the behavior tag is located;
or one of the first operation and the second operation is a sliding operation in a first direction in an area where the behavior tag is located, and the other is a sliding operation in a direction opposite to the first direction in the area where the behavior tag is located.
4. The method of claim 1, wherein the first indicia and the second indicia comprise one or more of: color marking, font size marking, or motion effect marking.
5. The method of claim 1, further comprising:
generating reminding information according to the difference between the health data and a preset health plan and displaying the reminding information on the interactive interface; the preset health plan indicates predetermined execution information for the event indicated by the behavior tag.
6. The method of claim 1, wherein the one or more behavior tags are generated based on user basic information, historical health data, information about the environment in which the user is currently located, and/or motion information detected by the wearable device.
7. The method of claim 1, wherein obtaining one or more behavior tags comprises:
determining a behavior category according to historical health data of a user and/or basic information of the user, wherein the behavior category comprises a plurality of behavior tags;
and acquiring one or more behavior tags from the plurality of behavior tags.
8. The method of claim 7, wherein determining the behavior category according to the historical behavior tag and/or the user basic information of the user comprises:
inputting the historical health data and/or the user basic information of the user into a preset identification model, and performing identification analysis on the historical health data and/or the user basic information of the user through the identification model to output the behavior category.
9. The method of any one of claims 6 to 8, wherein the method is performed by an electronic device, and wherein the one or more tags are obtained by the electronic device from a cloud.
10. A health data recording device, characterized in that the device comprises:
the label display module is used for acquiring one or more behavior labels and displaying the behavior labels on the interactive interface;
the mark display and health data generation module is used for responding to a first operation of the user aiming at the behavior tag, displaying a first mark in the area where the behavior tag is located and generating health data related to the user; or responding to a second operation of the user for the behavior tag, displaying a second mark in the area where the behavior tag is located, and generating health data about the user; wherein the first operation and the second operation indicate different execution behaviors of a user with respect to an event indicated by the behavior tag.
11. The apparatus according to claim 10, wherein one of the first operation and the second operation is a click operation for a left side of an area where the behavior tag is located, and the other is a click operation for a right side of the area where the behavior tag is located;
or one of the first operation and the second operation is a sliding operation in a first direction in an area where the behavior tag is located, and the other is a sliding operation in a direction opposite to the first direction in the area where the behavior tag is located.
12. The apparatus of claim 10, wherein the first indicia and the second indicia comprise one or more of: color marking, font size marking, or motion effect marking.
13. The apparatus of claim 10, wherein the one or more behavior tags are generated based on user basic information, historical health data, information about the environment in which the user is currently located, and/or motion information detected by the wearable device.
14. The apparatus of claim 10, further comprising:
the reminding message generating module is used for generating reminding information according to the difference between the health data and a preset health plan and displaying the reminding information on the interactive interface; the preset health plan indicates predetermined execution information for the event indicated by the behavior tag.
15. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor, when executing the executable instructions, is configured to implement the method of any of claims 1 to 9.
16. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out the method of any one of claims 1 to 9.
CN202110335652.0A 2021-03-29 2021-03-29 Health data recording method, device, equipment and storage medium Pending CN112908482A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110335652.0A CN112908482A (en) 2021-03-29 2021-03-29 Health data recording method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110335652.0A CN112908482A (en) 2021-03-29 2021-03-29 Health data recording method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112908482A true CN112908482A (en) 2021-06-04

Family

ID=76109355

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110335652.0A Pending CN112908482A (en) 2021-03-29 2021-03-29 Health data recording method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112908482A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130038555A1 (en) * 2010-03-18 2013-02-14 Nippon Telegraph And Telephone Corporation Information Input Device, Information Input Method, and Information Input Program
CN103336665A (en) * 2013-07-15 2013-10-02 北京小米科技有限责任公司 Display method, display device and terminal equipment
CN106982283A (en) * 2016-01-19 2017-07-25 中兴通讯股份有限公司 Contact person's packet processing method, device and mobile terminal
CN108986876A (en) * 2018-06-14 2018-12-11 北京奇伦天佑创业投资有限公司 The carry-on management system of personal health
CN112331339A (en) * 2020-10-10 2021-02-05 于军 Automatic and personalized health management system and method for large population

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130038555A1 (en) * 2010-03-18 2013-02-14 Nippon Telegraph And Telephone Corporation Information Input Device, Information Input Method, and Information Input Program
CN103336665A (en) * 2013-07-15 2013-10-02 北京小米科技有限责任公司 Display method, display device and terminal equipment
CN106982283A (en) * 2016-01-19 2017-07-25 中兴通讯股份有限公司 Contact person's packet processing method, device and mobile terminal
CN108986876A (en) * 2018-06-14 2018-12-11 北京奇伦天佑创业投资有限公司 The carry-on management system of personal health
CN112331339A (en) * 2020-10-10 2021-02-05 于军 Automatic and personalized health management system and method for large population

Similar Documents

Publication Publication Date Title
KR102013493B1 (en) System and method for providing recommendation on an electronic device based on emotional state detection
WO2021018154A1 (en) Information representation method and apparatus
US11388016B2 (en) Information processing system, information processing device, information processing method, and recording medium
US20200053262A1 (en) Wearable apparatus providing feedback to adjust a field of view
US10509540B2 (en) Method and device for displaying a message
US10960173B2 (en) Recommendation based on dominant emotion using user-specific baseline emotion and emotion analysis
US20160063874A1 (en) Emotionally intelligent systems
WO2017071059A1 (en) Communication method, apparatus and system for wearable device
CN114205324B (en) Message display method, device, terminal, server and storage medium
EP2569925A1 (en) User interfaces
CN106610781B (en) Intelligent wearing equipment
CN111556352B (en) Multimedia resource sharing method and device, electronic equipment and storage medium
CN112035031B (en) Note generation method and device, electronic equipment and storage medium
CN112416207A (en) Information content display method, device, equipment and medium
CN106293810B (en) Application processing method and device based on VR equipment and VR equipment
WO2016034952A1 (en) Activity based text rewriting using language generation
US20220318551A1 (en) Systems, devices, and/or processes for dynamic surface marking
CN112153218B (en) Page display method and device, wearable device and storage medium
CN115858552A (en) Data query method, device, equipment and storage medium
CN110795660B (en) Data analysis method, data analysis device, electronic device, and medium
CN111615007A (en) Video display method, device and system
CN112015277A (en) Information display method and device and electronic equipment
CN112908482A (en) Health data recording method, device, equipment and storage medium
EP4047568A1 (en) Method and device for generating emoticon, and storage medium
CN112070483A (en) Reminding method, reminding device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230419

Address after: 230088 China (Anhui) pilot Free Trade Zone, Hefei City, Anhui Province 7 / F, building B2, huami Global Innovation Center, No. 900, Wangjiang West Road, high tech Zone, Hefei City

Applicant after: Anhui Huami Information Technology Co.,Ltd.

Address before: 230088 No.01, 7th floor, building B2, Zhongan chuanggu Science Park, no.900, Wangjiang West Road, high tech Zone, Hefei City, Anhui Province

Applicant before: Anhui huami health care Co.,Ltd.

TA01 Transfer of patent application right