CN118230372A - Monitoring method and related device of wearable equipment - Google Patents

Monitoring method and related device of wearable equipment Download PDF

Info

Publication number
CN118230372A
CN118230372A CN202211633641.1A CN202211633641A CN118230372A CN 118230372 A CN118230372 A CN 118230372A CN 202211633641 A CN202211633641 A CN 202211633641A CN 118230372 A CN118230372 A CN 118230372A
Authority
CN
China
Prior art keywords
user
data
wearable device
identity
preset threshold
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211633641.1A
Other languages
Chinese (zh)
Inventor
张炜
靳俊叶
赵梦龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211633641.1A priority Critical patent/CN118230372A/en
Publication of CN118230372A publication Critical patent/CN118230372A/en
Pending legal-status Critical Current

Links

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Provided are a method for monitoring a wearable device and a related device, wherein the method comprises the following steps: identifying the identity of a user wearing the wearable device to determine the identity of the user; the user is monitored to obtain a first data set comprising data obtained by monitoring one or more physiological parameters of the user's physiological parameter set and a user tag for identifying the user. In the scheme, the wearable device can identify the identity of the user and carry the data obtained by monitoring the user with the user tag, so that if a plurality of users use the wearable device and the data of the plurality of users are monitored, the data of different users can be distinguished based on the user tag, and the confusion of the data of the different users can be avoided.

Description

Monitoring method and related device of wearable equipment
Technical Field
The application relates to the technical field of terminals, in particular to a monitoring method and a related device of wearable equipment.
Background
Along with the increasing attention of people to health, the demands of monitoring physiological parameters such as heart rate, blood oxygen and the like of human bodies in real time are also increasing. At present, physiological parameters such as heart rate, blood oxygen and the like of a human body can be monitored in real time through wearable equipment. Common wearable devices include, for example, but are not limited to, a wristband, a wristwatch, a glove, and the like.
In some special scenarios, the wearable device may present a problem that the recorded data of the above-mentioned physiological parameters cannot be matched to a specific user. For example, where both user a and user B use wearable devices, even where both user a and user B are logged into the same account, the wearable devices may not be able to distinguish between user a and user B's data.
Disclosure of Invention
The application provides a monitoring method and a related device of wearable equipment, which aim to solve the problem that data of different users cannot be distinguished and avoid confusion of the data of the different users.
In a first aspect, the present application provides a method for monitoring a wearable device, where the method may be performed by the wearable device, or may also be performed by a component (such as a chip, a chip system, etc.) configured in the wearable device, or may also be implemented by a logic module or software capable of implementing all or part of the functions of the wearable device, which is not limited in this aspect of the present application.
Illustratively, the method includes: identifying the identity of a user wearing the equipment to determine the identity of the user; and monitoring the user to obtain a first data set, wherein the first data set comprises data obtained by monitoring one or more physiological parameters in the physiological parameter set of the user and a user tag, and the user tag is used for identifying the user.
In the technical scheme, the wearable device can identify the identity of the user wearing the wearable device and carry the data obtained by monitoring the user with the user tag, so that if a plurality of users use the wearable device and even the plurality of users log in the same account, the data of different users can be distinguished based on the user tag, and the confusion of the data of different users is avoided.
With reference to the first aspect, in some possible implementation manners of the first aspect, the identifying a user wearing the device includes: the identity of the user wearing the device is identified by one of the following means: face recognition, fingerprint recognition, or voiceprint recognition.
In the present application, the wearable device may identify the user in one of the following ways: face recognition, fingerprint recognition or voiceprint recognition, it can be understood that a certain user can be uniquely identified whether the face, the fingerprint or the voiceprint is recognized, so that confusion of data of different users can be avoided under the condition that a plurality of users use the wearable device.
It should be understood that the foregoing face recognition, fingerprint recognition, or voiceprint recognition is merely an example, and should not be construed as limiting the present application in any way, as long as the user can be uniquely identified, and the present application is not limited to a specific recognition method.
With reference to the first aspect, in certain possible implementation manners of the first aspect, in a case that an identity of the user is not identified, the method further includes: prompting the user to input identity information of the user, wherein the identity information is information for identifying the user; and responding to user operation, and collecting the identity information of the user.
It will be appreciated that the wearable device may enter identity information for a plurality of different users. The wearable device may also enter multiple identity information of the same user, e.g., the wearable device enters a face image, fingerprint, voiceprint, etc. of the user. The application is not limited in this regard.
If the user uses the wearable device for the first time (i.e., if the user belongs to a strange user for the wearable device), the situation that the wearable device does not recognize the identity of the user may occur, and in this case, the user may be prompted to enter the identity information of the user, so that the user may recognize the identity when the user uses the wearable device next time.
It will be appreciated that in some cases, if the user may not be a usual user of the device, the user may also use the wearable device in the identity of the visitor, i.e. the wearable device may not collect identity information of the user, e.g. the wearable device may prompt the user to use the wearable device in the form of a visitor.
With reference to the first aspect, in certain possible implementation manners of the first aspect, the method further includes: and in response to user operation, displaying data of the user, wherein the data of the user comprises data obtained by monitoring one or more physiological parameters in the physiological parameter set of the user.
In response to a user operation, the wearable device may display data of the user, including in particular data obtained by monitoring one or more physiological parameters of the user's physiological parameter set. For example, in response to a user operating a setting option in an application corresponding to a certain physiological parameter, data of the certain physiological parameter of the user is displayed, and for example, the wearable device may also directly display data of multiple physiological parameters of the user in response to a user operating a setting option of the wearable device.
With reference to the first aspect, in some possible implementation manners of the first aspect, the displaying, in response to a user operation, data of the user includes: in response to a first user operation, displaying at least one user tag corresponding to at least one user including the user, each user tag for identifying one user, the at least one user corresponding to at least one data set, each data set including data monitored for each physiological parameter in the physiological data set for the corresponding user; and displaying the data of the user in response to a second user operation.
Wherein the second user operation may be used to select a user tag of the user from among the at least one user tag.
For example, in response to a user operation on a setting option in an application corresponding to a certain physiological parameter, the wearable device displays a user tag of at least one user including the user, and further, in response to a user operation of clicking a user tag of one of the users, displays data of the user.
It will be appreciated that if the wearable device stores data of one user, the wearable device may display a user tag of the user in response to a first user operation, and further display the data of the user in response to a second user operation; the wearable device may also display data of the user directly in response to the first user operation. The application is not limited in this regard.
With reference to the first aspect, in certain possible implementations of the first aspect, the physiological parameters in the set of physiological parameters include one or more of: heart rate, blood oxygen, blood pressure, temperature, heart rate variability, or respiratory rate.
With reference to the first aspect, in certain possible implementation manners of the first aspect, the method further includes: acquiring a current facial image of the user; based on the first data set and the facial image, obtaining a current comprehensive state index of the user, wherein the comprehensive state index is used for evaluating the current physical state of the user; and prompting the user for the comprehensive state index.
The wearable device can obtain the comprehensive state index for evaluating the current physical state of the user based on the data of each physiological parameter of the user and the facial image, and prompts the comprehensive state index to the user, so that the user can know the physical state of the user in time, and the user experience is improved.
With reference to the first aspect, in certain possible implementation manners of the first aspect, the method further includes: determining a reference of the comprehensive state index based on the preset data of each physiological parameter in the physiological parameter set of the user and the standard certificate photograph of the user; and suggesting to the user based on the reference and the current comprehensive state index of the user.
The wearable device can determine the standard of the comprehensive state index based on the data of each preset physiological parameter and the standard certificate photograph of the user, or determine the standard of the comprehensive state index, so as to compare the current comprehensive state index of the user with the standard, and further give corresponding suggestions to the user, such as suggesting the user to rest or exercise.
One possible design is that the wearable device obtains a current comprehensive state index according to the current facial image and the current data obtained by monitoring the physiological parameters of the user, compares the comprehensive state index with the reference, and gives the user advice. For example, when the user's current composite state index is greater than or equal to a reference, suggesting how rest the user has; when the current comprehensive state index of the user is smaller than the reference, the user is recommended to do more exercises, and the application does not limit the proposed user proposal specifically.
Another possible design is that the wearable device obtains a current comprehensive state index according to the current facial image and the current data obtained by monitoring the physiological parameters of the user, further uses the sum of the current comprehensive state index and a reference as a final comprehensive state index, compares the comprehensive state index with a preset threshold, and further gives a user suggestion.
With reference to the first aspect, in certain possible implementation manners of the first aspect, the method further includes: and guiding the user to exercise until the blood oxygen concentration of the user is not higher than a first preset threshold.
In the application, the wearable device can monitor the blood oxygen concentration of the user and guide the user to exercise until the blood oxygen concentration of the user is kept in a normal range.
With reference to the first aspect, in certain possible implementation manners of the first aspect, the method further includes: judging whether the confidence level of a photoplethysmography (photo plethysmo graphy, PPG) signal reaches a second preset threshold, wherein the PPG signal is used for detecting the blood oxygen concentration, and the confidence level of the PPG signal is used for reflecting the reliability degree of the PPG signal; and guiding the user to exercise until the blood oxygen concentration of the user is not higher than a first preset threshold, comprising: and under the condition that the confidence coefficient of the PPG signal reaches the second preset threshold, guiding the user to exercise until the blood oxygen concentration of the user is not higher than the first preset threshold.
The wearable device can judge whether the confidence coefficient of the PPG signal reaches a second preset threshold or not, and instruct the user to exercise until the blood oxygen concentration of the user is not higher than the first preset threshold under the condition that the confidence coefficient of the PPG signal reaches the second preset threshold. Therefore, the problem of inaccurate blood oxygen concentration caused by inaccurate PPG signals can be avoided, and the accuracy of the detected blood oxygen concentration is improved.
In a second aspect, the present application provides a wearable device, capable of implementing the method of any one of the first aspect and the possible implementation manner of the first aspect. The device comprises corresponding units for performing the above-described method. The units comprised by the device may be implemented in software and/or in hardware.
In a third aspect, the present application provides a wearable device comprising a processor. The processor is coupled to the memory and operable to execute a computer program in the memory to implement the method of the first aspect and any one of the possible implementations of the first aspect.
Optionally, the apparatus further comprises a memory for storing computer readable instructions which are read by the processor so that the apparatus can implement the method as described in the first aspect and any possible implementation of the first aspect.
Optionally, the device may further comprise a communication interface for the device to communicate with other devices, which may be, for example, a transceiver, a circuit, a bus, a module or other type of communication interface.
In a fourth aspect, the present application provides a computer readable storage medium having stored therein a computer program or instructions which, when executed, implement the method of the first aspect and any one of the possible implementations of the first aspect.
In a fifth aspect, the present application provides a computer program product comprising instructions which, when executed, implement the method of the first aspect and any one of the possible implementations of the first aspect.
In a sixth aspect, the present application provides a chip system comprising at least one processor for supporting the implementation of the functions involved in any of the first aspect and any of the possible implementations of the first aspect, e.g. for receiving or processing data involved in the method described above, etc.
In one possible design, the system on a chip further includes a memory to hold program instructions and data, the memory being located either within the processor or external to the processor.
The chip system may be formed of a chip or may include a chip and other discrete devices.
It should be understood that, the second aspect to the sixth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
Fig. 1 is a schematic structural diagram of a wearable device according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of a method for monitoring a wearable device provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of data of different users provided by an embodiment of the present application;
FIG. 4 is a schematic flow chart of evaluating a user comprehensive state index according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of guiding a user to exercise provided by an embodiment of the present application;
FIG. 6 is a schematic block diagram of a wearable device provided by an embodiment of the present application;
fig. 7 is yet another schematic block diagram of a wearable device provided by an embodiment of the present application.
Detailed Description
The technical scheme of the application will be described below with reference to the accompanying drawings.
In order to clearly describe the technical solution of the embodiments of the present application, the following description is first made:
First, in the embodiments of the present application, the words "first" and "second" are used to distinguish between the same item or similar items that have substantially the same function and effect. For example, the first preset threshold and the second preset threshold are used for distinguishing different preset thresholds, and the sequence of the first preset threshold and the second preset threshold is not limited. It will be appreciated by those skilled in the art that the words "first" and "second" do not necessarily limit the number and order of execution, and that the words "first" and "second" do not necessarily differ.
Second, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
Third, in embodiments of the present application, "a plurality" means two or more. "one or more of the following" or similar expressions thereof, refers to any combination of these items, including any combination of single or plural items. For example, one or more of a, b, or c may represent: a, b, c; a and b; a and c; b and c; or a and b and c.
Fourth, in the embodiment of the present application, the wearable device may also be referred to as a wearable intelligent device, which is a generic name for intelligently designing daily wear by applying a wearable technology and developing wearable devices, such as a smart watch, a bracelet, a glove, and the like. Wearable devices are portable devices that are worn directly on or on the body, or are integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
Illustratively, fig. 1 shows a schematic structural diagram of a wearable device 100. As shown in fig. 1, the device 100 may include a processor 110, a radio frequency unit 120, a power supply 130, a memory 140, an input unit 150, a display unit 160, a sensor 170, an audio circuit 180, a wireless fidelity (WIRELESS FIDELITY, wi-Fi) module 190, and a bluetooth module 1100.
Processor 110 is a control center of device 100 that utilizes various interfaces and lines to connect various portions of the overall device, and performs various functions of the device and processes the data by running or executing software programs and/or modules stored in memory 140, and invoking data stored in memory 140, thereby performing overall monitoring of the device. The processor 110 may include one or more processing units. Preferably, the processor 110 may integrate an application processor and a modem processor. Optionally, the application processor primarily handles operating systems, user interfaces, applications, etc., and the modem processor primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The radio frequency unit 120 may be used for receiving and transmitting signals during the information receiving or communication process, specifically, receiving downlink information of the base station, and then processing the downlink information for the processor 110; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 120 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 120 may also communicate with other devices via a wireless communication network. The wireless communications may use any communication standard or protocol including, but not limited to, global System for Mobile communications (global system of mobile communication, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple Access 2000 (code division multiple access, CD MA 2000), wideband code division multiple Access (wideband code division multiple access, WCDMA), time division synchronous code division multiple Access (TD-SCDMA), frequency division Duplex Long term evolution (frequency division duplexing-long term evolution, FDD-LTE), and time division Duplex Long term evolution (time division duplexing-long term evolution, TDD-LTE), among others.
The power supply 130 (e.g., a battery) is used to power the various components, and preferably the power supply 130 may be logically connected to the processor 110 through a power management system, so as to perform functions of managing charging, discharging, and power consumption management through the power management system.
The memory 140 may be used to store software programs and modules that the processor 110 performs various functional applications and data processing of the device by running the software programs and modules stored in the memory 140. The memory 140 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, a boot loader (boot loader), etc.; the storage data area may store data (such as audio data, phonebooks, etc.) created according to the use of the device, etc. In addition, memory 140 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The input unit 150 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the device. In particular, the input unit 150 may include a touch panel 1501 and other input devices 1502. The touch panel 1501, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1501 or thereabout using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection device according to a predetermined program. Alternatively, touch panel 1501 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 110, and can receive and execute commands sent from the processor 110. In addition, the touch panel may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 150 may include other input devices 1502 in addition to the touch panel 1501. In particular, other input devices 1502 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc.
The display unit 160 may be used to display information input by a user or information provided to the user and various menus of the device. The display unit 160 may include a display panel 1601, and alternatively, the display panel 1601 may be configured in the form of a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), or the like. Further, the touch panel 1501 may cover the display panel 1601, and when the touch panel 1501 detects a touch operation thereon or thereabout, the touch operation is transferred to the processor 110 to determine the type of touch event, and then the processor 110 provides a corresponding visual output on the display panel 1601 according to the type of touch event. Although in fig. 1, the touch panel 1501 and the display panel 1601 are two separate components to implement the input and input functions of the device, in some embodiments, the touch panel 1501 may be integrated with the display panel 1601 to implement the input and output functions of the device.
The device 100 may also include at least one sensor 170, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 1601 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 1601 or the backlight when the device is moved to the ear.
The audio circuit 180 may transmit the received electrical signal converted from audio data to the speaker 1801, and convert the electrical signal into a sound signal by the speaker 1801 to output; on the other hand, the microphone 1802 converts the collected sound signals into electrical signals, which are received by the audio circuit 180 and converted into audio data, which are processed by the audio data output processor 110 for transmission to, for example, another electronic device via the radio frequency unit 120, or which are output to the memory 140 for further processing.
Wi-Fi, which is a short-range wireless transmission technology, can help users to send and receive e-mail, browse web pages, access streaming media, etc. through Wi-Fi module 190, which provides wireless broadband internet access to users. Although fig. 1 shows Wi-Fi module 190, it is to be understood that it does not belong to the necessary constitution of the device, and can be omitted entirely as required within the scope not changing the essence of the invention.
Bluetooth technology is also a short-range wireless transmission technology, and the device 100 can establish bluetooth connection with other devices with bluetooth modules through the bluetooth module 1100, so as to perform data transmission based on a bluetooth communication link. The bluetooth module 1100 may be a bluetooth low energy (bluetooth low energy, BLE) module according to the actual needs.
In an embodiment of the present application, the device 100 may further include a camera, which may be used to identify a user wearing the wearable device, so as to determine the identity of the user.
It is to be understood that the illustrated construction of the present application does not constitute a particular limitation of the apparatus 100. In other embodiments, the apparatus 100 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Along with the increasing attention of people to health, the demands of monitoring physiological parameters such as heart rate, blood oxygen and the like of human bodies in real time are also increasing. At present, physiological parameters such as heart rate, blood oxygen and the like of a human body can be monitored in real time through wearable equipment. Common wearable devices include, for example, but are not limited to, a wristband, a wristwatch, a glove, and the like.
In some special scenarios, the wearable device may present a problem that the recorded data of the above-mentioned physiological parameters cannot be matched to a specific user. For example, where both user a and user B use wearable devices, even where both user a and user B are logged into the same account, the wearable devices may not be able to distinguish between user a and user B's data.
Therefore, the embodiment of the application provides a monitoring method of a wearable device, the wearable device can identify the identity of a user wearing the wearable device, monitor each physiological parameter in a physiological parameter set, and obtain data of each physiological parameter, wherein the data carries a user tag for identifying that the data belongs to the user, so that even if a plurality of users use the wearable device, the data of different users can be distinguished based on the user tag, and confusion of the data of different users is avoided.
It should be noted that, the identity, identity information, etc. of the user related to the present application are all data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region, and provide corresponding operation entrance for the user to select authorization or rejection.
The following will describe in detail a method for monitoring a wearable device according to an embodiment of the present application with reference to the accompanying drawings.
It should be understood that the embodiments shown below describe the method from the point of view of the wearable device, but should not constitute any limitation as to the execution body of the method. The method provided by the embodiment of the present application can be executed as long as the program recorded with the code of the method provided by the embodiment of the present application can be executed. For example, the wearable device may also be replaced with a component (e.g., a chip, a system-on-chip, etc.) configured in the wearable device, or other functional modules capable of invoking and executing a program. The embodiment of the present application is not limited thereto.
Fig. 2 is a schematic flowchart of a method 200 for monitoring a wearable device according to an embodiment of the present application. The method 200 shown in fig. 2 may include steps 210 and 220. The various steps in method 200 are described in detail below.
Step 210, identify a user wearing the wearable device to determine an identity of the user.
The wearable device may identify, i.e., determine, the identity of the user wearing the device.
One possible implementation is that in response to a user operation, the wearable device identifies the user wearing the device. Wherein the user operations include, but are not limited to: the application is not limited to triggering the wearable device to identify the user.
For example, the wearable device may automatically identify the user after detecting that the user wears the wearable device.
For another example, after the wearable device detects that the user has worn the wearable device, the user is identified in response to an operation of lifting the wrist by the user, such as face recognition of the user by the camera.
For another example, after the wearable device detects that the user has worn the wearable device, in response to an operation of the user entering the identity recognition interface, identity recognition is performed on the user, for example, face recognition is performed on the user through a camera, where the identity recognition interface may be, for example, an interface for collecting a face image, and the user may place the face in a specified area so that the wearable device can recognize the face of the user to determine the identity of the user.
The face recognition of the user through the camera includes: the image or video stream containing the face is collected by the camera on the wearable device, and the face is automatically detected and tracked in the image, so that the detected face is subjected to face recognition, for example, also referred to as face recognition, face recognition and the like, and a more detailed method of face recognition can refer to known technology and is not described in detail herein.
It will be appreciated that in the event that the user operates to raise the wrist, a false recognition or a recognition failure scenario may occur. For example, if the user lifts the wrist, the wearable device performs identity recognition on the user, for example, face recognition, then a situation that a face image of another person is captured one or more times may occur, and thus misrecognition is caused, for example, the user a wears the wearable device, and the wearable device monitors the user a, but in response to an operation that the user a lifts the wrist one time, the wearable device captures a face image of the user B, and then the wearable device may determine the identity of the user a according to a plurality of recognition results in a preset period. For example, the wearable device may identify the identity of the user multiple times within a preset period, and if the number of times of identifying the user a is greater than that of the user B, the monitored data may carry a user tag of the user a.
In yet another example, the user lifts his wrist, but the wearable device does not recognize the user (e.g., does not capture an image of his face), the wearable device may carry the previously monitored data and the data monitored after recognizing the identity of the user with the user tag of the user after the next recognition of the user. For example, after the user wears the wearable device, the wearable device does not recognize the user when the user lifts the wrist for the first time, but recognizes the identity of the user when the user lifts the wrist for the second time, and then the wearable device may carry the data monitored before and after the second recognition with the user tag of the user, that is, carry the data monitored after wearing the wearable device until the data before removing the wearable device carries the user tag of the user.
It will be appreciated that in the case where the wearable device has not identified the identity of the user, one possible design is that the wearable device stores the acquired data as data of the undetermined user, and prompts the user when wearing the wearable device next time, and the user can manually set the classification of the data or manually delete the data. If the user does not process after reminding for many times, the wearable device can automatically delete the data after reminding the user for a certain time, for example, the validity period of the data can be preset in the wearable device, that is, the data is automatically deleted after how long.
Yet another possible design is that the wearable device may discard the monitored data directly or store it for a period of time and the application is not limited in this regard.
It can be appreciated that the above identification of the face of the user can be based on the first camera module on the wearable device, which can further include a second camera module, which can include a plurality of camera units and a photosensitive unit, which can be used to more accurately identify and confirm the identity of the user. For example, the structure of the wearable device is a style of a folding screen, and then the first camera module can be used for identifying a user when the folding screen is in a closed state, the second camera module can be used for identifying the user when the folding screen is in a non-closed state (namely when the folding screen is opened), and the second camera module is used for identifying the user, so that the accuracy of identification can be greatly improved, and the accuracy of classifying user data packets is further improved.
Optionally, the wearable device may identify the user wearing the device in one of the following ways: face recognition, fingerprint recognition, or voiceprint recognition.
In an example, the wearable device may obtain a facial image of the user through the camera to face the user and determine the identity of the user.
In yet another example, the wearable device may determine the identity of the user by obtaining a fingerprint of the user to fingerprint the user.
As yet another example, the wearable device may determine the identity of the user by obtaining the user's voice to voiceprint identify the user.
It should be understood that the foregoing manner of identifying a user is merely an example, and should not be construed as limiting the embodiments of the present application in any way. In other embodiments, the wearable device may identify the user in other manners, so long as the identity of the user can be uniquely identified, and the manner of identifying is not specifically limited in the present application.
Optionally, before identifying the user wearing the device in response to the user operation, the method 200 further includes: whether the user wears the wearable device is detected, i.e., a wearing state of the wearable device is determined, wherein the wearing state includes unworn or worn.
For example, the wearable device may detect its wearing state by a sensor, e.g., a smart watch may employ a capacitive sensor to enable the wearing detection. The sensing end of the capacitive sensor is arranged on the back surface of the watch, and after the user wears the watch, the sensor detects the sensing capacitance, so that the current state of the watch is determined. The wearable device may detect the wearing state in a manner that may be referred to known techniques, and will not be described in detail herein.
Optionally, in the case that the identity of the user is not identified, the method 200 further includes: prompting the user to input the identity information of the user, wherein the identity information is information for identifying the user, and acquiring the identity information of the user in response to the user operation.
The identity information may be used, for example, to identify the user next time.
It can be appreciated that, in the case that the wearable device does not recognize the identity of the user, if the identity information of the user is not stored in the wearable device, the wearable device may prompt the user to enter the identity information of the user for the next identity recognition of the user, and in response to the user operation, the wearable device acquires the identity information of the user.
The identity information may be a face image, a fingerprint, or a voiceprint of the user, which is not limited in the embodiment of the present application.
In addition, the wearable device may enter identity information for a plurality of different users. The wearable device may also enter multiple identity information of the same user, e.g., the wearable device enters a face image, fingerprint, voiceprint, etc. of the user. The application is not limited in this regard.
Step 220, monitoring the user to obtain a first data set.
The first data set comprises data obtained by monitoring one or more physiological parameters in a physiological parameter set of a user and a user tag, wherein the user tag is used for identifying the user.
It can be seen that, for each physiological parameter in the physiological parameter set, the data obtained by monitoring the user by the wearable device can carry the user tag of the user, so that the data can be determined to belong to the user.
For example, the data of user a carries the user tag of user a, and the data of user B carries the user tag of user B, so that the data of user a and the data of user B can be distinguished.
It should be understood that the present application is not limited to the order of steps 210 and 220. For example, the wearable device may monitor the physiological parameter of the user to obtain the data of the user when the identity of the user is not identified, and the user tag portion corresponding to the data of the user may be blank first, and after the identity of the user is identified, the data of the user is carried with the user tag of the user. For another example, the wearable device recognizes the identity of the user at the beginning, and then the wearable device further monitors the physiological parameter of the user to obtain the data of the user, and then the data of the user is directly carried with the user tag.
Optionally, the physiological parameters in the set of physiological parameters include one or more of: heart rate, blood oxygen, blood pressure, temperature, heart rate variability, or respiratory rate, etc. The embodiment of the application does not limit the physiological parameters in the physiological parameter set.
Optionally, the method 200 further includes: and displaying the data of the user in response to user operation, wherein the data of the user comprises data obtained by monitoring one or more physiological parameters in the physiological parameter set of the user.
For example, in response to a user's operation of a setting option in an application corresponding to a certain physiological parameter, the wearable device may display data of the physiological parameter of the user, and for example, in response to a user's operation of a setting option of the wearable device, the wearable device may also directly display data of a plurality of physiological parameters of the user.
Optionally, in response to a user operation, displaying data of the user includes: in response to a first user operation, displaying at least one user tag corresponding to at least one user including the user, each user tag being used to identify one user, the at least one user corresponding to at least one data set, each data set including data obtained by monitoring the corresponding user for each physiological parameter in the physiological data set; and displaying the data of the user in response to a second user operation.
In an example, a user enters an application corresponding to a certain physiological parameter, such as a heart rate, and then the wearable device displays at least one user tag of the user, such as a user tag of the user a and a user tag of the user B, in response to a user clicking on the user tag of the user a, and displays data of the heart rate of the user a.
It will be appreciated that if the wearable device stores data of one user, the wearable device may display a user tag of the user in response to a first user operation, and further display the data of the user in response to a second user operation; the wearable device may also display data of the user directly in response to the first user operation. The application is not limited in this regard.
In yet another example, the user wishes to view the data of all physiological parameters of user a, and the wearable device displays at least one user's user tag, such as user a's user tag and user B's user tag, in response to a user operation, such as a user operation of a setting option of the wearable device, and displays the data of all physiological parameters of user a in response to a user clicking on the user tag of user a.
Fig. 3 is a schematic diagram of data of different users according to an embodiment of the present application.
As shown in fig. 3, two users wear and use the wearable device, namely, user a and user B, respectively, and the abscissa represents time, and the time unit may be time, minutes, seconds, and the like, which is not limited by the present application. The ordinate represents the heart rate of the user. The black circles represent data for user a and the white circles represent data for user B. For example, the user a is in use for a period of time before and the user B is in use for a period of time after, the wearable device records data of the user a and the user B, and the data of the user a and the data of the user B can be displayed on a device connected with the wearable device, such as a mobile phone, or can be displayed on a display screen of the wearable device itself, which is not limited in this application.
It should be understood that the user may choose to display only the data of user a, only the data of user B, or both the data of user a and the data of user B, for comparison, which is not limited by the present application.
It should also be appreciated that the wearable device may also give more analysis information based on the monitored data, such as average heart rate, highest heart rate, lowest heart rate, heart rate variability, etc. over a period of time. The application is not limited in this regard.
Optionally, the method 200 further includes: acquiring a current facial image of a user; based on the monitored data of the physiological parameters of the user and the facial image, obtaining a current comprehensive state index of the user, wherein the comprehensive state index is used for evaluating the current physical state of the user; the user is prompted with the composite state index.
The wearable device may obtain a comprehensive state index from the monitored data of the physiological parameter and the facial image, where the comprehensive state index is derived by combining the data of the physiological parameter of the user and the facial image and is used for evaluating the physical state of the user, and the facial expression, the facial state, and the like of the user can be seen from the facial image.
For example, the wearable device may comprehensively analyze the data of the monitored physiological parameters and the data of the facial image through a neural network model to obtain a comprehensive state index of the user.
Optionally, the method 200 further includes: determining a reference of the comprehensive state index based on data of various physiological parameters in a preset physiological parameter set of a user and a standard certificate photograph of the user; based on the benchmark and the user's current composite state index, a suggestion is made to the user.
The wearable device may obtain a reference of the integrated state index according to the preset data of each physiological parameter and the standard photo image of the user, and propose a suggestion to the user according to the reference and the current integrated state index of the user. For example, when the user's current composite state index is greater than or equal to a reference, suggesting how rest the user has; when the current comprehensive state index of the user is smaller than the reference, the user is recommended to do more exercises, and the like.
It should be understood that the wearable device may further obtain a current comprehensive state index according to the current facial image and the current data obtained by monitoring the physiological parameters of the user, further compare the current comprehensive state index with the sum of the reference as a final comprehensive state index, and compare the comprehensive state index with a preset threshold, thereby providing a user suggestion. The specific suggestions of the present application are not limited.
The data of the preset physiological parameters include, for example, a resting heart rate, a standard range of blood oxygen, and the like. The standard document image is input in advance by the user, and the method for identifying the standard document by the wearable device can refer to known technology, and is not described in detail herein.
It will be appreciated that the user may evaluate the composite state index via an application, e.g., in response to a user entering an application, the wearable device evaluates the composite state index of the user, and in more detail, the wearable device may obtain data of various physiological parameters of the user and facial images to derive the composite state index.
The application may be used to evaluate the comprehensive state index of the user, and the application may be an application deployed separately, or may be an application corresponding to a physiological parameter, such as an application corresponding to blood oxygen, an application corresponding to heart rate, etc., which is not limited in the present application.
Optionally, the method 200 further includes: guiding the user to exercise until the blood oxygen concentration of the user is not higher than a first preset threshold.
The wearable device may instruct the user to perform exercise, for example, instruct the user to perform deep breath, breath hold, and the like, and monitor blood oxygen of the user in real time until the blood oxygen concentration of the user is not higher than a first preset threshold.
Optionally, the method 200 further includes: judging whether the confidence coefficient of the PPG signal reaches a second preset threshold, wherein the PPG signal is used for detecting the blood oxygen concentration, and the confidence coefficient of the PPG signal is used for reflecting the reliability degree of the PPG signal; and guiding the user to exercise until the blood oxygen concentration of the user is not higher than a first preset threshold, comprising: and under the condition that the confidence coefficient of the PPG signal reaches a second preset threshold, guiding the user to exercise until the blood oxygen concentration of the user is not higher than the first preset threshold.
It can be appreciated that, possibly, the monitored blood oxygen concentration is inaccurate due to the inaccuracy of the PPG signal, so the wearable device may determine the confidence level of the PPG signal first, and instruct the user to perform exercise until the blood oxygen concentration is not higher than the first preset threshold when the confidence level reaches (i.e., is greater than or equal to) the second preset threshold. And under the condition that the confidence coefficient of the PPG signal is smaller than a second preset threshold, guiding the user to adjust the gesture until the confidence coefficient of the signal reaches the second preset threshold.
It should be understood that the present application is not particularly limited to the values of the first preset threshold and the second preset threshold.
The process of evaluating the user's integrated state index will be described in detail with reference to fig. 4.
Fig. 4 is a schematic flow chart of evaluating a comprehensive state index of a user according to an embodiment of the present application.
Step 410, launch the application.
The application may be used to evaluate the comprehensive state index of the user, and the application may be an application deployed separately, or may be an application corresponding to a physiological parameter, such as an application corresponding to blood oxygen, an application corresponding to heart rate, etc., which is not limited in the present application.
In response to a user operation (e.g., clicking on an application icon), the wearable device launches the application.
Step 420, user identity is confirmed.
After the wearable device starts the application, the identity of the user can be further confirmed, such as through face recognition, fingerprint recognition, voiceprint recognition, or the like.
At step 430, data of the physiological parameter is obtained.
For example, the wearable device may obtain data of physiological parameters, such as heart rate, blood oxygen, heart rate variability, blood pressure, etc., through the PPG signal.
Step 440, a facial image is acquired.
Illustratively, the wearable device may acquire a facial image of the user through a camera.
It should be understood that step 430 and step 440 may be performed simultaneously or separately, and when they are performed separately, the present application is not limited in their sequence.
It should also be appreciated that, to facilitate distinguishing the confidence levels of the PPG signal and the camera described above, the confidence level corresponding to the PPG signal is denoted as confidence level 1, and the confidence level corresponding to the camera is denoted as confidence level 2.
Step 450, determine whether the confidence level 1 reaches the preset threshold 1.
The preset threshold 1 may be a threshold value of confidence of the PPG signal configured in advance by the user. For example, the preset threshold 1 may be an example of a second preset threshold.
Step 460, determining whether the confidence level 2 reaches the preset threshold 2.
The preset threshold 2 may be a threshold value of confidence of the camera configured in advance by the user.
The wearable device may determine whether the confidence level 1 and the confidence level 2 reach (i.e., are greater than or equal to) the corresponding preset thresholds, respectively, and if the confidence level 1 reaches the preset threshold 1 and the confidence level 2 reaches the preset threshold 2, the wearable device continues to execute step 470. If the confidence coefficient 1 is smaller than the preset threshold 1 or the confidence coefficient 2 is smaller than the preset threshold 2, ending.
It should be understood that step 450 and step 460 may be performed simultaneously or separately, and when they are performed separately, the present application is not limited in their sequence.
It should also be appreciated that steps 450 and 460 are merely examples, and are not intended to limit the embodiments of the present application in any way, as to improve the accuracy of the acquired data of physiological parameters and facial images. For example, in other embodiments, the wearable device may not perform step 450 and step 460, that is, the confidence level 1 and the confidence level 2 need not be determined, which is not limited by the embodiment of the present application.
Step 470, calculate the user's current composite state index.
The wearable device can calculate the current comprehensive state index of the user based on the data of the physiological parameters and the facial image through the trained neural network model.
Step 480, suggesting to the user based on the reference of the integrated state index and the current integrated state index.
After the wearable device calculates the current comprehensive state index of the user, the user can be prompted with the comprehensive state index. The wearable device may also give user advice based on the reference to the composite state index and the current composite state index. For example, when the user's current composite state index is greater than or equal to a reference, suggesting how rest the user has; when the current comprehensive state index of the user is smaller than the reference, the user is recommended to do more exercises, and the like.
The reference of the comprehensive state index is obtained according to the preset data of each physiological parameter and standard certificate photograph. The data of the preset physiological parameters include, for example, resting heart rate, standard range of blood oxygen, etc.
It can be understood that the wearable device may further obtain a current comprehensive state index according to the current facial image and the current data obtained by monitoring the physiological parameters of the user, further compare the current comprehensive state index with a reference sum to obtain a final comprehensive state index, and further provide a user suggestion by comparing the comprehensive state index with a preset threshold. The specific suggestions of the present application are not limited.
The process of guiding the user's exercise will be described in detail with reference to fig. 5.
Fig. 5 is a schematic flow chart of guiding a user to exercise according to an embodiment of the present application.
Step 501, an application is started.
The application may be used for guiding the user to exercise, and the application may be an application deployed separately, or may be an application corresponding to a physiological parameter, such as an application corresponding to blood oxygen, an application corresponding to heart rate, etc., which is not limited in the present application.
In response to a user operation (e.g., clicking on an application icon), the wearable device launches the application.
Step 502, reminding a user to wear the wearable device correctly, and aligning the camera to the face.
The wearable device reminds the user of wearing the wearable device correctly so that the accuracy of the acquired PPG signal is higher, and the patient reminds the user to aim the camera at the face so as to acquire the facial image of the user. The wearable device acquires a facial image of a user, and on one hand, the wearable device can identify the user so as to determine the identity of the user; on the other hand, the facial expression, state and other data of the user can be obtained through the facial image.
In step 503, data of physiological parameters is acquired.
For example, the wearable device may obtain data of physiological parameters, such as heart rate, blood oxygen, heart rate variability, blood pressure, etc., through the PPG signal.
In step 504, a facial image is acquired.
Illustratively, the wearable device may acquire a facial image of the user through a camera. The face image is acquired to acquire the face data of the user, so that the current state of the user is identified, the user is normally within the shooting range of the camera, the blink, the breathing and other normal states of the user are identified, and the occurrence of unexpected conditions, such as the occurrence of the faint due to hypoxia and the like, is avoided.
It should be understood that step 503 and step 504 may be performed simultaneously or separately, and when they are performed separately, the present application is not limited to the order of the steps.
It should also be appreciated that, to facilitate distinguishing the confidence levels of the PPG signal and the camera described above, the confidence level corresponding to the PPG signal is denoted as confidence level 1, and the confidence level corresponding to the camera is denoted as confidence level 2.
Step 505, determine whether the confidence level 1 reaches a preset threshold 1.
The preset threshold 1 may be a threshold value of confidence of the PPG signal configured in advance by the user. For example, the preset threshold 1 may be an example of a second preset threshold.
Step 506, determining whether the confidence level 2 reaches the preset threshold 2.
The preset threshold 2 may be a threshold value of confidence of the camera configured in advance by the user.
The wearable device may determine whether the confidence level 1 and the confidence level 2 reach (i.e. are greater than or equal to) the corresponding preset thresholds, respectively, and if the confidence level 1 reaches the preset threshold 1 and the confidence level 2 reaches the preset threshold 2, the wearable device continues to execute step 509. If the confidence level 1 is less than the preset threshold 1, the wearable device performs step 507; if the confidence level 2 is less than the preset threshold 2, the wearable device performs step 508.
It should be understood that step 505 and step 506 may be performed simultaneously or separately, and when they are performed separately, the present application is not limited in order.
It should also be understood that steps 505 and 506 are merely examples, and are not intended to limit the embodiments of the present application in any way, for the purpose of improving the accuracy of the acquired data of physiological parameters and facial images. For example, in other embodiments, the wearable device may not perform steps 505 and 506, that is, the confidence level 1 and the confidence level 2 need not be determined, or the wearable device may perform one of the steps (for example, only determining whether the confidence level 1 reaches the preset threshold 1), which is not limited by the embodiment of the present application.
Step 507 directs the user to adjust the wearable device.
If the confidence coefficient 1 is smaller than the preset threshold 1, the wearable device guides the user to adjust the wearable device, such as guiding the user to adjust the tightness degree of the wearable device, adjusting the wearing position of the wearable device, and the like.
Step 508, guiding the user to adjust the facial position.
If the confidence level 2 is less than the preset threshold 2, the wearable device directs the user to adjust the facial position so that the camera is aimed at the user's face.
Step 509, guiding the user to inhale and hold breath.
If the confidence coefficient 1 reaches the preset threshold 1 and the confidence coefficient 2 reaches the preset threshold 2, the wearable device guides the user to inhale and hold breath until the blood oxygen concentration reaches the preset threshold 3.
Step 510, breath holding timing, and acquiring data of physiological parameters and facial images in real time.
The wearable device starts breath holding from a user to time, and acquires data of various physiological parameters and facial images in real time, wherein when the blood oxygen concentration reaches a preset threshold 3, the wearable device breathes normally.
In step 511, it is determined whether the blood oxygen concentration reaches a preset threshold 3.
If the blood oxygen concentration reaches the preset threshold 3, the wearable device performs step 512; if the blood oxygen concentration is less than the preset threshold 3, the wearable device continues to perform step 509, i.e. continues to instruct the user to inhale and hold breath until the blood oxygen concentration reaches the preset threshold 3.
Step 512, stopping breath holding and breathing normally.
If the blood oxygen concentration reaches a preset threshold 3, the wearable device stops holding breath and breathes normally.
It can be seen that the user is beneficial to improving the endurance, the vital capacity and the like of the user by performing the breath holding training.
Based on the technical scheme, the wearable device can identify the identity of the user wearing the wearable device and carry the data obtained by monitoring the user with the user tag, so that if a plurality of users use the wearable device and even the plurality of users log in the same account, the data of different users can be distinguished based on the user tag, and the confusion of the data of different users is avoided.
Fig. 6 is a schematic block diagram of a wearable device 600 provided by an embodiment of the application. As shown in fig. 6, the apparatus 600 may include: an identification unit 610 and a monitoring unit 620. The units in the device 600 may be used to implement the methods described in the embodiments shown in fig. 2, 4 or 5.
When the device 600 is used to implement the method described in the embodiment shown in fig. 2, the identifying unit 610 may be configured to identify a user wearing the wearable device in response to a user operation, so as to determine the identity of the user; the monitoring unit 620 may be configured to monitor the user to obtain a first data set, where the first data set includes data obtained by monitoring one or more physiological parameters of the user and a user tag, where the user tag is used to identify the user. Reference is made specifically to the detailed description of the method embodiments, and details are not described here.
The specific process of each unit executing the corresponding steps is described in detail in the above method embodiments, and for brevity, will not be described in detail herein.
The division of the units in the embodiment of the application is schematic, only one logic function is divided, and other division modes can be adopted in actual implementation. In addition, each functional unit in the embodiments of the present application may be integrated in one processor, or may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
Fig. 7 is yet another schematic block diagram of a wearable device 700 suitable for use with embodiments of the present application.
As shown in fig. 7, the device 700 may include at least one processor 710 for implementing the functions of the wearable device in the method provided by the embodiment of the present application. The device 700 may be a system-on-chip. In the embodiment of the application, the chip system can be formed by a chip, and can also comprise the chip and other discrete devices.
Illustratively, when the device 700 is used to implement the method shown in fig. 2, the processor 710 may be configured to identify a user wearing the wearable device in response to a user operation to determine the identity of the user; the user is monitored to obtain a first data set, wherein the first data set comprises data obtained by monitoring one or more physiological parameters in the physiological parameter set of the user and a user tag, and the user tag is used for identifying the user. Reference is made specifically to the detailed description in the method examples, and details are not described here.
The device 700 may also include at least one memory 720 for storing program instructions and/or data. Memory 720 is coupled to processor 710. The coupling in the embodiments of the present application is an indirect coupling or communication connection between devices, units, or modules, which may be in electrical, mechanical, or other forms for information interaction between the devices, units, or modules. Processor 710 may operate in conjunction with memory 720. Processor 710 may execute program instructions stored in memory 720. At least one of the at least one memory may be included in the processor.
The device 700 may also include a communication interface 730 for communicating with other devices over a transmission medium so that the device 700 may communicate with other devices. The communication interface 730 may be, for example, a transceiver, an interface, a bus, a circuit, or a device capable of implementing a transceiver function. Processor 710 may receive and transmit data and/or information using communication interface 730 and may be used to implement the methods described in fig. 2, 4, or 5.
The specific connection medium between the processor 710, the memory 720, and the communication interface 730 is not limited in the embodiment of the present application. In fig. 7, the processor 710, the memory 720, and the communication interface 730 are connected by a bus 740 according to an embodiment of the application. The bus 740 is shown in bold lines in fig. 7, and the manner in which other components are connected is illustrated schematically and not by way of limitation. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one thick line is shown in fig. 7, but not only one bus or one type of bus.
The application also provides a chip system comprising at least one processor for implementing the functions involved in the method described in fig. 2, 4 or 5.
In one possible design, the system on a chip further includes a memory to hold program instructions and data, the memory being located either within the processor or external to the processor.
The chip system may be formed of a chip or may include a chip and other discrete devices.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a computer, causes the computer to perform the method described in fig. 2,4 or 5.
Embodiments of the present application also provide a computer program product comprising a computer program which, when run, causes a computer to perform the method described in fig. 2,4 or 5.
It should be appreciated that the processor in embodiments of the present application may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be implemented by integrated logic circuits of hardware in a processor or instructions in software form. The processor may be a general purpose processor, a digital signal processor (DIGITAL SIGNAL processor, DSP), an Application SPECIFIC INTEGRATED Circuit (ASIC), a field programmable gate array (field programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor reads the information in the memory and, in combination with its hardware, performs the steps of the above method.
It should also be appreciated that the memory in embodiments of the present application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an erasable programmable ROM (erasable PROM), an electrically erasable programmable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available, such as static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (double DATA RATE SDRAM, DDR SDRAM), enhanced synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCHLINK DRAM, SLDRAM), and direct memory bus random access memory (direct rambus RAM, DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
The terms "unit," "module," and the like as used in this specification may be used to refer to a computer-related entity, either hardware, firmware, a combination of hardware and software, or software in execution.
Those of ordinary skill in the art will appreciate that the various illustrative logical blocks (illustrative logical block) and steps (steps) described in connection with the embodiments disclosed herein can be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application. In the several embodiments provided by the present application, it should be understood that the disclosed apparatus, device and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more units may be integrated into one module.
In the above embodiments, the functions of the respective functional modules may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions (programs). When the computer program instructions (program) are loaded and executed on a computer, the processes or functions according to the embodiments of the present application are fully or partially produced. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, terminal device, or data center to another website, computer, terminal device, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a terminal device, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital versatile disk (digital video disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a terminal device, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A method of monitoring a wearable device, applied to a wearable device, the method comprising:
identifying the identity of a user wearing the equipment to determine the identity of the user;
And monitoring the user to obtain a first data set, wherein the first data set comprises data obtained by monitoring one or more physiological parameters in the physiological parameter set of the user and a user tag, and the user tag is used for identifying the user.
2. The method of claim 1, wherein the identifying the user wearing the device comprises:
The identity of the user wearing the device is identified by one of the following means: face recognition, fingerprint recognition, or voiceprint recognition.
3. The method of claim 1 or 2, wherein in the event that the identity of the user is not identified, the method further comprises:
prompting the user to input identity information of the user, wherein the identity information is information for identifying the user;
And responding to user operation, and collecting the identity information of the user.
4. A method according to any one of claims 1 to 3, wherein the method further comprises:
And in response to user operation, displaying data of the user, wherein the data of the user comprises data obtained by monitoring one or more physiological parameters in the physiological parameter set of the user.
5. The method of claim 4, wherein displaying the user's data in response to a user operation comprises:
In response to a first user operation, displaying at least one user tag corresponding to at least one user including the user, each user tag for identifying one user, the at least one user corresponding to at least one data set, each data set including data monitored for each physiological parameter in the physiological data set for the corresponding user;
And displaying the data of the user in response to a second user operation.
6. The method of any one of claims 1 to 5, wherein the physiological parameters in the set of physiological parameters comprise one or more of: heart rate, blood oxygen, blood pressure, temperature, heart rate variability, or respiratory rate.
7. The method of any one of claims 1 to 6, wherein the method further comprises:
Acquiring a current facial image of the user;
Based on the first data set and the facial image, obtaining a current comprehensive state index of the user, wherein the comprehensive state index is used for evaluating the current physical state of the user;
and prompting the user for the comprehensive state index.
8. The method of claim 7, wherein the method further comprises:
Determining a reference of the comprehensive state index based on the preset data of each physiological parameter in the physiological parameter set of the user and the standard certificate photograph of the user;
and suggesting to the user based on the reference and the current comprehensive state index of the user.
9. The method of any one of claims 1 to 8, wherein the method further comprises:
and guiding the user to exercise until the blood oxygen concentration of the user is not higher than a first preset threshold.
10. The method of claim 9, wherein the method further comprises:
Judging whether the confidence coefficient of a PPG signal of a photoplethysmography reaches a second preset threshold, wherein the PPG signal is used for detecting blood oxygen concentration, and the confidence coefficient of the PPG signal is used for reflecting the reliability degree of the PPG signal; and
The step of guiding the user to exercise until the blood oxygen concentration of the user is not higher than a first preset threshold comprises the following steps:
And under the condition that the confidence coefficient of the PPG signal reaches the second preset threshold, guiding the user to exercise until the blood oxygen concentration of the user is not higher than the first preset threshold.
11. A wearable device, characterized by comprising means for implementing the method of any of claims 1 to 10.
12. A wearable device comprising a processor and a memory, wherein,
The memory is used for storing a computer program;
The processor is configured to invoke the computer program to cause the wearable device to perform the method of any of claims 1 to 10.
13. A computer readable storage medium, characterized in that the storage medium has stored therein a computer program or instructions which, when executed by a computer, implement the method of any of claims 1 to 10.
14. A computer program product comprising instructions which, when executed by a computer, implement the method of any one of claims 1 to 10.
CN202211633641.1A 2022-12-19 2022-12-19 Monitoring method and related device of wearable equipment Pending CN118230372A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211633641.1A CN118230372A (en) 2022-12-19 2022-12-19 Monitoring method and related device of wearable equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211633641.1A CN118230372A (en) 2022-12-19 2022-12-19 Monitoring method and related device of wearable equipment

Publications (1)

Publication Number Publication Date
CN118230372A true CN118230372A (en) 2024-06-21

Family

ID=91503335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211633641.1A Pending CN118230372A (en) 2022-12-19 2022-12-19 Monitoring method and related device of wearable equipment

Country Status (1)

Country Link
CN (1) CN118230372A (en)

Similar Documents

Publication Publication Date Title
US10262123B2 (en) Multimodal biometric authentication system and method with photoplethysmography (PPG) bulk absorption biometric
US11350885B2 (en) System and method for continuous privacy-preserved audio collection
US9747902B2 (en) Method and system for assisting patients
EP2264988A1 (en) Method of detecting a current user activity and environment context of a user of a mobile phone using an accelerator sensor and a microphone, computer program product, and mobile phone
US9892247B2 (en) Multimodal biometric authentication system and method with photoplethysmography (PPG) bulk absorption biometric
CN110688973B (en) Equipment control method and related product
CN104850827A (en) Fingerprint identification method and apparatus
WO2019015418A1 (en) Unlocking control method and related product
CN105867545A (en) Intelligent vein wearable bracelet
WO2016142359A1 (en) Wearable device for sweat testing administration
CN107784268B (en) Method and electronic device for measuring heart rate based on infrared sensor
WO2018223491A1 (en) Identification method and system based on tooth occlusion sound
US20210134319A1 (en) System and method for passive subject specific monitoring
KR20210006419A (en) Generating and storing health-related information
KR102163996B1 (en) Apparatus and Method for improving performance of non-contact type recognition function in a user device
US10116788B2 (en) Detecting notable events and annotating multimedia data based on the notable events
CN109938722B (en) Data acquisition method and device, intelligent wearable device and storage medium
CN113764099A (en) Psychological state analysis method, device, equipment and medium based on artificial intelligence
KR20190117100A (en) Method and apparatus for measuring biometric information in electronic device
CN118230372A (en) Monitoring method and related device of wearable equipment
CN107944242B (en) Biological identification function disabling method and mobile terminal
TWI536963B (en) Measuring device for electrocardiogram and measuring method for the measuring device
WO2021135932A1 (en) Data acquisition method, terminal device, and storage medium
CN109709561A (en) Distance measuring method, terminal and computer readable storage medium
CN108304076A (en) The management method and Related product that electronic device, video playing are applied

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination