CN113552937A - Display control method and wearable device - Google Patents

Display control method and wearable device Download PDF

Info

Publication number
CN113552937A
CN113552937A CN202010335463.9A CN202010335463A CN113552937A CN 113552937 A CN113552937 A CN 113552937A CN 202010335463 A CN202010335463 A CN 202010335463A CN 113552937 A CN113552937 A CN 113552937A
Authority
CN
China
Prior art keywords
screen
user
wearable device
instruction
sleep state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010335463.9A
Other languages
Chinese (zh)
Inventor
张慧
李靖
周林峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010335463.9A priority Critical patent/CN113552937A/en
Priority to PCT/CN2021/084004 priority patent/WO2021213151A1/en
Publication of CN113552937A publication Critical patent/CN113552937A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake

Abstract

The embodiment of the application provides a display control method and wearable equipment, wherein the method comprises the following steps: after receiving a first screen awakening instruction, the wearable device determines the current sleep state of the user; and if the current sleep state of the user meets a preset condition, lightening the screen of the wearable device after receiving a second screen awakening instruction, wherein the first screen awakening instruction is an instruction generated based on any preset user operation for awakening the screen, and the second screen awakening instruction is an instruction generated based on any preset user operation for awakening the screen except for the touch operation corresponding to the first screen awakening instruction. The power consumption of the wearable equipment increased by repeatedly lighting the screen due to misoperation of the user can be effectively reduced, and the user can conveniently and quickly wake up the screen through the second screen to light the screen when needing to check the display content of the screen of the wearable equipment, so that the use experience of the user is improved.

Description

Display control method and wearable device
Technical Field
The application relates to the technical field of electronics, in particular to a display control method and wearable equipment.
Background
Wearable equipment is that the technique of wearing is used to carry out intelligent design to daily wearing to develop the general term of the equipment that can dress, like intelligent glasses, intelligent gloves, intelligent wrist-watch, intelligent dress, intelligent shoes etc.. Among them, smart watches and smart bracelets are the most significant. Along with the improvement of people health consciousness, the function of wearable equipment monitoring health is more and more liked by people. Some functions of monitoring health, such as sleep monitoring functions, sleep apnea monitoring functions, etc., require a user to wear a wearable device at night to acquire relevant sleep data.
However, during the night sleep of the user, especially when the user is going to fall asleep, if the user accidentally touches or mistakenly touches the screen of the wearable device, the screen will be woken up, and the change of the screen brightness may affect the sleep of the user, and especially for the user with poor sleep quality or difficulty in falling asleep, the experience is very poor. And the repeated bright screen can also increase the power consumption of the wearable device, and reduce the standby time of the wearable device.
Disclosure of Invention
The application discloses a display control method and wearable equipment, which can solve the problem that the power consumption of the wearable equipment is increased due to repeated screen lightening caused by misoperation of a user.
In a first aspect, an embodiment of the present application provides a display control method, which is applied to a wearable device, and the method includes: after receiving a first screen awakening instruction, the wearable device determines the current sleep state of the user; and if the current sleep state of the user meets a preset condition, lightening the screen of the wearable device after receiving a second screen awakening instruction, wherein the first screen awakening instruction is an instruction generated based on any user operation preset for awakening the screen of the wearable device, and the second screen awakening instruction is an instruction generated based on any user operation preset for awakening the screen of the wearable device except for the user operation corresponding to the first screen awakening instruction. The power consumption of the wearable device increased by repeated screen lightening due to misoperation of the user can be effectively reduced, the phenomenon that the user sleeps due to repeated screen lightening caused by misoperation of the user is prevented, and the user can wake up the instruction to lighten the screen quickly through the second screen when needing to check the display content of the screen of the wearable device, so that the use experience of the user is improved.
The screen wake-up instruction referred to in the embodiments of the present application is an instruction for instructing to illuminate a screen of the wearable device generated in response to a user operation. When the screen of the wearable device is in a screen-off state, when a user touches, for example, clicks any area or a designated area of the wearable device, or when the user lifts or turns over the wrist, the wearable device generates a screen wake-up command to light up the screen of the wearable device, wherein the designated area is any screen area preset for responding to the touch operation of the user. That is, the screen wake-up instruction may be a screen wake-up instruction generated when the user touches any screen region of the wearable device, may also be a screen wake-up instruction generated when the user touches a designated screen region of the wearable device, and may also be a screen wake-up instruction generated when the user lifts or turns over the wrist.
Illustratively, the user operation includes, but is not limited to, a click operation, a double click operation, a continuous click operation, a press operation, a multi-press operation, a slide operation, a wrist-turning operation, or a combination of at least two of the above user operations, such as a combination of a press operation and a wrist-turning operation.
For example, the first screen wake-up command may be a screen wake-up command generated by a preset user operation in a normal mode, for example, the user operation corresponding to the first screen wake-up command includes a wrist-up operation, a wrist-turning operation, a clicking operation, a tapping operation, and the like. The second screen wake-up instruction may be a screen wake-up instruction generated by a preset user operation in the false touch prevention mode, where the user operation is different from that in the normal mode, and for example, the user operation corresponding to the second screen wake-up instruction includes a double-click operation, a continuous-click operation, a multi-press operation, a sliding operation, a combined wrist-lifting and pressing operation, and the like.
With reference to the first aspect, in a first possible implementation manner of the first aspect, after receiving the first screen wake-up instruction, the wearable device further includes: detecting the current screen state of the wearable device; and if the current screen state of the wearable device is in the screen-off state, determining the current sleep state of the user.
The sleep states include three states of not going to sleep, having gone to sleep, and being ready to go to sleep. The fact that the current sleep state of the user meets the preset condition specifically means that the current sleep state of the user is a sleep state already entered or a sleep state ready to enter.
According to the embodiment of the application, the current screen state of the wearable device is detected, and after the current screen state of the wearable device is determined to be in the screen off state, the current sleep state of a user is determined, so that when the current screen state of the wearable device is in the screen on state, other user operations can be responded quickly, and the response efficiency of the wearable device is improved. Only when the current screen state of the wearable device is in the screen-off state, the current sleep state of the user needs to be determined, so that the power consumption of the wearable device increased by repeatedly turning on the screen due to misoperation of the user is reduced, and the purpose that the sleep of the user is influenced by the repeatedly turning on the screen due to the misoperation of the user is prevented.
With reference to the first aspect, in a second possible implementation manner of the first aspect, after receiving the first screen wake-up instruction, the wearable device further includes: the wearable device determines whether the anti-false touch mode is started; if the false touch prevention mode is started, the wearable device determines whether the user operation corresponding to the first screen awakening instruction is a preset user operation; and if the user operation corresponding to the first screen awakening instruction is the preset user operation, executing the first screen awakening instruction, and lightening the screen of the wearable device, otherwise, determining the user operation corresponding to the first screen awakening instruction as the misoperation, shielding the first screen awakening instruction, and not lightening the screen of the wearable device. If the anti-false touch mode is not turned on, the wearable device determines the current sleep stage of the user.
According to the embodiment of the application, whether the false touch prevention mode is started or not is determined, so that whether the current stage of the user needs to be determined to trigger the automatic false touch prevention function or not is determined, namely, if the current sleep state of the user meets the preset condition is determined, the screen of the wearable device is lightened after the second screen awakening instruction is received, the intelligent degree of the wearable device is improved, and different requirements of people are met.
With reference to the first aspect, in a third possible implementation manner of the first aspect, if the current sleep state of the user meets a preset condition, lighting a screen of the wearable device after receiving the second screen wake-up instruction includes: if the current sleep state of the user meets a preset condition, detecting whether the second screen awakening instruction is received; if the second screen awakening instruction is received, executing the second screen awakening instruction, and lightening a screen of the wearable device; if the second screen awakening instruction is not received within the preset time, the user operation corresponding to the first screen awakening instruction is determined to be misoperation, the first screen awakening instruction is shielded, and the screen of the wearable device is not lightened.
The embodiment of the application determines whether to light the screen of the wearable device by judging whether to receive the second screen awakening instruction or not so as to reduce the occurrence of the event of repeated screen lightening caused by user misoperation, can effectively reduce the discomfort brought to the user by lightening the screen due to the user misoperation, effectively reduce the power consumption of the wearable device increased due to repeated screen lightening, and prevent the user sleep from being influenced by the repeated screen lightening caused by the user misoperation.
With reference to the first aspect, in a fourth possible implementation manner of the first aspect, after receiving the first screen wake instruction, the determining, by the wearable device, a current sleep state of the user includes: the wearable equipment acquires user sleep detection data; and determining the current sleep state of the user according to the sleep detection data of the user.
The sleep detection data is related detection data for determining the current sleep state of the user, and includes but is not limited to at least one of physiological characteristic data of the user, movement posture data of the user, and current environment data. Wherein the user physiological characteristic data includes but is not limited to heart rate, pulse, respiratory rate, brain wave signal and other data of the user; the user's motion gesture data includes, but is not limited to, the user's wrist gesture data; the current environment data includes, but is not limited to, ambient light level data of the environment in which the wearable device is currently located.
According to the embodiment of the application, whether the current sleep state of the user is the sleep state which is not entered, the sleep state which is entered or the sleep state which is ready to enter can be judged through the sleep detection data. Whether the user operation corresponding to the first screen instruction needs to be judged to be misoperation or not is further judged, and if the user operation corresponding to the first screen instruction is not misoperation, namely the current sleep state of the user does not accord with the preset condition, the wearable device can execute the first screen awakening instruction and light the screen of the wearable device; otherwise, another screen wake-up instruction generated by user operation, namely a second screen wake-up instruction, is needed to light up the screen of the wearable device, so that the purpose of increasing the power consumption of the wearable device due to repeated screen lightening caused by misoperation of the user is reduced, meanwhile, the occurrence of an event that the sleep quality of the user is possibly reduced due to repeated screen lightening is reduced, and the user experience is improved.
It should be noted that, the current sleep state of the user can be quickly determined to be the non-sleep state and the sleep state through the physiological characteristic data of the user or the motion posture data of the user. The wearable device accurately judges whether the current sleep state of the user is ready to enter the sleep state, and the judgment accuracy of whether the current user operation is misoperation is improved.
For example, the determining the current sleep state of the user according to the user sleep detection data includes:
determining an action grade corresponding to the wrist gesture data of the user according to the wrist gesture data of the user; and if the action grade corresponding to the determined wrist gesture data of the user is a preset action grade, determining that the wrist gesture data of the user meets a first condition.
Correspondingly, if the acquired wrist gesture data of the user meets a first condition and/or the ambient light brightness of the current environment where the wearable device is located is lower than a preset brightness threshold, determining that the current sleep state of the user meets a preset condition, including: and if the action grade corresponding to the acquired wrist gesture data of the user is a preset action grade and/or the environmental light brightness of the current environment of the wearable device is lower than a preset brightness threshold, determining that the current sleep state of the user meets a preset condition.
According to the embodiment of the application, whether the wrist gesture data of the user accords with the first condition is determined by judging whether the action grade corresponding to the wrist gesture data of the user is the preset action grade, namely the wrist gesture data of the user can be considered to accord with the first condition when the first condition is that the action grade of the wrist of the user is the preset action grade, and the accuracy rate of judging the current sleep state of the user is improved.
With reference to the first aspect, in a fifth possible implementation manner of the first aspect, the determining that the motion level corresponding to the acquired wrist gesture data of the user is a preset motion level includes: determining at least two corresponding action levels according to the acquired wrist gesture data of the user; comparing the at least two corresponding action grades; and if at least two motion levels with the same level exist in the at least two corresponding motion levels, setting the motion level with the same level and the largest number as the motion level corresponding to the wrist gesture data of the user.
According to the embodiment of the application, the action grades corresponding to the acquired wrist gesture data of the user are determined, the action grades with the same grade and the largest quantity are set as the action grades corresponding to the wrist gesture data of the user, the judgment accuracy of the action grades corresponding to the wrist gesture data of the user is improved, the accuracy of judging whether the current sleep state of the user is ready to enter the sleep state is further improved, the accuracy of judging whether the user operation corresponding to a certain screen awakening instruction is misoperation can be effectively improved, and the purpose of preventing mistaken touch is achieved.
In a second aspect, the present application provides a wearable device comprising: one or more processors, memory, and a display screen; the memory, the display screen and the one or more processors are coupled, the memory is configured to store computer program code, the computer program code includes computer instructions, and when the computer instructions are executed by the one or more processors, the wearable device is caused to perform the method according to any one of the possible embodiments of the first aspect.
In a third aspect, the present application provides a computer storage medium comprising computer instructions that, when executed on a wearable device, cause the wearable device to perform the method as provided in any one of the possible embodiments of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product, which when run on a computer, causes the computer to execute the method as described above, such that the wearable device described above performs the method as provided in any one of the possible embodiments of the first aspect.
In a fifth aspect, embodiments of the present application provide a chip system, which includes a processor, where the processor is coupled with a memory, and when the processor executes a computer program stored in the memory, the wearable device executes the method according to any one of the possible embodiments of the first aspect. The chip system can be a single chip or a chip module consisting of a plurality of chips.
It is to be understood that the wearable device of the second aspect, the computer storage medium of the third aspect, the computer program product of the fourth aspect, or the chip system of the fifth aspect, are all configured to perform the method of the first aspect. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
Drawings
The drawings used in the embodiments of the present application are described below.
Fig. 1 is a schematic structural diagram of a wearable device 100 provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a display control method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of another display control method provided in an embodiment of the present application;
fig. 4 is a schematic flowchart of another display control method provided in the embodiment of the present application;
FIG. 5 is a schematic flow chart diagram illustrating another display control method according to an embodiment of the present disclosure;
fig. 6 is a flowchart illustrating a method for determining a current sleep state of a user according to an embodiment of the present application.
Detailed Description
The embodiments of the present application will be described below with reference to the drawings. The terminology used in the description of the embodiments herein is for the purpose of describing particular embodiments herein only and is not intended to be limiting of the application.
A wearable device according to an embodiment of the present application will be described first. Referring to fig. 1, fig. 1 is a schematic structural diagram of a wearable device 100 according to an embodiment of the present disclosure.
The wearable device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the wearable device 100. In other embodiments of the present application, wearable device 100 may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
Among other things, the controller may be a neural center and a command center of the wearable device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 and the touch sensor 180K communicate through an I2C bus interface, enabling touch functionality of the wearable device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the capture functionality of the wearable device 100. Processor 110 and display screen 194 communicate through the DSI interface to implement the display functionality of wearable device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the wearable device 100, and may also be used to transmit data between the wearable device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It is to be understood that the interfacing relationship between the modules according to the embodiment of the present invention is only illustrative, and does not form a structural limitation for the wearable device 100. In other embodiments of the present application, the wearable device 100 may also adopt different interface connection manners or a combination of interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive the wireless charging input through a wireless charging coil of the wearable device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the wearable device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in wearable device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the wearable device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the wearable device 100, including Wireless Local Area Networks (WLANs) (such as wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of wearable device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that wearable device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The wearable device 100 implements display functions via the GPU, the display screen 194, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like.
The wearable device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, wearable device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the wearable device 100 is in frequency point selection, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. Wearable device 100 may support one or more video codecs. As such, wearable device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as smart recognition of the wearable device 100 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the wearable device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the wearable device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (such as audio data, phone book, etc.) created during use of the wearable device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The wearable device 100 may implement audio functions via the audio module 170, speaker 170A, microphone 170C, headphone interface 170D, and application processor, among others. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The wearable device 100 can listen to music through the speaker 170A, or listen to a hands-free conversation.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the wearable device 100 answers a phone call or voice information, voice can be answered by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The wearable device 100 may be provided with at least one microphone 170C. In other embodiments, the wearable device 100 may be provided with two microphones 170C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the wearable device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The wearable device 100 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the wearable device 100 detects the intensity of the touch operation according to the pressure sensor 180A. The wearable device 100 can also calculate the position of the touch from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion gesture of the wearable device 100. In some embodiments, the angular velocity of wearable device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyroscope sensor 180B detects a shaking angle of the wearable device 100, calculates a distance to be compensated for by the lens module according to the shaking angle, and allows the lens to counteract shaking of the wearable device 100 through a reverse motion, thereby achieving anti-shaking. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, wearable device 100 calculates altitude, aiding in positioning and navigation from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. Wearable device 100 may detect the opening and closing of the flip holster with magnetic sensor 180D. In some embodiments, when wearable device 100 is a flip-top machine, wearable device 100 may detect the opening and closing of the flip according to magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the wearable device 100 in various directions (typically three axes).
A distance sensor 180F for measuring a distance. The wearable device 100 may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, wearable device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The wearable device 100 emits infrared light outward through the light emitting diode. The wearable device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object near the wearable device 100. When insufficient reflected light is detected, the wearable device 100 may determine that there is no object near the wearable device 100. The wearable device 100 can utilize the proximity light sensor 180G to detect that the user holds the wearable device 100 close to the ear for conversation, so as to automatically extinguish the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Wearable device 100 may adaptively adjust display screen 194 brightness based on perceived ambient light levels. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the wearable device 100 is in a pocket to prevent inadvertent contact.
The fingerprint sensor 180H is used to collect a fingerprint. The wearable device 100 can utilize the collected fingerprint characteristics to achieve fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, wearable device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the wearable device 100 performs a reduction in performance of a processor located near the temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, wearable device 100 heats battery 142 when the temperature is below another threshold to avoid low temperatures causing abnormal shutdown of wearable device 100. In other embodiments, wearable device 100 performs a boost on the output voltage of battery 142 when the temperature is below yet another threshold to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the wearable device 100 at a different location than the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The wearable device 100 may receive key inputs, producing key signal inputs related to user settings and function control of the wearable device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the wearable device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The wearable device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The wearable device 100 interacts with the network through the SIM card to implement functions such as conversation and data communication. In some embodiments, wearable device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the wearable device 100 and cannot be separated from the wearable device 100.
Next, some terms referred to in the embodiments of the present application are explained so as to be easily understood by those skilled in the art.
It should be noted that the following embodiments of the present application relate to a screen wake-up instruction for indicating an instruction generated in response to a user operation to illuminate a screen of the wearable device 100. When the wearable device 100 is in the off-screen state, the user touches, for example, any area or a designated area of the wearable device 100, or the user lifts or turns the wrist, the wearable device 100 generates a screen wake-up command to illuminate the screen, wherein the designated area is any area of the screen that is preset to respond to the user touch operation. The screen wake-up instruction may be a screen wake-up instruction generated when the user touches any screen area of the wearable device 100, may also be a screen wake-up instruction generated when the user touches a designated screen area of the wearable device 100, and may also be a screen wake-up instruction generated when the user lifts or turns over the wrist.
The user operation related to the embodiment of the application includes, but is not limited to, a click operation, a double-click operation, a continuous click operation, a pressing operation, a multi-pressing operation, a sliding operation, a wrist-turning operation, or a combination operation of at least two of the above user operations, such as a combination operation of pressing and wrist-turning operations.
The screen wake-up instruction in the embodiment of the present application includes a first screen wake-up instruction and a second screen wake-up instruction, where the first screen wake-up instruction is an instruction generated based on any user operation preset for waking up the screen of the wearable device 100, and the second screen wake-up instruction is an instruction generated based on any user operation preset for waking up the screen of the wearable device 100 except for a touch operation corresponding to the first screen wake-up instruction.
It is understood that the first screen wake-up command may be a screen wake-up command generated by a preset user operation in the normal mode, for example, the user operation corresponding to the first screen wake-up command includes a wrist-up operation, a wrist-turning operation, a clicking operation, a tapping operation, and the like. The second screen wake-up instruction may be a screen wake-up instruction generated by a user operation in a different mode from the normal mode in the anti-false touch mode, for example, the user operation corresponding to the second screen wake-up instruction includes a double-click operation, a continuous-click operation, a multi-press operation, a sliding operation, a combined wrist-lifting and pressing operation, and the like.
The following describes in detail a display control method provided in an embodiment of the present application, based on the wearable device 100 shown in fig. 1, with reference to other drawings.
In some application scenarios, the wearable device 100, such as a smart band, a smart watch, and the like, wakes up the screen by mainly touching or pressing the screen, lifting the wrist, and the like, and although the screen can be quickly woken up, there are certain disadvantages. For example, when the user sleeps at night, the user may easily mistakenly press or touch the screen of the wearable device 100 by himself or a person sitting on the pillow, and especially after the user turns on the wrist-lifting and screen-lighting function of the wearable device 100, the user may even turn over to wake up the screen. During the night sleep, the repeated screen lightening affects the sleep of the user and also increases the power consumption of the wearable device 100. How to improve the discernment rate of accuracy to the screen action of awakening up, avoid bright screen spurious triggering to reduce because of the power consumption of the wearable equipment that bright screen increases repeatedly, and reduce the influence to user's sleep quality is the major technical problem that needs to solve at present.
In the prior art, in order to prevent the user from being disturbed during sleeping at night, a disturbance-free mode is provided on most wearable devices 100. After the user turns on the do-not-disturb mode, incoming call information, notification information such as WeChat notification, QQ notification, etc. do not trigger the vibration of the wearable device 100, and the user does not light up when lifting the wrist. At present, there are two methods for turning on the non-disturbing mode of the wearable device 100: (1) starting at a specified time period; such as a user setting to turn on the do-not-disturb mode for a particular period of time; or the do not disturb mode is turned on until the user turns off. (2) Intelligently starting; upon recognizing that the user is in the sleep state, the wearable device 100 stops receiving the information notification.
However, both of the methods for starting the do-not-disturb mode have disadvantages, for example, for the first method for starting the do-not-disturb mode, on one hand, the specified time period is to start the do-not-disturb mode within the time period set by the user, which is not intelligent enough; on the other hand, after the do-nothing mode is turned on, the interference of the bright screen event caused by accidental touch and mistaken pressing of the user or a sleeper at night cannot be solved only by preventing the interference of external information such as incoming call information, WeChat notification, QQ notification or wrist-lifting bright screen. For the second method for turning on the do-nothing mode, although the do-nothing mode can be turned on intelligently according to the sleep state of the user, on one hand, the turn-on of the do-nothing mode only prevents the interference of external information, and cannot solve the interference of screen-on events caused by accidental touch, mistaken press and the like of the user or a person on the head at night; on the other hand, the non-disturbing mode is turned on when the current sleep state of the user is recognized as the sleep state, but actually, when the current sleep state of the user is a preparation sleep stage, that is, the user is in a state of not sleeping, the user is more easily disturbed by the bright screen of the wearable device 100, and the sleep quality of the user who is difficult to fall asleep is seriously affected.
The embodiment of the application provides a display control method, which can effectively reduce the power consumption of the wearable device 100 caused by repeated screen lightening due to user misoperation, prevent the user sleep from being influenced by the repeated screen lightening caused by the user misoperation, and improve the use experience of the user.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating a display control method according to an embodiment of the present disclosure. As shown in fig. 2, the method includes steps S101 to S105.
S101, the wearable device 100 receives a first screen awakening instruction.
In this embodiment of the application, after the user touches any screen area or a designated screen area of the wearable device 100, or the user lifts up or turns over the wrist, the wearable device 100 receives a screen wake-up instruction to wake up the screen, so that the user can perform corresponding operations conveniently.
S102, the wearable device 100 determines the current sleep state of the user.
S103, the wearable device 100 determines whether the current sleep state of the user meets a preset condition.
And S104, if the current sleep state of the user does not meet the preset condition, the wearable device 100 executes a first screen awakening instruction, and lights the screen of the wearable device 100.
The sleep states in the embodiment of the application comprise three states of not entering sleep, entering sleep and preparing to enter sleep. Determining the current sleep state of the user, that is, determining whether the current sleep state of the user is a sleep state that is not entered, a sleep state that has been entered, or a sleep state that is ready to enter, by determining the sleep state of the user, when the current sleep state of the user is a sleep state that is not entered, a first screen wake-up instruction can be directly executed, and a screen of the wearable device 100 is lit; when the current sleep state of the user is the sleep state or is ready to enter the sleep state, the screen of the wearable device 100 needs to be lightened through the second screen wake-up instruction, so that the power consumption of the wearable device 100 is reduced due to repeated screen lightening caused by misoperation of the user in the two sleep states, the phenomenon that the sleep of the user is affected due to the repeated screen lightening caused by the misoperation of the user is prevented, and the use experience of the user is improved.
S105, if the current sleep state of the user meets the preset condition, the wearable device 100 lights up the screen after receiving the second screen wake-up instruction.
In the implementation of the present application, the current sleep state of the user meets the preset condition, specifically, when the current sleep state of the user is the sleep state already entered or is ready to enter the sleep state, the current sleep state of the user can be considered to meet the preset condition.
During night sleep, if a user whose current sleep state is ready to enter the sleep state inadvertently touches any or a designated screen area of the wearable device 100 or lifts his wrist while turning over, the wearable device 100 executes a screen wake-up command generated by these user operations, and especially in the case where the user turns off the wearable device, i.e., the brightness of the environment in which the wearable device is currently located is very dark, the wearable device 100 screen is lighted, and the sudden screen lighting may irritate the eyes of the user in darkness, which may affect the sleep quality of the user while causing eye discomfort. Especially, for a user who is difficult to fall asleep, for example, a user who is difficult to fall asleep due to the influence of light, the sudden or repeated bright screen may seriously affect the sleep quality of the user, so that the user may perceive the bright screen of the wearable device 100, or view or browse the display content of the wearable device 100 after the wearable device 100 is bright screen, which may easily make it more difficult for the user who is difficult to fall asleep to enter a sleep state, and thus the user experience is very poor. Moreover, repeated screen lightening caused by misoperation of the user also increases the power consumption of the wearable device 100, and reduces the standby time of the wearable device 100.
In some embodiments, after determining that the current sleep state of the user is the sleep state or the sleep state ready to be entered, the wearable device 100 recognizes the user operation corresponding to the generated screen wake-up instruction as a misoperation, and shields the screen wake-up instruction without lighting up the screen, so that discomfort caused by the user being lighted up due to the user misoperation can be effectively reduced, power consumption of the wearable device 100 due to repeated screen lighting-up is effectively reduced, and the influence of the repeated screen lighting-up on the sleep of the user due to the user misoperation is prevented. At this time, if the user further wants to wake up the screen of the wearable device 100, the screen needs to be lighted up by the screen wake-up instruction (i.e., the second screen wake-up instruction) generated based on another user operation so as to be distinguished from the screen wake-up instruction (i.e., the first screen wake-up instruction) generated by the previous user operation, so that the user can still light up the screen of the wearable device 100 by the screen wake-up instruction generated by another user operation after the wearable device 100 determines that the current sleep state of the user is the sleep state entered or is ready to enter the sleep state, which meets the diversified requirements of the user and improves the user experience.
In other implementations, it may be possible that the user still wants to illuminate the screen of the wearable device 100 to view time or view other content after the wearable device 100 determines that the user's current sleep state is or is ready to enter the sleep state, or after the wearable device 100 turns on the anti-false touch mode. At this time, if the screen wake-up instruction generated by the user operation is determined to be an invalid instruction, that is, the screen wake-up instruction generated by the current user operation is shielded, and the screen of the wearable device 100 is not lighted, the illusion that the user device is damaged or has a problem is possibly given, and the use experience of the user is reduced. In order to avoid this situation, in the embodiment of the present application, after determining that the current sleep state of the user is the sleep state already entered or is ready to enter the sleep state, the wearable device 100 further detects whether a second screen wake-up instruction is received, that is, an instruction generated by another user operation after the user operation corresponding to the first screen wake-up instruction is received, and if the second screen wake-up instruction is detected, the wearable device 100 executes the second screen wake-up instruction to light up the screen of the wearable device 100, so that the user can conveniently view the display content of the wearable device 100, thereby satisfying the diversified needs of the user, and improving the user experience.
According to the embodiment of the application, after a first screen awakening instruction is received, the current sleep state of a user is determined; if the current sleep state of the user meets the preset condition, the screen of the wearable device 100 is lightened after the second screen awakening instruction is received, the power consumption of the wearable device 100 caused by repeated screen lightening due to misoperation of the user can be effectively reduced, the standby time of the wearable device 100 is prolonged, adverse effects such as discomfort of eyes and influence on sleep quality of the user caused by repeated screen lightening due to misoperation are reduced, and the user experience is improved.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating another display control method according to an embodiment of the present disclosure. As shown in fig. 2, the method includes steps S201 to S204.
S201, the wearable device 100 receives a first screen wake-up instruction.
S201 can be described with reference to step S101 in the embodiment described in fig. 2, and is not described herein again.
S202, the wearable device 100 detects a current screen state.
S203, if the current screen state of the wearable device 100 is in the screen-off state, determining the current sleep state of the user.
In the embodiment of the present application, the current screen states of the wearable device 100 include a screen-on state and a screen-off state. When the screen state of the wearable device 100 is in the bright screen state, the first screen wake-up instruction may be directly executed to light the screen of the wearable device 100, so that the user may browse or view corresponding display content quickly. When the screen state of the wearable device 100 is in the screen-off state, the screen lightening caused by the misoperation of the user corresponding to the first screen awakening instruction needs to be avoided, and the judgment accuracy rate of the repeated screen lightening caused by the misoperation of the user can be improved by determining the current sleep state of the user, so that the power consumption of the wearable device increased by the repeated screen lightening caused by the misoperation of the user is reduced, and the purpose that the sleep of the user is influenced by the repeated screen lightening caused by the misoperation of the user is prevented.
And S204, if the current sleep state of the user meets the preset condition, the wearable device 100 lights up the screen after receiving the second screen wake-up instruction.
S204 can be described with reference to step S105 in the embodiment described in fig. 2, and is not described herein again.
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating another display control method according to an embodiment of the present disclosure. As shown in fig. 4, the method includes steps S301 to S304.
S301, the wearable device 100 receives a first screen wake-up instruction. S301 can be described with reference to step S101 in the embodiment described in fig. 2, and is not described herein again.
S302, the wearable device 100 determines whether the anti-false touch mode is turned on.
The above-mentioned false touch prevention mode is a preset mode for preventing a user from operating by mistake, and in the false touch prevention mode, the user needs a relatively complicated operation such as double-click, continuous click, sliding and the like to light the screen, but a simple click operation, wrist-lifting operation and the like cannot light the screen.
In this embodiment of the application, the user may set the false touch prevention mode to be turned on within a specified time period, for example, 22:00 to 7:00 of the next day, after the false touch prevention mode of the wearable device 100 is turned on, and within the specified time period, when the wearable device 100 is in the screen-off state, if the user does not touch any screen area or specified screen area of the wearable device 100, or the user lifts or turns over the wrist, the wearable device 100 will detect a screen wake-up instruction generated based on the current user operation, at this time, the wearable device 100 will need to determine whether the user operation corresponding to the screen wake-up instruction is the preset user operation, and if so, turn on the screen. Otherwise, the user operation corresponding to the first screen awakening instruction is determined as misoperation, the screen awakening instruction is shielded, and the screen of the wearable device 100 is not lightened, so that the screen is prevented from being lightened due to the misoperation of the user.
The preset user operation is an operation preset to wake up the screen in the anti-false touch mode, such as a double click, a continuous click, or an operation of pressing any area or a designated area of the screen multiple times, a sliding operation performed in any area or a designated area of the screen, a combined operation of pressing and raising the wrist, and the like.
S303, if the false touch prevention mode is not turned on, the wearable device 100 determines the current sleep state of the user.
S304, if the current sleep state of the user meets the preset condition, the wearable device 100 lights up the screen after receiving the second screen wake-up instruction.
S304 can be described with reference to step S105 in the embodiment described in fig. 2, and is not described herein again.
According to the embodiment of the application, whether the false touch prevention mode is started or not is determined, so that whether the current stage of the user needs to be determined to trigger the automatic false touch prevention function or not is determined, the intelligent degree of the wearable device is improved, and different requirements of people are met.
Referring to fig. 5, fig. 5 is a schematic flowchart illustrating another display control method according to an embodiment of the present disclosure. As shown in fig. 4, the method includes steps S401 to S403.
S401, if the current sleep state of the user meets the preset condition, the wearable device 100 detects whether a second screen wake-up instruction is received.
S402, if the second screen wake-up instruction is received, the wearable device 100 executes the second screen wake-up instruction to light up the screen of the wearable device 100.
S403, if the second screen wake-up instruction is not received within the predetermined time, the wearable device 100 determines that the user operation corresponding to the first screen wake-up instruction is an incorrect operation, shields the first screen wake-up instruction, and does not light up the screen of the wearable device 100.
In the embodiment of the present application, when the wearable device 100 is in the screen-off state and the user clicks or presses the screen of the wearable device 100 for the first time, the wearable device 100 will detect the first screen wake-up command generated based on this click or pressing operation of the user, at this time, after determining that the current sleep state of the user is the sleep state entered or is ready to enter the sleep state, the wearable device 100 will not execute the first screen wake-up command for a while, and continue to detect whether the user acts on the screen of the wearable device 100 through sliding, double-clicking, etc. for the second time within a predetermined time, that is, the wearable device 100 detects whether the second screen wake-up command generated based on another user operation of the user such as sliding, double-clicking, etc. is received within the predetermined time, if the second screen wake-up command is received within the predetermined time, the wearable device 100 may then execute the first screen wake-up command to illuminate the screen, and may also execute the second screen wake-up command to illuminate the screen. If the second screen awakening instruction is not received within the preset time, the wearable device 100 determines that the user operation corresponding to the first screen awakening instruction is misoperation, shields the first screen awakening instruction, and does not lighten the screen of the wearable device 100.
According to the embodiment of the application, whether the screen of the wearable device 100 is lightened is determined by judging whether the second screen awakening instruction is received or not, so that the occurrence of the event of repeatedly lightening the screen caused by misoperation of the user is reduced, discomfort brought to the user by lightening the screen caused by misoperation of the user can be effectively reduced, and the power consumption of the wearable device 100 increased by repeatedly lightening the screen of the wearable device is effectively reduced.
In the embodiment of the present application, when determining the current sleep state of the user, the wearable device 100 acquires sleep detection data, and determines the current sleep state of the user according to the sleep detection data.
It should be noted that the sleep detection data is related detection data for determining the current sleep state of the user, and includes, but is not limited to, at least one of physiological characteristic data of the user, movement posture data of the user, and current environment data. Wherein the user physiological characteristic data includes but is not limited to heart rate, pulse, respiratory rate, brain wave signal and other data of the user; the user's motion gesture data includes, but is not limited to, the user's wrist gesture data; the current environment data includes, but is not limited to, ambient light level data of the environment in which wearable device 100 is currently located.
The sleep detection data can judge whether the current sleep state of the user is not in the sleep state, is in the sleep state or is ready to enter the sleep state. Further judging whether the user operation corresponding to the first screen instruction needs to be judged as misoperation, if the user operation corresponding to the first screen instruction is not misoperation, the wearable device 100 can execute the first screen wake-up instruction and light the screen of the wearable device 100; otherwise, another screen wake-up instruction, that is, a second screen wake-up instruction, is needed to light up the screen of the wearable device 100, so as to reduce the repeated screen brightening caused by the misoperation of the user and increase the power consumption of the wearable device 100, prevent the repeated screen brightening caused by the misoperation of the user from affecting the sleep of the user, and improve the user experience.
It should be noted that, the current sleep state of the user can be quickly determined to be the non-sleep state and the sleep state through the physiological characteristic data of the user or the motion posture data of the user. The wearable device 10 accurately determines whether the current sleep state of the user is ready to enter the sleep state, which is a key point for improving the accuracy of determining whether the current user operation is a misoperation, so that the accuracy of determining whether the sleep state of the user is ready to enter the sleep state is improved, the accuracy of determining whether the user operation corresponding to a certain screen wake-up instruction is a misoperation can be effectively improved, and the purpose of preventing the user from being touched by mistake is achieved.
Referring to fig. 6, fig. 6 is a flowchart illustrating a method for determining a current sleep state of a user according to an embodiment of the present application. As shown in fig. 6, the method includes steps S501 to S503.
S501, the wearable device 100 obtains wrist gesture data of the user and an ambient light brightness of an environment where the wearable device 100 is currently located.
In the embodiment of the present application, the wrist gesture data of the user includes, but is not limited to, acceleration data of wrist movement, distance data of wrist movement, and the like, and the wearable device 100 may acquire the wrist gesture data of the user through the acceleration sensor 180E, the gyroscope sensor 180B, the distance sensor 180F, and the like. The ambient light level of the environment in which the wearable device 100 is currently located can be acquired by the ambient light sensor 180L.
S502, the wearable device 100 determines whether the acquired gesture data of the wrist of the user meets a first condition, and whether the ambient light brightness of the current environment of the wearable device 100 is lower than a preset brightness threshold.
In the embodiment of the present application, it is determined whether the acquired gesture data of the wrist of the user meets a first condition, specifically, it may be determined whether an average value, a variance, a median, and the like of a root mean square obtained from the gesture data of the wrist of the user within a period of time acquired from the wearable device 100 meets a corresponding preset value range; or judging whether the action level corresponding to the wrist gesture data of the user is a preset action level. And if the mean value, the variance, the median and the like of the root mean square obtained according to the wrist gesture data of the user in a period of time accord with a corresponding preset numerical range, or the action grade corresponding to the wrist gesture data of the user is a preset action grade, determining that the obtained wrist gesture data of the user accord with a first condition.
S503, if the acquired wrist gesture data of the user meet a first condition and/or the environmental light brightness of the current environment where the wearable device is located is lower than a preset brightness threshold, determining that the current sleep state of the user meets a preset condition.
In the embodiment of the application, the action grade corresponding to the wrist gesture data of the user is determined according to the wrist gesture data of the user; and if the action grade corresponding to the determined wrist gesture data of the user is a preset action grade, determining that the wrist gesture data of the user meets a first condition.
In some embodiments, the wearable device 100 acquires N acceleration data ACC in three directions of x, y and z axes of the wearable device 100 through a three-axis acceleration sensor for a period of timexn,ACCyn,ACCznWherein N is an integer greater than or equal to 1, and N is an integer [1, N ]]Calculating N accelerations ACC of the wearable device 100 in three directions of x, y and z axesxn,ACCyn,ACCznMean, median, variance of root mean squareEquating to obtain a corresponding first numerical value, and determining the action level corresponding to the wrist gesture data of the user according to the value range of dividing the first numerical value into the corresponding action levels.
In some specific embodiments, the motion level corresponding to the wrist gesture data of the user may be divided into five levels (e.g., 0-4 levels), where 0 level represents still, 1 level represents a small amount of motion or a small motion amplitude, 2 level represents a medium motion amplitude, 3 level represents more motion or a large motion amplitude, and 4 level represents a large amount of motion.
In some embodiments of the present application, the motion level corresponding to the wrist gesture data of the user may be further divided into more or less motion levels, and the more motion levels, the more detailed the analysis of the motion gesture data of the user wrist, and the more precise the determined motion level.
Specifically, the table 1 may be referred to as a numerical table corresponding to the action level. Each action level corresponds to a different range of values, e.g., level 0 corresponds to a range of values that is a first mean range, e.g., [ a ]1,a2B), a first median range such as [ beta ]12) Or a first variance range such as [ gamma ]12) (ii) a The numerical range corresponding to level 1 is a second mean range such as [ a ]2,a3) A second median range such as [ beta ]23) Or a second variance range such as [ gamma ]23) And so on.
TABLE 1
Figure BDA0002466398420000171
In this embodiment of the application, the corresponding action level is determined according to the wrist gesture data of the user, and the acceleration of the wrist movement of the user can be used, that is, the wearable device 100 can acquire N acceleration data (the acceleration ACC in the three directions of x, y and z axes) of the wearable device 100 in a period of time through the three-axis acceleration sensorxn,ACCyn,ACCzn) Specifically, the motion level corresponding to the wrist gesture data of the user may be determined by calculating a mean and a variance of root mean square of the obtained N pieces of acceleration data, or by obtaining a median of the root mean square of the N pieces of acceleration data.
In some specific embodiments, the wearable device 100, upon receiving the first screen wake-up command, acquires acceleration data at a receiving time point (a time point when the first screen wake-up command is received), and N-1 pieces of acceleration data in a period of time before the receiving time point (that is, assuming that the acquired acceleration data of the wearable device in a time range from the receiving time point and a period of time before the receiving time point are N pieces), calculates a mean value of root-mean-square of the acquired N pieces of acceleration data, obtains a mean value, matches the calculated mean value with a value corresponding to the mean value in a value table corresponding to a motion level, and determines a motion level corresponding to the wrist posture data of the user according to a result of the matching.
In the preset time range, acceleration data of the wearable device 100 in a time period of [08:00:00,08:00:10] is acquired, where the receiving time point is an ending time and the time point before the push is a specific time period is a starting time, for example, the receiving time point is 08:00:00 (eight am), and the time before the push is 10s is 08:00:10 (eight am ten seconds).
In other specific embodiments, the variance of the root mean square of the acquired N pieces of acceleration data may also be calculated to obtain a variance, the calculated variance is matched with a value corresponding to the variance in the value table corresponding to the motion level, and the motion level corresponding to the wrist posture data of the user is determined according to the matching result.
In other specific embodiments, after acquiring the N pieces of acceleration data, a median of a root mean square of the N pieces of acceleration data may be acquired, the median is used as a first numerical value to be matched with a numerical value corresponding to the median in the numerical value table corresponding to the motion level, and the motion level corresponding to the wrist gesture data of the user is determined according to a matching result.
In order to improve the accuracy of judging the action grades corresponding to the wrist gesture data of the user, the embodiment of the application determines at least two corresponding action grades according to the acquired wrist gesture data of the user; comparing the at least two corresponding action grades; if there are at least two motion levels having the same level among the at least two corresponding motion levels, the motion level having the same level and the largest number of motion levels is set as the motion level corresponding to the wrist gesture data of the user.
In some specific embodiments, after determining at least two motion levels of the current wrist of the user according to any two or more of the mean, the variance, or the median in the above embodiments, the determined motion levels are compared, and if at least two motion levels with the same level exist in the at least two corresponding motion levels, the motion level with the same level and the largest number is set as the motion level corresponding to the wrist gesture data of the user, for example, if two motion levels corresponding to the wrist gesture data of the determined user are two, if the two motion levels are consistent, the motion level corresponding to the two consistent motion levels is used as the motion level corresponding to the wrist gesture data of the user; otherwise, the N pieces of acceleration data are acquired again to determine the action level corresponding to the wrist gesture data of the user. If the two action levels corresponding to the determined wrist gesture data of the user are more than two, if the proportion of the action levels with the same number and the largest number in the more than two action levels to the total number of the action levels corresponding to the current wrist gesture data of the user reaches a preset threshold value, setting the action levels with the same number and the largest number as the action levels corresponding to the wrist gesture data of the user, and otherwise, acquiring N pieces of acceleration data again to determine the action levels corresponding to the wrist gesture data of the user. Or further determining the action grade of the user through the movement amplitude of the wrist of the user, comparing the obtained action grades, and determining the action grade corresponding to the wrist gesture data of the user according to the comparison result.
For example, in some embodiments, the amplitude of the movement of the user's wrist, i.e., the value of the movement of the user's wrist between a certain reference point distance or angle, is determined by acquiring data from the acceleration sensor 180E or the gyroscope sensor 180B, and the distance sensor 180F. After the motion amplitude of the wrist of the user is determined, comparing the motion amplitude with a preset amplitude value, and if the determined motion amplitude of the wrist of the user is smaller than the preset amplitude value, determining that the motion amplitude of the user is smaller; if the determined magnitude of the motion of the user's wrist is greater than or equal to the predetermined magnitude value, the magnitude of the motion of the user is deemed to be greater. And determining the corresponding action grade according to the size of the identified motion amplitude of the user.
In the embodiment of the application, the action grade corresponding to the wrist gesture data of the user is determined according to the wrist gesture data of the user; if the motion level corresponding to the determined wrist gesture data of the user is a preset motion level, such as the above level 0 or level 1, it is determined that the wrist gesture data of the user meets the first condition.
For example, when the action level corresponding to the wrist gesture data of the user is level 0, the current sleep state of the user may be considered as the sleep state entered, and when the action level corresponding to the wrist gesture data of the user is level 1, the current sleep state of the user may be considered as the sleep state ready to enter. When the action level corresponding to the wrist gesture data of the user is 2 levels or more, the current sleep state of the user can be considered as a sleep state.
When the ambient light brightness of the current environment of the wearable device 100 is not lower than the preset brightness threshold, even if the screen of the wearable device 100 is turned on, the discomfort of the eyes of the user is not caused, but the power consumption of the wearable device 100 is increased, therefore, when the ambient light brightness of the current environment of the wearable device is not lower than the preset brightness threshold, but when the gesture data of the wrist of the user meets the first condition, in order to facilitate the user to view the display content of the screen of the wearable device 100, the screen of the wearable device 100 still needs to be turned on through a second screen wake-up instruction, so that the power consumption of the wearable device 100 is increased due to repeated screen lightening caused by misoperation of the user is reduced, and the standby time of the wearable device 100 is prolonged.
In some embodiments, determining the current sleep state of the user may also be determined by physiological parameters such as heart rate, respiration, etc. For example, by determining whether the heart rate of the user is lower than a preset heart rate value, whether the number of breaths within a preset time is lower than a preset number, and the like.
It is understood that, when the current sleep state of the user is the sleep state entered state, in order to reduce the power consumption of the wearable device 100 caused by the repeated screen-on due to the user misoperation, the first screen wake-up command is generally directly masked after the current sleep state of the user is the sleep state entered state. If the user wakes up suddenly and wants to light the screen, the screen needs to be lighted through the second screen wake-up instruction, so that the wearable device can distinguish the screen wake-up operations of the user in different sleep stages, and the use experience of the user is improved.
According to the embodiment of the application, whether the current sleep state of the user meets the preset condition is determined by judging whether the acquired gesture data of the wrist of the user meets the first condition and whether the ambient light brightness of the current environment of the wearable device 100 is lower than the preset brightness threshold, namely whether the current sleep state of the user is in the sleep state or is ready to enter the sleep state is determined, so that the accuracy of judging whether the user operation corresponding to a certain screen awakening instruction is misoperation can be effectively improved, and the purpose of preventing the user from being touched mistakenly is achieved.
It should be noted that, due to the limitation of the wearable device 100 itself, when determining the current sleep state of the user, it may not be possible to quickly and accurately determine whether the user is currently in a suspected sleep stage or a sleep stage, that is, after the wearable device 100 detects the first screen wake-up instruction, the wearable device 100 may send a sleep stage confirmation request instruction to a third-party electronic device, for example, a smartphone bound to the wearable device 100, where the sleep stage confirmation request instruction is used to instruct the third-party electronic device to judge the current sleep state of the user, and the third-party electronic device feeds back the judgment result to the wearable device 100, that is, the execution main body of the steps S501 to S503 may be other electronic devices except the wearable device 100.
Embodiments of the present application also provide a computer-readable storage medium having stored therein instructions, which when executed on a computer or processor, cause the computer or processor to perform one or more steps of any one of the methods described above.
The embodiment of the application also provides a computer program product containing instructions. The computer program product, when run on a computer or processor, causes the computer or processor to perform one or more steps of any of the methods described above.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optics, digital subscriber line) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A display control method is applied to wearable equipment and is characterized by comprising the following steps:
after receiving a first screen awakening instruction, determining the current sleep state of a user;
and if the current sleep state of the user meets the preset condition, lightening the screen of the wearable device after receiving a second screen awakening instruction.
2. The display control method according to claim 1, further comprising, after receiving the first screen wake-up instruction:
detecting a current screen state of the wearable device;
and if the current screen state of the wearable device is in the screen-off state, determining the current sleep state of the user.
3. The display control method of claim 1, wherein if the current sleep state of the user meets a preset condition, lighting up the screen of the wearable device after receiving a second screen wake-up instruction comprises:
if the current sleep state of the user meets a preset condition, detecting whether a second screen awakening instruction is received;
if the second screen awakening instruction is received, executing the second screen awakening instruction, and lightening a screen of the wearable device;
if the second screen awakening instruction is not received within the preset time, the user operation corresponding to the first screen awakening instruction is determined to be misoperation, the first screen awakening instruction is shielded, and the screen of the wearable device is not lightened.
4. The display control method of claim 1, further comprising, after the determining the current sleep state of the user:
and if the current sleep state of the user does not meet the preset condition, executing the first screen awakening instruction, and lighting the screen of the wearable device.
5. The display control method of any one of claims 1 to 4, wherein the determining the current sleep state of the user comprises:
acquiring wrist gesture data of the user and the ambient light brightness of the current environment where the wearable device is located;
and if the acquired wrist gesture data of the user meets a first condition and/or the environmental light brightness of the current environment where the wearable device is located is lower than a preset brightness threshold, determining that the current sleep state of the user meets a preset condition.
6. The display control method according to claim 5, wherein the determining that the current sleep state of the user meets a preset condition if the acquired wrist gesture data of the user meets a first condition and/or the ambient light brightness of the environment where the wearable device is currently located is lower than a preset brightness threshold comprises:
and if the action grade corresponding to the acquired wrist gesture data of the user is a preset action grade and/or the environmental light brightness of the current environment of the wearable device is lower than a preset brightness threshold, determining that the current sleep state of the user meets a preset condition.
7. The display control method of claim 6, wherein determining the action level corresponding to the acquired wrist gesture data of the user as a preset action level comprises:
determining at least two corresponding action levels through the acquired wrist gesture data of the user;
comparing the at least two corresponding action grades;
and if at least two motion levels with the same level exist in the at least two corresponding motion levels, setting the motion level with the same level and the largest number as the motion level corresponding to the acquired wrist gesture data of the user.
8. The display control method according to claim 1, wherein the first screen wake-up instruction is an instruction generated based on any user operation preset for waking up the screen of the wearable device, and the second screen wake-up instruction is an instruction generated based on any user operation preset for waking up the screen of the wearable device except for the user operation corresponding to the first screen wake-up instruction.
9. A wearable device, comprising: one or more processors, memory, and a display screen;
the memory, the display screen, and the one or more processors, the memory to store computer program code, the computer program code comprising computer instructions;
the computer instructions, when executed by the one or more processors, cause the wearable device to perform the display control method of any of claims 1-8.
10. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the display control method according to any one of claims 1 to 8.
11. A computer program product, which, when run on the wearable device, causes the wearable device to perform the display control method of any of claims 1 to 8.
12. A chip system, comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement a display control method according to any one of claims 1 to 8.
CN202010335463.9A 2020-04-24 2020-04-24 Display control method and wearable device Pending CN113552937A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010335463.9A CN113552937A (en) 2020-04-24 2020-04-24 Display control method and wearable device
PCT/CN2021/084004 WO2021213151A1 (en) 2020-04-24 2021-03-30 Display control method and wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010335463.9A CN113552937A (en) 2020-04-24 2020-04-24 Display control method and wearable device

Publications (1)

Publication Number Publication Date
CN113552937A true CN113552937A (en) 2021-10-26

Family

ID=78101415

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010335463.9A Pending CN113552937A (en) 2020-04-24 2020-04-24 Display control method and wearable device

Country Status (2)

Country Link
CN (1) CN113552937A (en)
WO (1) WO2021213151A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113867168A (en) * 2021-11-02 2021-12-31 珠海格力电器股份有限公司 Control method and device of screen equipment, storage medium and screen equipment
CN115388511A (en) * 2022-08-17 2022-11-25 珠海格力电器股份有限公司 Air conditioner control method and device based on wearable device and electronic device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115079804B (en) * 2021-12-09 2023-11-07 荣耀终端有限公司 Control processing method of electronic equipment
CN114298105B (en) * 2021-12-29 2023-08-22 东莞市猎声电子科技有限公司 Signal processing method for quickly responding to wrist lifting action and brightening screen in running process
CN117008854A (en) * 2022-04-28 2023-11-07 华为技术有限公司 Screen-lighting control method, electronic equipment and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140058679A1 (en) * 2012-08-23 2014-02-27 Apple Inc. Wake Status Detection for Suppression and Initiation of Notifications
CN105446479A (en) * 2014-09-23 2016-03-30 飞比特公司 Methods, systems, and apparatuses to display visibility changes responsive to user gestures
CN105791545A (en) * 2016-02-24 2016-07-20 宇龙计算机通信科技(深圳)有限公司 Anti-disturbing method and device for terminal equipment
CN107436674A (en) * 2017-08-22 2017-12-05 深圳天珑无线科技有限公司 terminal control method, device and non-transitory computer-readable medium
CN107526603A (en) * 2017-09-20 2017-12-29 深圳天珑无线科技有限公司 One kind applies awakening method and device
CN108293080A (en) * 2015-11-26 2018-07-17 华为技术有限公司 A kind of method of contextual model switching
CN110850988A (en) * 2019-12-02 2020-02-28 合肥工业大学 System and method for preventing interference and wrist lifting and screen lighting

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060028429A1 (en) * 2004-08-09 2006-02-09 International Business Machines Corporation Controlling devices' behaviors via changes in their relative locations and positions
CN104158956B (en) * 2014-07-21 2016-03-30 小米科技有限责任公司 Terminal carries out method and the device of sleep awakening
CN104899029A (en) * 2015-05-28 2015-09-09 广东欧珀移动通信有限公司 Screen control method and apparatus
CN107155005A (en) * 2017-04-27 2017-09-12 上海斐讯数据通信技术有限公司 A kind of intelligent wrist wearable device bright screen control method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140058679A1 (en) * 2012-08-23 2014-02-27 Apple Inc. Wake Status Detection for Suppression and Initiation of Notifications
CN105446479A (en) * 2014-09-23 2016-03-30 飞比特公司 Methods, systems, and apparatuses to display visibility changes responsive to user gestures
CN110638422A (en) * 2014-09-23 2020-01-03 飞比特公司 Method, system and device for updating screen content in response to user gesture
CN108293080A (en) * 2015-11-26 2018-07-17 华为技术有限公司 A kind of method of contextual model switching
CN105791545A (en) * 2016-02-24 2016-07-20 宇龙计算机通信科技(深圳)有限公司 Anti-disturbing method and device for terminal equipment
CN107436674A (en) * 2017-08-22 2017-12-05 深圳天珑无线科技有限公司 terminal control method, device and non-transitory computer-readable medium
CN107526603A (en) * 2017-09-20 2017-12-29 深圳天珑无线科技有限公司 One kind applies awakening method and device
CN110850988A (en) * 2019-12-02 2020-02-28 合肥工业大学 System and method for preventing interference and wrist lifting and screen lighting

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113867168A (en) * 2021-11-02 2021-12-31 珠海格力电器股份有限公司 Control method and device of screen equipment, storage medium and screen equipment
CN115388511A (en) * 2022-08-17 2022-11-25 珠海格力电器股份有限公司 Air conditioner control method and device based on wearable device and electronic device

Also Published As

Publication number Publication date
WO2021213151A1 (en) 2021-10-28

Similar Documents

Publication Publication Date Title
CN112577611B (en) Human body temperature measuring method, electronic equipment and computer readable storage medium
CN110989852B (en) Touch screen, electronic equipment and display control method
CN113552937A (en) Display control method and wearable device
CN113827185B (en) Wearing tightness degree detection method and device for wearing equipment and wearing equipment
CN113395382B (en) Method for data interaction between devices and related devices
WO2022007720A1 (en) Wearing detection method for wearable device, apparatus, and electronic device
CN113691271B (en) Data transmission method and wearable device
WO2022100407A1 (en) Intelligent eye mask, terminal device, and health management method and system
CN112334860B (en) Touch control method of wearable device, wearable device and system
CN115757906A (en) Index display method, electronic device and computer-readable storage medium
CN113467747B (en) Volume adjusting method, electronic device and storage medium
WO2022105830A1 (en) Sleep evaluation method, electronic device, and storage medium
CN115665632A (en) Audio circuit, related device and control method
WO2021204036A1 (en) Sleep risk monitoring method, electronic device and storage medium
CN114221402A (en) Charging method and device of terminal equipment and terminal equipment
CN113467735A (en) Image adjusting method, electronic device and storage medium
CN113918003A (en) Method and device for detecting time length of skin contacting screen and electronic equipment
CN113539487A (en) Data processing method and device and terminal equipment
CN113391735A (en) Display form adjusting method and device, electronic equipment and storage medium
CN113509145B (en) Sleep risk monitoring method, electronic device and storage medium
WO2023237087A1 (en) Method for predicting fertile window, apparatus and electronic device
CN114115513B (en) Key control method and key device
CN111026285B (en) Method for adjusting pressure threshold and electronic equipment
WO2021203941A1 (en) Prompting method and apparatus
CN114610195B (en) Icon display method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination