CN113552937B - Display control method and wearable device - Google Patents

Display control method and wearable device Download PDF

Info

Publication number
CN113552937B
CN113552937B CN202010335463.9A CN202010335463A CN113552937B CN 113552937 B CN113552937 B CN 113552937B CN 202010335463 A CN202010335463 A CN 202010335463A CN 113552937 B CN113552937 B CN 113552937B
Authority
CN
China
Prior art keywords
user
screen
wearable device
instruction
sleep state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010335463.9A
Other languages
Chinese (zh)
Other versions
CN113552937A (en
Inventor
张慧
李靖
周林峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010335463.9A priority Critical patent/CN113552937B/en
Priority to PCT/CN2021/084004 priority patent/WO2021213151A1/en
Publication of CN113552937A publication Critical patent/CN113552937A/en
Application granted granted Critical
Publication of CN113552937B publication Critical patent/CN113552937B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a display control method and wearable equipment, wherein the method comprises the following steps: after receiving a first screen awakening instruction, the wearable device determines the current sleep state of the user; and if the current sleep state of the user meets the preset condition, a screen of the wearable device is lightened after a second screen awakening instruction is received, wherein the first screen awakening instruction is an instruction generated based on any preset user operation for awakening the screen, and the second screen awakening instruction is an instruction generated based on any preset user operation except the touch operation corresponding to the first screen awakening instruction for awakening the screen. The power consumption of the wearable device increased by repeated screen lighting caused by misoperation of the user can be effectively reduced, the user can conveniently view the display content of the screen of the wearable device, the screen can be quickly lightened by the second screen awakening instruction, and the use experience of the user is improved.

Description

Display control method and wearable device
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a display control method and a wearable device.
Background
Wearable equipment is the intelligent design of application wearing technique to daily wearing to develop the generic name of wearable equipment, such as intelligent glasses, intelligent gloves, intelligent wrist-watch, intelligent dress, intelligent shoes etc.. The smart watch and the smart bracelet are the most remarkable. With the improvement of health consciousness of people, the function of monitoring health of the wearable equipment is also more and more favored by people. Some functions for monitoring health, such as sleep monitoring functions, sleep apnea monitoring functions, etc., require the user to wear a wearable device during the night to acquire relevant sleep data.
However, if the user inadvertently presses or touches the screen of the wearable device during night sleep, especially when going to sleep, the user wakes up the screen, and a change in the brightness of the screen may affect the sleep of the user, especially for users with poor sleep quality or difficulty falling asleep themselves, the experience is very poor. And repeated bright screens can also increase the power consumption of the wearable equipment, and reduce the standby time of the wearable equipment.
Disclosure of Invention
The application discloses a display control method and wearable equipment, which can reduce the problem that repeated screen lighting caused by misoperation of a user increases power consumption of the wearable equipment.
In a first aspect, an embodiment of the present application provides a display control method, which is applied to a wearable device, where the method includes: after receiving a first screen awakening instruction, the wearable device determines the current sleep state of the user; and if the current sleep state of the user meets the preset condition, the screen of the wearable device is lightened after a second screen awakening instruction is received, wherein the first screen awakening instruction is an instruction generated based on any preset user operation for awakening the screen of the wearable device, and the second screen awakening instruction is an instruction generated based on any preset user operation except the user operation corresponding to the first screen awakening instruction for awakening the screen of the wearable device. The power consumption of the wearable device increased by repeated screen lighting caused by misoperation of the user can be effectively reduced, the influence of the repeated screen lighting caused by misoperation of the user on sleeping of the user is prevented, the user can conveniently and quickly wake up the screen through the second screen awakening instruction when the user needs to view the display content of the screen of the wearable device, and the use experience of the user is improved.
The screen wake-up instruction referred to in the embodiment of the application is an instruction for instructing to turn on a screen of the wearable device generated in response to a user operation. When the screen of the wearable device is in a screen off state, the wearable device generates a screen wake-up instruction to light the screen of the wearable device when a user touches, for example, clicks on any area or a designated area of the wearable device, or when the user lifts or turns over the wrist, wherein the designated area is any screen area preset for responding to the touch operation of the user. That is, the screen wake-up instruction may be a screen wake-up instruction generated when the user touches any screen area of the wearable device, or may be a screen wake-up instruction generated when the user touches a specified screen area of the wearable device, or may be a screen wake-up instruction generated when the user lifts or turns the wrist.
Illustratively, the above-mentioned user operation includes, but is not limited to, a click operation, a double click operation, a continuous click operation, a press operation, a multiple press operation, a slide operation, a wrist turning operation, or a combination operation of at least two of the above user operations, such as a combination operation of press+wrist turning operation.
For example, the first screen wake-up instruction may be a screen wake-up instruction generated by a preset user operation in a normal mode, where, for example, the user operation corresponding to the first screen wake-up instruction includes a wrist lifting operation, a wrist turning operation, a clicking operation, a tapping operation, and so on. The second screen wake-up instruction may be a screen wake-up instruction generated by a preset user operation different from the normal mode in the false touch prevention mode, for example, the user operation corresponding to the second screen wake-up instruction includes a double click operation, a continuous click operation, a multiple press operation, a sliding operation, a wrist lifting and pressing combination operation, and the like.
With reference to the first aspect, in a first possible implementation manner of the first aspect, after receiving the first screen wake instruction, the wearable device further includes: detecting the current screen state of the wearable device; and if the current screen state of the wearable device is in the off-screen state, determining the current sleep state of the user.
The sleep states include three states of not going to sleep, having gone to sleep, and ready to go to sleep. The fact that the current sleep state of the user meets the preset condition specifically means that the current sleep state of the user is the entered sleep state or is the ready to enter the sleep state.
According to the embodiment of the application, the current screen state of the wearable device is detected, and after the current screen state of the wearable device is determined to be in the screen-off state, the current sleep state of the user is determined, so that the wearable device can quickly respond to other user operations when the current screen state of the wearable device is in the screen-on state, and the response efficiency of the wearable device is improved. Only when the current screen state of the wearable device is in the screen-off state, the current sleep state of the user needs to be determined, so that the purpose of reducing the power consumption of the wearable device increased by repeated screen-on caused by misoperation of the user and preventing the repeated screen-on caused by misoperation of the user from influencing the sleep of the user is achieved.
With reference to the first aspect, in a second possible implementation manner of the first aspect, after receiving the first screen wake instruction, the wearable device further includes: the wearable device determines whether an anti-false touch mode has been turned on; if the false touch prevention mode is started, the wearable device determines whether the user operation corresponding to the first screen awakening instruction is a preset user operation or not; and if the user operation corresponding to the first screen awakening instruction is a preset user operation, executing the first screen awakening instruction, and lighting the screen of the wearable device, otherwise, recognizing the user operation corresponding to the first screen awakening instruction as misoperation, shielding the first screen awakening instruction, and not lighting the screen of the wearable device. If the anti-false touch mode is not on, the wearable device determines the sleep stage in which the user is currently located.
According to the embodiment of the application, whether the automatic false touch prevention function is triggered by determining whether the false touch prevention mode is started or not is determined, so that whether the current stage of the user is required to be determined, namely, if the current sleep state of the user meets the preset condition, the screen of the wearable device is lightened after the second screen awakening instruction is received, the intelligent degree of the wearable device is improved, and different requirements of people are met.
With reference to the first aspect, in a third possible implementation manner of the first aspect, if the current sleep state of the user meets a preset condition, then lighting a screen of the wearable device after receiving a second screen wake-up instruction includes: if the current sleep state of the user meets a preset condition, detecting whether the second screen awakening instruction is received; if the second screen awakening instruction is received, executing the second screen awakening instruction, and lighting a screen of the wearable equipment; if the second screen awakening instruction is not received within the preset time, the user operation corresponding to the first screen awakening instruction is confirmed to be misoperation, the first screen awakening instruction is shielded, and the screen of the wearable device is not lightened.
The embodiment of the application determines whether to lighten the screen of the wearable device by judging whether to receive the second screen awakening instruction so as to reduce the occurrence of repeated screen lightening events caused by misoperation of the user, effectively reduce discomfort brought to the user by lightening the screen caused by misoperation of the user, effectively reduce the increased power consumption of the wearable device caused by repeated screen lightening, and prevent the repeated screen lightening caused by misoperation of the user from influencing the sleep of the user.
With reference to the first aspect, in a fourth possible implementation manner of the first aspect, after receiving the first screen wake instruction, the wearable device determines a current sleep state of the user, including: the wearable device acquires sleep detection data of a user; and determining the current sleep state of the user according to the sleep detection data of the user.
The sleep detection data is related detection data for judging the current sleep state of the user, and comprises at least one of physiological characteristic data of the user, movement posture data of the user and current environment data. Wherein the physiological characteristic data of the user comprises, but is not limited to, heart rate, pulse, respiratory rate, brain wave signals and the like of the user; the motion gesture data of the user includes, but is not limited to, wrist gesture data of the user; the current environmental data includes, but is not limited to, ambient light level data of the environment in which the wearable device is currently located.
The embodiment of the application can judge whether the current sleep state of the user is the non-sleep state, the sleep state or the sleep state ready to be entered according to the sleep detection data. Further judging whether the user operation corresponding to the first screen instruction is required to be judged to be misoperation or not, if the user operation corresponding to the first screen instruction is not misoperation, namely the current sleep state of the user does not accord with the preset condition, the wearable device can execute the first screen awakening instruction, and the screen of the wearable device is lightened; otherwise, a screen awakening instruction generated by another user operation, namely a second screen awakening instruction, is needed to lighten the screen of the wearable device, so that the purpose of reducing the power consumption of the wearable device caused by repeated screen lightening due to misoperation of the user is achieved, meanwhile, the occurrence of the event that the repeated screen lightening is likely to reduce the sleeping quality of the user is reduced, and the user experience is improved.
It should be noted that, the current sleep state of the user can be quickly determined to be the state that does not enter the sleep state and the state that does enter the sleep state by the physiological characteristic data of the user or the movement gesture data of the user. The wearable device accurately judges whether the current sleep state of the user is the state ready to enter the sleep state, which is a key point for improving the accuracy of judging whether the current user operation is the misoperation, so that the accuracy of judging whether the sleep state of the user is the state ready to enter the sleep state is improved, the accuracy of judging whether the user operation corresponding to a certain screen wake-up instruction is the misoperation can be effectively improved, and the aim of better preventing the misoperation is achieved.
Exemplary, the user sleep detection data includes wrist gesture data of the user and ambient light brightness data of an environment where the wearable device is currently located, and determining, according to the user sleep detection data, a current sleep state of the user includes:
Determining an action grade corresponding to the wrist gesture data of the user according to the wrist gesture data of the user; and if the action grade corresponding to the determined wrist gesture data of the user is a preset action grade, determining that the wrist gesture data of the user meets a first condition.
Correspondingly, if the acquired wrist gesture data of the user accords with a first condition and/or the environmental light brightness of the current environment of the wearable device is lower than a preset brightness threshold, determining that the current sleep state of the user accords with a preset condition includes: if the action grade corresponding to the acquired wrist gesture data of the user is a preset action grade and/or the ambient light brightness of the current environment of the wearable device is lower than a preset brightness threshold value, determining that the current sleep state of the user meets preset conditions.
According to the embodiment of the application, whether the wrist gesture data of the user accords with the first condition is determined by judging whether the action grade corresponding to the wrist gesture data of the user is the preset action grade, namely, the wrist gesture data of the user can be considered to accord with the first condition when the first condition is that the action grade of the wrist of the user is the preset action grade, so that the accuracy rate of judging the current sleep state of the user is improved.
With reference to the first aspect, in a fifth possible implementation manner of the first aspect, the determining that the action level corresponding to the acquired wrist posture data of the user is a preset action level includes: determining at least two corresponding action grades according to the acquired wrist gesture data of the user; comparing the at least two corresponding action grades; if at least two action levels with the same level exist in the at least two corresponding action levels, setting the action level with the highest level number as the action level corresponding to the wrist gesture data of the user.
According to the embodiment of the application, the action grade corresponding to the acquired wrist gesture data of the user is determined, the action grade with the same number and the largest grade is set as the action grade corresponding to the wrist gesture data of the user, so that the judgment accuracy of the action grade corresponding to the wrist gesture data of the user is improved, the accuracy of judging whether the current sleep state of the user is ready to enter the sleep state is further improved, the accuracy of judging whether the user operation corresponding to a certain screen awakening instruction is misoperation is effectively improved, and the aim of better preventing misoperation is achieved.
In a second aspect, the present application provides a wearable device comprising: one or more processors, memory, and a display screen; the memory, the display screen, and the one or more processors are coupled, the memory is configured to store computer program code, the computer program code includes computer instructions that, when executed by the one or more processors, cause the wearable device to perform a method as provided by any one of the possible implementations of the first aspect.
In a third aspect, the present application provides a computer storage medium comprising computer instructions which, when run on a wearable device, cause the wearable device to perform a method as provided by any one of the possible implementations of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product, which when run on a computer causes the computer to perform, for example, causing the wearable device described above to perform, as provided by any one of the possible implementations of the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip system, including a processor, where the processor is coupled to a memory, and where the processor executes a computer program stored in the memory, to cause the wearable device to perform a method as provided in any one of possible implementation manners of the first aspect. The chip system can be a single chip or a chip module formed by a plurality of chips.
It will be appreciated that the wearable device according to the second aspect, the computer storage medium according to the third aspect, the computer program product according to the fourth aspect, or the chip system according to the fifth aspect provided above are all configured to perform the method provided by the first aspect. Therefore, the advantages achieved by the method can be referred to as the advantages of the corresponding method, and will not be described herein.
Drawings
The drawings to which embodiments of the present application are applied are described below.
Fig. 1 is a schematic structural diagram of a wearable device 100 provided in an embodiment of the present application;
Fig. 2 is a schematic flow chart of a display control method according to an embodiment of the present application;
FIG. 3 is a flowchart of another display control method according to an embodiment of the present application;
FIG. 4 is a flowchart of another display control method according to an embodiment of the present application;
FIG. 5 is a flowchart of another display control method according to an embodiment of the present application;
fig. 6 is a flowchart of a method for determining a current sleep state of a user according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. The terminology used in the description of the embodiments of the application is for the purpose of describing particular embodiments of the application only and is not intended to be limiting of the application.
First, the wearable device according to the embodiment of the present application is described. Referring to fig. 1, fig. 1 is a schematic structural diagram of a wearable device 100 according to an embodiment of the application.
The wearable device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation on the wearable device 100. In other embodiments of the application, the wearable device 100 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be, among other things, a neural hub and a command center of the wearable device 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the wearable device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (CAMERA SERIAL INTERFACE, CSI), display serial interfaces (DISPLAY SERIAL INTERFACE, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface, implementing the photographing function of wearable device 100. The processor 110 and the display screen 194 communicate through a DSI interface to implement the display functionality of the wearable device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the wearable device 100, and may also be used to transfer data between the wearable device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the wearable device 100. In other embodiments of the present application, the wearable device 100 may also use different interfacing manners, or a combination of multiple interfacing manners, in the above embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the wearable device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the wearable device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the wearable device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G or the like for use on the wearable device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near Field Communication (NFC), infrared (IR), etc., for use on the wearable device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of wearable device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that wearable device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The wearable device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like.
The wearable device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, wearable device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the wearable device 100 is selecting a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The wearable device 100 may support one or more video codecs. In this way, the wearable device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of the wearable device 100 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the wearable device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the wearable device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the wearable device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The wearable device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The wearable device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When wearable device 100 is answering a phone or voice message, voice may be received by placing receiver 170B close to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The wearable device 100 may be provided with at least one microphone 170C. In other embodiments, the wearable device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the wearable device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The wearable device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display 194, the wearable device 100 detects the touch operation intensity from the pressure sensor 180A. The wearable device 100 may also calculate the location of the touch from the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the wearable device 100. In some embodiments, the angular velocity of wearable device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the angle of the shake of the wearable device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the wearable device 100 through the reverse motion, thereby realizing anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, wearable device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The wearable device 100 can detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the wearable device 100 is a flip machine, the wearable device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the wearable device 100 in various directions (typically three axes).
A distance sensor 180F for measuring a distance. The wearable device 100 may measure the distance by infrared or laser light. In some embodiments, capturing a scene, wearable device 100 may range using distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The wearable device 100 emits infrared light outwards through the light emitting diode. The wearable device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the wearable device 100. When insufficient reflected light is detected, the wearable device 100 may determine that there is no object in the vicinity of the wearable device 100. The wearable device 100 can detect that the user holds the wearable device 100 close to the ear to talk by using the proximity light sensor 180G, so as to automatically extinguish the screen to achieve the purpose of saving electricity. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The wearable device 100 may adaptively adjust the display screen 194 brightness according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether wearable device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The wearable device 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is for detecting temperature. In some embodiments, wearable device 100 performs a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, wearable device 100 performs a reduction in performance of a processor located in proximity to temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the wearable device 100 heats the battery 142 to avoid low temperatures causing the wearable device 100 to shut down abnormally. In other embodiments, when the temperature is below yet another threshold, wearable device 100 performs boosting of the output voltage of battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the wearable device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The wearable device 100 may receive key inputs, generating key signal inputs related to user settings and function control of the wearable device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be contacted and separated from the wearable device 100 by inserting the SIM card interface 195, or by extracting from the SIM card interface 195. The wearable device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The wearable device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, wearable device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the wearable device 100 and cannot be separated from the wearable device 100.
Next, some terms involved in the embodiments of the present application are explained for easy understanding by those skilled in the art.
It should be noted that, the following embodiments of the present application relate to a screen wake instruction for indicating an instruction generated in response to a user operation to light a screen of the wearable device 100. When the wearable device 100 is in the off-screen state, the user touches, for example, clicking, any area of the wearable device 100 or a designated area, or when the user lifts or turns over the wrist, the wearable device 100 generates a screen wake-up instruction to light up the screen, wherein the designated area is any screen area preset for responding to the touch operation of the user. The screen wake-up instruction may be a screen wake-up instruction generated when a user touches any screen area of the wearable device 100, or may be a screen wake-up instruction generated when the user touches a specified screen area of the wearable device 100, or may be a screen wake-up instruction generated when the user lifts or turns over the wrist.
The user operation according to the embodiment of the present application includes, but is not limited to, a click operation, a double click operation, a continuous click operation, a press operation, a multiple press operation, a slide operation, a wrist turning operation, or a combination operation of at least two of the above user operations, such as a press+wrist turning operation.
The screen wake-up instruction in the embodiment of the application includes a first screen wake-up instruction and a second screen wake-up instruction, wherein the first screen wake-up instruction is an instruction generated based on any preset user operation for waking up the screen of the wearable device 100, and the second screen wake-up instruction is an instruction generated based on any preset user operation for waking up the screen of the wearable device 100 except for a touch operation corresponding to the first screen wake-up instruction.
It may be understood that the first screen wake-up instruction may be a screen wake-up instruction generated by a preset user operation in a normal mode, for example, the user operation corresponding to the first screen wake-up instruction includes a wrist lifting operation, a wrist turning operation, a clicking operation, a tapping operation, and the like. The second screen wake-up instruction may be a screen wake-up instruction generated by a user operation different from the normal mode in the false touch prevention mode, for example, the user operation corresponding to the second screen wake-up instruction includes a double click operation, a continuous click operation, a multiple press operation, a sliding operation, a wrist lifting and press combination operation, and the like.
The following describes in detail a display control method provided by an embodiment of the present application based on the wearable device 100 shown in fig. 1 with reference to other drawings.
In some application scenarios, the wearable device 100, such as a smart bracelet, a smart watch, etc., mainly performs the actions of touching or pressing the screen, lifting the wrist, etc., to light the screen, and although the screen can be quickly awakened, certain disadvantages exist. For example, when the user is sleeping at night, the user may easily wake up the screen of the wearable device 100 because the user may press or touch the screen by himself or herself or by a person at the pillow, and especially after the user turns on the function of the wrist-lifting and screen-lighting of the wearable device 100, even if the user turns over, the user may wake up the screen. While repeated lighting during night sleep affects the user's sleep and also increases the power consumption of the wearable device 100. How to improve the recognition accuracy of the screen awakening action, avoid the false triggering of the bright screen, reduce the power consumption of the wearable equipment increased by repeated bright screen, and reduce the influence on the sleeping quality of the user is a great technical problem to be solved at present.
In the prior art, in order to prevent the user from being disturbed during night sleep, a tamper-free mode is provided on most wearable devices 100. After the user starts the no-disturbance mode, incoming call information, notification information such as WeChat notification, QQ notification and the like cannot trigger vibration of the wearable device 100, and a user cannot lighten a screen when lifting the wrist. There are two methods for opening the no-disturb mode of the wearable device 100 at present: (1) a specified time period is started; for example, a user sets to start a no-disturbance mode in a specific time period; or the do-not-disturb mode is turned on until the user turns off. (2) intelligent starting; upon recognizing that the user is in the sleep state, the wearable device 100 stops receiving the information notification.
However, the two methods for starting the no-disturbance-rejection mode have disadvantages, for example, for the first method for starting the no-disturbance-rejection mode, on the one hand, the starting of the specified time period is to start the no-disturbance-rejection mode in the time period set by the user, which is not intelligent enough; on the other hand, after the no-disturbance mode is started, only the interference of external information such as incoming call information, weChat notification, QQ notification or wrist lifting and screen lightening is prevented, and the interference of screen lightening events caused by accidental false touch, false pressing and the like of the user at night or at the pillow side cannot be solved. For the second method for starting the no-disturbance mode, although the no-disturbance mode can be intelligently started according to the sleep state of the user, on one hand, the no-disturbance mode is started only to prevent the interference of external information, and the interference of bright screen events caused by accidental false touch, false pressing and the like of the user at night or by the people at the pillow side cannot be solved; on the other hand, the non-disturbing mode is started when the current sleep state of the user is identified as the sleep state, but actually, when the current sleep state of the user is the sleep stage, i.e. the user is in a state of sleeping or not, the user is more easily disturbed by the bright screen of the wearable device 100, and for the user with difficulty in sleeping, the sleep quality of the user is seriously affected.
The embodiment of the application provides a display control method, which can effectively reduce the power consumption of the wearable device 100 increased by repeated screen lighting caused by misoperation of a user, prevent the repeated screen lighting caused by misoperation of the user from influencing the sleep of the user and improve the use experience of the user.
Referring to fig. 2, fig. 2 is a flow chart of a display control method according to an embodiment of the application. As shown in fig. 2, the method includes steps S101 to S105.
S101, the wearable device 100 receives a first screen wake instruction.
In the embodiment of the present application, after a user touches any screen area or designated screen area of the wearable device 100, or the user lifts or turns his wrist, the wearable device 100 will receive a screen wake-up instruction to wake up the screen, so that the user can conveniently perform corresponding operations.
S102, the wearable device 100 determines the current sleep state of the user.
S103, the wearable device 100 judges whether the current sleep state of the user meets preset conditions.
And S104, if the current sleep state of the user does not meet the preset condition, the wearable device 100 executes a first screen wake-up instruction, and the screen of the wearable device 100 is lightened.
The sleep states in the embodiment of the application comprise three states of non-sleep, sleep and ready to go to sleep. Determining the current sleep state of the user is to determine whether the current sleep state of the user is a non-entered sleep state or a entered sleep state or is ready to enter the sleep state, and by determining the sleep state of the user, the first screen wake-up instruction can be directly executed when the current sleep state of the user is the non-entered sleep state, so as to light up the screen of the wearable device 100; when the current sleep state of the user is the sleep state or the sleep state is ready to be entered, the screen of the wearable device 100 needs to be lightened through a second screen wake-up instruction, so that the power consumption of the wearable device 100 is reduced, the repeated screen lightening caused by misoperation of the user in the two sleep states is increased, the influence of the repeated screen lightening caused by misoperation of the user on the sleep of the user is prevented, and the use experience of the user is improved.
S105, if the current sleep state of the user meets the preset condition, the wearable device 100 lights up the screen after receiving the second screen wake-up instruction.
In the embodiment of the application, the current sleep state of the user meets the preset condition, specifically, when the current sleep state of the user is the entered sleep state or is ready to enter the sleep state, the current sleep state of the user can be considered to meet the preset condition.
During night sleep, if a user in a current sleep state is ready to enter the sleep state, if the user inadvertently touches any screen area or designated screen area of the wearable device 100, or lifts the wrist when turning over the user, the wearable device 100 executes a screen wake-up instruction generated by these user operations, especially in the case that the environment where the user has turned off the light, i.e. the environment where the wearable device is currently located is very dark, the screen of the wearable device 100 is lightened, and the eyes of the user in the dark are stimulated by the bright screen between the sudden touch, and the sleeping quality of the user may be affected while causing discomfort to the eyes. In particular, for a user who is difficult to fall asleep, for example, a user who is difficult to fall asleep as long as the user has a bright effect, the sleep quality of the user is seriously affected by suddenly or repeatedly lighting the screen, so that the user can easily see or browse the display content of the wearable device 100 after perceiving that the wearable device 100 is on the screen, the user who is originally difficult to fall asleep is easily caused to be more difficult to enter a sleep state, and the user experience is very bad. In addition, repeated screen lighting caused by misoperation of the user also increases power consumption of the wearable device 100, and reduces standby time of the wearable device 100.
In some embodiments, after determining that the current sleep state of the user is the sleep state or is ready to enter the sleep state, the wearable device 100 considers the user operation corresponding to the generated screen wake-up instruction as an incorrect operation, and shields the screen wake-up instruction, so that the screen is not lightened, discomfort caused by the fact that the user is wrongly operated to lighten the screen can be effectively reduced, the power consumption of the wearable device 100 increased due to repeated screen lightening is effectively reduced, and the influence of the repeated screen lightening caused by the incorrect operation of the user on the sleep of the user is prevented. At this time, if the user wants to wake up the screen of the wearable device 100, the screen needs to be lightened by a screen wake-up instruction (i.e. a second screen wake-up instruction) generated based on another user operation, so as to distinguish from a screen wake-up instruction (i.e. a first screen wake-up instruction) generated by a previous user operation, so that after the wearable device 100 determines that the current sleep state of the user is the sleep state or is ready to enter the sleep state, the user can still lighten the screen of the wearable device 100 by the screen wake-up instruction generated by another user operation, thereby satisfying diversified requirements of the user and improving the use experience of the user.
In other implementations, after determining that the user's current sleep state is entered or is ready to enter a sleep state, or after wearable device 100 has turned on the anti-false touch mode, it is possible that the user still wants to illuminate the screen of wearable device 100 to view time or view other content. At this time, if the screen wake-up instruction generated by the user operation is determined to be an invalid instruction, that is, the screen wake-up instruction generated by the current user operation is masked, the screen of the wearable device 100 is not lightened, and the illusion that the user device is bad or has a problem is possibly given, so that the use experience of the user is reduced. In order to avoid this, after determining that the current sleep state of the user is the sleep state or is ready to enter the sleep state, the wearable device 100 in the embodiment of the application further detects whether a second screen wake-up instruction is received, that is, an instruction generated by another user operation after the user operation corresponding to the first screen wake-up instruction, if the second screen wake-up instruction is detected, the wearable device 100 will execute the second screen wake-up instruction, and the screen of the wearable device 100 is lightened, so that the user can conveniently view the display content of the wearable device 100, the diversified requirements of the user are met, and the use experience of the user is improved.
After receiving a first screen awakening instruction, the embodiment of the application determines the current sleep state of the user; if the current sleep state of the user meets the preset condition, the screen of the wearable device 100 is lightened after the second screen awakening instruction is received, so that the power consumption of the wearable device 100 can be effectively reduced, the power consumption of the wearable device 100 is increased due to repeated screen lightening caused by misoperation of the user, the standby time of the wearable device 100 is prolonged, adverse effects, such as discomfort of eyes, influence on sleep quality and the like, caused by repeated screen lightening caused by misoperation of the user are reduced, and the user experience is improved.
Referring to fig. 3, fig. 3 is a flow chart of another display control method according to an embodiment of the application. As shown in fig. 2, the method includes steps S201 to S204.
S201, the wearable device 100 receives a first screen wake instruction.
S201 may be described with reference to step S101 in the embodiment described in fig. 2, and will not be described herein.
S202, the wearable device 100 detects the current screen state.
S203, if the current screen state of the wearable device 100 is in the off-screen state, determining the current sleep state of the user.
In an embodiment of the present application, the current screen state of the wearable device 100 includes a bright screen state and a dead screen state. When the screen state of the wearable device 100 is in the bright screen state, the first screen wake-up instruction can be directly executed, and the screen of the wearable device 100 is lightened, so that a user can conveniently and quickly browse or view corresponding display contents. When the screen state of the wearable device 100 is in the off-screen state, it is necessary to avoid the bright screen caused by the misoperation of the user corresponding to the first screen wake-up instruction, and the current sleep state of the user can be determined to improve the accuracy of judging the repeated bright screen caused by the misoperation of the user, so as to achieve the purposes of reducing the power consumption of the wearable device increased by the repeated bright screen caused by the misoperation of the user and preventing the repeated bright screen caused by the misoperation of the user from affecting the sleep of the user.
S204, if the current sleep state of the user meets the preset condition, the wearable device 100 lights up the screen after receiving the second screen wake-up instruction.
S204 may be described with reference to step S105 in the embodiment described in fig. 2, and will not be described herein.
Referring to fig. 4, fig. 4 is a flow chart of another display control method according to an embodiment of the application. As shown in fig. 4, the method includes steps S301 to S304.
S301, the wearable device 100 receives a first screen wake instruction. S301 may be described with reference to step S101 in the embodiment described in fig. 2, and will not be described herein.
S302, the wearable device 100 determines whether the anti-false touch mode is on.
The above-mentioned error touch prevention mode is a preset mode for preventing user from misoperation, in the error touch prevention mode, the user needs more complicated operations such as double click, continuous click, sliding and the like to light the screen, but simple click operation, wrist lifting operation and the like cannot light the screen.
In the embodiment of the present application, a user may set the anti-false touch mode to be on in a specified time period, for example, 22:00 to 7:00 a next day, when the anti-false touch mode of the wearable device 100 is on, and when the wearable device 100 is in a screen-off state in the specified time period, if the user does not touch any screen area or specified screen area of the wearable device 100 or the user lifts or turns the wrist, the wearable device 100 will detect a screen wake-up instruction generated based on the current user operation, and at this time, the wearable device 100 will need to determine whether the user operation corresponding to the screen wake-up instruction is the preset user operation, and if so, the screen is turned on. Otherwise, the user operation corresponding to the first screen wake-up instruction is determined to be misoperation, the screen wake-up instruction is shielded, and the screen of the wearable device 100 is not lightened, so that the screen is prevented from being lightened due to misoperation of the user.
The preset user operation is a preset operation for waking up the screen in the anti-false touch mode, such as a double-click, a continuous-click or a multiple-press operation on any area or a designated area of the screen, a sliding operation performed on any area or designated area of the screen, a pressing and wrist lifting combination operation, and the like.
S303, if the anti-false touch mode is not turned on, the wearable device 100 determines the current sleep state of the user.
S304, if the current sleep state of the user meets the preset condition, the wearable device 100 lights up the screen after receiving the second screen wake-up instruction.
S304 may be described with reference to step S105 in the embodiment described in fig. 2, and will not be described herein.
According to the embodiment of the application, whether the false touch prevention mode is started is determined, so that whether the current stage of the user is required to be determined to trigger the automatic false touch prevention function is determined, the intelligent degree of the wearable equipment is improved, and different requirements of people are met.
Referring to fig. 5, fig. 5 is a flowchart illustrating another display control method according to an embodiment of the application. As shown in fig. 4, the method includes steps S401 to S403.
S401, if the current sleep state of the user meets the preset condition, the wearable device 100 detects whether a second screen wake-up instruction is received.
S402, if the second screen wake-up instruction is received, the wearable device 100 executes the second screen wake-up instruction, and lights up the screen of the wearable device 100.
S403, if the second screen wake-up instruction is not received within the predetermined time, the wearable device 100 recognizes the user operation corresponding to the first screen wake-up instruction as an error operation, shields the first screen wake-up instruction, and does not light the screen of the wearable device 100.
In the embodiment of the present application, when the wearable device 100 is in the off-screen state, when the user clicks or presses the screen of the wearable device 100 for the first time, the wearable device 100 will detect a first screen wake-up instruction generated based on the clicking or pressing operation of the user, at this time, after judging that the current sleep state of the user is the sleep state or ready to enter the sleep state, the wearable device 100 will not execute the first screen wake-up instruction temporarily, and continuously detect whether the user acts on the screen of the wearable device 100 through operations such as sliding, double clicking, continuous clicking and the like for the second time within a predetermined time, for example, 5 seconds, that is, the wearable device 100 detects whether a second screen wake-up instruction generated based on another user operation such as sliding, continuous clicking, double clicking and the like of the user is received within the predetermined time, and if the second screen wake-up instruction is received within the predetermined time, the wearable device 100 may execute the first screen wake-up instruction to light the screen, and may execute the second screen wake-up instruction to light the screen. If the second screen wake-up instruction is not received within the predetermined time, the wearable device 100 recognizes that the user operation corresponding to the first screen wake-up instruction is an incorrect operation, shields the first screen wake-up instruction, and does not light the screen of the wearable device 100.
The embodiment of the application determines whether to lighten the screen of the wearable device 100 by judging whether the second screen awakening instruction is received or not so as to reduce the occurrence of repeated screen lightening events caused by misoperation of a user, effectively reduce discomfort brought to the user by lightening the screen caused by misoperation of the user and effectively reduce the power consumption of the wearable device 100 increased by repeated screen lightening of the wearable device.
In the embodiment of the present application, when the wearable device 100 determines the current sleep state of the user, sleep detection data is acquired, and the current sleep state of the user is determined according to the sleep detection data.
It should be noted that the sleep detection data is related detection data for determining a current sleep state of the user, and includes, but is not limited to, at least one of physiological characteristic data of the user, movement posture data of the user, and current environment data. Wherein the physiological characteristic data of the user comprises, but is not limited to, heart rate, pulse, respiratory rate, brain wave signals and the like of the user; the motion gesture data of the user includes, but is not limited to, wrist gesture data of the user; the current environmental data includes, but is not limited to, ambient light level data of the environment in which the wearable device 100 is currently located.
The sleep detection data can be used for judging whether the current sleep state of the user is the non-sleep state, the sleep state or the sleep state ready to be entered. Further judging whether the user operation corresponding to the first screen instruction needs to be judged as misoperation or not, if the user operation corresponding to the first screen instruction is not misoperation, the wearable device 100 can execute a first screen awakening instruction, and a screen of the wearable device 100 is lightened; otherwise, another screen wake-up instruction, namely a second screen wake-up instruction, is needed to light the screen of the wearable device 100, so as to achieve the purposes of reducing repeated screen lighting caused by misoperation of the user and increasing power consumption of the wearable device 100, preventing the repeated screen lighting caused by misoperation of the user from influencing sleeping of the user, and improving user experience.
It should be noted that, the current sleep state of the user can be quickly determined to be the state that does not enter the sleep state and the state that does enter the sleep state by the physiological characteristic data of the user or the movement gesture data of the user. The wearable device 10 accurately determines whether the current sleep state of the user is ready to enter the sleep state, which is a key point for improving the accuracy of determining whether the current user operation is misoperation, so that the accuracy of determining whether the sleep state of the user is ready to enter the sleep state is improved, and the accuracy of determining whether the user operation corresponding to a certain screen wake-up instruction is misoperation can be effectively improved, thereby achieving the purpose of better preventing misoperation.
Referring to fig. 6, fig. 6 is a flowchart illustrating a method for determining a current sleep state of a user according to an embodiment of the application. As shown in fig. 6, the method includes steps S501 to S503.
S501, the wearable device 100 acquires wrist gesture data of the user, and the environmental light brightness of the environment where the wearable device 100 is currently located.
In the embodiment of the present application, the wrist gesture data of the user includes, but is not limited to, acceleration data of wrist movement, distance data of wrist movement, and the like, and the wearable device 100 may acquire the wrist gesture data of the user through the acceleration sensor 180E, the gyro sensor 180B, the distance sensor 180F, and the like. The ambient light level of the environment in which the wearable device 100 is currently located may be acquired by the ambient light sensor 180L.
S502, the wearable device 100 judges whether the acquired wrist gesture data of the user meets a first condition and whether the environmental light brightness of the current environment of the wearable device 100 is lower than a preset brightness threshold.
In the embodiment of the present application, whether the obtained wrist gesture data of the user meets the first condition is specifically determined by determining whether the average value, variance, median, etc. of the root mean square obtained according to the wrist gesture data of the user obtained from the wearable device 100 within a period of time meets a corresponding preset numerical range; and judging whether the action grade corresponding to the wrist gesture data of the user is a preset action grade or not. If the average value, variance, median and the like of the root mean square obtained according to the wrist posture data of the user within a period of time meet the corresponding preset numerical range, or the action grade corresponding to the wrist posture data of the user is the preset action grade, the obtained wrist posture data of the user is determined to meet the first condition.
S503, if the acquired wrist gesture data of the user accords with a first condition and/or the environmental light brightness of the current environment of the wearable device is lower than a preset brightness threshold, determining that the current sleep state of the user accords with a preset condition.
According to the embodiment of the application, according to the wrist gesture data of the user, determining the action grade corresponding to the wrist gesture data of the user; and if the action grade corresponding to the determined wrist gesture data of the user is the preset action grade, determining that the wrist gesture data of the user meets the first condition.
In some embodiments, the wearable device 100 obtains N acceleration data ACC xn,ACCyn,ACCzn in three directions of x, y and z axes of the wearable device 100 through a three-axis acceleration sensor for a period of time, where N is an integer greater than or equal to 1, N e1, N, and the corresponding first value is obtained by calculating the mean value, median value and variance value of the root mean square of N acceleration ACC xn,ACCyn,ACCzn of the wearable device 100 in three directions of x, y and z axes, and then determining the action level corresponding to the wrist gesture data of the user according to dividing the first value into the value range of the corresponding action level.
In some specific embodiments, the motion levels corresponding to the wrist gesture data of the user may be divided into five levels (e.g., 0-4 levels), where level 0 indicates rest, level 1 indicates a small amount of motion or a small motion amplitude, level 2 indicates a medium motion amplitude, level 3 indicates a large amount of motion or a large motion amplitude, and level 4 indicates a large amount of motion.
In some embodiments of the present application, the action levels corresponding to the gesture data of the wrist of the user may be further divided into more or less action levels, and the more the action levels are divided, the more detailed the analysis of the gesture data of the wrist of the user, the more accurate the determined action level.
Specifically, the numerical table corresponding to the action level provided in table 1 may be referred to. Each action class corresponds to a different range of values, such as a 0-class corresponding to a first mean range such as [ a 1,a2 ], a first median range such as [ beta 12), or a first variance range such as [ gamma 12); the range of values corresponding to level 1 is a second mean range such as [ a 2,a3 ], a second median range such as [ beta 23), or a second variance range such as [ gamma 23), and so on.
TABLE 1
In the embodiment of the application, the corresponding action level is determined according to the wrist gesture data of the user, and the action level corresponding to the wrist gesture data of the user can be determined by the acceleration of the wrist motion of the user, that is, the N acceleration data (acceleration ACC xn,ACCyn,ACCzn in the x, y and z directions) of the wearable device 100 obtained by the three-axis acceleration sensor in a period of time, specifically, the action level corresponding to the wrist gesture data of the user is determined by calculating the mean and variance of the root mean square of the N obtained acceleration data, or obtaining the median of the root mean square of the N acceleration data.
In some specific embodiments, when the wearable device 100 receives the first screen wake-up instruction, it acquires acceleration data at a receiving time point (a time point when the first screen wake-up instruction is received), and N-1 acceleration data within a period of time before the receiving time point (that is, assuming that the acquired acceleration data of the wearable device in a time range from the receiving time point and a period of time before the receiving time point is N), calculates a mean value of root mean square of the acquired N acceleration data, obtains a mean value, matches a value corresponding to the mean value in a value table corresponding to the action level, and determines the action level corresponding to the wrist posture data of the user according to a matching result.
In the preset time range, the receiving time point is taken as the ending time, and the time point of the pushing specific duration is taken as the starting time, for example, the receiving time point is 08:00:00 (eight am integer), the pushing time of 10s is 08:00:10 (eight am ten seconds), and the acceleration data of the wearable device 100 in the time period of [08:00:00,08:00:10] are obtained.
In other specific embodiments, the variance of the root mean square of the obtained N acceleration data may be calculated to obtain a variance, the calculated variance is matched with a value corresponding to the variance in a value table corresponding to the action level, and the action level corresponding to the wrist gesture data of the user is determined according to the matching result.
In other specific embodiments, after acquiring the N acceleration data, the median of the root mean square of the N acceleration data may be acquired, the median may be used as the first value to be matched with a value corresponding to the median in the value table corresponding to the action level, and the action level corresponding to the wrist posture data of the user may be determined according to the matching result.
In order to improve the accuracy of judging action grades corresponding to the wrist posture data of the user, the embodiment of the application determines at least two corresponding action grades according to the acquired wrist posture data of the user; comparing the at least two corresponding action levels; if at least two action levels with the same level exist in the at least two corresponding action levels, the action level with the largest number of the same levels is set as the action level corresponding to the wrist gesture data of the user.
In some specific embodiments, after determining at least two action levels of the current wrist portion of the user according to any two or more of the mean value, the variance, and the median in the above embodiments, comparing the determined action levels, if at least two action levels with the same level exist in the at least two corresponding action levels, setting the action level with the largest number of levels as the action level corresponding to the wrist portion posture data of the user, for example, when the determined two action levels corresponding to the wrist portion posture data of the user are two, if the two action levels are identical, taking the action level corresponding to the identical two action levels as the action level corresponding to the wrist portion posture data of the user; otherwise, acquiring N acceleration data again to determine the action grade corresponding to the wrist gesture data of the user. If the determined two action levels corresponding to the wrist posture data of the user are more than two, if the proportion of the action level with the largest quantity among the action levels with the same quantity among the action levels with the largest quantity occupies the total number of the action levels corresponding to the current wrist posture data of the user to reach a preset threshold, setting the action level with the largest quantity among the action levels with the same quantity as the action level corresponding to the wrist posture data of the user, otherwise, acquiring N acceleration data again to determine the action level corresponding to the wrist posture data of the user. Or further determining the action grade of the user according to the motion amplitude of the wrist of the user, comparing the obtained action grade, and determining the action grade corresponding to the wrist gesture data of the user according to the comparison result.
For example, in some specific embodiments, the magnitude of motion of the user's wrist, i.e., the value of the user's wrist movement between a certain reference point distance or angle, is determined by acquiring data from the acceleration sensor 180E or the gyro sensor 180B, and the distance sensor 180F. After the motion amplitude of the wrist part of the user is determined, comparing the motion amplitude with a preset amplitude value, and if the determined motion amplitude of the wrist part of the user is smaller than the preset amplitude value, determining that the motion amplitude of the user is smaller; and if the determined motion amplitude of the wrist part of the user is larger than or equal to the preset amplitude value, the motion amplitude of the user is determined to be larger. And determining a corresponding action level according to the motion amplitude of the identified user.
According to the embodiment of the application, the action grade corresponding to the wrist gesture data of the user is determined according to the wrist gesture data of the user; if the determined action level corresponding to the wrist gesture data of the user is a preset action level, for example, level 0 or level 1, then the wrist gesture data of the user is determined to meet the first condition.
For example, when the action level corresponding to the wrist posture data of the user is 0 level, the current sleep state of the user can be considered to be the sleep state, and when the action level corresponding to the wrist posture data of the user is 1 level, the current sleep state of the user can be considered to be the sleep state. When the action level corresponding to the wrist posture data of the user is 2 or more, the current sleep state of the user can be considered as not-entered sleep state.
When the environmental light brightness of the environment where the wearable device 100 is currently located is not lower than the preset brightness threshold, even if the screen of the wearable device 100 is turned on, discomfort of eyes of a user is not caused, but the power consumption of the wearable device 100 is increased, so that when the environmental light brightness of the environment where the wearable device is currently located is not lower than the preset brightness threshold, but the wrist gesture data of the user accords with the first condition, in order to facilitate the user to view the display content of the screen of the wearable device 100, the screen of the wearable device 100 still needs to be turned on through a second screen wake-up instruction, so that the power consumption of the wearable device 100 is increased due to repeated screen-on caused by misoperation of the user, and the standby time of the wearable device 100 is prolonged.
In some embodiments, determining the current sleep state of the user may also be determined by physiological characteristic parameters such as heart rate, respiration, etc. For example, whether the heart rate of the user is lower than a preset heart rate value, whether the respiration frequency in the preset time is lower than the preset frequency, and the like are judged.
It can be appreciated that, when the current sleep state of the user is the sleep state, in order to reduce the repeated screen-lighting caused by the misoperation of the user and increase the power consumption of the wearable device 100, the first screen wake-up instruction is generally directly masked after the current sleep state of the user is the sleep state. If the user wakes up suddenly and wants to lighten the screen, the screen is lightened through a second screen wake-up instruction, so that the wearable device can distinguish the screen wake-up operation of the user in different sleep stages, and the use experience of the user is improved.
According to the embodiment of the application, whether the current sleep state of the user accords with the preset condition or not is determined by judging whether the acquired wrist gesture data of the user accords with the first condition and whether the environmental light brightness of the current environment of the wearable device 100 is lower than the preset brightness threshold value or not, namely, whether the current sleep state of the user is the sleep state or is ready to enter the sleep state is determined, and the accuracy rate of judging whether the user operation corresponding to a certain screen awakening instruction is misoperation or not can be effectively improved, so that the aim of better preventing misoperation is achieved.
It should be noted that, because of the limitation of the wearable device 100, when determining the current sleep state of the user, it may not be possible to quickly and accurately determine whether the user is currently in the suspected sleep stage or the sleep stage, that is, the wearable device 100 may send a sleep stage confirmation request instruction to a third party electronic device, such as a smart phone bound to the wearable device 100, after detecting the first screen wake-up instruction, where the sleep stage confirmation request instruction is used to instruct the third party electronic device to determine the current sleep state of the user, and the third party electronic device feeds back the determination result to the wearable device 100, that is, the execution subject of steps S501 to S503 may be other electronic devices except the wearable device 100.
Embodiments of the present application also provide a computer-readable storage medium having instructions stored therein, which when run on a computer or processor, cause the computer or processor to perform one or more steps of any of the methods described above.
Embodiments of the present application also provide a computer program product comprising instructions. The computer program product, when run on a computer or processor, causes the computer or processor to perform one or more steps of any of the methods described above.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted across a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
The foregoing is merely a specific implementation of the embodiment of the present application, but the protection scope of the embodiment of the present application is not limited to this, and any changes or substitutions within the technical scope disclosed in the embodiment of the present application should be covered in the protection scope of the embodiment of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A display control method applied to a wearable device, comprising:
After receiving a first screen awakening instruction, determining the current sleep state of a user; the first screen awakening instruction is generated by preset user operation in a normal mode;
If the current sleep state of the user meets the preset condition, the screen of the wearable device is lightened after a second screen awakening instruction is received, and the method comprises the following steps: if the current sleep state of the user meets a preset condition, detecting whether the second screen awakening instruction is received; if the second screen awakening instruction is received, executing the second screen awakening instruction, and lighting up the screen of the wearable device; if the second screen awakening instruction is not received within the preset time, the user operation corresponding to the first screen awakening instruction is confirmed to be misoperation, the first screen awakening instruction is shielded, and the screen of the wearable device is not lightened; the second screen wake-up instruction is generated by user operation different from the normal mode in the false touch prevention mode; the current sleep state of the user accords with preset conditions and comprises the following steps: the current sleep state of the user is entered or is ready to enter a sleep state.
2. The display control method of claim 1, wherein after receiving the first screen wakeup instruction, further comprising:
Detecting the current screen state of the wearable device;
and if the current screen state of the wearable device is in the screen-off state, determining the current sleep state of the user.
3. The display control method according to claim 1, further comprising, after said determining a current sleep state of the user:
And if the current sleep state of the user does not meet the preset condition, executing the first screen awakening instruction, and lighting up the screen of the wearable device.
4. A display control method according to any one of claims 1 to 3, wherein the determining the current sleep state of the user comprises:
acquiring wrist gesture data of the user and the environmental light brightness of the current environment of the wearable equipment;
If the acquired wrist gesture data of the user accords with a first condition and/or the ambient light brightness of the current environment of the wearable device is lower than a preset brightness threshold value, determining that the current sleep state of the user accords with a preset condition.
5. The display control method according to claim 4, wherein determining that the current sleep state of the user meets the preset condition if the acquired wrist posture data of the user meets the first condition and/or the environmental light brightness of the environment in which the wearable device is currently located is lower than a preset brightness threshold comprises:
if the action grade corresponding to the acquired wrist gesture data of the user is a preset action grade and/or the ambient light brightness of the current environment of the wearable device is lower than a preset brightness threshold, determining that the current sleep state of the user meets preset conditions.
6. The display control method according to claim 5, wherein determining that the action level corresponding to the acquired wrist posture data of the user is a preset action level includes:
determining at least two corresponding action grades according to the acquired wrist gesture data of the user;
comparing the at least two corresponding action grades;
and if at least two action grades with the same grade exist in the at least two corresponding action grades, setting the action grade with the most grade number as the action grade corresponding to the acquired wrist gesture data of the user.
7. The display control method according to claim 1, wherein the first screen wake-up instruction is an instruction generated based on any user operation preset to wake up a screen of the wearable device, and the second screen wake-up instruction is an instruction generated based on any user operation preset to wake up a screen of the wearable device other than the user operation corresponding to the first screen wake-up instruction.
8. A wearable device, comprising: one or more processors, memory, and a display screen;
The memory, the display screen, and the one or more processors are coupled, the memory is used for storing computer program code, and the computer program code comprises computer instructions;
The computer instructions, when executed by the one or more processors, cause the wearable device to perform the display control method of any of claims 1-7.
9. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the display control method according to any one of claims 1 to 7.
10. A chip system comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the display control method of any one of claims 1 to 7.
CN202010335463.9A 2020-04-24 2020-04-24 Display control method and wearable device Active CN113552937B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010335463.9A CN113552937B (en) 2020-04-24 2020-04-24 Display control method and wearable device
PCT/CN2021/084004 WO2021213151A1 (en) 2020-04-24 2021-03-30 Display control method and wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010335463.9A CN113552937B (en) 2020-04-24 2020-04-24 Display control method and wearable device

Publications (2)

Publication Number Publication Date
CN113552937A CN113552937A (en) 2021-10-26
CN113552937B true CN113552937B (en) 2024-07-19

Family

ID=78101415

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010335463.9A Active CN113552937B (en) 2020-04-24 2020-04-24 Display control method and wearable device

Country Status (2)

Country Link
CN (1) CN113552937B (en)
WO (1) WO2021213151A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113867168A (en) * 2021-11-02 2021-12-31 珠海格力电器股份有限公司 Control method and device of screen equipment, storage medium and screen equipment
CN115079804B (en) * 2021-12-09 2023-11-07 荣耀终端有限公司 Control processing method of electronic equipment
CN114298105B (en) * 2021-12-29 2023-08-22 东莞市猎声电子科技有限公司 Signal processing method for quickly responding to wrist lifting action and brightening screen in running process
CN117008854A (en) * 2022-04-28 2023-11-07 华为技术有限公司 Screen-lighting control method, electronic equipment and computer readable storage medium
CN116056190B (en) * 2022-05-06 2024-09-13 荣耀终端有限公司 Method for managing terminal device, electronic device and computer readable storage medium
CN115388511B (en) * 2022-08-17 2024-09-06 珠海格力电器股份有限公司 Air conditioner control method and device based on wearable equipment and electronic equipment
CN118626183A (en) * 2023-03-09 2024-09-10 华为技术有限公司 Screen state control method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108293080A (en) * 2015-11-26 2018-07-17 华为技术有限公司 A kind of method of contextual model switching
CN110638422A (en) * 2014-09-23 2020-01-03 飞比特公司 Method, system and device for updating screen content in response to user gesture
CN110850988A (en) * 2019-12-02 2020-02-28 合肥工业大学 System and method for preventing interference and wrist lifting and screen lighting
CN111596751A (en) * 2020-05-19 2020-08-28 歌尔智能科技有限公司 Display control method and device for wrist-worn device, wrist-worn device and storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060028429A1 (en) * 2004-08-09 2006-02-09 International Business Machines Corporation Controlling devices' behaviors via changes in their relative locations and positions
WO2013093712A1 (en) * 2011-12-22 2013-06-27 Koninklijke Philips Electronics N.V. Wake-up system
US20140058679A1 (en) * 2012-08-23 2014-02-27 Apple Inc. Wake Status Detection for Suppression and Initiation of Notifications
CN104158956B (en) * 2014-07-21 2016-03-30 小米科技有限责任公司 Terminal carries out method and the device of sleep awakening
CN104899029A (en) * 2015-05-28 2015-09-09 广东欧珀移动通信有限公司 Screen control method and apparatus
CN105791545B (en) * 2016-02-24 2019-08-02 宇龙计算机通信科技(深圳)有限公司 A kind of terminal device is anti-to bother method and apparatus
CN107155005A (en) * 2017-04-27 2017-09-12 上海斐讯数据通信技术有限公司 A kind of intelligent wrist wearable device bright screen control method and system
CN107436674A (en) * 2017-08-22 2017-12-05 深圳天珑无线科技有限公司 terminal control method, device and non-transitory computer-readable medium
CN107526603B (en) * 2017-09-20 2021-01-08 深圳天珑无线科技有限公司 Application awakening method and device
CN110362197A (en) * 2019-06-13 2019-10-22 缤刻普达(北京)科技有限责任公司 Screen lights method, apparatus, intelligent wearable device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110638422A (en) * 2014-09-23 2020-01-03 飞比特公司 Method, system and device for updating screen content in response to user gesture
CN108293080A (en) * 2015-11-26 2018-07-17 华为技术有限公司 A kind of method of contextual model switching
CN110850988A (en) * 2019-12-02 2020-02-28 合肥工业大学 System and method for preventing interference and wrist lifting and screen lighting
CN111596751A (en) * 2020-05-19 2020-08-28 歌尔智能科技有限公司 Display control method and device for wrist-worn device, wrist-worn device and storage medium

Also Published As

Publication number Publication date
CN113552937A (en) 2021-10-26
WO2021213151A1 (en) 2021-10-28

Similar Documents

Publication Publication Date Title
CN113552937B (en) Display control method and wearable device
CN113827185B (en) Wearing tightness degree detection method and device for wearing equipment and wearing equipment
CN113395382B (en) Method for data interaction between devices and related devices
CN112334860B (en) Touch control method of wearable device, wearable device and system
CN113892920B (en) Wearing detection method and device of wearable equipment and electronic equipment
CN111552451A (en) Display control method and device, computer readable medium and terminal equipment
CN113448482B (en) Sliding response control method and device of touch screen and electronic equipment
CN112334977B (en) Voice recognition method, wearable device and system
CN113467735A (en) Image adjusting method, electronic device and storage medium
CN114221402A (en) Charging method and device of terminal equipment and terminal equipment
CN113509145B (en) Sleep risk monitoring method, electronic device and storage medium
CN114521878B (en) Sleep evaluation method, electronic device and storage medium
CN115665632B (en) Audio circuit, related device and control method
CN114762588A (en) Sleep monitoring method and related device
CN115757906A (en) Index display method, electronic device and computer-readable storage medium
CN113467747B (en) Volume adjusting method, electronic device and storage medium
WO2021204036A1 (en) Sleep risk monitoring method, electronic device and storage medium
CN111026285B (en) Method for adjusting pressure threshold and electronic equipment
CN114375027A (en) Method and device for reducing power consumption
CN113918003A (en) Method and device for detecting time length of skin contacting screen and electronic equipment
WO2023237087A1 (en) Method for predicting fertile window, apparatus and electronic device
CN114115513B (en) Key control method and key device
CN114362878B (en) Data processing method and electronic equipment
CN113509144B (en) Prompting method and device
CN114610195B (en) Icon display method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant