WO2021213151A1 - Procédé de commande d'affichage et dispositif portable - Google Patents

Procédé de commande d'affichage et dispositif portable Download PDF

Info

Publication number
WO2021213151A1
WO2021213151A1 PCT/CN2021/084004 CN2021084004W WO2021213151A1 WO 2021213151 A1 WO2021213151 A1 WO 2021213151A1 CN 2021084004 W CN2021084004 W CN 2021084004W WO 2021213151 A1 WO2021213151 A1 WO 2021213151A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
screen
wearable device
instruction
sleep state
Prior art date
Application number
PCT/CN2021/084004
Other languages
English (en)
Chinese (zh)
Inventor
张慧
李靖
周林峰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021213151A1 publication Critical patent/WO2021213151A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake

Definitions

  • This application relates to the field of electronic technology, and in particular to a display control method and a wearable device.
  • Wearable devices are the general term for the use of wearable technology to intelligently design daily wearables to develop wearable devices, such as smart glasses, smart gloves, smart watches, smart clothing, smart shoes, etc. Among them, smart watches and smart bracelets are the most prominent. With the improvement of people's health awareness, the function of wearable devices to monitor health is becoming more and more popular. Some functions of monitoring health, such as sleep monitoring function, sleep apnea monitoring function, etc., require users to wear wearable devices at night to obtain related sleep data.
  • the present application discloses a display control method and a wearable device, which can reduce the problem of repeatedly lighting the screen and increasing the power consumption of the wearable device caused by a user's misoperation.
  • an embodiment of the present application provides a display control method applied to a wearable device.
  • the method includes: the wearable device determines the current sleep state of the user after receiving the first screen wake-up instruction; if the user is currently If the sleep state meets the preset condition, the screen of the wearable device is turned on after receiving the second screen wake-up instruction, where the first screen wake-up instruction is based on a preset for waking up the screen of the wearable device An instruction generated by any user operation, the second screen wake-up instruction is generated based on any user operation other than the user operation corresponding to the first screen wake-up instruction for waking up the screen of the wearable device Instructions.
  • the wearable device can effectively reduce the power consumption of the wearable device due to the user's misoperation caused by repeated screen-on, prevent the user's sleep caused by the user's misoperation from repeatedly lighting the screen, and make it convenient for the user to view the screen of the wearable device.
  • the screen When displaying content, the screen can be quickly turned on through the above-mentioned second screen wake-up instruction, which improves the user experience.
  • the screen wake-up instruction involved in the embodiment of the present application is an instruction used to instruct the screen of the wearable device to be generated in response to a user operation.
  • the screen of the wearable device is off, the user touches, for example, clicks on any area or designated area of the wearable device, or when the user lifts or turns the wrist, the wearable device will generate a screen wake-up command to light up the wearable
  • the screen of the device where the designated area is any screen area preset for responding to user touch operations.
  • the screen wake-up command can be a screen wake-up command generated when the user touches any screen area of the wearable device, or a screen wake-up command generated when the user touches a designated screen area of the wearable device, or the user raises or flips the wrist
  • the embodiment of the application does not limit the specific user operation.
  • the aforementioned user operations include, but are not limited to, click operations, double-click operations, double-click operations, pressing operations, multiple pressing operations, sliding operations, wrist-turning operations, or a combination of at least two user operations among the above user operations, such as Combination operation of pressing + turning operation.
  • the above-mentioned first screen wake-up instruction may be a screen wake-up instruction generated by a preset user operation in the normal mode.
  • the user operation corresponding to the first screen wake-up instruction includes a wrist-lifting operation, a wrist-turning operation, Click operation, touch operation, etc.
  • the above-mentioned second screen wake-up command may be a screen wake-up command generated by a preset user operation different from the normal mode in the anti-mistouch mode.
  • the user operation corresponding to the second screen wake-up command includes double-tap operation and connection. Tap operation, multiple press operation, sliding operation, wrist lift + press combination operation, etc.
  • the wearable device after the wearable device receives the first screen wake-up instruction, it further includes: detecting the current screen state of the wearable device; When the screen state is off, the user's current sleep state is determined.
  • the aforementioned sleep states include three states: not entering sleep, entering sleep, and preparing to enter sleep.
  • the current sleep state of the user meeting the preset condition specifically refers to that the current sleep state of the user is already in the sleep state or is about to enter the sleep state.
  • the embodiment of the application detects the current screen state of the wearable device, and after determining that the current screen state of the wearable device is in the off-screen state, determines the current sleep state of the user, so that the wearable device is in the on-screen state in its current screen state It can quickly respond to other user operations and improve the response efficiency of the wearable device. Only when the current screen state of the wearable device is in the off-screen state, it is necessary to determine the user’s current sleep state, so as to reduce the power consumption of the wearable device caused by the user’s misoperation and to prevent the user from repeatedly turning on the screen. Repeated screen lighting caused by misoperation affects the purpose of user sleep.
  • the wearable device after the wearable device receives the first screen wake-up instruction, it further includes: the wearable device determines whether the false touch prevention mode is turned on; If the touch mode is turned on, the wearable device determines whether the user operation corresponding to the first screen wake-up instruction is a preset user operation; if the user operation corresponding to the first screen wake-up instruction is a preset user operation, execute the first The screen wake-up command lights up the screen of the wearable device. Otherwise, the user operation corresponding to the first screen wake-up command is determined to be a misoperation, and the first screen wake-up command is shielded, and the screen of the wearable device is not lit. If the anti-inadvertent touch mode is not turned on, the wearable device determines which sleep stage the user is currently in.
  • the embodiment of the present application determines whether the anti-mistouch mode is turned on, thereby determining whether it is necessary to determine the user's current stage to trigger the automatic anti-mistouch function, that is, by determining if the user's current sleep state meets a preset condition, the receiving The screen of the wearable device is turned on after the wake-up instruction on the second screen, which improves the intelligence of the wearable device and meets the different needs of people.
  • the screen of the wearable device is turned on after receiving the second screen wake-up instruction, Including: if the current sleep state of the user meets the preset condition, detecting whether the second screen wake-up instruction is received; if the second screen wake-up instruction is received, execute the second screen wake-up instruction to light up the wearable device If the second screen wake-up command is not received within the predetermined time, the user operation corresponding to the first screen wake-up command is determined to be a misoperation, the first screen wake-up command is shielded, and the screen of the wearable device is not lit .
  • the embodiment of the present application determines whether to turn on the screen of the wearable device by determining whether the second screen wake-up instruction is received, so as to reduce the occurrence of repeated screen-on events caused by user misoperation, and can effectively reduce the occurrence of user misoperation.
  • the discomfort caused by the lighting of the screen to the user and effectively reduces the increased power consumption of the wearable device due to the repeated lighting of the screen, and prevents the repeated lighting of the screen caused by the user's misoperation from affecting the user's sleep.
  • determining the user's current sleep state includes: the wearable device obtains user sleep detection data; According to the above-mentioned user sleep detection data, the current sleep state of the user is determined.
  • the above-mentioned sleep detection data is related detection data used to determine the current sleep state of the user, including but not limited to at least one of the user's physiological characteristic data, the user's motion posture data, and the current environment data.
  • the user's physiological characteristic data includes but not limited to the user's heart rate, pulse, respiration rate, brain wave signal and other data;
  • the user's motion posture data includes but not limited to the user's wrist posture data;
  • the current environment data includes but is not limited to wearable devices The ambient light brightness data of the current environment.
  • the sleep detection data can be used to determine whether the current sleep state of the user is not in the sleep state, has entered the sleep state, or is ready to enter the sleep state. Further determine whether it is necessary to determine whether the user operation corresponding to the first screen instruction is a misoperation. If the user operation corresponding to the first screen instruction is not a misoperation, that is, the user's current sleep state does not meet the preset conditions, the wearable device can perform The first screen wake-up command lights up the screen of the wearable device; otherwise, the screen wake-up command generated by another user's operation, that is, the second screen wake-up command, is required to light up the screen of the wearable device, so as to reduce the misoperation caused by the user. The resulting repeated screen lighting increases the power consumption of the wearable device, and at the same time reduces the occurrence of events that may reduce the user's sleep quality by repeatedly lighting the screen, and improves the user experience.
  • the user's current sleep state can be quickly determined as the user has not entered the sleep state or has entered the sleep state through the user's physiological characteristic data or the user's motion posture data.
  • the wearable device accurately judges whether the user’s current sleep state is ready to enter the sleep state is a key point to improve the accuracy of judging whether the current user operation is a misoperation. Therefore, it improves the judgment of whether the user’s sleep state is ready
  • the accuracy of entering the sleep state can effectively improve the accuracy of judging whether the user operation corresponding to a certain screen wake-up instruction is a misoperation, so as to achieve a better purpose of preventing accidental touch.
  • the user sleep detection data includes the user's wrist posture data and the ambient light brightness data of the environment where the wearable device is currently located.
  • the above determination of the user's current sleep state based on the user sleep detection data includes:
  • the acquired wrist posture data of the user meets the first condition and/or the ambient light brightness of the environment in which the wearable device is currently located is lower than the preset brightness threshold, it is determined that the current sleep state of the user meets the preset brightness threshold.
  • the set conditions include: if the action level corresponding to the obtained wrist posture data of the user is the preset action level and/or the ambient light brightness of the environment where the wearable device is currently located is lower than the preset brightness threshold, then determining the above The current sleep state of the user meets the preset condition.
  • the embodiment of the application determines whether the user's wrist gesture data meets the first condition by judging whether the action level corresponding to the user's wrist posture data is the preset action level, that is, the first condition is that the user's wrist action level is predetermined When the action level is set, it can be considered that the user's wrist posture data meets the first condition, which improves the accuracy of judging the user's current sleep state.
  • the determination that the obtained action level corresponding to the wrist posture data of the user is a preset action level includes: Determine at least two corresponding action levels; compare the above at least two corresponding action levels; if there are at least two action levels with the same level in the at least two corresponding action levels, the level is the same The action level with the largest number is set as the action level corresponding to the above-mentioned user's wrist posture data.
  • This embodiment of the application determines the obtained action level corresponding to the user’s wrist posture data and sets the action level with the same number of levels as the action level corresponding to the user’s wrist posture data, which improves the user’s wrist
  • the accuracy of the judgment of the action level corresponding to the posture data thereby further improving the accuracy of judging whether the user's current sleep state is ready to enter the sleep state, and can effectively improve the accuracy of judging whether the user operation corresponding to a screen wake-up instruction is a misoperation Rate, so as to achieve a better purpose of preventing accidental touch.
  • the present application provides a wearable device, including: one or more processors, a memory, and a display screen; the foregoing memory, the foregoing display screen are coupled with the foregoing one or more processors, and the foregoing memory is used to store the computer Program code, the computer program code includes computer instructions, and when the one or more processors execute the computer instructions, the wearable device executes the method provided in any one of the possible implementations of the first aspect.
  • the present application provides a computer storage medium, including computer instructions, when the computer instructions are executed on a wearable device, the wearable device is caused to execute the method provided in any one of the possible implementation manners of the first aspect .
  • the embodiments of the present application provide a computer program product, when the computer program product runs on a computer, the computer executes, for example, the above-mentioned wearable device executes the method provided in any possible implementation manner of the first aspect .
  • an embodiment of the present application provides a chip system, including a processor, the processor is coupled with a memory, and when the processor executes a computer program stored in the memory, the wearable device can execute any of the above-mentioned wearable devices as in the first aspect.
  • Possible implementation methods provide methods.
  • the above-mentioned chip system may be a single chip or a chip module composed of multiple chips.
  • the above-mentioned wearable device in the second aspect, the above-mentioned computer storage medium in the third aspect, the above-mentioned computer program product in the fourth aspect, or the above-mentioned chip system in the fifth aspect are all used to execute the above-mentioned chip system provided in the first aspect.
  • Methods. Therefore, the beneficial effects that can be achieved can refer to the beneficial effects in the corresponding method, which will not be repeated here.
  • FIG. 1 is a schematic structural diagram of a wearable device 100 provided by an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of a display control method provided by an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of another display control method provided by an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of another display control method provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of another display control method provided by an embodiment of the present application.
  • FIG. 6 is a schematic flowchart of a method for determining a user's current sleep state provided by an embodiment of the present application.
  • FIG. 1 is a schematic structural diagram of a wearable device 100 provided by an embodiment of the present application.
  • the wearable device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display 194, And subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the wearable device 100.
  • the wearable device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the wearable device 100.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter/receiver (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter/receiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to realize the touch function of the wearable device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with the display screen 194, the camera 193 and other peripheral devices.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and so on.
  • the processor 110 and the camera 193 communicate through a CSI interface to realize the photographing function of the wearable device 100.
  • the processor 110 and the display screen 194 communicate through a DSI interface to realize the display function of the wearable device 100.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the wearable device 100, and can also be used to transfer data between the wearable device 100 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is merely a schematic illustration, and does not constitute a structural limitation of the wearable device 100.
  • the wearable device 100 may also adopt different interface connection modes in the above-mentioned embodiments, or a combination of multiple interface connection modes.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the wearable device 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the wearable device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the wearable device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the wearable device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the wearable device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation. Satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive a signal to be sent from the processor 110, perform frequency modulation, amplify, and convert
  • the antenna 1 of the wearable device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the wearable device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the wearable device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor, which is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations, and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active matrix organic light-emitting diode active-matrix organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the wearable device 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the wearable device 100 may include 1 or N cameras 193, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the wearable device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the wearable device 100 may support one or more video codecs. In this way, the wearable device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the wearable device 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the wearable device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the wearable device 100 by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program (such as a sound playback function, an image playback function, etc.) required by at least one function, and the like.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the wearable device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the wearable device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the wearable device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the wearable device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the wearable device 100 may be provided with at least one microphone 170C. In some other embodiments, the wearable device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In other embodiments, the wearable device 100 may also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may include at least two parallel plates with conductive materials.
  • the wearable device 100 determines the intensity of the pressure according to the change in capacitance.
  • the wearable device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the wearable device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example: when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the wearable device 100.
  • the angular velocity of the wearable device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyroscope sensor 180B detects the jitter angle of the wearable device 100, and calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the jitter of the wearable device 100 through reverse movement, so as to prevent the wearable device 100 from shaking. shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the wearable device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the wearable device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the wearable device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of acceleration of the wearable device 100 in various directions (generally three-axis).
  • the wearable device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the wearable device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the wearable device 100 emits infrared light to the outside through the light emitting diode.
  • the wearable device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the wearable device 100. When insufficient reflected light is detected, the wearable device 100 may determine that there is no object near the wearable device 100.
  • the wearable device 100 can use the proximity light sensor 180G to detect that the user holds the wearable device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the wearable device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the wearable device 100 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the wearable device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • the wearable device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the wearable device 100 performs a reduction in the performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the wearable device 100 when the temperature is lower than another threshold, the wearable device 100 heats the battery 142 to prevent the wearable device 100 from shutting down abnormally due to low temperature.
  • the wearable device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the wearable device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 180M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 180M, and realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 180M, and realize the heart rate detection function.
  • the button 190 includes a power-on button, a volume button, and so on.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the wearable device 100 may receive key input, and generate key signal input related to user settings and function control of the wearable device 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the wearable device 100.
  • the wearable device 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 may also be compatible with external memory cards.
  • the wearable device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the wearable device 100 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the wearable device 100 and cannot be separated from the wearable device 100.
  • the screen wake-up instruction involved in the following embodiments of the present application is used to indicate an instruction generated after responding to a user operation to light up the screen of the wearable device 100.
  • the wearable device 100 When the wearable device 100 is in the off-screen state, the user touches, for example, taps any area or a designated area of the wearable device 100, or when the user lifts or turns the wrist, the wearable device 100 will generate a screen wake-up command to light up the screen , Where the designated area is any screen area preset to respond to user touch operations.
  • the screen wake-up command can be a screen wake-up command generated when the user touches any screen area of the wearable device 100, it can also be a screen wake-up command generated when the user touches a specified screen area of the wearable device 100, or the user lifts up or turns over.
  • the screen wake-up command generated by the wrist is not limited to specific user operations in this embodiment of the application.
  • the user operations involved in the embodiments of this application include, but are not limited to, click operations, double-click operations, double-click operations, pressing operations, multiple pressing operations, sliding operations, wrist-turning operations, or a combination of at least two user operations among the above user operations. For example, the combined operation of pressing + turning the wrist.
  • the screen wake-up command in the embodiment of the present application includes a first screen wake-up command and a second screen wake-up command, where the first screen wake-up command is generated based on any user operation preset for waking up the screen of the wearable device 100
  • the second screen wake-up command is an instruction generated based on any user operation other than the touch operation corresponding to the first screen wake-up command for waking up the screen of the wearable device 100.
  • the first screen wake-up command may be a screen wake-up command generated by a preset user operation in the normal mode.
  • the user operation corresponding to the first screen wake-up command includes wrist raising operation, wrist turning operation, Click operation, touch operation, etc.
  • the above-mentioned second screen wake-up command may be a screen wake-up command generated by a user operation different from the normal mode in the anti-mistouch mode.
  • the user operation corresponding to the second screen wake-up command includes a double-click operation, a double-click operation, and multiple One-time pressing operation, sliding operation, combined operation of raising wrist + pressing, etc.
  • the screen wake-up of the wearable device 100 is mainly achieved by touching or pressing the screen, raising the wrist, etc.
  • the designated time period is to enable the DND mode within the time period set by the user.
  • the Do Not Disturb mode is turned on, it only prevents interference from external information such as incoming call information, WeChat notifications, QQ notifications, or lifting the wrist to brighten the screen. It cannot solve the accidental touch by yourself or the person next to the pillow at night. Mispressing, etc. cause interference from the bright screen event.
  • the second method of turning on the Do Not Disturb mode although it can intelligently turn on the Do Not Disturb mode according to the user's sleep state, on the one hand, turning on the Do Not Disturb mode is only to prevent the interference of external information, and it cannot Solve the interference caused by accidental touches and presses by yourself or the person next to the pillow at night that cause the screen-on event; on the other hand, the activation of the Do Not Disturb mode is activated when the user's current sleep state is recognized as the sleep state , But in fact, when the user’s current sleep state is the stage of preparing to fall asleep, that is, when the user is about to fall asleep, the user is more likely to be disturbed by the bright screen of the wearable device 100. For users who have difficulty falling asleep, their sleep is seriously affected. quality.
  • the embodiment of the present application provides a display control method, which can effectively reduce the repeated lighting of the screen caused by the user's misoperation and increase the power consumption of the wearable device 100, and prevent the repeated lighting of the screen caused by the user's misoperation from affecting the user's sleep. Improve user experience.
  • FIG. 2 is a schematic flowchart of a display control method provided by an embodiment of the present application. As shown in Figure 2, the method includes steps S101 to S105.
  • the wearable device 100 receives a first screen wake-up instruction.
  • the wearable device 100 will receive a screen wake-up instruction to wake up the screen for the user to perform corresponding operations.
  • the wearable device 100 determines the current sleep state of the user.
  • the wearable device 100 determines whether the current sleep state of the user meets a preset condition.
  • the sleep state in the embodiment of the present application includes three states of not entering sleep, entering sleep, and preparing to enter sleep. Determining the user’s current sleep state is to determine whether the user’s current sleep state is not in sleep state, is in sleep state, or is ready to enter sleep state. By determining the user’s sleep state, you can determine whether the user’s current sleep state is not in sleep state.
  • the first screen wake-up instruction is directly executed to light up the screen of the wearable device 100; and when the user's current sleep state is entered or is about to enter the sleep state, the second screen wake-up instruction is required to Light up the screen of the wearable device 100 to reduce the repeated lighting of the screen caused by the user's misoperation in these two sleep states. Increase the power consumption of the wearable device 100 and prevent the repeated lighting of the screen caused by the user's misoperation from affecting the user Sleep, improve user experience.
  • the wearable device 100 lights up the screen after receiving the second screen wake-up instruction.
  • the current sleep state of the user meets the preset condition, specifically refers to that the current sleep state of the user is considered to meet the preset condition when the user's current sleep state is entered or is about to enter the sleep state.
  • the wearable device 100 During night sleep, for a user whose current sleep state is ready to enter the sleep state, if any screen area or designated screen area of the wearable device 100 is touched unintentionally, or when the user raises his wrist when turning over, the wearable device 100 Just execute the screen wake-up command generated by these user operations, especially when the user has turned off the light and the wearable device is currently in a very dark environment, light up the wearable device 100 screen, suddenly bright The screen will irritate the eyes of a user in the dark, and while causing eye discomfort, it may also affect the user's sleep quality.
  • the sudden or repeated lighting of the screen will seriously affect the quality of their sleep, making the user perceive the wearable device 100 to brighten the screen, or Viewing or browsing the display content of the wearable device 100 after the screen of the wearable device 100 is turned on can easily make it more difficult for users who have difficulty falling asleep to enter the sleep state, which makes the user experience very bad.
  • the repeated lighting of the screen caused by the user's misoperation also increases the power consumption of the wearable device 100 and reduces the standby time of the wearable device 100.
  • the wearable device 100 determines that the user’s current sleep state is entered or is about to enter the sleep state
  • the user operation corresponding to the generated screen wake-up instruction is regarded as a misoperation, and the screen is blocked
  • the wake-up command does not light up the screen, which can effectively reduce the discomfort caused to the user by the user's misoperation and light the screen, and effectively reduce the power consumption of the wearable device 100 due to repeated lighting of the screen, and prevent the user from misusing the screen. Repeated screen lighting caused by operation affects the user's sleep.
  • a screen wake-up command (ie, a second screen wake-up command) generated based on another user's operation, so as to facilitate the operation with the previous user.
  • the generated screen wake-up instructions (ie, the first screen wake-up instructions) are distinguished, so that the user can still pass another user after the wearable device 100 determines that the user’s current sleep state is the sleep state or is about to enter the sleep state.
  • the screen wake-up command generated by the operation is used to light up the screen of the wearable device 100, which meets the diversified needs of the user and improves the user experience.
  • the wearable device 100 determines that the user’s current sleep state is already in sleep state or ready to enter the sleep state, or after the wearable device 100 has turned on the anti-mistouch mode, it is possible that the user still tries to Turn on the screen of the wearable device 100 to check the time or browse other content. At this time, if the screen wake-up command generated by the user operation is determined to be an invalid command, that is, the screen wake-up command generated by the current user operation is shielded, and the screen of the wearable device 100 is not lit, it may cause the user device to be broken or malfunction. The illusion of a problem reduces the user experience.
  • the wearable device 100 will further detect whether the second screen wake-up instruction is received after determining that the current sleep state of the user is the sleep state or is about to enter the sleep state in the embodiment of the present application.
  • a screen wake-up instruction corresponds to an instruction generated by another user operation after a user operation. If a second screen wake-up instruction is detected, the wearable device 100 will execute the second screen wake-up instruction to light up the screen of the wearable device 100. It is convenient for the user to view the display content of the wearable device 100, meets the diversified needs of the user, and improves the user experience.
  • the user’s current sleep state is determined after receiving the first screen wake-up instruction; if the user’s current sleep state meets the preset conditions, the wearable device 100 is turned on after the second screen wake-up instruction is received.
  • the screen can effectively reduce the repeated lighting of the screen caused by the user's misoperation, increase the power consumption of the wearable device 100, increase the standby time of the wearable device 100, and reduce the repeated lighting of the screen caused by the misoperation to the user
  • the adverse effects such as eye discomfort, affecting sleep quality, etc., improve the user experience.
  • FIG. 3 is a schematic flowchart of another display control method provided by an embodiment of the present application. As shown in Figure 2, the method includes steps S201 to S204.
  • the wearable device 100 receives a first screen wake-up instruction.
  • step S101 in the embodiment described in FIG. 2, which will not be repeated here.
  • S202 The wearable device 100 detects the current screen state.
  • the current screen state of the wearable device 100 includes the screen on state and the screen off state.
  • the first screen wake-up instruction can be directly executed to light up the screen of the wearable device 100, which is convenient for the user to quickly browse or view the corresponding display content.
  • the screen state of the wearable device 100 is in the off-screen state, it is necessary to avoid the user operation corresponding to the first screen wake-up instruction from being caused by a misoperation.
  • step S204 refer to the description of step S105 in the embodiment described in FIG. 2, which is not repeated here.
  • FIG. 4 is a schematic flowchart of another display control method provided by an embodiment of the present application. As shown in Fig. 4, the method includes steps S301 to S304.
  • the wearable device 100 receives a first screen wake-up instruction.
  • S301 refer to the description of step S101 in the embodiment described in FIG. 2, which is not repeated here.
  • the wearable device 100 determines whether the accidental touch prevention mode is turned on.
  • the above-mentioned anti-misoperation mode is a preset mode to prevent users from misoperation.
  • users need more complicated operations such as double-clicking, double-clicking, sliding, etc. to light up the screen, while simple clicking, lifting Wrist operation, etc. cannot light up the screen.
  • the user can set the accidental touch prevention mode to be turned on within a specified time period, such as 22:00 to 7:00 the next day.
  • a specified time period such as 22:00 to 7:00 the next day.
  • the user operation corresponding to the first screen wake-up instruction is determined to be a misoperation, the screen wake-up instruction is shielded, and the screen of the wearable device 100 is not lit, so as to prevent the screen from being turned on due to the user's misoperation.
  • preset user operations are preset operations for waking up the screen in the accident prevention mode, such as double-clicking, double-clicking, or pressing any area or designated area of the screen multiple times. Any area or designated area for sliding operations, pressing + wrist-lifting combined operations, etc.
  • the wearable device 100 determines the current sleep state of the user.
  • the wearable device 100 lights up the screen after receiving the second screen wake-up instruction.
  • step S304 refer to the description of step S105 in the embodiment described in FIG. 2, which is not repeated here.
  • the embodiment of the present application determines whether the anti-mistouch mode is turned on, thereby determining whether it is necessary to determine the current stage of the user to trigger the automatic anti-mistouch function, improve the intelligence of the wearable device, and meet different needs of people.
  • FIG. 5 is a schematic flowchart of another display control method provided by an embodiment of the present application. As shown in Fig. 4, the method includes steps S401 to S403.
  • the wearable device 100 determines that the user operation corresponding to the first screen wake-up command is a misoperation, shields the first screen wake-up command, and does not light up the wearable device 100 Screen.
  • the wearable device 100 when the wearable device 100 is in the off-screen state, when the user clicks or presses the screen of the wearable device 100 for the first time, the wearable device 100 will detect that it is based on the user’s click or press operation.
  • the wearable device 100 will not execute the first screen wake-up command temporarily after judging that the user’s current sleep state is the sleep state or is about to enter the sleep state, and continue to detect at a predetermined time such as 5 Whether the user acts on the screen of the wearable device 100 by swiping, double-clicking, or double tapping for the second time within seconds, the wearable device 100 can detect whether another user-based operation such as swiping or double tapping is received within a predetermined time If a second screen wake-up instruction is generated by a user operation such as a double-tap, if a second screen wake-up instruction is received within a predetermined time, the wearable device 100 can execute the first screen wake-up instruction to light up the screen, or perform a second screen wake-up The instructions light up the screen.
  • the wearable device 100 determines that the user operation corresponding to the first screen wake-up instruction is a misoperation, shields the first screen wake-up instruction, and does not light up the screen of the wearable device 100.
  • the embodiment of the present application determines whether to turn on the screen of the wearable device 100 by determining whether a second screen wake-up instruction is received, so as to reduce the occurrence of repeated screen-on events caused by user misoperations, and can effectively reduce user misoperations.
  • the discomfort brought to the user by lighting the screen effectively reduces the power consumption of the wearable device 100 that is increased by the wearable device due to repeated lighting of the screen.
  • the wearable device 100 when the wearable device 100 determines the user's current sleep state, it will obtain sleep detection data, and determine the user's current sleep state based on this sleep detection data.
  • the above-mentioned sleep detection data is related detection data used to determine the user's current sleep state, including but not limited to at least one of user physiological characteristic data, user motion posture data, and current environment data.
  • the user's physiological characteristic data includes but not limited to the user's heart rate, pulse, respiration rate, brain wave signal and other data;
  • the user's motion posture data includes but not limited to the user's wrist posture data;
  • the current environment data includes but is not limited to wearable devices 100
  • the wearable device 100 can execute the first screen wake-up instruction to light up the wearable device 100 screen; otherwise, another screen wake-up command, namely a second screen wake-up command, is needed to light up the screen of the wearable device 100, so as to reduce the repeated screen lighting caused by user misoperation and increase the power consumption of the wearable device 100.
  • the user's current sleep state can be quickly determined as the user has not entered the sleep state or has entered the sleep state through the user's physiological characteristic data or the user's motion posture data.
  • the wearable device 10 accurately determines whether the user’s current sleep state is ready to enter the sleep state, which is a key point to improve the accuracy of determining whether the current user operation is a misoperation. Therefore, it improves the determination of whether the user’s sleep state is
  • the accuracy of preparing to enter the sleep state can effectively improve the accuracy of judging whether the user operation corresponding to a certain screen wake-up instruction is a misoperation, so as to achieve a better purpose of preventing accidental touch.
  • FIG. 6 is a schematic flowchart of a method for determining a user's current sleep state according to an embodiment of the present application. As shown in Fig. 6, the method includes steps S501 to S503.
  • the wearable device 100 obtains the user's wrist posture data and the ambient light brightness of the environment where the wearable device 100 is currently located.
  • the user's wrist posture data includes, but is not limited to, acceleration data of wrist movement, distance data of wrist movement, etc.
  • the wearable device 100 can obtain the user through acceleration sensor 180E, gyroscope sensor 180B, distance sensor 180F, etc. Wrist posture data.
  • the ambient light brightness of the environment where the wearable device 100 is currently located can be obtained through the ambient light sensor 180L.
  • the wearable device 100 determines whether the acquired user wrist posture data meets the first condition, and whether the ambient light brightness of the environment where the wearable device 100 is currently located is lower than a preset brightness threshold.
  • judging whether the obtained user's wrist posture data meets the first condition may specifically be judging based on the root mean square obtained from the user's wrist posture data over a period of time obtained from the wearable device 100 Whether the average value, variance, median, etc. of is within the corresponding preset value range; it can also be used to determine whether the action level corresponding to the user's wrist posture data is a preset action level. If the root mean square average, variance, median, etc. obtained from the user's wrist posture data over a period of time meets the corresponding preset value range, or the action level corresponding to the user's wrist posture data is preset The action level of, that is, it is determined that the obtained user's wrist posture data meets the first condition.
  • S503 If the acquired wrist posture data of the user meets the first condition and/or the ambient light brightness of the environment where the wearable device is currently located is lower than a preset brightness threshold, determine the current sleep state of the user Meet the preset conditions.
  • the action level corresponding to the user's wrist posture data is determined; if the determined action level corresponding to the user's wrist posture data is the preset action level, the user's wrist posture data is determined The wrist posture data meets the first condition.
  • the wearable device 100 acquires N acceleration data ACC xn , ACC yn , ACC zn of the wearable device 100 in the three directions of x, y, and z axes for a period of time through a three-axis acceleration sensor, where N Is an integer greater than or equal to 1, n ⁇ [1,N], by calculating the root mean square of the N accelerations ACC xn , ACC yn and ACC zn of the wearable device 100 in the x, y, and z axes Mean, median, variance, etc., to obtain the corresponding first value, and then divide the first value into the value range of the corresponding action level to determine the action level corresponding to the user's wrist posture data.
  • the action level corresponding to the user's wrist posture data can be divided into five levels (for example, level 0-4), where level 0 represents stillness and level 1 represents a small amount of movement or a small movement range. Level 2 means medium motion range, level 3 means more or greater motion range, and level 4 means very much motion.
  • the action level corresponding to the user's wrist posture data can also be divided into more or less action levels. The more action levels are divided, the analysis of the user’s wrist posture data The more detailed, the more accurate the determined action level.
  • Each action level corresponds to a different numerical range.
  • the numerical range corresponding to level 0 is the first average range such as [ ⁇ 1 , ⁇ 2 ), the first median range such as [ ⁇ 1 , ⁇ 2 ) or the first party Difference range such as [ ⁇ 1 , ⁇ 2 );
  • the numerical range corresponding to level 1 is the second mean range such as [ ⁇ 2 , ⁇ 3 ), the second median range such as [ ⁇ 2 , ⁇ 3 ) or the second variance range For example, [ ⁇ 2 , ⁇ 3 ), etc., and so on.
  • the user’s wrist posture data is used to determine the corresponding action level
  • the acceleration of the user’s wrist movement can be used to obtain the wearable device 100’s performance over a period of time through the three-axis acceleration sensor.
  • N pieces of acceleration data (acceleration ACC xn , ACC yn , ACC zn in the three directions of the x, y, and z axes) to determine the action level corresponding to the user's wrist posture data, which can be specifically obtained by calculation
  • the mean value and variance of the root mean square of the N acceleration data, or the median of the root mean square of the N acceleration data is obtained, and the action level corresponding to the user's wrist posture data is determined.
  • the wearable device 100 when the wearable device 100 receives the first screen wake-up instruction, it obtains acceleration data at the receiving time point (the time point when the first screen wake-up instruction is received), and at the receiving time point N-1 acceleration data in the previous period of time (that is, assuming that the acquired wearable device has N acceleration data in the time range from the receiving time point and a period of time before the receiving time point), calculate the obtained acceleration data The average value of the root mean square of the N acceleration data is obtained, and the calculated average value is matched with the value corresponding to the average value in the numerical table corresponding to the action level. According to the matching result, the user's wrist posture data corresponding to the Action level.
  • the above preset time range is within the time range where the above receiving time point is the end time, and the time point of a specific length of time before the push is the starting time.
  • the receiving time point is 08:00:00 (eight o'clock in the morning)
  • the time of 10s is 08:00:10 (eight ten seconds in the morning)
  • the acceleration data of the wearable device 100 in the time period of [08:00:00, 08:00:10] is acquired.
  • the variance of the root mean square of the acquired N acceleration data can also be calculated to obtain a variance, and the calculated variance can be matched with the value corresponding to the variance in the value table corresponding to the action level. , Determine the action level corresponding to the user's wrist posture data according to the matching result.
  • the median of the root mean square of the N pieces of acceleration data may be obtained, and this median may be used as the first value and the value corresponding to the action level.
  • the value corresponding to the median in the table is matched, and the action level corresponding to the user's wrist posture data is determined according to the matching result.
  • the embodiment of the present application determines at least two corresponding action levels based on the obtained user's wrist posture data; the at least two corresponding action levels Perform comparison; if there are at least two action levels with the same level in the at least two corresponding action levels, the action level with the same number of levels is set as the action level corresponding to the user's wrist posture data.
  • the determined action Levels are compared. If there are at least two action levels with the same level in the at least two corresponding action levels, the action level with the same number of levels is set as the action level corresponding to the user’s wrist posture data, for example, OK When the two action levels corresponding to the user's wrist posture data are two, if the two action levels are the same, the action level corresponding to the two consistent action levels is taken as the action level corresponding to the user's wrist posture data; Otherwise, reacquire N acceleration data to determine the action level corresponding to the user's wrist posture data.
  • the action levels corresponding to the determined user's wrist posture data are more than two, if the action levels with the same number of the two or more action levels account for the total number of action levels corresponding to the user’s current wrist posture data When the ratio reaches the preset threshold, the action level with the same number of levels is set as the action level corresponding to the user's wrist posture data. Otherwise, N acceleration data are re-acquired to determine the user’s wrist posture data. Action level. Alternatively, the user's action level is further determined by the motion range of the user's wrist, and the obtained action levels are compared, and the action level corresponding to the user's wrist posture data is determined according to the comparison result.
  • the movement amplitude of the user's wrist is determined, that is, the user's wrist is between a certain reference point distance or angle The moved value. After determining the movement amplitude of the user's wrist, compare it with the preset amplitude value. If the determined movement amplitude of the user's wrist is less than the preset amplitude value, it is determined that the user's movement amplitude is small; if the determined user's wrist is If the movement amplitude of the user is greater than or equal to the preset amplitude value, it is determined that the user's movement amplitude is relatively large. According to the determined magnitude of the user's motion amplitude, the corresponding action level is determined.
  • the action level corresponding to the user's wrist posture data is determined according to the user's wrist posture data; if the determined action level corresponding to the user's wrist posture data is a preset action level, such as the above At level 0 or 1, it is determined that the user's wrist posture data meets the first condition.
  • the user's current sleep state can be considered to have entered a sleep state
  • the user's wrist posture data corresponds to an action level of level 1
  • the action level corresponding to the user's wrist posture data is level 2 or above, it can be considered that the user's current sleep state is not entering the sleep state.
  • the ambient light brightness of the environment where the wearable device 100 is currently located is not lower than the preset brightness threshold, even if the screen of the wearable device 100 is turned on, it will not cause discomfort to the user’s eyes, but it will also increase the wearable device 100’s Therefore, the ambient light brightness of the environment in which the wearable device is currently located is not lower than the preset brightness threshold, but when the user's wrist posture data meets the first condition, it is convenient for the user to view the wearable device 100
  • the display content of the screen still needs to use the second screen wake-up command to light up the screen of the wearable device 100, so as to reduce the repeated screen lighting caused by the user's misoperation, increase the power consumption of the wearable device 100, and extend the wearable device 100's power consumption. Standby time.
  • determining the user's current sleep state can also be determined by physiological characteristic parameters such as heart rate and breathing. For example, by judging whether the user's heart rate is lower than a preset heart rate value, whether the number of breaths within a preset time is lower than a preset number, and so on.
  • the user’s current sleep state is entered the sleep state
  • the user’s current sleep state is entered the sleep state
  • the first screen wake-up command is directly shielded. If the user suddenly wakes up and wants to light up the screen, he needs to use the second screen wake-up instruction to light up the screen, so that the wearable device can distinguish the screen wake-up operations performed by the user during different sleep stages, and improve the user experience.
  • This embodiment of the application determines whether the user’s current sleep state meets the preset brightness threshold by determining whether the acquired user’s wrist posture data meets the first condition and whether the ambient light brightness of the environment where the wearable device 100 is currently located is lower than a preset brightness threshold.
  • Set conditions that is, to determine whether the user’s current sleep state is already in sleep state or ready to enter sleep state, which can effectively improve the accuracy of judging whether the user operation corresponding to a screen wake-up command is a misoperation, so as to achieve better prevention Mistakenly touch the purpose.
  • a sleep stage confirmation request instruction can be sent to a third-party electronic device, such as a smart phone bound to the wearable device 100.
  • the sleep stage confirmation request instruction is used to instruct the third-party electronic device to determine the user's current sleep.
  • the third-party electronic device feeds back the judgment result to the wearable device 100, that is, the execution subject of the foregoing steps S501 to S503 may be other electronic devices other than the wearable device 100.
  • the embodiments of the present application also provide a computer-readable storage medium that stores instructions in the computer-readable storage medium, and when it runs on a computer or a processor, the computer or the processor executes any one of the above methods. Or multiple steps.
  • the embodiments of the present application also provide a computer program product containing instructions.
  • the computer program product runs on a computer or a processor, the computer or the processor is caused to execute one or more steps in any of the foregoing methods.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted through the computer-readable storage medium.
  • the computer instructions can be sent from a website site, computer, server, or data center to another website site, computer, Server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).
  • the process can be completed by a computer program instructing relevant hardware.
  • the program can be stored in a computer readable storage medium. , May include the processes of the above-mentioned method embodiments.
  • the aforementioned storage media include: ROM or random storage RAM, magnetic disks or optical disks and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé de commande d'affichage et un dispositif portable. Le procédé comprend : après la réception d'une première instruction de réveil d'écran, la détermination, par un dispositif portable, de l'état de sommeil actuel d'un utilisateur ; et si l'état de sommeil actuel de l'utilisateur satisfait une condition prédéfinie, l'allumage d'un écran du dispositif portable après la réception d'une seconde instruction de réveil d'écran, la première instruction de réveil d'écran étant n'importe quelle instruction générée sur la base d'une opération d'utilisateur prédéfinie pour réveiller l'écran et la seconde instruction de réveil d'écran étant n'importe quelle instruction générée sur la base d'une opération d'utilisateur prédéfinie, autre qu'une opération tactile correspondant à la première instruction de réveil d'écran, pour réveiller l'écran. L'augmentation de consommation d'énergie d'un dispositif portable en raison de l'allumage répété d'un écran causé par une opération éronnée d'un utilisateur peut être efficacement réduite, et lorsqu'il est nécessaire de visualiser un contenu affiché sur un écran d'un dispositif portable, un utilisateur peut de manière commode et rapide allumer l'écran au moyen d'une seconde instruction de réveil d'écran de sorte que la convivialité d'utilisation est améliorée.
PCT/CN2021/084004 2020-04-24 2021-03-30 Procédé de commande d'affichage et dispositif portable WO2021213151A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010335463.9 2020-04-24
CN202010335463.9A CN113552937B (zh) 2020-04-24 2020-04-24 显示控制方法和可穿戴设备

Publications (1)

Publication Number Publication Date
WO2021213151A1 true WO2021213151A1 (fr) 2021-10-28

Family

ID=78101415

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/084004 WO2021213151A1 (fr) 2020-04-24 2021-03-30 Procédé de commande d'affichage et dispositif portable

Country Status (2)

Country Link
CN (1) CN113552937B (fr)
WO (1) WO2021213151A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113867168A (zh) * 2021-11-02 2021-12-31 珠海格力电器股份有限公司 带屏设备的控制方法、装置、存储介质及带屏设备
CN114298105A (zh) * 2021-12-29 2022-04-08 东莞市猎声电子科技有限公司 一种跑步过程中快速响应抬腕动作并亮屏的信号处理方法
CN115079804A (zh) * 2021-12-09 2022-09-20 荣耀终端有限公司 一种电子设备的控制处理方法
CN116056190A (zh) * 2022-05-06 2023-05-02 荣耀终端有限公司 管理终端设备的方法、电子设备及计算机可读存储介质
WO2023207715A1 (fr) * 2022-04-28 2023-11-02 华为技术有限公司 Procédé de commande d'écran, dispositif électronique et support d'enregistrement lisible par ordinateur
WO2024183503A1 (fr) * 2023-03-09 2024-09-12 华为技术有限公司 Procédé de commande d'état d'écran et dispositif électronique

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115388511B (zh) * 2022-08-17 2024-09-06 珠海格力电器股份有限公司 一种基于穿戴设备的空调控制方法、装置、电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060028429A1 (en) * 2004-08-09 2006-02-09 International Business Machines Corporation Controlling devices' behaviors via changes in their relative locations and positions
CN104158956A (zh) * 2014-07-21 2014-11-19 小米科技有限责任公司 终端进行睡眠唤醒的方法及装置
CN104899029A (zh) * 2015-05-28 2015-09-09 广东欧珀移动通信有限公司 一种屏幕控制方法及装置
CN107155005A (zh) * 2017-04-27 2017-09-12 上海斐讯数据通信技术有限公司 一种智能腕式可穿戴装置亮屏控制方法及系统
CN110638422A (zh) * 2014-09-23 2020-01-03 飞比特公司 响应于用户手势而更新屏幕内容的方法、系统及设备

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013093712A1 (fr) * 2011-12-22 2013-06-27 Koninklijke Philips Electronics N.V. Système de réveil
US20140058679A1 (en) * 2012-08-23 2014-02-27 Apple Inc. Wake Status Detection for Suppression and Initiation of Notifications
WO2017088154A1 (fr) * 2015-11-26 2017-06-01 华为技术有限公司 Procédé de commutation d'un mode de profil
CN105791545B (zh) * 2016-02-24 2019-08-02 宇龙计算机通信科技(深圳)有限公司 一种终端设备防打扰方法与装置
CN107436674A (zh) * 2017-08-22 2017-12-05 深圳天珑无线科技有限公司 终端控制方法、装置及非临时性计算机可读介质
CN107526603B (zh) * 2017-09-20 2021-01-08 深圳天珑无线科技有限公司 一种应用唤醒方法及装置
CN110362197A (zh) * 2019-06-13 2019-10-22 缤刻普达(北京)科技有限责任公司 屏幕点亮方法、装置、智能穿戴设备及存储介质
CN110850988B (zh) * 2019-12-02 2022-03-11 合肥工业大学 一种防干扰抬腕亮屏的方法
CN111596751B (zh) * 2020-05-19 2022-07-29 歌尔智能科技有限公司 腕戴设备的显示控制方法、装置、腕戴设备及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060028429A1 (en) * 2004-08-09 2006-02-09 International Business Machines Corporation Controlling devices' behaviors via changes in their relative locations and positions
CN104158956A (zh) * 2014-07-21 2014-11-19 小米科技有限责任公司 终端进行睡眠唤醒的方法及装置
CN110638422A (zh) * 2014-09-23 2020-01-03 飞比特公司 响应于用户手势而更新屏幕内容的方法、系统及设备
CN104899029A (zh) * 2015-05-28 2015-09-09 广东欧珀移动通信有限公司 一种屏幕控制方法及装置
CN107155005A (zh) * 2017-04-27 2017-09-12 上海斐讯数据通信技术有限公司 一种智能腕式可穿戴装置亮屏控制方法及系统

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113867168A (zh) * 2021-11-02 2021-12-31 珠海格力电器股份有限公司 带屏设备的控制方法、装置、存储介质及带屏设备
CN115079804A (zh) * 2021-12-09 2022-09-20 荣耀终端有限公司 一种电子设备的控制处理方法
CN115079804B (zh) * 2021-12-09 2023-11-07 荣耀终端有限公司 一种电子设备的控制处理方法
CN114298105A (zh) * 2021-12-29 2022-04-08 东莞市猎声电子科技有限公司 一种跑步过程中快速响应抬腕动作并亮屏的信号处理方法
CN114298105B (zh) * 2021-12-29 2023-08-22 东莞市猎声电子科技有限公司 一种跑步过程中快速响应抬腕动作并亮屏的信号处理方法
WO2023207715A1 (fr) * 2022-04-28 2023-11-02 华为技术有限公司 Procédé de commande d'écran, dispositif électronique et support d'enregistrement lisible par ordinateur
CN116056190A (zh) * 2022-05-06 2023-05-02 荣耀终端有限公司 管理终端设备的方法、电子设备及计算机可读存储介质
WO2024183503A1 (fr) * 2023-03-09 2024-09-12 华为技术有限公司 Procédé de commande d'état d'écran et dispositif électronique

Also Published As

Publication number Publication date
CN113552937A (zh) 2021-10-26
CN113552937B (zh) 2024-07-19

Similar Documents

Publication Publication Date Title
WO2021213151A1 (fr) Procédé de commande d'affichage et dispositif portable
WO2020168965A1 (fr) Procédé de commande d'un dispositif électronique à écran pliant et dispositif électronique
EP4033335A1 (fr) Écran tactile, dispositif électronique et procédé de commande d'affichage
CN111262975B (zh) 亮屏控制方法、电子设备、计算机可读存储介质和程序产品
WO2021036785A1 (fr) Procédé de rappel de message et dispositif électronique
WO2021169515A1 (fr) Procédé d'échange de données entre dispositifs, et dispositif associé
WO2020019355A1 (fr) Procédé de commande tactile pour dispositif vestimentaire, et système et dispositif vestimentaire
WO2021052139A1 (fr) Procédé d'entrée de geste et dispositif électronique
WO2020015144A1 (fr) Procédé de photographie et dispositif électronique
WO2021190314A1 (fr) Procédé et appareil de commande de réponse au glissement d'un écran tactile, et dispositif électronique
WO2020056778A1 (fr) Procédé destiné à blinder un événement tactile et dispositif électronique
WO2023016017A1 (fr) Procédé de commande pour écran d'affichage, et dispositif électronique
WO2022007720A1 (fr) Procédé de détection de port pour un dispositif pouvant être porté, appareil et dispositif électronique
CN113691271B (zh) 数据传输方法及可穿戴设备
WO2020221062A1 (fr) Procédé d'opération de navigation et dispositif électronique
WO2022151887A1 (fr) Procédé de surveillance du sommeil et appareil associé
WO2022105830A1 (fr) Procédé d'évaluation de sommeil, dispositif électronique et support de stockage
CN114089902A (zh) 手势交互方法、装置及终端设备
WO2022237598A1 (fr) Procédé de test d'état de sommeil et dispositif électronique
CN115665632A (zh) 音频电路、相关装置和控制方法
CN113467747B (zh) 音量调节方法、电子设备及存储介质
WO2021204036A1 (fr) Procédé de surveillance du risque de sommeil, dispositif électronique et support de stockage
CN114375027A (zh) 降低功耗的方法和装置
CN111026285B (zh) 一种调节压力阈值的方法及电子设备
CN113918003A (zh) 检测皮肤接触屏幕时长的方法、装置及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21791892

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21791892

Country of ref document: EP

Kind code of ref document: A1