WO2022199352A1 - 一种息屏显示方法及电子设备 - Google Patents

一种息屏显示方法及电子设备 Download PDF

Info

Publication number
WO2022199352A1
WO2022199352A1 PCT/CN2022/079146 CN2022079146W WO2022199352A1 WO 2022199352 A1 WO2022199352 A1 WO 2022199352A1 CN 2022079146 W CN2022079146 W CN 2022079146W WO 2022199352 A1 WO2022199352 A1 WO 2022199352A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
animation
electronic device
mobile phone
user
Prior art date
Application number
PCT/CN2022/079146
Other languages
English (en)
French (fr)
Inventor
张晓航
黄丽薇
杨诗姝
任杰
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to US18/027,527 priority Critical patent/US20230350535A1/en
Priority to EP22774014.9A priority patent/EP4202621A1/en
Publication of WO2022199352A1 publication Critical patent/WO2022199352A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3212Monitoring battery levels, e.g. power saving mode being initiated when battery voltage goes below a certain level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3215Monitoring of peripheral devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3228Monitoring task completion, e.g. by use of idle timers, stop commands or wait commands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • G09G2330/022Power management, e.g. power saving in absence of operation, e.g. no data being entered during a predetermined time

Definitions

  • the present application relates to the field of electronic equipment, and in particular, to an information screen display method and electronic equipment.
  • the AOD (Always On Display) function can also be called the off-screen display, off-screen display or off-screen display function, which means that the electronic device displays the time, caller information, The ability to push messages, etc.
  • the mobile phone may also display a preset screen-opening animation in the area 102 of the screen.
  • the screen animation can be a video with a duration of 3s.
  • the mobile phone can freeze at the last frame of the screen-on screen animation.
  • the present application provides an information screen display method and electronic device, which can be combined with the use state of the electronic device to display the information screen, so that the use state of the electronic device can be more vividly and effectively conveyed to the user in the closed screen state, and the content displayed on the information screen can be enhanced. real-time and interesting.
  • the present application provides a method for displaying an on-screen display, including: an electronic device receiving a screen-on event; in response to the screen-off event, the electronic device can enter a screen-off state; in the screen-off state, the electronic device can The charging status and power information of the first screen animation group display the first screen animation in the first screen animation group.
  • the first screen animation group can include multiple screen animations, and each screen animation can belong to the same theme; when the first screen animation group After the animation ends, if the electronic device detects the first touch operation input by the user, the electronic device may respond to the first touch operation and display the second screen-on-screen animation in the first screen-on-screen animation group.
  • the electronic device can transmit the latest charging status and power information to the user completely and vividly through the multiple screen-on-screen animations in the screen-on-screen animation group.
  • the latest power status of the device is intuitively and vividly displayed to the user, improving the user experience.
  • the above method further includes: the electronic device may freeze and display the last frame of the first screen-on-screen animation; similarly, when the electronic device displays the first screen-on screen animation After the second on-screen animation in the group, the method further includes: the electronic device freezes and displays the last frame of the second on-screen animation.
  • the method further includes: the electronic device detects a second touch operation input by the user; in response to the second touch operation, the electronic device can obtain The latest charging status and power information; when the charging status of the electronic device changes, or when the power level of the electronic device changes, the electronic device can play the third screen animation in the corresponding second screen animation group; When the charging state of the electronic device does not change and the power level of the electronic device does not change, the electronic device can replay the first screen animation. In this way, after all the screen-off animations in the first screen-on-screen animation group have finished playing, the electronic device can respond to the touch operation input by the user, and perform screen-on-screen display in combination with the latest power status.
  • the electronic device detects the first touch operation input by the user, including: the electronic device detects the first touch operation input by the user within a preset time (for example, 10s) after the first screen-closing animation ends. . Otherwise, the electronic device may enter a black screen state, reducing power consumption of the electronic device.
  • a preset time for example, 10s
  • the method further includes: if a user-inputted animation is detected within a preset time after the end of the second screen-holding animation With the third touch operation, the electronic device can obtain the charging state and power information of the electronic device; when the charging state of the electronic device changes, or when the power level of the electronic device changes, the electronic device can play the corresponding second information The third screen animation in the screen animation group; when the charging state of the electronic device does not change and the power level of the electronic device does not change, the electronic device can replay the first screen animation.
  • the electronic device enters a black screen state, thereby reducing the power consumption of the electronic device.
  • the method further includes: if a fourth touch operation input by the user is detected within a preset time, the electronic device can obtain the charging state and power information of the electronic device; When the charging state of the electronic device changes, or when the power level of the electronic device changes, the electronic device can play the third video animation in the corresponding second video animation group; when the charging state of the electronic device does not occur changes, and when the power level of the electronic device does not change, the electronic device can replay the first breath screen animation. That is to say, when the screen of the electronic device is black, it can also respond to the touch operation input by the user, and display the screen with the latest power status.
  • the method further includes: an electronic device.
  • the device displays the preset screen-in animation, and the last frame of the screen-entry animation is the same as the first frame of the animation on the first screen-in.
  • the last frame of the above-mentioned first on-screen animation is the same as the first frame of the second on-screen animation;
  • the animation target is the same; or, the event executed by the animation target in the above-mentioned first on-screen animation is the same as the event executed by the animation target in the second on-screen animation.
  • the method further includes: when the electronic device collects the user's face image or eye image, the electronic device can play For the first interactive screen animation, the first frame of the first interactive screen animation is the same as the last frame of the first screen animation; similarly, after the electronic device displays the second screen animation in the first screen animation group , and also includes: when the electronic device collects the user's facial image or eye image, the electronic device can play the second interactive screen animation, the first frame of the second interactive screen animation and the last frame of the second screen animation The same; wherein, the animation targets in the first interactive on-screen animation and the second interactive on-screen animation both present a state of interacting with the user.
  • the above-mentioned first on-screen animation group may further include a third on-screen animation; after the electronic device displays the second on-screen animation in the first on-screen animation group, it may further include: when the first on-screen animation is displayed. After the two-screen animation ends, if the electronic device detects the fifth touch operation input by the user; the electronic device can respond to the fifth touch operation and display the third screen-on-screen animation in the first screen-on-screen animation group.
  • the multiple screen animations in the animation more completely and vividly show the whole process of the animation target executing a certain event.
  • the electronic device when playing the first screen-on-screen animation or the second screen-on screen animation, may not respond to the touch operation input by the user, so as to avoid interrupting the screen-on-screen animation being played and affecting the user's viewing experience.
  • the present application provides an electronic device, comprising: a touch screen, where the touch screen includes a touch sensor and a display screen; one or more processors; a memory; wherein, the memory stores one or more computer programs, one or more
  • the computer program includes instructions, and when the instructions are executed by the electronic device, the electronic device performs the following steps: receiving an interest screen event; entering an interest screen state in response to the interest screen event; displaying a first interest screen according to the charging state and power information of the electronic device
  • the first screen-breathing animation in the animation group, the first screen-breathing animation group may include multiple screen-breathing animations; after the first screen-breathing animation ends, if the first touch operation input by the user is detected, the first touch operation will be responded to.
  • the operation displays the second screen-on-screen animation in the first screen-on screen animation group.
  • the electronic device is further configured to: freeze and display the last frame of the first breath-screen animation; display the second frame of the first breath-screen animation group on the electronic device; After the screen animation, the electronic device is further used to: freeze and display the last frame of the second screen animation.
  • the electronic device is further configured to: detect the second touch operation input by the user; and acquire the electronic device in response to the second touch operation
  • the third animation in the corresponding second animation group is played; when the electronic device’s
  • the charging state does not change and the power level of the electronic device does not change, the first breath screen animation is replayed.
  • the electronic device detecting the first touch operation input by the user includes: the electronic device detecting the first touch operation input by the user within a preset time after the end of the first screen-closing animation.
  • the electronic device is further configured to: if the second screen holding animation is detected within a preset time after the second screen holding animation ends
  • the third touch operation input by the user obtains the charging status and power information of the electronic device; when the charging status of the electronic device changes, or when the power level of the electronic device changes, the corresponding second screen animation group is played.
  • the charging state of the electronic device does not change and the power level of the electronic device does not change, the first breath-screen animation is replayed.
  • the electronic device if the first touch operation input by the user is not detected within the preset time, the electronic device enters a black screen state; if the third touch operation input by the user is not detected within the preset time, Then the electronic device enters a black screen state.
  • the electronic device is further configured to: obtain the charging state and power information of the electronic device if a fourth touch operation input by the user is detected within a preset time; When the charging state of the electronic device changes, or when the power level of the electronic device changes, play the third video animation in the corresponding second video animation group; when the charging state of the electronic device does not change, and When the power level of the electronic device does not change, the first breath screen animation is replayed.
  • the electronic device after the electronic device enters the screen-on-screen state, before the electronic device displays the first screen-on-screen animation in the first screen-on-screen animation group according to the charging status and power information of the electronic device, the electronic device also uses In: Displays the preset entry animation, the last frame of the entry animation is the same as the first frame of the first animation.
  • the last frame of the first screen breath animation is the same as the first frame of the second screen breath animation; or, the animation target in the first screen breath animation is the same as the animation in the second screen breath animation.
  • the target is the same; or, the event executed by the animation target in the first screen-hold animation is the same as the event executed by the animation target in the second screen-breath animation.
  • the electronic device is further configured to: when the electronic device collects the user's face image or eye image, play The first interactive screen animation, the first frame of the first interactive screen animation is the same as the last frame of the first screen animation; after the electronic device displays the second screen animation in the first screen animation group, the electronic device Also used for: when the electronic device collects the user's facial image or eye image, play the second interactive screen animation, the first frame of the second interactive screen animation is the same as the last frame of the second screen animation; wherein , the animation targets in the first interactive screen animation and the second interactive screen animation show a state of interacting with the user.
  • the first on-screen animation group further includes a third on-screen animation; after the electronic device displays the second on-screen animation in the first on-screen animation group, the electronic device is further configured to: After the end of the screen-breathing animation, the fifth touch operation input by the user is detected; in response to the fifth touch operation, the third screen-breathing animation in the first screen-breathing animation group is displayed.
  • the electronic device when playing the first on-screen animation or the second on-screen animation, may not respond to the touch operation input by the user.
  • the present application provides an electronic device, comprising: a memory, a display screen and one or more processors; the memory, the display screen and the processor are coupled.
  • the memory is used to store computer program codes, and the computer program codes include computer instructions; when the electronic device is running, the processor is used to execute one or more computer instructions stored in the memory, so that the electronic device executes the above-mentioned first aspect. Any one of the above-mentioned methods for displaying an off-screen display.
  • the present application provides a computer storage medium, including computer instructions, which, when the computer instructions are executed on an electronic device, cause the electronic device to execute the method for displaying an information screen according to any one of the first aspects.
  • the present application provides a computer program product that, when the computer program product is run on an electronic device, causes the electronic device to execute the method for displaying an information screen according to any one of the first aspects.
  • FIG. 1 is a schematic diagram of an application scenario of an information screen display function in the prior art
  • FIG. 2 is a schematic structural diagram 1 of an electronic device according to an embodiment of the present application.
  • FIG. 3 is a schematic flowchart 1 of an information screen display method provided by an embodiment of the present application.
  • FIG. 4 is a second schematic flowchart of a method for displaying an information screen provided by an embodiment of the present application
  • FIG. 5 is a schematic flow chart 3 of an information screen display method provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram 1 of an application scenario of an information screen display method provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram 2 of an application scenario of an information screen display method provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram 3 of an application scenario of an information screen display method provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram 4 of an application scenario of an information screen display method provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram 5 of an application scenario of an information screen display method provided by an embodiment of the present application.
  • FIG. 11 is a fourth schematic flowchart of an information screen display method provided by an embodiment of the present application.
  • FIG. 12 is a schematic flowchart 5 of an information screen display method provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram 6 of an application scenario of an information screen display method provided by an embodiment of the present application.
  • FIG. 14 is a sixth schematic flowchart of an information screen display method provided by an embodiment of the present application.
  • FIG. 15 is a seventh schematic flowchart of a method for displaying an information screen provided by an embodiment of the present application.
  • FIG. 16 is a schematic flowchart eight of an information screen display method provided by an embodiment of the present application.
  • FIG. 17 is a schematic flow chart 9 of an information screen display method provided by an embodiment of the present application.
  • FIG. 18 is a schematic flowchart tenth of a method for displaying an information screen provided by an embodiment of the present application.
  • FIG. 19 is a seventh schematic diagram of an application scenario of an information screen display method provided by an embodiment of the application.
  • FIG. 20 is a schematic flowchart eleventh of a method for displaying an information screen provided by an embodiment of the present application.
  • FIG. 21 is a second schematic structural diagram of an electronic device according to an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • an information screen display method provided in this embodiment of the present application can be applied to a mobile phone, a vehicle-mounted device (also referred to as a vehicle-mounted device), a tablet computer, a notebook computer, and an ultra-mobile personal computer (UMPC). ), handheld computers, netbooks, personal digital assistants (personal digital assistants, PDAs), wearable electronic devices, virtual reality devices, and other electronic devices with an on-screen display function, which are not limited in the embodiments of the present application.
  • FIG. 2 it is a schematic structural diagram of a mobile phone.
  • the mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, and an audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, and an audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, etc.
  • USB universal serial bus
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the mobile phone.
  • the mobile phone may include more or less components than shown, or some components may be combined, or some components may be separated, or different component arrangements.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in a cell phone can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the mobile phone.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the wireless communication module 160 can provide applications on the mobile phone including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellite system ( global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation
  • the antenna 1 of the mobile phone is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the handset may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • an OLED display screen may include a plurality of OLED pixel units arranged in an array. As shown in FIG. 2 , each OLED pixel unit includes a cathode, an anode, and an electron transport layer, a hole transport layer and a light-emitting layer located between the cathode and the anode.
  • the cathode may be a metal electrode
  • the anode may be an ITO (indium tin oxide, indium tin oxide) transparent electrode.
  • the driving voltage V After the driving voltage V is input to the cathode and the anode, under the action of the driving voltage V, electrons are transported from the cathode to the electron transport layer, and holes are injected from the anode to the hole transport layer.
  • the light-emitting molecules in the light-emitting layer are excited and radiated to generate a light source.
  • the driving voltage V is different, the corresponding OLED pixel units can be excited to present different colors and brightness. In this way, each OLED pixel unit in the OLED display screen can display corresponding pictures under different driving voltages.
  • the organic materials in the electron transport layer, the hole transport layer and the light-emitting layer will gradually age during use.
  • the phenomenon of afterimage in the OLED display screen is actually that the OLED pixel unit in a fixed position always displays the same and still image for a long time, which causes the organic material in this part of the pixel unit to be more depleted than other positions, and the luminous efficiency is attenuated. faster, leaving an afterimage on the OLED display.
  • the mobile phone can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the mobile phone may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the mobile phone selects the frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, etc.
  • Video codecs are used to compress or decompress digital video.
  • a phone can support one or more video codecs.
  • the mobile phone can play or record videos in various encoding formats, such as: Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the mobile phone can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a “speaker” is used to convert audio electrical signals into sound signals.
  • the mobile phone can listen to music through the speaker 170A, or listen to hands-free calls.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be received by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the mobile phone may be provided with at least one microphone 170C.
  • the mobile phone may be provided with two microphones 170C, which in addition to collecting sound signals, may also implement a noise reduction function.
  • the mobile phone can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the sensor module 180 may include a pressure sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • the mobile phone may also include a charging management module, a power management module, a battery, a button, an indicator, and one or more SIM card interfaces, which are not limited in this embodiment of the present application.
  • the mobile phone can use each sensor in the above sensor module to collect corresponding sensor data.
  • a cell phone may use a GPS device to obtain the user's location.
  • the mobile phone can use the acceleration sensor and the gravity sensor to obtain the user's step count.
  • the mobile phone may use a temperature sensor to obtain the temperature of the environment where the user is located, and the like.
  • applications such as weather, calendar, music or health can also be installed in the mobile phone.
  • the mobile phone can also obtain corresponding application data from these applications.
  • the mobile phone can obtain the power information of the mobile phone by setting the battery function in the APP.
  • the mobile phone can obtain whether the mobile phone is playing music through the music APP.
  • the mobile phone can obtain health data such as the number of steps and heart rate of the user through the health APP.
  • the above sensor data and/or application data may be referred to as usage data of the mobile phone, and the usage data may be used to reflect the usage status of the mobile phone.
  • the screen of the mobile phone is off (also known as screen-off, screen-off, screen-locked or black-screen)
  • the mobile phone can display the screen-off-screen according to the latest usage data, so that the mobile phone can display the screen-off state in the off-screen state.
  • the content on the screen is associated with the latest usage status of the phone.
  • the mobile phone when the latest usage data of the mobile phone indicates that the mobile phone is playing music, the mobile phone can display an on-screen animation 1 of a bear dancing after the screen is off. For another example, when the latest usage data of the mobile phone indicates that the number of steps of the user is greater than the preset value, the mobile phone can display the on-screen animation 2 of Bear Fitness after the screen is off. For another example, when the latest usage data of the mobile phone indicates that the battery power of the mobile phone is less than the preset value, the mobile phone may display an on-screen animation 3 of a bear lying down after the screen is off.
  • the mobile phone can also transmit the latest usage status of the mobile phone to the user in the off-screen state, which not only increases the interest of the interaction between the mobile phone and the user, but also effectively updates the mobile phone with the latest usage status.
  • the usage status (such as battery level, whether to play music, etc.) is intuitively and vividly displayed to the user to improve the user experience.
  • the mobile phone can also transmit the usage status of the mobile phone to the user in other forms such as displaying text or pictures associated with the usage data of the mobile phone, which is not limited in this embodiment of the present application.
  • a mobile phone is used as an example for an electronic device having an information display function.
  • a power management unit may be provided in the mobile phone, and the power management unit may monitor the charging state of the mobile phone in real time.
  • the above-mentioned power management unit may be a power management service (PowerManagerService, PMS) in the mobile phone.
  • PMS power management service
  • the charging start event can be reported to the PMS.
  • a charging end event can be reported to the PMS.
  • the PMS acquires the charging start event and does not acquire the charging end event, it can be determined that the mobile phone is in a charging state. Otherwise, the PMS can determine that the phone is not charging.
  • the above-mentioned power management unit can also monitor the power information of the mobile phone in real time, and then determine which gear the power of the mobile phone is in according to the power information of the mobile phone.
  • the power of the mobile phone can be divided into three gears in advance, wherein the first gear is: 0% ⁇ electricity ⁇ 20%, that is, the low battery state, and the second gear is: 20% ⁇ electricity ⁇ 60%, that is In the medium battery state, the third gear is: 60% ⁇ battery ⁇ 100%, that is, the high battery state.
  • the power management unit determines that the mobile phone is in a non-charging state, it can also determine that the power of the mobile phone is in the first gear, the second gear or the third gear according to the newly acquired power information.
  • the latest charging status and power information of the mobile phone can be obtained according to the above method, and then different screen-on-screen animations are displayed in the screen-on-screen state according to the latest charging status and power information of the mobile phone.
  • (or on-screen animation group) so that the current battery status of the mobile phone can be vividly and intuitively conveyed to the user through different on-screen animations (or on-screen animation groups), so that the user can intuitively and vividly know the mobile phone when the mobile phone screen is off. the battery status, and enhance the fun when the screen is displayed.
  • the on-screen animation group refers to a group of on-screen animations.
  • the on-screen animation group may include multiple on-screen animations, and each off-screen animation is a video of a certain duration.
  • the duration of each pause animation can be 3s.
  • the plurality of screen-breathing animations in the above-mentioned screen-breathing animation group may belong to the same theme.
  • the three screen-breathing animations in the screen-breathing animation group A are all animations on the theme of dancing bears. That is to say, the animation targets (for example, characters or animals) of each screen-on-screen animation in the same screen-on-screen animation group may be the same, and the events executed by the animation targets may be the same.
  • the whole process of the animation target performing a certain event (for example, the dance of the bear) can be more completely and vividly represented by the multiple on-screen animations in the on-screen animation group.
  • the multiple screen-off animations in the above-mentioned screen-off animation group may be continuous.
  • screen-off animation group A includes screen-off animation 1 and screen-off animation 2
  • the last frame of screen-off animation 1 is the last frame of screen-off animation 1.
  • the picture can be the same as or similar to the first frame picture of the animation 2.
  • the last frame of the screen-on-screen animation 2 may be the same as or similar to the first frame of the screen-on-screen animation 1.
  • Seam connection will not cause screen mutation and other phenomena to affect the user's experience.
  • the mobile phone can receive a screen-on-screen event input by the user, and the screen-on-screen event can be used to trigger the mobile phone to enter the screen-on-screen state.
  • the on-screen event may be an event in which the user taps the power button when the screen of the mobile phone is on.
  • the screen-on-screen event may be an event in which no user operation is detected within a certain period of time when the screen of the mobile phone is on.
  • the mobile phone can obtain the latest charging status and power information 1 according to the above method. If the mobile phone is in the charging state, no matter how much the battery power of the mobile phone is, after the mobile phone is off the screen, the first on-screen animation group associated with the charging state can be displayed. The multiple screen-on-screen animations in the first screen-on-screen animation group can convey the charging state of the mobile phone to the user.
  • the first breath screen animation group can be used to show the process of a polar bear swimming in lake water.
  • the first on-screen animation group may include on-screen animation A1 and on-screen animation A2.
  • the breath-screen animation A1 and the breath-screen animation A2 can be two different stages when the polar bear swims in the lake water.
  • the animation group of the first breath screen can also be used for the cat sleeping on the grass. In this way, the charging state of the mobile phone can be conveyed to the user through relatively leisurely animations such as swimming and sleeping, so that the user can intuitively and vividly know the current charging state of the mobile phone.
  • the mobile phone can determine which gear the power of the mobile phone is currently in according to the newly acquired power information 1 . For example, if 0% ⁇ power information 1 ⁇ 20%, the mobile phone can determine that the power of the mobile phone is in the first gear, that is, the low power state, and after the screen of the mobile phone is off, it can display the second information associated with the power information of the first gear. Screen animation group.
  • the second breath screen animation group can be used to show the process of the polar bear lying on the ice and looking up at the sky.
  • the second screen-holding animation group may include screen-holding animation B1 and screen-holding animation B2.
  • the breath screen animation B1 and the breath screen animation B2 can be two different stages when the polar bear is lying on the ice and looking up at the sky.
  • the speed of the polar bear's movement (for example, V1) can be slow, or the amplitude of the polar bear's movement (A1) can be small, so that the polar bear is in a more tired state, so as to convey the intuitive and vivid transmission to the user. Information that the phone is in a low battery state.
  • the mobile phone can determine that the power of the mobile phone is in the second gear, that is, the medium power state, and the mobile phone can display the same power as the second gear after the screen is off.
  • the third screen animation group associated with the information if 20% ⁇ power information 1 ⁇ 60%, the mobile phone can determine that the power of the mobile phone is in the second gear, that is, the medium power state, and the mobile phone can display the same power as the second gear after the screen is off.
  • the third breath screen animation group can be used to show the process of polar bears interacting with meteors or other objects on the ice surface.
  • the third screen-breathing animation group may include screen-breathing animation C1 and screen-breathing animation C2.
  • On-screen animation C1 and off-screen animation C2 can be two different stages of polar bears interacting with meteors on the ice.
  • the motion speed of the polar bear (for example, V2) may be faster than the polar bear motion speed in the second breath-screen animation group, that is, V2>V1.
  • the magnitude of the polar bear's motion (eg, A2 ) may be greater than the magnitude of the polar bear's motion in the second breath-screen animation group, that is, A2 > A1 .
  • the polar bear in the third breath screen animation group can present a more dynamic state, thereby intuitively and vividly conveying the information that the mobile phone is in a medium power state to the user.
  • the mobile phone can determine that the power of the mobile phone is in the third gear, that is, the high power state, then the mobile phone can display the same power as the third gear after the screen is off.
  • the fourth breath screen animation group associated with the information if 60% ⁇ power information 1 ⁇ 100%, the mobile phone can determine that the power of the mobile phone is in the third gear, that is, the high power state, then the mobile phone can display the same power as the third gear after the screen is off.
  • the fourth breath screen animation group can be used to show the process of polar bears interacting with meteors or other objects on the ice.
  • the fourth on-screen animation group may include on-screen animation D1 and on-screen animation D2.
  • the breath-screen animation D1 and the breath-screen animation D2 can be two different stages when the polar bear interacts with the meteor on the ice surface.
  • the moving speed of the polar bear (for example, V3) may be faster than the polar bear moving speed in the third breath-screen animation group, that is, V3>V2.
  • the magnitude of the polar bear's motion (eg, A3) may be greater than that of the polar bear's motion in the third breath-screen animation group, that is, A3>A2.
  • the polar bear in the fourth breath screen animation group can present a more dynamic state, thereby intuitively and vividly conveying the information that the mobile phone is in a high battery state to the user.
  • the corresponding screen-on-screen animation group can be displayed after the screen-off according to the charging status and power information of the mobile phone, and the screen-on-screen animation in the corresponding screen-on-screen animation group can be used to convey the current information of the mobile phone to the user. battery status.
  • the user in the off-screen state, the user can intuitively and effectively know the current battery status of the mobile phone through the animation content in the off-screen animation group, which enhances the real-time and interestingness of the contents displayed on the off-screen.
  • the screen-on-screen animation group includes two screen-off animations as an example. It is understood that any screen-on-screen animation group may include two or more screen-on-screen animations.
  • the application examples do not impose any limitation on this.
  • the mobile phone after the mobile phone enters an on-screen state, the mobile phone can display a corresponding on-screen animation group in the first area 1001 of the screen.
  • the mobile phone after the mobile phone enters the screen-off state, the mobile phone can also display the current time, date or an icon of a notification message in the second area 1002 .
  • the mobile phone can also display a fingerprint identification area and the like when the screen is off, which is not limited in this embodiment of the present application.
  • the user can set the on-screen display function in the mobile phone to various display modes such as all-day display, timing display, or touch display.
  • the all-day display means when the mobile phone enters the screen-off state throughout the day, the above-mentioned screen-off animation and other contents need to be displayed.
  • Timing display means within the time range set by the user, if the mobile phone enters the screen-off state, the above-mentioned screen-off animation and other contents need to be displayed.
  • Touch display means that after the mobile phone enters the screen-off state, the time for displaying the above-mentioned screen-on-screen animation and other content is fixed each time.
  • the mobile phone can stop displaying the above-mentioned screen-off animation and other contents and enter the black screen state; Correspondingly, if the user's touch operation is detected in the black screen state, the mobile phone can display the above-mentioned content such as the screen-on-screen animation again.
  • the mobile phone in a black screen state, the mobile phone usually does not display any content on the display screen, so that the entire display area of the display screen appears black.
  • the display mode of the on-screen display function is all-day display, as shown in Figure 11, after the mobile phone enters the off-screen state, the corresponding on-screen animation group (for example, off-screen animations) can be played according to the current charging status and power information.
  • the first on-screen animation in group A) is the on-screen animation 1.
  • the mobile phone can freeze and display the last frame of the on-screen animation 1.
  • the freeze-frame display of the last frame of the on-screen animation 1 refers to the continuous display of the last frame of the on-screen animation 1 .
  • the on-screen animation 1 is displayed in the display area 1 in the display screen.
  • the mobile phone can refresh the display content on the display screen at a certain frame rate, and the display content in the display area 1 is the last frame of the screen-on-screen animation 1 every time it is refreshed. Subsequently, the mobile phone can monitor whether the user inputs a touch operation. If no touch operation input by the user is detected, the mobile phone can stay in the last frame (ie, the last frame) of the screen-on-screen animation 1 .
  • the mobile phone can play the second on-screen animation in the on-screen animation group A, that is, on-screen animation 2 .
  • the mobile phone can freeze the last frame of the on-screen animation 2 to display.
  • the screen-on-screen animation group A only includes screen-on-screen animation 1 and screen-on-screen animation 2.
  • the mobile phone detects that the charging status of the mobile phone has changed, or detects where the power information of the mobile phone is located.
  • the phone can still play the animation 2 or stay in the last frame of the animation 1, instead of playing the information corresponding to the new power gear (or charging state).
  • Screen animation group to avoid the phenomenon that the screen cannot be effectively connected after jumping to the new screen animation group.
  • the mobile phone After the mobile phone freezes and displays the last frame of the on-screen animation 2, still as shown in Figure 11, if the user input touch operation is not detected, the mobile phone can stay in the last frame of the on-screen animation 2. Correspondingly, if a touch operation input by the user is detected, the mobile phone can obtain the latest charging status and power information, and then determine whether the current charging status or power level of the mobile phone has changed. If the current charging status and power level of the mobile phone have not changed, the mobile phone can replay the on-screen animation 1 in the on-screen animation group A. Correspondingly, if the current charging state or power level of the mobile phone changes, the mobile phone can play the first on-screen animation in the corresponding on-screen animation group (eg, on-screen animation group B).
  • the first on-screen animation in the corresponding on-screen animation group (eg, on-screen animation group B).
  • the first frame of the first screen-on animation in the screen-on-screen animation group B may be the same as or similar to the last frame of the last screen-on-screen animation in the screen-on-screen animation group A.
  • a seamless visual experience can be provided to the user.
  • the mobile phone may Play the corresponding transitional screen animation.
  • the first frame of the transition screen animation can be the same as or similar to the last frame of the last screen animation in the screen animation group A.
  • the last frame of the transition screen animation can be the same as the screen animation.
  • the first frame of the first screen-free animation in group B is the same or similar.
  • the mobile phone after the mobile phone enters the off-screen state, it can first display the animation of the on-screen off-screen display, and then display the corresponding information on the current charging status and power level.
  • Screen animation group As shown in FIG. 12 , different from the screen-on-screen display process shown in FIG. 11 , after the mobile phone enters the screen-off state, the screen-on-screen animation corresponding to the screen-on-screen animation group A mentioned above can be displayed first.
  • the entrance breath screen animation 1301 may be the process of a polar bear walking on the ice surface.
  • the last frame of the screen-in animation 1301 may be the same as or similar to the first frame of the screen-in animation 1 in the screen-in animation group A to be displayed subsequently.
  • the mobile phone detects all the touch operations input by the user, the mobile phone can display the first on-screen animation of the on-screen animation group A corresponding to the current charging status and power information (that is, the off-screen animation).
  • the subsequent process of displaying the on-screen display on the mobile phone is similar to the process of displaying the on-screen on-screen display without entering the on-screen on-screen animation in FIG. 11 , so it will not be repeated here.
  • the method for displaying the screen-on-screen animation group after the screen of the mobile phone is off is similar to the above method.
  • the difference is that, as shown in FIG. 14 , when the mobile phone enters the screen-off state, it can be determined whether the current time is within the time range set by the user. If it is within the time range set by the user, the mobile phone can perform an on-screen display according to the related method shown in FIG. 11 .
  • FIG. 15 when the mobile phone enters the screen-off state, it can be determined whether the current time is within the time range set by the user.
  • the mobile phone can first display the corresponding animation on the entry screen according to the method shown in FIG.
  • the mobile phone may enter a black screen state after the screen is off according to the existing method.
  • the display mode of the screen-on-screen display function is touch display, as shown in FIG. 16
  • the information in the screen-on-screen animation group A can be displayed according to the current charging status and power information.
  • the mobile phone can freeze and display the last frame of the on-screen animation 1. Further, the mobile phone can detect whether a touch operation input by the user is received within a preset time (for example, 10s).
  • the mobile phone can play the screen-on-screen animation 2 in the screen-on-screen animation group A.
  • the mobile phone can freeze the last frame of the on-screen animation 2 to display.
  • the mobile phone can detect whether a touch operation input by the user is detected within a preset time (for example, 10s). If no touch operation input by the user is detected, the mobile phone may enter a black screen state after the screen is off according to the existing method.
  • the mobile phone can obtain the latest charging status and power information, and then determine whether the current charging status or power level of the mobile phone has changed. If the current charging status and power level of the mobile phone have not changed, the mobile phone can replay the on-screen animation 1 in the on-screen animation group A. Correspondingly, if the current charging state or power level of the mobile phone changes, the mobile phone can play the first on-screen animation in the corresponding on-screen animation group (eg, on-screen animation group B).
  • the mobile phone can enter a black screen state after screen-off according to the existing method. Subsequently, if the mobile phone detects a touch operation input by the user in a black screen state, similar to the above method, the mobile phone can obtain the latest charging status and power information, and then determine whether the current charging status or power level of the mobile phone has changed. If the current charging status and power level of the mobile phone have not changed, the mobile phone can replay the on-screen animation 1 in the on-screen animation group A. Correspondingly, if the current charging state or power level of the mobile phone changes, the mobile phone can play the first on-screen animation in the corresponding on-screen animation group (eg, on-screen animation group B).
  • the first on-screen animation in the corresponding on-screen animation group (eg, on-screen animation group B).
  • the mobile phone after the mobile phone enters the screen-off state, it can first display the animation of the screen-on-screen display, and then display the information corresponding to the current charging status and power information. screen animation group.
  • the screen-on-screen animation corresponding to the screen-on-screen animation group A can be displayed first, for example, the screen-on-screen animation 1301 shown in FIG. 13 is displayed.
  • the mobile phone can stay in the last frame of the animation of the entrance screen.
  • the mobile phone can detect whether a touch operation input by the user is received within a preset time.
  • the mobile phone can play the screen animation group A corresponding to the current charging status and power information.
  • On-screen animation 1 and freeze on the last frame of on-screen animation 1.
  • the mobile phone can play screen-on-screen animation 2 in the screen-on-screen animation group A.
  • the mobile phone can freeze and display the last frame of the on-screen animation 2.
  • the mobile phone can detect whether a touch operation input by the user is detected within a preset time.
  • the mobile phone may enter a black screen state after the screen is off according to the existing method. If the touch operation input by the user is detected, the mobile phone can obtain the latest charging status and power information, and then determine whether the current charging status or power level of the mobile phone has changed. If neither the current charging state nor the power level of the mobile phone has changed, the mobile phone can replay the on-screen animation 1 in the on-screen animation group A. Correspondingly, if the current charging state or power level of the mobile phone changes, the mobile phone can play the first on-screen animation in the corresponding on-screen animation group (eg, on-screen animation group B).
  • the first on-screen animation eg, on-screen animation group B.
  • the mobile phone when the animation on the entrance screen ends, if no touch operation input by the user is detected within a preset time, the mobile phone can enter a black screen state.
  • the mobile phone may also enter a black screen state.
  • the mobile phone when the mobile phone enters the black screen state, as shown in Figure 17, if the touch operation input by the user is detected, the mobile phone can obtain the latest charging status and power information, and then according to the current charging status of the mobile phone The state or power level will play the corresponding entry screen animation. For example, if neither the current charging state nor the power level of the mobile phone has changed, the mobile phone can replay the entrance breath screen animation corresponding to the breath screen animation group A (for example, the entrance breath screen animation 1). Correspondingly, if the current charging state or power level of the mobile phone changes, for example, the current charging state or power level of the mobile phone corresponds to the animation group B on the screen, the mobile phone can play the entry information corresponding to the animation group B on the screen. Screen animation (such as entrance screen animation 2).
  • the mobile phone in the off-screen state can select the corresponding on-screen animation group to display based on the latest charging status and power information.
  • the latest charging status and power information can be more intuitively and vividly delivered to the user, enhancing the real-time and interestingness of the screen-on-screen display.
  • the screen-on-screen animation group includes two screen-off animations.
  • Animation 3 the mobile phone can also display on the screen according to the above method.
  • the touch operation involved in the above embodiment may be a click operation or a double-click operation, and may also be an operation such as knuckle tapping, stylus click, etc., which is not limited in this embodiment of the present application.
  • the mobile phone when the mobile phone enters the screen-off state, it can not only respond to the touch operation input by the user, but also capture the user's face image or eye image. For example, when the mobile phone enters the screen-off state, the mobile phone can call the camera to start capturing images. When the user's face image or eye image is collected, it indicates that the user's focus at this time may be on the screen of the mobile phone, and the mobile phone can play the corresponding interactive screen animation. Among them, the characters or animals in the interactive screen animation can present a state of interacting with the user in front of the screen.
  • the mobile phone after the mobile phone enters the screen-on-screen state, it can play screen-on-screen animation 1 in screen-on-screen animation group A, and then freeze the last frame of screen-on-screen animation 1 to display. Subsequently, if the mobile phone collects the user's face image or eye image, the mobile phone can play the interactive screen-on-screen animation 1 corresponding to the screen-on-screen animation 1 .
  • the first frame of the interactive screen animation 1 can be the same as or similar to the last frame of the screen animation 1
  • the last frame of the interactive screen animation 1 can also be the same as the last frame of the screen animation 1. or similar.
  • the interactive screen animation 1 can be a process in which the polar bear turns his head to look at the user, and then turns back and stands on the ice.
  • the interactive screen animation 1 can be a process in which the polar bear turns his head to look at the user, and then turns back and stands on the ice.
  • the mobile phone can play the screen-on-screen animation in the screen-on-screen animation group A according to the method described in the embodiment. Animation 2, and then freeze the last frame of the screen animation 2 to display. Subsequently, similar to the above-mentioned embodiment, if the mobile phone collects the user's face image or eye image, the mobile phone can play the interactive on-screen animation 2 corresponding to the on-screen animation 2 .
  • the first frame of the interactive screen animation 2 can be the same as or similar to the last frame of the screen animation 2
  • the last frame of the interactive screen animation 2 can also be the same as the last frame of the screen animation 2. or similar.
  • the mobile phone can set the corresponding interactive screen animation for each screen breath animation in the screen breath animation group. After the mobile phone plays any screen breath animation in the screen breath animation group, it can respond to the facial image or the face image input by the user. The eye image plays the corresponding interactive screen animation to enhance the interaction and interest between the user and the mobile phone.
  • the mobile phone can freeze the last frame of the interactive screen animation after playing the corresponding interactive screen animation (eg, the interactive screen animation 1 or the interactive screen animation 2). If the mobile phone still detects the user's face image or eye image during the playback of the interactive screen animation, or when the interactive screen animation ends, the mobile phone can play the interactive screen animation again. Alternatively, the mobile phone may play the interactive screen animation again without responding to the facial image or eye image input by the user, so as to reduce the power consumption of the mobile phone. Subsequently, when the mobile phone re-detects the facial image or eye image input by the user, the interactive screen animation can be played again.
  • the mobile phone may play the interactive screen animation again without responding to the facial image or eye image input by the user, so as to reduce the power consumption of the mobile phone. Subsequently, when the mobile phone re-detects the facial image or eye image input by the user, the interactive screen animation can be played again.
  • the mobile phone may start a timer for a certain period of time (for example, 5 minutes). During the time before the timer expires, if the mobile phone detects the facial image or eye image input by the user, the mobile phone does not need to play the above-mentioned interactive screen animation again. Correspondingly, after the timer times out, if the mobile phone detects the facial image or the eye image input by the user, the mobile phone can play the above-mentioned interactive screen animation again.
  • the mobile phone can also dynamically play the corresponding interactive screen animation according to the duration of the facial image or the eye image input by the user. For example, when the facial image or eye image input by the user is detected, the mobile phone can play the process of the polar bear turning to look at the user in the corresponding interactive screen animation 1; when the user continues to input the facial image or eye image, the mobile phone can Display the process of the polar bear looking at the user in the interactive screen animation 1; when the mobile phone detects that the user's facial image or eye image has left, it can display the interactive screen animation 1 The process of the polar bear standing back on the ice can be displayed.
  • the mobile phone can respond to the detection of the user.
  • the event of the face image (or eye image) of play the corresponding interactive screen animation.
  • the mobile phone does not need to respond to the touch operation input by the user. That is, the event of detecting the face image (or eye image) of the user may have a higher priority than the event of detecting the touch operation.
  • the priority of the event of detecting a touch operation may also be set higher than the priority of the event of detecting the user's face image (or eye image), which is not limited in this embodiment of the present application.
  • the mobile phone may not respond and continue to play the on-screen animation that is being played, so as to avoid interrupting the on-screen animation from affecting the user's viewing experience.
  • the screen-breathing animation group includes two screen-breathing animations. It is understood that any screen-breathing animation group may include three or more screen-breathing animations.
  • the off-screen animation group A may include off-screen animation 1, off-screen animation 2, and off-screen animation 3.
  • the screen-on-screen animation 1 in the screen-on-screen animation group A can be played according to the current charging state and power information.
  • the mobile phone can freeze and display the last frame of the on-screen animation 1.
  • the mobile phone can play the screen-on-screen animation 2 in the screen-on-screen animation group A, and then freeze the last frame of the screen-on-screen animation 2 to display.
  • the mobile phone can play the screen-on-screen animation 3 in the screen-on-screen animation group A, and then freeze and display the last frame of the screen-on-screen animation 3 .
  • the mobile phone can obtain the latest charging status and power information of the mobile phone. If the current charging status and power level of the mobile phone have not changed, the mobile phone can replay the on-screen animation 1 in the on-screen animation group A. Correspondingly, if the current charging state or power level of the mobile phone changes, the mobile phone can play the first on-screen animation in the corresponding on-screen animation group (eg, on-screen animation 1 in the on-screen animation group B). That is to say, when the three screen-on-screen animations in the screen-on-screen animation group A are all played, the mobile phone can display screen-on-screen animations in other screen-on-screen animation groups based on the latest charging status and power information.
  • the first on-screen animation in the corresponding on-screen animation group eg, on-screen animation 1 in the on-screen animation group B. That is to say, when the three screen-on-screen animations in the screen-on-screen animation group A are all played, the mobile phone can display screen-on-screen animations in other screen-on-
  • the mobile phone can also refer to the above method to display screen-on-screen animation 1, screen-on-screen animation 2, and screen-on-screen animation 3 in the screen-on-screen animation group A, which is an embodiment of the present application. There are no restrictions on this.
  • the charging state or power information of the mobile phone is taken as the usage data of the mobile phone as an example, and it is explained how to display the on-screen animation group associated with the usage data when the mobile phone is in the off-screen state.
  • the mobile phone can also dynamically display different on-screen animation groups in the off-screen state according to other usage data (such as the user's steps, whether to play music, weather or time, etc.).
  • the mobile phone can display the on-screen animation group associated with the state of playing music after the mobile phone is off the screen.
  • the mobile phone is not currently running the music APP, after the mobile phone is off the screen, a screen-on-screen animation group associated with the state of not playing music can be displayed. In this way, the user can vividly and intuitively know the current state of whether the mobile phone is playing music through the multiple screen-on-screen animations in the screen-on-screen animation group.
  • the mobile phone can also combine the usage data of various mobile phones to dynamically display different on-screen animation groups in the off-screen state.
  • the mobile phone can display different animation groups on the screen in combination with the current battery information and whether the music APP is running, so as to convey the current usage status of the mobile phone, such as the battery power of the mobile phone and whether to play music, to the user.
  • the display position of the on-screen animation or on-screen animation group can also be moved.
  • the mobile phone can periodically move the display position of the screen-on-screen animation or screen-on-screen animation group according to the preset motion trajectory in a period of 3s, so as to avoid screen burn-in caused by displaying the same or similar content in a certain position for a long time.
  • the mobile phone is used as an example to display the information on the screen.
  • the above-mentioned method for the display of the screen can also be applied to electronic equipment such as vehicle-mounted devices, tablet computers, and watches, and all of these devices can be used.
  • the embodiment of the present application does not impose any limitation on this.
  • an embodiment of the present application discloses an electronic device, and the electronic device may be the above-mentioned mobile phone.
  • the electronic device may specifically include: a touch screen 2101 including a touch sensor 2106 and a display screen 2107; one or more processors 2102; a memory 2103; one or more application programs (not shown); and one or more A computer program 2104, the various devices described above may be connected through one or more communication buses 2105.
  • the above-mentioned one or more computer programs 2104 are stored in the above-mentioned memory 2103 and configured to be executed by the one or more processors 2102, and the one or more computer programs 2104 include instructions, which can be used to execute the above-mentioned Relevant steps in the examples.
  • Each functional unit in each of the embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as an independent product, may be stored in a computer-readable storage medium.
  • a computer-readable storage medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Abstract

一种息屏显示方法及电子设备,涉及终端技术领域,可结合电子设备的使用状态进行息屏显示,从而在息屏状态下更加生动、有效的向用户传递电子设备的电量状态,增强息屏显示内容的实时性和趣味性。该方法包括:电子设备接收息屏事件;响应于该息屏事件,电子设备可进入息屏状态;在息屏状态下,电子设备可根据电子设备的充电状态和电量信息显示第一息屏动画组中的第一息屏动画;当第一息屏动画结束后,如果电子设备检测到用户输入的第一触摸操作,则电子设备可响应第一触摸操作,显示第一息屏动画组中的第二息屏动画。

Description

一种息屏显示方法及电子设备
本申请要求于2021年3月26日提交国家知识产权局、申请号为202110326182.1、发明名称为“一种息屏显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备领域,尤其涉及一种息屏显示方法及电子设备。
背景技术
AOD(Always On Display)功能也可称为息屏显示、熄屏显示或灭屏显示功能,是指电子设备在不点亮整块屏幕的前提下,在屏幕的部分区域显示时间、来电信息、推送消息等内容的功能。
以手机为电子设备举例,手机开启息屏显示功能后,当手机息屏后,如图1所示,手机可在屏幕的区域101中显示时间、日期等内容。这样,在手机息屏的状态下,用户也可以获取到时间、日期等信息。
在一些场景中,当手机息屏后,仍如图1所示,手机还可以在屏幕的区域102中显示预设的息屏动画。例如,屏幕动画可以为时长为3s的一段视频。当该息屏动画显示结束后,手机可定格在息屏动画的最后一帧。这种息屏显示方法虽然在一定程度上可提高用户的观感体验,但显示方式仍然较为单一,使得息屏显示内容的实时性和趣味性较低。
发明内容
本申请提供一种息屏显示方法及电子设备,可结合电子设备的使用状态进行息屏显示,从而在息屏状态下更加生动、有效的向用户传递电子设备的使用状态,增强息屏显示内容的实时性和趣味性。
为达到上述目的,本申请采用如下技术方案:
第一方面,本申请提供一种息屏显示方法,包括:电子设备接收息屏事件;响应于该息屏事件,电子设备可进入息屏状态;在息屏状态下,电子设备可根据电子设备的充电状态和电量信息显示第一息屏动画组中的第一息屏动画,第一息屏动画组中可以包括多个息屏动画,各个息屏动画可属于同一主题;当第一息屏动画结束后,如果电子设备检测到用户输入的第一触摸操作,则电子设备可响应第一触摸操作,显示第一息屏动画组中的第二息屏动画。
这样一来,电子设备可以通过息屏动画组中的多个息屏动画,完整、生动的向用户传递最新的充电状态和电量信息,不仅增加了手机与用户互动的趣味,还可以有效将电子设备最新的电量状态直观、生动的展示给用户,提高用户的使用体验。
在一种可能的实现方式中,当第一息屏动画结束后,上述方法还包括:电子设备可定格显示第一息屏动画的末帧画面;类似的,当电子设备显示第一息屏动画组中的第二息屏动画之后,还包括:电子设备定格显示第二息屏动画的末帧画面。
在一种可能的实现方式中,在电子设备定格显示第二息屏动画的末帧画面之后, 还包括:电子设备检测用户输入的第二触摸操作;响应于第二触摸操作,电子设备可获取最新的充电状态和电量信息;当电子设备的充电状态发生变化,或者,当电子设备的电量档位发生变化时,电子设备可播放对应的第二息屏动画组中的第三息屏动画;当电子设备的充电状态没有发生变化,且电子设备的电量档位没有发生变化时,电子设备可重新播放第一息屏动画。这样,在第一息屏动画组中的熄屏动画都播放结束后,电子设备可响应用户输入的触摸操作,结合最新的电量状态进行息屏显示。
在一种可能的实现方式中,电子设备检测用户输入的第一触摸操作,包括:电子设备在第一息屏动画结束后的预设时间(例如10s)内检测到用户输入的第一触摸操作。否则,电子设备可进入黑屏状态,降低电子设备功耗。
在一种可能的实现方式中,在电子设备显示第一息屏动画组中的第二息屏动画之后,还包括:若在第二息屏动画结束后的预设时间内检测到用户输入的第三触摸操作,则电子设备可获取电子设备的充电状态和电量信息;当电子设备的充电状态发生变化,或者,当电子设备的电量档位发生变化时,电子设备可播放对应的第二息屏动画组中的第三息屏动画;当电子设备的充电状态没有发生变化,且电子设备的电量档位没有发生变化时,电子设备可重新播放第一息屏动画。
相应的,如果在预设时间内没有检测到用户输入的第三触摸操作,则电子设备进入黑屏状态,降低电子设备功耗。
在一种可能的实现方式中,在电子设备进入黑屏状态之后,还包括:若在预设时间内检测到用户输入的第四触摸操作,则电子设备可获取电子设备的充电状态和电量信息;当电子设备的充电状态发生变化,或者,当电子设备的电量档位发生变化时,电子设备可播放对应的第二息屏动画组中的第三息屏动画;当电子设备的充电状态没有发生变化,且电子设备的电量档位没有发生变化时,电子设备可重新播放第一息屏动画。也就是说,当电子设备黑屏后,也可以响应用户输入的触摸操作,结合最新的电量状态进行息屏显示。
在一种可能的实现方式中,在电子设备进入息屏状态之后,在电子设备根据电子设备的充电状态和电量信息显示第一息屏动画组中的第一息屏动画之前,还包括:电子设备显示预设的入场息屏动画,入场息屏动画的末帧画面与第一息屏动画的首帧画面相同。
在一种可能的实现方式中,上述第一息屏动画的末帧画面与第二息屏动画的首帧画面相同;或者,上述第一息屏动画中的动画目标与第二息屏动画中的动画目标相同;或者,上述第一息屏动画中动画目标执行的事件与第二息屏动画中动画目标执行的事件相同。这样,当第一息屏动画播放结束后可以平滑过渡至第二息屏动画,提高用户的观看体验。
在一种可能的实现方式中,在电子设备显示第一息屏动画组中的第一息屏动画之后,还包括:当电子设备采集到用户的面部图像或眼部图像时,电子设备可播放第一互动息屏动画,第一互动息屏动画的首帧画面与第一息屏动画的末帧画面相同;类似的,在电子设备显示第一息屏动画组中的第二息屏动画之后,还包括:当电子设备采集到用户的面部图像或眼部图像时,电子设备可播放第二互动息屏动画,第二互动息屏动画的首帧画面与第二息屏动画的末帧画面相同;其中,第一互动息屏动画和第二 互动息屏动画中的动画目标均呈现出与用户进行互动的状态。
在一种可能的实现方式中,上述第一息屏动画组中还可以包括第三息屏动画;在电子设备显示第一息屏动画组中的第二息屏动画之后,还包括:当第二息屏动画结束后,如果电子设备检测到用户输入的第五触摸操作;则电子设备可响应第五触摸操作,显示第一息屏动画组中的第三息屏动画,通过息屏动画组中的多个息屏动画更加完整、生动的表现出动画目标执行某一事件的全过程。
在一种可能的实现方式中,在播放第一息屏动画或第二息屏动画时,电子设备可以不响应用户输入的触摸操作,避免中断正在播放的息屏动画影响用户的观看体验。
第二方面,本申请提供一种电子设备,包括:触摸屏,触摸屏包括触摸传感器和显示屏;一个或多个处理器;存储器;其中,存储器中存储有一个或多个计算机程序,一个或多个计算机程序包括指令,当指令被电子设备执行时,使得电子设备执行以下步骤:接收息屏事件;响应于息屏事件,进入息屏状态;根据电子设备的充电状态和电量信息显示第一息屏动画组中的第一息屏动画,第一息屏动画组中可以包括多个息屏动画;第一息屏动画结束后,若检测到用户输入的第一触摸操作,则响应该第一触摸操作显示第一息屏动画组中的第二息屏动画。
在一种可能的实现方式中,第一息屏动画结束后,电子设备还用于:定格显示第一息屏动画的末帧画面;在电子设备显示第一息屏动画组中的第二息屏动画之后,电子设备还用于:定格显示第二息屏动画的末帧画面。
在一种可能的实现方式中,在电子设备定格显示第二息屏动画的末帧画面之后,电子设备还用于:检测用户输入的第二触摸操作;响应于第二触摸操作,获取电子设备的充电状态和电量信息;当电子设备的充电状态发生变化,或者,当电子设备的电量档位发生变化时,播放对应的第二息屏动画组中的第三息屏动画;当电子设备的充电状态没有发生变化,且电子设备的电量档位没有发生变化时,重新播放第一息屏动画。
在一种可能的实现方式中,电子设备检测用户输入的第一触摸操作,包括:电子设备在第一息屏动画结束后的预设时间内检测到用户输入的第一触摸操作。
在一种可能的实现方式中,在电子设备显示第一息屏动画组中的第二息屏动画之后,电子设备还用于:若在第二息屏动画结束后的预设时间内检测到用户输入的第三触摸操作,则获取电子设备的充电状态和电量信息;当电子设备的充电状态发生变化,或者,当电子设备的电量档位发生变化时,播放对应的第二息屏动画组中的第三息屏动画;当电子设备的充电状态没有发生变化,且电子设备的电量档位没有发生变化时,重新播放第一息屏动画。
在一种可能的实现方式中,若在预设时间内没有检测到用户输入的第一触摸操作,则电子设备进入黑屏状态;若在预设时间内没有检测到用户输入的第三触摸操作,则电子设备进入黑屏状态。
在一种可能的实现方式中,在电子设备进入黑屏状态之后,电子设备还用于:若在预设时间内检测到用户输入的第四触摸操作,则获取电子设备的充电状态和电量信息;当电子设备的充电状态发生变化,或者,当电子设备的电量档位发生变化时,播放对应的第二息屏动画组中的第三息屏动画;当电子设备的充电状态没有发生变化, 且电子设备的电量档位没有发生变化时,重新播放第一息屏动画。
在一种可能的实现方式中,在电子设备进入息屏状态之后,在电子设备根据电子设备的充电状态和电量信息显示第一息屏动画组中的第一息屏动画之前,电子设备还用于:显示预设的入场息屏动画,入场息屏动画的末帧画面与第一息屏动画的首帧画面相同。
在一种可能的实现方式中,第一息屏动画的末帧画面与第二息屏动画的首帧画面相同;或者,第一息屏动画中的动画目标与第二息屏动画中的动画目标相同;或者,第一息屏动画中动画目标执行的事件与第二息屏动画中动画目标执行的事件相同。
在一种可能的实现方式中,在电子设备显示第一息屏动画组中的第一息屏动画之后,电子设备还用于:当电子设备采集到用户的面部图像或眼部图像时,播放第一互动息屏动画,第一互动息屏动画的首帧画面与第一息屏动画的末帧画面相同;在电子设备显示第一息屏动画组中的第二息屏动画之后,电子设备还用于:当电子设备采集到用户的面部图像或眼部图像时,播放第二互动息屏动画,第二互动息屏动画的首帧画面与第二息屏动画的末帧画面相同;其中,第一互动息屏动画和第二互动息屏动画中的动画目标均呈现出与用户进行互动的状态。
在一种可能的实现方式中,第一息屏动画组中还包括第三息屏动画;在电子设备显示第一息屏动画组中的第二息屏动画之后,电子设备还用于:第一息屏动画结束后,检测用户输入的第五触摸操作;响应于第五触摸操作,显示第一息屏动画组中的第三息屏动画。
在一种可能的实现方式中,在播放第一息屏动画或第二息屏动画时,电子设备可不响应用户输入的触摸操作。
第三方面,本申请提供一种电子设备,包括:存储器、显示屏和一个或多个处理器;存储器、显示屏与处理器耦合。其中,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令;当电子设备运行时,该处理器用于执行该存储器存储的一个或多个计算机指令,以使电子设备执行如上述第一方面中任一项所述的息屏显示方法。
第四方面,本申请提供一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行如第一方面中任一项所述的息屏显示方法。
第五方面,本申请提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行如第一方面中任一项所述的息屏显示方法。
可以理解地,上述提供的第二方面和第三方面所述的电子设备、第四方面所述的计算机存储介质,以及第五方面所述的计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
附图说明
图1为现有技术中息屏显示功能的应用场景示意图;
图2为本申请实施例提供的一种电子设备的结构示意图一;
图3为本申请实施例提供的一种息屏显示方法的流程示意图一;
图4为本申请实施例提供的一种息屏显示方法的流程示意图二;
图5为本申请实施例提供的一种息屏显示方法的流程示意图三;
图6为本申请实施例提供的一种息屏显示方法的应用场景示意图一;
图7为本申请实施例提供的一种息屏显示方法的应用场景示意图二;
图8为本申请实施例提供的一种息屏显示方法的应用场景示意图三;
图9为本申请实施例提供的一种息屏显示方法的应用场景示意图四;
图10为本申请实施例提供的一种息屏显示方法的应用场景示意图五;
图11为本申请实施例提供的一种息屏显示方法的流程示意图四;
图12为本申请实施例提供的一种息屏显示方法的流程示意图五;
图13为本申请实施例提供的一种息屏显示方法的应用场景示意图六;
图14为本申请实施例提供的一种息屏显示方法的流程示意图六;
图15为本申请实施例提供的一种息屏显示方法的流程示意图七;
图16为本申请实施例提供的一种息屏显示方法的流程示意图八;
图17为本申请实施例提供的一种息屏显示方法的流程示意图九;
图18为本申请实施例提供的一种息屏显示方法的流程示意图十;
图19为本申请实施例提供的一种息屏显示方法的应用场景示意图七;
图20为本申请实施例提供的一种息屏显示方法的流程示意图十一;
图21为本申请实施例提供的一种电子设备的结构示意图二。
具体实施方式
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
下面将结合附图对本实施例的实施方式进行详细描述。
示例性的,本申请实施例提供的一种息屏显示方法可应用于手机、车载设备(也可称为车机)、平板电脑、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、手持计算机、上网本、个人数字助理(personal digital assistant,PDA)、可穿戴电子设备、虚拟现实设备等具有息屏显示功能的电子设备,本申请实施例对此不做任何限制。
以手机为上述电子设备举例,如图2所示,为手机的一种结构示意图。
其中,手机可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180等。
可以理解的是,本发明实施例示意的结构并不构成对手机的具体限定。在本申请另一些实施例中,手机可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit, GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
手机的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在手机上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在手机上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机可以通过无线通信技术与网络以及其他设备通信。所述无线通 信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
手机通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机可以包括1个或N个显示屏194,N为大于1的正整数。
以OLED显示屏举例,一块OLED显示屏中可以包含阵列排布的多个OLED像素单元。如图2所示,每个OLED像素单元中包含阴极,阳极,以及位于阴极和阳极之间的电子传输层、空穴传输层以及发光层。其中,阴极可以为金属电极,阳极可以为ITO(indium tin oxide,氧化铟锡)透明电极。
向阴极和阳极输入驱动电压V后,在驱动电压V的作用下,电子从阴极传输到电子传输层,空穴从阳极注入到空穴传输层,二者在发光层相遇后产生激子,使得发光层中的发光分子激发,经过辐射后产生光源。当驱动电压V不同时,可激发对应的OLED像素单元呈现不同的颜色和亮度。这样,OLED显示屏中的各个OLED像素单元可在不同的驱动电压下显示出对应的画面。
其中,电子传输层、空穴传输层以及发光层中的有机材料在使用过程中会逐渐老化。而OLED显示屏中出现的残影现象,其实就是某个固定位置的OLED像素单元长时间总显示相同且静止的图像,导致这部分像素单元中的有机材料比其他位置更加耗损,发光效率衰减得更快,从而在OLED显示屏上留下了残影。
手机可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,手机可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当手机在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。手机可以支持一种或多种视频编解码器。这样,手机可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
手机可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。手机可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当手机接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。手机可以设置至少一个麦克风170C。在另一些实施例中,手机可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,手机还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可 以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
传感器模块180中可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
当然,手机还可以包括充电管理模块、电源管理模块、电池、按键、指示器以及1个或多个SIM卡接口等,本申请实施例对此不做任何限制。
仍以手机为上述电子设备举例,如图3所示,手机可使用上述传感器模块中的各个传感器采集对应的传感器数据。例如,手机可使用GPS装置获取用户的位置。又例如,手机可使用加速度传感器和重力传感器获取用户的步数。又例如,手机可使用温度传感器获取用户所处环境的温度等。
仍如图3所示,手机中还可以安装天气、日历、音乐或健康等应用(application,APP)。手机也可以从这些应用中获取对应的应用数据。例如,手机通过设置APP中的电池功能可以获取手机的电量信息。又例如,手机通过音乐APP可以获取到手机是否正在播放音乐。又例如,手机通过健康APP可以获取用户的步数、心率等健康数据。
在本申请实施例中,可将上述传感器数据和/或应用数据称为手机的使用数据,使用数据可用于反映手机的使用状态。仍如图3所示,当手机息屏(也可称为熄屏、灭屏、锁屏或黑屏)时,手机可根据最新的使用数据进行息屏显示,使手机在息屏状态下显示出的息屏内容与手机最新的使用状态相关联。
例如,当手机最新的使用数据指示手机正在播放音乐时,手机可在息屏后显示小熊跳舞的息屏动画1。又例如,当手机最新的使用数据指示用户的步数大于预设值时,手机可在息屏后显示小熊健身的息屏动画2。又例如,当手机最新的使用数据指示手机的电量小于预设值时,手机可在息屏后显示小熊趴下的息屏动画3。这样一来,通过显示与手机的使用数据关联的息屏动画,手机在息屏状态下也能够向用户传递手机最新的使用状态,不仅增加了手机与用户互动的趣味,还可以有效将手机最新的使用情况(例如电量高低、是否播放音乐等)直观、生动的展示给用户,提高用户的使用体验。
当然,在息屏状态下,手机也可以通过显示与手机的使用数据关联的文字或图片等其他形式向用户传递手机的使用状态,本申请实施例对此不做任何限制。
以下,将结合附图对本申请实施例提供的一种息屏显示方法进行具体介绍。以下实施例中均以手机作为具有息屏显示功能的电子设备举例说明。
示例性的,手机中可以设置电源管理单元,电源管理单元可实时监控手机的充电状态。例如,上述电源管理单元可以为手机中的电源管理服务(PowerManagerService,PMS)。当手机接入USB或无线充电装置后,可向PMS上报充电开始事件。相应的,当手机与USB或无线充电装置断开时,可向PMS上报充电结束事件。那么,如图4所示,当PMS获取到充电开始事件,且未获取到充电结束事件时,可确定手机处于充 电状态。否则,PMS可确定手机处于非充电状态。
在一些实施例中,上述电源管理单元(例如PMS)还可以实时监控手机的电量信息,进而根据手机的电量信息确定手机电量具体处于哪个档位。例如,可预先将手机电量划分为三个档位,其中,第一档位为:0%<电量≤20%,即低电量状态,第二档位为:20%<电量≤60%,即中等电量状态,第三档位为:60%<电量≤100%,即高电量状态。当然,本领域技术人员也可以根据实际需要或实际应用场景将手机电量划分为更多或更少的档位。仍如图3所示,电源管理单元确定手机处于非充电状态后,还可以根据最新获取到的电量信息确定手机电量此时处于第一档位、第二档位或第三档位。
在本申请实施例中,手机在进入息屏状态时可以按照上述方法获取手机最新的充电状态和电量信息,进而根据手机最新的充电状态和电量信息,在息屏状态下显示不同的息屏动画(或息屏动画组),从而通过不同的息屏动画(或息屏动画组)生动、直观的向用户传递手机当前的电量状态,使得用户在手机息屏时也能够直观、生动的获知手机的电量状态,并且增强息屏显示时的趣味性。
其中,息屏动画组是指一组息屏动画。即息屏动画组中可以包括多个息屏动画,每个息屏动画为一定时长的视频。例如,每个息屏动画的时长可以为3s。
示例性的,上述息屏动画组中的多个息屏动画可属于同一主题,例如,息屏动画组A中的三个息屏动画均为小熊跳舞这一主题的动画。也就是说,同一息屏动画组中各个息屏动画的动画目标(例如人物或动物)可以相同,动画目标执行的事件可以相同。通过息屏动画组中的多个息屏动画可以更加完整、生动的表现出动画目标执行某一事件(例如小熊跳舞)的全过程。
示例性的,上述息屏动画组中的多个息屏动画之间可以是连续的,例如,息屏动画组A中包括息屏动画1和息屏动画2,息屏动画1的最后一帧画面可与息屏动画2的第一帧画面相同或相似。又例如,息屏动画2的最后一帧画面可与息屏动画1的第一帧画面相同或相似,这样,在循环播放息屏动画组时息屏动画1和息屏动画2之间可以无缝衔接,不会产生画面突变等现象影响用户的使用体验。
在一些实施例中,如图5所示,手机可接收用户输入的息屏事件,该息屏事件可用于触发手机进入息屏状态。例如,该息屏事件可以是用户在手机亮屏时点击电源键的事件。又例如,该息屏事件可以是手机亮屏时在一定时间内没有检测到用户操作的事件。
仍如图5所示,如果手机检测到息屏事件,则手机可按照上述方法获取最新的充电状态和电量信息1。如果手机处于充电状态,则无论手机的电量是多少,手机息屏后可显示与充电状态关联的第一息屏动画组。第一息屏动画组中的多个息屏动画可向用户传递手机正在充电的状态。
例如,如图6所示,第一息屏动画组可用于展示北极熊在湖水中游泳的过程。其中,第一息屏动画组可以包括息屏动画A1和息屏动画A2。息屏动画A1和息屏动画A2可以为北极熊在湖水中游泳时的两个不同阶段。当然,第一息屏动画组也可以为猫咪在草地上睡觉的过程等。这样,通过游泳、睡觉等较为悠闲的动画可以向用户传递手机正在充电的状态,使用户可以直观、生动的获知当前手机处于充电状态。
仍如图5所示,如果手机处于非充电状态,则手机可根据最新获取到的电量信息1确定手机的电量当前处于哪一档位。例如,如果0%<电量信息1≤20%,则手机可确定手机的电量处于第一档位,即低电量状态,则手机息屏后可显示与第一档位的电量信息关联的第二息屏动画组。
例如,如图7所示,第二息屏动画组可用于展示北极熊趴在冰面上抬头仰望天空的过程。其中,第二息屏动画组可以包括息屏动画B1和息屏动画B2。息屏动画B1和息屏动画B2可以为北极熊趴在冰面上抬头仰望天空时的两个不同阶段。在第二息屏动画组中,北极熊运动的速度(例如V1)可以较慢,或者,北极熊运动的幅度(A1)可以较小,使得北极熊呈现较为疲惫的状态,从而向用户直观、生动的传递手机处于低电量状态的信息。
仍如图5所示,如果20%<电量信息1≤60%,则手机可确定手机的电量处于第二档位,即中等电量状态,则手机息屏后可显示与第二档位的电量信息关联的第三息屏动画组。
类似的,如图8所示,第三息屏动画组可用于展示北极熊在冰面上与流星或其他物体互动的过程。其中,第三息屏动画组可以包括息屏动画C1和息屏动画C2。息屏动画C1和息屏动画C2可以为北极熊在冰面上与流星互动时的两个不同阶段。在第三息屏动画组中,北极熊运动的速度(例如V2)可以快于上述第二息屏动画组中北极熊运动的速度,即V2>V1。或者,在第三息屏动画组中,北极熊运动的幅度(例如A2)可以大于上述第二息屏动画组中北极熊运动的幅度,即A2>A1。这样,相较于第二息屏动画组,第三息屏动画组中的北极熊可以呈现出较有活力的状态,从而向用户直观、生动的传递手机处于中等电量状态的信息。
仍如图5所示,如果60%<电量信息1≤100%,则手机可确定手机的电量处于第三档位,即高电量状态,则手机息屏后可显示与第三档位的电量信息关联的第四息屏动画组。
类似的,如图9所示,第四息屏动画组可用于展示北极熊在冰面上与流星或其他物体互动的过程。其中,第四息屏动画组可以包括息屏动画D1和息屏动画D2。息屏动画D1和息屏动画D2可以为北极熊在冰面上与流星互动时的两个不同阶段。在第四息屏动画组中,北极熊运动的速度(例如V3)可以快于上述第三息屏动画组中北极熊运动的速度,即V3>V2。或者,在第四息屏动画组中,北极熊运动的幅度(例如A3)可以大于上述第三息屏动画组中北极熊运动的幅度,即A3>A2。这样,相较于第三息屏动画组,第四息屏动画组中的北极熊可以呈现出较有活力的状态,从而向用户直观、生动的传递手机处于高电量状态的信息。
可以看出,在手机进入息屏状态时,可以根据手机的充电状态和电量信息在息屏后显示对应的息屏动画组,通过相应息屏动画组中的息屏动画向用户传递当前手机的电量状态。这样,在息屏状态下,用户通过息屏动画组中的动画内容可以直观、有效的获知当前手机的电量状态,增强息屏显示内容的实时性和趣味性。
需要说明的是,上述实施例中是以息屏动画组中包括两个息屏动画举例说明的,可以理解是的,任意息屏动画组中可以包括两个或更多个息屏动画,本申请实施例对此不做任何限制。
示例性的,如图10所示,手机进入息屏状态后,手机可在屏幕的第一区域1001中显示对应的息屏动画组。并且,手机进入息屏状态后,手机还可以在第二区域1002中显示当前的时间、日期或通知消息的图标等。当然,手机在息屏状态下还可以显示指纹识别区域等,本申请实施例对此不做任何限制。
在一些实施例中,用户可以将手机中的息屏显示功能设置为全天显示、定时显示或触摸显示等多种显示方式。其中,全天显示是指:手机在全天内进入息屏状态时,均需要显示上述息屏动画等内容。定时显示是指:在用户设定的时间范围内,如果手机进入息屏状态,则需要显示上述息屏动画等内容。例如,如果用户设置8:00-18:00这一时间范围内开启息屏显示功能,则当手机检测到息屏事件后,如果当前时间在8:00-18:00之间,则手机息屏后可显示上述息屏动画等内容,否则手机可进入黑屏状态。触摸显示是指:手机进入息屏状态后每次显示上述息屏动画等内容的时间是固定的,如果没有检测到用户的触摸操作,则手机可停止显示上述息屏动画等内容进入黑屏状态;相应的,在黑屏状态下如果检测到用户的触摸操作,则手机可再次显示上述息屏动画等内容。其中,在黑屏状态下手机在显示屏中通常不显示任何内容,使显示屏的全部显示区域呈现出黑色。
示例性的,当息屏显示功能的显示方式为全天显示时,如图11所示,手机进入息屏状态后可根据当前的充电状态和电量信息播放对应息屏动画组(例如息屏动画组A)中的第一个息屏动画,即息屏动画1。当息屏动画1播放结束后,手机可定格显示息屏动画1的最后一帧画面。其中,定格显示息屏动画1的最后一帧画面是指持续显示息屏动画1的最后一帧画面。例如,息屏动画1显示在显示屏中的显示区域1内。当息屏动画1播放结束后,手机可按照一定的帧率刷新显示屏中的显示内容,每次刷新时显示区域1内的显示内容均为息屏动画1的最后一帧画面。后续,手机可监测用户是否输入触摸操作。如果没有检测到用户输入触摸操作,则手机可停留在息屏动画1的最后一帧画面(即末帧画面)。
如果检测到用户输入触摸操作,仍如图11所示,则手机可播放息屏动画组A中的第二个息屏动画,即息屏动画2。当息屏动画2播放结束后,手机可定格显示息屏动画2的最后一帧画面。以息屏动画组A中仅包括息屏动画1和息屏动画2举例,在未播放完息屏动画2之前,即使手机检测到手机充电状态发生变化,或者,检测到手机的电量信息所处的档位(即电量档位)发生变化,手机仍然可以播放息屏动画2或者停留在息屏动画1的最后一帧画面,而不是播放与新的电量档位(或充电状态)对应的息屏动画组,以避免跳转至新的息屏动画组后画面无法有效衔接的现象。
在手机定格显示息屏动画2的最后一帧画面后,仍如图11所示,如果没有检测到用户输入触摸操作,则手机可停留在息屏动画2的最后一帧画面。相应的,如果检测到用户输入触摸操作,则手机可获取最新的充电状态和电量信息,进而确定当前手机的充电状态或电量档位是否发生变化。如果当前手机的充电状态和电量档位均没有发生变化,则手机可重新播放息屏动画组A中的息屏动画1。相应的,如果当前手机的充电状态或电量档位发生变化,则手机可播放对应的息屏动画组(例如息屏动画组B)中的第一个息屏动画。
例如,息屏动画组B中第一个息屏动画的首帧画面可与息屏动画组A中最后一个 息屏动画的末帧画面相同或相似。这样,手机从息屏动画组A切换为息屏动画组B时可向用户提供无缝衔接的视觉体验。
或者,当息屏动画组B中第一个息屏动画的首帧画面与息屏动画组A中最后一个息屏动画的末帧画面不同时,手机还可以在播放息屏动画组B之前先播放相应的过渡息屏动画,过渡息屏动画的首帧画面可与息屏动画组A中最后一个息屏动画的末帧画面相同或相似,过渡息屏动画的末帧画面可与息屏动画组B中第一个息屏动画的首帧画面相同或相似。这样,通过过渡息屏动画可在息屏动画组A和息屏动画组B之间进行柔和过渡,避免息屏动画2的出现过于突兀,提高用户的观感体验。
仍以息屏显示功能的显示方式为全天显示举例,在另一些实施例中,手机进入息屏状态后,还可以先显示入场息屏动画,再显示与当前充电状态和电量信息对应的息屏动画组。如图12所示,与图11所示的息屏显示过程不同的是,手机进入息屏状态后可先显示与上述息屏动画组A对应的入场息屏动画。以入场息屏动画1301举例,如图13所示,入场息屏动画1301可以是北极熊走上冰面的过程。其中,入场息屏动画1301的末帧画面可与后续待显示的息屏动画组A中息屏动画1的首帧画面相同或相似。后续,仍如图12所示,如果手机检测都到用户输入的触摸操作,则手机可显示与当前的充电状态和电量信息对应的息屏动画组A的第一个息屏动画(即息屏动画1)。后续手机进行息屏显示的过程与图11中没有入场息屏动画的息屏显示过程类似,故此处不再赘述。
在另一些实施例中,当息屏显示功能的显示方式为定时显示时,手机息屏后显示息屏动画组的方法与上述方法类似。不同的是,如图14所示,手机进入息屏状态时可判断当前时间是否在用户设定的时间范围内。如果在用户设定的时间范围内,则手机可按照图11所示的相关方法进行息屏显示。或者,如图15所示,手机进入息屏状态时可判断当前时间是否在用户设定的时间范围内。如果在用户设定的时间范围内,则手机可按照图12所示的方法先显示对应的入场息屏动画,再显示与当前的充电状态和电量信息显示对应的息屏动画组。相应的,如果当前时间不在用户设定的时间范围内,则手机可按照现有的方式息屏后进入黑屏状态。
在另一些实施例中,当息屏显示功能的显示方式为触摸显示时,如图16所示,手机进入息屏状态后可根据当前的充电状态和电量信息显示息屏动画组A中的息屏动画1。当息屏动画1播放结束后,手机可定格显示息屏动画1的最后一帧画面。进而,手机可检测在预设时间(例如10s)内是否接收到用户输入的触摸操作。
仍如图16所示,如果在预设时间内检测到用户输入的触摸操作,则手机可播放息屏动画组A中的息屏动画2。当息屏动画2播放结束后,手机可定格显示息屏动画2的最后一帧画面。类似的,当息屏动画2播放结束后,手机可检测在预设时间(例如10s)内是否检测到用户输入的触摸操作。如果没有检测到用户输入的触摸操作,则手机可按照现有的方式息屏后进入黑屏状态。如果检测到用户输入的触摸操作,则手机可获取最新的充电状态和电量信息,进而确定当前手机的充电状态或电量档位是否发生变化。如果当前手机的充电状态和电量档位均没有发生变化,则手机可重新播放息屏动画组A中的息屏动画1。相应的,如果当前手机的充电状态或电量档位发生变化,则手机可播放对应的息屏动画组(例如息屏动画组B)中的第一个息屏动画。
当息屏动画1播放结束后,如果在预设时间内没有检测到用户输入的触摸操作,仍如图16所示,手机可按照现有的方式息屏后进入黑屏状态。后续,如果手机在黑屏状态下检测到用户输入的触摸操作,则与上述方法类似的,手机可获取最新的充电状态和电量信息,进而确定当前手机的充电状态或电量档位是否发生变化。如果当前手机的充电状态和电量档位均没有发生变化,则手机可重新播放息屏动画组A中的息屏动画1。相应的,如果当前手机的充电状态或电量档位发生变化,则手机可播放对应的息屏动画组(例如息屏动画组B)中的第一个息屏动画。
仍以息屏显示功能的显示方式为触摸显示举例,在另一些实施例中,手机进入息屏状态后,还可以先显示入场息屏动画,再显示与当前充电状态和电量信息对应的息屏动画组。如图17所示,手机进入息屏状态后可先显示与上述息屏动画组A对应的入场息屏动画,例如,显示如图13所示的入场息屏动画1301。进而,当入场息屏动画播放结束后,手机可停留在入场息屏动画的最后一帧画面。并且,手机可检测在预设时间内是否接收到用户输入的触摸操作。
如果在入场息屏动画结束后的预设时间内检测到用户输入的触摸操作,仍如图17所示,则手机可播放与当前的充电状态和电量信息对应的息屏动画组A中的息屏动画1,并定格在息屏动画1的最后一帧画面。类似的,当息屏动画1播放结束后,如果在预设时间内检测到用户输入的触摸操作,则手机可播放息屏动画组A中的息屏动画2。当息屏动画2播放结束后,手机可定格显示息屏动画2的最后一帧画面。类似的,当息屏动画2播放结束后,手机可检测在预设时间内是否检测到用户输入的触摸操作。如果没有检测到用户输入的触摸操作,则手机可按照现有的方式息屏后进入黑屏状态。如果检测到用户输入的触摸操作,则手机可获取最新的充电状态和电量信息,进而确定当前手机的充电状态或电量档位是否发生变化。如果当前手机的充电状态和电量档位均没有发生变化,则手机可重新播放息屏动画组A中的息屏动画1。相应的,如果当前手机的充电状态或电量档位发生变化,则手机可播放对应的息屏动画组(例如息屏动画组B)中的第一个息屏动画。
相应的,仍如图17所示,当入场息屏动画播放结束后,如果在预设时间内没有检测到用户输入的触摸操作,则手机可进入黑屏状态。或者,当息屏动画1播放结束后,如果在预设时间内没有检测到用户输入的触摸操作,手机也可进入黑屏状态。
在触摸显示这种显示方式中,当手机进入黑屏状态后,仍如图17所示,如果检测到用户输入的触摸操作,则手机可获取最新的充电状态和电量信息,进而根据当前手机的充电状态或电量档位播放对应的入场息屏动画。例如,如果当前手机的充电状态和电量档位均没有发生变化,则手机可重新播放与息屏动画组A对应的入场息屏动画(例如入场息屏动画1)。相应的,如果当前手机的充电状态或电量档位发生变化,例如,当前手机的充电状态或电量档位与息屏动画组B对应,则手机可播放与息屏动画组B对应的入场息屏动画(例如入场息屏动画2)。
可以看出,无论息屏显示功能的显示方式为全天显示、定时显示或触摸显示,手机在息屏状态下都可以基于最新的充电状态和电量信息选择相应的息屏动画组进行显示。这样,通过息屏动画组中的多个息屏动画,可以更加直观、生动的向用户传递最新的充电状态和电量信息,增强息屏显示时的实时性和趣味性。
需要说明的是,上述实施例中是以息屏动画组中包括两个息屏动画举例说明的是,可以理解的是,如果上述息屏动画组中还包括其他的息屏动画(例如息屏动画3),则手机也可以按照上述方法进行息屏显示。
另外,上述实施例中涉及的触摸操作可以是点击操作或双击操作,也可以是指关节敲击、手写笔点击等操作,本申请实施例对此不做任何限制。
在另一些实施例中,当手机进入息屏状态后,不仅可以响应用户输入的触摸操作,还可以捕捉用户的面部图像或眼部图像。例如,当手机进入息屏状态后,手机可以调用摄像头开始采集图像。当采集到用户的面部图像或眼部图像时,说明用户此时的关注点可能在手机屏幕上,则手机可播放对应的互动息屏动画。其中,互动息屏动画中的人物或动物可呈现出与屏幕前的用户进行互动的状态。
示例性的,如图18所示,手机进入息屏状态后可以播放息屏动画组A中的息屏动画1,进而定格显示息屏动画1的最后一帧画面。后续,如果手机采集到用户的面部图像或眼部图像,则手机可播放与息屏动画1对应的互动息屏动画1。其中,互动息屏动画1的第一帧画面可与息屏动画1的最后一帧画面相同或相近,互动息屏动画1的最后一帧画面也可与息屏动画1的最后一帧画面相同或相近。
例如,如图19所示,互动息屏动画1可以为北极熊转头看向用户,再回头站立在冰面上的过程。这样,通过采集用户的面部图像或眼部图像触发手机在息屏时播放互动息屏动画1,可增强用户与手机之间的互动性和趣味性。
仍如图18所示,手机定格显示息屏动画1的最后一帧画面之后,如果检测到用户输入触摸操作,则手机可按照实施例中所述的方法播放息屏动画组A中的息屏动画2,进而定格显示息屏动画2的最后一帧画面。后续,与上述实施例类似的,如果手机采集到用户的面部图像或眼部图像,则手机可播放与息屏动画2对应的互动息屏动画2。其中,互动息屏动画2的第一帧画面可与息屏动画2的最后一帧画面相同或相近,互动息屏动画2的最后一帧画面也可与息屏动画2的最后一帧画面相同或相近。
也就是说,手机可以为息屏动画组中的每个息屏动画设置对应的互动息屏动画,手机播放完息屏动画组中的任意息屏动画后,均可响应用户输入的面部图像或眼部图像播放对应的互动息屏动画,增强用户与手机之间的互动性和趣味性。
在一些实施例中,手机每次播放完对应的互动息屏动画(例如上述互动息屏动画1或互动息屏动画2)后,可定格显示该互动息屏动画的最后一帧画面。如果在播放互动息屏动画的过程中,或在播放互动息屏动画结束时,手机仍然检测到用户的面部图像或眼部图像,手机可再次播放该互动息屏动画。或者,手机也可以不响应用户输入的面部图像或眼部图像再次播放互动息屏动画,以降低手机功耗。后续,当手机重新检测到用户输入的面部图像或眼部图像时,可再次播放该互动息屏动画。
或者,当本次互动息屏动画播放结束后,手机可启动一定时长(例如5分钟)的定时器。在定时器未超时的时间内,如果手机检测到用户输入的面部图像或眼部图像,则手机无需再次播放上述互动息屏动画。相应的,当定时器超时后,如果手机检测到用户输入的面部图像或眼部图像,则手机可再次播放上述互动息屏动画。
又或者,手机还可以根据用户输入的面部图像或眼部图像的时长动态的播放对应的互动息屏动画。例如,当检测到用户输入的面部图像或眼部图像时,手机可以播放 对应的互动息屏动画1中北极熊转头看向用户的过程;当用户持续输入面部图像或眼部图像时,手机可显示互动息屏动画1中北极熊与用户对视的过程;当手机检测到用户的面部图像或眼部图像离开后,可显示互动息屏动画1中北极熊回头站立在冰面上的过程。
示例性的,如果手机在定格显示某一息屏动画的最后一帧画面时,既检测到用户的面部图像(或眼部图像),又检测到用户输入的触摸操作,则手机可响应检测到用户的面部图像(或眼部图像)的事件,播放对应的互动息屏动画。此时,手机无需响应用户输入的触摸操作。即检测到用户的面部图像(或眼部图像)这一事件的优先级可以高于检测到触摸操作这一事件的优先级。当然,也可以设置检测到触摸操作这一事件的优先级高于检测到用户的面部图像(或眼部图像)这一事件的优先级,本申请实施例对此不做任何限制。
另外,如果手机在播放息屏动画(例如上述息屏动画1、息屏动画2或互动息屏动画)的过程中,检测到用户输入触摸操作或者检测到用户的面部图像或眼部图像,则手机可以不做响应,继续播放正在播放的息屏动画,避免中断息屏动画影响用户的观看体验。
上述实施例中是以息屏动画组中包括两个息屏动画举例说明的,可以理解是的,任意息屏动画组中可以包括三个或更多个息屏动画。
以息屏显示的方式为全天显示举例,息屏动画组A中可以包括息屏动画1、息屏动画2以及息屏动画3。如图20所示,手机息屏后可根据当前的充电状态和电量信息播放息屏动画组A中的息屏动画1。当息屏动画1播放结束后,手机可定格显示息屏动画1的最后一帧画面。进而,如果检测到用户输入的触摸操作,则手机可播放息屏动画组A中的息屏动画2,进而定格显示息屏动画2的最后一帧画面。进而,如果检测到用户输入的触摸操作,则手机可播放息屏动画组A中的息屏动画3,进而定格显示息屏动画3的最后一帧画面。
仍如图20所示,手机定格显示息屏动画3的最后一帧画面后,如果检测到用户输入的触摸操作,则手机可获取手机最新的充电状态和电量信息。如果当前手机的充电状态和电量档位均没有发生变化,则手机可重新播放息屏动画组A中的息屏动画1。相应的,如果当前手机的充电状态或电量档位发生变化,则手机可播放对应的息屏动画组中的第一个息屏动画(例如,息屏动画组B中的息屏动画1)。也就是说,当息屏动画组A中的三个息屏动画都播放结束后,手机可基于最新的充电状态和电量信息显示其他息屏动画组中的息屏动画。
当然,如果息屏显示的方式为定是显示或触摸显示,则手机也可参见上述方法显示息屏动画组A中的息屏动画1、息屏动画2以及息屏动画3,本申请实施例对此不做任何限制。
上述实施例中是以手机的充电状态或电量信息为手机的使用数据举例,阐述手机在息屏状态下如何显示与使用数据关联的息屏动画组。在另一些实施例中,手机还可以根据其他的使用数据(例如用户步数、是否播放音乐、天气或时间等),在息屏状态下动态显示不同的息屏动画组。
例如,手机检测到息屏事件后,如果手机当前在运行音乐APP,则手机息屏后可 显示与播放音乐这一状态关联的息屏动画组。相应的,如果手机当前没有运行音乐APP,则手机息屏后可显示与没有播放音乐这一状态关联的息屏动画组。这样,用户通过息屏动画组中的多个息屏动画可以生动、直观的获知当前手机是否播放音乐的状态。
又例如,手机还可以结合多种手机的使用数据,在息屏状态下动态显示不同的息屏动画组。例如,手机可结合当前的电量信息和是否运行音乐APP这两种使用数据显示不同的息屏动画组,从而向用户传递手机电量以及是否播放音乐等手机当前的使用状态。
另外,手机在息屏状态下显示息屏动画或息屏动画组时,还可以移动显示息屏动画或息屏动画组的显示位置。例如,手机可以以3s为周期,按照预设的运动轨迹定期移动息屏动画或息屏动画组的显示位置,避免长期在某一位置显示相同或相似内容引起的烧屏现象。
需要说明的是,上述实施例中是以手机进行息屏显示举例说明的,可以理解的是,上述息屏显示方法还可以应用于车载设备、平板电脑、手表等电子设备中,这些设备均可用于实现上述实施例中的息屏显示方法,本申请实施例对此不做任何限制。
如图21所示,本申请实施例公开了一种电子设备,该电子设备可以为上述手机。该电子设备具体可以包括:触摸屏2101,所述触摸屏2101包括触摸传感器2106和显示屏2107;一个或多个处理器2102;存储器2103;一个或多个应用程序(未示出);以及一个或多个计算机程序2104,上述各器件可以通过一个或多个通信总线2105连接。其中,上述一个或多个计算机程序2104被存储在上述存储器2103中并被配置为被该一个或多个处理器2102执行,该一个或多个计算机程序2104包括指令,该指令可以用于执行上述实施例中的相关步骤。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请实施例各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请实施例的具体实施方式,但本申请实施例的保护范围并不局限于此,任何在本申请实施例揭露的技术范围内的变化或替换,都应涵盖在本申请 实施例的保护范围之内。因此,本申请实施例的保护范围应以所述权利要求的保护范围为准。

Claims (14)

  1. 一种息屏显示方法,其特征在于,包括:
    电子设备接收息屏事件;
    响应于所述息屏事件,所述电子设备进入息屏状态;
    所述电子设备根据所述电子设备的充电状态和电量信息显示第一息屏动画组中的第一息屏动画;
    所述第一息屏动画结束后,
    所述电子设备检测到用户的第一触摸操作;
    响应于所述第一触摸操作,所述电子设备显示所述第一息屏动画组中的第二息屏动画,
    其中,所述第一息屏动画组包括所述第一息屏动画和所述第二息屏动画。
  2. 根据权利要求1所述的方法,其特征在于,所述第一息屏动画结束后,所述方法还包括:
    所述电子设备定格显示所述第一息屏动画的末帧画面;
    在所述电子设备显示所述第一息屏动画组中的第二息屏动画之后,还包括:
    所述电子设备定格显示所述第二息屏动画的末帧画面。
  3. 根据权利要求2所述的方法,其特征在于,在所述电子设备定格显示所述第二息屏动画的末帧画面之后,还包括:
    所述电子设备检测到用户输入的第二触摸操作;
    响应于所述第二触摸操作,所述电子设备获取所述电子设备的充电状态和电量信息;
    当所述电子设备的充电状态发生变化,或者,当所述电子设备的电量档位发生变化时,所述电子设备播放对应的第二息屏动画组中的第三息屏动画;
    当所述电子设备的充电状态没有发生变化,且所述电子设备的电量档位没有发生变化时,所述电子设备播放所述第一息屏动画。
  4. 根据权利要求1或2所述的方法,其特征在于,所述电子设备检测到用户输入的第一触摸操作,包括:
    所述电子设备在所述第一息屏动画结束后的预设时间内检测到用户输入的第一触摸操作。
  5. 根据权利要求4所述的方法,其特征在于,在所述电子设备显示所述第一息屏动画组中的第二息屏动画之后,还包括:
    若在所述第二息屏动画结束后的预设时间内检测到用户输入的第三触摸操作,则所述电子设备获取所述电子设备的充电状态和电量信息;
    当所述电子设备的充电状态发生变化,或者,当所述电子设备的电量档位发生变化时,所述电子设备播放对应的第二息屏动画组中的第三息屏动画;
    当所述电子设备的充电状态没有发生变化,且所述电子设备的电量档位没有发生变化时,所述电子设备播放所述第一息屏动画。
  6. 根据权利要求4或5所述的方法,其特征在于,所述方法还包括:
    若在预设时间内没有检测到用户输入的第一触摸操作,则所述电子设备进入黑屏 状态;
    若在预设时间内没有检测到用户输入的第三触摸操作,则所述电子设备进入黑屏状态。
  7. 根据权利要求6所述的方法,其特征在于,在所述电子设备进入黑屏状态之后,还包括:
    若在预设时间内检测到用户输入的第四触摸操作,则所述电子设备获取所述电子设备的充电状态和电量信息;
    当所述电子设备的充电状态发生变化,或者,当所述电子设备的电量档位发生变化时,所述电子设备播放对应的第二息屏动画组中的第三息屏动画;
    当所述电子设备的充电状态没有发生变化,且所述电子设备的电量档位没有发生变化时,所述电子设备播放所述第一息屏动画。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,在所述电子设备进入息屏状态之后,在所述电子设备根据所述电子设备的充电状态和电量信息显示第一息屏动画组中的第一息屏动画之前,还包括:
    所述电子设备显示预设的入场息屏动画,所述入场息屏动画的末帧画面与所述第一息屏动画的首帧画面相同。
  9. 根据权利要求1-8中任一项所述的方法,其特征在于,
    所述第一息屏动画的末帧画面与所述第二息屏动画的首帧画面相同;或者,
    所述第一息屏动画中的动画目标与所述第二息屏动画中的动画目标相同;或者,
    所述第一息屏动画中动画目标执行的事件与所述第二息屏动画中动画目标执行的事件相同。
  10. 根据权利要求1-9中任一项所述的方法,其特征在于,在所述电子设备显示所述第一息屏动画组中的第一息屏动画之后,还包括:
    当所述电子设备采集到用户的面部图像或眼部图像时,所述电子设备播放第一互动息屏动画,所述第一互动息屏动画的首帧画面与所述第一息屏动画的末帧画面相同;
    在所述电子设备显示所述第一息屏动画组中的第二息屏动画之后,还包括:
    当所述电子设备采集到用户的面部图像或眼部图像时,所述电子设备播放第二互动息屏动画,所述第二互动息屏动画的首帧画面与所述第二息屏动画的末帧画面相同;
    其中,所述第一互动息屏动画和所述第二互动息屏动画中的动画目标均呈现出与用户进行互动的状态。
  11. 根据权利要求1-10中任一项所述的方法,其特征在于,所述第一息屏动画组中还包括第三息屏动画;
    在所述电子设备显示所述第一息屏动画组中的第二息屏动画之后,还包括:
    所述第二息屏动画结束后,所述电子设备检测用户输入的第五触摸操作;
    响应于所述第五触摸操作,所述电子设备显示所述第一息屏动画组中的第三息屏动画。
  12. 根据权利要求1-11中任一项所述的方法,其特征在于,在播放所述第一息屏动画或所述第二息屏动画时,所述电子设备不响应用户输入的触摸操作。
  13. 一种电子设备,其特征在于,包括:
    触摸屏,所述触摸屏包括触摸传感器和显示屏;
    一个或多个处理器;
    存储器;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如权利要求1-12中任一项所述的一种息屏显示方法。
  14. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-12中任一项所述的一种息屏显示方法。
PCT/CN2022/079146 2021-03-26 2022-03-03 一种息屏显示方法及电子设备 WO2022199352A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/027,527 US20230350535A1 (en) 2021-03-26 2022-03-03 Always on display method and electronic device
EP22774014.9A EP4202621A1 (en) 2021-03-26 2022-03-03 Method for always on displaying, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110326182.1A CN115129215B (zh) 2021-03-26 2021-03-26 一种息屏显示方法及电子设备
CN202110326182.1 2021-03-26

Publications (1)

Publication Number Publication Date
WO2022199352A1 true WO2022199352A1 (zh) 2022-09-29

Family

ID=83374667

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/079146 WO2022199352A1 (zh) 2021-03-26 2022-03-03 一种息屏显示方法及电子设备

Country Status (4)

Country Link
US (1) US20230350535A1 (zh)
EP (1) EP4202621A1 (zh)
CN (2) CN117666903A (zh)
WO (1) WO2022199352A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109361593A (zh) * 2018-10-19 2019-02-19 北京小米移动软件有限公司 一种消息提醒的方法、装置、移动终端和存储介质
CN110221898A (zh) * 2019-06-19 2019-09-10 北京小米移动软件有限公司 息屏画面的显示方法、装置、设备及存储介质
CN110989882A (zh) * 2019-11-28 2020-04-10 维沃移动通信有限公司 一种控制方法、电子设备和计算机可读存储介质
CN112363785A (zh) * 2020-10-29 2021-02-12 努比亚技术有限公司 一种终端显示方法、终端及计算机可读存储介质

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110285727A1 (en) * 2010-05-24 2011-11-24 Microsoft Corporation Animation transition engine
US9158372B2 (en) * 2012-10-30 2015-10-13 Google Technology Holdings LLC Method and apparatus for user interaction data storage
US9632664B2 (en) * 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
KR102082417B1 (ko) * 2018-03-12 2020-04-23 엘지전자 주식회사 이동 단말기 및 그 제어 방법
KR20210032204A (ko) * 2019-09-16 2021-03-24 삼성전자주식회사 충전 개시 시 충전 상황을 표시하기 위한 방법, 이를 위한 전자 장치 및 저장 매체
CN110994731B (zh) * 2019-12-20 2022-01-07 Oppo广东移动通信有限公司 充电提示方法和装置、电子设备和计算机可读存储介质
CN111431247A (zh) * 2020-05-18 2020-07-17 Oppo(重庆)智能科技有限公司 充电控制方法、装置、电子设备及介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109361593A (zh) * 2018-10-19 2019-02-19 北京小米移动软件有限公司 一种消息提醒的方法、装置、移动终端和存储介质
CN110221898A (zh) * 2019-06-19 2019-09-10 北京小米移动软件有限公司 息屏画面的显示方法、装置、设备及存储介质
CN110989882A (zh) * 2019-11-28 2020-04-10 维沃移动通信有限公司 一种控制方法、电子设备和计算机可读存储介质
CN112363785A (zh) * 2020-10-29 2021-02-12 努比亚技术有限公司 一种终端显示方法、终端及计算机可读存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4202621A1

Also Published As

Publication number Publication date
US20230350535A1 (en) 2023-11-02
CN117666903A (zh) 2024-03-08
CN115129215B (zh) 2023-11-03
CN115129215A (zh) 2022-09-30
EP4202621A1 (en) 2023-06-28

Similar Documents

Publication Publication Date Title
US11722449B2 (en) Notification message preview method and electronic device
WO2020098437A1 (zh) 一种播放多媒体数据的方法及电子设备
EP4030276A1 (en) Content continuation method and electronic device
JP7235871B2 (ja) データ伝送方法と電子デバイス
WO2020216156A1 (zh) 投屏方法和计算设备
CN113518144B (zh) 一种应用开屏信息的显示方法、电子设备及存储介质
WO2020244623A1 (zh) 一种空鼠模式实现方法及相关设备
WO2021169402A1 (zh) 一种屏幕亮度调节方法和电子设备
US20230251814A1 (en) Always on display method and electronic device
WO2023065873A1 (zh) 帧率调整方法、终端设备及帧率调整系统
EP4187907A1 (en) Screen projection data processing method and apparatus
WO2021190314A1 (zh) 触控屏的滑动响应控制方法及装置、电子设备
CN113747346A (zh) 终端设备的待机方法和终端设备
CN112269554B (zh) 显示系统及显示方法
WO2021121036A1 (zh) 一种折叠设备的自定义按键方法、设备及存储介质
US20240005830A1 (en) Always-on-display method and electronic device
CN113766060B (zh) 一种息屏显示方法、电子设备及计算机可读存储介质
WO2022199352A1 (zh) 一种息屏显示方法及电子设备
WO2022135163A1 (zh) 一种投屏显示方法及电子设备
RU2811483C1 (ru) Способ и электронное устройство постоянного отображения на дисплее
CN111626931A (zh) 图像处理方法、图像处理装置、存储介质与电子设备
WO2021088891A1 (zh) 显示系统及显示方法
CN111294509A (zh) 视频拍摄方法、装置、终端及存储介质
WO2023142959A1 (zh) 多机位拍摄系统的拍摄方法、设备、存储介质和程序产品
WO2021093847A1 (zh) 一种具有柔性屏幕的电子设备的显示方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22774014

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022774014

Country of ref document: EP

Effective date: 20230322

NENP Non-entry into the national phase

Ref country code: DE