CN115357317B - Display control method, device, chip and equipment of terminal equipment - Google Patents

Display control method, device, chip and equipment of terminal equipment Download PDF

Info

Publication number
CN115357317B
CN115357317B CN202210843720.9A CN202210843720A CN115357317B CN 115357317 B CN115357317 B CN 115357317B CN 202210843720 A CN202210843720 A CN 202210843720A CN 115357317 B CN115357317 B CN 115357317B
Authority
CN
China
Prior art keywords
screen
wallpaper
information
target object
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210843720.9A
Other languages
Chinese (zh)
Other versions
CN115357317A (en
Inventor
张从飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210843720.9A priority Critical patent/CN115357317B/en
Publication of CN115357317A publication Critical patent/CN115357317A/en
Application granted granted Critical
Publication of CN115357317B publication Critical patent/CN115357317B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a display control method, a device, a chip and equipment of terminal equipment, wherein the method comprises the following steps: determining whether a preset first operation exists, wherein the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation of terminal equipment; in the presence of a first operation, acquiring reference information of a corresponding terminal device, the reference information including: at least one of equipment mode scene of the terminal equipment, external scene information of the terminal equipment and personalized setting information of a user; generating a target object according to the reference information; and displaying the target object in the process of executing a second operation corresponding to the first operation, wherein the second operation comprises one of a screen-on operation and a screen-off operation of the terminal equipment. According to the embodiment of the application, the target object is displayed in the process of executing the on-off screen operation, so that the wallpaper display effect can be enriched, and the use experience of a user is improved.

Description

Display control method, device, chip and equipment of terminal equipment
Technical Field
The present application relates to the field of display control, and in particular, to a display control method, device, chip and apparatus for a terminal device.
Background
The terminal equipment is provided with a wallpaper function, and a user can set wallpaper displayed by the terminal equipment according to the needs, so that the personalized requirement of the user on the wallpaper display effect is met.
Currently, the terminal device may provide some static wallpaper and dynamic wallpaper for a user to select, and the user may use a custom picture such as a photo as the wallpaper.
However, the wallpaper displayed by the terminal device is still single, so that the user experience is poor.
Disclosure of Invention
The embodiment of the application provides a display control method, a device, a chip and equipment of terminal equipment, which can enrich the wallpaper display effect and promote the use experience of a user by displaying a target object in the process of executing the on-off screen operation.
In a first aspect, an embodiment of the present application provides a display control method of a terminal device, including: determining whether a preset first operation exists, wherein the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation of terminal equipment; acquiring reference information corresponding to the terminal equipment when the first operation exists, wherein the reference information comprises: at least one of equipment mode scene of the terminal equipment, external scene information of the terminal equipment and personalized setting information of a user; generating a target object according to the reference information; and displaying the target object in the process of executing a second operation corresponding to the first operation, wherein the second operation comprises one of a screen-on operation and a screen-off operation of the terminal equipment.
Optionally, in the presence of the first operation, the method further comprises: acquiring a first wallpaper corresponding to the first operation; the displaying the target object in the process of executing the second operation corresponding to the first operation includes: and respectively displaying the target object and the first wallpaper in the process of executing a second operation corresponding to the first operation.
Optionally, in the presence of the first operation, the method further comprises: acquiring a first wallpaper corresponding to the first operation; adding the target object on the first wallpaper to obtain a second wallpaper; the displaying the target object includes: and displaying the second wallpaper.
Optionally, the personalized setting information includes: and displaying the time period of the target object.
Optionally, the personalized setting information further includes: the information types corresponding to the time periods; the generating a target object according to the reference information comprises the following steps: generating a target object according to first information conforming to the information type in the reference information; the displaying the target object includes: displaying a first wallpaper corresponding to the first operation, and displaying the target object in the time period.
Optionally, the personalized setting information further includes: the information types corresponding to the time periods; the method further comprises the steps of: adding a target object corresponding to the first information on a first wallpaper corresponding to the first operation according to the time period and first information conforming to the information type in the reference information to obtain a second wallpaper; the displaying the target object includes: and displaying the second wallpaper, so that the target object is displayed in the time period.
Optionally, the personalized setting information includes: at least one of priority setting information of a priority between the device mode scene and the extrinsic scene information, a priority between different device mode scenes, and a priority between different extrinsic scene information.
Optionally, the generating the target object according to the reference information includes: acquiring target information with highest priority in the reference information according to the priority setting information in the personalized setting information; and generating a target object according to the target information.
Optionally, the reference information includes: at least one of equipment mode scene of the terminal equipment and external scene information of the terminal equipment; the generating a target object according to the reference information comprises the following steps: and generating a target object according to each piece of information in the reference information.
Optionally, the first operation includes: triggering the operation of a power key, executing the operation of one-key screen locking, the operation of lifting the terminal equipment, the operation of sliding unlocking and any one of preset automatic screen-on and screen-off programs.
Optionally, in the presence of the first operation, the method further comprises: according to the first operation, determining a demand type of the screen-on/off demand; under the condition that the determined demand type is a bright screen demand, determining that the first wallpaper corresponding to the first operation is a bright screen wallpaper for sequentially displaying a screen-extinguishing display animation, a screen-locking animation and a desktop animation; and under the condition that the determined demand type is the screen-off demand, determining that the first wallpaper is the screen-off wallpaper for sequentially displaying desktop animation, screen-locking animation and screen-off display animation.
In a second aspect, an embodiment of the present application provides a display control apparatus for a terminal device, including: the terminal equipment comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining whether a preset first operation exists, and the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation for the terminal equipment; an obtaining module, configured to obtain, in the presence of the first operation, reference information corresponding to the terminal device, where the reference information includes: at least one of equipment mode scene of the terminal equipment, external scene information of the terminal equipment and personalized setting information of a user; the first processing module is used for generating a target object according to the reference information; the second processing module is used for displaying the target object in the process of executing a second operation corresponding to the first operation, wherein the second operation comprises one of a screen-on operation and a screen-off operation of the terminal equipment.
In a third aspect, an embodiment of the present application provides an electronic chip, including: a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method of any of the first aspects.
In a fourth aspect, an embodiment of the present application provides an electronic device comprising a memory for storing computer program instructions, a processor for executing the computer program instructions, and communication means, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method according to any of the first aspects.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored therein, which when run on a computer, causes the computer to perform the method according to any of the first aspects.
In a sixth aspect, an embodiment of the present application provides a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method according to any one of the first aspects.
According to the embodiment of the application, the target object is displayed in the process of executing the on-off screen operation, so that the wallpaper display effect can be enriched, and the use experience of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic diagram of a screen-lighting process of a terminal device according to an embodiment of the present application;
fig. 3 is a schematic diagram of a terminal device screen-off process according to an embodiment of the present application;
FIG. 4 is a frame diagram for implementing wallpaper display according to an embodiment of the present application;
fig. 5 is a flow chart of a display control method of a terminal device for a screen-off requirement according to an embodiment of the present application;
fig. 6 is a flow chart of a display control method of a terminal device for a bright screen requirement according to an embodiment of the present application;
Fig. 7 is a timing chart of monitoring a switch state of a terminal according to an embodiment of the present application;
fig. 8 is a timing chart of a display control method of a terminal device according to an embodiment of the present application;
fig. 9 is a schematic diagram showing an implementation manner of wallpaper of a mobile phone according to an embodiment of the present application;
fig. 10 is a flow chart of a display control method of a terminal device according to an embodiment of the present application;
fig. 11 is a flowchart of another display control method of a terminal device according to an embodiment of the present application.
Detailed Description
For a better understanding of the technical solution of the present application, the following detailed description of the embodiments of the present application refers to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that in embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. The term "and/or" as used herein is merely one association relationship describing the associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. Wherein A, B may be singular or plural. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. "at least one of the following" and the like means any combination of these items, including any combination of single or plural items. For example, at least one of a, b and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
It should be understood that although the terms first, second, etc. may be used in embodiments of the present application to describe the set threshold values, these set threshold values should not be limited to these terms. These terms are only used to distinguish the set thresholds from each other. For example, a first set threshold may also be referred to as a second set threshold, and similarly, a second set threshold may also be referred to as a first set threshold, without departing from the scope of embodiments of the present application.
The terminology used in the description of the embodiments of the application herein is for the purpose of describing particular embodiments of the application only and is not intended to be limiting of the application.
The display control method of any terminal device provided in the embodiment of the present application may be applied to the electronic device 100 shown in fig. 1. Fig. 1 shows a schematic configuration of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc. The temperature sensor 180J is for detecting temperature.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100. The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
In order to attract the attention of a user group, enrich the dynamic effect of wallpaper and improve the use experience of a user, a specific target object can be dynamically added to the wallpaper displayed by the terminal equipment (the image adding effect can be reflected from the view of the user, and the repeated explanation is omitted hereinafter), for example, a specific target object with the effects of a mask, a filter and the like is added, so that the terminal equipment is equivalent to displaying the wallpaper added with the specific target object. Or the specific target object may be displayed directly at the terminal device without wallpaper.
In order to have the image adding effect, the wallpaper and the target object can be overlapped and respectively displayed, so that a user can see the wallpaper added with the target object and displayed by the terminal device. The wallpaper can be processed according to the target object, and the wallpaper obtained through processing is displayed, so that the user can see the wallpaper added with the target object and displayed by the terminal equipment.
The terminal equipment can be electronic equipment such as a smart phone, a smart tablet, a personal computer and the like.
The wallpaper displayed by the terminal equipment can be static wallpaper or dynamic wallpaper. The wallpaper displayed by the terminal equipment can be planar wallpaper or stereoscopic wallpaper.
In one implementation, the terminal device may display 3D (3 Dimension) super wallpaper. The 3D technology can make the picture stereoscopic as compared with the 2D picture display effect. Super wallpaper can bring ultra-high definition, cool and dazzling visual experience to users.
The target object may be embodied in terms of color, brightness, text content, patterns, etc.
In order to meet the requirement of the user on wallpaper display information, the use experience of the user is improved, and corresponding target objects can be added at least according to part or all of equipment mode scenes of the terminal equipment, external scene information of the terminal equipment (namely, information of the environment where the terminal equipment is located, the information reflects the environment where the user is located) and personalized setting information of the user.
The equipment mode scene of the terminal equipment can be in modes such as a flight mode, a mute mode, a vibration mode and the like; the external scene information of the terminal equipment can be external information such as season information, time information, weather information, ambient light perception information and the like; the personalized setting information of the user may have a time period in which the target object needs to be added (so that the target object may be displayed at a time corresponding to the time period), an addition priority of various information in the reference information (the priority is used to decide what information in the reference information to generate the target object from), and the like.
The priority setting information may include priority setting information between the device mode scene and the extrinsic scene information, may include priority setting information between different device mode scenes, and may include priority setting information between different extrinsic scene information.
For example, the priority of weather information in rainy days may be set higher than the priority of any season information.
Based on the priority setting, when the reference information including the rainy day, the spring is acquired, the target object may be generated from the weather information of the "rainy day" whose priority is relatively highest, instead of the seasonal information of the "spring".
For another example, a piece of priority setting information for different device mode scenarios may be set, and in one implementation, the piece of priority setting information may be as follows:
first priority: a device mode scene of "dark/light mode";
second priority: three device mode scenarios, namely a mute mode, a vibration mode and a response mode;
third priority: five device mode scenarios, namely a flying mode, an electronic book mode, a simple mode, an eye protection mode and a low battery mode.
Wherein a "dark/light mode" may be used to identify that the electronic device is displaying information based on a dark interface/light interface (e.g., a black/white interface).
Based on this priority setting information, for example, if three device mode scenes of "dark/light mode", "mute mode", and "eye-protection mode" are acquired, the target object is generated in "dark/light mode" having the relatively highest priority.
Based on this priority setting information, for example, if two device mode scenes of "vibration mode" and "low power mode" are acquired, the target object is generated in the "vibration mode" having the relatively highest priority.
For another example, a piece of priority setting information for different extrinsic scene information may be set, and in one implementation, the piece of priority setting information may be as follows:
first priority: three external scene information, namely weather information, time information (such as holidays, solar terms and the like) and season information (four seasons);
second priority: "ambient light sensing information (e.g., bright/dark light environments, etc.)" is extrinsic scene information.
Based on this priority setting information, for example, if two extrinsic scene information of "weather information" and "ambient light perception information" are acquired, a target object is generated with "weather information" having a relatively highest priority.
Based on this priority setting information, for example, if three kinds of extrinsic scene information, i.e., time information, season information, and ambient light perception information, are acquired, a target object is generated with the "time information" and the "season information" having the relatively highest priority.
Based on this priority setting information, for example, if an extrinsic scene information such as "ambient light perception information" is acquired, a target object is generated with the "ambient light perception information" having the relatively highest priority.
In this embodiment, it is considered that, in the same item of priority setting information, target objects corresponding to various pieces of information of the same priority do not collide with each other, and these target objects can be displayed simultaneously.
The priority may be set for all the information in the reference information, or may be set for only part of the information in the reference information, and the priority may not be set for the other part of the information. For example, only the device mode scene may be prioritized (for example, the priorities of the mute mode, the eye-protection mode, and the power-saving mode in the device mode scene are sequentially reduced), and other information such as external scene information may not be prioritized. The application is not limited in this regard.
In one implementation, for the portion of information that is prioritized, the highest priority information may be taken to generate the target object, and for the portion of information that is not prioritized, each of the information may be taken to generate the target object.
Along with the scene switching of the equipment mode and the scene information change of the equipment outside, the corresponding information displayed on the wallpaper can be changed correspondingly so as to realize the timely response of the target object, and thus, the wallpaper display effect can be fitted with the user requirement in real time.
Based on the above, the implementation manner of dynamically adding the target object on the wallpaper may be at least any one of the following manners:
mode 1: and dynamically adding a corresponding target object on the wallpaper according to the equipment mode scene of the terminal equipment.
For example, during the period when the terminal device is in the flight mode, the terminal device displays the 3D super wallpaper added with the filter effect of the corresponding flight mode when the terminal device is on or off.
Mode 2: and dynamically adding a corresponding target object on the wallpaper according to the external scene information of the terminal equipment.
For example, during a rainy day as an external scene of the terminal device, the terminal device displays the 3D super wallpaper added with the filter effect corresponding to the rainy day when the terminal device is on or off.
Mode 3: and dynamically adding corresponding target objects on the wallpaper according to the equipment mode scene and the external scene information of the terminal equipment.
For example, when the terminal device is in a mute mode and the external scene of the terminal device is a sunny day, the terminal device displays the 3D super wallpaper added with the filter effect corresponding to the mute mode and the sunny day when the terminal device is on or off the screen.
The method has the advantages that the method can realize the on-off screen of the terminal equipment by triggering the on-off key, the preset automatic on-off screen program and the like, the terminal equipment synchronously displays wallpaper in the on-off screen process, and the 3D super wallpaper added with the target object can be displayed.
The wallpaper displayed in the screen-on and screen-off process can be animation wallpaper. Referring to fig. 2, a continuous animation play from an AOD (off-screen display) to a desktop can be displayed during a bright screen process. Referring to fig. 3, a continuous animation play from the desktop to the AOD (off-screen display) can be displayed during the off-screen process.
The AOD is to display the contents such as time, incoming call, message, battery information, push message, etc. by using the CPU to control the locally lit pixels of the screen without lighting the whole display screen. And the bright screen display is to lighten the whole display screen, and the CPU is used for controlling all the lightened pixels of the screen to display information.
Possibly, in a bright screen application scenario of the terminal device, the sliding entry may be unlocked. For example, in a feasible triggering scenario, a user may click a power key on a screen-off display interface of the terminal device to lock a screen program, and the screen-locking face unlock is set to directly enter the desktop.
As shown in fig. 2, in a bright screen application scenario of a terminal device, the terminal device may enter a desktop from an off-screen display (AOD). Referring to fig. 2, when the terminal device is not on screen, the AOD image can be always displayed, if the user clicks the power key of the terminal device to request on screen during the display of the AOD image, the AOD image is not displayed any more, and the on screen display is started, specifically, the screen locking animation is displayed first, and then the user performs the up-sliding operation and then displays the desktop. For the display of the screen locking animation, specifically, each frame of the screen locking animation is displayed in sequence, namely, the first frame of the screen locking animation is displayed first, and the last frame of the screen locking animation is displayed last.
In order to ensure consistency of the animation, improve user experience of the terminal device on the bright screen, please refer to fig. 2, before starting the bright screen display, the AOD animation may be displayed first, and the first frame of the AOD animation may be an AOD image displayed before the user clicks the power key, which is shown in fig. 2, and an image display effect of the last frame of the AOD animation and an image display effect of the first frame of the screen locking animation are kept consistent or linked in the visual sense of the user. And after the user performs the up-slide operation, a desktop animation may be displayed, where a first frame of the desktop animation is the same as or is connected to a last frame of the lock screen animation, and the last frame of the desktop animation is the same as the desktop image displayed after the user performs the up-slide operation shown in fig. 2.
Fig. 2 shows an example of an animation of the terminal device entering the desktop from the screen-off display, and in other usage scenarios, the terminal device may also enter any application on the terminal device from the screen-off display, where the terminal device displays a screen-lock animation first and then displays an interface image of the application. The animation display process under the usage scenario refers to the above description of entering the desktop, and this embodiment is not described herein.
In the application scene of the off-screen of the terminal equipment, the off-screen of the power key can be clicked. Thus, in one possible trigger scenario, a user may click a power key to request a screen-off while the terminal device displays a desktop interface. In other feasible triggering scenes, the screen-killing purpose can be realized by means of one-key screen locking, overtime screen-killing and the like.
As shown in fig. 3, in an off-screen application scenario of a terminal device, the terminal device may be backed up from the desktop to an off-screen display (AOD). Referring to fig. 3, when the terminal device displays a desktop interface, a user may click a power key to request to turn off the screen. After detecting the operation of clicking the power key by the user, the terminal equipment can display the screen locking animation first and then display the AOD image. For the display of the screen locking animation, specifically, each frame of the screen locking animation is displayed in sequence, namely, the first frame of the screen locking animation is displayed first, and the last frame of the screen locking animation is displayed last.
The screen locking animation during screen-off can be regarded as reverse play of the screen locking animation during screen-on because the screen-on and screen-off are two opposite functions.
In order to ensure consistency of animation, and improve user experience of terminal equipment screen-off, please refer to fig. 3, after a user clicks a power key and before displaying a screen-locking animation, a desktop animation may be displayed, wherein a first frame of the desktop animation is the same as a desktop image displayed before the user clicks the power key, and a last frame of the desktop animation is the same as or is connected with the first frame of the screen-locking animation. And, after the lock screen animation is displayed, an AOD animation may be displayed, and an image display effect of a first frame of the AOD animation and an image display effect of a last frame of the lock screen animation are visually consistent or linked by a user, and the last frame of the AOD animation may be the AOD image shown in fig. 3.
Because the screen-locking animation is a bright screen display, which is different from the AOD (screen-off display) in display mode, please refer to fig. 3, after the bright screen display is finished, all pixels of the screen can be turned off by the black screen, and then local lighting processing is performed on the basis of the black screen to display the AOD image. The duration of this black screen may typically be 260-300 ms.
In other implementations, the black screen condition may also be optimized, so as to directly implement switching between the bright screen display and the off-screen display. For example, referring to fig. 2, when going from the off-screen display to the on-screen display, the off-screen condition may not be experienced.
Since the AOD (off screen display) is that the screen is locally lit and the bright screen display is that the screen is fully lit, even if the image effect displayed by the AOD is visually consistent with the image effect displayed when the bright screen display is visually consistent by the user, the pixel lighting conditions of the two images are different. For example, for the same black display effect in two images, the corresponding pixel is not lit in AOD to appear black, while the corresponding pixel is lit and displayed as black in the lit-screen display.
Thus, if the image processing effect of the object added on the AOD is visually consistent with the image processing effect of the object added when displayed on the screen, the corresponding image processing operation of the object added on the AOD is different from the corresponding image processing operation of the object added when displayed on the screen.
Taking adding a filter effect as an example, please refer to fig. 2, the filter effect obtained through the image processing operation 1 may be added to the AOD, and the filter effect obtained through the image processing operation 2 (the image processing operation 1 is different from the image processing operation 1) may be added to each frame of the lock screen animation, so that the filter effect in the bright screen animation playing seen by the user remains visually consistent all the time.
As shown in fig. 4, this embodiment provides a frame diagram for realizing wallpaper display when a terminal device is turned on or off. Fig. 4 shows an application layer 410 and an application Framework (FWK) layer 420.
The application layer 410 may include, among other things, a theme module 411, a wallpaper module 412, a screen-off display module (i.e., AOD module) 417, a system interface (SystemUI module) 418, and a desktop launcher (launcher) 419; the application framework layer 420 may include a theme switch module (ThemeManager) 421, a theme wallpaper management service (WallpaperManagerService) 422, a screensaver service (DreamService) 423, a Power management service (PowerManagerService) 424, and a Window management service (WindowManagerService) 425.
The theme module 411 may implement presentation of a theme, setting of a theme, preview of a theme, and the like.
Wallpaper modules 412 may include a stereoscopic wallpaper module 413, a wallpaper service (SuperWallpaperService) 414, an Engine 415, and an animation generation service 416. Wherein, stereoscopic wallpaper module 413 may provide a basic wallpaper display service; wallpaper service 414 may manage parameters of the wallpaper, such as the size of the wallpaper, etc., and may send dynamic wallpaper generated by animation generation service 416 to stereoscopic wallpaper module 413 for display; engine 415 may generate each frame image of static wallpaper, dynamic wallpaper according to wallpaper parameters; the animation generation service 416 may generate dynamic wallpaper through Video & OpenGL techniques.
The system UI application is a persistent process, and can provide a set of UI (User Interface) components for displaying and interacting information at a system level for a User.
The theme module 411 may provide various themes for a user to select, and the user may request to change the theme, or may request to change only the theme wallpaper.
When a user requests to change a theme, the theme switching module 421 may provide relevant information of the changed theme to the wallpaper module 412, and the theme wallpaper management service 422 may provide relevant information of the changed theme wallpaper to the wallpaper module 412, so that the wallpaper module 412 may perform corresponding update on the used wallpaper.
When a user updates the theme wallpaper, theme wallpaper management service 422 may provide information regarding the changed theme wallpaper to wallpaper module 412 such that wallpaper module 412 may correspondingly update the wallpaper used.
The screen saver service 423 can realize automatic screen-off of the terminal device through a low power consumption mode, that is, a DOZE mode. The screen saver service 423 can also realize automatic screen lighting of the terminal device when the conditions of incoming calls, incoming short messages and the like exist during screen-off of the terminal device. The screen saver service 423 can provide information regarding the on-off status of the terminal device to the screen saver display module 417, the system interface module 418, and the desktop initiator 419.
The power key management service 424 may provide information about the on-off status of the terminal device to the screen-in display module 417, the system interface module 418, and the desktop initiator 419 according to the user's trigger of the power key. The triggering operation of the user on the power key can reflect whether the terminal equipment is unlocked or not.
The window management service 425 may control the hierarchy and display order of the windows and provide corresponding status information to the screen display module 417, the system interface module 418, and the desktop initiator 419.
The screen-off display module 417 may send a corresponding control instruction to the wallpaper module 412 according to the obtained status information, and the wallpaper module 412 displays a corresponding screen-off effect in a screen-off display stage according to the control instruction.
The system interface module 418 may send a corresponding control instruction to the wallpaper module 412 according to the obtained status information, and the wallpaper module 412 may display a corresponding bright screen effect in a bright screen display stage according to the control instruction.
The desktop starter can control the starting of the desktop according to the obtained state information.
The transfer of state information between different modules may be based on the same information transfer channel, as applicable.
Optionally, the transmission of control instructions by the off-screen display module 417 and the system interface module 418 may be based on the same information transmission channel.
In one implementation, first, reference information is acquired, where the reference information may be information that reflects some or all of the device mode scene, external scene information, and personalized setting information, and the information screen display module 417 and the system interface module 418 may add the reference information to a control instruction to be sent, so as to transfer the reference information to the wallpaper module 412. Wallpaper module 412 may add a corresponding target object to the original on-off wallpaper according to the reference information in the control instruction to obtain a new wallpaper, and display the new wallpaper with the added target object. Or the target object can be generated according to the reference information, and the original on-off screen wallpaper and the generated target object are respectively displayed.
In one implementation, the information display module 417 and the system interface module 418 add the same reference information. In this way, in the process of turning on and off the screen of the terminal device, the image processing effect of adding the target object on the wallpaper when the screen is turned off and the image processing effect of adding the target object on the wallpaper when the screen is turned on and off can be the same.
The two image processing effects may also be different based on the user's personalized settings. In one implementation, the information display module 417 and the system interface module 418 add different reference information. In this way, in the process of turning on and off the screen of the terminal device, the image processing effect of adding the target object on the wallpaper when the screen is turned off and the image processing effect of adding the target object on the wallpaper when the screen is turned on can be different.
In a possible implementation, the off-screen display module 417 may inform the wallpaper module 412 of the start time of the off-screen display and the system interface module 418 may inform the wallpaper module 412 of the start time of the on-screen display.
In another possible implementation, a separate monitoring module may be provided that can send the reference information to wallpaper module 412 when it is detected, without the need for the reference information to be sent by off-screen display module 417 and system interface module 418.
Referring to fig. 5, a display control method of a terminal device for a screen-off requirement may include the following steps 501 to 508:
step 501, detecting whether any preset on-off screen triggering operation exists, where the preset on-off screen triggering operation at least includes an operation of triggering a power key, an operation of executing a one-key screen locking, and an operation of raising a terminal device, if yes, step 502 is executed, otherwise, step 501 is executed again.
The corresponding key of the one-key screen locking can be a screen locking control displayed on a screen of the terminal equipment. When the user holds the terminal device and makes a hand lifting action, a data processing result for reflecting whether the operation of lifting the terminal device exists or not can be obtained based on the sensing data of the sensor in the terminal device.
Step 502, determining a requirement type of the on-off screen requirement according to the detected on-off screen triggering operation.
Step 503, if the determined requirement type is the screen-off requirement, determining a screen-locking and screen-off process, and executing step 504 and step 505.
The life cycle of the screen locking and extinguishing process can comprise the steps of displaying the screen locking wallpaper and then extinguishing the screen.
And 504, controlling the starting of the screen-off display process according to the determined screen-off locking process.
The off-screen display process is typically started only after a period of time, during which the following steps 505-508 may be performed to determine an off-screen animation effect, and based on the determined off-screen animation effect, a display process from a lock-screen animation to an off-screen display animation may be implemented.
In addition, after the off-screen display process is started, the off-screen display process may be performed for a period of time (for example, 260ms to 300 ms), so as to determine the off-screen animation effect during the period of time, and then the off-screen display may be performed based on the determined off-screen animation effect, so that the display process from the lock-screen animation to the black-screen to the off-screen display animation as shown in fig. 3 may be implemented.
Step 505, after starting the screen-off display process, determining a screen-off active control instruction according to the determined screen-off locking process, and executing step 506.
In one implementation, the off-screen active control instructions may include control instructions generated by the off-screen display module 417 and control instructions generated by the system interface module 418 of fig. 4, thereby implementing a wallpaper off-screen display of dynamic effects.
Step 506, determining whether a filter effect needs to be added according to the acquired reference information, wherein the reference information comprises equipment mode scene and external scene information, if the filter effect needs to be added, executing step 507, otherwise executing step 508.
For example, the acquired device mode scene may be a current mobile phone mode state, and the acquired external scene information may have weather information and time information.
Step 507, adding a filter effect on the wallpaper by performing video processing, and performing step 508.
According to the embodiment, the filter effect can be added on the original screen-off wallpaper, and further the screen-off wallpaper with the added filter effect can be displayed. In other embodiments, wallpaper with the filter effect may be generated, and the wallpaper and the original off-screen wallpaper may be displayed separately.
And step 508, displaying the wallpaper according to the determined screen-off action control instruction so as to correspondingly display the screen-off action.
Referring to fig. 6, a display control method of a terminal device for a bright screen requirement may include the following steps 601 to 608:
step 601, detecting whether any preset on-off screen triggering operation exists, where the preset on-off screen triggering operation at least includes an operation of triggering a power key, an operation of executing a one-key screen locking, and an operation of raising a terminal device, if yes, executing step 602, otherwise, executing step 601 again.
The corresponding key of the one-key screen locking can be a screen locking control displayed on a screen of the terminal equipment. When the user holds the terminal device and makes a hand lifting action, a data processing result for reflecting whether the operation of lifting the terminal device exists or not can be obtained based on the sensing data of the sensor in the terminal device.
Step 602, determining a requirement type of the on-off screen requirement according to the detected on-off screen triggering operation.
Step 603, if the determined requirement type is a screen-lighting requirement, determining a screen-locking and screen-lighting process, and executing step 604 and step 605.
The life cycle of the screen locking and brightening process can comprise displaying the screen locking wallpaper and brightening the screen.
Step 604, according to the determined screen locking and screen brightening process, executing the screen brightening display process.
The bright screen display process is typically completed for a period of time during which the following steps 605-608 may be performed to determine bright screen animation effects, and based on the determined bright screen animation effects, a display process from the off-screen display to the bright screen display animation as shown in fig. 2 may be implemented.
That is, in this embodiment, steps 605-608 may be performed synchronously during the execution of step 604.
Step 605, according to the determined screen locking and screen brightening process, a screen brightening active effect control instruction is determined, and step 606 is executed.
In one implementation, the bright screen active control instructions may include control instructions generated by the off-screen display module 417 and control instructions generated by the system interface module 418 of fig. 4, thereby implementing a wallpaper bright screen display of dynamic effects.
Step 606, determining whether a filter effect is required according to the acquired reference information, where the reference information includes a device mode scene and external scene information, if the filter effect is required to be added, executing step 607, otherwise executing step 608.
For example, the acquired device mode scene may be a current mobile phone mode state, and the acquired external scene information may have weather information and time information.
Step 607, by performing video processing, a filter effect is added to the wallpaper, and step 608 is performed.
According to the embodiment, the filter effect can be added on the original bright screen wallpaper, and further the bright screen wallpaper with the added filter effect can be displayed. In other embodiments, wallpaper with the filter effect may be generated, and the wallpaper and the original bright screen wallpaper may be displayed separately.
Step 608, displaying wallpaper according to the determined bright screen dynamic effect control instruction so as to correspondingly display the bright screen dynamic effect.
Referring to fig. 7, fig. 7 shows a timing diagram of monitoring a switch state of a terminal, which relates to operations of a lock screen service 701, a decoding module 702, and a state machine 703. Next, a process of monitoring the on-off state of the terminal will be described with reference to fig. 7.
The lockscreen service 701 (keyguard service) may monitor and receive external (typically, an end user) inputs to a terminal touch screen, for example, may monitor that the user triggers a terminal power key to request an operation of starting the terminal, opening the terminal screen, and send corresponding input information to the decoding module 702.
The input information sent to the decoding module 702 may be: starting to wake up a screen (e.g., onstardingtakeup), stopping to sleep (e.g., onstream standing), the screen is lighting up (e.g., onScreenTurningOn), the screen is already lit (e.g., onScreenTurnedOn), starting to go to sleep (e.g., onstargettosleep), completing to go to sleep (e.g., onscreen go to sleep), starting to sleep (e.g., onstream standing), and the screen is off (e.g., onScreenTurnedOff).
The decoding module (keyguard view mediator) 702 can implement decoding and distributing processing of information related to the lock screen service 701.
Among the information sent to state machine 703 may be: the state machine (e.g., updateStateMachine) is updated.
The external input monitored by the lock screen service 701 may be input related to the state of the terminal or input unrelated to the state of the terminal. For example, if the user triggers a power key in the terminal screen-off display state, the terminal changes from the screen-lock state to the screen-bright state, and the input is related to the terminal state. For example, if the user triggers the touch screen (such as making a call, recording information, taking a picture, etc.) in the terminal screen-off display state, the terminal can maintain the screen-lock state, and the input is the input of the state of the irrelevant terminal.
For those external inputs related to the state of the terminal, the decoding module 702 processes the information of the external inputs and then sends the processing result to the state machine 703, so that the state machine 703 can further determine whether there is a change of the state of the terminal that causes the wallpaper display according to the processing result; for those external inputs that are not related to the terminal state, the decoding module 702 may not send the processing result to the state machine 703 after processing the information of the external input.
A state machine (dynectcontroller) 703, or "dynamic effect controller", can monitor the on-off state change of the terminal, and is mainly used for monitoring the state of the terminal related to the lock screen service 701. The state machine 703 may determine whether a state change exists at the terminal that causes a wallpaper display change. The state machine 703 may send the generated wallpaper command to an internal judging module (such as senddyn effect wallpaper command) to judge whether a state change causing a wallpaper display change exists in the terminal.
If it is determined that the state change causing the wallpaper display change exists in the terminal, a corresponding wallpaper sending instruction (such as sendwallpaper command) can be sent to the wallpaper service. Otherwise, if the terminal is judged to not have the state change which causes the wallpaper display change, a corresponding wallpaper sending instruction (such as sendwallpaper command) is not sent to the wallpaper service.
Further, the wallpaper service may display wallpaper according to the received send wallpaper command (e.g., sendWallpaperCommand).
Referring to fig. 8, fig. 8 shows a timing chart of a display control method of a terminal device, which relates to operations of a state machine 801, a screen saver service 802, a wallpaper service 803, an encoder 804, an animation generation module 805, and a multimedia player 806. Next, a wallpaper display process will be described with reference to fig. 8.
Referring to the above description of the switch state timing diagram of the monitoring terminal shown in fig. 7, it can be known that the state machine 801 can determine whether the terminal has a state change that causes a wallpaper display change. The state machine 801 may send the generated send active wallpaper command (e.g., senddyn effect wallpaper command) to the internal judging module, so as to judge whether the terminal has a state change that causes a wallpaper display change. If the terminal has a state change that causes a wallpaper display change, state machine 801 may send a corresponding send wallpaper instruction (e.g., sendWallpaperCommand) to wallpaper service 803.
In addition to the state machine 801, a screensaver service (DozeService) 802 may also monitor the on-off state changes of the terminal, and may be used to monitor terminal state changes caused by an auto-lock procedure. The screen saver service 802 can determine whether the terminal has a state change that causes a wallpaper display change. The screen saver service 802 may send the generated send action wallpaper command (such as senddyn effect wallpaper command) to the internal judging module, so as to judge whether the terminal has a state change that causes a wallpaper display change.
If it is determined that the terminal has a state change that causes a wallpaper display change, a corresponding wallpaper sending instruction (such as sendwallpaper command) may be sent to the wallpaper service 803. Otherwise, if it is determined that the terminal does not have a state change that causes a wallpaper display change, a corresponding wallpaper sending instruction (such as sendwallpaper command) is not sent to the wallpaper service 803.
In addition, the screen saver service 802 can also periodically (e.g. once every 3 s) send the state of itself to the wallpaper service 803, so that the wallpaper service 803 can also realize a corresponding wallpaper display effect according to the state of the screen saver service 802.
Wallpaper service 803 may obtain an operating parameter in response to a send wallpaper instruction (e.g., sendWallpaperCommand) from state machine 801 or screen saver service 802, and send a receive instruction (e.g., onCommand) to encoder 804 based on the obtained operating parameter.
The operation parameters may be registration parameters, weather information, time information, etc. The wallpaper service 803 may send an instruction to register a broadcast (such as a register receiver), an instruction to acquire weather information (such as getWetherInfo), and an instruction to acquire time information (such as getTimeInfo) to the corresponding internal modules for processing.
An encoder (MediaCodeWrapper) 804 may generate corresponding wallpaper according to the operation parameters acquired by the wallpaper service 803 in response to a control instruction sent from the wallpaper service 803. For displaying dynamic wallpaper, the encoder 804 may generate each frame of wallpaper image of the dynamic wallpaper, and issue an instruction to initiate an action (e.g., startAnimation) to the animation generation module 805 to generate the dynamic wallpaper accordingly.
Encoder 804 may send the display wallpaper to multimedia player 806 for wallpaper display, and may send the video stream returned by animation generation module 805 to multimedia player 806 for dynamic wallpaper display. Wherein the encoder 804 may send an instruction to release the output buffer (e.g., releaseOutputBuffer) to the animation generation module 805 for dynamic wallpaper display.
An animation generation module (effectgisurface view) 805 may package each frame image generated by the encoder 804 into a video stream through OpenGL technology, and may return the video stream to the encoder 804.
The animation generation module 805 may issue instructions (e.g., doEffectFrames) to the internal module to add an animation effect to a frame/frames of the video stream to generate the video stream.
A multimedia player (MediaCodec) 806 may display wallpaper or video streams from the encoder 804.
Referring to fig. 9, a schematic diagram showing an implementation manner of wallpaper of a mobile phone is provided in this embodiment.
As shown in fig. 9, the theme module 912 may obtain a theme pack of the theme selected by the user from the theme pack module 911, and send the theme wallpaper in the obtained theme pack to the theme wallpaper management service 910. The theme wallpaper management service (wallpapermanager service) 910 provides parameters of the theme wallpaper to the wallpaper service (WallpaperService) 909. Further, the wallpaper module 906 may perform a wallpaper display operation based on the theme wallpaper parameters managed by the wallpaper service 909.
The mobile phone module switching module 907 may obtain mobile phone mode switching information, such as switching the mobile phone mode to a flight mode, a mute mode, a dark display mode, etc., and provide the mobile phone mode switching information to the wallpaper module 906, so that the wallpaper module 906 may display wallpaper to which a target object corresponding to the mobile phone mode switching information is added.
The mobile phone external scene information acquisition module 908 may acquire external scene information of the mobile phone, such as weather, time, etc. information of the environment where the mobile phone is located, and provide the external scene information of the mobile phone to the wallpaper module 906, so that the wallpaper module 906 may display wallpaper added with a target object corresponding to the external scene information.
The power key management service (PowerManagerService, PMS) 901 can provide information related to the on-off screen state of the terminal device to the screen display module 903 and the system interface (such as the system ui) module 904 according to the triggering of the power key by the user.
The window management service (WindowManagerService, WMS) 902 can control the hierarchy and display order of the windows and provide corresponding status information to the screen display module 903, the system interface module 904, and the desktop launcher 905. Desktop launcher 905 may control the launching of the desktop based on the obtained state information.
The screen-extinguishing display module 903 may send a corresponding control instruction to the wallpaper module 906 according to the obtained status information, where the wallpaper module 906 displays a corresponding screen-extinguishing effect in a screen-extinguishing display stage according to the control instruction.
The system interface module 904 may send a corresponding control instruction to the wallpaper module 906 according to the obtained status information, and the wallpaper module 906 displays a corresponding bright screen effect in a bright screen display stage according to the control instruction.
Referring to fig. 10, a display control method for a terminal device according to an embodiment of the present application may include the following steps 1001 to 1006:
in step 1001, the user activates a Power key (Power key) to make a request to turn on and off the screen.
Step 1002, when the power key is triggered, the screen locking service triggers the screen-extinguishing display module to generate a control instruction including scene information according to the scene information collected by the scene monitoring module.
The scene listening module may include an ambient light sensor for acquiring extrinsic scene information, a weather information acquisition module, a time information acquisition module, a season information acquisition module, etc., and include a flight mode module, a mute mode module, a vibration mode module, etc. for acquiring device mode scenes.
In step 1003, the screen-off display module sends the generated control instruction including the scene information to the wallpaper module.
In step 1004, the wallpaper module listens for control instructions.
In step 1005, the wallpaper module performs filter processing corresponding to the scene information on the wallpaper according to the scene information in the control command.
According to the scene information and the equipment mode information, the filter effect added on the wallpaper can comprise effects of a rainy day filter, a sunny day filter, a snowflake filter, a 24 solar terms filter, a vibration effect filter, a flight filter and the like.
In step 1006, the wallpaper module executes the action display operation according to the control command to display the wallpaper added with the filter processing.
In another embodiment, the wallpaper and the filter effect image may be displayed directly without processing the wallpaper, and may have the same visual effect as the wallpaper to which the filter effect image is added.
Referring to fig. 11, another display control method for a terminal device according to an embodiment of the present application may include the following steps 1101 to 1107:
step 1101, detecting whether the terminal device meets any preset on-off trigger condition, if yes, executing step 1102, otherwise, executing step 1101 again.
The set on-off screen triggering conditions can comprise that a power key is triggered, a one-key screen locking control is triggered, a terminal device is lifted, automatic screen-off operation is performed, and the like.
If the terminal device meets any on-off trigger condition, the terminal device can be considered to have a switching on/off state change, specifically, a change from an on state to an off state or a change from the off state to the on state. When the switching-on and switching-off state of the terminal equipment changes, the corresponding bright screen wallpaper or off screen wallpaper is displayed.
Step 1102, determining a requirement type of the on-off screen requirement according to the on-off screen triggering condition satisfied by the terminal equipment.
In step 1103, wallpaper corresponding to the demand type is determined according to the determined demand type.
Step 1104, obtaining reference information, where the reference information includes at least one of a device mode scene of the terminal device and extrinsic scene information of the terminal device.
Step 1105, determining whether a target object corresponding to the reference information needs to be added to the wallpaper according to the acquired reference information, if yes, executing step 1106, otherwise, displaying the determined wallpaper corresponding to the demand type in the process of executing the screen-on or screen-off processing corresponding to the demand type.
And step 1106, adding the target object corresponding to the reference information to the wallpaper according to the acquired reference information, thereby obtaining the wallpaper added with the target object.
Step 1107, displaying the wallpaper added with the target object in the process of executing the on-screen or off-screen processing corresponding to the requirement type.
In another embodiment, the wallpaper and the target object may be displayed directly without processing the wallpaper, and may have the same visual effect as displaying the wallpaper with the target object added.
An embodiment of the present application provides a display control method for a terminal device, including: determining whether a preset first operation exists, wherein the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation of terminal equipment; in the presence of a first operation, acquiring reference information of a corresponding terminal device, the reference information including: at least one of equipment mode scene of the terminal equipment, external scene information of the terminal equipment and personalized setting information of a user; generating a target object according to the reference information; and displaying the target object in the process of executing a second operation corresponding to the first operation, wherein the second operation comprises one of a screen-on operation and a screen-off operation of the terminal equipment.
For example, the device mode scene of the terminal device may have a flight mode, a mute mode, a vibration mode, etc., and the external scene information of the terminal device may have season information, time information, weather information, ambient light sensing information, etc.
In one embodiment, the personalized setting information may include: a time period for adding the target object, an object content addition priority of various information in the reference information, and the like.
In another embodiment, the personalized setting information may include specific information types, such as temperature, time, preset text content, and the like. Based on the personalized setting information, when the on-off screen operation is executed, the current temperature value, the current time, the preset text content and the like can be correspondingly displayed.
The target object may be object content generated according to the reference information, and its format is different from that of original wallpaper, i.e. the target object may not be a complete image or animation, but may be an element capable of being displayed on a screen, such as an icon, a picture, a pattern, a word, a mask, a control, a button, etc. The content of the target object is not particularly limited in this embodiment, and may be other possible content not listed in this embodiment.
For example, when the reference information is extrinsic scene information such as snowy weather, the target object corresponding to the reference information may be image content having a snowy masking effect.
For another example, when the reference information is the personalized setting information of a specific character string, the target object corresponding to the reference information may be the specific character string.
In one implementation manner, when the terminal device is on or off, the target object and the original on or off wallpaper can be displayed respectively.
Possibly, the target object may be located at an upper layer of the original on-off wallpaper. At this time, from the view point of the user's vision, the wallpaper display effect that the target object is added on the original bright and dark screen wallpaper can be provided.
In another embodiment, the target object may be added to the original on-off wallpaper, synthesized into a new wallpaper, and displayed in the on-off process to display the target object.
In one embodiment of the application, in the presence of the first operation, the method further comprises: acquiring first wallpaper corresponding to a first operation; the displaying the target object in the process of executing the second operation corresponding to the first operation includes: and respectively displaying the target object and the first wallpaper during the process of executing the second operation corresponding to the first operation.
The first wallpaper may be an original bright and dark screen wallpaper. The first wallpaper can be static wallpaper or dynamic wallpaper. The first wallpaper does not relate to a target object corresponding to the reference information.
In one embodiment, the target object may be located at an upper layer of the first wallpaper for display. The user can see the superimposed display effect of the target object and the first wallpaper.
In one embodiment of the application, in the presence of the first operation, the method further comprises: acquiring first wallpaper corresponding to a first operation; adding a target object on the first wallpaper to obtain a second wallpaper; the display target object includes: and displaying the second wallpaper.
By displaying the second wallpaper, the user can also see the superposition display effect of the target object and the first wallpaper.
In one embodiment of the present application, the personalized setting information includes: the time period of the target object is displayed.
In one embodiment, the time period may include a specific time interval, such as 8 to 22 points per day. As such, during 8 to 22 points per day, in response to the on-off screen triggering operation, a target object corresponding to the reference information may be displayed. And in other time ranges, responding to the on-off screen triggering operation, and not displaying the target object corresponding to the reference information.
In another embodiment, the time period may include a time period type. The time period type may be, for example: a time period corresponding to a screen-off display, a time period corresponding to a screen-on display, a time period corresponding to a desktop display, a time period corresponding to a screen-off display and a screen-on display, a time period corresponding to a specific time interval in a screen-on/off display period (for example, the first n milliseconds of the screen-on/off display period, n being a positive number), and the like.
Taking a time period corresponding to the screen-off display as an example, in response to the screen-on/off triggering operation, the target object corresponding to the reference information can be displayed during the screen-off display, and the target object corresponding to the reference information is not displayed during other times during the screen-on/off display, such as screen-lock display and desktop display.
In one embodiment of the present application, the personalized setting information further includes: the information types corresponding to the time periods; the generating the target object according to the reference information includes: generating a target object according to first information conforming to the information type in the reference information; the display target object includes: and displaying the first wallpaper corresponding to the first operation, and displaying the target object in the time period.
In this embodiment, the first wallpaper may be displayed during the on-off screen display, and the target object may be displayed during an intersecting time interval of the on-off screen display and the preset time period.
In one embodiment of the present application, the personalized setting information further includes: the information types corresponding to the time periods; the method further comprises the steps of: adding a target object corresponding to the first information on the first wallpaper corresponding to the first operation according to the time period and the first information conforming to the information type in the reference information to obtain second wallpaper; the display target object includes: the second wallpaper is displayed such that the target object is displayed during the time period.
In this embodiment, the target object may be added to a corresponding portion (such as only the off-screen display portion) in the video stream of the first wallpaper, such that the target object is displayed only for a preset period of time during the on-off screen display.
The information type may be at least one of a part or all of the type of the device mode scene, a part or all of the type of the extrinsic scene information, and a part or all of the type of the personalized setting information, for example.
In one embodiment of the present application, the personalized setting information includes: at least one of priority setting information of a priority between the device mode scene and the extrinsic scene information, a priority between different device mode scenes, and a priority between different extrinsic scene information.
In one embodiment of the present application, the generating the target object according to the reference information includes: acquiring target information with highest priority in the reference information according to the priority setting information in the personalized setting information; and generating a target object according to the target information.
In one embodiment of the application, the reference information includes: at least one of equipment mode scene of the terminal equipment and external scene information of the terminal equipment; the generating the target object according to the reference information includes: and generating a target object according to each piece of information in the reference information.
In one embodiment of the application, the first operation comprises: triggering the operation of a power key, executing the operation of one-key screen locking, the operation of lifting the terminal equipment, the operation of sliding unlocking and any one of preset automatic screen-on and screen-off programs.
In one embodiment of the application, in the presence of the first operation, the method further comprises: according to the first operation, determining a demand type of the screen-on/off demand; under the condition that the determined demand type is a bright screen demand, determining that the first wallpaper corresponding to the first operation is a bright screen wallpaper for sequentially displaying a screen-extinguishing display animation, a screen-locking animation and a desktop animation; and under the condition that the determined demand type is the screen-off demand, determining that the first wallpaper is the screen-off wallpaper for sequentially displaying the desktop animation, the screen-locking animation and the screen-off display animation.
One embodiment of the present application also provides an electronic chip mounted in an electronic device (UE), the electronic chip including: a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger an electronic chip to perform the method steps provided by any of the method embodiments of the present application.
An embodiment of the present application further proposes a terminal device, which includes a communication module, a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the terminal device to execute the method steps provided by any of the method embodiments of the present application.
An embodiment of the application also proposes a server device comprising a communication module, a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the server device to perform the method steps provided by any of the method embodiments of the application.
An embodiment of the present application also provides an electronic device comprising a plurality of antennas, a memory for storing computer program instructions, a processor for executing the computer program instructions and communication means, such as a communication module enabling 5G communication based on the NR protocol, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method steps provided by any of the method embodiments of the present application.
In particular, in one embodiment of the present application, one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the apparatus, cause the apparatus to perform the method steps described in the embodiments of the present application.
Specifically, in an embodiment of the present application, the processor of the electronic device may be a System On Chip (SOC), and the processor may include a central processing unit (Central Processing Unit, CPU) and may further include other types of processors. Specifically, in an embodiment of the present application, the processor of the electronic device may be a PWM control chip.
In particular, in an embodiment of the present application, the processor may include, for example, a CPU, DSP (digital signal processor ) or microcontroller, GPU (graphics processing unit, graphics processor), embedded Neural network processor (Neural-network Process Units, NPU) and image signal processor (Image Signal Processing, ISP), and the processor may further include necessary hardware accelerator or logic processing hardware circuit, such as ASIC, or one or more integrated circuits for controlling the execution of the program according to the present application. Further, the processor may have a function of operating one or more software programs, which may be stored in a storage medium.
In particular, in one embodiment of the application, the memory of the electronic device may be a read-only memory (ROM), other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory, CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any computer readable medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
In particular, in an embodiment of the present application, the processor and the memory may be combined into a processing device, more commonly separate components, and the processor is configured to execute the program code stored in the memory to implement the method according to the embodiment of the present application. In particular, the memory may also be integrated into the processor or may be separate from the processor.
Further, the devices, apparatuses, modules illustrated in the embodiments of the present application may be implemented by a computer chip or entity, or by a product having a certain function.
It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied therein.
In several embodiments provided by the present application, any of the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application.
In particular, in one embodiment of the present application, there is also provided a computer-readable storage medium having a computer program stored therein, which when run on a computer, causes the computer to perform the method steps provided by the embodiments of the present application.
An embodiment of the application also provides a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method steps provided by the embodiments of the application.
The description of embodiments of the present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (means) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or units, which may be in electrical, mechanical, or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units, implemented in the form of software functional units, may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a Processor (Processor) to perform part of the steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In embodiments of the present application, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments of the present application are described in a progressive manner, and the same and similar parts of the embodiments are all referred to each other, and each embodiment is mainly described in the differences from the other embodiments. In particular, for the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments in part.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as a combination of electronic hardware, computer software, and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the application.

Claims (12)

1. A display control method of a terminal device, characterized by comprising:
determining whether a preset first operation exists, wherein the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation of terminal equipment;
responding to the screen-on triggering operation or the screen-off triggering operation, and acquiring reference information corresponding to the terminal equipment, wherein the reference information comprises: the terminal equipment comprises equipment mode scenes of the terminal equipment and personalized setting information of a user; the personalized setting information comprises priorities among different equipment mode scenes;
generating a target object according to the reference information, wherein the target object has at least one of a filter effect and a mask effect;
and displaying the target object and first wallpaper corresponding to the first operation in the process of executing the second operation corresponding to the first operation, wherein the second operation comprises one of a screen-on operation and a screen-off operation for the terminal equipment, the first wallpaper corresponding to the first operation is determined to be a screen-on wallpaper sequentially displaying a screen-off display animation, a screen-locking animation and a desktop animation under the condition that the first operation is the screen-on triggering operation, and the first wallpaper is determined to be a screen-off wallpaper sequentially displaying the desktop animation, the screen-locking animation and the screen-off animation under the condition that the first operation is the screen-off triggering operation.
2. The method according to claim 1, wherein the method further comprises:
acquiring a first wallpaper corresponding to the first operation;
adding the target object on the first wallpaper to obtain a second wallpaper;
the displaying the target object and the first wallpaper corresponding to the first operation includes: and displaying the second wallpaper.
3. The method of claim 1, wherein the personalized setting information comprises: and displaying the time period of the target object.
4. The method of claim 3, wherein the personalized settings information further comprises: the information types corresponding to the time periods;
the generating a target object according to the reference information comprises the following steps:
generating a target object according to first information conforming to the information type in the reference information;
the displaying the target object and the first wallpaper corresponding to the first operation includes:
displaying a first wallpaper corresponding to the first operation, and displaying the target object in the time period.
5. The method of claim 3, wherein the personalized settings information further comprises: the information types corresponding to the time periods;
The method further comprises the steps of:
adding a target object corresponding to the first information on a first wallpaper corresponding to the first operation according to the time period and first information conforming to the information type in the reference information to obtain a second wallpaper;
the displaying the target object and the first wallpaper corresponding to the first operation includes: and displaying the second wallpaper, so that the target object is displayed in the time period.
6. The method of claim 1, wherein generating the target object from the reference information comprises:
acquiring target information with highest priority in the reference information according to the priority setting information in the personalized setting information;
and generating a target object according to the target information.
7. The method of claim 1, wherein the reference information comprises: external scene information of the terminal equipment;
the generating a target object according to the reference information comprises the following steps:
and generating a target object according to each piece of information in the reference information.
8. The method of claim 1, wherein the first operation comprises: triggering the operation of a power key, executing the operation of one-key screen locking, the operation of lifting the terminal equipment, the operation of sliding unlocking and any one of preset automatic screen-on and screen-off programs.
9. A display control apparatus of a terminal device, characterized by comprising:
the terminal equipment comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining whether a preset first operation exists, and the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation for the terminal equipment;
the acquisition module is used for responding to the screen-on triggering operation or the screen-off triggering operation and acquiring reference information corresponding to the terminal equipment, wherein the reference information comprises: the terminal equipment comprises equipment mode scenes of the terminal equipment and personalized setting information of a user; the personalized setting information comprises priorities among different equipment mode scenes;
the first processing module is used for generating a target object according to the reference information, wherein the target object has at least one of a filter effect and a mask effect;
the second processing module is configured to display the target object and a first wallpaper corresponding to the first operation in a process of executing a second operation corresponding to the first operation, where the second operation includes one of a screen-on operation and a screen-off operation for the terminal device, where in a case where the first operation is the screen-on triggering operation, it is determined that the first wallpaper corresponding to the first operation is a screen-on wallpaper that sequentially displays a screen-off display animation, a screen-locking animation, and a desktop animation, and in a case where the first operation is the screen-off triggering operation, it is determined that the first wallpaper is a screen-off wallpaper that sequentially displays a desktop animation, a screen-locking animation, and a screen-off display animation.
10. An electronic chip, comprising:
a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method of any of claims 1-8.
11. An electronic device comprising a memory for storing computer program instructions, a processor for executing the computer program instructions, and communication means, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method of any of claims 1-8.
12. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when run on a computer, causes the computer to perform the method according to any of claims 1-8.
CN202210843720.9A 2022-07-18 2022-07-18 Display control method, device, chip and equipment of terminal equipment Active CN115357317B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210843720.9A CN115357317B (en) 2022-07-18 2022-07-18 Display control method, device, chip and equipment of terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210843720.9A CN115357317B (en) 2022-07-18 2022-07-18 Display control method, device, chip and equipment of terminal equipment

Publications (2)

Publication Number Publication Date
CN115357317A CN115357317A (en) 2022-11-18
CN115357317B true CN115357317B (en) 2023-11-21

Family

ID=84031609

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210843720.9A Active CN115357317B (en) 2022-07-18 2022-07-18 Display control method, device, chip and equipment of terminal equipment

Country Status (1)

Country Link
CN (1) CN115357317B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106412234A (en) * 2016-08-29 2017-02-15 乐视控股(北京)有限公司 Wallpaper replacement method and device
CN107621918A (en) * 2017-09-08 2018-01-23 维沃移动通信有限公司 The method to set up and mobile terminal of breath screen display content
CN107957834A (en) * 2017-11-26 2018-04-24 上海爱优威软件开发有限公司 With the associated terminal unlock method of weather
CN107977276A (en) * 2017-12-20 2018-05-01 维沃移动通信有限公司 A kind of based reminding method of Changes in weather, device and mobile terminal
CN111488091A (en) * 2020-04-16 2020-08-04 深圳传音控股股份有限公司 Interface display method of mobile terminal, mobile terminal and storage medium
CN112148410A (en) * 2020-09-29 2020-12-29 维沃移动通信有限公司 Image display method and electronic equipment
CN113824834A (en) * 2021-08-25 2021-12-21 荣耀终端有限公司 Control method for screen-off display and electronic equipment
CN114003319A (en) * 2020-07-28 2022-02-01 华为技术有限公司 Screen-off display method and electronic equipment
WO2022048506A1 (en) * 2020-09-03 2022-03-10 维沃移动通信有限公司 Wallpaper displaying method, device, and electronic device
CN114244953A (en) * 2020-09-07 2022-03-25 华为技术有限公司 Interface display method and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120030575A1 (en) * 2010-07-27 2012-02-02 Cok Ronald S Automated image-selection system
KR20150011577A (en) * 2013-07-23 2015-02-02 삼성전자주식회사 Device, method and computer readable recording medium for displaying a wallpaper on an electronic device
US20150033193A1 (en) * 2013-07-25 2015-01-29 Here Global B.V. Methods for modifying images and related aspects
CN108475204A (en) * 2016-12-30 2018-08-31 华为技术有限公司 Method, terminal device and the graphic user interface of automatic setting wallpaper

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106412234A (en) * 2016-08-29 2017-02-15 乐视控股(北京)有限公司 Wallpaper replacement method and device
CN107621918A (en) * 2017-09-08 2018-01-23 维沃移动通信有限公司 The method to set up and mobile terminal of breath screen display content
CN107957834A (en) * 2017-11-26 2018-04-24 上海爱优威软件开发有限公司 With the associated terminal unlock method of weather
CN107977276A (en) * 2017-12-20 2018-05-01 维沃移动通信有限公司 A kind of based reminding method of Changes in weather, device and mobile terminal
CN111488091A (en) * 2020-04-16 2020-08-04 深圳传音控股股份有限公司 Interface display method of mobile terminal, mobile terminal and storage medium
CN114003319A (en) * 2020-07-28 2022-02-01 华为技术有限公司 Screen-off display method and electronic equipment
WO2022048506A1 (en) * 2020-09-03 2022-03-10 维沃移动通信有限公司 Wallpaper displaying method, device, and electronic device
CN114244953A (en) * 2020-09-07 2022-03-25 华为技术有限公司 Interface display method and electronic equipment
CN112148410A (en) * 2020-09-29 2020-12-29 维沃移动通信有限公司 Image display method and electronic equipment
CN113824834A (en) * 2021-08-25 2021-12-21 荣耀终端有限公司 Control method for screen-off display and electronic equipment

Also Published As

Publication number Publication date
CN115357317A (en) 2022-11-18

Similar Documents

Publication Publication Date Title
CN114679537B (en) Shooting method and terminal
CN113362783B (en) Refresh rate switching method and electronic equipment
WO2022002205A1 (en) Display method and electronic device
CN114095666B (en) Photographing method, electronic device, and computer-readable storage medium
EP4280586A1 (en) Point light source image detection method and electronic device
CN113641271B (en) Application window management method, terminal device and computer readable storage medium
CN113625860A (en) Mode switching method and device, electronic equipment and chip system
CN112684969B (en) Always displaying method and mobile device
WO2020233593A1 (en) Method for displaying foreground element, and electronic device
WO2023207667A1 (en) Display method, vehicle, and electronic device
CN116389884B (en) Thumbnail display method and terminal equipment
CN115357317B (en) Display control method, device, chip and equipment of terminal equipment
CN114500732B (en) Interface display method, electronic equipment and storage medium
CN116110351B (en) Backlight control method, device, chip, electronic equipment and medium
CN116052607B (en) Electronic equipment control method, device, chip, electronic equipment and medium
CN116048831B (en) Target signal processing method and electronic equipment
CN116723384B (en) Process control method, electronic device and readable storage medium
CN115513571B (en) Control method of battery temperature and terminal equipment
CN116208705B (en) Equipment abnormality recovery method and electronic equipment
CN116051351B (en) Special effect processing method and electronic equipment
CN116700578B (en) Layer synthesis method, electronic device and storage medium
CN115619628B (en) Image processing method and terminal equipment
CN115495716B (en) Local authentication method and electronic equipment
CN114115772B (en) Method and device for off-screen display
US20240137659A1 (en) Point light source image detection method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant