CN115357317A - Display control method and device of terminal equipment, chip and equipment - Google Patents

Display control method and device of terminal equipment, chip and equipment Download PDF

Info

Publication number
CN115357317A
CN115357317A CN202210843720.9A CN202210843720A CN115357317A CN 115357317 A CN115357317 A CN 115357317A CN 202210843720 A CN202210843720 A CN 202210843720A CN 115357317 A CN115357317 A CN 115357317A
Authority
CN
China
Prior art keywords
screen
information
wallpaper
target object
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210843720.9A
Other languages
Chinese (zh)
Other versions
CN115357317B (en
Inventor
张从飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210843720.9A priority Critical patent/CN115357317B/en
Publication of CN115357317A publication Critical patent/CN115357317A/en
Application granted granted Critical
Publication of CN115357317B publication Critical patent/CN115357317B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations

Abstract

The embodiment of the application provides a display control method, a display control device, a chip and equipment of terminal equipment, wherein the method comprises the following steps: determining whether a preset first operation exists or not, wherein the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation of the terminal equipment; in the presence of the first operation, acquiring reference information of the corresponding terminal device, the reference information including: at least one of a device mode scene of the terminal device, external scene information of the terminal device, and personalized setting information of the user; generating a target object according to the reference information; and displaying the target object in the process of executing a second operation corresponding to the first operation, wherein the second operation comprises one of screen-on operation and screen-off operation of the terminal equipment. According to the method and the device, the target object is displayed in the process of executing the screen on-off operation, the wallpaper display effect can be enriched, and the use experience of a user is improved.

Description

Display control method and device of terminal equipment, chip and equipment
Technical Field
The present application relates to the field of display control, and in particular, to a display control method, device, chip, and device for a terminal device.
Background
The terminal equipment is provided with a wallpaper function, so that a user can set the wallpaper displayed by the terminal equipment as required, and the personalized requirement of the user on the wallpaper display effect is met.
At present, terminal equipment can provide some static wallpapers and dynamic wallpapers for users to select, and the users can also use self-defined pictures such as photos and the like as wallpapers.
However, the wallpaper displayed by the terminal device is still single, so that the user experience is poor.
Disclosure of Invention
The embodiment of the application provides a display control method, a display control device, a display control chip and display control equipment of terminal equipment.
In a first aspect, an embodiment of the present application provides a display control method for a terminal device, including: determining whether a preset first operation exists or not, wherein the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation of the terminal equipment; acquiring reference information corresponding to the terminal device under the condition that the first operation exists, wherein the reference information comprises: at least one of a device mode scene of the terminal device, external scene information of the terminal device, and personalized setting information of a user; generating a target object according to the reference information; and displaying the target object in the process of executing a second operation corresponding to the first operation, wherein the second operation comprises one of screen-on operation and screen-off operation of the terminal equipment.
Optionally, in the presence of the first operation, the method further comprises: acquiring first wallpaper corresponding to the first operation; the displaying the target object in the process of executing a second operation corresponding to the first operation comprises: and respectively displaying the target object and the first wallpaper in the process of executing a second operation corresponding to the first operation.
Optionally, in the presence of the first operation, the method further comprises: acquiring first wallpaper corresponding to the first operation; adding the target object on the first wallpaper to obtain a second wallpaper; the displaying the target object comprises: and displaying the second wallpaper.
Optionally, the personalized setting information includes: a time period for displaying the target object.
Optionally, the personalized setting information further includes: the information type corresponding to the time period; the generating of the target object according to the reference information includes: generating a target object according to first information which accords with the information type in the reference information; the displaying the target object comprises: displaying a first wallpaper corresponding to the first operation and displaying the target object in the time period.
Optionally, the personalized setting information further includes: the information type corresponding to the time period; the method further comprises the following steps: according to the time period and first information which accords with the information type in the reference information, adding a target object corresponding to the first information on first wallpaper corresponding to the first operation to obtain second wallpaper; the displaying the target object comprises: displaying the second wallpaper such that the target object is displayed within the time period.
Optionally, the personalized setting information includes: at least one priority setting information of the priority between the device mode scene and the external scene information, the priority between different device mode scenes and the priority between different external scene information.
Optionally, the generating a target object according to the reference information includes: acquiring target information with the highest priority in the reference information according to priority setting information in the personalized setting information; and generating a target object according to the target information.
Optionally, the reference information includes: at least one of a device mode scene of the terminal device and external scene information of the terminal device; the generating of the target object according to the reference information includes: and generating a target object according to each information in the reference information.
Optionally, the first operation comprises: any one of the operations of triggering a power key, executing a one-key screen locking operation, lifting the terminal equipment, sliding unlocking operation and a preset automatic screen on-off program.
Optionally, in the presence of the first operation, the method further comprises: determining a requirement type of a screen on-off requirement according to the first operation; under the condition that the determined demand type is a bright screen demand, determining that first wallpaper corresponding to the first operation is bright screen wallpaper for sequentially displaying an information screen display animation, a screen locking animation and a desktop animation; and under the condition that the determined requirement type is a screen-off requirement, determining that the first wallpaper is a screen-off wallpaper for sequentially displaying a desktop animation, a screen-locking animation and a screen-displaying animation.
In a second aspect, an embodiment of the present application provides a display control apparatus for a terminal device, including: the terminal equipment comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining whether a preset first operation exists or not, and the first operation comprises one of screen-on triggering operation and screen-off triggering operation of the terminal equipment; an obtaining module, configured to obtain reference information corresponding to the terminal device when the first operation exists, where the reference information includes: at least one of a device mode scene of the terminal device, external scene information of the terminal device, and personalized setting information of a user; the first processing module is used for generating a target object according to the reference information; and the second processing module is used for displaying the target object in the process of executing a second operation corresponding to the first operation, wherein the second operation comprises one of screen on operation and screen off operation of the terminal equipment.
In a third aspect, an embodiment of the present application provides an electronic chip, including: a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method of any of the first aspects.
In a fourth aspect, embodiments of the present application provide an electronic device comprising a memory for storing computer program instructions, a processor for executing the computer program instructions, and a communication apparatus, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method according to any one of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium having stored therein a computer program, which when run on a computer, causes the computer to perform the method according to any one of the first aspects.
In a sixth aspect, embodiments provide a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method of any one of the first aspects.
According to the method and the device, the target object is displayed in the process of executing the screen on-off operation, the wallpaper display effect can be enriched, and the use experience of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a screen lightening process of a terminal device according to an embodiment of the present application;
fig. 3 is a schematic diagram of a screen turn-off process of a terminal device according to an embodiment of the present application;
FIG. 4 is a frame diagram for implementing wallpaper display according to an embodiment of the application;
fig. 5 is a schematic flowchart of a display control method of a terminal device for meeting a screen blanking requirement according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a display control method of a terminal device for a bright screen requirement according to an embodiment of the present application;
fig. 7 is a timing diagram of a switch state of a monitoring terminal according to an embodiment of the present disclosure;
fig. 8 is a timing diagram of a display control method of a terminal device according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram illustrating an implementation of displaying a wallpaper of a mobile phone according to an embodiment of the present application;
fig. 10 is a schematic flowchart of a display control method of a terminal device according to an embodiment of the present application;
fig. 11 is a flowchart illustrating another display control method for a terminal device according to an embodiment of the present application.
Detailed Description
For better understanding of the technical solutions of the present application, the following detailed descriptions of the embodiments of the present application are provided with reference to the accompanying drawings.
It should be understood that the embodiments described are only a few embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terminology used in the embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the examples of this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that in the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. The term "and/or" as used herein is merely an associative relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship. "at least one of the following" and the like, refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
It should be understood that although the terms first, second, etc. may be used in the embodiments of the present application to describe set thresholds, these set thresholds should not be limited to these terms. These terms are used only to distinguish the set thresholds from each other. For example, the first set threshold value may also be referred to as a second set threshold value, and similarly, the second set threshold value may also be referred to as a first set threshold value, without departing from the scope of the embodiments of the present application.
The terminology used in the description of the embodiments section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
The display control method of any terminal device provided by the embodiment of the present application can be applied to the electronic device 100 shown in fig. 1. Fig. 1 shows a schematic structural diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 100 may utilize the distance sensor 180F to range to achieve fast focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense ambient light brightness. The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on. The temperature sensor 180J is used to detect temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100. The software system of the electronic device 100 may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
In order to attract the attention of a user group, enrich the wallpaper dynamic effect and improve the user experience, a specific target object (the image adding effect can be reflected from the perspective of the user vision, and the description is not repeated below) may be dynamically added to the wallpaper displayed by the terminal device, for example, a specific target object having the effects of a mask, a filter and the like is added, so that the terminal device is equivalent to displaying the wallpaper added with the specific target object. Or the specific target object may be directly displayed at the terminal device without displaying the wallpaper.
In order to have the image adding effect, the wallpaper and the target object can be overlapped and respectively displayed, so that a user can see the wallpaper added with the target object displayed by the terminal device. The wallpaper can be processed according to the target object, the wallpaper obtained through processing is displayed, and the user can see the wallpaper added with the target object displayed by the terminal device.
The terminal equipment can be electronic equipment such as a smart phone, a smart tablet, a personal computer and the like.
The wallpaper displayed by the terminal equipment can be static wallpaper or dynamic wallpaper. The wallpaper displayed by the terminal equipment can be a plane wallpaper or a three-dimensional wallpaper.
In one implementation, the terminal device may display 3D (3 Dimension) super wallpaper. Compared with the 2D picture display effect, the 3D technology can make the picture become stereoscopic and vivid. The super wallpaper can bring visual experiences of ultra-high definition, cool dazzling and the like to a user.
The target object may be embodied in terms of color, brightness, text content, pattern, and the like.
In order to meet the requirements of a user on wallpaper display information and improve the use experience of the user, the corresponding target object can be added at least according to the device mode scene of the terminal device, the external scene information of the terminal device (namely the information of the environment where the terminal device is located, the information reflects the situation of the environment where the user is located), and part or all of the personalized setting information of the user.
The device mode scene of the terminal device can be a flight mode, a mute mode, a vibration mode and other modes; the external scene information of the terminal equipment can be external information such as season information, time information, weather information, ambient light perception information and the like; the personalized setting information of the user may include a time period in which the target object needs to be added (so that the target object may be displayed at a time corresponding to the time period), an addition priority of a plurality of kinds of information in the reference information (the priority is used for deciding what kind of information in the reference information to generate the target object), and the like.
The priority setting information may include priority setting information between the device mode scene and the external scene information, priority setting information between different device mode scenes, and priority setting information between different external scene information.
For example, the priority of the weather information on rainy days may be set higher than the priority of the information on any season.
Based on the priority setting, when reference information including rainy season and spring season is collected, the target object may be generated according to weather information of "rainy season" with a relatively highest priority, rather than according to season information of "spring season".
For another example, a piece of priority setting information for different device mode scenarios may be set, and in one implementation, the priority setting information may be as follows:
the first priority: a device mode scene of "dark/light mode";
the second priority is as follows: three device mode scenarios of "silent mode", "vibration mode", "response mode";
third priority: five device mode scenes of a flight mode, an electronic book mode, a simple mode, an eye protection mode and a low battery mode.
Wherein "dark/light mode" may be used to identify that the electronic device is displaying information based on a dark/light interface (such as a black/white interface).
Based on this priority setting information, for example, if three device mode scenes of "dark/light mode", "silent mode", and "eye-protection mode" are captured, the target object is generated in the "dark/light mode" having a relatively highest priority.
Based on this priority setting information, for example, if two device mode scenes, that is, a "vibration mode" and a "low battery mode", are acquired, the target object is generated in the "vibration mode" having the relatively highest priority.
For another example, a piece of priority setting information for different pieces of extrinsic context information may be set, and in an implementation, the priority setting information may be as follows:
the first priority is: three external scene information, namely weather information, time information (such as holidays, solar terms and the like) and season information (four seasons);
the second priority is: the "ambient light sensing information (e.g., bright/dark environment, etc.)" is external scene information.
Based on this priority setting information, for example, if two kinds of extrinsic scene information, that is, "weather information" and "ambient light perception information" are collected, the target object is generated as "weather information" having a relatively highest priority.
Based on this priority setting information, if three kinds of extrinsic scene information, for example, "time information", "season information", and "ambient light perception information", are collected, the target object is generated with the "time information" and the "season information" having relatively highest priorities.
Based on this priority setting information, if extrinsic scene information such as "ambient light perception information" is collected, for example, the target object is generated with "ambient light perception information" having a relatively highest priority.
In this embodiment, it is considered that, in the same item of priority setting information, target objects corresponding to various pieces of information of the same priority do not conflict with each other, and these target objects may be displayed at the same time.
It should be noted that, priority may be set for all pieces of information in the reference information, or priority may be set for only some pieces of information in the reference information, and priority is not set for other pieces of information. For example, only the device mode scene may be prioritized (for example, the priority of the mute mode, the eye protection mode, and the power saving mode in the device mode scene is sequentially decreased), and the other information, for example, the external scene information, may not be prioritized. This is not limited in this application.
In one implementation, for the part of information with the priority set, the highest priority information may be taken to generate the target object, and for the part of information without the priority set, each of the information may be taken to generate the target object.
With the switching of the mode and the scene of the equipment and the change of the information of the external scene of the equipment, the corresponding information displayed on the wallpaper can be changed correspondingly so as to realize the timely response of the target object, thus the display effect of the wallpaper can be in real time fit with the requirements of the user.
Based on the above, the implementation manner of dynamically adding the target object on the wallpaper can be at least any one of the following manners:
mode 1: and dynamically adding a corresponding target object on the wallpaper according to the equipment mode scene of the terminal equipment.
For example, when the terminal device is in the flight mode and the terminal device is turned on or off, the 3D super wallpaper added with the filter effect corresponding to the flight mode is displayed.
Mode 2: and dynamically adding a corresponding target object on the wallpaper according to the external scene information of the terminal equipment.
For example, when the external scene of the terminal device is rainy and the terminal device is turned on or off, 3D super wallpaper with a filter effect corresponding to the rainy day is displayed.
Mode 3: and dynamically adding corresponding target objects on the wallpaper according to the equipment mode scene and the external scene information of the terminal equipment.
For example, when the terminal device is in the mute mode and the external scene of the terminal device is a sunny day, and the terminal device is on or off the screen, the 3D super wallpaper added with the filter effect corresponding to the mute mode and the sunny day is displayed.
Optionally, the screen can be turned on or off by triggering a switch key, a preset automatic screen turning-on or off program and the like, the wallpaper is synchronously displayed by the terminal device in the screen turning-on or off process, and specifically, the 3D super wallpaper added with the target object can be displayed.
The wallpaper displayed in the screen on-off process can be animation wallpaper. Referring to fig. 2, continuous animation playing from AOD (information screen display) to the desktop can be displayed during the screen-up process. Referring to fig. 3, a continuous animation play from a desktop to an AOD (message screen display) may be displayed during the screen-off process.
The AOD is to display the contents of time, incoming call, message, battery information, push message, etc. by using the CPU to control the locally lighted pixels of the screen without lighting the entire display screen. The bright screen display is to light up the whole display screen, and the CPU is used to control all the lighted pixel points of the screen to display information.
Optionally, in a bright screen application scenario of the terminal device, the slide-in may be unlocked. For example, in a feasible triggering scenario, a user may click a power key to a screen locking program on a message screen display interface of the terminal device, and the screen locking face unlocking is set to directly enter a desktop.
As shown in fig. 2, in a bright screen application scenario of the terminal device, the terminal device may enter the desktop from an on screen display (AOD). Referring to fig. 2, when the terminal device is not turned on, the AOD image may be always displayed, and if the user clicks a power key of the terminal device to request the screen to be turned on during the display of the AOD image, the AOD image is not displayed any more and the screen-on display is started, specifically, the screen-locking animation is displayed first, and then the user performs the sliding operation and displays the desktop. Specifically, for the display of the screen locking animation, each frame of the screen locking animation is sequentially displayed, that is, the first frame of the screen locking animation is displayed first, and the last frame of the screen locking animation is displayed last.
In order to ensure the consistency of the animation and improve the user experience of the terminal device in the bright screen, please refer to fig. 2, before the bright screen display is started, the AOD animation may be displayed first, and the first frame of the AOD animation may be the AOD image displayed before the user clicks the power key shown in fig. 2, and the image display effect of the last frame of the AOD animation and the image display effect of the first frame of the screen-locking animation are kept consistent or linked visually by the user. And after the user performs the slide-up operation, the desktop animation may be displayed, the first frame of the desktop animation is the same as or joined to the last frame of the lock screen animation, and the last frame of the desktop animation is the same as the desktop image displayed after the user performs the slide-up operation shown in fig. 2.
Fig. 2 shows an example of animation in which the terminal device enters the desktop from the information screen display, and in other usage scenarios, the terminal device may also enter any application on the terminal device from the information screen display, and at this time, the terminal device displays the screen locking animation first and then displays an interface image of the application. For the animation display process in the use scenario, reference is made to the above description of entering the desktop, and this embodiment is not described herein again.
In a screen-off application scene of the terminal equipment, a power key can be clicked to screen off. Thus, in one possible triggering scenario, the user may click the power key to request the screen to be turned off when the terminal device displays the desktop interface. In other feasible trigger scenes, the screen-off purpose can be realized by means of one-key screen locking, overtime screen-off and the like.
As shown in fig. 3, in the screen-off application scenario of the terminal device, the terminal device may exit from the desktop to a message screen display (AOD). Referring to fig. 3, when the terminal device displays a desktop interface, a user may click a power key to request to turn off the screen. After detecting the operation of clicking the power key by the user, the terminal equipment can display the screen locking animation firstly and then display the AOD image. Specifically, for the display of the screen locking animation, each frame of the screen locking animation is sequentially displayed, that is, the first frame of the screen locking animation is displayed first, and the last frame of the screen locking animation is displayed last.
Because the screen-on and screen-off are two relative functions, the screen-locking animation during screen-off can be regarded as reverse playing of the screen-locking animation during screen-on.
In order to ensure the continuity of the animation and improve the user experience of the user in the screen-off process of the terminal device, please refer to fig. 3, after the user clicks the power key and before the screen-locking animation is displayed, the desktop animation can be displayed, the first frame of the desktop animation is the same as the desktop image displayed before the user clicks the power key, and the last frame of the desktop animation is the same as or connected with the first frame of the screen-locking animation. And after the screen locking animation is displayed, the AOD animation may be displayed, and an image display effect of a first frame of the AOD animation and an image display effect of a last frame of the screen locking animation are visually consistent or joined with each other by a user, and the last frame of the AOD animation may be the AOD image shown in fig. 3.
Since the screen-locking animation is a bright screen display, which is different from the display mode of the AOD (information screen display), please refer to fig. 3, after the bright screen display is finished, the screen can be blacked to turn off all the pixel points on the screen, and then local lighting processing is performed on the basis of the blacked screen to display the AOD image. The duration of this black screen may typically be 260-300 ms.
In other implementation manners, the black screen condition can be optimized, and switching between bright screen display and information screen display can be directly realized. For example, referring to fig. 2, when the bright screen display is entered from the message screen display, the dark screen condition may not be experienced.
Because the AOD (information screen display) is that the screen is partially lighted, and the bright screen display is that the screen is completely lighted, even if the image effect displayed by the AOD is consistent with the image effect displayed during the bright screen display in the user vision, the lighting conditions of the pixel points of the two images are different. For example, for the black display effect at the same position in the two images, the corresponding pixel point is not lighted in the AOD and appears black, and the corresponding pixel point is lighted in the bright screen display and displayed as black.
In this way, if the image processing effect of the added target object on the AOD and the image processing effect of the added target object at the time of the bright-screen display visually coincide with each other by the user, the corresponding image processing operation of the added target object on the AOD and the corresponding image processing operation of the added target object at the time of the bright-screen display are different.
Taking the filter effect increase as an example, referring to fig. 2, the filter effect obtained through the image processing operation 1 may be added to the AOD, and the filter effect obtained through the image processing operation 2 (the image processing operation 1 is different from the image processing operation 1) may be added to each frame of image of the screen-locked animation, so that the filter effect in the bright-screen animation playback seen by the user is visually consistent all the time.
As shown in fig. 4, the embodiment provides a frame diagram for displaying wallpaper when the terminal device turns on or off the screen. Fig. 4 shows an application layer 410 and an application Framework (FWK) layer 420.
Among other things, the application layer 410 can include a theme module 411, a wallpaper module 412, a message display module (i.e., AOD module) 417, a system interface (systemoui module) 418, and a desktop launcher (launcher) 419; the application framework layer 420 may include a theme switching module (ThemeManager) 421, a theme wallpaper management service (WallpaperManagerService) 422, a screen saver service (DreamService) 423, a power key management service (PowerManagerService) 424, and a window management service (WindowManagerService) 425.
The theme module 411 may implement presentation of a theme, setting of a theme, preview of a theme, and the like.
Wallpaper module 412 may include a stereoscopic wallpaper module 413, a wallpaper service (SuperWallpaertervice) 414, an Engine (Engine) 415, and an animation generation service 416. Among them, the stereoscopic wallpaper module 413 may provide basic wallpaper display services; wallpaper service 414 may manage parameters of wallpaper, such as wallpaper size, and may send dynamic wallpaper generated by animation generation service 416 to stereoscopic wallpaper module 413 for display; engine 415 may generate each frame image of static wallpaper, dynamic wallpaper according to the wallpaper parameters; animation generation service 416 may generate dynamic wallpaper through Video & OpenGL techniques.
SystemUI applications are a persistent process that provides a User with a set of UI (User Interface) components for system-level information display and interaction.
A variety of themes may be provided in the theme module 411 for selection by the user, and the user may request to change the theme or only request to change the theme wallpaper.
When the user requests to change the theme, the theme switching module 421 may provide the related information of the changed theme to the wallpaper module 412, and the theme wallpaper management service 422 may provide the related information of the changed theme wallpaper to the wallpaper module 412, so that the wallpaper module 412 may update the used wallpaper correspondingly.
When a user updates the theme wallpaper, the theme wallpaper management service 422 may provide information regarding the changed theme wallpaper to the wallpaper module 412 so that the wallpaper module 412 may perform a corresponding update on the used wallpaper.
The screen saver service 423 can implement automatic screen-off of the terminal device through a low power consumption mode, i.e., a DOZE mode. The screen saver service 423 can also realize automatic screen lightening of the terminal device when the terminal device has incoming calls, incoming short messages and the like during screen saving. The screen saver service 423 can provide information about the on/off state of the terminal device to the message screen display module 417, the system interface module 418, and the desktop launcher 419.
The power key management service 424 may provide information related to the on/off state of the terminal device to the message screen display module 417, the system interface module 418, and the desktop starter 419 according to the triggering of the power key by the user. The triggering operation of the power key by the user can reflect whether the terminal equipment is unlocked.
The window management service 425 may control the hierarchy and display order of windows and provide corresponding state information to the message screen display module 417, the system interface module 418, and the desktop launcher 419.
The information screen display module 417 may send a corresponding control instruction to the wallpaper module 412 according to the obtained state information, and the wallpaper module 412 displays a corresponding information screen effect at an information screen display stage according to the control instruction.
The system interface module 418 may send a corresponding control instruction to the wallpaper module 412 according to the obtained status information, and the wallpaper module 412 displays a corresponding bright screen effect in a bright screen display stage according to the control instruction.
The desktop starter can control the starting of the desktop according to the obtained state information.
It is feasible that the transfer of status information between different modules may be based on the same information transfer channel.
Optionally, the message screen display module 417 and the system interface module 418 may be based on the same message delivery channel for the delivery of control commands.
In one implementation, first, reference information is obtained, where the reference information may be information used for reflecting part or all of the device mode scene, external scene information, and personalized setting information, and the information screen display module 417 and the system interface module 418 may add the reference information to a control instruction to be sent, so as to transfer the reference information to the wallpaper module 412. The wallpaper module 412 may add a corresponding target object to the original on-off screen wallpaper according to the reference information in the control instruction to obtain a new wallpaper, and display the new wallpaper to which the target object has been added. Or the target object can be generated according to the reference information, and the original on-off screen wallpaper and the generated target object are respectively displayed.
In one implementation, the reference information added by the message screen display module 417 and the system interface module 418 is the same. Therefore, in the process of turning on and off the screen of the terminal device, the image processing effect of adding the target object to the wallpaper in the screen-off display process can be the same as the image processing effect of adding the target object to the wallpaper in the screen-on display process.
The two image processing effects may also be different based on the user's personalized settings. In one implementation, the reference information added by the message screen display module 417 and the system interface module 418 is different. Therefore, in the process of turning on and off the screen of the terminal device, the image processing effect of adding the target object to the wallpaper in the screen-off display process and the image processing effect of adding the target object to the wallpaper in the screen-on display process can be different.
In a possible implementation, information display module 417 may notify wallpaper module 412 of the start time of the information display, and system interface module 418 may notify wallpaper module 412 of the start time of the bright display.
In another possible implementation, a separate monitoring module may be provided, and the monitoring module may send the reference information to wallpaper module 412 when the reference information is monitored, without sending the reference information by information screen display module 417 and system interface module 418.
Referring to fig. 5, a display control method for a terminal device with a screen-off requirement may include the following steps 501 to 508:
step 501, detecting whether any preset screen on-off triggering operation exists, where the preset screen on-off triggering operation at least includes an operation of triggering a power key, an operation of executing a one-key screen locking, and an operation of raising the terminal device, if yes, step 502 is executed, otherwise, step 501 is executed again.
The corresponding key of the one-key screen locking can be a screen locking control displayed on the screen of the terminal device. When a user holds the terminal device and carries out a hand-lifting action, a data processing result for reflecting whether the operation of lifting the terminal device exists or not can be obtained based on the sensing data of the sensor in the terminal device.
Step 502, determining a requirement type of the screen on-off requirement according to the detected screen on-off triggering operation.
Step 503, if the determined requirement type is a screen-off requirement, determining a screen-locking screen-off process, and executing step 504 and step 505.
The life cycle of the screen locking and screen extinguishing flow can comprise displaying screen locking wallpaper and then extinguishing the screen.
And step 504, controlling the starting of the message screen display process according to the determined screen locking and screen blanking process.
The screen-locking animation display process is usually started after a period of time, and the following steps 505 to 508 can be executed in the period of time to determine the screen-locking animation effect, and the display process from the screen-locking animation to the screen-locking animation can be realized based on the determined screen-locking animation effect.
In addition, after the information screen display process is started, the screen is first blacked for a period of time (e.g., 260ms to 300 ms) to perform the following steps 505 to 508 during the blacked screen period to determine the information screen animation effect, and then the information screen display is performed based on the determined information screen animation effect, so that the display process from the screen-locked animation to the blacked screen to the information screen display animation as shown in fig. 3 can be implemented.
Step 505, after starting the information screen display process, determining a screen-off active control instruction according to the determined screen-locking and screen-off flow, and executing step 506.
In one implementation, the screen-off active control instruction may include a control instruction generated by the message screen display module 417 and a control instruction generated by the system interface module 418 in fig. 4, so as to implement the wallpaper screen-off display with a dynamic effect.
Step 506, determining whether a filter effect needs to be added according to the acquired reference information, wherein the reference information comprises an equipment mode scene and external scene information, executing step 507 if the filter effect needs to be added, and executing step 508 if the filter effect does not need to be added.
For example, the acquired device mode scene may be a current mobile phone mode state, and the acquired external scene information may include weather information and time information.
In step 507, a filter effect is added to the wallpaper by performing video processing, and step 508 is performed.
According to the embodiment, the filter effect can be added to the original screen-out wallpaper, and then the screen-out wallpaper with the filter effect added can be displayed. In other embodiments, the wallpaper with the filter effect can also be generated, and the wallpaper and the original screen-off wallpaper are respectively displayed.
And step 508, displaying the wallpaper according to the determined screen-off dynamic effect control instruction so as to correspondingly display the screen-off dynamic effect.
Referring to fig. 6, a display control method for a terminal device with a bright screen requirement may include the following steps 601 to 608:
step 601, detecting whether any preset screen on-off triggering operation exists, where the preset screen on-off triggering operation at least includes an operation of triggering a power key, an operation of executing a one-key screen locking, and an operation of raising the terminal device, if yes, executing step 602, and otherwise, executing step 601 again.
The corresponding key of the one-key screen locking can be a screen locking control displayed on the screen of the terminal device. When a user holds the terminal device by hand and carries out a hand-lifting action, a data processing result for reflecting whether the operation of lifting the terminal device exists or not can be obtained based on the sensing data of the sensor in the terminal device.
Step 602, determining a requirement type of the on-off screen requirement according to the detected on-off screen triggering operation.
Step 603, if the determined requirement type is a screen lightening requirement, determining a screen locking and lightening process, and executing step 604 and step 605.
The life cycle of the screen locking and lighting process may include displaying the screen locking wallpaper and then lighting the screen.
And step 604, executing screen-up display processing according to the determined screen-up and screen-up locking flow.
The bright screen display process is usually completed for a period of time, and the following steps 605-608 may be executed during the period of time to determine the bright screen animation effect, and based on the determined bright screen animation effect, the display process from the bright screen display to the bright screen display animation as shown in fig. 2 may be implemented.
That is, in the present embodiment, during the process of executing step 604, step 605 to step 608 may be executed synchronously.
Step 605, determining a screen-up dynamic effect control instruction according to the determined screen-up locking flow, and executing step 606.
In one implementation, the bright-screen dynamic effect control instruction may include a control instruction generated by the message screen display module 417 and a control instruction generated by the system interface module 418 in fig. 4, so as to implement the bright-screen display of the wallpaper with the dynamic effect.
Step 606, determining whether a filter effect needs to be added according to the obtained reference information, where the reference information includes an equipment mode scene and external scene information, and if so, executing step 607, otherwise, executing step 608.
For example, the acquired device mode scene may be a current mobile phone mode state, and the acquired external scene information may include weather information and time information.
Step 607, add filter effect on wallpaper by performing video processing, and perform step 608.
According to the embodiment, the filter effect can be added to the original bright screen wallpaper, and then the bright screen wallpaper with the filter effect can be displayed. In other embodiments, the wallpaper with the filter effect may also be generated, and the wallpaper and the original bright-screen wallpaper may be displayed separately.
And 608, displaying the wallpaper according to the determined bright screen dynamic effect control instruction so as to correspondingly display the bright screen dynamic effect.
Referring to fig. 7, fig. 7 shows a timing diagram of the switch state of the monitoring terminal, which relates to the operations of the screen locking service 701, the decoding module 702 and the state machine 703. Next, a process of monitoring the terminal switch state will be described with reference to fig. 7.
The screen locking service 701 (keyguard service) may monitor and receive an external (typically, end user) input to the touch screen of the terminal, for example, may monitor that the user triggers a terminal power key to request to start the terminal, open the terminal screen, and send corresponding input information to the decoding module 702.
The input information sent to the decoding module 702 may include: starting to wake up the screen (such as onstartwakingup), stopping to sleep (such as onscreenreststopstopped), the screen being lit for a process (such as onscreenturnington), the screen having been lit (such as onScreenTurnedOn), starting to go to sleep (such as onstarttedgoing tofleep), completing to go to sleep (such as onsfinishedgjointosleep), starting to sleep (such as onscreenstardated), and screen going out (such as onscreenturntoff).
A decoding module (keyguard viewmedia) 702 can implement decoding and distribution processing of information related to the screen locking service 701.
The information sent to the state machine 703 may include: update the state machine (such as updateStateMachine).
The external input monitored by the lock screen service 701 may be input related to the state of the terminal or input unrelated to the state of the terminal. For example, if the user triggers the power key in the terminal information screen display state, the terminal will change from the screen locking state to the screen lightening state, and the input is the input related to the terminal state. For example, if the user triggers the touch screen (e.g., call connection, information recording, photographing, etc.) in the display state of the information screen of the terminal, the terminal can maintain the screen-locking state, and the input is input irrespective of the state of the terminal.
For those external inputs related to the terminal state, the decoding module 702 processes the information of the external inputs and then sends the processing result to the state machine 703, so that the state machine 703 can further determine whether there is a terminal state change that may cause wallpaper display according to the processing result; for those external inputs that do not relate to the terminal state, the decoding module 702 may not send the processing result to the state machine 703 after processing the information of the external inputs.
A state machine (dyneffectiontroller) 703, or a dynamic effect controller, can monitor the on/off state change of the terminal, and is mainly used for monitoring the terminal state related to the screen locking service 701. State machine 703 may determine whether the terminal has a state change that causes a change in the wallpaper display. The state machine 703 may send the generated instruction for sending the wallpaper with dynamic effect to an internal determination module (for example, sendDynEffectWallpaperCommand), so as to determine whether a state change causing a wallpaper display change exists in the terminal.
If the terminal is judged to have the state change which causes the wallpaper display change, a corresponding wallpaper sending command (such as sendwallwallpaper command) can be sent to the wallpaper service. Otherwise, if the terminal is judged not to have the state change which causes the wallpaper display change, the corresponding wallpaper sending command (such as sendWallpaperCommand) is not sent to the wallpaper service.
Further, the wallpaper service may display wallpaper in response to a received send wallpaper command (e.g., sendWallpaperCommand).
Referring to fig. 8, fig. 8 shows a timing diagram of a display control method of a terminal device, which relates to operations of a state machine 801, a screen saver service 802, a wallpaper service 803, an encoder 804, an animation generation module 805, and a multimedia player 806. Next, the wallpaper display process will be described with reference to fig. 8.
Please refer to the above description of the switch state timing diagram of the monitoring terminal shown in fig. 7, it can be seen that the state machine 801 can determine whether the terminal has a state change that causes a wallpaper display change. The state machine 801 may send the generated wallpaper instruction (for example, sendDynEffectWallpaperCommand) to the internal determination module, so as to determine whether a state change causing a wallpaper display change exists in the terminal. If the terminal has a state change that causes a change in wallpaper display, state machine 801 may send a corresponding send wallpaper instruction (such as sendwallwallpaper command) to wallpaper service 803.
In addition to the state machine 801, the screensaver service (DozeService) 802 may also monitor the on/off state changes of the terminal, and may be mainly used to monitor the terminal state changes caused by the auto-lock procedure. The screensaver service 802 can determine whether the terminal has a state change that causes a change in wallpaper display. The screensaver service 802 may send the generated wallpaper instruction (e.g., senddyneffectrwallpapercommand) to the internal determination module, and determine whether the terminal has a state change that causes a wallpaper display change.
If it is determined that the terminal has a state change that causes a wallpaper display change, a corresponding wallpaper sending command (e.g., sendwallwallpaper command) may be sent to wallpaper service 803. Otherwise, if it is determined that the terminal does not have a state change causing a wallpaper display change, the terminal does not send a corresponding wallpaper sending command (e.g., sendWallpaperCommand) to the wallpaper service 803.
In addition, the screensaver service 802 can also periodically (e.g., once every 3 s) send its own state to the wallpaper service 803, so that the wallpaper service 803 can also achieve a corresponding wallpaper display effect according to the state of the screensaver service 802.
The wallpaper service (SuperWallService) 803 may obtain the operation parameters in response to a wallpaper sending command (e.g., sendWallpaperCommand) sent by the state machine 801 or the screen saver service 802, and send a received command (e.g., onCommand) to the encoder 804 according to the obtained operation parameters.
The operation parameters may be registration parameters, weather information, time information, and the like. The wallpaper service 803 may send an instruction to register broadcast (e.g., registers receivers), an instruction to obtain weather information (e.g., getWetherInfo), and an instruction to obtain time information (e.g., getTimeInfo) to corresponding internal modules for processing.
Encoder (MediaCodeWrapper) 804 may respond to a control command sent by wallpaper service 803, and generate corresponding wallpaper according to the operation parameters acquired by wallpaper service 803. For the case of displaying dynamic wallpaper, the encoder 804 may generate each frame of wallpaper image of the dynamic wallpaper, and issue an instruction to start animation (such as start animation) to the animation generation module 805 to generate the dynamic wallpaper accordingly.
The encoder 804 may send display wallpaper to the multimedia player 806 for wallpaper display and may send the video stream returned by the animation generation module 805 to the multimedia player 806 for dynamic wallpaper display. Wherein, the encoder 804 may send an instruction (such as releaseOutputBuffer) to release the output buffer to the animation generation module 805 for displaying the dynamic wallpaper.
An animation generation module (EffectGISurfaceView) 805 may packetize each frame of images generated by the encoder 804 into a video stream via OpenGL techniques, and may return (return) the video stream to the encoder 804.
The animation generation module 805 may issue an instruction (such as a doffectframe) to add an animation to a frame/frames of the video stream to the internal module to generate the video stream.
A multimedia player (MediaCodec) 806 can display wallpaper or video stream from encoder 804.
Referring to fig. 9, the present embodiment provides a schematic diagram illustrating a mobile phone wallpaper implementation manner.
As shown in fig. 9, the theme module 912 may obtain a theme pack of the theme selected by the user from the theme pack module 911 and send the theme wallpaper in the obtained theme pack to the theme wallpaper management service 910. The theme wallpaper management service (wallpapermanager service) 910 provides parameters of the theme wallpaper to a wallpaper service (WallpaperService) 909. In turn, wallpaper module 906 may perform display operations of wallpaper based on the theme wallpaper parameters managed by wallpaper service 909.
The mobile phone module switching module 907 may acquire mobile phone mode switching information, such as switching of the mobile phone mode to the flight mode, the mute mode, the deep color display mode, and the like, and provide the mobile phone mode switching information to the wallpaper module 906, so that the wallpaper module 906 may display wallpaper added with a target object corresponding to the mobile phone mode switching information.
The external scene information collection module 908 of the mobile phone may obtain external scene information of the mobile phone, such as weather, time and other information of an environment where the mobile phone is located, and provide the external scene information of the mobile phone to the wallpaper module 906, so that the wallpaper module 906 may display the wallpaper added with the target object corresponding to the external scene information.
A power key management service (PMS) 901 may provide information related to the on/off screen status of the terminal device to a screen display module 903 and a system interface (e.g., systemUI) module 904 according to the triggering of the power key by the user.
A Windows Management Service (WMS) 902 may control the window hierarchy and display order and provide corresponding state information to a message screen display module 903, a system interface module 904, and a desktop launcher 905. The desktop launcher 905 may control the launching of the desktop according to the obtained state information.
The information screen display module 903 may send a corresponding control instruction to the wallpaper module 906 according to the obtained state information, and the wallpaper module 906 displays a corresponding information screen effect at an information screen display stage according to the control instruction.
The system interface module 904 may send a corresponding control instruction to the wallpaper module 906 according to the obtained state information, and the wallpaper module 906 displays a corresponding screen-up effect in a screen-up display stage according to the control instruction.
Referring to fig. 10, a display control method of a terminal device according to an embodiment of the present disclosure may include the following steps 1001 to 1006:
in step 1001, the user triggers a Power key (Power key) to send a screen on/off request.
Step 1002, when the power key is triggered, the screen locking service triggers the screen display module to generate a control instruction including scene information according to the scene information collected by the scene monitoring module.
The scene monitoring module may include an ambient light sensor, a weather information acquisition module, a time information acquisition module, a season information acquisition module, and the like for acquiring extrinsic scene information, and include a flight mode module, a mute mode module, a vibration mode module, and the like for acquiring a device mode scene.
In step 1003, the information screen display module sends the generated control instruction including the scene information to the wallpaper module.
Step 1004, the wallpaper module monitors the control command.
And step 1005, the wallpaper module performs filter processing corresponding to the scene information on the wallpaper according to the scene information in the control instruction.
According to the scene information and the equipment mode information, the filter effects added on the wallpaper can comprise effects of a rainy filter, a sunny filter, a snowflake filter, a 24-day air saving filter, a vibration effect filter, a flight filter and the like.
And step 1006, the wallpaper module executes a dynamic effect display operation according to the control instruction so as to display the wallpaper added with the filter processing.
In another embodiment, the wallpaper may not be processed, and the wallpaper and the filter effect image may be directly displayed separately, and may also have the same visual effect as the wallpaper with the filter effect image added thereto.
Referring to fig. 11, another display control method for a terminal device according to an embodiment of the present application may include the following steps 1101 to 1107:
and 1101, detecting whether the terminal equipment meets any preset on-off screen triggering condition, if so, executing step 1102, otherwise, executing step 1101 again.
The set on-off screen triggering conditions can include that a power key is triggered, a one-key screen locking control is triggered, the terminal device is lifted, automatic screen turning operation is executed, and the like.
If the terminal device meets any on-off screen triggering condition, it can be considered that the terminal device has a change of the on-off state, specifically, a change from the on state to the off state or a change from the off state to the on state. And displaying corresponding bright screen wallpaper or screen-off wallpaper when the terminal equipment has the on-off state change.
Step 1102, determining a requirement type of a screen on-off requirement according to a screen on-off triggering condition met by the terminal equipment.
Step 1103, determining the wallpaper corresponding to the requirement type according to the determined requirement type.
And 1104, acquiring reference information, wherein the reference information comprises at least one of a device mode scene of the terminal device and extrinsic scene information of the terminal device.
Step 1105, determining whether a target object corresponding to the reference information needs to be added to the wallpaper according to the acquired reference information, if yes, executing step 1106, and if not, displaying the determined wallpaper corresponding to the demand type in the process of executing the screen on or off processing corresponding to the demand type.
And step 1106, adding a target object corresponding to the reference information on the wallpaper according to the acquired reference information, so as to obtain the wallpaper added with the target object.
Step 1107, in the process of executing the screen-on or screen-off processing corresponding to the requirement type, displaying the wallpaper added with the target object.
In another embodiment, the wallpaper may not be processed, but the wallpaper and the target object may be directly displayed separately, and may also have the same visual effect as the wallpaper with the target object added thereto is displayed.
An embodiment of the present application provides a display control method of a terminal device, including: determining whether a preset first operation exists or not, wherein the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation of the terminal equipment; in the presence of the first operation, acquiring reference information of the corresponding terminal device, the reference information including: at least one of a device mode scene of the terminal device, external scene information of the terminal device, and personalized setting information of the user; generating a target object according to the reference information; and displaying the target object in the process of executing a second operation corresponding to the first operation, wherein the second operation comprises one of screen on operation and screen off operation of the terminal equipment.
For example, the device mode scene of the terminal device may be an airplane mode, a mute mode, a vibration mode, and the like, and the external scene information of the terminal device may be season information, time information, weather information, ambient light sensing information, and the like.
In one embodiment, the personalization setting information may include: the time period of adding the target object, the object content of various information in the reference information, and the like.
In another embodiment, the personalized setting information may include specific information categories, such as temperature, time, preset text content, and the like. Based on the personalized setting information, when the screen on-off operation is executed, the current temperature value, the current time, the preset text content and the like can be correspondingly displayed.
The target object may be object content generated according to the reference information, and the format of the object content is different from that of the original wallpaper, that is, the target object may not be a complete image or animation, but may be an element that can be displayed on a screen, such as an icon, a picture, a pattern, a character, a mask, a control, a button, and the like. The content of the target object is not specifically limited in this embodiment, and may be other feasible contents not listed in this embodiment.
For example, when the reference information is external scene information, such as snowy weather, the target object corresponding to the reference information may be image content having a snowy mask effect.
For another example, when the reference information is the personalized setting information of the specific character string, the target object corresponding to the reference information may be the specific character string.
In one implementation mode, when the terminal device lights or disappears the screen, the target object and the original screen wallpaper for lighting or disappearing can be displayed respectively.
Optionally, the target object may be located on an upper layer of the original bright-out screen wallpaper. At this time, from the perspective of the user's vision, it is possible to have a wallpaper display effect in which a target object is added to the original on-off screen wallpaper.
In another embodiment, the target object can be added to the original bright-and-dark screen wallpaper and synthesized into a new wallpaper, and the new wallpaper is displayed in the bright-and-dark screen process to display the target object.
In one embodiment of the present application, in the presence of the first operation, the method further comprises: acquiring first wallpaper corresponding to a first operation; the displaying the target object in the process of executing the second operation corresponding to the first operation comprises the following steps: and respectively displaying the target object and the first wallpaper in the process of executing a second operation corresponding to the first operation.
The first wallpaper can be the original bright and dark screen wallpaper. The first wallpaper can be static wallpaper or dynamic wallpaper. The first wallpaper does not relate to a target object of the corresponding reference information.
In one embodiment, the target object may be located at an upper layer of the first wallpaper for display. The user can see the superimposed display effect of the target object and the first wallpaper.
In one embodiment of the present application, in the presence of the first operation, the method further comprises: acquiring first wallpaper corresponding to a first operation; adding a target object on the first wallpaper to obtain a second wallpaper; the display target object includes: displaying the second wallpaper.
By displaying the second wallpaper, the user may also be made to see the superimposed display effect of the target object and the first wallpaper.
In one embodiment of the present application, the personalization setting information includes: a time period for displaying the target object.
In one embodiment, the time period may include a specific time interval, such as 8 to 22 points per day. As such, during 8 o 'clock to 22 o' clock each day, in response to the on-off screen trigger operation, the target object corresponding to the reference information may be displayed. And in other time ranges, responding to the screen on-off triggering operation, not displaying the target object corresponding to the reference information.
In another embodiment, the time period may include a time period type. The time period type may be, for example: a time period corresponding to the mute screen display, a time period corresponding to the lock screen display, a time period corresponding to the desktop display, a time period corresponding to the mute screen display and the lock screen display, a time period corresponding to a specific time interval (such as the first n milliseconds during the mute screen display, n is a positive number) in the on-off screen display period, and the like.
Taking the time period corresponding to the screen-off display as an example, in response to the on-off screen triggering operation, the target object corresponding to the reference information may be displayed during the screen-off display, and at other times during the on-off screen display, the target object corresponding to the reference information may not be displayed during the screen-locking display and the desktop display.
In an embodiment of the present application, the personalized setting information further includes: the information type corresponding to the time period; the generating of the target object according to the reference information includes: generating a target object according to first information which accords with the information type in the reference information; the display target object includes: displaying a first wallpaper corresponding to the first operation, and displaying the target object in the time period.
In this embodiment, the first wallpaper may be displayed during the bright-out screen display, and the target object may be displayed within an intersection time interval between the bright-out screen display and a preset time period.
In an embodiment of the present application, the personalized setting information further includes: the information type corresponding to the time period; the method further comprises the following steps: according to the time period and first information which accords with the information type in the reference information, adding a target object corresponding to the first information on first wallpaper corresponding to the first operation to obtain second wallpaper; the display target object includes: displaying the second wallpaper such that the target object is displayed within the time period.
In this embodiment, the target object may be added at a corresponding portion of the video stream of the first wallpaper (such as only at the message display portion) such that the target object is displayed for only a preset period of time during the splash screen display.
The information category may be at least one of a partial or full type scene of the device mode scene, a partial or full type information in the extrinsic scene information, and a partial or full type information in the personalized setting information, for example.
In one embodiment of the present application, the personalized setting information includes: at least one priority setting information of the priority between the device mode scene and the external scene information, the priority between different device mode scenes and the priority between different external scene information.
In an embodiment of the present application, the generating a target object according to reference information includes: acquiring target information with the highest priority in the reference information according to the priority setting information in the personalized setting information; and generating a target object according to the target information.
In one embodiment of the present application, the reference information includes: at least one of a device mode scene of the terminal device and external scene information of the terminal device; the generating of the target object according to the reference information includes: according to each of the reference information, a target object is generated.
In one embodiment of the present application, the first operation comprises: any one of the operations of triggering a power key, executing a one-key screen locking operation, lifting the terminal equipment, sliding unlocking operation and a preset automatic screen on-off program.
In one embodiment of the present application, in the presence of the first operation, the method further comprises: determining a requirement type of a screen on-off requirement according to the first operation; under the condition that the determined requirement type is a bright screen requirement, determining first wallpaper corresponding to the first operation as bright screen wallpaper for sequentially displaying a screen display animation, a screen locking animation and a desktop animation; and under the condition that the determined demand type is the screen-off demand, determining that the first wallpaper is the screen-off wallpaper for sequentially displaying the desktop animation, the screen-locking animation and the information screen display animation.
An embodiment of the present application further provides an electronic chip, where the task processing chip is installed in an electronic device (UE), and the electronic chip includes: a processor for executing computer program instructions stored on the memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method steps provided by any of the method embodiments of the present application.
An embodiment of the present application further provides a terminal device, where the terminal device includes a communication module, a memory for storing computer program instructions, and a processor for executing the program instructions, where the computer program instructions, when executed by the processor, trigger the terminal device to execute the method steps provided in any method embodiment of the present application.
An embodiment of the present application further provides a server device, which includes a communication module, a memory for storing computer program instructions, and a processor for executing the program instructions, wherein when the computer program instructions are executed by the processor, the server device is triggered to execute the method steps provided by any of the method embodiments of the present application.
An embodiment of the present application further provides an electronic device, which includes multiple antennas, a memory for storing computer program instructions, a processor for executing the computer program instructions, and a communication apparatus (such as a communication module capable of implementing 5G communication based on NR protocol), wherein when the computer program instructions are executed by the processor, the electronic device is triggered to execute the method steps provided by any method embodiment of the present application.
In particular, in an embodiment of the present application, one or more computer programs are stored in the memory, the one or more computer programs including instructions which, when executed by the apparatus, cause the apparatus to perform the method steps of the embodiments of the present application.
Specifically, in an embodiment of the present application, a processor of the electronic device may be a System On Chip (SOC), and the processor may include a Central Processing Unit (CPU), and may further include other types of processors. Specifically, in an embodiment of the present application, the processor of the electronic device may be a PWM control chip.
Specifically, in an embodiment of the present application, the processor may include, for example, a CPU, a DSP (digital Signal processor) or a microcontroller, and may further include a GPU (graphics Processing unit), an embedded Neural Network Processor (NPU), and an Image Signal Processing (ISP), and the processor may further include a necessary hardware accelerator or a logic Processing hardware circuit, such as an ASIC, or one or more integrated circuits for controlling execution of the program according to the present application. Further, the processor may have the functionality to operate one or more software programs, which may be stored in the storage medium.
Specifically, in one embodiment of the present application, the memory of the electronic device may be a read-only memory (ROM), other types of static memory devices that can store static information and instructions, a Random Access Memory (RAM), or other types of dynamic memory devices that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), a magnetic disc storage medium, or other magnetic storage devices, or any computer-readable medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
In particular, in an embodiment of the present application, the processor and the memory may be combined into a processing device, and more generally, are independent components, and the processor is configured to execute the program code stored in the memory to implement the method described in the embodiment of the present application. In particular implementations, the memory may be integrated within the processor or may be separate from the processor.
Furthermore, the apparatuses, devices, and modules set forth in the embodiments of the present application may be specifically implemented by a computer chip or an entity, or implemented by a product with certain functions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied in the medium.
In the several embodiments provided in the present application, any function, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application.
In particular, an embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer program causes the computer to execute the method steps provided in the embodiment of the present application.
An embodiment of the present application also provides a computer program product, which comprises a computer program that, when run on a computer, causes the computer to perform the method steps provided by the embodiments of the present application.
The description of embodiments herein is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments herein. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit, which is implemented in the form of a software functional unit, may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a Processor (Processor) to execute some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In the embodiments of the present application, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus comprising the element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present application are described in a progressive manner, and the same and similar parts among the embodiments can be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the apparatus embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of electronic hardware and computer software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The above description is only a preferred embodiment of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A display control method of a terminal device, comprising:
determining whether a preset first operation exists or not, wherein the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation of the terminal equipment;
acquiring reference information corresponding to the terminal device under the condition that the first operation exists, wherein the reference information comprises: at least one of a device mode scene of the terminal device, external scene information of the terminal device, and personalized setting information of a user;
generating a target object according to the reference information;
and displaying the target object in the process of executing a second operation corresponding to the first operation, wherein the second operation comprises one of screen-on operation and screen-off operation of the terminal equipment.
2. The method of claim 1, wherein in the presence of the first operation, the method further comprises:
acquiring first wallpaper corresponding to the first operation;
the displaying the target object in the process of executing a second operation corresponding to the first operation comprises:
and respectively displaying the target object and the first wallpaper in the process of executing a second operation corresponding to the first operation.
3. The method of claim 1, wherein in the presence of the first operation, the method further comprises:
acquiring first wallpaper corresponding to the first operation;
adding the target object to the first wallpaper to obtain a second wallpaper;
the displaying the target object comprises: and displaying the second wallpaper.
4. The method of claim 1, wherein the personalization setting information comprises: a time period for displaying the target object.
5. The method of claim 4, wherein the personalization setting information further comprises: the information type corresponding to the time period;
the generating a target object according to the reference information includes:
generating a target object according to first information which accords with the information type in the reference information;
the displaying the target object comprises:
displaying a first wallpaper corresponding to the first operation and displaying the target object within the time period.
6. The method of claim 4, wherein the personalization setting information further comprises: the information type corresponding to the time period;
the method further comprises the following steps:
according to the time period and first information which accords with the information type in the reference information, adding a target object corresponding to the first information on first wallpaper corresponding to the first operation to obtain second wallpaper;
the displaying the target object comprises: displaying the second wallpaper such that the target object is displayed within the time period.
7. The method of claim 1, wherein the personalization setting information comprises: at least one priority setting information of the priority between the device mode scene and the external scene information, the priority between different device mode scenes and the priority between different external scene information.
8. The method of claim 7, wherein generating the target object according to the reference information comprises:
acquiring target information with the highest priority in the reference information according to priority setting information in the personalized setting information;
and generating a target object according to the target information.
9. The method of claim 1, wherein the reference information comprises: at least one of a device mode scene of the terminal device and external scene information of the terminal device;
the generating of the target object according to the reference information includes:
and generating a target object according to each information in the reference information.
10. The method of claim 1, wherein the first operation comprises: any one of the operation of triggering a power key, the operation of executing one-key screen locking, the operation of lifting the terminal equipment, the operation of sliding unlocking and a preset automatic screen on-off program.
11. The method of claim 1, wherein in the presence of the first operation, the method further comprises: determining a requirement type of a screen on-off requirement according to the first operation;
under the condition that the determined demand type is a bright screen demand, determining that first wallpaper corresponding to the first operation is bright screen wallpaper for sequentially displaying an information screen display animation, a screen locking animation and a desktop animation;
and under the condition that the determined requirement type is a screen-off requirement, determining that the first wallpaper is a screen-off wallpaper for sequentially displaying a desktop animation, a screen-locking animation and a screen-displaying animation.
12. A display control apparatus of a terminal device, comprising:
the device comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining whether a preset first operation exists or not, and the first operation comprises one of a screen-on triggering operation and a screen-off triggering operation of the terminal equipment;
an obtaining module, configured to obtain reference information corresponding to the terminal device when the first operation exists, where the reference information includes: at least one of a device mode scene of the terminal device, external scene information of the terminal device, and personalized setting information of a user;
the first processing module is used for generating a target object according to the reference information;
and the second processing module is used for displaying the target object in the process of executing a second operation corresponding to the first operation, wherein the second operation comprises one of screen on operation and screen off operation of the terminal equipment.
13. An electronic chip, comprising:
a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method of any of claims 1-11.
14. An electronic device, characterized in that the electronic device comprises a memory for storing computer program instructions, a processor for executing the computer program instructions and communication means, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method according to any of claims 1-11.
15. A computer-readable storage medium, in which a computer program is stored which, when run on a computer, causes the computer to carry out the method according to any one of claims 1-11.
CN202210843720.9A 2022-07-18 2022-07-18 Display control method, device, chip and equipment of terminal equipment Active CN115357317B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210843720.9A CN115357317B (en) 2022-07-18 2022-07-18 Display control method, device, chip and equipment of terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210843720.9A CN115357317B (en) 2022-07-18 2022-07-18 Display control method, device, chip and equipment of terminal equipment

Publications (2)

Publication Number Publication Date
CN115357317A true CN115357317A (en) 2022-11-18
CN115357317B CN115357317B (en) 2023-11-21

Family

ID=84031609

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210843720.9A Active CN115357317B (en) 2022-07-18 2022-07-18 Display control method, device, chip and equipment of terminal equipment

Country Status (1)

Country Link
CN (1) CN115357317B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120030575A1 (en) * 2010-07-27 2012-02-02 Cok Ronald S Automated image-selection system
US20150033193A1 (en) * 2013-07-25 2015-01-29 Here Global B.V. Methods for modifying images and related aspects
US20150029206A1 (en) * 2013-07-23 2015-01-29 Samsung Electronics Co., Ltd. Method and electronic device for displaying wallpaper, and computer readable recording medium
CN106412234A (en) * 2016-08-29 2017-02-15 乐视控股(北京)有限公司 Wallpaper replacement method and device
CN107621918A (en) * 2017-09-08 2018-01-23 维沃移动通信有限公司 The method to set up and mobile terminal of breath screen display content
CN107957834A (en) * 2017-11-26 2018-04-24 上海爱优威软件开发有限公司 With the associated terminal unlock method of weather
CN107977276A (en) * 2017-12-20 2018-05-01 维沃移动通信有限公司 A kind of based reminding method of Changes in weather, device and mobile terminal
US20190342444A1 (en) * 2016-12-30 2019-11-07 Huawei Technologies Co., Ltd. Automatic Wallpaper Setting Method, Terminal Device, and Graphical User Interface
CN111488091A (en) * 2020-04-16 2020-08-04 深圳传音控股股份有限公司 Interface display method of mobile terminal, mobile terminal and storage medium
CN112148410A (en) * 2020-09-29 2020-12-29 维沃移动通信有限公司 Image display method and electronic equipment
CN113824834A (en) * 2021-08-25 2021-12-21 荣耀终端有限公司 Control method for screen-off display and electronic equipment
CN114003319A (en) * 2020-07-28 2022-02-01 华为技术有限公司 Screen-off display method and electronic equipment
WO2022048506A1 (en) * 2020-09-03 2022-03-10 维沃移动通信有限公司 Wallpaper displaying method, device, and electronic device
CN114244953A (en) * 2020-09-07 2022-03-25 华为技术有限公司 Interface display method and electronic equipment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120030575A1 (en) * 2010-07-27 2012-02-02 Cok Ronald S Automated image-selection system
US20150029206A1 (en) * 2013-07-23 2015-01-29 Samsung Electronics Co., Ltd. Method and electronic device for displaying wallpaper, and computer readable recording medium
US20150033193A1 (en) * 2013-07-25 2015-01-29 Here Global B.V. Methods for modifying images and related aspects
CN106412234A (en) * 2016-08-29 2017-02-15 乐视控股(北京)有限公司 Wallpaper replacement method and device
US20190342444A1 (en) * 2016-12-30 2019-11-07 Huawei Technologies Co., Ltd. Automatic Wallpaper Setting Method, Terminal Device, and Graphical User Interface
CN107621918A (en) * 2017-09-08 2018-01-23 维沃移动通信有限公司 The method to set up and mobile terminal of breath screen display content
CN107957834A (en) * 2017-11-26 2018-04-24 上海爱优威软件开发有限公司 With the associated terminal unlock method of weather
CN107977276A (en) * 2017-12-20 2018-05-01 维沃移动通信有限公司 A kind of based reminding method of Changes in weather, device and mobile terminal
CN111488091A (en) * 2020-04-16 2020-08-04 深圳传音控股股份有限公司 Interface display method of mobile terminal, mobile terminal and storage medium
CN114003319A (en) * 2020-07-28 2022-02-01 华为技术有限公司 Screen-off display method and electronic equipment
WO2022048506A1 (en) * 2020-09-03 2022-03-10 维沃移动通信有限公司 Wallpaper displaying method, device, and electronic device
CN114244953A (en) * 2020-09-07 2022-03-25 华为技术有限公司 Interface display method and electronic equipment
CN112148410A (en) * 2020-09-29 2020-12-29 维沃移动通信有限公司 Image display method and electronic equipment
CN113824834A (en) * 2021-08-25 2021-12-21 荣耀终端有限公司 Control method for screen-off display and electronic equipment

Also Published As

Publication number Publication date
CN115357317B (en) 2023-11-21

Similar Documents

Publication Publication Date Title
CN113362783B (en) Refresh rate switching method and electronic equipment
CN114390139B (en) Method for presenting video by electronic equipment in incoming call, electronic equipment and storage medium
CN113170037B (en) Method for shooting long exposure image and electronic equipment
CN114513847B (en) Positioning method, device, system, electronic equipment and storage medium
CN112860428A (en) High-energy-efficiency display processing method and equipment
CN112684969B (en) Always displaying method and mobile device
WO2020233593A1 (en) Method for displaying foreground element, and electronic device
WO2023207667A1 (en) Display method, vehicle, and electronic device
CN114003827A (en) Weather information display method and device and electronic equipment
CN114500732B (en) Interface display method, electronic equipment and storage medium
CN115357317B (en) Display control method, device, chip and equipment of terminal equipment
CN113495733A (en) Theme pack installation method and device, electronic equipment and computer readable storage medium
CN116110351B (en) Backlight control method, device, chip, electronic equipment and medium
CN114115772B (en) Method and device for off-screen display
CN116048831B (en) Target signal processing method and electronic equipment
WO2023116669A1 (en) Video generation system and method, and related apparatus
CN115495716B (en) Local authentication method and electronic equipment
CN116208705B (en) Equipment abnormality recovery method and electronic equipment
WO2023109636A1 (en) Application card display method and apparatus, terminal device, and readable storage medium
CN113254409B (en) File sharing method, system and related equipment
CN116704075A (en) Image processing method, device and storage medium
CN116528337A (en) Business collaboration method, electronic device, readable storage medium, and chip system
CN117407094A (en) Display method, electronic equipment and system
CN115904576A (en) Wallpaper application method, electronic device and storage medium
CN115480680A (en) Multi-device cooperative control method, terminal device and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant