CN117631933A - Display control method, display control device, electronic equipment and storage medium - Google Patents

Display control method, display control device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117631933A
CN117631933A CN202210952023.7A CN202210952023A CN117631933A CN 117631933 A CN117631933 A CN 117631933A CN 202210952023 A CN202210952023 A CN 202210952023A CN 117631933 A CN117631933 A CN 117631933A
Authority
CN
China
Prior art keywords
target
display
display content
original
shader
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210952023.7A
Other languages
Chinese (zh)
Inventor
刘全瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202210952023.7A priority Critical patent/CN117631933A/en
Publication of CN117631933A publication Critical patent/CN117631933A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The disclosure relates to a display control method, a device, an electronic apparatus and a storage medium, wherein the method is applied to a terminal apparatus and comprises the following steps: acquiring state parameters of a target display effect, wherein the state parameters are used for representing the stage of the target display effect; acquiring original display content, wherein the original display content comprises a plurality of display windows; according to the state parameters, rendering the original display content by using the target shader to obtain target display content with a target display effect, wherein the target shader is a shader corresponding to the target display effect; and displaying the target display content.

Description

Display control method, display control device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of display control, and in particular relates to a display control method, a display control device, electronic equipment and a storage medium.
Background
In recent years, display performance of terminal devices such as smart phones, tablet computers and wearable devices is more and more excellent, and better use experience is brought to users. The display content of the terminal equipment is synthesized by a plurality of display windows, each display window is rendered by different application programs at the upper layer before synthesis, in other words, each application program can only control the corresponding display window in the display content, and the whole display content cannot be controlled. In the related art, when displaying effects such as animation, special effects and the like are rendered in real time for the whole display content, only a plurality of application programs can respectively perform real-time rendering on corresponding display windows, so that the effect synchronism among different display windows is difficult to ensure, and the same rendering operation is repeated, so that energy consumption and memory waste are caused.
Disclosure of Invention
In order to overcome the problems in the related art, embodiments of the present disclosure provide a display control method, apparatus, electronic device, and storage medium, which are used to solve the drawbacks in the related art.
According to a first aspect of embodiments of the present disclosure, there is provided a display control method, applied to a terminal device, the method including:
acquiring state parameters of a target display effect, wherein the state parameters are used for representing the stage of the target display effect;
acquiring original display content, wherein the original display content comprises a plurality of display windows;
according to the state parameters, rendering the original display content by using the target shader to obtain target display content with a target display effect, wherein the target shader is a shader corresponding to the target display effect;
and displaying the target display content.
In one embodiment, the plurality of display windows included in the original display content are respectively generated by a plurality of application programs in a one-to-one correspondence manner;
the obtaining the state parameters of the target display effect includes:
and acquiring the state parameters of the target display effect from any application program in the plurality of application programs.
In one embodiment, further comprising:
and generating the original display content according to the display windows, and adding the original display content to a cache region.
In one embodiment, the obtaining the original display content includes:
acquiring the original display content from the cache area;
in one embodiment, the terminal device has a display driving unit and a display screen;
the displaying the target display content comprises the following steps:
updating the original display content corresponding to the buffer area by using the target display content;
and controlling the display driving unit to read the target display content in the buffer area, and driving the display screen to display according to the target display content.
In one embodiment, the rendering the original display content by using the target shader according to the state parameter to obtain a target display content with a target display effect includes:
transmitting the state parameters and the original display content to a rendering engine;
and controlling the rendering engine to configure the target shader by using the state parameter, and inputting the original display content into the target shader for rendering processing to obtain target display content output by the target shader.
In one embodiment, the terminal device has a graphics processor; the method further comprises the steps of:
controlling the rendering engine to acquire target shaders from a plurality of shaders according to the target display effect, writing the target shaders into the graphics processor, and configuring the target shaders by using initial parameters, wherein each shader corresponds to a different display effect.
In one embodiment, the obtaining the state parameter of the target display effect includes:
acquiring state parameters of a target display effect frame by frame according to a preset sequence;
the obtaining the original display content comprises the following steps:
acquiring original display contents frame by frame according to a preset sequence;
and according to the state parameter, rendering the original display content by using the target shader to obtain target display content with a target display effect, including:
according to the state parameters of each frame, rendering the original display content of the corresponding frame by using the target shader to obtain target display content with a target display effect;
the displaying the target display content comprises the following steps:
and displaying the target display content frame by frame according to a preset sequence.
According to a second aspect of embodiments of the present disclosure, there is provided a display control apparatus applied to a terminal device, the apparatus including:
the first acquisition module is used for acquiring state parameters of a target display effect, wherein the state parameters are used for representing the stage of the target display effect;
the second acquisition module is used for acquiring original display content, wherein the original display content comprises a plurality of display windows;
the rendering module is used for rendering the original display content by utilizing the target shader according to the state parameter to obtain target display content with a target display effect, wherein the target shader is a shader corresponding to the target display effect;
and the display module is used for displaying the target display content.
In one embodiment, the plurality of display windows included in the original display content are respectively generated by a plurality of application programs in a one-to-one correspondence manner;
the first obtaining module is specifically configured to:
and acquiring the state parameters of the target display effect from any application program in the plurality of application programs.
In one embodiment, the method further comprises a generating module for:
and generating the original display content according to the display windows, and adding the original display content to a cache region.
In one embodiment, the second obtaining module is specifically configured to:
acquiring the original display content from the cache area;
in one embodiment, the terminal device has a display driving unit and a display screen;
the display module is specifically used for:
updating the original display content corresponding to the buffer area by using the target display content;
and controlling the display driving unit to read the target display content in the buffer area, and driving the display screen to display according to the target display content.
In one embodiment, the rendering module is specifically configured to:
transmitting the state parameters and the original display content to a rendering engine;
and controlling the rendering engine to configure the target shader by using the state parameter, and inputting the original display content into the target shader for rendering processing to obtain target display content output by the target shader.
In one embodiment, the terminal device has a graphics processor; the apparatus further comprises an initialization module for:
controlling the rendering engine to acquire target shaders from a plurality of shaders according to the target display effect, writing the target shaders into the graphics processor, and configuring the target shaders by using initial parameters, wherein each shader corresponds to a different display effect.
In one embodiment, the first obtaining module is specifically configured to:
acquiring state parameters of a target display effect frame by frame according to a preset sequence;
the second obtaining module is specifically configured to:
acquiring original display contents frame by frame according to a preset sequence;
the rendering module is specifically configured to:
according to the state parameters of each frame, rendering the original display content of the corresponding frame by using the target shader to obtain target display content with a target display effect;
the display module is specifically used for:
and displaying the target display content frame by frame according to a preset sequence.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic device comprising a memory for storing computer instructions executable on a processor for implementing the display control method of the first aspect when the computer instructions are executed.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of the first aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
according to the display control method provided by the embodiment of the disclosure, by acquiring the state parameter of the target display effect and the original display content, the original display content can be rendered by using the target shader corresponding to the target display effect according to the state parameter, so as to obtain the target display content with the target display effect, and finally the target display content can be displayed. According to the method, the target shader is utilized to perform unified rendering processing on a plurality of display windows of original display content, so that the effect synchronism among different display windows can be ensured, repeated rendering is not needed, and the waste of energy consumption and memory is avoided.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow chart of a display control method shown in an exemplary embodiment of the present disclosure;
fig. 2 is a schematic structural view of a display control apparatus shown in an exemplary embodiment of the present disclosure;
fig. 3 is a block diagram of an electronic device shown in an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in this disclosure to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
In recent years, display performance of terminal devices such as smart phones, tablet computers and wearable devices is more and more excellent, and better use experience is brought to users. The display content of the terminal equipment is synthesized by a plurality of display windows, for example, a screen locking interface is synthesized by a status bar, a notification bar, screen locking wallpaper, a function bar and other display windows. Each display window is rendered by different application programs at the upper layer before synthesis, in other words, each application program can only control the corresponding display window in the display content and cannot control the whole display content. In the related art, when rendering display effects such as animation and special effects in real time for the whole display content, only a plurality of application programs can respectively render corresponding display windows in real time, because the application programs can only control the display windows and cannot obtain the whole display content, the effect synchronicity among different display windows is difficult to ensure in such a way, and the same rendering operation is repeated, so that energy consumption and memory waste are caused.
Based on this, in a first aspect, at least one embodiment of the present disclosure provides a display control method, please refer to fig. 1, which illustrates a flow of the method, including steps S101 to S103.
The method can be applied to terminal equipment such as smart phones, tablet computers, wearable equipment and the like. Preferably, the method can be applied to the scene of the display effect such as animation or special effect rendered in real time by the terminal equipment aiming at the display content formed by a plurality of display windows.
In step S101, a state parameter of a target display effect is obtained, where the state parameter is used to characterize a stage of the target display effect.
The target display effect may be a certain animation effect or a certain special effect. The status parameter may be a parameter such as a progress of the target display effect. The display content of the terminal equipment comprises a plurality of display windows, and the display windows are respectively generated by a plurality of application programs in a one-to-one correspondence mode, namely, each display window in the display content is controlled by different application programs at an upper layer. When the application program renders each frame of display window, the state parameters of the target display effect corresponding to each frame of display window are synchronously obtained. It can be appreciated that, in the related art, an application program renders a target display effect in a corresponding display window according to a state parameter.
Therefore, in this step, the state parameter of the target display effect may be acquired from any application program of the plurality of application programs. Since the state parameters of the target display effect are continuously updated with the update of the display content, it is preferable to acquire the state parameters of the target display effect frame by frame in a preset order. It will be understood that, each time the frame status parameter is obtained in this step, a frame of display content is displayed according to steps S102 to S104, which is a process of rendering the display content with the target display effect in real time.
In step S102, original display content is acquired, wherein the original display content includes a plurality of display windows.
The original display content refers to the display content to be added with the target display effect, and the step can be executed by a system module at the lower layer, because the application at the upper layer cannot acquire the complete original display content.
The display windows included in the original display content are respectively generated by a plurality of application programs at the upper layer in a one-to-one correspondence manner. And the rendering engine can generate the original display content according to a plurality of display windows after the corresponding display windows are generated by a plurality of application programs at the upper layer, and add the original display content to the buffer area. Illustratively, the application generates (i.e., renders) the display window on a frame-by-frame basis, the rendering engine generates the display content on a frame-by-frame basis, and adds the display content to the buffer on a frame-by-frame basis, i.e., the display content within the buffer is stored on a frame-by-frame basis.
Therefore, in this step, the original display content can be obtained from the buffer area. Since the display contents in the buffer are stored frame by frame, the original display contents are preferably obtained frame by frame according to a preset sequence in this step. It can be understood that, in order to ensure the continuity of the display content and the continuity of the target display effect, after each frame of state parameter is acquired, a frame of original display content can be acquired, so as to form a one-to-one corresponding synchronous corresponding relationship.
In step S103, according to the state parameter, rendering processing is performed on the original display content by using the target shader, so as to obtain a target display content with a target display effect, where the target shader is a shader corresponding to the target display effect.
Wherein, a plurality of shaders can be preconfigured in the terminal equipment, and each shader corresponds to different display effects. Illustratively, the shader may be configured based on an open source image library, skia. In one possible embodiment, the rendering engine may be controlled to obtain a target shader from a plurality of shaders according to the target display effect, and write the target shader to a graphics processor (graphics processing unit, GPU) of the terminal device, and configure the target shader (i.e., initialize the shader) using initial parameters, wherein each shader corresponds to a different display effect, respectively.
In one possible embodiment, the state parameters and the original display content may be sent to a rendering engine in this step; and then controlling the rendering engine to configure the target shader by using the state parameter, and inputting the original display content into the target shader for rendering processing to obtain target display content output by the target shader.
It can be understood that, because the state parameter for obtaining the target display effect and the original display content are obtained frame by frame, in this step, the original display content of the corresponding frame can be rendered by using the target shader according to the state parameter of each frame, so as to obtain the target display content with the target display effect. That is, the original display content is rendered frame by frame according to the state parameters and the original display content in a one-to-one correspondence.
The shader renders the complete display content, so that the synchronism of the display effects of a plurality of display windows can be ensured, and the waste of memory and energy consumption caused by repeated rendering can be avoided.
In step S104, the target display content is displayed.
The terminal equipment is provided with a display driving unit and a display screen, wherein the display screen is used for displaying display contents, and the display driving unit is used for driving the display screen to display and controlling the display contents of the display screen.
For example, the target display content is used first, and the original display content corresponding to the buffer area is updated; and controlling the display driving unit to read the target display content in the buffer area, and driving the display screen to display according to the target display content.
It is understood that the target display content may be displayed frame by frame in a preset order. The display driving unit is used for controlling the display driving unit to read the target display content in the buffer area frame by frame according to a preset sequence when the corresponding original display content in the buffer area is updated after the target display content of one frame is obtained, and driving the display screen to display after the target display content of one frame is read.
According to the display control method provided by the embodiment of the disclosure, by acquiring the state parameter of the target display effect and the original display content, the original display content can be rendered by using the target shader corresponding to the target display effect according to the state parameter, so as to obtain the target display content with the target display effect, and finally the target display content can be displayed. According to the method, the target shader is utilized to perform unified rendering processing on a plurality of display windows of original display content, so that the effect synchronism among different display windows can be ensured, repeated rendering is not needed, and the waste of energy consumption and memory is avoided.
According to a second aspect of the embodiments of the present disclosure, a display control apparatus is provided, which is applied to a terminal device, please refer to fig. 2, and the apparatus includes:
a first obtaining module 201, configured to obtain a state parameter of a target display effect, where the state parameter is used to characterize a stage of the target display effect;
a second obtaining module 202, configured to obtain original display content, where the original display content includes a plurality of display windows;
the rendering module 203 is configured to perform rendering processing on the original display content by using the target shader according to the state parameter, so as to obtain target display content with a target display effect, where the target shader is a shader corresponding to the target display effect;
and the display module 204 is used for displaying the target display content.
In some embodiments of the present disclosure, the plurality of display windows included in the original display content are respectively generated by a plurality of application programs in one-to-one correspondence;
the first obtaining module is specifically configured to:
and acquiring the state parameters of the target display effect from any application program in the plurality of application programs.
In some embodiments of the present disclosure, the method further comprises a generating module for:
and generating the original display content according to the display windows, and adding the original display content to a cache region.
In some embodiments of the disclosure, the second obtaining module is specifically configured to:
acquiring the original display content from the cache area;
in some embodiments of the present disclosure, the terminal device has a display driving unit and a display screen;
the display module is specifically used for:
updating the original display content corresponding to the buffer area by using the target display content;
and controlling the display driving unit to read the target display content in the buffer area, and driving the display screen to display according to the target display content.
In some embodiments of the disclosure, the rendering module is specifically configured to:
transmitting the state parameters and the original display content to a rendering engine;
and controlling the rendering engine to configure the target shader by using the state parameter, and inputting the original display content into the target shader for rendering processing to obtain target display content output by the target shader.
In some embodiments of the present disclosure, the terminal device has a graphics processor; the apparatus further comprises an initialization module for:
controlling the rendering engine to acquire target shaders from a plurality of shaders according to the target display effect, writing the target shaders into the graphics processor, and configuring the target shaders by using initial parameters, wherein each shader corresponds to a different display effect.
In some embodiments of the disclosure, the first obtaining module is specifically configured to:
acquiring state parameters of a target display effect frame by frame according to a preset sequence;
the second obtaining module is specifically configured to:
acquiring original display contents frame by frame according to a preset sequence;
the rendering module is specifically configured to:
according to the state parameters of each frame, rendering the original display content of the corresponding frame by using the target shader to obtain target display content with a target display effect;
the display module is specifically used for:
and displaying the target display content frame by frame according to a preset sequence.
The specific manner in which the various modules perform the operations in relation to the apparatus of the above embodiments has been described in detail in relation to the embodiments of the method of the first aspect and will not be described in detail here.
In accordance with a fifth aspect of embodiments of the present disclosure, reference is made to fig. 3, which schematically illustrates a block diagram of an electronic device. For example, apparatus 300 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 3, apparatus 300 may include one or more of the following components: a processing component 302, a memory 304, a power supply component 306, a multimedia component 308, an audio component 310, an input/output (I/O) interface 312, a sensor component 314, and a communication component 316.
The processing component 302 generally controls overall operation of the apparatus 300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing element 302 may include one or more processors 320 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 302 can include one or more modules that facilitate interactions between the processing component 302 and other components. For example, the processing component 302 may include a multimedia module to facilitate interaction between the multimedia component 308 and the processing component 302.
Memory 304 is configured to store various types of data to support operations at device 300. Examples of such data include instructions for any application or method operating on the device 300, contact data, phonebook data, messages, pictures, videos, and the like. The memory 304 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 306 provides power to the various components of the device 300. The power components 306 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 300.
The multimedia component 308 includes a screen between the device 300 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation. In some embodiments, the multimedia component 308 includes a front-facing camera and/or a rear-facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the apparatus 300 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 310 is configured to output and/or input audio signals. For example, the audio component 310 includes a Microphone (MIC) configured to receive external audio signals when the device 300 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 304 or transmitted via the communication component 316. In some embodiments, audio component 310 further comprises a speaker for outputting audio signals.
The I/O interface 312 provides an interface between the processing component 302 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 314 includes one or more sensors for providing status assessment of various aspects of the apparatus 300. For example, the sensor assembly 314 may detect the on/off state of the device 300, the relative positioning of the components, such as the display and keypad of the device 300, the sensor assembly 314 may also detect a change in position of the device 300 or a component of the device 300, the presence or absence of user contact with the device 300, the orientation or acceleration/deceleration of the device 300, and a change in temperature of the device 300. The sensor assembly 314 may also include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 314 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 314 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 316 is configured to facilitate communication between the apparatus 300 and other devices, either wired or wireless. The device 300 may access a wireless network based on a communication standard, such as WiFi,2G or 3G,4G or 5G, or a combination thereof. In one exemplary embodiment, the communication part 316 receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 316 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the power supply methods of electronic devices described above.
In a sixth aspect, the present disclosure also provides, in an exemplary embodiment, a non-transitory computer-readable storage medium, such as memory 304, comprising instructions executable by processor 320 of apparatus 300 to perform the method of powering an electronic device described above. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (18)

1. A display control method, characterized by being applied to a terminal device, the method comprising:
acquiring state parameters of a target display effect, wherein the state parameters are used for representing the stage of the target display effect;
acquiring original display content, wherein the original display content comprises a plurality of display windows;
according to the state parameters, rendering the original display content by using the target shader to obtain target display content with a target display effect, wherein the target shader is a shader corresponding to the target display effect;
and displaying the target display content.
2. The display control method according to claim 1, wherein a plurality of display windows included in the original display content are generated by a plurality of application programs in one-to-one correspondence, respectively;
the obtaining the state parameters of the target display effect includes:
and acquiring the state parameters of the target display effect from any application program in the plurality of application programs.
3. The display control method according to claim 1, characterized by further comprising:
and generating the original display content according to the display windows, and adding the original display content to a cache region.
4. A display control method according to claim 3, wherein the acquiring the original display content includes:
and acquiring the original display content from the buffer memory area.
5. The display control method according to claim 3 or 4, wherein the terminal device has a display driving unit and a display screen;
the displaying the target display content comprises the following steps:
updating the original display content corresponding to the buffer area by using the target display content;
and controlling the display driving unit to read the target display content in the buffer area, and driving the display screen to display according to the target display content.
6. The display control method according to claim 1, wherein the rendering the original display content with the target shader according to the state parameter to obtain a target display content having a target display effect comprises:
transmitting the state parameters and the original display content to a rendering engine;
and controlling the rendering engine to configure the target shader by using the state parameter, and inputting the original display content into the target shader for rendering processing to obtain target display content output by the target shader.
7. The display control method according to claim 1, wherein the terminal device has a graphic processor; the method further comprises the steps of:
controlling the rendering engine to acquire target shaders from a plurality of shaders according to the target display effect, writing the target shaders into the graphics processor, and configuring the target shaders by using initial parameters, wherein each shader corresponds to a different display effect.
8. The display control method according to any one of claims 1 to 7, characterized in that the acquiring the state parameter of the target display effect includes:
acquiring state parameters of a target display effect frame by frame according to a preset sequence;
the obtaining the original display content comprises the following steps:
acquiring original display contents frame by frame according to a preset sequence;
and according to the state parameter, rendering the original display content by using the target shader to obtain target display content with a target display effect, including:
according to the state parameters of each frame, rendering the original display content of the corresponding frame by using the target shader to obtain target display content with a target display effect;
the displaying the target display content comprises the following steps:
and displaying the target display content frame by frame according to a preset sequence.
9. A display control apparatus, characterized by being applied to a terminal device, comprising:
the first acquisition module is used for acquiring state parameters of a target display effect, wherein the state parameters are used for representing the stage of the target display effect;
the second acquisition module is used for acquiring original display content, wherein the original display content comprises a plurality of display windows;
the rendering module is used for rendering the original display content by utilizing the target shader according to the state parameter to obtain target display content with a target display effect, wherein the target shader is a shader corresponding to the target display effect;
and the display module is used for displaying the target display content.
10. The display control apparatus according to claim 9, wherein a plurality of display windows included in the original display content are respectively generated by a plurality of application programs in one-to-one correspondence;
the first obtaining module is specifically configured to:
and acquiring the state parameters of the target display effect from any application program in the plurality of application programs.
11. The display control device of claim 9, further comprising a generation module configured to:
and generating the original display content according to the display windows, and adding the original display content to a cache region.
12. The display control device according to claim 11, wherein the second acquisition module is specifically configured to:
and acquiring the original display content from the buffer memory area.
13. The display control apparatus according to claim 11 or 12, wherein the terminal device has a display driving unit and a display screen;
the display module is specifically used for:
updating the original display content corresponding to the buffer area by using the target display content;
and controlling the display driving unit to read the target display content in the buffer area, and driving the display screen to display according to the target display content.
14. The display control device of claim 9, wherein the rendering module is specifically configured to:
transmitting the state parameters and the original display content to a rendering engine;
and controlling the rendering engine to configure the target shader by using the state parameter, and inputting the original display content into the target shader for rendering processing to obtain target display content output by the target shader.
15. The display control apparatus according to claim 9, wherein the terminal device has a graphic processor; the apparatus further comprises an initialization module for:
controlling the rendering engine to acquire target shaders from a plurality of shaders according to the target display effect, writing the target shaders into the graphics processor, and configuring the target shaders by using initial parameters, wherein each shader corresponds to a different display effect.
16. The display control device according to any one of claims 9 to 15, wherein the first acquisition module is specifically configured to:
acquiring state parameters of a target display effect frame by frame according to a preset sequence;
the second obtaining module is specifically configured to:
acquiring original display contents frame by frame according to a preset sequence;
the rendering module is specifically configured to:
according to the state parameters of each frame, rendering the original display content of the corresponding frame by using the target shader to obtain target display content with a target display effect;
the display module is specifically used for:
and displaying the target display content frame by frame according to a preset sequence.
17. An electronic device comprising a memory, a processor for storing computer instructions executable on the processor for implementing the display control method of any one of claims 1 to 8 when the computer instructions are executed.
18. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method of any one of claims 1 to 8.
CN202210952023.7A 2022-08-09 2022-08-09 Display control method, display control device, electronic equipment and storage medium Pending CN117631933A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210952023.7A CN117631933A (en) 2022-08-09 2022-08-09 Display control method, display control device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210952023.7A CN117631933A (en) 2022-08-09 2022-08-09 Display control method, display control device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117631933A true CN117631933A (en) 2024-03-01

Family

ID=90032547

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210952023.7A Pending CN117631933A (en) 2022-08-09 2022-08-09 Display control method, display control device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117631933A (en)

Similar Documents

Publication Publication Date Title
CN111580904B (en) Display state switching method, display state switching device and storage medium
EP3454192A1 (en) Method and device for displaying page
CN111078170B (en) Display control method, display control device, and computer-readable storage medium
CN109451341B (en) Video playing method, video playing device, electronic equipment and storage medium
CN111610912B (en) Application display method, application display device and storage medium
CN111866571B (en) Method and device for editing content on smart television and storage medium
CN108322673B (en) Video generation method and video generation device
CN111611034A (en) Screen display adjusting method and device and storage medium
CN112785672A (en) Image processing method and device, electronic equipment and storage medium
CN108829473B (en) Event response method, device and storage medium
US20210335390A1 (en) Method and device for generating dynamic image
CN106447747B (en) Image processing method and device
CN112882784A (en) Application interface display method and device, intelligent equipment and medium
CN105630486B (en) Typesetting method and device for desktop of intelligent terminal equipment
CN117119260A (en) Video control processing method and device
CN114268802B (en) Virtual space display method and device, electronic equipment and storage medium
CN109407942B (en) Model processing method and device, control client and storage medium
CN117631933A (en) Display control method, display control device, electronic equipment and storage medium
CN114827721A (en) Video special effect processing method and device, storage medium and electronic equipment
CN109754452B (en) Image rendering processing method and device, electronic equipment and storage medium
CN109389547B (en) Image display method and device
CN111246012B (en) Application interface display method and device and storage medium
CN113919311A (en) Data display method and device, electronic equipment and storage medium
CN110312117B (en) Data refreshing method and device
US20220291890A1 (en) Method for interaction between devices and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination