CN114579076B - Data processing method and related device - Google Patents

Data processing method and related device Download PDF

Info

Publication number
CN114579076B
CN114579076B CN202210114714.XA CN202210114714A CN114579076B CN 114579076 B CN114579076 B CN 114579076B CN 202210114714 A CN202210114714 A CN 202210114714A CN 114579076 B CN114579076 B CN 114579076B
Authority
CN
China
Prior art keywords
frame rate
vsync
thread
frame
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210114714.XA
Other languages
Chinese (zh)
Other versions
CN114579076A (en
Inventor
蔡立峰
沈赫
杜鸿雁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210114714.XA priority Critical patent/CN114579076B/en
Priority to CN202310358322.2A priority patent/CN116991354A/en
Publication of CN114579076A publication Critical patent/CN114579076A/en
Application granted granted Critical
Publication of CN114579076B publication Critical patent/CN114579076B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the application provides a data processing method and a related device, which are applied to the technical field of terminals. The method comprises the following steps: when the application draws a rendering image at a first frame rate, a frame rate control system receives a message which is sent by a synthesis thread and carries a second frame rate, wherein the ratio of the first frame rate to the second frame rate is an integer greater than 1; in response to the message, the frame rate control system adds an identifier for adjusting the time for compositely drawing the rendered image and controls the application to draw the rendered image at the second frame rate; and the first-time frame rate control system controls the compositing thread to compositely draw the rendered image at a second frame rate based on the identifier, and controls the display driver to drive the screen to display the composited image at the second frame rate. Therefore, the rendering is switched first, and then the synthesis and the display are switched, so that the display interval between each frame of image and the previous frame of image is consistent with the rendering interval, the sliding speed jump caused by the inconsistency of the display interval and the rendering interval is reduced, the pause is reduced, and the user experience is increased.

Description

Data processing method and related device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a data processing method and a related apparatus.
Background
Currently, a user can refer to various types of content through a display screen of a terminal device. When the content is more, the display screen can not display all the content at one time, and the user can slide and browse the related content in the display screen.
The interface display of the display screen of the terminal device generally needs to go through the processes of drawing, rendering, synthesizing and the like. Illustratively, the interface drawing process of the terminal device may include background drawing, sub-view drawing, scroll bar drawing and the like. The interface synthesis process of the terminal device may include processing processes such as vertex processing and pixel processing.
However, when the terminal device switches the screen refresh rate during the screen interface change, the terminal device may suffer from a stuck phenomenon.
Disclosure of Invention
The embodiment of the application provides a data processing method and a related device, which are applied to terminal equipment. The method is used for solving the problem of the pause phenomenon caused by the switching of the screen refresh rate in the process of changing the screen interface of the terminal equipment.
In a first aspect, an embodiment of the present application provides a data processing method, which is applied to a terminal device, where the terminal device includes an application, a frame rate control system, a composition thread, a cache thread, and a display driver.
The method comprises the following steps: when the application draws a rendering image at a first frame rate, a frame rate control system receives a message which is sent by a synthesis thread and carries a second frame rate, and the ratio of the first frame rate to the second frame rate is an integer greater than 1; in response to receiving the message carrying the second frame rate, the frame rate control system adds an identifier, controls the application to draw a rendered image at the second frame rate, and identifies the time for adjusting the time for which the composite line Cheng Gecheng application draws the rendered image; at a first moment, the frame rate control system controls the compositing thread to draw the rendered image by a second frame rate compositing application based on the identifier, and controls the display driver to drive the screen to display the image composited by the compositing thread at a second frame rate; the first moment is between 1+A Vsync-HW signals and 2+A Vsync-HW signals after the frame rate control system receives a message carrying a second frame rate, when A is the frame rate control system receives the message carrying the second frame rate, the buffer amount of a buffer in a corresponding buffer queue is applied in a buffer thread, and the Vsync-HW signals are used for triggering a display driving screen to display an image synthesized by a synthesis thread.
Therefore, when the screen refresh rate of the terminal equipment is switched, the drawing and rendering processes of the images are switched, and then the synthesis process of the images and the display process of the images are switched, so that the display interval between each frame of image and the previous frame of image is consistent with the drawing and rendering interval, the display rhythm of each frame of image is consistent with the drawing and rendering rhythm, the sliding speed jump caused by the inconsistency of the display interval and the drawing and rendering interval is reduced, the blockage is reduced, and the user experience is increased. In addition, the image synthesis time is adjusted based on the identification, so that the condition that no image is sent and displayed is reduced, and the phenomenon of pause is further reduced.
Optionally, the terminal device further includes a Vsync thread; in response to receiving a message carrying a second frame rate, the frame rate control system controls the application to render an image at the second frame rate, comprising: after the frame rate control system receives a message carrying a second frame rate, the frame rate control system sends a first message to the Vsync thread, wherein the first message is used for indicating that the application frame rate is switched to the second frame rate; the Vsync thread generating a Vsync-APP signal at a second frame rate based on the first message and sending the Vsync-APP signal to the application; the application renders a rendered image based on the Vsync-APP signal.
In the embodiment of the present application, generating the Vsync-APP signal at the second frame rate may be understood as generating the Vsync-APP signal in the Vsync signal period of the second frame rate.
Therefore, the time interval between the Vsync-APP signals is modified, the drawing and rendering process of the image is switched from the first frame rate to the second frame rate, the mode is simple and convenient, and computing resources are saved.
Optionally, the terminal device further includes: a Vsync thread; identifying a first identifier for adjusting, based on the offset of the Vsync-SF signal, a time at which the rendered image is drawn by the integration application into the thread at the second frame rate; at a first moment, the frame rate control system controls the compositing thread to composite an image rendered by an application and drawing at a second frame rate, and controls the display driver to drive the screen to display the image composited by the compositing thread at the second frame rate, and the method comprises the following steps: at the first moment, the frame rate control system sends a second message to the Vsync thread, wherein the second message is used for indicating that the composite frame rate and the screen refresh rate are switched to a second frame rate; the Vsync thread generating a Vsync-SF signal at a second frame rate based on the second message and the first identification and sending the Vsync-SF signal to the composition thread; the compositing thread composes and draws the rendered image based on the Vsync-SF signal; the Vsync thread sends a third message to the display driver based on the second message, wherein the third message is used for indicating that the screen refresh rate is switched to the second frame rate; the display driving generates a Vsync-HW signal at a second frame rate based on the third message control screen, the generation time of the Vsync-HW signal being later than that of the Vsync-SF signal; the display driver controls the screen to display the synthesized image upon receiving the Vsync-HW signal.
In the embodiment of the present application, generating the Vsync-SF signal at the second frame rate may be understood as generating the Vsync-SF signal at the Vsync signal period of the second frame rate. Generating the Vsync-HW signal at the second frame rate may be understood as generating the Vsync-HW signal at the Vsync signal period of the second frame rate.
In this way, the time interval between the Vsync-SF signals and the time interval between the Vsync-HW signals are modified, so that the image synthesis process and the image display process are respectively switched from the first frame rate to the second frame rate, the method is simple and convenient, and the computing resources are saved. In addition, adjusting the image composition time based on the shift amount of the Vsync-SF signal can reduce the occurrence of no image display, and further reduce the pause phenomenon.
Optionally, the Vsync thread generating a Vsync-SF signal at a second frame rate based on the second message and the first flag and sending the Vsync-SF signal to the composition thread, including: the Vsync thread generating and sending Vsync-SF signals at a second frame rate to the composition thread based on the second message and the first identification, including: when receiving an image rendered by application drawing, the cache thread sends a fourth message to the frame rate control system, wherein the fourth message is used for indicating to query the first identifier; when the first identifier exists in the frame rate control system, the frame rate control system sends a fifth message to the cache thread, wherein the fifth message is used for indicating that the first identifier exists in the frame rate control system; the cache thread sends a sixth message to the Vsync thread, wherein the sixth message is used for indicating the adjustment offset; the Vsync thread generates a Vsync-SF signal at the second frame rate in advance of a time corresponding to the offset amount based on the sixth message; the Vsync thread sends a Vsync-SF signal to the composition thread.
Thus, the image synthesis time is advanced based on the shift amount of the Vsync-SF signal, the waiting time for image synthesis is reduced, the image is synthesized in time, the condition that no image is sent and displayed is reduced, and the pause phenomenon is further reduced.
Optionally, the time corresponding to the offset amount is a period duration of a vertical synchronization Vsync signal corresponding to one first frame rate.
Therefore, compared with the Vsync-HW signal, the Vsync-SF signal is ahead of the vertical synchronization Vsync signal period corresponding to the first frame rate, the synthesis time of the images is advanced, and the images are synthesized in time, so that the images can be synthesized before the Vsync-HW signal arrives, the condition that no images are sent and displayed is reduced, and the pause phenomenon is further reduced.
Optionally, the terminal device further includes: a Vsync thread; the mark is a second mark used for indicating the compositing thread to immediately composite the image; at a first moment, the frame rate control system controls the compositing thread to composite an application drawing a rendered image at a second frame rate, and controls the display driver to drive the screen to display the image composited by the compositing thread at the second frame rate, including: at the first moment, the frame rate control system sends a seventh message to the Vsync thread, wherein the seventh message is used for indicating that the composite frame rate and the screen refresh rate are switched to the second frame rate; the Vsync thread generates a Vsync-SF signal at the second frame rate based on the seventh message; when the cache thread receives the image after application drawing rendering, the cache thread sends the image after application drawing rendering to the synthesis thread based on the second identification; the compositing thread compositing application draws the rendered image; the Vsync thread sends an eighth message to the display driver based on the seventh message, the eighth message for instructing the screen refresh rate to switch to the second frame rate; the display driving generates a Vsync-HW signal at the second frame rate based on the eighth message control screen; the display driver controls the screen to display the synthesized image upon receiving the Vsync-HW signal.
Therefore, by modifying the image synthesis mode, the images after drawing and rendering are immediately synthesized after being received, the waiting time for image synthesis is reduced, the images are synthesized in time, the condition that no image is sent and displayed is further reduced, and the pause phenomenon is further reduced.
Optionally, when the cache thread receives the image after application rendering, the cache thread sends the image after application rendering to the composition thread based on the second identifier, where the method includes: when the cache thread receives the image rendered by the application drawing, the cache thread sends a ninth message to the frame rate control system, wherein the ninth message is used for indicating to query the second identifier; when the frame rate control system has the second identifier, the frame rate control system sends a tenth message to the cache thread, wherein the tenth message is used for indicating that the second identifier exists; and the cache queue sends the image after the rendering of the application drawing to the composition thread based on the tenth message.
Optionally, a difference between the first time and a time when the frame rate control system receives the message with the second frame rate satisfies: (number of buffers + 1) × Vsync cycle duration corresponding to the first frame rate.
Therefore, the calculation at the first moment can be simplified, and the calculation resource can be saved.
Optionally, the number of the buffers is the sum of the number of the buffers after rendering in the buffer queue corresponding to the application and the number of the buffers during rendering in the buffer queue corresponding to the application.
Optionally, when the application draws the rendered image at the first frame rate, after the frame rate control system receives a message carrying the second frame rate sent by the composition thread, the method further includes: the frame rate control system calculates the ratio of the first frame rate to the second frame rate; when the ratio is an integer larger than 1, the frame rate control system obtains the cache number from the cache queue corresponding to the application; the frame rate control system determines a first time based on the number of buffers.
Optionally, when the ratio is an integer greater than 1, the frame rate control system obtains the number of buffers from the buffer queue corresponding to the application, including: the frame rate control system acquires a focus window from a window manager, wherein the focus window corresponds to an application; and the frame rate control system acquires the buffer amount from the buffer queue corresponding to the application based on the focus window.
In a second aspect, an embodiment of the present application provides a terminal device, which may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. The terminal device may be a mobile phone (mobile phone), a smart tv, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and so on.
The terminal device comprises a processor for invoking a computer program in a memory for performing the method according to the first aspect.
In a third aspect, embodiments of the present application provide a computer-readable storage medium storing computer instructions, which, when executed on a terminal device, cause the terminal device to perform the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product, which, when executed, causes a terminal device to perform the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip comprising a processor, the processor being configured to call a computer program in a memory to perform the method according to the first aspect.
It should be understood that the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by the aspects and the corresponding possible implementations are similar and will not be described again.
Drawings
Fig. 1 is a schematic structural diagram of a hardware system of a terminal device according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a terminal device software system according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a terminal device interface display processing flow in a possible implementation;
FIG. 4 is a schematic diagram of an interface display processing flow corresponding to frame rate switching in one possible implementation;
FIG. 5 is a schematic diagram of an interface display processing flow in a possible implementation;
FIG. 6 is a schematic diagram of an interface display in a possible implementation;
fig. 7 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 8 is a schematic process diagram of interaction among modules in the data processing method according to the embodiment of the present application;
fig. 9 is a schematic view of a processing flow of terminal device interface display according to an embodiment of the present application;
fig. 10 is a schematic flowchart of a data processing method according to an embodiment of the present application;
fig. 11 is a schematic view of a terminal device interface display processing flow provided in an embodiment of the present application;
fig. 12 is a schematic view of a terminal device interface display processing flow provided in an embodiment of the present application;
fig. 13 is a schematic view of a terminal device interface display processing flow provided in an embodiment of the present application;
fig. 14 is a schematic view of a terminal device interface display processing flow provided in an embodiment of the present application;
fig. 15 is a schematic view of a terminal device interface display processing flow provided in an embodiment of the present application;
fig. 16 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application;
fig. 17 is a schematic hardware configuration diagram of a data processing apparatus according to an embodiment of the present application.
Detailed Description
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first chip and the second chip are only used for distinguishing different chips, and the sequence order thereof is not limited. Those skilled in the art will appreciate that the terms "first," "second," and the like do not denote any order or importance, but rather the terms "first," "second," and the like do not denote any order or importance.
It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "such as" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a alone, A and B together, and B alone, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
The embodiment of the application provides a data processing method which can be applied to electronic equipment with a display function.
The electronic device includes a terminal device, which may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), and so on. The terminal device may be a mobile phone (mobile phone), a smart tv, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and so on. The embodiment of the present application does not limit the specific technology and the specific device form adopted by the terminal device.
In order to better understand the embodiments of the present application, the following describes the structure of the terminal device according to the embodiments of the present application:
fig. 1 shows a schematic configuration diagram of a terminal device 100. The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it may be called from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the terminal device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus, enabling communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of receiving a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to implement the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a display screen serial interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture function of terminal device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the terminal device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, and may also be used to transmit data between the terminal device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is an exemplary illustration, and does not limit the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the terminal device 100. The charging management module 140 may also supply power to the terminal device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device.
The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The antennas in terminal device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with a network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. GNSS may include Global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), and/or Satellite Based Augmentation System (SBAS).
The terminal device 100 implements a display function by the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used for displaying images, displaying videos, receiving slide operations, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-o led, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the terminal device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the terminal device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device 100 can listen to music through the speaker 170A, or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal device 100 answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The terminal device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the terminal device 100 detects the intensity of the touch operation from the pressure sensor 180A. The terminal device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions.
The gyro sensor 180B may be used to determine the motion attitude of the terminal device 100. In some embodiments, the angular velocity of the terminal device 100 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the terminal device 100, calculates the distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the terminal device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the terminal device 100 calculates an altitude from the barometric pressure measured by the barometric pressure sensor 180C, and assists in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The terminal device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the terminal device 100 is a folder, the terminal device 100 may detect the opening and closing of the folder according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal device 100 in various directions (generally, three axes). The magnitude and direction of gravity can be detected when the terminal device 100 is stationary. The method can also be used for recognizing the posture of the terminal equipment, and is applied to application programs such as horizontal and vertical screen switching, pedometers and the like.
A distance sensor 180F for measuring a distance. The terminal device 100 may measure the distance by infrared or laser. In some embodiments, shooting a scene, the terminal device 100 may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal device 100 emits infrared light to the outside through the light emitting diode. The terminal device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100. When insufficient reflected light is detected, the terminal device 100 can determine that there is no object near the terminal device 100. The terminal device 100 may utilize the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense ambient light brightness. The terminal device 100 may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in a pocket, in order to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal device 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the terminal device 100 executes a temperature processing policy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the terminal device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the terminal device 100 heats the battery 142 when the temperature is below another threshold to avoid abnormal shutdown of the terminal device 100 due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the terminal device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the terminal device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human voice vibrating a bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signals acquired by the bone conduction sensor 180M, and the heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device 100 may receive a key input, and generate a key signal input related to user setting and function control of the terminal device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration prompts as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects in response to touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the terminal device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The terminal device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the terminal device 100 employs eSIM, namely: an embedded SIM card. The eSIM card may be embedded in the terminal device 100 and cannot be separated from the terminal device 100.
The software system of the terminal device 100 may adopt a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, a cloud architecture, or the like. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the terminal device 100.
Fig. 2 is a block diagram of a software structure of a terminal device according to an embodiment of the present application. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, a hardware abstraction layer, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 2, the application packages may include phone, mailbox, calendar, camera, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, a frame rate control system, an image composition system, a view system, a package manager, an input manager, an activity manager, and a resource manager, among others.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The frame rate control system is used for adjusting the screen refresh rate.
The image synthesis system is used to control image synthesis and generate vertical synchronization (Vsync) signals.
The image synthesizing system includes: a composition thread, a Vsync thread, a cache (tune buffer) thread. The composition thread is used to wake up for composition by the Vsync signal. The Vsync thread is used to generate the next Vsync signal from the Vsync signal request. The cache thread is used to deposit a cache, generate Vsync signal requests, and wake up the composition thread, etc. One or more cache queues exist in the cache thread and are respectively used for storing caches corresponding to different applications.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The package manager is used for program management within the system, for example: application installation, uninstallation, upgrade, and the like.
The input manager is used to manage the programs of the input device. For example, the input system may determine input operations such as a mouse click operation, a keyboard input operation, and a touch slide.
The activity manager is used for managing the life cycle of each application program and the navigation backspacing function. The method is responsible for the creation of the main thread of the Android and the maintenance of the life cycle of each application program.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application layer and the application framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: an image rendering library, an image composition library, a function library, a media library, an input processing library, and the like.
The image rendering library is used for rendering two-dimensional or three-dimensional images. The image composition library is used for composition of two-dimensional or three-dimensional images.
In a possible implementation manner, the application performs drawing rendering on the image through the image rendering library, and then the application sends the drawn and rendered image to a cache queue of the image synthesis system. Each time the Vsync signal arrives, an image synthesis system (e.g., a surface flicker) sequentially acquires one frame of image to be synthesized from the buffer queue, and then performs image synthesis by the image synthesis library.
The function library provides macros, type definitions, character string operation functions, mathematical calculation functions, input and output functions, and the like used in the C language.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, and the like.
The input processing library is used for processing a library of input devices, and can realize mouse, keyboard, touch input processing and the like.
The hardware abstraction layer may include a plurality of library modules, which may be, for example, hardware compositor (hwcomposer, HWC), camera library modules, and the like. The Android system can load corresponding library modules for the equipment hardware, and further the purpose that the application program framework layer accesses the equipment hardware is achieved. The device hardware may include, for example, an LCD display screen, a camera, etc. in the electronic device.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a Touch Panel (TP) driver, a display driver, a Bluetooth driver, a WIFI driver, a keyboard driver, a shared memory driver, a camera driver and the like.
The hardware may be audio devices, bluetooth devices, camera devices, sensor devices, etc.
The following describes an exemplary workflow of software and hardware of the terminal device 100 in conjunction with a scenario where an application is started or an interface is switched in the application.
When the touch sensor 180K in the touch panel receives a touch operation, the kernel layer processes the touch operation into an original input event (including information such as touch coordinates, touch force, and a time stamp of the touch operation). The raw input events are stored at the kernel layer. And the kernel layer reports the original input event to an input manager of the application program framework layer through the input processing library. And the input manager of the application program framework layer analyzes the information (including the operation type, the report point position and the like) of the original input event, determines the focus application according to the current focus, and sends the analyzed information to the focus application. The focus may be a touch point in a touch operation or a click position in a mouse click operation. The focus application is an application running in the foreground of the terminal equipment or an application corresponding to the touch position in the touch operation. The focus application determines a control corresponding to the original input event according to the parsed information (e.g., a hit position) of the original input event.
Taking the touch operation as a touch sliding operation, and taking a control corresponding to the touch sliding operation as a list control of the WeChat application as an example, the WeChat application calls an image rendering library in a system library to render and draw an image through a view system of an application program framework layer. And the WeChat application sends the rendered image to a cache queue of the image synthesis system. And synthesizing the rendered image drawn in the image synthesis system into a WeChat interface through an image synthesis library in the system library. The image synthesis system is driven by the display of the kernel layer, so that the screen (display screen) displays the corresponding interface of the WeChat application.
For ease of understanding, the examples given are provided to illustrate concepts related to embodiments of the present application.
1. Frame: refers to a single picture of the smallest unit in the interface display. A frame can be understood as a still picture and displaying a number of consecutive frames in rapid succession can create the illusion of motion of the object. The frame rate refers to the number of frames of a picture refreshed in 1 second, and can also be understood as the number of times of refreshing the picture per second by a graphics processor in the terminal device. A high frame rate may result in a smoother and more realistic animation. The greater the number of frames per second, the more fluid the displayed motion will be.
It should be noted that, before the interface displays the frame, processes such as drawing, rendering, and composition are usually required.
2. And (3) frame drawing: the method refers to drawing pictures on a display interface. The display interface may be comprised of one or more views, each of which may be drawn by a visual control of the view system, each of which is comprised of a sub-view, one of which corresponds to a widget in the view, e.g., one of which corresponds to a symbol in the picture view.
3. Frame rendering: rendering the rendered view or adding 3D effects, etc. For example: the 3D effect may be a light effect, a shadow effect, a texture effect, and the like.
4. Frame synthesis: is the process of compositing a plurality of the one or more rendered views into a display interface.
5. Sliding away from the hand: the method is characterized in that after the touch sliding of the interface of the terminal equipment is finished, the sliding display is continued at the initial speed after the interface is separated from the hand.
For example, in applications such as setup, headline, etc., the interface displays a list layout, and after the finger that slides the interface is lifted, the interface may continue to slide at the initial speed after the finger has been lifted for a period of time.
It will be appreciated that when the interface includes a list control, the interface displays a list layout, which is a list interface. The list control may be listview or retrievview, and the list control is not limited in the embodiment of the application.
The following describes a display process of the interface of the terminal device 100 with software and hardware.
In order to improve the smoothness of display and reduce the occurrence of display jamming and the like, the terminal device generally performs display based on the Vsync signal to synchronize the flows of drawing, rendering, synthesizing, screen refreshing display and the like of an image.
It is understood that the Vsync signal is a periodic signal, and the period of the Vsync signal may be set according to the screen refresh rate, for example, when the screen refresh rate is 60Hz, the period of the Vsync signal may be 16.6ms, that is, the Vsync signal is periodically triggered by the terminal device generating a control signal every 16.6ms.
It should be noted that the Vsync signal may be divided into a software Vsync signal and a hardware Vsync signal. The software Vsync signals include Vsync-APP and Vsync-SF. Vsync-APP is used to trigger the rendering process. Vsync-SF is used to trigger the synthesis flow. The hardware Vsync signal (Vsync-HW) is used to trigger the screen display refresh flow.
Typically, the software Vsync signal and the hardware Vsync signal maintain period synchronization. Taking 60Hz and 120Hz as examples, if Vsync-HW switches from 60Hz to 120Hz, the Vsync-APP and Vsync-SF change synchronously, both switch from 60Hz to 120Hz.
For example, fig. 3 is a schematic diagram of a terminal device interface display processing flow in a possible implementation. The contents displayed by the terminal device correspond to frame 1, frame 2, and frame 3 in the chronological order.
Specifically, taking the display of the frame 1 as an example, the application of the terminal device renders and renders the frame 1 through a view system of an application framework layer. After the frame 1 rendering is completed, the application of the terminal device sends the rendered frame 1 image data to an image synthesis system (e.g., a surface flicker). The image composition system composes the rendered frame 1. After the frame 1 is synthesized, the terminal device may start the display driver by calling the kernel layer, and display the content corresponding to the frame 1 on the screen (display screen). It is understood that the application of the terminal device transmits the rendered image data to the image composition system. The drawing of the rendered frame 1 may refer to drawing of rendered image data corresponding to the frame 1.
The process of frame 2 and frame 3 similar to frame 1 is also synthesized and displayed and will not be described in detail here. Each frame in fig. 3 lags from rendering to display by 2 Vsync signal periods, and the display of the terminal device has hysteresis.
It should be noted that the terminal device may decrease the screen refresh rate to reduce the jamming when the system load is large, or increase the screen refresh rate to increase the fluency of the display when the system load is small.
For example, fig. 4 is a schematic diagram of an interface display processing flow corresponding to frame rate switching in a possible implementation. The contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in order of time.
Specifically, taking the display of the frame 2 as an example, the application of the terminal device renders and renders the frame 2 through a view system of an application framework layer. After the frame 2 rendering and rendering are completed, the application of the terminal device sends the rendered frame 2 to an image synthesis system (e.g., a surface flicker). The image composition system composes the rendered frame 2. After the frame 2 is synthesized, the terminal device may start the display driver by calling the kernel layer, and display the content corresponding to the frame 2. The processes of frame 3, frame 4, frame 5 and frame 6 similar to frame 2 are also synthesized and displayed, and are not described in detail here.
When the frame 2 is rendered, the frame rate control system of the terminal device decides to switch the frame rate (for example, from 120Hz to 60 Hz), and when the frame 3 is rendered, the frame rate is switched, and the period duration of the Vsync signal corresponding to the frame 3 is changed, so that the frame rate switching is completed.
Note that the terminal device determines the layout of the image and the like by the amount of displacement. In some sliding scenes (e.g., a slide away from the hand), the amount of displacement of the image is related to the corresponding frame interval (previous Vsync cycle duration) at the time the previous frame was rendered. Specifically, for example, in a constant-speed sliding scene, the displacement of the current image (frame) during rendering is obtained by multiplying the sliding speed of the current frame by the frame interval of the previous frame (the current frame Vsync-App timestamp — the previous frame Vsync-App timestamp). Illustratively, taking the frame 3 in fig. 4 as an example, the displacement amount of the frame 3 is obtained by multiplying the frame interval of the frame 2 (the timestamp of the Vsync2 — the timestamp of the Vsync 1) by the sliding speed of the frame 3.
The sliding speed of the image display of the terminal device is obtained by dividing the displacement difference between the current frame and the previous frame (the displacement of the current frame) by the corresponding frame interval (the display duration of the previous frame) when the previous frame is displayed. Illustratively, taking frame 3 in fig. 4 as an example, the sliding speed of frame 3 is the displacement of frame 3 divided by the corresponding frame interval (timestamp of Vsync4 — timestamp of Vsync 3) when frame 2 is displayed.
Therefore, when the frame interval corresponding to the rendering of the image drawing coincides with the frame interval corresponding to the displaying, the image is displayed at the preset slide speed. If the frame interval corresponding to the rendering of the image is not consistent with the frame interval corresponding to the display, the sliding speed of the display may jump, and further the display screen may be unsmooth and unsmooth, and the user experience may be poor.
As can be seen in fig. 4, each frame in fig. 4 lags from drawing to display by 2 periods of the Vsync signal. When the screen refresh rate is switched, the frame interval corresponding to the frame 2 rendering is not consistent with the frame interval corresponding to the frame 2 displaying, and similarly, the frame interval corresponding to the frame 3 rendering is not consistent with the frame interval corresponding to the frame 3 displaying. This may cause the sliding speed of the display of the frames 3 and 4 to be different from the preset sliding speed, and the sliding speed of the display of the frames 3 and 4 may jump.
Next, the displacement amount and the sliding speed according to the flow in fig. 4 will be described with reference to fig. 5 and 6.
Illustratively, take the example where the list is slid at a constant speed, the screen refresh rate is switched from 120Hz to 60Hz, and the sliding speed is moved by 1pixel (pixel) every 8.3 milliseconds (ms). FIG. 5 is a schematic diagram of an interface display processing flow in a possible implementation. In fig. 5, the contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the order of time.
It is understood that the shift amount is a product of a frame interval of the previous frame (current frame Vsync-App timestamp — previous frame Vsync-App timestamp) and a sliding speed of the current frame. Illustratively, the shift amount of frame 3 in FIG. 5 is (8.3 ms-0 ms). Times.1 pixel/8.3ms, i.e., 1pixel; similarly, the shift amount of the frame 4 is (24.9 ms-8.3 ms) × 1pixel/8.3ms, i.e., 2 pixels.
As shown in fig. 5, when rendering is drawn by the terminal device in frame 2, frame rate switching is decided. When rendering starts at 8.3ms frame 3, frame rate switching has not been completed. Therefore, the shift amount of the frame 2 and the shift amount of the frame 3 are both 1pixel in relation to the screen refresh rate before switching (or the Vsync period duration before frame rate switching). At 24.9ms, frame rate switching is completed. The amount of displacement of the frame 4, the amount of displacement of the frame 5, and the amount of displacement of the frame 6 are all 2 pixels in relation to the screen refresh rate after switching (or the Vsync period duration after frame rate switching).
As can be seen from fig. 5, the frame interval corresponding to the frame 2 during rendering is 8.3ms-0ms, that is, 8.3ms, and the frame interval corresponding to the frame 2 during displaying is 41.5ms-24.9ms, that is, 16.6ms, so that the display rhythm of the terminal device is slowed, and the sliding speed during switching to the frame 3 display is reduced. Similarly, the sliding speed when the display is switched to the frame 2 display is decreased. The frame interval corresponding to the frame 3 during rendering is 24.9ms-8.3ms, namely 16.6ms, the frame interval corresponding to the frame 3 during displaying is 58.1ms-41.5ms, namely 16.6ms, the display rhythm of the frame 3 is the same as the rendering rhythm, and the sliding speed of the frame 4 is switched to be unchanged. When the frequency of 120Hz is switched to 60Hz, the sliding speed of the display of the terminal equipment is firstly reduced and then increased, so that the user perceives that the picture is jammed.
For ease of understanding, the display speed of fig. 5 is explained below with reference to fig. 6.
It is understood that the user perceives a change in speed when the screen is switched. Therefore, the sliding speed can be represented by dividing the display duration of the previous frame by the difference between the displacement of the current frame and the displacement of the previous frame (the displacement amount of the current frame).
Illustratively, fig. 6 is a display diagram of the interface corresponding to frames 0, 1, 2, 3, 4, 5 and 6 in fig. 5. As shown in fig. 6, there is a triangle in the list interface. Take the absolute position of the display screen (screen) as 0-18pixel as an example. If the triangle position in frame 0 is at 0 and the displacement of frame 1 is 1pixel, then the triangle position in frame 1 is at 1pixel. The shift amount of frame 2 and the shift amount of frame 3 are both 1pixel. The displacement amount of the frame 4, the displacement amount of the frame 5, and the displacement amount of the frame 6 are all 2 pixels. The triangle positions are located at 2 pixels, 3 pixels, 5 pixels, 7 pixels, and 9 pixels in frame 2, frame 3, frame 4, frame 5, and frame 6, respectively.
In connection with fig. 5, at 8.3ms, the Vsync signal arrives, the display interface of the terminal device changes from frame 0 to frame 1, the position of the triangle moves from 0 to 1pixel, the moving speed is 1pixel/8.3ms, and the sliding speed perceived by the user is 1 pixel/(8.3 ms-0 ms), i.e. 1pixel/8.3ms. At 24.9ms, a Vsync signal arrives, the display interface of the terminal device is changed from frame 1 to frame 2, the moving speed of the triangle is 1pixel/16.6ms, and the sliding speed perceived by a user is 1 pixel/(24.9 ms-8.3 ms), namely 1pixel/16.6ms.
At 41.5ms, the Vsync signal comes, the display interface of the terminal device changes from frame 2 to frame 3, the moving speed of the triangle is 1pixel/16.6ms, and the sliding speed perceived by the user is 1 pixel/(41.5 ms-24.9 ms), namely 1pixel/16.6ms. At 58.1ms, the Vsync signal comes in, the display interface of the terminal device changes from frame 3 to frame 4, the moving speed of the triangle is 2 pixels/16.6 ms, and the sliding speed perceived by the user is 2 pixels/(58.1 ms-41.5 ms), namely 2 pixels/16.6 ms. At 74.7ms, the Vsync signal comes in, the display interface of the terminal device changes from frame 4 to frame 5, the moving speed of the triangle is 2 pixels/16.6 ms, and the sliding speed perceived by the user is 1 pixel/(74.7 ms-58.1 ms), namely 2 pixels/16.6 ms.
In FIG. 5, the slip speed is changed from 1pixel/8.3ms to 1pixel/16.6ms to 2pixel/16.6ms. The sliding speed changes, so that the user feels that the card is blunt and the user experience is poor.
In summary, the screen refresh rate of the terminal device changes during the sliding process, and the frame interval corresponding to the rendering of the image may be greater than or less than the frame interval corresponding to the display of the image, which causes the sliding speed to jump (rise or fall) during the display, resulting in the image being stuck.
In view of this, embodiments of the present application provide a data processing method, when a terminal device performs frame rate switching, the application frame rate is switched first, and the composite frame rate and the screen refresh rate are switched in a delayed manner. Therefore, the frame interval corresponding to the image display is consistent with the frame interval corresponding to the frame image drawing and rendering, the jump of the sliding speed during the display is reduced, the pause phenomenon is further reduced, the image display is uniform and smooth, and the user experience is improved.
An application scenario provided by the embodiment of the present application is described below with reference to the drawings. Fig. 7 is a schematic view of an application scenario provided in the embodiment of the present application.
The terminal device may receive a user's slide-up or slide-down operation in the interface of the social application shown in a in fig. 7, or in the setting-related interface shown in b in fig. 7, the document interface shown in c in fig. 7, the goods browsing interface shown in d in fig. 7, or the like. The terminal device may also receive a user left-slide operation or right-slide operation in an interface shown by e in fig. 7, an electronic book interface shown by f in fig. 7, or the like. When the terminal device receives the sliding operation of the user, the terminal device performs the processes of frame drawing, rendering, composition and the like based on the sliding operation, and displays the content corresponding to the sliding operation.
The types of input events corresponding to the slide operation may be classified into a press (down), a move (move), and a lift (up).
In one possible scenario, the terminal device continues to slide the list display based on the initial speed at the time of the up event after detecting the up event. When the sliding speed is reduced to a certain speed threshold value or a preset time is elapsed, the terminal device switches the screen refresh rate from the high frame rate to the low frame rate (for example, from 120Hz to 60 Hz) based on the method of the embodiment of the present application. Thus, the power consumption of the terminal device can be reduced by reducing the screen refresh rate, and the click phenomenon caused by reducing the screen refresh rate can be reduced.
It can be understood that the data processing method provided in the embodiment of the present application may also be applied to other scenes related to speed, and is not limited herein. In addition, the method provided by the embodiment of the application can be applied to a constant speed scene, a speed reduction scene and a speed increase scene. The embodiment of the present application is not limited to the change of the speed.
For convenience of understanding, the following describes a process of interaction between the modules involved in the data processing method provided in the embodiment of the present application with reference to fig. 8 and fig. 9.
Exemplarily, fig. 8 is a schematic process diagram of interaction between modules in the data processing method provided in the embodiment of the present application.
As shown in fig. 8, the system may include: an application, a frame rate control system, an image compositing system (surface flicker), a window manager, a hardware compositor, and a display driver. The application comprises an application main thread and an application rendering thread. The image composition system includes a Vsync thread, a cache thread, and a composition thread. The application main thread may also be referred to as a logical thread, or as a UI thread.
Taking the preset time after entering the sliding process from the hand as an example,
when the touch panel receives a touch operation, the kernel layer processes the touch operation into an original input event (including information such as touch coordinates, touch force, and a time stamp of the touch operation). The raw input events are stored at the kernel layer. And the kernel layer reports the original input event to an input manager of the application program framework layer through the input processing library. The input manager of the application framework layer analyzes the information (including the operation type, the report position and the like) of the original input event and determines the focus application according to the current focus, and sends the analyzed information (such as a down event, a move event, an up event and the like) of the original input event to the composition thread.
And when the terminal equipment receives the sliding operation while displaying at the second frame rate, the composition thread receives the down event, the move event and the up event sent by the input manager. And when the composition thread receives a down event sent by the input manager, sending a message for indicating the switching of the screen refresh rate to the frame rate control system, wherein the message carries the first frame rate. Illustratively, the first frame rate is 120Hz and the second frame rate is 60Hz.
S801, the synthetic thread receives the up event.
Illustratively, the synthetic thread receives information sent by the input manager, which carries the up event.
S802, after the duration is preset, the synthesis thread sends a message for indicating the switching of the screen refresh rate to the frame rate control system, wherein the message carries a second frame rate.
Fig. 9 is a schematic diagram of a display process provided in an embodiment of the present application. As shown in fig. 9, if the first frame rate is 120Hz, the second frame rate is 60Hz. Also, in fig. 9, the contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the order of time.
In FIG. 9, the 10ms, vsync-APP signal comes in and the application main thread starts rendering frame 2. During the period of 0ms-8.3ms, the terminal device detects a lift-off event and a preset time duration elapses. The composition thread sends a message to the frame rate control system instructing to switch the screen refresh rate.
It is understood that the preset time may be 1s or 3s. The specific value of the preset duration is not limited in the embodiment of the application.
In a possible implementation manner, the composition thread sets a timer when receiving the up event, and sends a message for instructing to switch the screen refresh rate to the frame rate control system after the timer finishes timing. Illustratively, the timer count may be 3s.
S803, when the ratio of the first frame rate to the second frame rate is an integer, the frame rate control system increases the first flag. The first flag is used to adjust the offset of Vsync-SF.
For example, in the flow shown in fig. 9, when the ratio of the first frame rate to the second frame rate is 2, which is an integer, the frame rate control system increases the first flag.
S804, the frame rate control system inquires a focus window to the window manager. Adaptively, the window manager feeds back the application package name and the focus window layer corresponding to the focus window to the frame rate control system. The focus window is a window corresponding to an application operated by the foreground of the terminal equipment.
For example, in the flow shown in fig. 9, the focus window is a window corresponding to an input event.
It can be understood that S803 and S804 may be executed simultaneously or not, and the order of S803 and S804 is not limited in this embodiment of the application.
In a possible implementation manner, the frame rate control system may determine the focus application according to the application packet name corresponding to the focus window, and further determine whether the focus application is in a pre-stored delayed switching application list, and if so, perform the following steps S805 to S830 in the delayed switching application list. And if the application list is not in the delayed switching application list, synchronously switching the application frame rate and the screen refresh rate.
S805, the frame rate control system inquires the cache number in the cache queue corresponding to the focus window from the cache thread according to the focus window; adaptively, the buffer thread feeds back the buffer amount to the frame rate control system.
It should be noted that the buffers in the buffer queue include one or more of the following: a cached (queued buffer), a rendering (queued buffer), a compositing (acquired buffer), and an unused (free buffer).
The cached cache may be understood as a cache in which the rendered image is stored. A rendering cache may be understood as a cache for storing images that an application is rendering. A cache being composed may be understood as a cache in which the composition is performed in the composition thread. An unused cache may be understood as a cache that does not store images.
For example, taking the number of all buffers in the buffer queue as 20 as an example, at 10ms in fig. 9, the application is drawing rendering frame 3, the rendering buffer is frame 3, and the number is 1; the number of the cached caches in the cache queue is 0; the synthesizing thread is in synthesizing frame 2, the buffer being synthesized is frame 2, and the number is 1; the number of unused buffers is 20-1-0-1, i.e. 18.
At 16.5ms in fig. 9, the application does not perform rendering, and the number of the rendering caches is 0; the buffered buffer is frame 3, and the number is 1; the synthesis thread does not synthesize, and the number of the synthesized caches is 0; the number of unused buffers is 20-0-1-0, i.e. 19.
In the embodiment of the present application, the number of caches is the sum of the number of cached caches (queued buffers) and the number of caches (queued buffers) being rendered.
For example, in the flow shown in fig. 9, rendering is finished for frame 2, the number of buffered buffers is 1, the number of buffers being rendered is 0, the number of buffers is 1, and the buffer thread feeds back the buffer number to the frame rate control system as 1.
S806, the frame rate control system determines the delay duration M for switching the screen refresh rate to the second frame rate according to the buffer amount, compared with the delay duration M for switching the application frame rate to the second frame rate. M satisfies the following formula: m = (buffer count + 1) × Vsync period corresponding to the first frame rate.
When the ratio of the first frame rate to the second frame rate is an integer, the delay duration M = (buffer amount + 1) = Vsync period corresponding to the first frame rate. When the number of buffers is 1, the delay duration M is 2 × Vsync period corresponding to the first frame rate.
For example, in the process shown in fig. 9, the frame rate control system determines that the delay time for switching the screen refresh rate to the second frame rate is 2 × 8.3ms, i.e., 16.6ms, compared with the delay time for switching the application frame rate to the second frame rate.
In a possible implementation manner, when the ratio of the first frame rate to the second frame rate is an integer N, and the ratio of (1 + buffer number) to N is not an integer, the frame rate control system determines to adjust the offset of Vsync-APP to reduce the Vsync period duration corresponding to (1 + buffer number) -kN first frame rates. k is an integer, (1 + number of buffers) -kN is an integer less than N.
In this way, the occurrence of glitches and the like in subsequent Vsync-HW calibration of Vsync-APP.
S807, the frame rate control system sends a first message to the Vsync thread; the first message is used for indicating that the application frame rate is switched to the second frame rate.
The application frame rate refers to a frame rate corresponding to application rendering.
In a possible implementation, the first message carries the second frame rate. Illustratively, the second frame rate is 120Hz.
And S809, after receiving the first message, storing the Vsync signal period corresponding to the second frame rate by the Vsync thread.
It is understood that the Vsync thread stores the Vsync signal period corresponding to the application frame rate as the Vsync period corresponding to the second frame rate after receiving the first message. The subsequent Vsync-APP signal is generated at a rhythm corresponding to the second frame rate.
And S810, the frame rate control system sleeps after sending the first message to the Vsync thread.
The terminal device may further perform S811 while the terminal device performs the above S802-S809 or before performing the above S802.
S811, the application main thread sends a Vsync-APP request to the Vsync thread.
And S812, the Vsync thread generates Vsync-APP according to the Vsync signal period corresponding to the first frame rate.
Illustratively, in fig. 9, the frame rate of the application is decided to be switched between 0ms and 8.3ms. The Vsync thread generates Vsync-APP with a period of the Vsync signal corresponding to 120Hz (8.3 ms). This Vsync-APP signal is sent to the application main thread at 8.3ms.
S813, when the application main thread receives the Vsync-APP, calculating a frame interval according to the timestamp of the Vsync-APP.
Specifically, the main thread is applied to calculate a difference value between a timestamp of the Vsync-APP signal received this time and a timestamp of the Vsync-APP signal received last time, and the difference value is a frame interval corresponding to rendering of a previous frame.
Illustratively, in fig. 9, the timestamp of the Vsync-APP signal received this time is 8.3ms, and the timestamp of the Vsync-APP signal received last time is 0ms. And calculating the frame interval corresponding to the drawing and rendering of the frame 3 by applying the main thread to be 8.3ms-0ms, namely 8.3ms.
S814, the main thread is applied to calculate the displacement.
S815, the application main thread sends the displacement of the current frame to the application rendering thread so as to wake up the application rendering thread.
In a possible implementation, the displacement is the product of the frame interval and the velocity. It should be noted that the application main thread may determine the speed based on a pre-stored speed profile. Illustratively, taking the Vsync-APP signal of 8.3ms in fig. 9, the speed is 1pixel/8.3ms as an example, the current frame is frame 3, and the displacement of frame 3 is the product of the frame interval (8.3 ms) corresponding to the rendering of frame 2 rendering and 1pixel/8.3ms, i.e. 1pixel. The application main thread sends the displacement (1 pixel) of the current frame to the application rendering thread.
In a possible implementation, the application main thread does not perform S813-S815. The application main thread sends the speed of the current frame, the timestamp of the current Vsync-APP and the timestamp of the previous frame Vsync-APP to the application rendering thread to wake up the application rendering thread. Illustratively, the application main thread sends the speed (1 pixel/8.3 ms), the Vsync-APP timestamp of the current frame (8.3 ms), and the timestamp of the previous frame Vsync-APP (0 ms) to the application rendering thread.
And S816, the application rendering thread is awakened after receiving the displacement or the timestamp of the Vsync-APP signal, and a rendered image is drawn.
In a possible implementation manner, the application main thread is awakened after receiving the displacement amount, and starts to draw the rendered image.
And S817, after the application rendering thread is awakened, requesting for caching from a caching process so as to store the rendered image.
Adaptively, after receiving a cache request command sent by an application rendering thread, the cache thread reserves a space for storing a rendered image, and sends an instruction for instructing cache dequeuing to the application rendering thread.
S818, after the application rendering thread receives the instruction for indicating the cache dequeue, rendering the image according to the displacement.
S819, the application rendering thread sends the rendered image to a cache thread (cache enqueue).
S820, after receiving the rendered image sent by the application rendering thread, the cache thread inquires whether a first identifier exists in the frame rate control system.
It is understood that when the first identifier exists in the frame rate control system, the terminal device performs S821; when the first flag does not exist in the frame rate control system, the terminal apparatus executes S823.
Illustratively, in fig. 9, when the cache thread receives the rendered frame 3, it queries the frame rate control system whether the first identifier exists. At this time, the first flag exists in the frame rate control system.
And S821, when the first identifier exists in the frame rate control system, the buffer thread inquires the Vsync thread whether the application frame rate is switched to the second frame rate.
It is understood that when the application frame rate is not switched to the second frame rate, the terminal apparatus performs S823.
For example, in fig. 9, the cache thread queries the Vsync thread whether the application frame rate is switched to the second frame rate. At this time, the application frame rate is not completely switched.
S823, the cache thread requests the Vsync-SF signal from the Vsync thread.
S824, the Vsync thread sends a Vsync-SF signal to the composition thread according to the Vsync signal period corresponding to the first frame rate.
Illustratively, the Vsync thread sends a Vsync-SF signal to the composition thread at 16.6ms.
S825, after receiving the Vsync-SF signal, the composition thread composes an image.
In a possible implementation, the composition thread may also query the window manager for the focus application after receiving the Vsync-SF signal. And the synthesis thread queries a cache queue corresponding to the focus application in the cache thread according to the focus application so as to confirm the image needing to be synthesized.
And S826, the synthesis thread sends the synthesized image to a hardware synthesizer. The hardware synthesizer sends the synthesized image to a display driver for display.
S827, display drive, after the Vsync-HW signal arrives, displays the synthesized image.
And S828, after the frame rate control system is M in duration, the sleep is finished.
It can be understood that the delay duration is related to the number of buffers, and when the terminal device executes S828, the application frame rate may complete the switching, and the application frame rate may also not complete the switching.
For example, in the flow shown in fig. 9, the sleep duration of the frame rate control system is 16.6ms, and 16.6ms after the s810 is a period of 16.6ms to 24.9ms. Therefore, when the terminal device performs S828, i.e., ends the hibernation, the application frame rate does not complete the switching.
For example, in the flow shown in fig. 11, the sleep duration of the frame rate control system is 24.9ms, and 24.9ms after the s810 is a period of 24.9ms-33.2 ms. Therefore, when the terminal device performs S828, that is, ends the hibernation, the application frame rate completes the switching.
S829, the frame rate control system sends a second message to the Vsync thread, where the second message is used to indicate that both the composition frame rate and the screen refresh rate are switched to the second frame rate.
And S830, after receiving the second message, the Vsync thread stores a Vsync signal period corresponding to the second frame rate.
It is understood that the Vsync thread stores the Vsync signal period corresponding to the combined frame rate as the Vsync period corresponding to the second frame rate after receiving the second message. Subsequent Vsync-SF signals are generated at a cadence corresponding to the second frame rate.
The S831, vsync thread sends the second frame rate to the hardware synthesizer.
And S832, the hardware synthesizer receives the second frame rate and sends the second frame rate to the display driver.
And S833, switching the screen refresh rate to a second frame rate by the display driving screen.
It will be appreciated that the screen generates the Vsync-HW signal in Vsync periods corresponding to the second frame rate.
Illustratively, in the flow shown in fig. 9, the Vsync-HW signal is generated at 24.9ms and the screen displays frame 3. The Vsync-HW signal is generated at 24.9ms +16.6ms, i.e. 41.5ms, and frame 4 is displayed on the screen.
Following the above S814, the application main thread also executes S834.
It is understood that the terminal device may simultaneously perform rendering, composition, display, and the like. For example, as shown in fig. 9, when rendering is performed on frame 3, frame 2 is synthesized and frame 1 is displayed.
S834, the application main thread sends a Vsync-APP request to the Vsync thread.
And S835, the Vsync thread generates Vsync-APP according to the Vsync signal period corresponding to the second frame rate, and sends the Vsync-APP to the application main thread. And after receiving the Vsync-APP, the application main thread calculates the displacement amount, and draws a rendering image through the application rendering thread. And sending the image to a buffer queue after the image is drawn and rendered, and waiting for synthesis.
It is understood that the frame rate control system may end the sleep or the frame rate control system may not end the sleep when the end device performs S835. For example, in the flow shown in fig. 9, the application main thread sends a Vsync-APP request to the Vsync thread during 8.3ms-24.9 ms; and the Vsync thread generates Vsync-APP according to a Vsync signal period (16.6 ms) corresponding to the second frame rate, and sends the Vsync-APP to the application main thread when the Vsync signal period is 8.3ms +16.6ms, namely 24.9ms.
S836, after receiving the rendered image, the cache thread queries, to the frame rate control system, whether the first identifier exists.
It is understood that when the first identifier exists in the frame rate control system, the terminal device performs S837; when the first flag does not exist in the frame rate control system, the terminal device performs S839.
Illustratively, in fig. 9, when the cache thread receives the rendered frame 4, it queries the frame rate control system whether the first identifier exists. At this time, a first flag exists in the frame rate control system.
S837, when the first identifier exists in the frame rate control system, the cache thread inquires the Vsync thread whether the application frame rate is switched to the second frame rate.
It is to be understood that, when the application frame rate is not switched to the second frame rate, the terminal apparatus does not perform S838. When the application frame rate is switched to the second frame rate, the terminal apparatus executes S838 and S839.
For example, in fig. 9, when the cache thread receives the rendered frame 4 and the first flag exists in the frame rate control system, the Vsync thread queries whether the application frame rate is switched to the second frame rate. At this time, the application frame rate is switched to the second frame rate.
S838, the cache thread sends an instruction to the Vsync thread to instruct to adjust the offset of the Vsync-SF signal.
In a possible implementation, the command to adjust the offset of the Vsync-SF signal carries the offset of the Vsync-SF signal. For example, the offset amount may be 1 Vsync signal period corresponding to the first frame rate. In an embodiment of the present application, the Vsync signal period may refer to a duration corresponding to the Vsync signal period.
S839, the cache thread requests the Vsync-SF signal from the Vsync thread.
S840, the Vsync thread advances by 1 cycle of the Vsync signal corresponding to the first frame rate, generates a Vsync-SF signal, and sends the Vsync-SF signal to the composition thread.
The composition thread, upon receiving the Vsync-SF signal, composes the image and sends the composed image to the display driver via the hardware compositor. The display drive drives the screen to display an image.
Illustratively, in fig. 9, the 16.6-24.9ms device does not generate the Vsync-SF signal at 24.9ms, since the cache thread does not request the Vsync-SF signal from the Vsync thread. After the frame 4 rendering is ended, the terminal device executes S838 and S839. The Vsync thread generates the Vsync-SF signal 1 first frame rate ahead of the Vsync signal period (8.3 ms). Specifically, the Vsync thread generates a Vsync-SF signal at 41.5ms-8.3ms, i.e., 33.2ms, and the composition thread composes frame 4 at 33.2 ms. At 41.5ms, frame 4 is displayed.
It will be appreciated that after the application receives Vsync-APP in S835, the application may proceed to send Vsync-APP requests to the Vsync thread. And the Vsync thread generates Vsync-APP according to a Vsync signal period corresponding to the second frame rate, and sends the Vsync-APP to the application main thread.
In a possible implementation manner, the image synthesis system cancels the Vsync-SF signal generated at the first frame rate after the second message, and synthesizes and draws the rendered image according to the Vsync-SF signal generated at the second frame rate.
For example, as shown in fig. 11, frame 4 starts rendering at 24.9ms, and when the rendering of frame 4 is finished, the rendering thread sends rendered frame 4 to a corresponding cache queue in the cache thread; after receiving the rendered frame 4, the cache thread inquires the frame rate control system that the first identifier exists, and inquires the Vsync thread that the application frame rate is switched to the second frame rate. The cache thread sends a message to the Vsync thread to adjust the Vsync-SF offset and requests a Vsync-SF signal.
The Vsync thread receives the second message during 24.9-33.2 ms. The Vsync thread cancels the Vsync-SF signal generated according to the first frame rate after the second message, namely cancels the Vsync-SF signal of 33.2 ms; the Vsync thread generates a Vsync-SF signal according to the Vsync-SF signal generated by the second frame rate, that is, the Vsync thread generates the Vsync-SF signal at 41.5ms. The composition thread starts composing frame 4 at 41.5ms.
In the above-described flow shown in fig. 8, the terminal device further reduces the stuck phenomenon of the terminal device by adjusting the offset amount of Vsync-SF. In a possible implementation manner, the terminal device may further reduce a stuck phenomenon of the terminal device by setting the compositing thread to be an immediate compositing manner.
For example, the first flag in fig. 8 may be replaced by a second flag, and the second flag is used to indicate immediate composition. In addition, when the cache thread inquires that the frame rate control system has the second identifier and the application frame rate is switched to the second frame rate, the cache thread informs the composition thread to immediately compose, and the composition thread composes and draws the rendered image. The manner in which the composition thread is set to composition immediately is similar to the flow illustrated in FIG. 8 described above. And will not be described in detail herein.
In a possible implementation manner, when the buffer thread queries that the frame rate control system has the second identifier and the application frame rate is switched to the second frame rate, the buffer thread may not request the Vsync-SF signal from the Vsync thread.
The data processing method according to the embodiment of the present application will be described in detail with reference to specific embodiments. The following embodiments may be combined with each other and may not be described in detail in some embodiments for the same or similar concepts or processes.
Fig. 10 is a flowchart illustrating a data processing method according to an embodiment of the present application. As shown in fig. 10, the method may include:
s1001, determining that the first frame rate is integral multiple of the second frame rate.
Specifically, the frame rate control system determines that the first frame rate is an integer multiple of the second frame rate.
In the embodiment of the application, the first frame rate is the frame rate before the screen refresh rate is switched; the second frame rate is the frame rate after the screen refresh rate is switched.
It is understood that the ratio of the first frame rate to the second frame rate is an integer. Illustratively, the first frame rate may be 120Hz, and the second frame rate may be 60Hz; the first frame rate may be 120Hz and the second frame rate may be 40Hz. And are not limited herein.
S1002, inquiring the buffer quantity in the buffer queue.
Specifically, the frame rate control system queries the number of buffers in the buffer queue. The buffer amount refers to the sum of the amount of buffered buffers (queued buffers) and the amount of buffers (queued buffers) being rendered.
In a possible implementation manner, a focus window is inquired, and the cache number in a cache queue corresponding to the focus window is inquired based on the application packet name corresponding to the focus window. Illustratively, the frame rate control system retrieves the focus application from the window manager. And querying a cache queue corresponding to the focus application in the image synthesis system according to the focus application, and further confirming the cache number in the corresponding cache queue.
In this way, the difference of a few Vsync cycles between rendering and displaying of each frame of image can be determined, and the delay time of screen refresh rate switching can be conveniently determined subsequently.
S1003, determining the delay duration of switching the screen refreshing rate to the second frame rate based on the cache number, and comparing the delay duration of switching the application frame rate to the second frame rate.
Specifically, the frame rate control system determines, based on the buffer amount, a delay duration for switching the screen refresh rate to the second frame rate compared to switching the application frame rate to the second frame rate.
In the embodiment of the present application, the delay duration may be the sum of the first duration and the second duration. The first duration is a difference between the first Vsync-HW time of the frame rate control system after receiving the message carrying the second frame rate sent by the synthetic thread and the time of the frame rate control system receiving the message carrying the second frame rate sent by the synthetic thread.
The second duration is between (number of buffers + Vsync cycle duration corresponding to the first frame rate) and (number of buffers + 1) Vsync cycle duration corresponding to the first frame rate.
In a possible implementation, the delay duration satisfies: (number of buffers + 1) × Vsync cycle duration corresponding to the first frame rate.
It can be understood that each frame of image in the terminal device differs by 2 Vsync periods from rendering to displaying the frame of image. When adding one buffer pile in the buffer queue, the number of Vsync periods from rendering to displaying of each frame of image is increased by 1.
The screen refreshing rate is delayed by (the buffer amount is plus 1) × the Vsync period duration corresponding to the first frame rate, so that the rhythm of the image during rendering and drawing is consistent with the rhythm of the image during display of the frame, the speed of the image during display is consistent with the expected speed, the jumping phenomenon is reduced, and the pause is reduced.
And S1004, switching the application frame rate to a second frame rate.
Specifically, the frame rate control system controls the image synthesis system to switch the application frame rate to the second frame rate.
It will be appreciated that the application frame rate is controlled by the Vsync-APP signal. Switching of the application frame rate to the second frame rate is achieved by controlling the time interval of adjacent Vsync-APP signals. The adjacent time intervals become Vsync period durations corresponding to the second frame rate.
And S1005, after delaying the time length, switching the screen refreshing rate to a second frame rate.
Specifically, after the delay time, the frame rate control system controls the display driver to switch the screen refresh rate to the second frame rate through the image synthesis system.
It will be appreciated that the screen refresh rate is controlled by the Vsync-HW signal. Switching of the screen refresh rate to the second frame rate is achieved by controlling the time interval of adjacent Vsync-HW signals. The adjacent time intervals become Vsync period durations corresponding to the second frame rate.
In a possible implementation, the switching time of the composite frame rate is consistent with the switching time of the screen refresh rate.
It will be appreciated that the composite frame rate is controlled by the Vsync-SF signal. Switching of the combined frame rate to the second frame rate is achieved by controlling the time interval of adjacent Vsync-SF signals. The adjacent time intervals become Vsync period durations corresponding to the second frame rate.
In summary, when the frame rate is switched, the application frame rate is switched first, and then the screen refresh rate is switched, so that the display interval between the mth frame and the M-1 th frame is consistent with the rendering interval, and further the display rhythm of the image is consistent with the rendering rhythm, and further the sliding speed jump caused by the inconsistency between the display interval and the rendering interval is reduced, the jamming is reduced, and the user experience is increased.
After the application frame rate is switched to the second frame rate, the terminal device further executes S1006 or S1007.
S1006, after the application frame rate is switched to the second frame rate, the offset of Vsync-SF is adjusted.
Specifically, the frame rate control system controls the image synthesis system to adjust the offset amount of Vsync-SF. Therefore, the image synthesis system can synthesize images in time and can display the images when Vsync-HW arrives, and the condition of no frame display is reduced.
In a possible implementation manner, the adjustment of the offset amount of Vsync-SF is increased by a period duration corresponding to 1 first frame rate.
When the offset amount is positive, the Vsync-SF is increased by an offset amount (offset) from the Vsync-HW, that is, the Vsync-SF timestamp value = Vsync-HW timestamp value — offset amount (offset). It can also be understood that Vsync-SF is moved forward by an offset (offset) compared to Vsync-HW. And adjusting the offset of the Vsync-SF to increase the period duration corresponding to 1 first frame rate, namely, adjusting the period duration corresponding to 1 first frame rate for moving the Vsync-SF forward.
In a possible implementation manner, when the ratio of the first frame rate to the second frame rate is an integer, the frame rate control system adds a first identifier, where the first identifier is used to adjust the Vsync-SF offset. And when the image synthesis system inquires that the first identifier exists in the frame rate control system and the frame rate switching is completely applied, adjusting the Vsync-SF offset.
In a possible implementation manner, when the ratio of the first frame rate to the second frame rate is an integer, the frame rate control system adds a first identifier, where the first identifier is used to adjust the Vsync-SF offset. When the image synthesis system inquires that the frame rate control system has the first identifier, the Vsync-SF signal is generated at the second frame rate by the time corresponding to the advance offset.
It is understood that the image synthesis system may query the frame rate control system whether the first identifier exists or not when receiving the rendered image; alternatively, the image composition system may receive the first flag sent by the frame rate control system and adjust the Vsync-SF offset when the application frame rate switch is complete.
In a possible implementation, when the screen refresh rate is switched to the second frame rate, the Vsync-SF signal corresponding to the first frame rate is canceled.
S1001-S1006 will be described below with reference to different buffer sizes.
A display flow when the second frame rate is lower than the first frame rate and the ratio of the first frame rate to the second frame rate is 2 according to the embodiment of the present application is described below with reference to fig. 9 and 11.
Fig. 9 is a schematic diagram of an interface display processing flow provided in an embodiment of the present application. In the scenario of the list sliding at a constant speed, the screen refresh rate is switched from 120Hz to 60Hz, and the sliding speed is 2 pixels/16.6 ms. In fig. 9, contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the order of time.
As shown in fig. 16, when the terminal device renders frame 2, it decides to apply frame rate switching. After the switching, the interval of the Vsync-APP signal becomes the Vsync period (16.6 ms) corresponding to the second frame rate. And in the period of 0ms-8.3ms, when the frame rate switching is decided to be applied, the buffer amount is 1. The screen refresh rate switch is decided after (1+1) × 8.3, i.e., 16.6ms. Therefore, during 16.6ms-24.9ms, the screen refresh rate is switched, and after the switching, the interval of the Vsync-HW signal becomes the Vsync period (16.6 ms) corresponding to the second frame rate. In addition, after the frame rate switching is applied, adjusting the Vsync-SF signal offset amount increases a Vsync period corresponding to the first frame rate, that is, shifts the Vsync period corresponding to the first frame rate forward.
As shown in fig. 9, the Vsync-APP generation times after the decision application frame rate switching are 8.3ms, 24.9ms, 41.5ms, and 58.1ms, respectively. When rendering is started at the 8.3ms frame 3, frame rate switching is not completed due to the application frame rate. Therefore, the shift amount of the frame 2 and the shift amount of the frame 3 are both 1pixel in relation to the application frame rate (first frame rate) before switching. At 41.5ms, the application frame rate completes the frame rate switch. The amount of displacement of the frame 4, the amount of displacement of the frame 5, and the amount of displacement of the frame 6 are 2 pixels, which are associated with the application frame rate (second frame rate) after switching.
During 16.6ms-24.9ms, the screen refresh rate switch is decided. The Vsync-HW generation times after the decision screen refresh rate switch are 24.9ms, 41.5ms, and 58.1ms, respectively. The screen refresh rate does not complete the frame rate switching at 24.9ms, and the slip speed calculation at 24.9ms is related to the screen refresh rate before switching (first frame rate). The screen refresh rate completes the frame rate switching at 41.5ms, and the sliding speed calculation of 41.5ms is related to the screen refresh rate after switching (second frame rate).
After 24.9ms, adjusting the Vsync-SF signal offset amount increases by the Vsync period corresponding to the first frame rate. The Vsync-SF signal generation time is 41.5ms minus 8.3ms, 58.1ms minus 8.3ms, and 74.7ms minus 8.3ms, i.e., 33.2ms, 49.8ms, and 66.4ms, respectively.
It is understood that adjusting the amount of shift of the Vsync-SF signal to increase the Vsync period corresponding to the first frame rate can reduce the glitch caused by no image display. Taking frame 4 as an example, the generation time of Vsync-SF is advanced by 8.3ms, so that frame 4 can be sent at 41.5ms, and then frame 4 is screen updated. If the offset is not adjusted, no image is displayed on the 41.5ms screen, the screen loses frames, and the jamming occurs.
In fig. 9, at 8.3ms, the display interface of the terminal device changes from frame 0 to frame 1, and the sliding speed is 1pixel/8.3ms. At 16.6ms, the display interface of the terminal device changes from frame 1 to frame 2, and the sliding speed is 1pixel/8.3ms. At 24.9ms, the display interface of the terminal device changes from frame 2 to frame 3, and the sliding speed is 1pixel/8.3ms. At 41.5ms, the display interface of the terminal device changes from frame 3 to frame 4, and the sliding speed is 2 pixels/16.6 ms. At 58.1ms, the display interface of the terminal device changes from frame 4 to frame 5, and the sliding speed is 2 pixels/16.6 ms. At 74.7ms, the display interface of the terminal device changes from frame 5 to frame 6, and the sliding speed is 2 pixels/16.6 ms. When the pictures are switched, the speed is consistent, and the pictures are displayed smoothly without blockage.
Fig. 11 is a schematic diagram of an interface display processing flow according to an embodiment of the present application. In the scenario of the list sliding at a constant speed, the screen refresh rate is switched from 120Hz to 60Hz, and the sliding speed is 2 pixels/16.6 ms. In fig. 11, the contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the order of time.
As shown in fig. 11, when rendering is drawn by the terminal device in frame 2, the decision is to apply frame rate switching. After the switching, the interval of the Vsync-APP signal becomes the Vsync period (16.6 ms) corresponding to the second frame rate. And in the period of 0ms-8.3ms, when the frame rate switching is decided to be applied, the buffer amount is 2. The screen refresh rate switch is decided after (2+1) × 8.3, i.e., 24.9ms. Therefore, during 24.9ms-33.2ms, the screen refresh rate switch is decided. After the switching, the interval of the Vsync-HW signal becomes a Vsync period (16.6 ms) corresponding to the second frame rate. Further, after the frame rate switching is applied, adjusting the Vsync-SF signal offset amount increases the Vsync period corresponding to the first frame rate, i.e., moves forward by the Vsync period corresponding to the first frame rate. And after the screen refresh rate is switched, canceling the Vsync-SF signal corresponding to the first frame rate.
The generation time for generating Vsync-APP after the decision application frame rate is switched is 8.3ms, 24.9ms, 41.5ms and 58.1ms, respectively. When rendering is started at the 8.3ms frame 3, frame rate switching is not completed due to the application frame rate. Therefore, the shift amount of the frame 2 and the shift amount of the frame 3 are both 1pixel in relation to the application frame rate (first frame rate) before switching. At 41.5ms, the application frame rate completes the frame rate switch. The amount of displacement of the frame 4, the amount of displacement of the frame 5, and the amount of displacement of the frame 6 are 2 pixels, which are associated with the application frame rate (second frame rate) after switching.
During 24.9ms-33.2ms, the screen refresh rate switch is decided. The Vsync-HW generation times after the decision screen refresh rate switch are 33.2ms, 49.8ms, 66.4ms, and 83ms, respectively. The screen refresh rate does not complete the frame rate switching at 33.2ms, and the slip speed calculation at 33.2ms is correlated with the screen refresh rate before switching (first frame rate). The screen refresh rate completes the frame rate switching at 49.8ms, and the sliding speed calculation of 49.8ms is related to the switched screen refresh rate (second frame rate).
After 24.9ms, the Vsync-SF signal corresponding to the first frame rate is cancelled, and the offset of the Vsync-SF signal is adjusted to increase the Vsync period corresponding to the first frame rate. The Vsync-SF signal generation time is 49.8ms minus 8.3ms, 66.4ms minus 8.3ms, and 83ms minus 8.3ms, i.e., 41.5ms, 58.1ms, and 74.7ms, respectively.
In fig. 11, at 8.3ms, the display interface of the terminal device changes from frame-1 to frame 0, and the sliding speed is 1pixel/8.3ms. At 16.6ms, the display interface of the terminal equipment is changed from frame 0 to frame 1, and the sliding speed is 1pixel/8.3ms. At 24.9ms, the display interface of the terminal equipment is changed from frame 1 to frame 2, and the sliding speed is 1pixel/8.3ms. At 33.2ms, the display interface of the terminal equipment is changed from frame 1 to frame 2, and the sliding speed is 1pixel/8.3ms. At 49.8ms, the display interface of the terminal device changes from frame 3 to frame 4, and the sliding speed is 2 pixels/16.6 ms. At 66.4ms, the display interface of the terminal device changes from frame 4 to frame 5, and the sliding speed is 2 pixels/16.6 ms. At 83ms, the display interface of the terminal device changes from frame 5 to frame 6, and the sliding speed is 2 pixels/16.6 ms. When the pictures are switched, the speed is consistent, and the pictures are displayed smoothly without blockage.
A display flow when the second frame rate is lower than the first frame rate and the ratio of the first frame rate to the second frame rate is 3 according to the embodiment of the present application is described below with reference to fig. 12 and 13.
Fig. 12 is a schematic diagram of an interface display processing flow provided in an embodiment of the present application. In the scene of the list sliding at a constant speed, the screen refresh rate is switched from 120Hz to 40Hz, and the sliding speed is 1pixel/8.3ms. In fig. 12, the contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, and frame 5 in the order of time.
As shown in fig. 12, when the terminal device renders frame 2, the decision applies frame rate switching. After the switching, the interval of the Vsync-APP signal becomes the Vsync period (24.9 ms) corresponding to the second frame rate. And in the period of 0ms-8.3ms, when the frame rate switching is decided to be applied, the buffer amount is 1. The screen refresh rate switch is decided after (1+1) × 8.3, i.e., 16.6ms. Thus, during 16.6ms-24.9ms, the screen refresh rate switch is decided. Further, after the frame rate switching is applied, adjusting the Vsync-SF signal offset amount increases the Vsync period corresponding to the first frame rate, i.e., moves forward by the Vsync period corresponding to the first frame rate.
The generation time of the Vsync-APP after the decision application frame rate is switched is 8.3ms, 33.2ms, 58.1ms and 83ms respectively. When rendering is started at the 8.3ms frame 3, frame rate switching is not completed due to the application frame rate. Therefore, the shift amount of the frame 2 and the shift amount of the frame 3 are both 1pixel in relation to the application frame rate (first frame rate) before switching. At 33.2ms, the application frame rate completes the frame rate switch. The shift amount of the frame 4 and the shift amount of the frame 5 are both related to the application frame rate (second frame rate) after switching, and are 3 pixels. During 16.6ms-24.9ms, the screen refresh rate switch is decided.
The generation times of Vsync-HW after the decision screen refresh rate is switched are 24.9ms, 49.8ms, and 74.7ms, respectively. The screen refresh rate does not complete the frame rate switching at 24.9ms, and the slip speed calculation at 24.9ms is related to the screen refresh rate before switching (first frame rate). The screen refresh rate completes the frame rate switching at 49.8ms, and the sliding speed calculation of 49.8ms is related to the switched screen refresh rate (second frame rate).
After 33.2ms, adjusting the Vsync-SF signal offset amount increases by the Vsync period corresponding to the first frame rate. The Vsync-SF signal generation time is 49.8ms minus 8.3ms, and 74.7ms minus 8.3ms, i.e., 41.5ms and 66.4ms, respectively.
It can be understood that adjusting the shift amount of the Vsync-SF signal to increase the Vsync period corresponding to the first frame rate can reduce the pause phenomenon caused by the no-frame display. Taking frame 4 as an example, the generation timestamp of Vsync-SF 49.8ms becomes 41.5ms, so that frame 4, the screen update frame 4, can be sent at 49.8 ms. If offset is not adjusted, then the 49.8ms screen has no pictures to display, the screen drops frames, and a stuck occurs.
In fig. 12, at 8.3ms, the display interface of the terminal device changes from frame 0 to frame 1, and the sliding speed is 1pixel/8.3ms. At 16.6ms, the display interface of the terminal device changes from frame 1 to frame 2, and the sliding speed is 1pixel/8.3ms. At 24.9ms, the display interface of the terminal device changes from frame 2 to frame 3, and the sliding speed is 1pixel/8.3ms. At 49.8ms, the display interface of the terminal equipment is changed from frame 3 to frame 4, and the sliding speed is 3pixel/24.9ms. At 74.7ms, the display interface of the terminal device changes from frame 4 to frame 5, and the sliding speed is 3 pixels/24.9 ms. When the pictures are switched, the speed is consistent, and the pictures are smoothly displayed without pause.
Exemplarily, fig. 13 is a schematic diagram of an interface display processing flow provided in an embodiment of the present application. In the scene of the list sliding at a constant speed, the screen refresh rate is switched from 120Hz to 40Hz, and the sliding speed is 1pixel/8.3ms. In fig. 13, the contents displayed by the terminal device correspond to frame-1, frame 0, frame 1, frame 2, frame 3, frame 4, and frame 5 in the order of time.
As shown in fig. 13, when rendering is drawn by the terminal device in frame 2, the decision is made to apply frame rate switching. After the switching, the interval of the Vsync-APP signal becomes the Vsync period (24.9 ms) corresponding to the second frame rate. And in the period of 0ms-8.3ms, when the frame rate switching is decided to be applied, the buffer amount is 2. The screen refresh rate switch is decided after (2+1) × 8.3, i.e., 24.9ms. Therefore, during 24.9ms-33.2ms, the screen refresh rate is decided to switch, and the offset of Vsync-SF is adjusted to be increased by the Vsync period duration (8.3 ms) corresponding to 1 first frame rate. In addition, after the frame rate switching is applied, adjusting the Vsync-SF signal offset amount increases a Vsync period corresponding to the first frame rate, that is, shifts the Vsync period corresponding to the first frame rate forward.
The generation time of the Vsync-APP after the decision application frame rate is switched is 8.3ms, 33.2ms, 58.1ms and 83ms respectively. When rendering is started at the 8.3ms frame 3, frame rate switching is not completed due to the application frame rate. Therefore, the shift amount of the frame 2 and the shift amount of the frame 3 are both 1pixel in relation to the application frame rate (first frame rate) before switching. At 33.2ms, the application frame rate completes the frame rate switch. The shift amount of the frame 4 and the shift amount of the frame 5 are both related to the application frame rate (second frame rate) after switching, and are 3 pixels. During 24.9ms-33.2ms, the screen refresh rate switch is decided. The generation times of Vsync-HW after the decision screen refresh rate is switched are 33.2ms, 58.1ms, and 83ms, respectively. The screen refresh rate does not complete the frame rate switching at 33.2ms, and the slip speed calculation at 33.2ms is correlated with the screen refresh rate before switching (first frame rate). The frame rate switching is completed when the screen refresh rate is 58.1ms, and the sliding speed calculation of 58.1ms is related to the switched screen refresh rate (second frame rate).
After 33.2ms, adjusting the Vsync-SF signal offset amount increases by the Vsync period corresponding to the first frame rate. The Vsync-SF signal generation time is 58.1ms minus 8.3ms, and 83ms minus 8.3ms, i.e., 49.8ms and 74.7ms, respectively.
It can be understood that adjusting the shift amount of the Vsync-SF signal to increase the Vsync period corresponding to the first frame rate can reduce the pause phenomenon caused by the no-frame display. Taking frame 4 as an example, the generation time stamp 49.8ms of Vsync-SF becomes 41.5ms, so that frame 4 can be sent at 49.8ms, and the screen updates frame 4. If the offset is not adjusted, no image is displayed on the 49.8ms screen, the screen loses frames, and the jamming occurs.
In fig. 13, at 8.3ms, the display interface of the terminal device changes from frame-1 to frame 0, and the sliding speed is 1pixel/8.3ms. At 16.6ms, the display interface of the terminal device changes from frame 0 to frame 1, and the sliding speed is 1pixel/8.3ms. At 24.9ms, the display interface of the terminal equipment is changed from frame 1 to frame 2, and the sliding speed is 1pixel/8.3ms. At 33.2ms, the display interface of the terminal equipment is changed from frame 2 to frame 3, and the sliding speed is 1pixel/8.3ms. At 58.1ms, the display interface of the terminal equipment is changed from frame 3 to frame 4, and the sliding speed is 3pixel/24.9ms. At 83ms, the display interface of the terminal equipment is changed from frame 4 to frame 5, and the sliding speed is 3pixel/24.9ms. When the pictures are switched, the speed is consistent, and the pictures are smoothly displayed without pause.
And S1007, after the application frame rate is switched to the second frame rate, setting a synthesis thread to immediately synthesize.
Specifically, the frame rate control system controls the image synthesis system to set the synthesis thread to synthesize immediately. Therefore, images can be synthesized in time, and the situation that the terminal equipment has no frame to be sent and displayed is reduced.
In the embodiment of the application, setting the compositing thread to immediately composite may be understood as that the terminal device invokes the compositing thread to composite the rendered image after the caching thread receives the rendered image. It can also be understood that the terminal device is adjusted to a single frame mode.
In a possible implementation, the Vsync-SF signal is canceled by the terminal device. It is understood that the composition of the image triggers a change in condition and the terminal device may cancel the Vsync-SF signal.
In a possible implementation manner, when the ratio of the first frame rate to the second frame rate is an integer, the frame rate control system adds a second identifier, where the second identifier is used to indicate immediate synthesis. And when the image synthesis system inquires that the frame rate control system has the second identifier and the frame rate switching is completely applied, adjusting the Vsync-SF offset.
It is understood that, when receiving the rendered image, the image synthesis system may query the frame rate control system whether the second identifier exists; alternatively, the image composition system may receive the second flag sent by the frame rate control system and adjust the Vsync-SF offset when the application frame rate switch is complete.
S1001 to S1005 and S1007 will be described below based on the different buffer numbers.
Fig. 14 is a schematic diagram of an interface display processing flow according to an embodiment of the present application. In the scenario of the list sliding at a constant speed, the screen refresh rate is switched from 120Hz to 60Hz, and the sliding speed is 2 pixels/16.6 ms. In fig. 14, the contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the order of time.
As shown in fig. 14, when the terminal device renders frame 2, the decision applies frame rate switching. After the switching, the interval of the Vsync-APP signal becomes the Vsync period (16.6 ms) corresponding to the second frame rate. And in the period of 0ms-8.3ms, when the frame rate switching is decided to be applied, the buffer amount is 1. The screen refresh rate switch is decided after (1+1) × 8.3, i.e., 16.6ms. Thus, during 16.6ms-24.9ms, the screen refresh rate switch is decided. After the switching, the interval of the Vsync-HW signal becomes a Vsync period (16.6 ms) corresponding to the second frame rate. Further, the rendered image is composed immediately after the frame rate switching is applied, by setting the composition thread, and the Vsync-SF signal is canceled.
As shown in fig. 14, at 24.9ms, the number of cached caches is 0, and the composition thread does not perform composition; when the rendering of the frame 4 is finished, the composition thread composes the frame 4. In 41.5ms, the cached buffer number is 0, and the synthesis thread does not perform synthesis; when the rendering of the frame 5 is finished, the composition thread composes the frame 5. At 58.1ms, the cached buffer number is 0, and the synthesis thread does not carry out synthesis; when the rendering of the frame 6 is finished, the composition thread composes the frame 6.
And deciding the generation time of the Vsync-APP after the application frame rate switching is 8.3ms, 24.9ms, 41.5ms and 58.1ms respectively, and when rendering is started in an 8.3ms frame 3, the frame rate switching is not completed because of the application frame rate. Therefore, the shift amount of the frame 2 and the shift amount of the frame 3 are both 1pixel in relation to the application frame rate (first frame rate) before switching. At 41.5ms, the application frame rate completes the frame rate switch. The amount of displacement of the frame 4, the amount of displacement of the frame 5, and the amount of displacement of the frame 6 are 2 pixels, which are associated with the application frame rate (second frame rate) after switching.
During 16.6ms-24.9ms, the screen refresh rate switch is decided. The Vsync-HW generation times after the decision screen refresh rate switch are 24.9ms, 41.5ms, and 58.1ms, respectively. The screen refresh rate does not complete the frame rate switching at 24.9ms, and the slip speed calculation at 24.9ms is related to the screen refresh rate before switching (first frame rate). The screen refresh rate completes the frame rate switching at 41.5ms, and the sliding speed calculation of 41.5ms is related to the screen refresh rate after switching (second frame rate).
In fig. 14, at 8.3ms, the display interface of the terminal device changes from frame 0 to frame 1, and the sliding speed is 1pixel/8.3ms. At 16.6ms, the display interface of the terminal device changes from frame 1 to frame 2, and the sliding speed is 1pixel/8.3ms. At 24.9ms, the display interface of the terminal device changes from frame 2 to frame 3, and the sliding speed is 1pixel/8.3ms. At 41.5ms, the display interface of the terminal device changes from frame 3 to frame 4, and the sliding speed is 2 pixels/16.6 ms. At 58.1ms, the display interface of the terminal equipment is changed from frame 4 to frame 5, and the sliding speed is 2pixel/16.6ms. At 74.7ms, the display interface of the terminal device changes from frame 5 to frame 6, and the sliding speed is 2 pixels/16.6 ms. When the pictures are switched, the speed is consistent, and the pictures are displayed smoothly without blockage.
Fig. 15 is a schematic diagram of an interface display processing flow provided in an embodiment of the present application. Take the scenario that the list slides at a constant speed, the screen refresh rate is switched from 120Hz to 60Hz, and the sliding speed is 2 pixels/16.6 ms as an example. In fig. 15, the contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the order of time.
As shown in fig. 15, when the terminal device renders frame 2, the decision applies frame rate switching. After the switching, the interval of the Vsync-APP signal becomes the Vsync period (16.6 ms) corresponding to the second frame rate. And in the period of 0ms-8.3ms, when the frame rate switching is decided to be applied, the buffer amount is 2. The screen refresh rate switch is decided after (2+1) × 8.3, i.e., 24.9ms. Therefore, during 24.9ms-33.2ms, the screen refresh rate switch is decided. After the switching, the interval of the Vsync-HW signal becomes a Vsync period (16.6 ms) corresponding to the second frame rate. Further, the setting compositing thread immediately composites the rendered image after applying the frame rate switching, and cancels the Vsync-SF signal.
As shown in fig. 15, at 24.9ms, the number of cached caches is 0, and the compositing thread does not perform compositing; when the rendering of the frame 4 is finished, the composition thread composes the frame 4. In 41.5ms, the cached cache number is 0, and the synthesis thread does not carry out synthesis; when the rendering of the frame 5 is finished, the composition thread composes the frame 5. At 58.1ms, the cached cache number is 0, and the synthesis thread does not carry out synthesis; when rendering of frame 6 is completed, the composition thread composes frame 6.
The generation time of the Vsync-APP after the decision application frame rate is switched is 8.3ms, 24.9ms, 41.5ms and 58.1ms respectively. When rendering is started at the 8.3ms frame 3, frame rate switching is not completed due to the application frame rate. Therefore, the shift amount of the frame 2 and the shift amount of the frame 3 are both 1pixel in relation to the application frame rate (first frame rate) before switching. At 41.5ms, the application frame rate completes the frame rate switch. The shift amount of the frame 4, the shift amount of the frame 5, and the shift amount of the frame 6 are 2 pixels, which are related to the application frame rate (second frame rate) after switching.
During 24.9ms-33.2ms, the screen refresh rate switch is decided. The generation times of Vsync-HW after the decision screen refresh rate switch are 33.2ms, 49.8ms, 66.4ms, and 83ms, respectively. The frame rate switching is not completed at 33.2ms for the screen refresh rate, and the calculation of the sliding speed at 33.2ms is related to the screen refresh rate before switching (first frame rate). The frame rate switching is completed when the screen refresh rate is 49.8ms, and the sliding speed calculation of 49.8ms is related to the switched screen refresh rate (second frame rate).
In fig. 15, at 8.3ms, the display interface of the terminal device changes from frame-1 to frame 0, and the sliding speed is 1pixel/8.3ms. At 16.6ms, the display interface of the terminal equipment is changed from frame 0 to frame 1, and the sliding speed is 1pixel/8.3ms. At 24.9ms, the display interface of the terminal equipment is changed from frame 1 to frame 2, and the sliding speed is 1pixel/8.3ms. At 33.2ms, the display interface of the terminal equipment is changed from frame 1 to frame 2, and the sliding speed is 1pixel/8.3ms. At 49.8ms, the display interface of the terminal device changes from frame 3 to frame 4, and the sliding speed is 2 pixels/16.6 ms. At 66.4ms, the display interface of the terminal equipment is changed from frame 4 to frame 5, and the sliding speed is 2pixel/16.6ms. At 83ms, the display interface of the terminal device changes from frame 5 to frame 6, and the sliding speed is 2 pixels/16.6 ms. When the pictures are switched, the speed is consistent, and the pictures are displayed smoothly without blockage.
It should be noted that Vsync-APP is implemented by a software timer, and after the screen refresh rate switching is completed, the terminal device will calibrate Vsync-APP by using the Vsync-HW timestamp to increase the accuracy of Vsync-APP. On the basis of the above embodiment, when the frame rate control system decides to switch the screen refresh rate, the offset (offset) of Vsync-APP at the time of calibration is also set. In this way, the occurrence of a stuck condition or the like in subsequent Vsync-APP calibration based on Vsync-HW may be reduced.
In a possible implementation manner, when (1 + buffer number) is not an integral multiple of N, setting the offset (offset) of Vsync-APP to be reduced by (1 + buffer number) -kN periods corresponding to the first frame rate, where (1 + buffer number) -kN is smaller than N, and k is an integer.
Illustratively, the screen refresh rate switch is decided during 24.9ms-33.2ms, as shown in FIG. 11 or as shown in FIG. 15. And the offset of Vsync-APP is increased by (2+1) -1*2, which is 8.3ms, corresponding to the Vsync period duration for the first frame rate. Specifically, the generation time of the Vsync-HW signal is 66.4ms, and the generation time of the Vsync-APP signal is shifted forward by 8.3ms, 41.5ms, from the generation time of the Vsync-HW signal.
Therefore, when the Vsync-APP is calibrated, the interval of the Vsync-APP is unchanged, and the phenomenon that the interval of the Vsync-APP is changed and then is blocked and the like when the Vsync-APP is calibrated based on the Vsync-HW can be reduced.
The data processing method according to the embodiment of the present application has been described above, and the terminal device provided in the embodiment of the present application, which executes the data processing method, is described below. Those skilled in the art can understand that the method and the apparatus can be combined and referred to each other, and the terminal device provided in the embodiments of the present application can perform the steps in the data processing method.
As shown in fig. 16, fig. 16 is a schematic structural diagram illustrating a data processing apparatus according to an embodiment of the present application. The data processing device may be a terminal device in the embodiment of the present application. The data processing apparatus includes: a display screen 1801 for displaying an image; one or more processors 1802; a memory 1803; a plurality of application programs; and one or more computer programs, wherein the one or more computer programs are stored in the memory 1803, the one or more computer programs comprising instructions which, when executed by the data processing apparatus, cause the data processing apparatus to perform the steps of the data processing method described above.
Fig. 17 is a schematic hardware configuration diagram of a data processing apparatus according to an embodiment of the present application. Referring to fig. 17, the apparatus includes: memory 1901, processor 1902, and interface circuitry 1903. The device may also include a display 1904, wherein the memory 1901, processor 1902, interface circuitry 1903, and display 1904 may communicate; illustratively, the memory 1901, the processor 1902, the interface circuit 1903 and the display 1904 may communicate via a communication bus, and the memory 1901 is used for storing computer-executable instructions, and is controlled by the processor 1902 to execute and the interface circuit 1903 to execute communication, so as to implement the data processing method provided by the embodiments of the present application.
Optionally, the interface circuit 1903 may also include a transmitter and/or a receiver. Optionally, the processor 1902 may include one or more CPUs, and may also be other general-purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present application may be embodied directly in a hardware processor, or in a combination of the hardware and software modules in the processor.
In a possible implementation manner, the computer execution instruction in the embodiment of the present application may also be referred to as an application program code, which is not specifically limited in the embodiment of the present application.
The data processing apparatus provided in the embodiment of the present application is used for executing the data processing method in the foregoing embodiment, and the technical principle and the technical effect are similar, and are not described herein again.
The embodiment of the application provides a terminal device, and the structure of the terminal device is shown in fig. 1. The memory of the terminal device may be configured to store at least one program instruction, and the processor is configured to execute the at least one program instruction to implement the technical solutions of the above-mentioned method embodiments. The implementation principle and technical effect are similar to those of the embodiments related to the method, and are not described herein again.
The embodiment of the application provides a chip. The chip comprises a processor for calling a computer program in the memory to execute the technical solution in the above embodiments. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
The embodiment of the present application provides a computer program product, which enables a terminal device to execute the technical solutions in the above embodiments when the computer program product runs on an electronic device. The implementation principle and technical effect are similar to those of the related embodiments, and are not described herein again.
The embodiment of the present application provides a computer-readable storage medium, on which program instructions are stored, and when the program instructions are executed by a terminal device, the terminal device is enabled to execute the technical solutions of the above embodiments. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above embodiments are only for illustrating the embodiments of the present invention and are not to be construed as limiting the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made on the basis of the embodiments of the present invention shall be included in the scope of the present invention.

Claims (14)

1. A data processing method, applied to a terminal device, the terminal device including an application, a frame rate control system, a composition thread, a cache thread, and a display driver, the method comprising:
when the application draws a rendering image at a first frame rate, the frame rate control system receives a message which is sent by the synthesis thread and carries a second frame rate, and the ratio of the first frame rate to the second frame rate is an integer greater than 1;
in response to receiving the message carrying the second frame rate, the frame rate control system adds an identifier and controls the application to draw and render images at the second frame rate, wherein the identifier is used for adjusting the time for synthesizing the images drawn and rendered by the application by the synthesizing thread;
at a first moment, the frame rate control system controls the compositing thread to composite the image after application drawing rendering at the second frame rate based on the identification, and controls the display driver to drive a screen to display the image after compositing by the compositing thread at the second frame rate;
the first time is between 1+A Vsync-HW signals and 2+A Vsync-HW signals after the frame rate control system receives the message carrying the second frame rate, where a is the buffer amount buffered in the buffer queue corresponding to the application in the buffer thread when the frame rate control system receives the message carrying the second frame rate, and the Vsync-HW signals are used to trigger the display driver to display the image synthesized by the synthesis thread on the screen.
2. The method of claim 1, wherein the terminal device further comprises a Vsync thread;
the response to receiving the message carrying the second frame rate, the frame rate control system controlling the application to draw a rendering image at the second frame rate includes:
after the frame rate control system receives the message carrying the second frame rate, the frame rate control system sends a first message to a Vsync thread, wherein the first message is used for indicating that the application frame rate is switched to the second frame rate;
the Vsync thread generating a Vsync-APP signal at the second frame rate based on the first message and sending the Vsync-APP signal to the application;
the application renders a rendered image based on the Vsync-APP signal.
3. The method of claim 2, wherein the terminal device further comprises: a Vsync thread; the identification is a first identification used for adjusting the time for synthesizing the application-drawn rendered image at the second frame rate by the synthesizing thread based on the offset of a Vsync-SF signal;
the frame rate control system controls the compositing thread to composite the application rendered image at the second frame rate at the first moment, and controls the display driver to drive the screen to display the composited image of the compositing thread at the second frame rate, including:
at the first moment, the frame rate control system sends a second message to a Vsync thread, wherein the second message is used for indicating that the synthesis frame rate and the screen refresh rate are both switched to the second frame rate;
generating, by the Vsync thread, a Vsync-SF signal at the second frame rate based on the second message and the first identification and sending the Vsync-SF signal to the composition thread;
the composition thread compositely draws a rendered image based on the Vsync-SF signal;
the Vsync thread sending a third message to the display driver based on the second message, the third message indicating that the screen refresh rate is switched to the second frame rate;
the display driver generates a Vsync-HW signal at the second frame rate based on the third message control screen, the Vsync-HW signal being generated at a time later than a time of generation of the Vsync-SF signal;
the display driver controls the screen to display the synthesized image upon receiving the Vsync-HW signal.
4. The method of claim 3, wherein the Vsync thread generating and sending Vsync-SF signals to the composition thread at the second frame rate based on the second message and the first identification, comprises:
when receiving the image rendered by the application drawing, the cache thread sends a fourth message to the frame rate control system, wherein the fourth message is used for indicating to inquire the first identifier;
when the first identifier exists in the frame rate control system, the frame rate control system sends a fifth message to the cache thread, wherein the fifth message is used for indicating that the first identifier exists in the frame rate control system;
the cache thread sends a sixth message to the Vsync thread, wherein the sixth message is used for indicating to adjust the offset;
the Vsync thread generating a Vsync-SF signal at the second frame rate in advance of a time corresponding to the offset amount based on the sixth message;
the Vsync thread sends the Vsync-SF signal to the composition thread.
5. The method of claim 4, wherein the offset corresponds to a time period corresponding to one vertical synchronization Vsync signal period corresponding to the first frame rate.
6. The method of claim 2, wherein the terminal device further comprises: a Vsync thread; the identifier is a second identifier, and the second identifier is used for indicating the compositing thread to immediately composite the image;
the frame rate control system controls the compositing thread to composite the application rendered image at the second frame rate at the first moment, and controls the display driver to drive the screen to display the composited image of the compositing thread at the second frame rate, including:
at the first moment, the frame rate control system sends a seventh message to a Vsync thread, wherein the seventh message is used for indicating that the synthesis frame rate and the screen refresh rate are both switched to the second frame rate;
the Vsync thread generates a Vsync-SF signal at a Vsync signal period corresponding to the second frame rate based on the seventh message;
when the cache thread receives the image after the application drawing rendering, the cache thread sends the image after the application drawing rendering to the synthesis thread based on the second identifier;
the compositing thread synthesizes the image rendered by the application drawing;
the Vsync thread sending an eighth message to the display driver based on the seventh message, the eighth message for instructing the screen refresh rate to switch to the second frame rate;
the display driver generates a Vsync-HW signal at the second frame rate based on the eighth message control screen;
the display driver controls the screen to display the synthesized image upon receiving the Vsync-HW signal.
7. The method of claim 6, wherein when the cache thread is receiving the application rendered image, the cache thread sends the application rendered image to the composition thread based on the second identifier; the method comprises the following steps:
when the cache thread receives the image rendered by the application drawing, the cache thread sends a ninth message to the frame rate control system, wherein the ninth message is used for indicating to inquire the second identifier;
when the second identifier exists in the frame rate control system, the frame rate control system sends a tenth message to the cache thread, wherein the tenth message is used for indicating that the second identifier exists;
and the cache queue sends the image after the application drawing rendering to the composition thread based on the tenth message.
8. The method according to any of claims 1-7, wherein a difference between the first time and a time when the frame rate control system receives the message carrying the second frame rate satisfies: (the buffer count + 1) × Vsync cycle duration corresponding to the first frame rate.
9. The method according to any one of claims 1 to 7, wherein the buffer amount is a sum of the amount of the buffer corresponding to the application after the rendering and the amount of the buffer corresponding to the application during the rendering.
10. The method according to any of claims 1-7, wherein after the frame rate control system receives a message carrying a second frame rate sent by the composition thread when the application renders the rendered image at the first frame rate, the method further comprises:
the frame rate control system calculates the ratio of the first frame rate to the second frame rate;
when the ratio is an integer larger than 1, the frame rate control system acquires the cache number from the cache queue corresponding to the application;
the frame rate control system determines the first time based on the buffer amount.
11. The method according to claim 10, wherein when the ratio is an integer greater than 1, the frame rate control system obtains the buffer amount from the buffer queue corresponding to the application, and includes:
the frame rate control system acquires a focus window from a window manager, wherein the focus window corresponds to the application;
and the frame rate control system acquires the cache number from the cache queue corresponding to the application based on the focus window.
12. A terminal device, characterized in that the terminal device comprises a processor for invoking a computer program in a memory for executing the method according to any of claims 1-11.
13. A computer-readable storage medium, having stored thereon computer instructions, which, when run on a terminal device, cause the terminal device to perform the method of any one of claims 1-11.
14. A chip, characterized in that the chip comprises a processor for calling a computer program in a memory for performing the method according to any of claims 1-11.
CN202210114714.XA 2022-01-30 2022-01-30 Data processing method and related device Active CN114579076B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210114714.XA CN114579076B (en) 2022-01-30 2022-01-30 Data processing method and related device
CN202310358322.2A CN116991354A (en) 2022-01-30 2022-01-30 Data processing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210114714.XA CN114579076B (en) 2022-01-30 2022-01-30 Data processing method and related device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310358322.2A Division CN116991354A (en) 2022-01-30 2022-01-30 Data processing method and related device

Publications (2)

Publication Number Publication Date
CN114579076A CN114579076A (en) 2022-06-03
CN114579076B true CN114579076B (en) 2023-04-11

Family

ID=81769555

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210114714.XA Active CN114579076B (en) 2022-01-30 2022-01-30 Data processing method and related device
CN202310358322.2A Pending CN116991354A (en) 2022-01-30 2022-01-30 Data processing method and related device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310358322.2A Pending CN116991354A (en) 2022-01-30 2022-01-30 Data processing method and related device

Country Status (1)

Country Link
CN (2) CN114579076B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115690269B (en) * 2022-10-31 2023-11-07 荣耀终端有限公司 View object processing method and electronic equipment
CN115665482B (en) * 2022-11-09 2023-06-30 腾讯科技(深圳)有限公司 Video rendering method, device, computer equipment and storage medium
CN116709004B (en) * 2022-11-21 2024-04-05 荣耀终端有限公司 Image processing method and electronic equipment
CN116069187B (en) * 2023-01-28 2023-09-01 荣耀终端有限公司 Display method and electronic equipment
CN116931866B (en) * 2023-09-13 2024-03-01 宁波均联智行科技股份有限公司 Vehicle-mounted multi-screen display control method and vehicle-mounted device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111124230A (en) * 2019-12-24 2020-05-08 腾讯科技(深圳)有限公司 Input response method, device, electronic equipment and computer readable storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10565671B2 (en) * 2017-04-24 2020-02-18 Intel Corporation Reduce power by frame skipping
US10559285B2 (en) * 2018-03-31 2020-02-11 Intel Corporation Asynchronous single frame update for self-refreshing panels
CN108810554B (en) * 2018-06-15 2021-06-22 腾讯科技(深圳)有限公司 Scene image transmission method of virtual scene, computer device and storage medium
GB2578629B (en) * 2018-11-01 2022-02-23 Samsung Electronics Co Ltd Device and method for processing rendered frames
US11200636B2 (en) * 2018-11-30 2021-12-14 Mediatek Inc. Method and apparatus for generating a series of frames with aid of synthesizer to offload graphics processing unit rendering in electronic device
CN112929741B (en) * 2021-01-21 2023-02-03 杭州雾联科技有限公司 Video frame rendering method and device, electronic equipment and storage medium
CN113630572B (en) * 2021-07-09 2022-10-14 荣耀终端有限公司 Frame rate switching method and related device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111124230A (en) * 2019-12-24 2020-05-08 腾讯科技(深圳)有限公司 Input response method, device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN114579076A (en) 2022-06-03
CN116991354A (en) 2023-11-03

Similar Documents

Publication Publication Date Title
CN114579075B (en) Data processing method and related device
KR102630338B1 (en) Application display method and electronic device
CN109766066B (en) Message processing method, related device and system
CN114579076B (en) Data processing method and related device
CN113630572B (en) Frame rate switching method and related device
CN113726950B (en) Image processing method and electronic equipment
CN113254120B (en) Data processing method and related device
CN114089933B (en) Display parameter adjusting method, electronic device, chip and readable storage medium
CN115048012B (en) Data processing method and related device
CN112516590A (en) Frame rate identification method and electronic equipment
CN114531519A (en) Control method based on vertical synchronization signal and electronic equipment
WO2022135195A1 (en) Method and apparatus for displaying virtual reality interface, device, and readable storage medium
CN115686403A (en) Display parameter adjusting method, electronic device, chip and readable storage medium
CN114740986A (en) Handwriting input display method and related equipment
CN115904184B (en) Data processing method and related device
WO2023124227A1 (en) Frame rate switching method and device
WO2023124225A1 (en) Frame rate switching method and apparatus
WO2024066834A9 (en) Vsync signal control method, electronic device, storage medium and chip
WO2024066834A1 (en) Vsync signal control method, electronic device, storage medium and chip
CN116069187B (en) Display method and electronic equipment
WO2023020528A1 (en) Mirroring method, device, storage medium, and computer program product
WO2024067037A1 (en) Service calling method and system, and electronic device
CN115904185A (en) Data processing method and related device
CN116414336A (en) Frame rate switching method and device
CN116414337A (en) Frame rate switching method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant