CN114579075B - Data processing method and related device - Google Patents

Data processing method and related device Download PDF

Info

Publication number
CN114579075B
CN114579075B CN202210114699.9A CN202210114699A CN114579075B CN 114579075 B CN114579075 B CN 114579075B CN 202210114699 A CN202210114699 A CN 202210114699A CN 114579075 B CN114579075 B CN 114579075B
Authority
CN
China
Prior art keywords
frame rate
vsync
frame
application
thread
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210114699.9A
Other languages
Chinese (zh)
Other versions
CN114579075A (en
Inventor
蔡立峰
沈赫
杜鸿雁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210114699.9A priority Critical patent/CN114579075B/en
Priority to CN202211502249.3A priority patent/CN116521115A/en
Publication of CN114579075A publication Critical patent/CN114579075A/en
Priority to PCT/CN2023/071242 priority patent/WO2023142995A1/en
Application granted granted Critical
Publication of CN114579075B publication Critical patent/CN114579075B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a data processing method and a related device, which are applied to the technical field of terminals. The method comprises the following steps: when the application draws a rendering image at a first frame rate, a frame rate control system receives a message which is sent by a cache thread and carries a second frame rate, and the ratio of the second frame rate to the first frame rate is an integer greater than 1; in response to the message, the frame rate control system controls the application to render the image at the second frame rate; and at the first moment, the frame rate control system controls the compositing thread to compositely draw the rendered images at a second frame rate, and controls the display driver to drive the screen to display the composited images at the second frame rate. Therefore, the rendering and the rendering are switched, and then the synthesis and the display are switched, so that the display interval between each frame of image and the previous frame of image is consistent with the rendering and the rendering interval, the jump of the sliding speed caused by the inconsistency of the display interval and the rendering interval is reduced, the pause is reduced, and the user experience is increased.

Description

Data processing method and related device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a data processing method and a related apparatus.
Background
Currently, a user can refer to various types of content through a display screen of a terminal device. When the content is more, the display screen can not display all the content at one time, and a user can slide and page the related content in the display screen.
The interface display of the display screen of the terminal device generally needs to go through the processes of drawing, rendering, synthesizing and the like. Illustratively, the interface drawing process of the terminal device may include the processes of background drawing, sub-view drawing, scroll bar drawing and the like. The interface synthesis process of the terminal device may include vertex processing, pixel processing, and the like.
However, when the screen refresh rate of the terminal device is switched during the screen interface change, a click phenomenon may occur in the terminal device.
Disclosure of Invention
The embodiment of the application provides a data processing method and a related device, which are applied to terminal equipment. The method is used for solving the problem of the pause phenomenon caused by the switching of the screen refresh rate in the process of changing the screen interface of the terminal equipment.
In a first aspect, an embodiment of the present application provides a data processing method, which is applied to a terminal device, where the terminal device includes an application, a frame rate control system, a composition thread, a cache thread, and a display driver.
The method comprises the following steps: when the application draws a rendering image at a first frame rate, a frame rate control system receives a message which is sent by a cache thread and carries a second frame rate, and the ratio of the second frame rate to the first frame rate is an integer greater than 1; in response to receiving a message carrying a second frame rate, the frame rate control system controls the application to draw a rendered image at the second frame rate; at a first moment, the frame rate control system controls the synthesis thread to synthesize, apply and draw the rendered image at a second frame rate, and controls the display driver to drive the screen to display the image synthesized by the synthesis thread at the second frame rate; the first time is between the 1+ A Vsync-HW signal and the 2+ A Vsync-HW signal after the frame rate control system receives the message carrying the second frame rate, where A is the buffer amount of the buffer in the corresponding buffer queue applied in the buffer thread when the frame rate control system receives the message carrying the second frame rate, and the Vsync-HW signal is used to trigger the screen to display the synthesized image.
Therefore, when the screen refresh rate of the terminal equipment is switched, the drawing and rendering processes of the images are switched, and then the synthesis process of the images and the display process of the images are switched, so that the display interval between each frame of image and the previous frame of image is consistent with the drawing and rendering interval, the display rhythm of each frame of image is consistent with the drawing and rendering rhythm, the sliding speed jump caused by the inconsistency of the display interval and the drawing and rendering interval is reduced, the blockage is reduced, and the user experience is increased.
Optionally, the terminal device further includes a Vsync thread; in response to receiving a message carrying a second frame rate, the frame rate control system controls the application to render an image at the second frame rate, comprising: after the frame rate control system receives a message carrying a second frame rate, the frame rate control system sends a first message to the Vsync thread, wherein the first message is used for indicating that the application frame rate is switched to the second frame rate; the Vsync thread generating Vsync-APP signals at a second frame rate based on the first message and sending the Vsync-APP signals to the application; the application renders a rendered image based on the Vsync-APP signal.
In the embodiment of the present application, generating the Vsync-APP signals at the second frame rate may be understood as generating the Vsync-APP signals at the Vsync signal period of the second frame rate.
Therefore, the time interval between the Vsync-APP signals is modified, the drawing and rendering process of the image is switched from the first frame rate to the second frame rate, the mode is simple and convenient, and computing resources are saved.
Optionally, the terminal device further includes: a Vsync thread; at a first moment, the frame rate control system controls the compositing thread to composite an application drawing a rendered image at a second frame rate, and controls the display driver to drive the screen to display the image composited by the compositing thread at the second frame rate, including: at the first moment, the frame rate control system sends a second message to the Vsync thread, wherein the second message is used for indicating that the composite frame rate and the screen refresh rate are switched to a second frame rate; the Vsync thread generating a Vsync-SF signal at a second frame rate based on the second message and sending the Vsync-SF signal to the composition thread; the compositing thread composes and draws the rendered image based on the Vsync-SF signal; the Vsync thread sends a third message to the display driver based on the second message, wherein the third message is used for indicating that the screen refresh rate is switched to a second frame rate; the display driving generates a Vsync-HW signal at a second frame rate based on the third message control screen; the display driver controls the screen to display the synthesized image upon receiving the Vsync-HW signal.
In the embodiment of the present application, generating the Vsync-SF signal at the second frame rate may be understood as generating the Vsync-SF signal in the Vsync signal period of the second frame rate. Generating the Vsync-HW signal at the second frame rate may be understood as generating the Vsync-HW signal at the Vsync signal period of the second frame rate.
In this way, the time interval between the Vsync-SF signals and the time interval between the Vsync-HW signals are modified, so that the image synthesis process and the image display process are respectively switched from the first frame rate to the second frame rate, the method is simple and convenient, and the computing resources are saved.
Optionally, a difference between the first time and a time when the frame rate control system receives the message carrying the second frame rate satisfies: (buffer count + 1) × Vsync cycle duration corresponding to the first frame rate.
Therefore, the calculation at the first moment can be simplified, and the calculation resource can be saved.
Optionally, the number of the buffers is the sum of the number of the buffers after rendering in the buffer queue corresponding to the application and the number of the buffers during rendering in the buffer queue corresponding to the application.
Optionally, when the application draws the rendered image at the first frame rate, after the frame rate control system receives a message carrying the second frame rate sent by the composition thread, the method further includes: the frame rate control system calculates the ratio of the second frame rate to the first frame rate; when the ratio is an integer larger than 1, the frame rate control system obtains the cache number from the cache queue corresponding to the application; the frame rate control system determines a first time based on the number of buffers.
Optionally, when the ratio is an integer greater than 1, the frame rate control system obtains the number of buffers from the buffer queue corresponding to the application, including: the frame rate control system acquires a focus window from a window manager, wherein the focus window corresponds to an application; and the frame rate control system acquires the buffer quantity from the buffer queue corresponding to the application based on the focus window.
Optionally, when the application draws a rendered image at the first frame rate, before the frame rate control system receives a message carrying the second frame rate sent by the composition thread, the method further includes: when the application draws and renders images at a first frame rate, the cache thread receives other drawn and rendered images, and the other images correspond to other windows corresponding to the focus window; and after receiving the other rendered images, the cache thread sends a message carrying a second frame rate to the frame rate control system.
Therefore, when other windows are displayed on the display interface, the screen refresh rate is improved, the display is smooth and fine, and the user experience is improved.
Optionally, the other images correspond to pop-up reminders of the terminal device, or the other images correspond to screen capture animations of the terminal device.
Therefore, when the terminal equipment displays the popup prompt or displays the screenshot animation, the screen refresh rate is improved, the display is smooth and fine, and the user experience is improved.
In a second aspect, an embodiment of the present application provides a terminal device, which may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. The terminal device may be a mobile phone (mobile phone), a smart television, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and the like.
The terminal device comprises a processor for invoking a computer program in a memory for performing the method according to the first aspect.
In a third aspect, embodiments of the present application provide a computer-readable storage medium storing computer instructions that, when executed on a terminal device, cause the terminal device to perform the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product, which, when executed, causes a terminal device to perform the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip comprising a processor for invoking a computer program in a memory to perform a method as in the first aspect.
It should be understood that the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by the aspects and the corresponding possible implementations are similar and will not be described again.
Drawings
Fig. 1 is a schematic structural diagram of a hardware system of a terminal device according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a terminal device software system according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a terminal device interface display processing flow in a possible implementation;
fig. 4 is a schematic diagram of an interface display processing flow corresponding to frame rate switching in possible implementations;
FIG. 5 is a schematic diagram of an interface display processing flow in a possible implementation;
FIG. 6 is a schematic diagram of an interface display in a possible implementation;
fig. 7 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 8 is a schematic process diagram of interaction among modules in the data processing method according to the embodiment of the present application;
fig. 9 is a schematic view of a processing flow of terminal device interface display according to an embodiment of the present application;
fig. 10 is a schematic flowchart of a data processing method according to an embodiment of the present application;
fig. 11 is a schematic view of a terminal device interface display processing flow provided in an embodiment of the present application;
fig. 12 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application;
fig. 13 is a schematic hardware structure diagram of a data processing apparatus according to an embodiment of the present application.
Detailed Description
In order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, words such as "first" and "second" are used to distinguish identical items or similar items with substantially the same functions and actions. For example, the first chip and the second chip are only used for distinguishing different chips, and the sequence order thereof is not limited. Those skilled in the art will appreciate that the terms "first," "second," and the like do not denote any order or importance, but rather the terms "first," "second," and the like do not denote any order or importance.
It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "such as" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
The embodiment of the application provides a data processing method which can be applied to electronic equipment with a display function.
The electronic device includes a terminal device, which may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), and so on. The terminal device may be a mobile phone (mobile phone), a smart tv, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and so on. The embodiment of the present application does not limit the specific technology and the specific device form adopted by the terminal device.
In order to better understand the embodiments of the present application, the following describes the structure of the terminal device according to the embodiments of the present application:
fig. 1 shows a schematic configuration diagram of a terminal device 100. The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it may be called from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, a charger, a flash, a camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the terminal device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus, enabling communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of receiving a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 170 and wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a display screen serial interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture function of terminal device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the terminal device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, and may also be used to transmit data between the terminal device 100 and a peripheral device. And the method can also be used for connecting a headset and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is an exemplary illustration, and does not limit the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the terminal device 100. The charging management module 140 may also supply power to the terminal device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device.
The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The antennas in terminal device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with a network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. GNSS may include Global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), and/or Satellite Based Augmentation System (SBAS).
The terminal device 100 implements a display function by the GPU, the display screen 194, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used for displaying images, displaying videos, receiving slide operations, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-o led, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the terminal device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the terminal device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into a sound signal. The terminal device 100 can listen to music through the speaker 170A, or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the terminal device 100 answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, the user can input a voice signal to the microphone 170C by uttering a voice signal close to the microphone 170C through the mouth of the user. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The terminal device 100 determines the intensity of the pressure from the change in the capacitance. When a touch operation is applied to the display screen 194, the terminal device 100 detects the intensity of the touch operation from the pressure sensor 180A. The terminal device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions.
The gyro sensor 180B may be used to determine the motion attitude of the terminal device 100. In some embodiments, the angular velocity of the terminal device 100 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the terminal device 100, calculates the distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the terminal device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the terminal device 100 calculates an altitude from the barometric pressure measured by the barometric pressure sensor 180C, and assists in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The terminal device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the terminal device 100 is a flip, the terminal device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal device 100 in various directions (generally, three axes). The magnitude and direction of gravity may be detected when the terminal device 100 is stationary. The method can also be used for identifying the attitude of the terminal equipment, and is applied to application programs such as horizontal and vertical screen switching, pedometers and the like.
A distance sensor 180F for measuring a distance. The terminal device 100 may measure the distance by infrared or laser. In some embodiments, shooting a scene, the terminal device 100 may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal device 100 emits infrared light to the outside through the light emitting diode. The terminal device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100. When insufficient reflected light is detected, the terminal device 100 can determine that there is no object near the terminal device 100. The terminal device 100 may utilize the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The terminal device 100 may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness. The ambient light sensor 180L can also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in a pocket, in order to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal device 100 may utilize the collected fingerprint characteristics to unlock a fingerprint, access an application lock, photograph a fingerprint, answer an incoming call with a fingerprint, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the terminal device 100 executes a temperature processing policy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the terminal device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the terminal device 100 heats the battery 142 when the temperature is below another threshold to avoid the terminal device 100 being abnormally shut down due to low temperature. In other embodiments, when the temperature is below a further threshold, the terminal device 100 performs a boost on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the terminal device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signals acquired by the bone conduction sensor 180M, and the heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device 100 may receive a key input, and generate a key signal input related to user setting and function control of the terminal device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration prompts as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the terminal device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The terminal device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the terminal device 100 employs eSIM, namely: an embedded SIM card. The eSIM card may be embedded in the terminal device 100 and cannot be separated from the terminal device 100.
The software system of the terminal device 100 may adopt a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture, etc. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the terminal device 100.
Fig. 2 is a block diagram of a software structure of a terminal device according to an embodiment of the present application. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, a hardware abstraction layer, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 2, the application package may include phone, mailbox, calendar, camera, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, a frame rate control system, an image composition system, a view system, a package manager, an input manager, an activity manager, and a resource manager, among others.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The frame rate control system is used to adjust the screen refresh rate.
The image synthesis system is used to control image synthesis and generate vertical synchronization (Vsync) signals.
The image synthesis system includes: a composition thread, a Vsync thread, a cache (tune buffer) thread. The composition thread is used to wake up for composition by the Vsync signal. The Vsync thread is used to generate the next Vsync signal from the Vsync signal request. The cache thread is used to deposit a cache, generate Vsync signal requests, and wake up the composition thread, etc. One or more cache queues exist in the cache thread, and the cache queues are respectively used for storing caches corresponding to different applications.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The package manager is used for program management within the system, for example: application installation, uninstallation, upgrade, and the like.
The input manager is used to manage the programs of the input device. For example, the input system may determine input operations such as a mouse click operation, a keyboard input operation, and a touch slide.
The activity manager is used for managing the life cycle of each application program and the navigation backspacing function. The method is responsible for the creation of the main thread of the Android and the maintenance of the life cycle of each application program.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application layer and the application framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: an image rendering library, an image composition library, a function library, a media library, an input processing library, and the like.
The image rendering library is used for rendering two-dimensional or three-dimensional images. The image synthesis library is used for synthesizing two-dimensional or three-dimensional images.
In a possible implementation manner, the application performs drawing rendering on the image through the image rendering library, and then the application sends the drawn and rendered image to a cache queue of the image synthesis system. Each time the Vsync signal arrives, an image synthesis system (e.g., a surface flicker) sequentially acquires one frame of image to be synthesized from the buffer queue, and then performs image synthesis by the image synthesis library.
The function library provides macros, type definitions, character string operation functions, mathematical computation functions, input and output functions, and the like used in the C language.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, and the like.
The input processing library is used for processing a library of input devices, and can realize mouse, keyboard, touch input processing and the like.
The hardware abstraction layer may include a plurality of library modules, which may be, for example, hardware compositor (hwcomposer, HWC), camera library modules, and the like. The Android system can load corresponding library modules for the equipment hardware, and then the purpose that the application program framework layer accesses the equipment hardware is achieved. The device hardware may include, for example, an LCD display screen, a camera, etc. in the electronic device.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a Touch Panel (TP) driver, a display driver, a Bluetooth driver, a WIFI driver, a keyboard driver, a shared memory driver, a camera driver and the like.
The hardware may be audio devices, bluetooth devices, camera devices, sensor devices, etc.
The following describes an exemplary workflow of software and hardware of the terminal device 100 in conjunction with a scenario where an application is started or an interface is switched in the application.
When the touch sensor 180K in the touch panel receives a touch operation, the kernel layer processes the touch operation into an original input event (including information such as touch coordinates, touch force, and a time stamp of the touch operation). The raw input events are stored at the kernel layer. And the kernel layer reports the original input event to an input manager of the application program framework layer through the input processing library. And the input manager of the application program framework layer analyzes the information (including the operation type, the report point position and the like) of the original input event, determines the focus application according to the current focus, and sends the analyzed information to the focus application. The focus may be a touch point in a touch operation or a click position in a mouse click operation. The focus application is an application running on the foreground of the terminal device or an application corresponding to a touch position in touch operation. The focus application determines a control corresponding to the original input event according to the parsed information (e.g., a breakpoint position) of the original input event.
Taking the touch operation as a touch sliding operation, and taking a control corresponding to the touch sliding operation as a list control of the WeChat application as an example, the WeChat application calls an image rendering library in a system library to render and draw an image through a view system of an application program framework layer. And the WeChat application sends the rendered image to a cache queue of the image synthesis system. And synthesizing the rendered image drawn in the image synthesis system into a WeChat interface through an image synthesis library in the system library. The image synthesis system is driven by the display of the kernel layer, so that the screen (display screen) displays the corresponding interface of the WeChat application.
For ease of understanding, the examples given are provided to illustrate concepts related to embodiments of the present application.
1. Frame: refers to a single picture of the smallest unit in the interface display. A frame can be understood as a still picture and displaying a number of consecutive frames in rapid succession can create the illusion of motion of the object. The frame rate is the number of frames of a picture refreshed in 1 second, and can also be understood as the number of times of refreshing pictures per second of a graphics processor in the terminal device. A high frame rate may result in a smoother and more realistic animation. The greater the number of frames per second, the more fluid the displayed motion will be.
It should be noted that, before the interface displays the frame, processes such as drawing, rendering, and composition are usually required.
2. And (3) frame drawing: the method refers to drawing pictures on a display interface. The display interface may be comprised of one or more views, each of which may be rendered by a visual control of the view system, each view being comprised of sub-views, one of which corresponds to a widget in the view, e.g., one of which corresponds to a symbol in the pictorial view.
3. Frame rendering: rendering operation is performed on the rendered view or 3D effect is added, etc. For example: the 3D effect may be a light effect, a shadow effect, a texture effect, and the like.
4. Frame synthesis: the method is a process of synthesizing a plurality of the one or more rendered views into a display interface.
5. Sliding away from the hand: the method is characterized in that after the touch sliding of the interface of the terminal equipment is finished, the sliding display is continued at the initial speed after the interface is separated from the hand.
For example, in applications such as setup, headline, etc., the interface displays a list layout, and after the finger of the sliding interface is lifted, the interface may continue to slide for a period of time at the initial speed after leaving the hand.
It will be appreciated that when the interface includes a list control, the interface displays a list layout, which is a list interface. The list control may be listview or retrieveview, and the list control is not limited in the embodiment of the application.
The following describes a display process of the interface of the terminal device 100 with software and hardware.
In order to improve the smoothness of display and reduce display jamming and the like, the terminal device generally performs display based on the Vsync signal to synchronize the flow of drawing, rendering, synthesizing, screen refreshing display and the like of an image.
It is understood that the Vsync signal is a periodic signal, and the period of the Vsync signal may be set according to the screen refresh rate, for example, when the screen refresh rate is 60Hz, the period of the Vsync signal may be 16.6ms, i.e., the terminal device generates a control signal every 16.6ms to trigger the period of the Vsync signal.
It should be noted that the Vsync signal may be divided into a software Vsync signal and a hardware Vsync signal. The software Vsync signal includes Vsync-APP and Vsync-SF. Vsync-APP is used to trigger the rendering process. Vsync-SF is used to trigger the synthesis flow. The hardware Vsync signal (Vsync-HW) is used to trigger the screen display refresh process.
Typically, the software Vsync signal and the hardware Vsync signal keep the periods synchronized. For example, with 60Hz and 120Hz changes, if Vsync-HW switches from 60Hz to 120Hz, the Vsync-APP and Vsync-SF change synchronously, both switching from 60Hz to 120Hz.
Fig. 3 is a schematic diagram of a terminal device interface display processing flow in a possible implementation. The contents displayed by the terminal device correspond to frame 1, frame 2, and frame 3 in chronological order.
Specifically, taking the display of the frame 1 as an example, the application of the terminal device draws and renders the frame 1 through a view system of an application framework layer. After the frame 1 rendering is completed, the application of the terminal device sends the rendered frame 1 image data to an image synthesis system (e.g., a surface flicker). The image composition system composes the rendered frame 1. After the frame 1 is synthesized, the terminal device may start the display driver by calling the kernel layer, and display the content corresponding to the frame 1 on the screen (display screen). It is understood that the application of the terminal device transmits the rendered image data to the image composition system. The drawing of the rendered frame 1 may refer to drawing of rendered image data corresponding to the frame 1.
The process of frame 2 and frame 3 similar to frame 1 is also synthesized and displayed and will not be described in detail here. Each frame in fig. 3 lags from rendering to display by 2 Vsync signal periods, and the display of the terminal device has hysteresis.
It should be noted that the terminal device may decrease the screen refresh rate to reduce the jamming when the system load is large, or increase the screen refresh rate to increase the fluency of the display when the system load is small.
For example, fig. 4 is a schematic diagram of an interface display processing flow corresponding to frame rate switching in possible implementations. The contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the chronological order.
Specifically, taking the display of the frame 2 as an example, the application of the terminal device renders and renders the frame 2 through a view system of an application framework layer. After the frame 2 rendering is completed, the application of the terminal device sends the rendered frame 2 to an image synthesis system (e.g., a surface flanger). The image composition system composes the rendered frame 2. After the frame 2 is synthesized, the terminal device may start the display driver by calling the kernel layer, and display the content corresponding to the frame 2. The processes of frame 3, frame 4, frame 5 and frame 6 similar to frame 2 are also synthesized and displayed, and are not described in detail here.
When the frame 3 is rendered, the frame rate control system of the terminal device decides to switch the frame rate (for example, from 60Hz to 120 Hz), and when the frame 4 is rendered, the frame rate is switched, and the period duration of the Vsync signal corresponding to the frame 4 is changed, so that the frame rate switching is completed.
Note that the terminal device determines the layout of the image and the like by the amount of displacement. In some sliding scenes (e.g., a slide away from the hand), the amount of displacement of the image is related to the corresponding frame interval (previous Vsync cycle duration) at the time the previous frame was rendered. Specifically, for example, in a constant-speed sliding scene, the displacement of the current image (frame) during rendering is obtained by multiplying the sliding speed of the current frame by the frame interval of the previous frame (the current frame Vsync-App timestamp — the previous frame Vsync-App timestamp). Illustratively, taking the frame 3 in fig. 4 as an example, the displacement amount of the frame 3 is obtained by multiplying the frame interval of the frame 2 (the timestamp of the Vsync2 — the timestamp of the Vsync 1) by the sliding speed of the frame 3.
The sliding speed of the image display of the terminal device is obtained by dividing the displacement difference between the current frame and the previous frame (the displacement of the current frame) by the corresponding frame interval (the display duration of the previous frame) when the previous frame is displayed. Illustratively, taking the frame 3 in fig. 4 as an example, the sliding speed of the frame 3 is the displacement amount of the frame 3 divided by the corresponding frame interval (timestamp of Vsync4 — timestamp of Vsync 3) when the frame 2 is displayed.
Therefore, when the frame interval corresponding to the rendering of the image drawing coincides with the frame interval corresponding to the displaying, the image is displayed at the preset slide speed. If the frame interval corresponding to the rendering of the image is not consistent with the frame interval corresponding to the display, the sliding speed of the display may jump, and further the display screen may be unsmooth and unsmooth, and the user experience may be poor.
As can be seen in fig. 4, each frame in fig. 4 lags from drawing to display by 2 periods of the Vsync signal. When the screen refresh rate is switched, the frame interval corresponding to the frame 2 rendering is not consistent with the frame interval corresponding to the frame 2 displaying, and similarly, the frame interval corresponding to the frame 3 rendering is not consistent with the frame interval corresponding to the frame 3 displaying. This may cause the sliding speed of the display of the frames 3 and 4 to be different from the preset sliding speed, and the sliding speed of the display of the frames 3 and 4 may jump.
Next, the displacement amount and the sliding speed according to the flow in fig. 4 will be described with reference to fig. 5 and 6.
Illustratively, take the example where the list is slid at a constant speed, the screen refresh rate is switched from 60Hz to 120Hz, and the sliding speed is moved by 2 pixels (pixels) every 16.6 milliseconds (ms). FIG. 5 is a schematic diagram of an interface display processing flow in a possible implementation.
In fig. 5, the contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the order of time.
It is understood that the shift amount is a product of a frame interval of the previous frame (current frame Vsync-App timestamp — previous frame Vsync-App timestamp) and a sliding speed of the current frame. Illustratively, the displacement of frame 3 in FIG. 5 is (16.6 ms-0 ms). Times.2 pixels/16.6 ms, i.e., 2 pixels; similarly, the shift amount of the frame 4 is (33.2 ms-16.6 ms) × 2 pixels/16.6 ms, i.e., 2 pixels.
As shown in fig. 5, the terminal device decides frame rate switching when rendering is drawn in frame 3. When rendering starts at 33.2ms frame 4, frame rate switching has not been completed. Therefore, the shift amount of the frame 2, the shift amount of the frame 3, and the shift amount of the frame 4 are all 2 pixels in relation to the screen refresh rate before switching (or the Vsync period duration before frame rate switching). At 41.5ms, frame rate switching is completed. The shift amount of the frame 5 and the shift amount of the frame 6 are both related to the screen refresh rate after switching (or the Vsync period duration after frame rate switching), and are 1pixel.
As can be seen from fig. 5, the frame interval corresponding to the frame 2 during rendering is 16.6ms-0ms, that is, 16.6ms, and the frame interval corresponding to the frame 2 during displaying is 41.5ms-33.2ms, that is, 8.3ms, so that the display rhythm of the terminal device is accelerated, and the sliding speed when the frame 3 is switched to display is increased. Similarly, the sliding speed when switching to the display of frame 4 increases. The frame interval corresponding to the frame 4 in the drawing and rendering process is 41.5ms-33.2ms, namely 8.3ms, the frame interval corresponding to the frame 4 in the display process is 58.1ms-49.8ms, namely 8.3ms, the display rhythm of the frame 4 is the same as the drawing and rendering rhythm, and the sliding speed of the frame 5 is unchanged. When the frequency of 60Hz is switched to 120Hz, the sliding speed of the display of the terminal equipment is increased and then decreased, so that the user perceives that the picture is jammed.
For ease of understanding, the display speed of fig. 5 is explained below with reference to fig. 6.
It is understood that the user perceives a change in speed when the screen is switched. Therefore, the sliding speed can be represented by dividing the display duration of the previous frame by the difference between the displacement of the current frame and the displacement of the previous frame (the displacement amount of the current frame).
Illustratively, fig. 6 is a display diagram of the interface corresponding to frames 0, 1, 2, 3, 4, 5 and 6 in fig. 5. As shown in fig. 6, there is a triangle in the list interface. Take the absolute position of the display screen (screen) as 0-18pixel as an example. If the triangle position in frame 0 is at 0 and the displacement of frame 1 is 2 pixels, then the triangle position in frame 1 is at 2 pixels. The shift amount of frame 2, the shift amount of frame 3, and the shift amount of frame 4 are all 2 pixels. The shift amount of the frame 5 and the shift amount of the frame 6 are both 1pixel. The triangle positions are located at 4 pixels, 6 pixels, 8 pixels, 9 pixels, and 10 pixels in frame 2, frame 3, frame 4, frame 5, and frame 6, respectively.
In connection with fig. 5, at 16.6ms, the Vsync signal arrives, the display interface of the terminal device changes from frame 0 to frame 1, the position of the triangle moves from 0 to 2 pixels, the moving speed is 2 pixels/16.6 ms, and the sliding speed perceived by the user is 2 pixels/(16.6 ms-0 ms), i.e. 2 pixels/16.6 ms. At 33.2ms, a Vsync signal comes, the display interface of the terminal device changes from frame 1 to frame 2, the moving speed of the triangle is 2 pixels/16.6 ms, and the sliding speed perceived by the user is 2 pixels/(33.2 ms-16.6 ms), namely 2 pixels/16.6 ms.
At 41.5ms, the Vsync signal comes, the display interface of the terminal device changes from frame 2 to frame 3, the moving speed of the triangle is 2 pixels/8.3 ms, and the sliding speed perceived by the user is 2 pixels/(41.5 ms-33.2 ms), namely 2 pixels/8.3 ms. At 49.8ms, a Vsync signal arrives, the display interface of the terminal device is changed from frame 3 to frame 4, the moving speed of the triangle is 2 pixels/8.3 ms, and the sliding speed perceived by the user is 2 pixels/(49.8 ms-41.5 ms), namely 2 pixels/8.3 ms. At 58.1ms, the Vsync signal comes in, the display interface of the terminal device is changed from frame 4 to frame 5, the moving speed of the triangle is 1pixel/8.3ms, and the sliding speed perceived by the user is 1 pixel/(58.1 ms-49.8 ms), namely 1pixel/8.3ms. At 66.4ms, the Vsync signal comes in, the display interface of the terminal device changes from frame 5 to frame 6, the moving speed of the triangle is 1pixel/8.3ms, and the sliding speed perceived by the user is 1 pixel/(66.4 ms-58.1 ms), namely 1pixel/8.3ms.
In FIG. 5, the slip speed is changed from 2pixel/16.6ms to 2pixel/8.3ms to 1pixel/8.3ms. The sliding speed changes, so that the user feels that the card is blunt and the user experience is poor.
In summary, the screen refresh rate of the terminal device changes during the sliding process, and the frame interval corresponding to the rendering of the image may be greater than or less than the frame interval corresponding to the display of the image, which causes the sliding speed to jump (rise or fall) during the display, resulting in the image being stuck.
In view of this, embodiments of the present application provide a data processing method, when frame rate switching occurs in a terminal device, an application frame rate is switched first, and switching of a composition frame rate and a screen refresh rate is delayed. Therefore, the frame interval corresponding to the image display is consistent with the frame interval corresponding to the frame image drawing and rendering, the jump of the sliding speed during display is reduced, the pause phenomenon is further reduced, the picture display is uniform and smooth, and the user experience is improved.
An application scenario provided by the embodiment of the present application is described below with reference to the drawings. Fig. 7 is a schematic view of an application scenario provided in the embodiment of the present application.
When the terminal device receives an upward slide operation on the list interface shown in a in fig. 7, it enters the list interface shown in b in fig. 7. In comparison with the list interface shown in a of fig. 7, the contents of the list corresponding display in the list interface shown in b of fig. 7 are changed.
In the process of list sliding display, the terminal device may also display other contents, and further the terminal device may increase the screen refresh rate to improve the fluency of display. Other content may be a pop-up reminder 701 in the interface shown in c in fig. 7, a screenshot animation 702 in the interface shown in d in fig. 7, and so on.
In one possible scenario, after the terminal device detects an up event in the sliding operation in the list interface in a in fig. 7, the sliding list display is continued based on the initial speed at the time of the up event. The terminal device may reduce the screen refresh rate in the display following the up event to reduce power consumption; when the terminal device displays other content on the list interface, the screen refresh rate may be switched based on the method of the embodiment of the present application to improve the fluency of the list interface. Other content includes, but is not limited to: pop-up window reminders, screen capture animations, charge reminders, and the like. The pop-up window reminding comprises: and message reminding of the application, such as short message reminding, weChat reminding and the like.
The types of input events corresponding to the slide operation may be classified into a press (down), a move (move), and a lift (up).
It is understood that the list interface of the embodiment of the present application may be an interface of a social application, a setting-related interface, a document interface, or a product browsing interface. The terminal device may receive a sliding operation of the user on the list interface, where the sliding operation may be an upward sliding operation, a downward sliding operation, a leftward sliding operation, or a rightward sliding operation.
It can be understood that the data processing method provided in the embodiment of the present application may also be applied to other scenes related to speed, and is not limited herein. In addition, the method provided by the embodiment of the application can be applied to a constant speed scene, a speed reduction scene and a speed increase scene. The embodiment of the present application is not limited to the change of the speed.
For convenience of understanding, the following describes, with reference to fig. 8 and 9, a process of interaction between the respective modules involved in the data processing method provided in the embodiment of the present application.
Fig. 8 is a schematic diagram illustrating a process of interaction between modules in the data processing method according to the embodiment of the present application.
As shown in fig. 8, the system may include: an application, a frame rate control system, an image compositing system (surface flicker), a window manager, a hardware compositor, and a display driver. The application comprises an application main thread and an application rendering thread. The image composition system includes a Vsync thread, a cache thread, and a composition thread. The application main thread may also be referred to as a logical thread, or as a UI thread.
Taking the case that the terminal device performs pop-up window reminding in the process of sliding away from the hand,
when the touch panel receives a touch operation, the kernel layer processes the touch operation into an original input event (including information such as touch coordinates, touch force, and a time stamp of the touch operation). The raw input events are stored at the kernel layer. And the kernel layer reports the original input event to an input manager of the application program framework layer through the input processing library. The input manager of the application framework layer analyzes the information (including the operation type, the report position and the like) of the original input event and determines the focus application according to the current focus, and sends the analyzed information (such as a down event, a move event, an up event and the like) of the original input event to the synthesis thread.
When the terminal equipment receives the sliding operation when displaying at the first frame rate, the composition thread receives a down event, a move event and an up event which are sent by the input manager. And when the composition thread receives the down event sent by the input manager, sending a message for indicating the switching of the screen refresh rate to the frame rate control system, wherein the message carries a second frame rate. Illustratively, the first frame rate is 60Hz and the second frame rate is 120Hz.
And after receiving the preset duration of the up event, the synthesis thread sends a message for indicating the switching of the screen refresh rate to the frame rate control system, wherein the message carries the first frame rate.
S801, when the terminal equipment performs popup reminding, the cache thread receives cache enqueues of other layers except the focus window layer.
It should be noted that the focus window layer may be understood as a layer corresponding to a list in the display interface, and corresponds to the list in the display interface. The other layers except the focus window layer can be understood as the layer corresponding to the pop-up window prompt and correspond to the pop-up window of the display interface.
In this embodiment of the present application, the focus window layer and the other layers may correspond to the same application, or may also correspond to different applications, which is not limited herein.
Fig. 9 is a schematic diagram of a display flow provided in an embodiment of the present application. As shown in fig. 9, if the first frame rate is 60Hz, the second frame rate is 120Hz. Also, in fig. 9, the contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the order of time.
In FIG. 9, 16.6ms, vsync-APP signal comes in, and the application main thread starts rendering frame 3. During the period of 16.6ms-33.2ms, the cache thread of the terminal equipment receives the cache enqueue of other layers except the focus window layer.
S802, the cache thread sends a message for indicating the screen refresh rate switching to the frame rate control system, and the message carries the second frame rate.
Illustratively, in the process shown in fig. 9, during a period from 16.6ms to 33.2ms, the cache thread receives a cache enqueue (such as a pop-up reminder in fig. 9) of other layers except for the layer of the focus window, and the composition thread sends a message for instructing to switch the screen refresh rate to the frame rate control system, where the message carries a second frame rate, and the second frame rate is 120Hz.
S803, the frame rate control system determines that the ratio of the second frame rate to the first frame rate is an integer.
For example, in the process shown in fig. 9, the frame rate control system determines that the ratio of the second frame rate to the first frame rate is 2, which is an integer.
S804, the frame rate control system inquires a focus window to the window manager. Adaptively, the window manager feeds back the application package name and the focus window layer corresponding to the focus window to the frame rate control system. The focus window is a window corresponding to an application operated by the foreground of the terminal equipment.
For example, in the flow shown in fig. 9, the focus window is a window corresponding to an input event.
It can be understood that S803 and S804 may be executed simultaneously or not, and the order of S803 and S804 is not limited in this embodiment of the application.
In a possible implementation manner, the frame rate control system may determine the focus application according to the application packet name corresponding to the focus window, and further determine whether the focus application is in a pre-stored delayed switching application list, and if so, perform the following steps S805 to S830 in the delayed switching application list. And if the application frame rate is not in the delayed switching application list, synchronously switching the application frame rate and the screen refresh rate.
S805, the frame rate control system inquires the cache number in the cache queue corresponding to the focus window from the cache thread according to the focus window; adaptively, the buffer thread feeds back the buffer amount to the frame rate control system.
It should be noted that the buffers in the buffer queue include one or more of the following: a cached (queued buffer), a rendering (queued buffer), a compositing (acquired buffer), and an unused (free buffer).
The cached cache may be understood as a cache in which a rendered image is stored. A rendering cache may be understood as a cache for storing images that an application is drawing a rendering. A cache being composed may be understood as a cache in which the composition is performed in the composition thread. An unused cache may be understood as a cache that does not store images.
For example, taking the number of all buffers in the buffer queue as 20 as an example, at 17ms in fig. 9, the application is drawing rendering frame 3, the rendering buffer is frame 3, and the number is 1; the number of the cached caches in the cache queue is 0; the synthesizing thread is in synthesizing frame 2, the buffer being synthesized is frame 2, and the number is 1; the number of unused buffers is 20-1-0-1, i.e. 18.
At 30ms in fig. 9, the application does not perform rendering, and the number of caches being rendered is 0; the buffered buffer is frame 3, and the number is 1; the synthesis thread does not synthesize, and the number of the synthesized caches is 0; the number of unused buffers is 20-0-1-0, i.e. 19.
In the embodiment of the present application, the number of caches is the sum of the number of cached caches (queued buffers) and the number of caches (queued buffers) being rendered.
For example, in the flow shown in fig. 9, the rendering is finished for frame 3, the number of buffered buffers is 1, the number of buffers being rendered is 0, the number of buffers is 1, and the buffer thread feeds back the buffer number to the frame rate control system as 1.
S806, the frame rate control system determines the delay duration M for switching the screen refresh rate to the second frame rate according to the buffer amount, compared with the delay duration M for switching the application frame rate to the second frame rate. M satisfies the following formula: m = (buffer count + 1) × Vsync period corresponding to the first frame rate.
When the ratio of the second frame rate to the first frame rate is an integer, the delay duration M = (buffer amount + 1) × Vsync period corresponding to the first frame rate. When the number of buffers is 1, the delay duration M is 2 × Vsync period corresponding to the first frame rate.
Illustratively, in the flowchart shown in fig. 9, the frame rate control system determines that the delay time for switching the screen refresh rate to the second frame rate is 2 × 16.6ms, i.e., 33.2ms, compared with the delay time for switching the application frame rate to the second frame rate.
S807, the frame rate control system sends a first message to the Vsync thread; the first message is used for indicating that the application frame rate is switched to the second frame rate.
The application frame rate refers to a frame rate corresponding to application rendering.
In a possible implementation, the first message carries the second frame rate. Illustratively, the second frame rate is 120Hz.
And S809, storing the Vsync signal period corresponding to the second frame rate after the Vsync thread receives the first message.
It is understood that the Vsync thread stores the Vsync signal period corresponding to the application frame rate as the Vsync period corresponding to the second frame rate after receiving the first message. And generating the subsequent Vsync-APP signal according to the rhythm corresponding to the second frame rate.
And S810, the frame rate control system sleeps after sending a first message to the Vsync thread.
The terminal device may further perform S811 while the terminal device performs the above S802-S809 or before performing the above S802.
S811, the application main thread sends a Vsync-APP request to the Vsync thread.
And S812, the Vsync thread generates Vsync-APP according to the Vsync signal period corresponding to the first frame rate.
Illustratively, in fig. 9, the frame rate of the application is decided to be switched between 16.6ms and 33.2ms. The Vsync thread generates Vsync-APP with a period of the Vsync signal corresponding to 60Hz (16.6 ms). This Vsync-APP signal is sent to the application main thread at 33.2ms.
S813, the application main thread calculates the frame interval according to the timestamp of the Vsync-APP when receiving the Vsync-APP.
Specifically, the main thread is used for calculating the difference value between the timestamp of the Vsync-APP signal received this time and the timestamp of the Vsync-APP signal received last time, and the difference value is used for rendering a frame interval corresponding to the drawing of the previous frame.
Illustratively, in fig. 9, the timestamp of the Vsync-APP signal received this time is 33.2ms, and the timestamp of the Vsync-APP signal received last time is 16.6ms. And calculating the frame interval corresponding to the drawing and rendering of the frame 3 by applying the main thread to be 33.2ms-16.6ms, namely 16.6ms.
S814, the main thread is applied to calculate the displacement.
S815, the application main thread sends the displacement of the current frame to an application rendering thread so as to wake up the application rendering thread.
In a possible implementation, the displacement is the product of the frame interval and the velocity. It should be noted that the application main thread may determine the speed based on a pre-stored speed profile. Illustratively, taking the Vsync-APP signal of 33.2ms in fig. 9, the speed is 2 pixels/16.6 ms as an example, the current frame is frame 4, and the displacement of frame 4 is the product of the frame interval (16.6 ms) corresponding to the rendering of frame 3 rendering, i.e. 2 pixels, and 2 pixels/16.6 ms. The application main thread sends the displacement (2 pixels) of the current frame to the application rendering thread.
In a possible implementation, the application main thread does not perform S813-S815. The application main thread sends the speed of the current frame, the timestamp of the current Vsync-APP and the timestamp of the previous frame Vsync-APP to the application rendering thread to wake up the application rendering thread. Illustratively, the application main thread sends the speed (2 pixel/16.6 ms), the Vsync-APP timestamp of the current frame (33.2 ms), and the timestamp of the previous frame Vsync-APP (16.6 ms) to the application rendering thread.
And S816, the application rendering thread is awakened after receiving the displacement or the timestamp of the Vsync-APP signal, and a rendered image is drawn.
In a possible implementation manner, the application main thread is awakened after receiving the displacement amount, and starts to draw the rendered image.
And S817, after the application rendering thread is awakened, requesting for caching from a caching process so as to store the rendered image.
Adaptively, after receiving a cache request command sent by an application rendering thread, the cache thread reserves a space for storing a rendered image, and sends an instruction for instructing cache dequeuing to the application rendering thread.
And S818, after receiving the instruction for indicating the dequeue of the cache, the application rendering thread draws a rendering image according to the displacement.
S819, the application rendering thread sends the rendered image to a cache thread (cache enqueue).
S820, the cache thread requests the Vsync-SF signal from the Vsync thread after receiving the rendered image sent by the application rendering thread.
After the above S814, the terminal device may further execute S821 when executing S814-S820.
It is understood that the terminal device may simultaneously perform rendering, composition, display, and the like. Illustratively, as shown in FIG. 9, frame 4 is rendered, frame 2 is displayed in combination with frame 3 and the pop-up window reminder.
S821, the application main thread sends a Vsync-APP request to the Vsync thread.
And S822, the Vsync thread generates Vsync-APP according to the Vsync signal period corresponding to the second frame rate, and sends the Vsync-APP to the application main thread.
For example, in the flow shown in fig. 9, the application main thread sends a Vsync-APP request to the Vsync thread during 33.2ms-41.5 ms; the Vsync thread generates Vsync-APP according to a Vsync signal period (8.3 ms) corresponding to the second frame rate, and sends the Vsync-APP to the application main thread when 33.2ms +8.3ms is 41.5 ms.
It is to be understood that, in general, the execution time of S822 is after the execution time of S819.
It is understood that the application main thread calculates the amount of displacement upon receiving Vsync-APP in S822, and draws a rendered image by the application rendering thread. And sending the image to a cache queue of a cache thread after the image is drawn and rendered, and waiting for composition. The cache thread requests the Vsync-SF signal from the Vsync thread after receiving the rendered image sent by the application rendering thread.
After the shift amount is calculated by the application main line main thread after the above S822, the application main thread may further execute S823 to continue requesting Vsync-APP.
S823, the application main thread sends a Vsync-APP request to the Vsync thread.
And S824, generating Vsync-APP by the Vsync thread according to the Vsync signal period corresponding to the second frame rate, and sending the Vsync-APP to the application main thread. In addition, the Vsync thread generates Vsync-SF in a Vsync signal period corresponding to the first frame rate, and transmits the Vsync-SF to the composition thread.
It will be appreciated that the Vsync thread generates a Vsync-APP signal in S824 in response to the Vsync-APP request in S823 described above. The Vsync thread generates a Vsync-SF signal in S824 in response to the Vsync-SF request of S820 described above.
Illustratively, as shown in FIG. 9, the Vsync thread sends the Vsync-APP signal to the application main thread at 49.8ms, and the Vsync thread sends the Vsync-SF signal to the composition thread at 49.8ms. S825, after receiving the Vsync-SF signal, the composition thread composes an image.
In a possible implementation, the composition thread may also query the window manager for the focus application after receiving the Vsync-SF signal. And the synthesis thread inquires a cache queue corresponding to the focus application in the cache thread according to the focus application so as to confirm the image needing to be synthesized.
It is understood that, in S824, the generation timing of the Vsync-APP signal coincides with the generation timing of the Vsync-SF signal. Thus, the application draws the rendered image when the compositing thread composites the image. Specifically, after receiving the Vsync-APP in S824, the application main thread calculates the displacement amount, and draws a rendering image by the application rendering thread. And sending the image to a buffer queue after the image is drawn and rendered, and waiting for synthesis. The cache thread requests the Vsync-SF signal from the Vsync thread after receiving the rendered image sent by the application rendering thread.
Illustratively, as shown in fig. 9, at 49.8ms, the composition thread is at composition frame 4 and the application main thread is drawing rendering frame 6.
And S826, the synthesis thread sends the synthesized image to a hardware synthesizer. And the hardware synthesizer sends the synthesized image to a display driver for displaying.
S827, display drive, when the Vsync-HW signal arrives, displays the synthesized image.
And S828, after the frame rate control system is M in duration, the sleep is finished.
For example, in the flow shown in fig. 9, the sleep duration of the frame rate control system is 33.2ms, and 33.2ms after the s810 is a period of 49.8ms to 66.4 ms. The sleep is thus ended when the terminal device executes S825 (49.8 ms-66.4 ms), i.e., when frame 4 is synthesized.
S829, the frame rate control system sends a second message to the Vsync thread, where the second message is used to indicate that both the composition frame rate and the screen refresh rate are switched to the second frame rate.
And S830, after receiving the second message, the Vsync thread stores a Vsync signal period corresponding to the second frame rate.
It is to be understood that the Vsync thread stores the Vsync signal period corresponding to the synthesized frame rate as the Vsync period corresponding to the second frame rate after receiving the second message. Subsequent Vsync-SF signals are generated at a cadence corresponding to the second frame rate.
Illustratively, in the flow shown in fig. 9, the Vsync-SF signal is generated at 66.4ms, and the composition thread composes frame 5 (i.e., renders the rendered image from the Vsync signal in S822). The Vsync-SF signal is generated at 66.4ms +8.3ms, 74.7ms, and the compositing thread synthesizes frame 6.
And S831 and Vsync threads send the second frame rate to the hardware synthesizer after receiving the second message.
And S832, the hardware synthesizer receives the second frame rate and sends the second frame rate to the display driver.
And S833, switching the screen refresh rate to a second frame rate by the display driving screen.
It will be appreciated that the screen generates the Vsync-HW signal in Vsync periods corresponding to the second frame rate.
Illustratively, in the flow shown in fig. 9, the Vsync-HW signal is generated at 66.4ms and frame 4 is displayed on the screen. The Vsync-HW signal is generated at 66.4ms +8.3ms, i.e. 74.7ms, and frame 5 is displayed on the screen.
S834 and the Vsync thread generates Vsync-SF according to the Vsync signal period corresponding to the first frame rate, and sends the Vsync-SF to the composition thread.
It is understood that the terminal device responds to the Vsync-SF signal requested after the above-described S822 and before S824 and transmits the Vsync-SF to the composition thread. The composition thread receives the Vsync-SF signal and composes the image. And sends the synthesized image to a display driver for display.
Illustratively, as shown in FIG. 9, the Vsync thread sends a Vsync-SF signal at 66.4ms to the composition thread, which composes frame 5.
S835, the Vsync thread generates Vsync-SF according to the Vsync signal period corresponding to the second frame rate, and sends the Vsync-SF to the composition thread.
It will be appreciated that the terminal device responds to the Vsync-SF signal requested after S824 above and sends the Vsync-SF to the composition thread. The composition thread, upon receiving the Vsync-SF signal, composes the image. And sends the synthesized image to a display driver for display.
Illustratively, as shown in FIG. 9, the Vsync thread sends a Vsync-SF signal to the composition thread at 74.7ms, which composes frame 6.
It should be noted that, when the terminal device displays the screen capture animation or performs the charging reminder in the process of sliding away from the hand, the implementation process of the terminal device is similar to the process shown in fig. 8, and details are not repeated here.
The data processing method according to the embodiment of the present application will be described in detail with reference to specific embodiments. The following embodiments may be combined with each other and may not be described in detail in some embodiments for the same or similar concepts or processes.
Fig. 10 is a flowchart illustrating a data processing method according to an embodiment of the present application. As shown in fig. 10, the method may include:
s1001, determining the second frame rate to be integral multiple of the first frame rate.
Specifically, the frame rate control system determines that the second frame rate is an integer multiple of the first frame rate.
In a possible implementation manner, after receiving a message carrying the second frame rate sent by the combining thread, the frame rate control system determines that the second frame rate is an integral multiple of the first frame rate.
In the embodiment of the application, the first frame rate is the frame rate before the screen refresh rate is switched; the second frame rate is the frame rate after the screen refresh rate is switched.
It is understood that the ratio of the second frame rate to the first frame rate is an integer. Illustratively, the first frame rate may be 120Hz, and the second frame rate may be 60Hz; the first frame rate may be 120Hz and the second frame rate may be 40Hz. And are not limited herein.
S1002, inquiring the buffer number in the buffer queue.
Specifically, the frame rate control system queries the number of buffers in the buffer queue. The buffer amount refers to the sum of the amount of buffered buffers (queued buffers) and the amount of buffers (queued buffers) being rendered.
In a possible implementation manner, the focus window is queried, and the cache number in the cache queue corresponding to the focus window is queried based on the application package name corresponding to the focus window. Illustratively, the frame rate control system retrieves the focus application from the window manager. And querying a cache queue corresponding to the focus application in the image synthesis system according to the focus application, and further confirming the cache number in the corresponding cache queue.
In this way, the difference of a few Vsync cycles between rendering and displaying of each frame of image can be determined, and the delay time of screen refresh rate switching can be conveniently determined subsequently.
S1003, determining the delay duration of switching the screen refreshing rate to the second frame rate based on the cache number, and comparing the delay duration of switching the application frame rate to the second frame rate.
Specifically, the frame rate control system determines, based on the number of buffers, a delay duration for switching the screen refresh rate to the second frame rate compared to switching the application frame rate to the second frame rate.
In the embodiment of the present application, the delay duration may be the sum of the first duration and the second duration. The first duration is a difference between the first Vsync-HW time of the frame rate control system after receiving the message with the second frame rate sent by the synthetic thread and the time of the frame rate control system after receiving the message with the second frame rate sent by the synthetic thread.
The second duration is between (number of buffers + 1) the Vsync cycle duration corresponding to the first frame rate and (number of buffers + 1) the Vsync cycle duration corresponding to the first frame rate.
In a possible implementation, the delay duration satisfies: (buffer count + 1) × Vsync cycle duration corresponding to the first frame rate.
It can be understood that each frame of image in the terminal device differs by 2 Vsync periods from rendering to displaying the frame of image. When adding one buffer pile in the buffer queue, the number of Vsync periods from rendering to displaying of each frame of image is increased by 1.
The screen refreshing rate is delayed by (the buffer amount is plus 1) × the Vsync period duration corresponding to the first frame rate, so that the rhythm of the image during rendering and drawing is consistent with the rhythm of the image during display of the frame, the speed of the image during display is consistent with the expected speed, the jumping phenomenon is reduced, and the pause is reduced.
And S1004, switching the application frame rate to a second frame rate.
Specifically, the frame rate control system controls the image synthesis system to switch the application frame rate to the second frame rate.
It will be appreciated that the application frame rate is controlled by the Vsync-APP signal. Switching of the application frame rate to the second frame rate is achieved by controlling the time interval of adjacent Vsync-APP signals. The adjacent time intervals become Vsync period durations corresponding to the second frame rate.
And S1005, after delaying the time length, switching the screen refreshing rate to a second frame rate.
Specifically, after the delay time, the frame rate control system controls the display driver to switch the screen refresh rate to the second frame rate through the image synthesis system.
It will be appreciated that the screen refresh rate is controlled by the Vsync-HW signal. Switching of the screen refresh rate to the second frame rate is achieved by controlling the time interval of adjacent Vsync-HW signals. The adjacent time intervals become Vsync period durations corresponding to the second frame rate.
In a possible implementation, the switching time of the composite frame rate is consistent with the switching time of the screen refresh rate.
It will be appreciated that the composite frame rate is controlled by the Vsync-SF signal. Switching of the combined frame rate to the second frame rate is achieved by controlling the time interval of adjacent Vsync-SF signals. The adjacent time intervals become Vsync period durations corresponding to the second frame rate.
In summary, when the frame rate is switched, the application frame rate is switched first, and then the screen refresh rate is switched, so that the display interval between the mth frame and the M-1 th frame is consistent with the rendering interval, and further the display rhythm of the image is consistent with the rendering rhythm, and further the sliding speed jump caused by the inconsistency between the display interval and the rendering interval is reduced, the jamming is reduced, and the user experience is increased.
Next, the case where the second frame rate is higher than the first frame rate will be described. Fig. 9 and 11 correspond to a display flow when the second frame rate is greater than the first frame rate.
Fig. 9 is a schematic diagram of an interface display processing flow provided in an embodiment of the present application. Take the scenario that the list slides at a constant speed, the screen refresh rate is switched from 60Hz to 120Hz, and the sliding speed is 2 pixels/16.6 ms as an example. In fig. 9, the contents displayed by the terminal device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the order of time.
As shown in fig. 9, when rendering is performed on frame 3, the terminal device receives a popup prompt and decides to switch the application frame rate. After the switching, the interval of the Vsync-APP signal becomes the Vsync period (8.3 ms) corresponding to the second frame rate. During 16.6ms-33.2ms, when deciding to apply frame rate switching, the buffer amount is 1. The screen refresh rate switch is decided after (1 + 1) × 16.6, i.e. 33.2ms. Thus during 49.8ms-66.4ms, the screen refresh rate switch is decided. After the switching, both the intervals of the Vsync-SF signal and the intervals of the Vsync-HW signal become Vsync periods (8.3 ms) corresponding to the second frame rate.
The generation times of the Vsync-APP signal after the decision application frame rate switching (the time stamp of the Vsync-APP) are 33.2ms, 41.5ms, and 49.8ms, respectively. When rendering is started at the 33.2ms frame 4, frame rate switching is not completed due to the application frame rate. Therefore, the amount of displacement of the frame 2, the amount of displacement of the frame 3, and the amount of displacement of the frame 4 are 2 pixels in relation to the application frame rate (first frame rate) before switching. At 41.5ms, the application frame rate completes the frame rate switch. The shift amount of the frame 5 and the shift amount of the frame 6 are both related to the application frame rate (second frame rate) after switching, and are 1pixel.
The generation times of the Vsync-HW signal after the decision screen refresh rate is switched (time stamp of Vsync-HW) are 66.4ms, 74.7ms, and 83ms, respectively; the screen refresh rate does not complete the frame rate switching at 66.4ms, and the slip speed calculation at 66.4ms is related to the screen refresh rate before switching (first frame rate). The screen refresh rate completes the frame rate switching at 74.7ms, and the sliding speed calculation of 74.7ms is related to the screen refresh rate after switching (second frame rate).
In fig. 9, at 16.6ms, the display interface of the terminal device changes from frame 0 to frame 1, and the sliding speed is 2 pixels/16.6 ms. At 33.2ms, the display interface of the terminal equipment is changed from frame 1 to frame 2, and the sliding speed is 2 pixels/16.6 ms. At 49.8ms, the display interface of the terminal equipment is changed from frame 2 to frame 3, and the sliding speed is 2 pixels/16.6 ms. At 66.4ms, the display interface of the terminal equipment is changed from frame 3 to frame 4, and the sliding speed is 2pixel/16.6ms. At 74.7ms, the display interface of the terminal device changes from frame 4 to frame 5, and the sliding speed is 1pixel/8.3ms. At 83ms, the display interface of the terminal device changes from frame 5 to frame 6, and the sliding speed is 1pixel/8.3ms. When the pictures are switched, the speed is consistent, and the pictures are smoothly displayed without pause.
Fig. 11 is a schematic diagram of a display process provided in an embodiment of the present application. As shown in FIG. 11, in the scenario where the list slides at a constant speed, the screen refresh rate is switched from 60Hz to 120Hz, and the sliding speed is 2 pixels/16.6 ms. In fig. 11, the contents displayed by the terminal device correspond to frame-1, frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in the order of time.
As shown in fig. 11, when the terminal device renders frame 3, it decides to apply frame rate switching. After the switching, the interval of the Vsync-APP signal becomes the Vsync period (8.3 ms) corresponding to the second frame rate. During 16.6ms-33.2ms, when the frame rate switching is decided to be applied, the buffer amount is 2. The screen refresh rate switch is decided after (2 + 1) × 16.6, i.e. 49.8ms. Thus during 66.4ms-83ms, the screen refresh rate switch is decided. After the switching, both the intervals of the Vsync-SF signal and the intervals of the Vsync-HW signal become Vsync periods (8.3 ms) corresponding to the second frame rate.
The generation time of the Vsync-APP after the application frame rate is switched is determined to be 33.2ms, 41.5ms and 49.8ms respectively; when rendering is started at the 33.2ms frame 4, frame rate switching is not completed due to the application frame rate. Therefore, the amount of displacement of the frame 2, the amount of displacement of the frame 3, and the amount of displacement of the frame 4 are 2 pixels in relation to the application frame rate (first frame rate) before switching. At 41.5ms, the application frame rate completes the frame rate switch. The shift amount of the frame 5 and the shift amount of the frame 6 are both related to the application frame rate (second frame rate) after switching, and are 1pixel.
The generation time of the Vsync-HW after the refresh rate of the decision screen is switched is 83ms, 91.3ms and 99.6ms; the screen refresh rate does not complete the frame rate switching at 83ms, and the slip rate calculation at 83ms is related to the screen refresh rate before switching (first frame rate). The screen refresh rate completes the frame rate switching at 91.3ms, and the sliding speed calculation of 91.3ms is related to the screen refresh rate after switching (second frame rate).
In fig. 11, at 16.6ms, the display interface of the terminal device changes from frame-1 to frame 0, and the sliding speed is 2 pixels/16.6 ms. And when the time is 33.2ms, the display interface of the terminal equipment is changed from frame 0 to frame 1, and the sliding speed is 2pixel/16.6ms. At 49.8ms, the display interface of the terminal equipment is changed from frame 1 to frame 2, and the sliding speed is 2 pixels/16.6 ms. At 66.4ms, the display interface of the terminal device changes from frame 2 to frame 3, and the sliding speed is 2 pixels/16.6 ms. At 83ms, the display interface of the terminal device changes from frame 3 to frame 4, and the sliding speed is 2 pixels/16.6 ms. At 91.3ms, the display interface of the terminal device changes from frame 4 to frame 5, and the sliding speed is 1pixel/8.3ms. At 99.6ms, the display interface of the terminal device changes from frame 5 to frame 6, and the sliding speed is 1pixel/8.3ms. When the pictures are switched, the speed is consistent, and the pictures are displayed smoothly without blockage.
The data processing method according to the embodiment of the present application has been described above, and the terminal device provided in the embodiment of the present application for executing the data processing method is described below. Those skilled in the art can understand that the method and apparatus can be combined and referred to each other, and the terminal device provided in the embodiments of the present application can perform the steps in the data processing method.
As shown in fig. 12, fig. 12 is a schematic structural diagram illustrating a data processing apparatus according to an embodiment of the present application. The data processing device may be a terminal device in the embodiment of the present application. The data processing apparatus includes: a display screen 1801 for displaying an image; one or more processors 1802; a memory 1803; a plurality of application programs; and one or more computer programs, wherein the one or more computer programs are stored in the memory 1803, the one or more computer programs comprising instructions which, when executed by the data processing apparatus, cause the data processing apparatus to perform the steps in the data processing method described above.
Fig. 13 is a schematic hardware structure diagram of a data processing apparatus according to an embodiment of the present application. Referring to fig. 13, the apparatus includes: memory 1901, processor 1902, and interface circuitry 1903. The device may also include a display 1904, wherein the memory 1901, processor 1902, interface circuitry 1903, and display 1904 may be in communication; illustratively, the memory 1901, the processor 1902, the interface circuit 1903 and the display 1904 may communicate via a communication bus, and the memory 1901 is used for storing computer-executable instructions, and is controlled by the processor 1902 to execute and the interface circuit 1903 to execute communication, so as to implement the data processing method provided by the embodiments of the present application.
Optionally, the interface circuit 1903 may also include a transmitter and/or a receiver. Optionally, the processor 1902 may include one or more CPUs, and may also be other general-purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present application may be embodied directly in a hardware processor, or in a combination of the hardware and software modules in the processor.
In a possible implementation manner, the computer execution instructions in the embodiment of the present application may also be referred to as application program codes, which is not specifically limited in the embodiment of the present application.
The data processing apparatus provided in the embodiment of the present application is configured to execute the data processing method in the foregoing embodiment, and the technical principle and the technical effect are similar, which are not described herein again.
The embodiment of the application provides a terminal device, and the structure of the terminal device is shown in fig. 1. The memory of the terminal device may be configured to store at least one program instruction, and the processor is configured to execute the at least one program instruction to implement the technical solutions of the above-mentioned method embodiments. The implementation principle and technical effect are similar to those of the related embodiments of the method, and are not described herein again.
The embodiment of the application provides a chip. The chip comprises a processor for calling a computer program in a memory to execute the technical solution in the above embodiments. The implementation principle and technical effect are similar to those of the related embodiments, and are not described herein again.
The embodiment of the present application provides a computer program product, which enables a terminal device to execute the technical solutions in the above embodiments when the computer program product runs on an electronic device. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
The embodiment of the present application provides a computer-readable storage medium, on which program instructions are stored, and when the program instructions are executed by a terminal device, the terminal device is enabled to execute the technical solutions of the above embodiments. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above embodiments are only for illustrating the embodiments of the present invention and are not to be construed as limiting the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made on the basis of the embodiments of the present invention shall be included in the scope of the present invention.

Claims (12)

1. A data processing method is applied to a terminal device, the terminal device comprises an application, a frame rate control system, a synthesis thread, a cache thread and a display driver, and the method comprises the following steps:
when the application draws a rendering image at a first frame rate, the frame rate control system receives a message which is sent by the cache thread and carries a second frame rate, and the ratio of the second frame rate to the first frame rate is an integer greater than 1;
in response to receiving the message carrying the second frame rate, the frame rate control system controls the application to draw a rendered image at the second frame rate;
at a first moment, the frame rate control system controls the compositing thread to composite the image rendered by the application drawing at the second frame rate, and controls the display driver to drive a screen to display the image composited by the compositing thread at the second frame rate;
the first time is between the 1+ a Vsync-HW signal and the 2+ a Vsync-HW signal after the frame rate control system receives the message carrying the second frame rate, where a is the amount of cache in the cache queue corresponding to the application in the cache thread when the frame rate control system receives the message carrying the second frame rate, and the Vsync-HW signal is used to trigger the screen to display the synthesized image.
2. The method of claim 1, wherein the terminal device further comprises a Vsync thread;
the response to receiving the message carrying the second frame rate, the frame rate control system controlling the application to draw a rendering image at the second frame rate includes:
after the frame rate control system receives the message carrying the second frame rate, the frame rate control system sends a first message to a Vsync thread, wherein the first message is used for indicating that the application frame rate is switched to the second frame rate;
the Vsync thread generating a Vsync-APP signal at the second frame rate based on the first message and sending the Vsync-APP signal to the application;
the application renders a rendered image based on the Vsync-APP signal.
3. The method according to claim 1 or 2, wherein the terminal device further comprises: a Vsync thread;
the frame rate control system controls the compositing thread to composite the application-rendered image at the second frame rate at the first moment, and controls the display driver to drive the screen to display the image composited by the compositing thread at the second frame rate, including:
at the first moment, the frame rate control system sends a second message to a Vsync thread, wherein the second message is used for indicating that the synthesis frame rate and the screen refresh rate are both switched to the second frame rate;
the Vsync thread generating a Vsync-SF signal at the second frame rate based on the second message and sending the Vsync-SF signal to the composition thread;
the compositing thread compositing a rendered image based on the Vsync-SF signal;
the Vsync thread sending a third message to the display driver based on the second message, the third message indicating that the screen refresh rate is switched to the second frame rate;
the display driver generates a Vsync-HW signal at the second frame rate based on the third message control screen;
the display driver controls the screen to display the synthesized image upon receiving the Vsync-HW signal.
4. The method according to claim 1 or 2, wherein the difference between the first time and the time when the message carrying the second frame rate is received at the frame rate control system satisfies: (the buffer number + 1) × Vsync cycle duration corresponding to the first frame rate.
5. The method according to claim 1 or 2, wherein the buffer amount is a sum of the amount of buffers after rendering in the buffer queue corresponding to the application and the amount of buffers during rendering in the buffer queue corresponding to the application.
6. The method according to claim 1 or 2, wherein after the frame rate control system receives a message carrying a second frame rate sent by the composition thread when the application draws the rendered image at the first frame rate, the method further comprises:
the frame rate control system determines the ratio of the second frame rate to the first frame rate;
when the ratio is an integer larger than 1, the frame rate control system acquires the cache number from the cache queue corresponding to the application;
the frame rate control system determines the first time based on the buffer amount.
7. The method according to claim 6, wherein when the ratio is an integer greater than 1, the frame rate control system obtains the buffer amount from the buffer queue corresponding to the application, and includes:
the frame rate control system acquires a focus window from a window manager, wherein the focus window corresponds to the application;
and the frame rate control system acquires the cache number from the cache queue corresponding to the application based on the focus window.
8. The method of claim 7, wherein before the frame rate control system receives a message carrying a second frame rate sent by the composition thread when the application renders the rendered image at the first frame rate, the method further comprises:
when the application draws and renders the image at the first frame rate, the cache thread receives a drawn and rendered second image, the second image corresponds to a second window, and the second window is different from the focus window;
and after receiving other rendered images, the cache thread sends a message carrying the second frame rate to the frame rate control system.
9. The method of claim 8, wherein the other image corresponds to a pop-up reminder of the terminal device or the other image corresponds to a screenshot animation of the terminal device.
10. A terminal device, characterized in that the terminal device comprises a processor for invoking a computer program in a memory for performing the method according to any of claims 1-9.
11. A computer-readable storage medium, having stored thereon computer instructions, which, when run on a terminal device, cause the terminal device to perform the method of any one of claims 1-9.
12. A chip, characterized in that the chip comprises a processor for calling a computer program in a memory for performing the method according to any of claims 1-9.
CN202210114699.9A 2022-01-30 2022-01-30 Data processing method and related device Active CN114579075B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202210114699.9A CN114579075B (en) 2022-01-30 2022-01-30 Data processing method and related device
CN202211502249.3A CN116521115A (en) 2022-01-30 2022-01-30 Data processing method and related device
PCT/CN2023/071242 WO2023142995A1 (en) 2022-01-30 2023-01-09 Data processing method and related apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210114699.9A CN114579075B (en) 2022-01-30 2022-01-30 Data processing method and related device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202211502249.3A Division CN116521115A (en) 2022-01-30 2022-01-30 Data processing method and related device

Publications (2)

Publication Number Publication Date
CN114579075A CN114579075A (en) 2022-06-03
CN114579075B true CN114579075B (en) 2023-01-17

Family

ID=81772479

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210114699.9A Active CN114579075B (en) 2022-01-30 2022-01-30 Data processing method and related device
CN202211502249.3A Pending CN116521115A (en) 2022-01-30 2022-01-30 Data processing method and related device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202211502249.3A Pending CN116521115A (en) 2022-01-30 2022-01-30 Data processing method and related device

Country Status (2)

Country Link
CN (2) CN114579075B (en)
WO (1) WO2023142995A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114579075B (en) * 2022-01-30 2023-01-17 荣耀终端有限公司 Data processing method and related device
CN117711355A (en) * 2022-08-24 2024-03-15 荣耀终端有限公司 Screen refresh rate switching method and electronic equipment
CN116048833B (en) * 2022-08-31 2023-10-20 荣耀终端有限公司 Thread processing method, terminal equipment and chip system
CN116700654B (en) * 2022-09-15 2024-04-09 荣耀终端有限公司 Image display method, device, terminal equipment and storage medium
CN117891422A (en) * 2022-10-13 2024-04-16 荣耀终端有限公司 Image processing method and electronic device
CN118314000A (en) * 2022-10-17 2024-07-09 荣耀终端有限公司 Image prediction method, device, equipment and storage medium
CN117991961A (en) * 2022-11-07 2024-05-07 荣耀终端有限公司 Data processing method, device and storage medium
CN118138872A (en) * 2022-12-02 2024-06-04 荣耀终端有限公司 Method and device for stabilizing image frames
CN117112086B (en) * 2023-01-31 2024-07-09 荣耀终端有限公司 Data processing method and electronic equipment
CN118426722A (en) * 2023-01-31 2024-08-02 华为技术有限公司 Display method, display device, electronic equipment and storage medium
CN117058291B (en) * 2023-07-12 2024-07-26 荣耀终端有限公司 Video memory switching method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112052097A (en) * 2020-10-15 2020-12-08 腾讯科技(深圳)有限公司 Rendering resource processing method, device and equipment for virtual scene and storage medium
CN113254120A (en) * 2021-04-02 2021-08-13 荣耀终端有限公司 Data processing method and related device
CN113347466A (en) * 2021-05-18 2021-09-03 深圳市腾讯网络信息技术有限公司 Data processing method, device and storage medium
CN113630572A (en) * 2021-07-09 2021-11-09 荣耀终端有限公司 Frame rate switching method and related device

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8797340B2 (en) * 2012-10-02 2014-08-05 Nvidia Corporation System, method, and computer program product for modifying a pixel value as a function of a display duration estimate
CN103593155B (en) * 2013-11-06 2016-09-07 华为终端有限公司 Display frame generating method and terminal device
CN105611316B (en) * 2014-11-21 2019-05-03 华为终端(东莞)有限公司 A kind of method, apparatus and system adjusting frame per second
CN107066383A (en) * 2017-03-15 2017-08-18 武汉斗鱼网络科技有限公司 A kind of application program smoothness degree detection method and device
KR102545078B1 (en) * 2018-10-01 2023-06-19 삼성전자주식회사 Display apparatus, method for controlling thereof and system
GB2578629B (en) * 2018-11-01 2022-02-23 Samsung Electronics Co Ltd Device and method for processing rendered frames
CN111460342B (en) * 2019-01-21 2023-04-25 阿里巴巴集团控股有限公司 Page rendering display method and device, electronic equipment and computer storage medium
WO2020191685A1 (en) * 2019-03-27 2020-10-01 华为技术有限公司 Frequency adjustment method and apparatus applied to terminal, and electronic device
CN110609645B (en) * 2019-06-25 2021-01-29 华为技术有限公司 Control method based on vertical synchronization signal and electronic equipment
CN113473229B (en) * 2021-06-25 2022-04-12 荣耀终端有限公司 Method for dynamically adjusting frame loss threshold and related equipment
CN113805832B (en) * 2021-09-15 2024-08-06 Oppo广东移动通信有限公司 Image data transmission method, device, terminal and medium
CN113741848B (en) * 2021-09-15 2024-02-23 Oppo广东移动通信有限公司 Image display method, DDIC, display screen module and terminal
CN113781949B (en) * 2021-09-26 2023-10-27 Oppo广东移动通信有限公司 Image display method, display driving chip, display screen module and terminal
CN114579075B (en) * 2022-01-30 2023-01-17 荣耀终端有限公司 Data processing method and related device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112052097A (en) * 2020-10-15 2020-12-08 腾讯科技(深圳)有限公司 Rendering resource processing method, device and equipment for virtual scene and storage medium
CN113254120A (en) * 2021-04-02 2021-08-13 荣耀终端有限公司 Data processing method and related device
CN113347466A (en) * 2021-05-18 2021-09-03 深圳市腾讯网络信息技术有限公司 Data processing method, device and storage medium
CN113630572A (en) * 2021-07-09 2021-11-09 荣耀终端有限公司 Frame rate switching method and related device

Also Published As

Publication number Publication date
CN116521115A (en) 2023-08-01
CN114579075A (en) 2022-06-03
WO2023142995A1 (en) 2023-08-03

Similar Documents

Publication Publication Date Title
CN114579075B (en) Data processing method and related device
CN114579076B (en) Data processing method and related device
CN113726950B (en) Image processing method and electronic equipment
CN113254120B (en) Data processing method and related device
CN113630572B (en) Frame rate switching method and related device
CN114895861A (en) Message processing method, related device and system
CN115048012B (en) Data processing method and related device
CN114089933B (en) Display parameter adjusting method, electronic device, chip and readable storage medium
CN116075808A (en) Image processing method and electronic equipment
CN112516590A (en) Frame rate identification method and electronic equipment
CN114531519A (en) Control method based on vertical synchronization signal and electronic equipment
WO2024156206A9 (en) Display method and electronic device
CN115904184B (en) Data processing method and related device
CN116414337A (en) Frame rate switching method and device
CN115904185A (en) Data processing method and related device
CN115686403A (en) Display parameter adjusting method, electronic device, chip and readable storage medium
CN114740986A (en) Handwriting input display method and related equipment
WO2023124227A9 (en) Frame rate switching method and device
WO2023124225A9 (en) Frame rate switching method and apparatus
WO2024066834A9 (en) Vsync signal control method, electronic device, storage medium and chip
CN116700578B (en) Layer synthesis method, electronic device and storage medium
WO2024159950A1 (en) Display method and apparatus, electronic device, and storage medium
CN116414336A (en) Frame rate switching method and device
CN113973152A (en) Unread message quick reply method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant