CN115686403A - Display parameter adjusting method, electronic device, chip and readable storage medium - Google Patents

Display parameter adjusting method, electronic device, chip and readable storage medium Download PDF

Info

Publication number
CN115686403A
CN115686403A CN202110876250.1A CN202110876250A CN115686403A CN 115686403 A CN115686403 A CN 115686403A CN 202110876250 A CN202110876250 A CN 202110876250A CN 115686403 A CN115686403 A CN 115686403A
Authority
CN
China
Prior art keywords
trigger signal
frequency
display
preset
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110876250.1A
Other languages
Chinese (zh)
Inventor
钟辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110876250.1A priority Critical patent/CN115686403A/en
Publication of CN115686403A publication Critical patent/CN115686403A/en
Pending legal-status Critical Current

Links

Images

Abstract

The embodiment of the application is applicable to the technical field of display, and provides a display parameter adjusting method, electronic equipment, a chip and a readable storage medium. The method comprises the steps of receiving operation of a user, responding to the operation, determining whether a current scene of the electronic equipment is a preset scene or not, if so, determining whether the frequency of a display trigger signal at the current moment is a first preset frequency or not, if not, increasing the frequency of the display trigger signal, further determining whether a first time difference between the current moment and the first moment is greater than or equal to a first preset time difference threshold or not, and if the first time difference is greater than or equal to the first preset time difference threshold, executing a processing flow of image display according to the adjusted display trigger signal. Equivalently, at least one trigger signal is added between the moment of the sliding operation and the initial display trigger signal, so that the time delay of image display is reduced, and the tracking quality of the electronic equipment is improved.

Description

Display parameter adjusting method, electronic device, chip and readable storage medium
Technical Field
The embodiment of the application relates to the technical field of display, in particular to a display parameter adjusting method, electronic equipment, a chip and a readable storage medium.
Background
In the process of displaying a frame of image, the electronic device usually performs rendering, layer composition, and image display in sequence according to a fixed flow. Generally, an electronic device uses a periodic vertical synchronization (Vsync) signal as a trigger signal for triggering rendering, layer composition, and image display. And when the trigger signal arrives, triggering the corresponding processing flow. For example, as shown in fig. 1, when the nth Vsync signal arrives, rendering is performed on the nth frame image data. And when the (N + 1) th Vsync signal arrives, performing layer composition on the rendered data of the Nth frame. The display of the synthesized image data of the nth frame is completed in the period of the N +2 th Vsync signal. Typically, the frequency of the Vsync signal is the same as the refresh rate of the display screen. That is, the higher the refresh rate of the display screen, the shorter the period duration of the Vsync signal, and the lower the refresh rate of the display screen, the longer the period duration of the Vsync signal.
In some scenarios, the electronic device may adjust a refresh rate of the display screen as the user slides the display interface. However, the electronic device can only adjust the refresh rate in the frame blanking region of one Vsync signal period. As shown in fig. 2, it is assumed that the electronic device receives a sliding operation of a user when performing rendering of an nth frame image, and the electronic device needs to wait for T duration before displaying a slid image (an N +1 th frame image). Wherein, the T duration includes a duration between a time when a user slides the display interface and a trigger time of the Vsync signal of the (N + 1) th frame, and a duration of two complete periods of the Vsync signal.
The image after the sliding is sent and displayed by adopting the method has longer time delay, and the image display of the electronic equipment has poor tracking chirality.
Disclosure of Invention
The embodiment of the application provides a display parameter adjusting method, electronic equipment, a chip and a computer readable storage medium, and improves the graphic display efficiency.
In a first aspect, a method for adjusting display parameters is provided, where the method is used for an electronic device, where the electronic device includes a touch screen, and the method includes:
receiving an operation input by a user through a touch screen;
determining whether the scene of the electronic equipment is a preset scene or not in response to the receiving operation;
if the scene is a preset scene, determining whether the frequency of the display trigger signal at the current moment is a first preset frequency; the display trigger signal is used for triggering the processing flow of image display;
if the frequency of the display trigger signal at the current moment is not the first preset frequency and the frequency of the display trigger signal at the current moment is less than the first preset frequency, increasing the frequency of the display trigger signal at the current moment to a second preset frequency to obtain a display trigger signal after frequency adjustment;
determining whether a first time difference between the current time and a first time is greater than or equal to a first preset time difference threshold, wherein the first time is the next trigger time indicated by the display trigger signal of the current time;
and if the first time difference is larger than or equal to a first preset time difference threshold, executing a processing flow of image display according to the adjusted display trigger signal before the first time.
In an embodiment, the processing flow of image display includes a drawing and rendering flow, and the displaying the trigger signal includes applying the drawing and rendering trigger signal; applying a drawing and rendering trigger signal for triggering a drawing and rendering process;
the frequency of the display trigger signal at the current moment is increased to a second preset frequency, and the display trigger signal after the frequency adjustment is obtained, and the method comprises the following steps:
increasing the frequency of the application drawing rendering trigger signal at the current moment to a second preset frequency to obtain an adjusted application drawing rendering trigger signal;
correspondingly, the processing flow for executing the image display according to the adjusted display trigger signal comprises the following steps:
and executing the drawing and rendering process according to the adjusted application drawing and rendering trigger signal.
The frequency of the application rendering trigger signal is increased to a second preset frequency to obtain an adjusted application rendering trigger signal, so that the frequency of the adjusted application rendering trigger signal is higher than that of the current application rendering trigger signal, namely at least one application rendering trigger signal is added between the current moment and the next trigger moment indicated by the initial application rendering trigger signal, the rendering process can be executed in advance, the image of the current frame can be timely sent and displayed by hardware, the time delay of image sending and displaying is reduced, and the smoothness of the picture and the following handedness of the electronic equipment are improved.
In an embodiment, the first preset time difference threshold is a duration of one cycle of the frequency-adjusted application rendering trigger signal.
In an embodiment, the image display processing flow includes an image synthesis flow, and the display trigger signal further includes a layer synthesis trigger signal; the layer composition trigger signal is used for triggering a layer composition process;
the frequency of the display trigger signal at the current moment is increased to a second preset frequency, and the display trigger signal after the frequency adjustment is obtained, and the method comprises the following steps:
increasing the frequency of the layer synthesis trigger signal at the current moment to a second preset frequency to obtain an adjusted layer synthesis trigger signal;
correspondingly, the processing flow for executing the image display according to the adjusted display trigger signal comprises the following steps:
and executing the layer synthesis process according to the adjusted layer synthesis trigger signal.
The frequency of the layer synthesis trigger signal is increased to a second preset frequency to obtain an adjusted layer synthesis trigger signal, so that the frequency of the adjusted layer synthesis trigger signal is higher than that of the current layer synthesis trigger signal, which is equivalent to that at least one layer synthesis trigger signal is added between the current moment and the next trigger moment indicated by the initial layer synthesis trigger signal, so that the layer synthesis process can be executed in advance, the image of the current frame can be timely sent and displayed by hardware, the time delay of image sending and displaying is reduced, and the fluency of the image and the following handedness of the electronic equipment are improved.
In one embodiment, the first predetermined frequency is a highest display refresh rate supported by the electronic device, and the second predetermined frequency is a highest display refresh rate supported by the electronic device.
In an embodiment, the first preset frequency is a highest refresh rate of a preset scene, and the second preset frequency is a highest refresh rate of the preset scene.
In an embodiment, the first preset frequency is a highest refresh rate of a preset scene and is not a highest display refresh rate supported by the electronic device, and the second preset frequency is a highest display refresh rate supported by the electronic device.
In an embodiment, the process flow of performing image display according to the adjusted display trigger signal before the first time includes:
executing the image display processing flow according to the adjusted display trigger signal at the moment of the first time period before the first moment; the first time interval is a period duration of at least one first preset frequency, or the first time interval is a period duration of at least one second preset frequency.
In an embodiment, after the processing flow of image display is executed according to the adjusted display trigger signal before the first time, the method further includes:
after the first moment, the frequency of the display trigger signal is adjusted to a first preset frequency.
In an embodiment, after the processing flow of the image display is a drawing and rendering flow, the display trigger signal is an application drawing and rendering trigger signal, and the processing flow of the image display is executed according to the adjusted display trigger signal, the method further includes:
determining whether the frequency of the layer synthesis trigger signal at the current moment is a third preset frequency; the layer composition trigger signal is used for triggering a layer composition process;
if the frequency of the layer synthesis trigger signal at the current moment is not the third preset frequency and the frequency of the layer synthesis trigger signal at the current moment is less than the third preset frequency, increasing the frequency of the layer synthesis trigger signal at the current moment to a fourth preset frequency to obtain a layer synthesis trigger signal after frequency adjustment;
determining whether a second time difference between the current time and a second time is greater than or equal to a second preset time difference threshold value; the second moment is the next trigger moment indicated by the layer synthesis trigger signal at the current moment;
and if the second time difference is larger than or equal to the second time difference threshold, executing the image synthesis process according to the adjusted layer synthesis trigger signal before the second moment.
And under the condition that a second time difference between the current time and the next trigger time indicated in the layer synthesis trigger signal at the current time is greater than a second preset time difference threshold value, increasing the layer synthesis frequency to a fourth preset frequency to obtain the adjusted layer synthesis trigger signal. The image layer synthesis process which can be effectively executed in advance according to the adjusted image layer synthesis trigger signal can effectively improve the image display sending efficiency, so that the image can be timely sent and displayed by hardware, and the smoothness of the image is improved. In a possible case, the second preset time difference threshold for triggering to increase the layer synthesis frequency may be different from the first preset time difference threshold for triggering to increase the rendering frequency, so that the flexibility of increasing the layer synthesis frequency is improved.
In an embodiment, the second time difference threshold is a duration of one period of the adjusted layer composition trigger signal.
In one embodiment, if the second predetermined frequency is higher than the fourth predetermined frequency; the method further comprises the following steps:
when the number of the to-be-synthesized drawing layer data stored in the buffer queue exceeds the number of the data which can be stored in the storage queue in the buffer queue, the to-be-synthesized drawing layer data are deleted from front to back in sequence according to the sequence of the to-be-synthesized drawing layer data stored in the buffer queue until the number of the to-be-synthesized drawing layer data stored in the buffer queue does not exceed the number of the storage queue in the buffer queue.
In one embodiment, the first predetermined frequency is the highest display refresh rate supported by the electronic device.
In one embodiment, the third predetermined frequency is the highest display refresh rate supported by the electronic device.
In one embodiment, the operation includes a sliding operation, and determining whether the scene of the electronic device is a preset scene in response to the receiving operation includes:
determining whether a current scene of the electronic device is a preset scene or not in response to the received sliding operation; the preset scene includes a list class scene.
In one embodiment, the operation includes a click operation, and determining whether the scene of the electronic device is a preset scene in response to the receiving operation includes:
responding to the received click operation, and determining whether the scene of the electronic equipment is a preset scene; the preset scene is a scene pre-stored in a scene list.
In one embodiment, the electronic device includes an application module, a window management module, a scene recognition module, and a frame rate control module;
the application program module receives operation input through the touch screen and responds to the operation to send window data to the window management module;
the window management module sends the acquired window data to the scene recognition module;
the scene recognition module acquires a preset scene information list, compares the received window data with the preset scene information list and determines the scene of the electronic equipment;
the scene recognition module sends a notification message to the frame rate control module, wherein the notification message carries an identifier corresponding to a scene of the electronic equipment;
the frame rate control module compares the scene indicated by the notification message with a pre-stored scene list, and determines that the scene indicated by the notification message is a preset scene when the scene indicated by the notification message is in the pre-stored scene list;
the frame rate control module determines whether the frequency of the display trigger signal at the current moment is a first preset frequency or not, and increases the frequency of the display trigger signal at the current moment to a second preset frequency under the condition that the frequency of the display trigger signal at the current moment is not the first preset frequency and is less than the first preset frequency to obtain the display trigger signal after frequency adjustment;
the frame rate control module determines whether a first time difference between the current time and the first time is greater than or equal to a first preset time difference threshold, and sends the adjusted display trigger signal before the first time when the first time difference is greater than or equal to the first preset time difference threshold.
In one embodiment, the electronic device further comprises an application rendering module; the display trigger signal comprises an application rendering trigger signal;
the frame rate control module sends an adjusted application drawing and rendering trigger signal to the application drawing and rendering module at a first moment;
and the application drawing and rendering module executes the drawing and rendering process when receiving the application drawing and rendering trigger signal.
In an embodiment, the frame rate control module sends an adjusted application rendering trigger signal to the application rendering module according to the adjusted frequency at the first time.
In an embodiment, the electronic device further includes a layer composition module, and the display trigger signal further includes a layer composition trigger signal;
the frame rate control module sends an adjusted layer synthesis trigger signal to the layer synthesis module at a first moment;
and executing the layer synthesis process when the layer synthesis module receives the layer synthesis trigger signal.
In an embodiment, the frame rate control module sends the adjusted layer synthesis trigger signal to the layer synthesis module according to the adjusted frequency at the first time.
The method for adjusting the display parameters. The method comprises the steps of receiving operation of a user, responding to the operation, determining whether a current scene of the electronic equipment is a preset scene or not, if so, determining whether the frequency of a display trigger signal at the current moment is a first preset frequency or not, if not, increasing the frequency of the display trigger signal, further determining whether a first time difference between the current moment and the first moment is greater than or equal to a first preset time difference threshold or not, and if the first time difference is greater than or equal to the first preset time difference threshold, executing a processing flow of image display according to the adjusted display trigger signal. Equivalently, at least one trigger signal is added between the moment of the sliding operation and the initial display trigger signal, so that the time delay of image display is reduced, and the tracking quality of the electronic equipment is improved.
In a second aspect, an electronic device is provided, where the electronic device includes a processor, and the processor is configured to couple with a memory, read instructions in the memory, and cause the electronic device to perform the method provided in the first aspect according to the instructions.
In a third aspect, a computer-readable storage medium is provided, which stores computer instructions that, when executed on an electronic device, cause the electronic device to perform the method provided in the first aspect.
In a fourth aspect, a chip is provided, the chip comprising a processor configured to couple with a memory and execute a computer program in the memory to perform the method provided in the first aspect.
In a fifth aspect, a computer program product comprising instructions for causing an electronic device to perform the method of the first aspect when the computer program product is run on the electronic device is provided.
Drawings
FIG. 1 is a schematic diagram of an image display process;
FIG. 2 is a schematic diagram of an image rendering process in response to a slide operation;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a block diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 5 is a schematic diagram of a Vsync signal provided in one embodiment of the present application;
FIG. 6 is a schematic diagram of an image display process according to an embodiment of the present application;
FIG. 7 is a schematic view of a sliding operation provided by one embodiment of the present application;
fig. 8 is a block diagram of a software architecture of an electronic device according to another embodiment of the present application;
FIG. 9 is a schematic diagram of an image display process according to another embodiment of the present application;
FIG. 10 is a schematic diagram of an image display process according to another embodiment of the present application;
FIG. 11 is a schematic diagram of an image display process according to another embodiment of the present application;
FIG. 12 is a schematic diagram of an image display process according to another embodiment of the present application;
fig. 13 is a flowchart illustrating a method for adjusting display parameters according to an embodiment of the present application;
FIG. 14 is a schematic diagram of a screen refresh rate setting interface according to an embodiment of the present application;
FIG. 15 is a schematic diagram of an image display process according to another embodiment of the present application;
FIG. 16 is a schematic diagram of an image rendering process according to another embodiment of the present application;
FIG. 17 is a schematic diagram of an image display process according to another embodiment of the present application;
fig. 18 is a flowchart illustrating a method for adjusting display parameters according to another embodiment of the present application;
fig. 19 is a flowchart illustrating a method for adjusting display parameters according to another embodiment of the present application;
fig. 20 is a flowchart illustrating a method for adjusting display parameters according to another embodiment of the present application;
fig. 21 is a flowchart illustrating a method for adjusting display parameters according to another embodiment of the present application;
fig. 22 is a schematic structural diagram of an apparatus for adjusting display parameters according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
In the following, the terms "first", "second" and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", "third" may explicitly or implicitly include one or more of the features.
The method for adjusting the display parameters, provided by the embodiment of the application, can be applied to electronic equipment. Optionally, the electronic device includes a terminal device, and the terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. The terminal device may be a mobile phone (mobile phone), a smart television, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and the like. The embodiment of the present application does not limit the specific technology and the specific device form adopted by the terminal device.
For example, fig. 3 shows a schematic structural diagram of the electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of receiving a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to implement the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (TD-SCDMA), long Term Evolution (LTE), fifth Generation wireless communication systems (5g, the 5th Generation of wireless communication system), BT, GNSS, WLAN, NFC, FM, and/or IR technology, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and perform directional recording.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C to assist in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so that the heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards can be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
It should be noted that any of the electronic devices mentioned in the embodiments of the present application may include more or less modules in the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 4 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture of the electronic device 100 divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 4, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 4, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide a fusion of the 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver, a Wi-Fi driver and the like.
It should be noted that the electronic device mentioned in the embodiments of the present application may include more or less modules in the electronic device. For example, the electronic device may also include memory, a timer, and the like.
For ease of understanding, the examples are given in part for illustration of concepts related to embodiments of the present application.
1. Frame: refers to a single picture of the smallest unit in the interface display. A frame can be understood as a still picture and displaying a number of consecutive frames in rapid succession can create the illusion of object motion. The frame rate refers to the number of frames of a picture refreshed in 1 second, and can also be understood as the number of times of refreshing the picture per second by a graphics processor in the terminal device. A high frame rate may result in smoother and more realistic animation. The greater the number of frames per second, the more fluid the displayed motion will be.
It should be noted that, before the interface displays the frame, processes such as drawing, rendering, and composition are usually required.
2. And (3) frame drawing: the method refers to drawing pictures on a display interface. The display interface may be comprised of one or more views, each of which may be drawn by a visual control of the view system, each of which is comprised of a sub-view, one of which corresponds to a widget in the view, e.g., one of which corresponds to a symbol in the picture view.
3. Frame rendering: rendering the rendered view or adding 3D effects, etc. For example: the 3D effect may be a light effect, a shadow effect, a texture effect, and the like.
4. Frame synthesis: is the process of compositing a plurality of the one or more rendered views into a display interface.
The following describes an exemplary workflow of software and hardware of the terminal device 100 in conjunction with a scenario where an application is started or an interface is switched in the application.
When the touch sensor 180K receives a touch operation, the kernel layer processes the touch operation into an original input event (including information such as touch coordinates, touch force, and a time stamp of the touch operation). The raw input events are stored at the kernel layer. And the kernel layer reports the original input event to an input system of the application program framework layer through the input processing library. And the input system of the application program framework layer analyzes the information (including the operation type, the report point position and the like) of the original input event, determines the focus application according to the current focus and sends the analyzed information to the focus application. The focus may be a touch point in a touch operation or a click position in a mouse click operation. The focus application is an application running in the foreground of the terminal equipment or an application corresponding to the touch position in the touch operation. The focus application determines a control corresponding to the original input event according to the parsed information (e.g., a hit position) of the original input event.
Taking the touch operation as a touch sliding operation, and taking the control corresponding to the touch sliding operation as a list control of the wechat application as an example, the wechat application calls an application rendering module (not shown in the figure) to render the image, and the application rendering module renders the rendered image. And the WeChat application sends the rendered image to a cache queue of a display composition process. And synthesizing the rendered image in the display synthesis process into a WeChat list interface through an image layer synthesis module (not shown in the figure). The display composition process is driven by the LCD/LED screen of the kernel layer, so that the LCD/LED screen displays the corresponding list interface of the WeChat application.
An application scenario provided by the embodiment of the present application is described below with reference to the drawings.
First, each flow in the image display process will be described.
And the electronic equipment responds to the condition that the user clicks the sliding application list, the application acquires the related data of the frame image to be displayed, processes the related data of the frame image to be displayed according to a fixed flow and sends the related data to the display screen for display. The fixed process mainly comprises a drawing and rendering process, a layer composition process and a display sending process. That is, for a certain frame of image, the electronic device is required to sequentially execute a rendering process, a layer composition process, and a display process before displaying the image on the display screen. The rendering process can be implemented by an application, and is therefore also referred to as an application rendering process. The layer composition process may be implemented by a layer composition module (surfaceflag), and the layer composition process may also be referred to as a surfaceflag layer composition process. The display flow is realized by hardware (hardware), and is also called hardware display flow.
Specifically, the drawing and rendering process is to obtain a plurality of pieces of to-be-displayed graph layer data, and draw and render the plurality of pieces of to-be-displayed graph layer data to generate the to-be-synthesized graph layer data. The layer synthesis process is to synthesize the to-be-synthesized layer data generated by the rendering process, perform hardware hybrid rendering (HWC), and generate to-be-displayed frame image data. The hardware display sending process is to perform hardware display processing on the frame image data to be displayed generated by the image layer combination process and push the frame image data to the display screen.
The electronic device typically uses a periodic Vsync signal as a trigger signal to trigger each flow. The frequency of the period of the Vsync signal is related to the refresh rate of the display screen. For example, when the refresh rate of the display screen is 15Hz, the cycle frequency of the Vsync signals is also 15Hz, and the cycle duration of one Vsync signal is 66.67ms. That is, the electronics output one Vsync signal every 66.67ms interval. How to display one frame of image is described in detail by the flow chart of the display image shown in fig. 1, taking the refresh rate of the display screen as 15Hz as an example. As shown in fig. 1, when the periodic signal of the nth Vsync arrives, an application rendering module of the application layer is triggered to perform rendering and rendering processing on multiple pieces of to-be-displayed image layer data in the nth frame image, generate to-be-synthesized image layer data, and store the generated to-be-synthesized image layer data in a to-be-synthesized buffer queue (buffer queue) in the surfafinger. When the periodic signal of the (N + 1) th Vsync arrives, the surfeFinger of the application framework layer acquires the to-be-synthesized image layer data generated by the period of the Nth Vsync from the buffer queue, synthesizes the to-be-synthesized image layer data, and performs hardware mixing rendering on the synthesized image layer to generate the to-be-displayed frame image data. It should be noted that, the surfefinger usually obtains the data of the drawing layer to be synthesized from front to back in the sequence stored in the buffer queue. In the period of the N +2 th Vsync, the display unit of the hardware layer performs hardware display processing on the frame image data to be displayed generated by the surface flicker, and pushes the frame image data to be displayed to a display screen.
The display screen in the electronic equipment comprises the pixel points arranged in rows and columns, and the display controller can control each pixel point to display corresponding brightness and color, so that the display screen can display one frame of picture. In an embodiment, taking the display screen as an organic light-emitting diode (OLED) display screen as an example, each pixel may be composed of 3 OLED organic light-emitting diodes, the 3 OLED organic light-emitting diodes respectively emit red (R), green (G), and blue (B) colors, and the 3 led combinations may emit RGB-format colors. The display controller can control the organic light emitting diode to enable the organic light emitting diode to display corresponding brightness and color, and further enable the display screen to display a frame of picture. In an embodiment, a display screen of the electronic device may be located at a lower layer of the touch screen, and the display screen and the touch screen are not limited in the embodiments of the present application.
Illustratively, the OLED display screen has a resolution of 1080 × 2400, and a row of the OLED display screen includes 1080 pixel points, and a column of the OLED display screen includes 2400 pixel points. Taking the display controller to control the first row of pixels as an example, the display controller may output a Horizontal Sync (HSYNC) signal, where the HSYNC indicates that the display controller is to control the organic light emitting diodes in the first row of pixels to work. After the display controller outputs HSYNC, it may wait for a Horizontal Back Porch (HBP) to start controlling the operation of the organic light emitting diodes in the pixels in the first row. After the display controller controls the last organic light emitting diode in the pixel point of the first row to operate, it may wait for a Horizontal Front Point (HFP) to input a row synchronization signal of the second row. Therefore, the display controller can control the organic light emitting diodes in the pixel points of all the rows to work.
As shown in fig. 5, for each row of pixels, there is a horizontal blanking area (H-Porch), and the duration of the H-Porch in each row is equal to "the duration of the row sync signal + the horizontal back shoulder + the horizontal front shoulder", that is, H-Porch = HFP + HBP + HSW. Where HSW is the duration of the line sync signal, the horizontal blanking region may also be referred to as the line blanking region.
When the display controller controls the organic light emitting diodes in the pixel points of all the lines in one frame of picture to work, the display screen can display one frame of picture. After a frame is displayed on the display screen, the display controller may wait for a Vertical Front Point (VFP) to output a frame synchronization signal (VSYNC), where the VSYNC indicates that the display controller is about to start controlling the display screen to display a next frame. After outputting VSYNC, the display controller may wait for a Vertical Back Porch (VBP) and output HSYNC of the first line of the next frame picture to control the display screen to display the next frame picture, as described above.
Wherein, for a frame of picture, there is a vertical blanking area (V-Porch), and the duration of V-Porch of each line is equal to "the duration of frame synchronization signal + vertical back shoulder + vertical front shoulder", i.e. V-Porch = VFP + VBP + VSW. Where VSW is the duration of the frame sync signal, the vertical blanking region may also be referred to as the frame blanking region.
The electronic device may make the adjustment of the display parameters during the VFP period in the frame blanking area. Generally, the duration of the frame blank area is very short, about 1/1000 of the period duration of one Vsync. However, when the frequency of image refresh is lower than 24Hz, the change of the image can be recognized by the human eye. Therefore, when the refresh rate of the display screen is lower than 24Hz (e.g. 15 Hz), the duration of the display time is usually less than 1000/24=41.67ms, and correspondingly, the duration of the frame blanking area is usually longer, which is the duration of one period of Vsync minus the duration of the display time. For example, the frame blanking region has a duration of 66.67ms, which is the duration of one period of the 15Hz signal minus the display time, of 41.67ms, i.e., 15ms.
Based on the above description, one frame of image display needs to go through a rendering process, a layer composition process, and a hardware display process. Since the rendering process and the layer composition process require a trigger of Vsync to be executed. Therefore, as shown in fig. 6, the time duration required for displaying one frame of image at least includes the period duration (T1) of the triggered rendering process, the period duration (T2) of the layer composition process Vsync, and the duration (T3) occupied by the display time in the hardware display sending process, i.e. the sum of the period duration of 2 complete Vsync periods and the duration of one display time. The lower the refresh rate of the display screen, the longer the time period required to display one frame of image.
In a possible situation, when the picture on the display screen is static and not changed, the electronic device will usually adjust the refresh rate of the display screen to be low (for example, 15 Hz), so as to achieve energy saving and consumption reduction of the electronic device. When the picture displayed on the display screen changes, the electronic equipment can increase the refresh rate of the display screen so as to improve the use experience of a user.
For example, as shown in fig. 7, the user browses an interface of a social application as shown in (a) in fig. 7, or the user browses a setting interface as shown in (b) in fig. 7, or alternatively, the user browses a document interface as shown in (c) in fig. 7, or alternatively, the user browses a goods browsing interface as shown in (d) in fig. 7. If the user does not perform any operation within a long time period and the image displayed on the display screen does not change within the long time period, the electronic device reduces the refresh rate of the display screen to 15Hz. And receiving the up-sliding operation or the down-sliding operation of the user, and adjusting the refresh rate of the display screen to 60Hz by the electronic equipment. At this time, the electronic device executes a rendering process, a layer composition process, and a hardware display sending process in response to the upslide operation or the downslide operation, and displays content corresponding to the upslide operation or displays content corresponding to the downslide operation.
Continuing as shown in fig. 7, the user browses the interface as shown in (e) in fig. 7, or the user browses the electronic book browsing interface as shown in (f) in fig. 7. If the user does not perform any operation within a long time period and the image displayed on the display screen does not change within the long time period, the electronic device reduces the refresh rate of the display screen to 15Hz. When receiving a left-sliding operation or a right-sliding operation of a user, the electronic device adjusts the refresh rate of the display screen to be lower than 60Hz. At this time, the electronic device executes a rendering process, a layer composition process, and a hardware display sending process in response to the left-sliding operation or the right-sliding operation, and displays the content corresponding to the left-sliding operation or displays the content corresponding to the right-sliding operation.
In the related art, the Vsync signal triggering the rendering process is referred to as an application rendering triggering signal (APP-Vsync); the Vsync signal triggering the layer composition process is called a layer composition trigger signal (SF-Vsync); the Vsync signal that triggers the Hardware rendering process is referred to as a Hardware rendering trigger signal (HW-Vsync). The APP-Vsync, SF-Vsync may have the same period duration as the HW-Vsync, and the period frequency of the HW-Vsync is the same as the refresh rate of the display screen.
In one possible scenario, a software architecture diagram of an electronic device may be as shown in FIG. 8. The electronic device may include an application drawing rendering module 501, a layer composition module 502, a display screen 503, a window management module (Windows manager Service) 504, a scene recognition module 505, a frame rate control module 506, and a display driver 507.
The application rendering module 501 is a functional module in an application, and belongs to an application program layer. The layer composition module 502, the window management module 504, the scene recognition module 505, and the frame rate control module 506 belong to an application architecture layer. The layer composition module 502 may be a surfaceflag process. The window management module 504 receives window data sent by an application. Illustratively, the window data includes: application package name, interface (activity) name, focus control type (e.g., listview list control). The frame rate control module 506 may adjust the period duration of the APP-Vsync and/or SF-Vsync. The scene identification module 504 may identify scene information corresponding to the window data in response to receiving the window data sent by the window management module 504. The display 503 is a functional module in hardware, and belongs to a hardware layer.
The application rendering module 501 performs rendering and rendering processing on the data of the drawing layer to be displayed according to the APP-Vsync sent by the frame rate control module 506, so as to generate the data of the drawing layer to be synthesized. The layer synthesis module 502 synthesizes the generated layer data of the to-be-synthesized image, which is obtained from the application rendering module 501, according to the SF-Vsync sent by the frame rate control module 506, and performs hardware mixing rendering to generate frame image data to be displayed. The layer composition module 502 may also send frame image data to be displayed to the display screen 503. The display screen 503 is driven by the display driver 507 of the kernel layer to send the frame image data to be displayed to the display screen 503 for display. The display driver 507 may drive the display screen 503 to send a frame image to be displayed to the display screen for display according to the HW-Vsync sent by the frame rate control module 506.
The time period required for displaying one frame image upon receiving the user's slide operation will be described in detail with reference to fig. 9. The refresh rate of the display before the sliding operation is 15Hz, and the electronic device increases the refresh rate of the display to 120Hz after the sliding operation will be described as an example.
As shown in fig. 9, when a sliding operation by a user is received, the application rendering module renders the image of the 1 st frame when the nth Vsync signal arrives. And the frame blanking area in the nth Vsync signal will adjust the refresh rate of the display screen from 15Hz up to 120Hz. As shown in fig. 9, the (N + 1) th Vsync signal period is a period after the up-regulation, and when the (N + 1) th Vsync signal arrives, the rendering of the drawing is performed on the 2 nd frame image. When the (N + 2) th Vsync signal arrives, layer composition is performed on the 2 nd frame image. Until the N +2 th Vsync period, the image of the 2 nd frame is sent for display. That is, the 2 nd frame image displayed in response to the slide operation requires a time period of a + B + C to start the presentation. If the difference between the time of the sliding operation and the trigger time indicated by the N +1Vsync signal is 50ms, the 2 nd frame image display needs to wait for 50+8.33+8.33=66.66ms.
In one possible case, the trigger time indicated by APP-Vsync and/or the trigger time indicated by SF-Vsync may be advanced to avoid the above-mentioned situation after the image display due to the low refresh rate of the display screen. This is explained in detail below by means of the embodiments shown in fig. 10 to 12.
In one embodiment, the latency of the image display may be reduced by advancing the next trigger time indicated by APP-Vsync. This is explained in detail below with reference to fig. 10. As shown in fig. 10, when a sliding operation by the user is received, the application rendering module is executing a rendering flow of rendering the frame 1. The frame rate control module advances the next trigger timing indicated by APP-Vsync, for example, to the timing shown in fig. 10. The application rendering module may perform the rendering process of the frame 2 at the next trigger time indicated by the APP-Vsync after performing the rendering process of the frame 1; the rendering of the frame 1 may also be stopped, and when the next trigger time indicated by the APP-Vsync arrives, the rendering process of the frame 2 is executed, which is not limited in the embodiment of the present application. That is, by advancing the next trigger timing indicated by the APP-Vsync, the time period of waiting required to display the slid image (frame 2) is A1+ B. Compared with fig. 9, as shown in fig. 10, the duration of A1 is less than that of a, and at the same time, advancing APP-Vsync saves the period duration C of one Vsync signal. That is, by advancing the next trigger timing indicated by the APP-Vsync, the time period required to display the slid image can be effectively reduced.
In one embodiment, the delay in the display of the image may be reduced by advancing the next trigger time indicated by SF-Vsync. This is explained in detail below with reference to fig. 11. As shown in fig. 11, when a sliding operation of a user is received, the layer composition module is executing a layer composition process for frame 1. The frame rate control module advances the next trigger timing indicated by SF-Vsync, for example, to the timing shown in fig. 11. The layer composition module may perform the layer composition process for the frame 2 at a next trigger time indicated by SF-Vsync after completing the layer composition process for the frame 1; the layer composition of the frame 1 may be directly stopped, and the layer composition flow may be executed on the frame 2 when the next trigger time indicated by the SF-Vsync arrives. By advancing the next trigger time indicated by SF-Vsync, the slid image (frame 2) can be displayed after the period duration (66.67 ms) of an initial Vsync signal is over. That is, by advancing the next trigger timing indicated by SF-Vsync, the time period required to display the slid image can be effectively reduced.
In one embodiment, the next trigger time indicated by APP-Vsync and the next trigger time indicated by SF-Vsync can be advanced simultaneously to reduce the image display delay. This is explained in detail below with reference to fig. 12. As shown in fig. 12, when a slide operation by the user is received, the application rendering module is executing a rendering flow of rendering the frame 1. The frame rate control module advances the next trigger timing indicated by APP-Vsync, for example, to the timing shown in fig. 12. Meanwhile, the application rendering module can stop rendering the frame 1, and when the next trigger time indicated by the APP-Vsync arrives, a rendering process is executed on the frame 2. The frame rate control module may simultaneously advance the next trigger time indicated by SF-Vsync, and in one possible case, may advance to the time shown in fig. 12. The layer composition module may perform the layer composition process on frame 2 when the next trigger time indicated by SF-Vsync arrives. After the layer composition process performed on the frame 2 is completed, the display driver may drive the display screen to display the frame image to be displayed according to the HW-Vsync sent by the frame rate control module. That is, the time period required for displaying the slid image (frame 2) is A3+ A4. Compared with fig. 9, the period duration B occupied by the rendering process and the period duration C occupied by the layer composition process are saved. That is, by advancing the next trigger timing indicated by APP-Vsync and the next trigger timing indicated by SF-Vsync at the same time, the time period required to display frame 2 can be further reduced.
Fig. 13 is a schematic diagram illustrating a process of information interaction between modules in an embodiment.
As shown in fig. 13, includes:
step 101: and when the sliding operation is received, responding to the sliding operation, and acquiring updated window data on the display screen by the window management module of the application program framework layer.
In response to a user swipe operation, the application sends window data to the window management module. Wherein the window data includes: package name, window name, list control, position of switched window, display order, size and position of elements in window, and switching animation, etc. The position of the window is the position of the window on the display screen of the electronic device, and the elements in the window include, but are not limited to: characters, pictures, boxes and the like displayed in the window. The switching animation is used for representing the switching effect when the switched window is displayed. The Window management module may include an interface Manager (AMS) and a Window Manager (WMS). The AMS is used for managing an interface of an application program, the WMS is used for managing windows, and information of all windows in the terminal equipment can be stored in the WMS.
Step 102: and the window management module sends the acquired updated window data to the scene identification module.
Step 103: in response to receiving the window data, the scene recognition module determines updated scene information.
For example, a scene information list may be preset in a configuration file of the electronic device, and may be stored in a memory of the electronic device. The scene information list may be as shown in table 1, and different scene categories correspond to different identification numbers (IDs). It should be noted that the scene information list may further include an ID corresponding to an interface in instant messaging and an ID corresponding to a list control.
TABLE 1
Figure BDA0003190419300000171
Figure BDA0003190419300000181
The scene recognition module may determine the application category from the application packet name in the received window data by loading the scene information list from the memory, and further determine the scene information identifier corresponding to the application category.
Step 104: and the scene recognition module sends a notification message to the frame rate control module, wherein the notification message carries an identifier corresponding to the current scene information.
Step 105: in response to receiving the notification message, the frame rate control module determines whether the updated scene indicated by the notification message is a preset scene.
Illustratively, the frame rate control module may obtain a scene identifier carried in the notification message, compare the scene identifier with a scene identifier pre-stored in the frame rate control module, and if the scene identifiers are consistent, indicate that the current scene is a preset scene, execute step 106.
If the scene identifier carried in the notification message is not consistent with the identifier of the scene pre-stored in the frame rate control module, the time delay of the picture change does not affect the user experience in the current scene. It is not necessary to increase the cycle frequency of Vsync in advance.
Illustratively, if the scene indicated by the scene identifier carried in the notification message is a map scene, the map scene is not a scene pre-stored in the frame rate control module. When the map does not change for a long time, the refresh rate of the display screen is turned down. When a slide operation is received, a corresponding map screen is displayed in response to the slide operation, and the delay of the map screen is not significant, so that the cycle frequency of Vsync does not need to be adjusted high.
Step 106: and under the condition that the frame rate control module determines that the updated scene indicated by the notification message is a preset scene, the frame rate control module determines whether to adjust the periodic frequency of the APP-Vsync.
The preset scene may be a scene with a low refresh rate and a high refresh rate in the same mode in table 2. When the frame rate control module acquires the updated scene from the notification message, it determines whether the updated scene has different refresh rates through the lookup table 2. For example, if the frame rate control module acquires that the ID of the scene is "0" from the notification message, it is determined that the updated scene is an instant messaging scene. As can be seen from table 2, if different refresh rates exist in the instant messaging in the same mode, the instant messaging scene is determined as the preset scene. Meanwhile, the frame rate control module may query the current refresh rate, and if the current refresh rate is 15Hz, the frame rate control module determines to increase the refresh rate to 120Hz.
TABLE 2
Figure BDA0003190419300000182
Figure BDA0003190419300000191
In one possible scenario, the user may set a mode type of the scene. For example, after the user clicks the mode setting option of the application, the electronic device presents an interface as shown in fig. 14, including a smart mode, a high-performance module, and a standard mode. Taking an application program as an example of a browser, when a user selects an intelligent mode, and correspondingly, when the browser is in the intelligent mode, the refresh rate is consistent with that of the high-performance mode, that is, two refresh rates exist, namely, a low refresh rate 15Hz and a high refresh rate 120Hz.
Further, whether to increase the period frequency of the APP-Vsync may be determined according to the magnitude of the difference between the time of the sliding operation and the trigger time of the next rendering process. And when the difference value between the sliding operation time and the trigger time of the next drawing rendering process is larger than the duration of one period of the APP-Vsync after the frequency is increased, increasing the APP-Vsync. For example, as shown in fig. 15, when the difference A5 between the time of the sliding operation and the trigger time of the next rendering flow is greater than the one-cycle duration B of the increased APP-Vsync, the cycle frequency of the APP-Vsync is increased. As shown in fig. 16, if the difference A6 between the sliding operation time and the trigger time of the next rendering process is smaller than the one period duration B of the increased APP-Vsync, the period frequency of the APP-Vsync is not increased.
Note that the APP-Vsync cycle frequency after the period frequency has been adjusted to be high is an integer multiple of the initial Vsync cycle frequency. Thus, the time of the next rendering flow indicated by the APP-Vsync after the period frequency is adjusted to be high can be matched with the time of the next layer composition flow indicated by the Vsync and the time of the next hardware rendering flow.
Step 107: and the frame rate control module sends the APP-Vsync and the attribute identifier to the application rendering module according to the increased periodic frequency.
Increasing the cycle frequency of the APP-Vsync is equivalent to adding at least one trigger signal for triggering the drawing rendering process between the sliding time and the trigger time indicated by the Vsync signal of the next frame, and is equivalent to advancing the trigger time of the next drawing rendering process.
In one possible scenario, the heightened APP-Vsync may be as shown in FIG. 17. The difference X between the time of the sliding operation and the next trigger time indicated by the Vsync signal, and the duration of one period of the raised Vsync signal is B. And determining the amount difference value between the trigger time indicated by the APP-Vsync and the sliding operation time by X-n × B = Y, wherein n is the smallest positive integer after X/B. As shown in FIG. 17, the minimum positive integer after X/B is 2, which corresponds to two APP-Vsyncs during the period of the initial Vsync. The trigger time indicated by the first APP-Vsync may be taken as the trigger time of the next rendering flow.
For example, if the periodic frequency of the raised APP-Vsync is 120Hz, the duration of the corresponding one period is 8.33ms. The current Vsync has a cycle frequency of 15Hz, and a corresponding cycle duration of 66.67ms. If the difference between the time of the sliding operation and the time of the next rendering flow indicated by the current Vsync is 20ms, the cycle frequency of the current APP-Vsync is increased to 120Hz. The time of the next rendering flow indicated by the raised APP-Vsync is 20-2 × 8.33=3.34ms after the time of the sliding operation.
Illustratively, the attribute identification includes New and replace. And when the attribute is identified as New, the application rendering module continues to execute the rendering process of the current frame. As shown in fig. 17, after the image drawing of the 1 st frame is completed, the drawing of the 2 nd frame image is performed. And when the attribute is identified as Replace, the application rendering module stops executing the rendering process of the current frame and starts executing a new rendering process of one frame. As shown in fig. 17, the drawing of the image of the 2 nd frame is directly performed, i.e., the drawing of the image of the 1 st frame is stopped.
And the frame rate control module sends the APP-Vsync and the attribute identifier assigned as Replace to the application drawing rendering module according to the increased periodic frequency.
Step 108: and the application rendering module executes a rendering process according to the heightened APP-Vsync.
And the application rendering module stops executing the rendering process of the current frame and starts executing a new rendering process of one frame in response to the attribute identifier assigned as Replace.
The frame rate control module may also increase the SF-Vsync cycle frequency, if possible. The implementation process is similar to the embodiment shown in fig. 13, and is not described herein again.
In one possible scenario, the frame rate control module may adjust the periods of APP-Vsync and SF-Vsync simultaneously, as described in detail below with respect to the embodiment shown in FIG. 18.
As shown in fig. 18, includes:
step 101: and when the sliding operation is received, responding to the sliding operation, and acquiring updated window data on the display screen by the window management module of the application program framework layer.
Step 102: and the window management module sends the acquired updated window data to the scene identification module.
Step 103: in response to receiving the window data, the scene recognition module determines updated scene information.
Step 104: and the scene recognition module sends a notification message to the frame rate control module, wherein the notification message carries an identifier corresponding to the current scene information.
Step 105: in response to receiving the notification message, the frame rate control module determines whether the updated scene indicated by the notification message is a preset scene.
Step 109: and under the condition that the frame rate control module determines that the updated scene indicated by the notification message is a preset scene, the frame rate control module determines whether to adjust the cycle frequency of the APP-Vsync and the SF-Vsync.
It should be noted that, when adjusting the period frequencies of the APP-Vsync and the SF-Vsync, the frame rate control module may adjust the APP-Vsync and the SF-Vsync to the same period frequency, or may adjust the APP-Vsync and the SF-Vsync to different period frequencies, which is not limited in this embodiment of the present application.
In one possible scenario, the adjusted APP-Vsync cycle frequency is greater than the adjusted SF-Vsync cycle frequency. Therefore, the speed of generating the data of the drawing and rendering module to be synthesized by the drawing and rendering module is greater than the speed of processing the data of the drawing and rendering module to be synthesized by the drawing and rendering module. Because the drawing rendering module is applied to generate the data of the image layer to be synthesized and store the data in the buffer queue, the buffer queue may be full, and the image layer synthesis module cannot timely process the data of the image layer to be synthesized stored in the buffer queue. When the space of the to-be-synthesized drawing layer data available for storage in the buffer queue is full, the application rendering module may delete the to-be-synthesized drawing layer data in the buffer queue according to the sequence from front to back of the storage sequence until the number of the to-be-synthesized drawing layer data stored in the buffer queue does not exceed the number of the storage queue in the buffer queue. It should be noted that, when the layer combining module performs layer combining processing on the to-be-combined layer data, the layer data to be combined that is stored earliest in the buffer queue is processed first.
Step 107: and the frame rate control module sends the APP-Vsync and the attribute identifier to the application rendering module according to the increased periodic frequency.
Step 108: and executing a drawing and rendering process by the application drawing and rendering module according to the heightened APP-Vsync.
Step 110: and the frame rate control module sends the SF-Vsync and the attribute identifier to the layer composition module according to the increased periodic frequency.
Step 111: and the layer synthesis module executes the layer synthesis process according to the increased SF-Vsync.
By embodying the trigger time indicated by the APP-Vsync and the trigger time indicated by the SF-Vsync, the drawing and rendering process and the layer composition process in the display process are advanced, the time delay of displaying the image after sliding is reduced, and the following handedness of the electronic equipment is improved.
The method for adjusting the display parameters according to the present application will be described in detail below with reference to specific examples. The following embodiments may be combined with each other and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 19 is a flowchart illustrating a method for adjusting display parameters according to an embodiment of the present application. As shown in fig. 19, the method for adjusting display parameters according to this embodiment includes:
s201, receiving an operation output by a user through a touch screen, responding to the received operation, and determining whether the scene of the electronic equipment is a preset scene.
Optionally, the operation may be a sliding operation or a clicking operation, which is not limited in this embodiment of the present application.
S202, when the scene of the electronic equipment is a preset scene, determining whether the frequency of the application drawing and rendering trigger signal at the current moment is a first preset frequency.
The first preset frequency may be a frequency threshold set by a user, or may be a display refresh rate corresponding to a changed scene after the scene is changed. In one possible scenario, the first predetermined frequency is the highest refresh rate supported by the display of the electronic device. If the frequency of the application rendering trigger signal at the current time is not the first preset frequency and is less than the first preset frequency, S203 is executed.
S203, determining whether the first time difference between the current time and the first time is greater than or equal to a first preset time difference threshold value.
And the first moment is the next trigger moment indicated by the application drawing and rendering trigger signal at the current moment. If the first time difference is greater than the first predetermined time difference threshold, S204 is executed.
The first preset time difference threshold may be a duration of one cycle of the adjusted application rendering trigger signal, and the first preset time difference threshold may also be a duration of one cycle of a highest refresh rate supported by the electronic device, which is not limited in this embodiment of the present application.
And S204, increasing the frequency of the application drawing rendering trigger signal at the current moment to a second preset frequency to obtain the adjusted application drawing rendering trigger signal.
And increasing the frequency of the application drawing rendering trigger signal at the current moment to a second preset frequency to obtain the adjusted application drawing rendering trigger signal.
The second preset frequency may be the same as or different from the first preset frequency, which is not limited in this embodiment of the present application. In one possible case, the second predetermined frequency is the highest display refresh rate supported by the electronic device. It should be noted that the highest display refresh rate supported by the electronic device may be an integer multiple of the second preset frequency, that is, fv-app = (1/n) × F1, where n is a positive integer, fv-app is the second preset frequency, and F1 is the highest refresh rate supported by the electronic device. Because the highest display refresh rate supported by the electronic device is an integral multiple of the second preset frequency, the application drawing rendering trigger signal adopting the second preset frequency can be aligned with the hardware display sending trigger signal adopting the highest refresh rate of the electronic device.
When the first time difference between the current moment and the next trigger moment indicated in the application rendering trigger signal at the current moment is larger than a first preset time difference threshold value, the frequency of the application rendering trigger signal at the current moment is increased, the rendering process executed in advance according to the adjusted application rendering trigger signal can be effectively performed, the image display sending efficiency can be effectively improved, the image of the current frame can be timely sent and displayed by hardware, and the smoothness of the image is improved.
And S205, executing a drawing and rendering process according to the adjusted application drawing and rendering trigger signal before the first moment.
The frequency of the application rendering trigger signal is increased to a second preset frequency to obtain an adjusted application rendering trigger signal, so that the frequency of the adjusted application rendering trigger signal is higher than that of the current application rendering trigger signal, namely at least one application rendering trigger signal is added between the current moment and the next trigger moment indicated by the initial application rendering trigger signal, the rendering process can be executed in advance, the image of the current frame can be timely sent and displayed by hardware, the time delay of image sending and displaying is reduced, and the smoothness of the picture and the following handedness of the electronic equipment are improved.
As can be seen from table 2, there can be two refresh rates for the same scene. Taking the instant messaging scenario as an example, in the standard mode, the refresh rate of the instant messaging scenario includes 15Hz and 60Hz. In a possible case, the first preset frequency may be the highest refresh rate of the preset scene, i.e. 60Hz, and the second preset frequency is the highest refresh rate supported by the electronic device, e.g. 120Hz. After the first time, the frequency of the application rendering can be adjusted to a first preset frequency of 60Hz, and after the first time, an application rendering trigger signal is sent to the application rendering module according to 60Hz.
The embodiment shown in fig. 19 focuses on a specific process that the electronic device can execute the rendering process in advance and improve the smoothness of the picture by increasing the frequency of the application rendering trigger signal. In a possible situation, the electronic device may further increase the frequency of the layer composition trigger signal, so as to further improve the fluency of the picture. The implementation method and the beneficial effect of the electronic device for increasing the frequency of the layer synthesis trigger signal are similar to those of the embodiment shown in fig. 19, and are not described herein again.
Further, on the basis of the embodiment shown in fig. 19, after the frequency of the application rendering trigger signal is increased, the electronic device may also increase the frequency of the layer composition trigger signal at the same time, which is described in detail by the embodiment shown in fig. 20 below.
Fig. 20 is a flowchart illustrating a method for adjusting display parameters according to an embodiment of the present disclosure. As shown in fig. 20, the method for adjusting display parameters provided in this embodiment includes:
s201, receiving an operation output by a user through a touch screen, responding to the received operation, and determining whether the scene of the electronic equipment is a preset scene.
S202, when the scene of the electronic equipment is a preset scene, determining whether the frequency of the application drawing and rendering trigger signal at the current moment is a first preset frequency.
The first preset frequency may be a frequency threshold set by a user, or may be a display refresh rate corresponding to a scene after the scene is changed. In one possible case, the first predetermined frequency is the highest refresh rate supported by the display screen of the electronic device. If the frequency of the application rendering trigger signal at the current moment is not less than the first preset frequency and less than the first preset frequency, S203 is executed.
S203, determining whether the first time difference between the current time and the first time is greater than or equal to a first preset time difference threshold value. If the first time difference is greater than or equal to the first preset time difference threshold, S204 is executed.
And S204, increasing the frequency of the application drawing and rendering trigger signal at the current moment to a second preset frequency to obtain the adjusted application drawing and rendering trigger signal.
And S205, executing a drawing and rendering process according to the adjusted application drawing and rendering trigger signal before the first moment.
S206, determining whether the frequency of the layer synthesis trigger signal at the current moment is a first preset frequency. If the frequency of the layer synthesis trigger signal is not the first preset frequency and is less than the first preset frequency, S207 is executed.
S207, whether the first time difference between the current time and the first time is larger than or equal to a first preset time difference threshold value or not is determined.
If the first time difference is greater than or equal to the first time difference threshold, S208 is executed.
And the first moment is the next trigger moment indicated by the layer synthesis trigger signal at the current moment. The first preset time difference threshold may be a duration of one period of the adjusted layer composition trigger signal, and the first preset time difference threshold may also be a duration of one period of a highest refresh rate supported by the electronic device, which is not limited in this embodiment of the present application.
And S208, increasing the frequency of the layer synthesis trigger signal to a second preset frequency to obtain the adjusted layer synthesis trigger signal.
And S209, executing the layer synthesis process according to the adjusted layer synthesis trigger signal before the first moment.
The adjusted layer synthesis trigger signal is obtained by increasing the layer synthesis frequency of the layer synthesis trigger signal, the adjusted application rendering trigger signal is obtained by increasing the rendering frequency, and the rendering process is further executed in time according to the adjusted layer synthesis trigger signal on the basis of executing the rendering process in time according to the adjusted application rendering trigger signal, so that the image can be timely sent and displayed by hardware, and the smoothness of the image is improved.
In a possible case, the rendering frequency may be increased when the rendering frequency is smaller than the first preset frequency, and the layer synthesis frequency may be increased when the layer synthesis frequency is smaller than the third preset frequency. Wherein the first preset frequency and the third preset frequency are different. Further, the rendering frequency may be adjusted up to a second preset frequency, and the layer composition frequency may be adjusted up to a fourth preset frequency. Wherein the second predetermined frequency is different from the fourth predetermined frequency. This is described in detail below with respect to the embodiment shown in fig. 21.
Fig. 21 is a schematic flowchart of a method for adjusting display parameters according to another embodiment of the present application, and as shown in fig. 21, the method includes:
s201, receiving an operation output by a user through a touch screen, responding to the received operation, and determining whether the scene of the electronic equipment is a preset scene.
S202, when the scene of the electronic equipment is a preset scene, determining whether the frequency of the application drawing and rendering trigger signal at the current moment is a first preset frequency.
The first preset frequency may be a frequency threshold preset by a user, or may be a display refresh rate corresponding to a scene after the scene is changed. In one possible scenario, the first predetermined frequency is the highest refresh rate supported by the display of the electronic device. If the rendering frequency of the application rendering trigger signal at the current moment is less than the first preset frequency, S203 is executed.
S203, determining whether the first time difference between the current time and the first time is greater than or equal to a first preset time difference threshold value. If the first time difference is greater than or equal to the first preset time difference threshold, S204 is executed.
And S204, increasing the frequency of the application drawing and rendering trigger signal at the current moment to a second preset frequency to obtain the adjusted application drawing and rendering trigger signal.
And S205, executing a drawing and rendering process according to the adjusted application drawing and rendering trigger signal before the first moment.
S210, determining whether the frequency of the layer synthesis trigger signal at the current moment is a third preset frequency. If the frequency of the layer synthesis trigger signal is not the third preset frequency and is less than the third preset frequency, S211 is executed.
The third preset frequency may be a frequency threshold set by a user, or may also be a display refresh rate corresponding to a scene of the electronic device, which is not limited in this embodiment of the application. In one possible scenario, the third predetermined frequency is the highest refresh rate supported by the display of the electronic device.
S211, determining whether a second time difference between the current time and the second time is larger than or equal to a second preset time difference threshold value.
If the third time difference is greater than the third time difference threshold, S212 is executed.
And the second moment is the next trigger moment indicated by the layer synthesis trigger signal at the current moment. The second preset time difference threshold may be a duration of one period of the adjusted layer composition trigger signal, and the second preset time difference threshold may also be a duration of one period of a highest refresh rate supported by the electronic device, which is not limited in this embodiment of the present application.
And S212, increasing the frequency of the layer synthesis trigger signal to a fourth preset frequency to obtain the adjusted layer synthesis trigger signal.
And under the condition that a second time difference between the current time and the next trigger time indicated in the layer synthesis trigger signal at the current time is greater than a second preset time difference threshold value, increasing the layer synthesis frequency to a fourth preset frequency to obtain the adjusted layer synthesis trigger signal. The image layer synthesis process which can be effectively executed in advance according to the adjusted image layer synthesis trigger signal can effectively improve the image display sending efficiency, so that the image can be timely sent and displayed by hardware, and the smoothness of the image is improved. In a possible case, the second preset time difference threshold for triggering to increase the layer synthesis frequency may be different from the first preset time difference threshold for triggering to increase the rendering frequency, so that the flexibility of increasing the layer synthesis frequency is improved.
In a possible case, the fourth preset frequency is different from the second preset frequency, so that the corresponding flow on the image display pipeline can be executed according to different trigger signals, and the flexibility of executing each flow on the image display pipeline is improved.
It should be noted that the maximum display refresh rate supported by the electronic device is an integer multiple of the second preset frequency, that is, F1= n × Fv-sf, where n is a positive integer, fv-sf is the fourth preset frequency, and F1 is the maximum refresh rate supported by the electronic device. Because the highest display refresh rate supported by the electronic device is an integral multiple of the fourth preset frequency, the layer composition trigger signal adopting the fourth preset frequency can be aligned with the hardware display sending trigger signal adopting the highest refresh rate of the electronic device.
And S213, executing the layer synthesis process according to the adjusted layer synthesis trigger signal before the second moment.
Fig. 22 is a schematic structural diagram of a display parameter adjustment apparatus according to an embodiment of the present disclosure. As shown in fig. 22, the apparatus for adjusting display parameters according to this embodiment may include: a receiving module 1101, a first determining module 1102, a second determining module 1103, an adjusting module 1104, and an executing module 1105, wherein:
the receiving module 1101 is configured to receive an operation input by a user through a touch screen.
A first determining module 1102, configured to determine whether a scene of the electronic device is a preset scene in response to the receiving operation.
A second determining module 1103, configured to determine, if the current time is the preset scene, whether the frequency of the display trigger signal is the first preset frequency; the display trigger signal is used for triggering the processing flow of image display.
The adjusting module 1104 is configured to increase the frequency of the display trigger signal at the current moment to a second preset frequency to obtain a display trigger signal with an adjusted frequency if the frequency of the display trigger signal at the current moment is not the first preset frequency and the frequency of the display trigger signal at the current moment is less than the first preset frequency.
An executing module 1105, configured to determine whether a first time difference between the current time and the first time is greater than or equal to a first preset time difference threshold, where the first time is a next trigger time indicated by the display trigger signal of the current time, and execute a processing procedure of image display according to the adjusted display trigger signal.
In an embodiment, the processing flow of image display includes a drawing and rendering flow, and the displaying the trigger signal includes applying the drawing and rendering trigger signal; and applying the drawing and rendering triggering signal to trigger the drawing and rendering process.
The adjusting module 1104 is specifically configured to increase the frequency of the application rendering trigger signal at the current time to a second preset frequency, so as to obtain an adjusted application rendering trigger signal.
The executing module 1105 is specifically configured to execute the rendering process according to the adjusted application rendering trigger signal.
In an embodiment, the first preset time difference threshold is a duration of one cycle of the frequency-adjusted application rendering trigger signal.
In an embodiment, the image display processing flow includes an image synthesis flow, and the display trigger signal further includes a layer synthesis trigger signal; the layer composition trigger signal is used for triggering a layer composition process.
The adjusting module 1104 is specifically configured to increase the frequency of the layer synthesis trigger signal at the current time to a second preset frequency, so as to obtain an adjusted layer synthesis trigger signal.
The executing module 1105 is specifically configured to execute the layer synthesis process according to the adjusted layer synthesis trigger signal.
In one embodiment, the first predetermined frequency is a highest display refresh rate supported by the electronic device.
In one embodiment, the second predetermined frequency is a highest display refresh rate supported by the electronic device.
In one embodiment, the second determining module 1103 is further configured to determine whether the frequency of the layer composition trigger signal at the current time is not a third preset frequency; the layer composition trigger signal is used for triggering a layer composition process.
The adjusting module 1104 is further configured to increase the frequency of the layer synthesis trigger signal at the current moment to a fourth preset frequency to obtain a frequency-adjusted layer synthesis trigger signal if the frequency of the layer synthesis trigger signal at the current moment is not the third preset frequency and the frequency of the layer synthesis trigger signal at the current moment is less than the third preset frequency.
The executing module 1105 is further configured to determine whether a second time difference between the current time and the second time is greater than or equal to a second preset time difference threshold; the second moment is the next trigger moment indicated by the layer synthesis trigger signal at the current moment; and if the second time difference is larger than or equal to a second time difference threshold, executing an image synthesis process before the second moment according to the adjusted layer synthesis trigger signal.
In an embodiment, the second time difference threshold is a duration of one period of the adjusted layer composition trigger signal.
In an embodiment, if the second predetermined frequency is higher than the fourth predetermined frequency; the above display parameter adjusting apparatus further comprises: a delete module 1106, wherein:
the deleting module 1106 is configured to, when the number of the to-be-synthesized drawing layer data stored in the buffer queue exceeds the number of data that can be stored in the storage queue in the buffer queue, sequentially delete the to-be-synthesized drawing layer data from front to back according to the sequence of the to-be-synthesized drawing layer data stored in the buffer queue until the number of the to-be-synthesized drawing layer data stored in the buffer queue does not exceed the number of the storage queue in the buffer queue.
In one embodiment, the first predetermined frequency is the highest display refresh rate supported by the electronic device.
In an embodiment, the third predetermined frequency is a highest display refresh rate supported by the electronic device.
In an embodiment, the operation is a sliding operation, and the first determining module 1102 is specifically configured to determine, in response to a received sliding operation, whether a current scene of the electronic device is the preset scene; the preset scenes comprise list type scenes.
In an embodiment, the operation is a click operation, and the first determining module 1102 is specifically configured to determine whether a scene of the electronic device is the preset scene in response to the received click operation; the preset scene is a scene prestored in a scene list.
The adjusting apparatus for display parameters provided in this embodiment is used to implement the adjusting method for display parameters in the foregoing embodiment, and the technical principle and the technical effect are similar, and are not described herein again.
An embodiment of the present application provides an electronic device, and the structure of the electronic device is shown in fig. 3. The memory of the electronic device may be configured to store at least one program instruction, and the processor is configured to execute the at least one program instruction to implement the solution of the above-mentioned method embodiment. The implementation principle and technical effect are similar to those of the embodiments related to the method, and are not described herein again.
The embodiment of the application provides a chip. The chip comprises a processor, which is coupled with the memory and executes the computer program in the memory to execute the technical solution in the above embodiments. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
The embodiment of the present application provides a computer program product, which, when running on an electronic device, enables the electronic device to execute the technical solutions in the above embodiments. The implementation principle and technical effect are similar to those of the related embodiments, and are not described herein again.
The embodiment of the present application provides a computer-readable storage medium, on which program instructions are stored, and when the program instructions are executed by an electronic device, the electronic device is enabled to execute the technical solutions of the above embodiments. The implementation principle and technical effect are similar to those of the related embodiments, and are not described herein again. In summary, the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (24)

1. A method for adjusting display parameters is applied to an electronic device, wherein the electronic device comprises a touch screen, and the method comprises the following steps:
receiving an operation input by a user through the touch screen;
in response to receiving the operation, determining whether the scene of the electronic equipment is a preset scene;
if so, determining whether the frequency of the display trigger signal at the current moment is a first preset frequency; the display trigger signal is used for triggering the processing flow of image display;
if the frequency of the display trigger signal at the current moment is not the first preset frequency and the frequency of the display trigger signal at the current moment is less than the first preset frequency, increasing the frequency of the display trigger signal at the current moment to a second preset frequency to obtain a display trigger signal after frequency adjustment;
determining whether a first time difference between the current time and a first time is greater than or equal to a first preset time difference threshold, wherein the first time is the next trigger time indicated by the display trigger signal of the current time;
and if the first time difference is greater than or equal to the first preset time difference threshold, executing the image display processing flow according to the adjusted display trigger signal before the first moment.
2. The method of claim 1, wherein the processing flow of the image display comprises a drawing rendering flow, and the display trigger signal comprises applying a drawing rendering trigger signal; the application rendering triggering signal is used for triggering the rendering process;
the step of increasing the frequency of the display trigger signal at the current moment to a second preset frequency to obtain a display trigger signal after frequency adjustment includes:
increasing the frequency of the application drawing rendering trigger signal at the current moment to the second preset frequency to obtain an adjusted application drawing rendering trigger signal;
correspondingly, the process flow for executing the image display according to the adjusted display trigger signal includes:
and executing the drawing and rendering process according to the adjusted application drawing and rendering trigger signal.
3. The method of claim 2, wherein the first predetermined time difference threshold is a duration of one cycle of the frequency-adjusted application rendering trigger signal.
4. The method according to any one of claims 1-3, wherein the processing flow of the image display comprises an image composition flow, and the display trigger signal further comprises an image layer composition trigger signal; the layer composition trigger signal is used for triggering the layer composition process;
the step of increasing the frequency of the display trigger signal at the current moment to a second preset frequency to obtain a display trigger signal after frequency adjustment includes:
increasing the frequency of the layer synthesis trigger signal at the current moment to the second preset frequency to obtain an adjusted layer synthesis trigger signal;
correspondingly, the process flow for executing the image display according to the adjusted display trigger signal includes:
and executing the layer synthesis process according to the adjusted layer synthesis trigger signal.
5. The method of any of claims 1-4, wherein the first predetermined frequency is a highest display refresh rate supported by the electronic device, and wherein the second predetermined frequency is a highest display refresh rate supported by the electronic device.
6. The method according to any one of claims 1 to 4, wherein the first preset frequency is a highest refresh rate of the preset scene, and the second preset frequency is a highest refresh rate of the preset scene.
7. The method according to any of claims 1-4, wherein the first predetermined frequency is a highest refresh rate of the predetermined scene and is not a highest display refresh rate supported by the electronic device, and the second predetermined frequency is a highest display refresh rate supported by the electronic device.
8. The method according to any one of claims 1 to 7, wherein the executing the processing flow of the image display according to the adjusted display trigger signal before the first time comprises:
executing the image display processing flow according to the adjusted display trigger signal at the time of a first time interval before the first time; the first time interval is a cycle duration of at least one first preset frequency, or the first time interval is a cycle duration of at least one second preset frequency.
9. The method according to any one of claims 1-8, wherein after the performing the processing flow of the image display according to the adjusted display trigger signal before the first time, the method further comprises:
and after the first moment, adjusting the frequency of the display trigger signal to the first preset frequency.
10. The method according to any one of claims 1 to 9, wherein when the processing flow of the image display is a rendering and rendering flow, the display trigger signal is an application rendering and rendering trigger signal, and after the processing flow of the image display is executed according to the adjusted display trigger signal, the method further comprises:
determining whether the frequency of the layer synthesis trigger signal at the current moment is a third preset frequency; the layer synthesis trigger signal is used for triggering a layer synthesis process;
if the frequency of the layer synthesis trigger signal at the current moment is not the third preset frequency and the frequency of the layer synthesis trigger signal at the current moment is less than the third preset frequency, increasing the frequency of the layer synthesis trigger signal at the current moment to a fourth preset frequency to obtain an adjusted layer synthesis trigger signal;
determining whether a second time difference between the current time and a second time is greater than or equal to a second preset time difference threshold value; the second moment is the next trigger moment indicated by the layer synthesis trigger signal at the current moment;
and if the second time difference is greater than or equal to the second time difference threshold, executing the image synthesis process according to the adjusted layer synthesis trigger signal before the second moment.
11. The method according to claim 10, wherein the second time difference threshold is a duration of one period of the adjusted layer composition trigger signal.
12. The method according to claim 10 or 11, wherein if the second predetermined frequency is higher than the fourth predetermined frequency; the method further comprises the following steps:
when the quantity of the to-be-synthesized drawing layer data stored in the buffer queue exceeds the quantity of the data which can be stored in the storage queue in the buffer queue, deleting the to-be-synthesized drawing layer data from front to back according to the sequence of the to-be-synthesized drawing layer data stored in the buffer queue until the quantity of the to-be-synthesized drawing layer data stored in the buffer queue does not exceed the quantity of the storage queue in the buffer queue.
13. The method of any of claims 10-12, wherein the first predetermined frequency is a highest display refresh rate supported by the electronic device.
14. The method of any of claims 10-13, wherein the third predetermined frequency is a highest display refresh rate supported by the electronic device.
15. The method of any of claims 1-14, wherein the operation comprises a slide operation, and wherein determining whether the context of the electronic device is a preset context in response to receiving the operation comprises:
determining whether the current scene of the electronic equipment is the preset scene or not in response to the received sliding operation; the preset scene comprises a list type scene.
16. The method of any of claims 1-14, wherein the operation comprises a click operation, and wherein determining whether the context of the electronic device is a preset context in response to receiving the operation comprises:
responding to the received click operation, and determining whether the scene of the electronic equipment is the preset scene; the preset scene is a scene prestored in a scene list.
17. The method of any of claims 1-16, wherein the electronic device comprises an application module, a window management module, a scene recognition module, and a frame rate control module;
the application program module receives operation input through the touch screen and responds to the operation to send window data to the window management module;
the window management module sends the acquired window data to the scene recognition module;
the scene recognition module acquires a preset scene information list, compares the received window data with the preset scene information list and determines the scene of the electronic equipment;
the scene recognition module sends a notification message to the frame rate control module, wherein the notification message carries an identifier corresponding to a scene of the electronic device;
the frame rate control module compares the scene indicated by the notification message with a pre-stored scene list, and determines that the scene indicated by the notification message is a preset scene when the scene indicated by the notification message is in the pre-stored scene list;
the frame rate control module determines whether the frequency of the display trigger signal at the current moment is a first preset frequency, and increases the frequency of the display trigger signal at the current moment to a second preset frequency under the conditions that the frequency of the display trigger signal at the current moment is not the first preset frequency and the frequency of the display trigger signal at the current moment is smaller than the first preset frequency, so as to obtain a display trigger signal with the adjusted frequency;
the frame rate control module determines whether a first time difference between the current time and a first time is greater than or equal to a first preset time difference threshold, and sends the adjusted display trigger signal before the first time when the first time difference is greater than or equal to the first preset time difference threshold.
18. The method of claim 17, wherein the electronic device further comprises an application rendering module; the display trigger signal comprises an application rendering trigger signal;
the frame rate control module sends an adjusted application drawing and rendering trigger signal to the application drawing and rendering module at a first moment;
and the application drawing and rendering module executes a drawing and rendering process when receiving the application drawing and rendering trigger signal.
19. The method of claim 18, wherein the frame rate control module sends the adjusted application rendering trigger signal to the application rendering module at the adjusted frequency prior to the first time.
20. The method according to any of claims 17-19, wherein the electronic device further comprises a layer composition module, and the display trigger signal further comprises a layer composition trigger signal;
the frame rate control module sends an adjusted layer synthesis trigger signal to the layer synthesis module at a first moment;
and the layer synthesis module executes the layer synthesis process when receiving the layer synthesis trigger signal.
21. The method of claim 20, wherein the frame rate control module sends the adjusted layer composition trigger signal to the layer composition module at an adjusted frequency before the first time.
22. An electronic device, comprising a processor configured to couple to a memory and to read instructions in the memory and to cause the electronic device to perform the method of any of claims 1-21 in accordance with the instructions.
23. A computer-readable storage medium having stored thereon computer instructions which, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1-21.
24. A chip comprising a processor for coupling with a memory and executing a computer program in the memory to perform the method of any one of claims 1 to 21.
CN202110876250.1A 2021-07-30 2021-07-30 Display parameter adjusting method, electronic device, chip and readable storage medium Pending CN115686403A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110876250.1A CN115686403A (en) 2021-07-30 2021-07-30 Display parameter adjusting method, electronic device, chip and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110876250.1A CN115686403A (en) 2021-07-30 2021-07-30 Display parameter adjusting method, electronic device, chip and readable storage medium

Publications (1)

Publication Number Publication Date
CN115686403A true CN115686403A (en) 2023-02-03

Family

ID=85060027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110876250.1A Pending CN115686403A (en) 2021-07-30 2021-07-30 Display parameter adjusting method, electronic device, chip and readable storage medium

Country Status (1)

Country Link
CN (1) CN115686403A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116204149A (en) * 2023-05-06 2023-06-02 南京极域信息科技有限公司 Technology for improving display performance of Android operating system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116204149A (en) * 2023-05-06 2023-06-02 南京极域信息科技有限公司 Technology for improving display performance of Android operating system

Similar Documents

Publication Publication Date Title
WO2021175213A1 (en) Refresh rate switching method and electronic device
WO2020253719A1 (en) Screen recording method and electronic device
WO2020259452A1 (en) Full-screen display method for mobile terminal, and apparatus
CN109559270B (en) Image processing method and electronic equipment
CN115473957B (en) Image processing method and electronic equipment
CN113254120B (en) Data processing method and related device
WO2022007862A1 (en) Image processing method, system, electronic device and computer readable storage medium
WO2021258814A1 (en) Video synthesis method and apparatus, electronic device, and storage medium
CN115016869B (en) Frame rate adjusting method, terminal equipment and frame rate adjusting system
CN113961157B (en) Display interaction system, display method and equipment
CN114327127B (en) Method and apparatus for sliding frame loss detection
US20230386382A1 (en) Display method, electronic device, and computer storage medium
CN115048012B (en) Data processing method and related device
CN114089932A (en) Multi-screen display method and device, terminal equipment and storage medium
CN113448382A (en) Multi-screen display electronic device and multi-screen display method of electronic device
US20230335081A1 (en) Display Synchronization Method, Electronic Device, and Readable Storage Medium
CN114756184A (en) Collaborative display method, terminal device and computer-readable storage medium
CN112532508B (en) Video communication method and video communication device
CN115686403A (en) Display parameter adjusting method, electronic device, chip and readable storage medium
CN113495733A (en) Theme pack installation method and device, electronic equipment and computer readable storage medium
CN116048831B (en) Target signal processing method and electronic equipment
CN116051351B (en) Special effect processing method and electronic equipment
CN115904184B (en) Data processing method and related device
CN114691248B (en) Method, device, equipment and readable storage medium for displaying virtual reality interface
EP4276618A1 (en) Image processing method, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination