WO2021000921A1 - 一种基于垂直同步信号的图像处理方法及电子设备 - Google Patents

一种基于垂直同步信号的图像处理方法及电子设备 Download PDF

Info

Publication number
WO2021000921A1
WO2021000921A1 PCT/CN2020/100014 CN2020100014W WO2021000921A1 WO 2021000921 A1 WO2021000921 A1 WO 2021000921A1 CN 2020100014 W CN2020100014 W CN 2020100014W WO 2021000921 A1 WO2021000921 A1 WO 2021000921A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
layer
rendering
layers
frame length
Prior art date
Application number
PCT/CN2020/100014
Other languages
English (en)
French (fr)
Inventor
王亮
李煜
陈健
吉星春
郭一方
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US17/624,292 priority Critical patent/US11887557B2/en
Priority to JP2021578085A priority patent/JP7337968B2/ja
Priority to EP20834161.0A priority patent/EP3971715A4/en
Publication of WO2021000921A1 publication Critical patent/WO2021000921A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/001Arbitration of resources in a display system, e.g. control of access to frame buffer by video controller and/or main processor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/503Blending, e.g. for anti-aliasing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0252Improving the response speed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/10Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/08Power processing, i.e. workload management for processors involved in display operations, such as CPUs or GPUs
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/18Timing circuits for raster scan displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/395Arrangements specially adapted for transferring the contents of the bit-mapped memory to the screen
    • G09G5/397Arrangements specially adapted for transferring the contents of two or more bit-mapped memories to the screen simultaneously, e.g. for mixing or overlay
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • the embodiments of the present application relate to the field of image processing and display technology, and in particular, to an image processing method and electronic device based on a vertical synchronization signal.
  • fluency is an important human-computer interaction performance.
  • fluency can include hand following performance.
  • Fluency can be embodied as the length of the delay time from "the user inputs a user operation to the electronic product" to "the electronic product displays the image corresponding to the user operation".
  • the above-mentioned user operation may be an operation input by the user through a mouse or a button, or the above-mentioned user operation may be a user's touch operation on the touch screen.
  • the aforementioned delay time can be referred to as the response delay of the electronic device.
  • the delay time may be referred to as a touch response delay.
  • the embodiments of the present application provide an image processing method and electronic device based on a vertical synchronization signal, which can shorten the response delay of the electronic device and improve the fluency of the electronic device.
  • an embodiment of the present application provides an image processing method based on a vertical synchronization signal, and the method can be applied to an electronic device including a display screen.
  • the method may include: in response to the first vertical synchronization signal, the electronic device draws one or more first layers, and after rendering of the one or more first layers is completed, rendering the one or more first layers , And perform layer synthesis on the rendered one or more first layers to obtain the first image frame; in response to the second vertical synchronization signal, refresh and display the first image frame.
  • the electronic device can complete the drawing, rendering, and composition of the layer in one synchronization cycle. That is, the time required for the electronic device to draw, render, and synthesize layers is less than or equal to one synchronization period.
  • the synchronization period is equal to the signal period of the second vertical synchronization signal.
  • the electronic device can perform layer drawing, rendering, and composition in response to the first vertical synchronization signal; instead of waiting for the arrival of the third vertical synchronization signal, the electronic device responds to the third vertical synchronization signal to perform the first rendering.
  • the layers are combined into layers. In this way, the electronic device can complete the drawing, rendering and synthesis of the layers in one synchronization cycle (such as the first synchronization cycle). That is to say, through the method of the embodiment of the present application, the response delay of the electronic device can be shortened by one synchronization period, and the fluency of the electronic device (such as hand-following performance) can be improved.
  • the electronic device cannot complete the drawing, rendering, and composition of the layer in one synchronization cycle. That is, the time required for the electronic device to draw, render, and synthesize layers is longer than one synchronization cycle.
  • frame loss may occur during the process of refreshing the display image frame.
  • the method of the embodiment of the present application it is possible to avoid the phenomenon of frame dropping in the displayed image, so as to prevent the display screen from displaying a frame of repeated images. That is to say, through the method of the embodiment of the present application, the smoothness of the image displayed on the display screen can be ensured, thereby improving the user's visual experience.
  • the possibility that the electronic device can complete the drawing, rendering, and composition of the layer in one synchronization cycle can be improved.
  • the electronic device can detect a user operation or when a user interface (UI) event occurs on the electronic device, in response to the first vertical synchronization signal, draw one or more second One layer, and render one or more first layers.
  • the user operation can trigger the user to update the interface of the electronic device.
  • the method of the embodiment of the present application may further include: the electronic device performs forward scheduling of the hardware resources of the electronic device, so as to shorten the layer drawing and layer rendering of the electronic device. And/or the length of time required for layer composition.
  • the above-mentioned electronic device performs forward scheduling of the hardware resources of the electronic device, which may include: the electronic device executes one or more of the following hardware resource scheduling to shorten the electronic The time required for the equipment to draw, render and/or compose layers.
  • the above forward scheduling may include: increasing the operating frequency of the processor of the electronic device, selecting a large core processor to execute the above method, and increasing the operating frequency of the memory of the electronic device.
  • the processor may include a central processing unit (CPU) and/or a graphics processing unit (GPU).
  • the calculation speed of large-core processors is faster than that of small-core processors.
  • the electronic device performs forward scheduling of the hardware resources of the electronic device, which can shorten the time required for the electronic device to perform layer drawing, layer rendering and/or layer synthesis, and improve the electronic device to complete layer drawing in one synchronization cycle , Rendering and compositing possibilities.
  • the electronic device can complete the drawing, rendering, and composition of the layer in one synchronization cycle, the response delay of the electronic device can be shortened by one synchronization cycle, and the fluency of the electronic device (such as hand-following performance) can be improved.
  • the electronic device can forwardly schedule the hardware resources of the electronic device according to the above-mentioned first processing frame length, so as to shorten the electronic device's layer drawing, layer rendering and / Or the time required for layer composition. For example, taking the electronic device increasing the operating frequency of the processor as an example, the larger the first processing frame length is, the higher the operating frequency of the processor is adjusted when the electronic device adjusts the operating frequency of the processor.
  • the electronic device can determine the number or probability that the electronic device completes layer drawing, layer rendering, and layer synthesis in a synchronization period in the first statistical period.
  • the hardware resources of the device are forwardly scheduled to shorten the time required for the electronic device to perform layer drawing, layer rendering and/or layer synthesis.
  • the electronic device completes layer drawing, layer rendering, and layer synthesis in a synchronization period in the first statistical period, the fewer times, or in the first statistical period
  • the probability that the electronic device completes layer drawing, layer rendering, and layer synthesis within one synchronization cycle is smaller; when the electronic device adjusts the operating frequency of the processor, the higher the operating frequency of the processor is adjusted.
  • the electronic device can forwardly schedule the hardware resources of the electronic device according to the front-end application of the electronic device, so as to shorten the electronic device’s layer drawing, layer rendering and/ Or the time required for layer composition.
  • the foreground application is the application corresponding to the interface currently displayed on the display screen.
  • a method or strategy for forward scheduling of hardware resources of electronic devices can be set for each application. For example, taking the electronic device to increase the working frequency of the processor as an example, when the electronic device runs the foreground application, the longer it takes for layer drawing, rendering, and synthesis, the longer the electronic device adjusts the working frequency of the processor. The higher the operating frequency is adjusted.
  • the above electronic device may respond to the first vertical synchronization signal when the first processing frame length of the first statistical period is less than or equal to the preset single frame frame length. , Draw one or more first layers, and render one or more first layers, and after one or more first layers are rendered, perform layering on the rendered one or more first layers Synthesize to get the first image frame.
  • the foregoing first processing frame length is the sum of the first rendering frame length and the first SF frame length.
  • the first rendering frame length is the length of time required for layer drawing and rendering of the drawn layer.
  • the first SF frame length is the length of time required to perform layer composition on the rendered layer.
  • the electronic device can complete the graph in one synchronization period in the first statistical period. Layer drawing, rendering and compositing. Then, in the next statistical period of the first statistical period (that is, the statistical period in which the current moment is located), the electronic device is more likely to complete the drawing, rendering, and composition of the layer in one synchronization period.
  • the foregoing preset single frame frame length is less than or equal to the signal period of the second vertical synchronization signal.
  • the electronic device in another possible design manner, if the first processing frame length of the first statistical period (that is, the previous statistical period of the current moment) is greater than the preset single frame frame length, it means that the Cyclic electronic devices cannot complete the drawing, rendering, and composition of layers in one synchronization cycle. Then, in the next statistical period of the first statistical period (that is, the statistical period in which the current moment is located), the electronic device is less likely to complete the drawing, rendering, and composition of the layer in one synchronization period. In this case, the electronic device can draw one or more first layers in response to the first vertical synchronization signal; in response to the third vertical synchronization signal, perform layer synthesis on the rendered first layer to obtain The first image frame.
  • the electronic device can draw one or more first layers in response to the first vertical synchronization signal, And render one or more first layers, and after one or more first layers are rendered, perform layer synthesis on the rendered one or more first layers to obtain the first image frame.
  • the electronic device can perform layer synthesis in advance, which can improve the possibility that the electronic device completes the drawing, rendering, and synthesis of the layer in one synchronization cycle.
  • the method of the embodiment of the present application may further include: the electronic device obtains one or more second processing frame lengths of the first statistical period; according to one or more second Processing frame length, determine the first processing frame length.
  • each second processing frame length is the sum of the second rendering frame length and the second SF frame length.
  • the second rendering frame length is the length of time required for layer drawing and rendering of the drawn layer.
  • the second SF frame length is the length of time required for layer composition of the rendered layer.
  • the first processing frame length is the largest second processing frame length among the multiple second processing frame lengths; or, The aforementioned first processing frame length is an average value of a plurality of second processing frame lengths.
  • performing layer synthesis on the rendered one or more first layers to obtain the first image frame may include: the electronic device responds to the first vertical synchronization signal in the first synchronization period Draw one or more first layers, render one or more first layers, and perform layer synthesis on one or more first layers rendered after one or more first layers are rendered.
  • the first synchronization period is a synchronization period corresponding to the first vertical synchronization signal. That is to say, in the embodiment of the present application, the electronic device may start layer composition during one synchronization period (that is, the first synchronization period) for layer drawing and rendering.
  • the method of the embodiment of the present application may further include: if the first processing frame length is greater than the preset single frame frame length, the electronic device has an impact on the hardware of the electronic device Resources are forwardly scheduled to shorten the time required for the electronic device to perform layer drawing, layer rendering and/or layer synthesis.
  • the specific method for the electronic device to forward the hardware resources of the electronic device and the technical effects achieved can be referred to the description in the above possible design manners, which will not be repeated here in the embodiment of the application.
  • the above-mentioned one or more layers may include: layers drawn by the electronic device performing drawing tasks corresponding to one or more applications.
  • the one or more applications may include: at least one of one or more system-level applications and one or more user-level applications.
  • the system-level applications may include: status bar, launcher, navigation bar, wallpaper, etc.
  • the above-mentioned user-level applications may include: system applications of electronic devices such as "settings", “phone” and “short message”, and third-party applications that the electronic device can download from the application store in response to user operations.
  • third-party applications may include applications such as "WeChat", “Alipay”, and "Baidu Maps".
  • the above electronic device draws one or more first layers and renders one or more first layers in response to the first vertical synchronization signal, and displays the After the rendering of the first or multiple first layers is completed, layer synthesis is performed on the rendered one or more first layers to obtain the first image frame, which may specifically include: the electronic device responds to the first vertical synchronization signal to respectively target For each of one or more applications, draw one or more first layers, and render one or more first layers; electronic device-to-electronic device renders one or more first layers for one or more applications Layer synthesis is performed on one layer to obtain the first image frame. That is to say, in response to the first vertical synchronization signal, the electronic device can perform layer drawing and rendering for each application; then the electronic device can render one or more first layers for all applications in one or more applications Perform layer synthesis to obtain the first image frame.
  • the frame may specifically include: one or more first layers of a focus application, a key application, or an application related to the fluency of the electronic device in one or more applications of the electronic device. Or a plurality of applications of the rendered first layer perform layer synthesis to obtain the first image frame.
  • the electronic device when the electronic device completes the layer rendering of the focus application, even if the layer rendering of other applications has not been completed, the electronic device can also start layer synthesis on the first layer that has been rendered to obtain the first Image frame.
  • the frame may specifically include: the focus layer, the key layer, or the layer that is strongly related to the fluency of the electronic device in the one or more first layers of the electronic device is rendered, and the electronic device is targeted for one or more The first layer that has been rendered by the application is combined to obtain the first image frame.
  • the electronic device can be based on the first rendering frame length corresponding to the focus application in the one or more applications, and the first rendering frame length corresponding to the one or more applications of the electronic device.
  • the SF frame length determines the aforementioned first processing frame length.
  • the electronic device can be based on the first rendering frame length corresponding to each application in one or more applications, the largest first rendering frame length, and the electronic device pairing one The first SF frame length corresponding to or multiple applications determines the first processing frame length.
  • the electronic device after the electronic device forwards the hardware resources of the electronic device, when the screen refresh rate of the electronic device is greater than the preset refresh rate threshold, the electronic device can The hardware resources are negatively scheduled to reduce the power consumption of electronic devices. In this way, the electronic device can avoid the phenomenon of frame loss in the displayed image in the second case described above under the premise of low power consumption, so as to prevent the display screen from displaying a frame of repeated images.
  • the electronic device after the electronic device forwards the hardware resources of the electronic device, when the screen refresh rate of the electronic device is greater than the preset refresh rate threshold, if the first processing frame length More than the preset double frame length, the electronic device can perform negative scheduling on the hardware resources of the electronic device to reduce the power consumption of the electronic device. In this way, the electronic device can avoid the phenomenon of frame loss in the displayed image in the second case described above under the premise of low power consumption, so as to avoid the display screen from displaying a frame of repeated images.
  • the electronic device may perform one or more of the following negative scheduling to reduce the power consumption of the electronic device.
  • the above-mentioned negative scheduling includes: reducing the operating frequency of the processor of the electronic device, selecting a processor with a small core to execute the method, and reducing the operating frequency of the memory of the electronic device.
  • the aforementioned preset double-frame frame length is less than or equal to K times the signal period of the second vertical synchronization signal. Among them, K ⁇ 2.
  • the method for the electronic device to increase the operating frequency of the processor may include: the electronic device increases the operating frequency of the processor according to a first preset step; or, The electronic device increases the operating frequency of the processor according to the difference between the first processing frame length and the preset single frame frame length.
  • the adjustment range of the operating frequency of the processor is proportional to the magnitude of the difference.
  • the method of the embodiment of the present application may further include: if the first processing frame length satisfies a preset condition, reducing the operating frequency of the processor.
  • the first processing frame length satisfies the preset condition, specifically including: the first processing frame length is less than the preset single frame frame length; or, the first processing frame length is less than the preset single frame frame length, and the preset single frame frame length The difference with the first processing frame length is greater than the first preset duration.
  • the first processing frame length is less than the preset single frame frame length, it means that the electronic device is more likely to complete the drawing, rendering, and composition of the layer in one synchronization cycle. In this case, it may be because the working frequency of the processor is higher, which makes the calculation speed of the processor faster, so that the electronic device can complete the drawing, rendering, and synthesis of the layer in a synchronous cycle. However, the operating frequency of the processor is too high, which will cause the power consumption of the electronic device to be high. Therefore, the electronic device can lower the operating frequency of the processor.
  • the method of the embodiment of the present application may further include: if the first processing frame length of N consecutive statistical periods meets a preset condition, lowering the operating frequency of the processor, where N ⁇ 2, and N is a positive integer. In this way, not only can the ping-pong phenomenon occur when adjusting the operating frequency of the processor, but also the rapid rise and slow fall when adjusting the operating frequency of the processor can be achieved.
  • the touch response time delay of the electronic device can be shortened, and the fluency of the electronic device (such as hand following performance) can be shortened under the premise of ensuring the stability of the system for drawing, rendering, and compositing layers of the electronic device.
  • the method for the electronic device to reduce the operating frequency of the processor may include: the electronic device reduces the operating frequency of the processor according to a second preset step.
  • the second preset step may be equal to the first preset step.
  • the second preset step may also be smaller than the first preset step.
  • the electronic device can adjust the operating frequency of the processor in a fast-up and slow-down manner. In this way, it is beneficial for the electronic device to execute the method of the embodiment of the present application, shorten the touch response time delay of the electronic device, and improve the fluency of the electronic device (such as hand-following performance).
  • the method of the embodiment of the present application may further include: during a statistical period, if one or more third layers are drawn and rendered, the cost of the first feature point If the duration is greater than the second preset duration corresponding to the first characteristic point, the operating frequency of the processor is adjusted to the maximum operating frequency of the processor.
  • the first feature point includes at least any one of the following: drawing the one or more third layers; rendering the one or more third layers; drawing the one or more third layers Execute any function during the process of rendering; execute any function during the process of rendering the one or more third layers.
  • one or more third layers are the layers that the electronic device is drawing or rendering in the statistical period.
  • the electronic device can instantaneously increase the frequency of the processor and adjust the operating frequency of the processor to the maximum operating frequency of the processor. After the processor instantaneously increases the frequency, the computing speed of the processor can be increased, and the time required for layer drawing, rendering, and composition of the electronic device can be shortened.
  • the method of the embodiment of the present application may further include: if the third processing frame length If the frame length is greater than the preset single frame, in response to the third vertical synchronization signal, the rendered layer is layer-composed to obtain an image frame.
  • the third processing frame length is the sum of the third rendering frame length and the third SF frame length.
  • the third rendering frame length is the length of time required to draw and render one or more third layers.
  • the third SF frame length is the length of time required for layer synthesis of one or more third layers to be rendered.
  • the foregoing preset single frame duration is the difference between the synchronization period and the preset delay threshold.
  • the preset delay threshold is greater than or equal to zero.
  • the above electronic device responds to the first vertical synchronization signal to draw one or more first layers, and render one or more first layers, and After the rendering of one or more first layers is completed, before performing layer synthesis on the rendered one or more first layers to obtain the first image frame, the method of the embodiment of the present application may further include: responding to the first image frame. An event starts the accelerated rendering mode.
  • the electronic device can not only draw one or more first layers, and render one or more first layers, but also can render one or more The multiple first layers are combined to obtain the first image frame.
  • the foregoing first event may include: receiving a first operation of a user; and/or, the first processing frame length of the first statistical period is less than or equal to the preset single frame length. Among them, the first statistical period is the previous statistical period at the current moment.
  • the electronic device can exit the aforementioned accelerated rendering mode in response to the second event.
  • the second event may include: receiving a second operation of the user; and/or the first processing frame length of the first statistical period is greater than the preset single frame frame length.
  • the method of the embodiment of the present application may further include: in response to the first vertical synchronization signal, drawing one or more second layers, and rendering one or more second layers In response to the third vertical synchronization signal, layer synthesis is performed on the rendered one or more second layers to obtain a second image frame; in response to the second vertical synchronization signal, the second image frame is refreshed and displayed.
  • the present application provides an electronic device that includes a touch screen, a memory, and one or more processors; the touch screen, the memory, and the processor are coupled; the memory is used to store computer program code
  • the computer program code includes computer instructions, and when the processor executes the computer instructions, the electronic device executes the method described in the first aspect and any of its possible design manners.
  • the present application provides a chip system that is applied to an electronic device including a touch screen; the chip system includes one or more interface circuits and one or more processors; the interface circuit and the processing The interface circuit is used to receive a signal from the memory of the electronic device and send the signal to the processor, and the signal includes the computer instruction stored in the memory; when the processor When executing the computer instruction, the electronic device executes the method described in the first aspect and any of its possible design manners.
  • the present application provides a computer storage medium that includes computer instructions that, when the computer instructions run on an electronic device, cause the electronic device to execute the first aspect and any of the possible The method described in the design method.
  • this application provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the method described in the first aspect and any of its possible design methods.
  • 1A is a schematic diagram of a software processing flow for an electronic device to display an image in response to a touch operation according to an embodiment of the application;
  • FIG. 1B is a schematic diagram of delay in the software processing flow shown in FIG. 1A;
  • FIG. 2 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the application.
  • FIG. 3 is a schematic diagram of an image processing flow provided by an embodiment of the application.
  • FIG. 4 is a flowchart of an image processing method based on a vertical synchronization signal provided by an embodiment of the application
  • FIG. 5 is a schematic diagram of a vertical synchronization signal used to trigger layer rendering, a vertical synchronization signal used to trigger layer rendering, and a vertical synchronization signal used to trigger layer composition according to an embodiment of the application;
  • 6A is a schematic diagram of the principle of an image processing method based on a vertical synchronization signal according to an embodiment of the application;
  • 6B is a schematic diagram of the principle of another image processing method based on a vertical synchronization signal according to an embodiment of the application;
  • 6C is a schematic diagram of the principle of another image processing method based on a vertical synchronization signal provided by an embodiment of the application;
  • 6D is a schematic diagram of the principle of another image processing method based on a vertical synchronization signal according to an embodiment of the application;
  • FIG. 7 is a schematic diagram of the principle of another image processing method based on a vertical synchronization signal provided by an embodiment of the application;
  • FIG. 8A is a schematic diagram of the principle of another image processing method based on a vertical synchronization signal provided by an embodiment of the application;
  • FIG. 8B is a flowchart of another image processing method based on a vertical synchronization signal according to an embodiment of the application.
  • FIG. 9 is a flowchart of another image processing method based on a vertical synchronization signal provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram of a display interface provided by an embodiment of this application.
  • FIG. 11 is a flowchart of another image processing method based on a vertical synchronization signal according to an embodiment of the application.
  • FIG. 12 is a flowchart of another image processing method based on a vertical synchronization signal provided by an embodiment of the application.
  • FIG. 13 is a flowchart of another image processing method based on a vertical synchronization signal provided by an embodiment of the application.
  • FIG. 14 is a schematic structural composition diagram of a phase adjustment device for a vertical synchronization signal provided by an embodiment of the application.
  • 15 is a schematic diagram of a software processing flow of another electronic device displaying an image in response to a touch operation according to an embodiment of the application;
  • FIG. 16 is a schematic diagram of test results of a test scenario provided by an embodiment of the application.
  • FIG. 17 is a schematic structural diagram of a chip system provided by an embodiment of the application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Thus, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of this embodiment, unless otherwise specified, “plurality” means two or more.
  • the embodiment of the present application provides an image processing method based on a vertical synchronization signal, and the method can be applied to an electronic device including a touch screen. Specifically, the method can be applied to the electronic device in the process of displaying images on the touch screen in response to the user's touch operation on the touch screen.
  • the delay time from "the user inputs a user operation to the electronic device” to "the electronic device displays an image corresponding to the user operation” may be referred to as the electronic device response delay.
  • the fluency of the electronic device (such as hand-following performance) can be reflected in the length of the response delay.
  • the above-mentioned user operation is a touch operation
  • the above-mentioned fluency may be a hand-following performance
  • the above-mentioned response delay may be referred to as a touch response delay.
  • the touch response delay is the delay time from "the user's finger inputs a touch operation on the touch screen" to "the touch screen displays an image corresponding to the touch operation".
  • the longer the response delay of the electronic device the worse the fluency of the electronic device (such as hand following performance); the shorter the response delay of the electronic device, the better the fluency (such as hand following performance) of the electronic device.
  • the better the fluency of the electronic device such as hand-following performance
  • the better and the smoother the user experience of controlling the electronic device through user operations such as touch operations.
  • FIG. 1A which takes the above-mentioned user operation as a touch operation as an example, showing a schematic diagram of the software processing flow of the electronic device during the process from "the user's finger inputs a touch operation on the touch screen” to "the touch screen displays an image corresponding to the touch operation".
  • the electronic device may include: touch panel (TP)/TP driver (Driver) 10, Input framework (i.e. Input Framework) 20, UI framework (i.e. UI Framework) 30, Display framework (i.e. Display Framework 40 and hardware display module 50.
  • the software processing flow of the electronic device may include the following steps (1) to (5).
  • Step (1) After the TP in the TP IC/TP driver 10 collects the touch operation of the user's finger on the TP of the electronic device, the TP driver reports the corresponding touch event to the Event Hub.
  • Step (2) The Input Reader thread of the Input frame 20 can read the touch event from the Event Hub, and then send the touch event to the Input Dispatcher thread; the Input Dispatcher thread uploads the touch event to the UI thread (such as DoFrame) in the UI framework 30 Touch event.
  • the UI thread such as DoFrame
  • Step (3) The UI thread in the UI frame 30 draws one or more layers corresponding to the touch event; the rendering thread (such as DrawFrame) performs layer rendering on one or more layers.
  • the rendering thread such as DrawFrame
  • Step (4) The composition thread in the Display frame 40 performs layer composition on the drawn one or more layers (that is, the rendered one or more layers) to obtain an image frame.
  • Step (5) The liquid crystal display (LCD) of the hardware display module 50 is driven to receive the synthesized image frame, and the LCD displays the synthesized image frame. After the LCD displays the image frame, the image displayed by the LCD can be perceived by the human eye.
  • LCD liquid crystal display
  • the embodiment of the application analyzes the processing flow of the electronic device from "the user's finger inputs a touch operation on the touch screen” to “the touch screen displays the image corresponding to the touch operation is perceived by the human eye", and the principle of the electronic device to shorten the response delay is simplified. Description.
  • step (1) when the TP IC/TP driver 10 collects touch operations and reports the touch time to the Input framework 20, there may be a kernel delay as shown in FIG. 1B.
  • step (2) when the Input frame 20 processes the touch time and inputs the touch event to the UI frame, there may be an input delay as shown in FIG. 1B.
  • step (3) when the UI thread in the UI framework draws one or more layers corresponding to the touch event, there may be a drawing delay (also called UI thread delay) shown in Figure 1B; and the rendering thread performs the drawing Layer rendering may have a rendering delay as shown in Figure 1B.
  • the composition thread in the Display framework 40 may have a composition delay as shown in FIG. 1B for layer composition.
  • step (5) during the process of displaying the synthesized image frame by the hardware display module 50, there may be a display delay as shown in FIG. 1B.
  • An image processing method based on a vertical synchronization signal provided by an embodiment of the present application can shorten the "drawing delay", “rendering delay” and “synthesizing delay” shown in FIG. 1B to shorten the response delay of the electronic device and improve the electronic The fluency of the device (such as hand-to-hand performance).
  • the electronic devices in the embodiments of the present application may be mobile phones, tablet computers, desktops, laptops, handheld computers, notebook computers, ultra-mobile personal computers (UMPC), netbooks, and cellular computers.
  • FIG. 2 is a schematic structural diagram of an electronic device 200 according to an embodiment of this application.
  • the electronic device 200 may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (USB) interface 230, a charging management module 240, a power management module 241, and a battery 242 , Antenna 1, antenna 2, mobile communication module 250, wireless communication module 260, audio module 270, speaker 270A, receiver 270B, microphone 170C, earphone interface 270D, sensor module 280, buttons 290, motor 291, indicator 292, camera 293 , The display screen 294, and the subscriber identification module (SIM) card interface 295, etc.
  • SIM subscriber identification module
  • the sensor module 280 may include a pressure sensor 280A, a gyroscope sensor 280B, an air pressure sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity light sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, and the environment Light sensor 280L, and bone conduction sensor 280M, etc.
  • the structure illustrated in this embodiment does not constitute a specific limitation on the electronic device 200.
  • the electronic device 200 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 210 may include one or more processing units.
  • the processor 210 may include an application processor (AP), a modem processor, a GPU, an image signal processor (ISP), and a control Processor, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • ISP image signal processor
  • control Processor memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 200.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 210 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 210 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 210. If the processor 210 needs to use the instruction or data again, it can be directly called from the memory. Repeated access is avoided, the waiting time of the processor 210 is reduced, and the efficiency of the system is improved.
  • the processor 210 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter (universal asynchronous transmitter) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the interface connection relationship between the modules illustrated in this embodiment is merely a schematic description, and does not constitute a structural limitation of the electronic device 200.
  • the electronic device 200 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 240 is used to receive charging input from the charger. While the charging management module 240 charges the battery 242, it can also supply power to the electronic device through the power management module 241.
  • the power management module 241 is used to connect the battery 242, the charging management module 240 and the processor 210.
  • the power management module 241 receives input from the battery 242 and/or the charge management module 240, and supplies power to the processor 210, the internal memory 221, the external memory, the display screen 294, the camera 293, and the wireless communication module 260.
  • the power management module 241 may also be provided in the processor 210.
  • the power management module 241 and the charging management module 240 may also be provided in the same device.
  • the wireless communication function of the electronic device 200 can be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 200 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the mobile communication module 250 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 200.
  • the mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 250 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 250 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 270A, a receiver 270B, etc.), or displays an image or video through the display screen 294.
  • the wireless communication module 260 can provide applications on the electronic device 200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 260 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 260 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 210.
  • the wireless communication module 260 may also receive the signal to be sent from the processor 210, perform frequency modulation, amplify it, and convert it into electromagnetic wave radiation via the antenna 2.
  • the antenna 1 of the electronic device 200 is coupled with the mobile communication module 250, and the antenna 2 is coupled with the wireless communication module 260, so that the electronic device 200 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 200 implements a display function through a GPU, a display screen 294, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 294 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 294 is used to display images, videos, etc.
  • the display screen 294 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • emitting diode AMOLED, flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the display screen 294 in the embodiment of the present application may be a touch screen. That is, the touch sensor 280K is integrated in the display screen 294.
  • the touch sensor 280K may also be called a “touch panel”.
  • the display screen 294 may include a display panel and a touch panel, and a touch screen composed of the touch sensor 280K and the display screen 294 is also called a “touch screen”.
  • the touch sensor 280K is used to detect touch operations acting on or near it. After the touch operation detected by the touch sensor 280K, it can be transmitted to the upper layer by a driver of the kernel layer (such as a TP driver) to determine the type of the touch event.
  • the display screen 294 may provide visual output related to the touch operation.
  • the touch sensor 280K may also be disposed on the surface of the electronic device 200, which is different from the position of the display screen 294.
  • the electronic device 200 may implement a shooting function through an ISP, a camera 293, a video codec, a GPU, a display screen 294, and an application processor.
  • the ISP is used to process the data fed back by the camera 293.
  • the camera 293 is used to capture still images or videos.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 200 may support one or more video codecs. In this way, the electronic device 200 can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can realize applications such as intelligent cognition of the electronic device 200, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 220 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 200.
  • the external memory card communicates with the processor 210 through the external memory interface 220 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 221 may be used to store computer executable program code, the executable program code including instructions.
  • the processor 210 executes various functional applications and data processing of the electronic device 200 by running instructions stored in the internal memory 221.
  • the processor 210 may execute instructions stored in the internal memory 221, and the internal memory 221 may include a program storage area and a data storage area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 200.
  • the internal memory 221 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), etc.
  • the electronic device 200 can implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 170C, a headphone interface 270D, and an application processor. For example, music playback, recording, etc.
  • the audio module 270 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 270 can also be used to encode and decode audio signals.
  • the speaker 270A also called a “speaker” is used to convert audio electrical signals into sound signals.
  • the receiver 270B also called “earpiece”, is used to convert audio electrical signals into sound signals.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the earphone interface 270D is used to connect wired earphones.
  • the pressure sensor 280A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 280A may be provided on the display screen 294.
  • the capacitive pressure sensor may include at least two parallel plates with conductive material. When a force is applied to the pressure sensor 280A, the capacitance between the electrodes changes.
  • the electronic device 200 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 200 detects the intensity of the touch operation according to the pressure sensor 280A.
  • the electronic device 200 may also calculate the touched position based on the detection signal of the pressure sensor 280A.
  • touch operations that act on the same touch location but have different touch operation strengths may correspond to different operation instructions.
  • the electronic device 200 may obtain the pressing force of the user's touch operation through the pressure sensor 280A.
  • the button 290 includes a power button, a volume button, and so on.
  • the button 290 may be a mechanical button. It can also be a touch button.
  • the electronic device 200 may receive key input, and generate key signal input related to user settings and function control of the electronic device 200.
  • the motor 291 can generate vibration prompts.
  • the motor 291 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • the indicator 292 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 295 is used to connect to the SIM card. The SIM card can be inserted into the SIM card interface 295 or pulled out from the SIM card interface 295 to achieve contact and separation with the electronic device 200.
  • the electronic device 200 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 295 may support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • Vertical synchronization signal 1 Such as VSYNC_APP.
  • the vertical synchronization signal 1 can be used to trigger the drawing of one or more layers.
  • the “vertical synchronization signal 1 can be used to trigger the drawing of one or more layers” in the embodiment of the present application specifically refers to: the vertical synchronization signal 1 can be used to trigger the drawing of one or more layers and trigger the The one or more layers are rendered. That is, in the embodiment of the present application, the drawn one or more layers refer to the rendered one or more layers.
  • the electronic device in response to the vertical synchronization signal 1, the electronic device may draw one or more layers for each application through each of the multiple drawing threads.
  • the electronic device in response to the vertical synchronization signal 1, can perform drawing tasks for one or more applications at the same time to draw one or more layers corresponding to each application.
  • drawing tasks for one or more applications at the same time to draw one or more layers corresponding to each application.
  • Vertical synchronization signal 2 Such as VSYNC_SF.
  • the vertical synchronization signal 2 can be used to trigger layer synthesis of one or more drawn layers to obtain an image frame.
  • Vertical synchronization signal 3 Such as HW_VSYNC.
  • the vertical synchronization signal 3 can be used to trigger the hardware to refresh the display image frame.
  • the vertical synchronization signal 1 (such as VSYNC_APP) in the embodiment of the present application is the first vertical synchronization signal described in the claims
  • the vertical synchronization signal 2 (such as VSYNC_SF) is the third vertical synchronization signal described in the claims.
  • the vertical synchronization signal 3 (HW_VSYNC) is the second vertical synchronization signal described in the claims.
  • the name of the vertical synchronization signal may be different in different systems or architectures.
  • the name of the vertical synchronization signal ie, vertical synchronization signal 1 used to trigger drawing of one or more layers may not be VSYNC_APP.
  • the name of the vertical synchronization signal is, as long as it is a synchronization signal with similar functions and conforms to the technical ideas of the method provided in the embodiments of this application, it should be covered by the protection scope of this application.
  • the definition of the vertical synchronization signal may be different.
  • the definition of the vertical synchronization signal 1 can be: the vertical synchronization signal 1 can be used to trigger the rendering of one or more layers;
  • the definition of the vertical synchronization signal 2 can be: the vertical synchronization signal 2 can be Used to trigger the generation of image frames based on one or more layers;
  • the definition of the vertical synchronization signal 3 can be: the vertical synchronization signal 3 can be used to trigger the display of the image frame.
  • the definition of the vertical synchronization signal is not limited. However, regardless of the definition of the vertical synchronization signal, as long as it is a synchronization signal with similar functions and conforms to the technical idea of the method provided in the embodiment of this application, it should be covered by the protection scope of this application.
  • the UI framework in response to a user operation (for example, the user's touch operation on TP, as shown in Figure 3, the finger touches TP) or the occurrence of a UI event on the electronic device, the UI framework can call the UI thread at the moment when the vertical synchronization signal 1 arrives. Draw one or more layers corresponding to the touch event, and then call the rendering thread to render the one or more layers.
  • the user operation may also be an operation input by the user through a mouse or a button.
  • the electronic device responds to user operations input by the user through a mouse or keystrokes, etc., and the fluency of the electronic device can also be improved by the method in the embodiments of the present application.
  • Hardware Composer can call the composition thread at the moment when the vertical synchronization signal 2 arrives to perform layer composition on one or more drawn layers (that is, one or more layers after rendering) to obtain an image Frame;
  • the hardware display module can refresh and display the above-mentioned image frame on the LCD (ie, the display screen, such as the above-mentioned display screen 294, where the LCD is taken as an example) when the vertical synchronization signal 3 arrives.
  • the aforementioned UI event may be triggered by the user's touch operation on the TP.
  • the UI event may be automatically triggered by the electronic device.
  • the foreground application of the electronic device automatically switches the screen, the aforementioned UI event can be triggered.
  • the foreground application is an application corresponding to the interface currently displayed on the display screen of the electronic device.
  • the UI framework is based on the vertical synchronization signal 1 to periodically draw and render layers;
  • the hardware synthesis HWC is based on the vertical synchronization signal 2 to periodically perform layer synthesis;
  • LCD is based on the vertical synchronization signal 3 cycles Refresh the image frame.
  • the vertical synchronization signal 3 is a hardware signal triggered by the display drive of the electronic device.
  • the signal period T3 of the vertical synchronization signal 3 (such as HW_VSYNC) is determined according to the screen refresh rate of the display screen of the electronic device.
  • the signal period T3 of the vertical synchronization signal 3 is the reciprocal of the screen refresh rate of the display screen (such as an LCD) of the electronic device.
  • the screen refresh rate described in the embodiment of the present application is the screen refresh rate currently used by the electronic device. That is, the signal period T3 of the vertical synchronization signal 3 is the reciprocal of the screen refresh rate currently used by the electronic device.
  • the vertical synchronization signal 3 in the embodiment of the present application is a periodic discrete signal. For example, as shown in Figure 5, every signal period (such as T3) will have a vertical synchronization signal 3 triggered by the hardware drive.
  • the vertical synchronization signal 3 that appears multiple times in Figure 5 is a signal according to the vertical synchronization signal 3.
  • the period T3 arrives sequentially.
  • the vertical synchronization signal 1 and the vertical synchronization signal 2 are generated based on the vertical synchronization signal 3. That is, the vertical synchronization signal 3 may be a signal source of the vertical synchronization signal 1 and the vertical synchronization signal 2; or, the vertical synchronization signal 1 and the vertical synchronization signal 2 are synchronized with the vertical synchronization signal 3. Therefore, the signal period of the vertical synchronization signal 1 and the vertical synchronization signal 2 is the same as the signal period of the vertical synchronization signal 3, and the phases are the same. For example, as shown in FIG. 5, the signal period T1 of the vertical synchronization signal 1, the signal period T2 of the vertical synchronization signal 2 and the signal period T3 of the vertical synchronization signal 3 are the same.
  • the above-mentioned vertical synchronization signal 1 and vertical synchronization signal 2 are also periodic discrete signals.
  • a vertical synchronization signal 1 for every signal period such as T1
  • a vertical synchronization signal 2 for every signal period such as T2
  • the vertical synchronization signals 2 appearing multiple times in FIG. 5 arrive sequentially according to the signal period T2 of the vertical synchronization signal 2. Therefore, the vertical synchronization signal 3, the vertical synchronization signal 1, and the vertical synchronization signal 2 can all be regarded as periodic discrete signals.
  • the arrival of the vertical synchronization signal (such as the arrival of the vertical synchronization signal 1) described in the embodiment of this application refers to the vertical synchronization signal
  • time t 1 is the arrival of a vertical synchronizing signal, refers to a time t pulse edge of the vertical synchronizing signal arrival 1; time t1 in response to a vertical synchronization signal, means responsive to the vertical synchronization timing t1 Pulse edge of signal 1.
  • the above-mentioned pulse edge is an edge of a pulse visually observed from an oscilloscope or observation system. In different systems, it may be a rising edge or a falling edge or both. In actual systems, it may be realized by means of timer flipping, interrupt signal, etc.
  • the screen refresh rate of the display screen of the electronic device may be any value such as 60 Hertz (Hz), 70 Hz, 75 Hz, or 80 Hz.
  • TP is a touch panel TP that can be integrated in the above-mentioned display 294.
  • TP is also called a touch sensor, such as the above-mentioned touch sensor 280K.
  • TP can periodically detect the user's touch operation. After the TP detects the touch operation, it can wake up the vertical synchronization signal 1 and the vertical synchronization signal 2 to trigger the UI framework to draw and render the layer based on the vertical synchronization signal 1, and the hardware synthesis HWC performs layer synthesis based on the vertical synchronization signal 2.
  • the detection period of the TP detection touch operation is the same as the signal period T3 of the vertical synchronization signal 3 (such as HW_VSYNC).
  • the drawing thread is based on the vertical synchronization signal 1 to draw the layer, and then the rendering thread performs the layer rendering; the composition thread is based on the vertical synchronization signal 2 (such as VSYNC_SF) for layer synthesis; therefore, the electronic device
  • the drawing, rendering and compositing of layers need to be completed in two synchronization cycles.
  • Frame1 Frame2, Frame3, and Frame4 shown in FIG. 3 respectively correspond to a synchronization period.
  • the time required for the electronic device to draw, render, and compose layers may be less than or equal to one synchronization period. That is to say, in the above two synchronization cycles (Frame2 and Frame3 as shown in Figure 3), the electronic device may only take up a part of the time for drawing, rendering and compositing layers; other times are waiting for the vertical synchronization signal 2 And the arrival of vertical sync signal 3. In this way, the response delay of the electronic device will be unnecessarily prolonged, which affects the fluency of the electronic device (such as hand-tracking performance).
  • the drawing, rendering and composition of the layers of the electronic device meets the single-frame rendering requirements (for example, the required time is less than or equal to one synchronization period), in a synchronization
  • the drawing, rendering and compositing of layers are performed in the cycle.
  • the "drawing delay", “rendering delay” and “compositing delay” shown in FIG. 1B can be shortened, so as to shorten the response delay of the electronic device and improve the fluency of the electronic device (such as hand-tracking performance).
  • the possibility that the electronic device completes the drawing, rendering, and composition of the layer in one synchronization cycle can be improved. That is, the benefit point of the electronic device shortening the response delay can be a synchronization period.
  • the execution subject of the image processing method based on the vertical synchronization signal provided by the embodiment of the present application may be a device that generates image frames.
  • the apparatus for generating image frames may be any one of the above-mentioned electronic devices (for example, the phase adjustment apparatus may be the electronic device 200 shown in FIG. 2).
  • the apparatus for generating an image frame may also be a CPU of an electronic device, or a control module in the electronic device for executing the image processing method based on the vertical synchronization signal.
  • the image processing method based on the vertical synchronization signal executed by the electronic device is taken as an example to illustrate the image processing method based on the vertical synchronization signal provided in the embodiment of the present application.
  • the first vertical synchronization signal is the above vertical synchronization signal 1 (such as the VSYNC_APP signal)
  • the third vertical synchronization signal is the above vertical synchronization signal 2 (such as the VSYNC_SF signal)
  • the second vertical synchronization signal is the above vertical synchronization signal. 3 (such as the HW_VSYNC signal) as an example to describe the method of the embodiment of the present application.
  • the embodiment of the present application provides an image processing method based on a vertical synchronization signal.
  • the image processing method based on the vertical synchronization signal may include S401-S402.
  • the electronic device draws one or more first layers, and renders one or more first layers, and after the rendering of the one or more first layers is completed, One or more first layers are combined to obtain the first image frame.
  • the delay time of the vertical synchronization signal 1 relative to the vertical synchronization signal 3 is zero, and the phase difference is zero.
  • the delay time of the vertical synchronization signal 1 and the vertical synchronization signal 2 relative to the vertical synchronization signal 3 is zero, and the phase difference is zero.
  • the delay time of the signal 2 relative to the vertical synchronization signal 3 is zero, which can be specifically: when a vertical synchronization signal 3 arrives, a vertical synchronization signal 1 and a vertical synchronization signal 2 will also arrive.
  • a vertical synchronization signal 1, a vertical synchronization signal 2 and a vertical synchronization signal 3 arrive at the same time; at time t 2 , a vertical synchronization signal 1.
  • a vertical synchronization signal 2 and a vertical synchronization signal 3 arrive at the same time; at t 3 , a vertical synchronization signal 1, a vertical synchronization signal 2 and a vertical synchronization signal 3 arrive at the same time; at t 4 , a vertical synchronization signal 1.
  • a vertical synchronization signal 2 and a vertical synchronization signal 3 arrive at the same time; at time t 5 , a vertical synchronization signal 1, a vertical synchronization signal 2 and a vertical synchronization signal 3 arrive at the same time.
  • FIG. 6A shows, the time t 1, a vertical synchronizing signal arrival 1; in response to the vertical synchronization signal a time t 1, the electronic device 1 may perform drawing and rendering 1; t 2 time , a vertical synchronizing signal arrival 1; time t 2 in response to the vertical synchronization signal 1, the electronic device can perform rendering and drawing 2 2.
  • the electronic device ie, the HWC of the electronic device
  • the electronic device ie the UI thread and the rendering thread of the electronic device
  • the HWC will not perform the rendering.
  • multiple first layers for layer synthesis The HWC will only perform layer synthesis when the time of the vertical synchronization signal 2 arrives to obtain an image frame.
  • the vertical synchronization signal 2 can only be used at time t 2 after time t 6 arrival; time t 2 in response to the vertical synchronization signal 2, the electronic device (i.e. an electronic device HWC) can be synthesized layers (i.e., perform "image frame 1 synthesis") to obtain a first image frame.
  • the electronic device needs to wait for ⁇ t1 shown in (a) in FIG. 6A before performing "image frame synthesis 1".
  • FIG. 6A (a) even if the electronic device has completed at time t 7 the layer rendering (i.e., "Rendering 2"), but the time t 7 after a time t 3, only the vertical synchronizing signal 2 may soon; in response to the vertical synchronization signal t 2, the time of the electronic device 3 (i.e., the electronic device HWC) can be synthesized layers (i.e., perform "image frame 2 synthesis") to obtain a first image frame. In other words, the electronic device needs to wait for ⁇ t2 shown in (a) in FIG. 6A before performing "image frame synthesis 2".
  • the electronic device does not need to wait for the vertical synchronization signal 2.
  • the vertical synchronization signal 1 draws and renders one or more first layers after rendering, that is, after one or more first layers are rendered After that, you can start layer synthesis on the rendered first layer to obtain the first image frame.
  • the electronic device can perform layer synthesis on the rendered first layer in advance.
  • the electronic device in response to the vertical synchronization signal 1, the electronic device may perform "drawing 1", “rendering 1", and "image frame synthesis 1".
  • FIG. 6A (b) at time t 6, "draw 1" and "Render 1" has ended.
  • the electronic device may be performed before the time t 2 the arrival of the vertical synchronizing signal 2, at time t 6 will proceed to the layer synthesis, i.e. performing "the image frame of one.” That is, the electronic device does not need to wait for the arrival of the vertical synchronization signal 2 at time t2, and can start to perform "image frame synthesis 1".
  • the electronic device in response to the vertical synchronization signal 1, the electronic device may perform "drawing 2", “rendering 2", and "image frame synthesis 2". As shown in FIG. 6A (b), at time t 7, "Drawing 2" and “Render 2" has ended.
  • the electronic device may be performed before the time t 3 of the arrival of the vertical synchronizing signal 2, at time t 6 will proceed to the layer synthesis, i.e. performing "image frame 2 Synthesis.” I.e., the electronic device need not wait for the time t 3 of the arrival of the vertical synchronizing signal 2, can begin "the image frame of two.”
  • the electronic device performs layer synthesis on this one first layer, which specifically includes: The layer performs format conversion, and the first layer is converted into the first image frame.
  • the electronic device performs layer synthesis on the multiple first layers, specifically including: the electronic device performs layer synthesis on the multiple first layers Layer synthesis to obtain the first image frame.
  • the electronic device performs layer synthesis on the rendered one or more first layers to obtain the first image frame, which may specifically include: after the rendering thread completes the rendering of the one or more first layers,
  • the composition thread can be called to perform layer composition on the rendered one or more first layers to obtain the first image frame; or, after the rendering thread completes the rendering of one or more first layers, it sends an instruction message to the composition thread, To trigger the composition thread to perform layer composition on the rendered one or more first layers to obtain the first image frame; alternatively, the composition thread may correct the rendering of one or more first layers when the rendering thread is detected
  • the rendered one or more first layers are layer-composed to obtain the first image frame.
  • the electronic device may immediately perform layer synthesis on the rendered one or more first layers after rendering the one or more first layers.
  • the electronic device may perform layer composition on the rendered one or more first layers after a certain delay time. In other words, in the actual implementation process, there may be a certain delay between the layer rendering and layer composition performed by the electronic device.
  • the electronic device In response to the vertical synchronization signal 3, the electronic device refreshes and displays the first image frame.
  • a vertical synchronization signal coming 3 obtained in response to the image frame "the image frame of 1" t 2 the vertical synchronizing signal timing 3, the electronic device can perform refresh the display (as in the first image frame), i.e., performs "image frame display 1"; at time t 3, a vertical synchronizing signal arrival 3; in response to the vertical synchronizing signal 3 time t 3, the electronic device performs a display refresh "the image frame of 2 "The resulting image frame, that is, execute "Image Frame Display 2".
  • the embodiment of the application hereby analyzes and explains the principle and effect of shortening the response delay of the electronic device after performing S401 on the electronic device in combination with (b) in FIG. 6A:
  • the time required for the electronic device to draw, render, and synthesize layers may be less than or equal to one synchronization period.
  • the electronic device executes S401 to complete drawing, rendering, and composition of the layer in one synchronization cycle (such as the first synchronization cycle).
  • the electronic device completes "drawing 1", “rendering 1” and “image frame synthesis 1" during the synchronization period from time t 1 to time t 2 ;
  • the synchronization period from time t 2 to time t 3 shown in (b) in 6A completes "drawing 2", “rendering 2" and "image frame synthesis 2".
  • the electronic device can be advanced by one synchronization period (for example, time t 2 is ahead of time t 3 by one synchronization period T Z ) Refresh and display the first image frame. That is to say, through the method of the embodiment of the present application, the response delay of the electronic device can be shortened by one synchronization period, and the fluency of the electronic device (such as hand-following performance) can be improved.
  • the time required for the electronic device to draw, render, and synthesize the layer, or the time required for the electronic device to draw and render the layer may be greater than one synchronization period.
  • frame loss may occur during the process of refreshing the image frame on the display screen. Specifically, in the process of refreshing and displaying image frames on the display screen, a repeated image may be displayed. In this way, the smoothness of the image displayed on the display screen will be affected, thereby affecting the visual experience of the user.
  • the time t 1 a vertical synchronizing signal arrival 1; in response to the vertical synchronization signal a time t 1, the electronic device performs "1 Draw” and "Render 1"; the time t 2, a vertical synchronization signal 2 coming; in response to the time t 2 the vertical synchronizing signal 2, the electronic device performs "image frame 1 synthesis”; at time t 3, a vertical synchronizing signal 3 coming; in response to the time t 3 of Vertical synchronization signal 3, the electronic device executes "image frame display 1".
  • FIG. 7 (a) the time t 1, a vertical synchronizing signal arrival 1; in response to the vertical synchronization signal a time t 1, the electronic device performs "1 Draw” and "Render 1"; the time t 2, a vertical synchronization signal 2 coming; in response to the time t 2 the vertical synchronizing signal 2, the electronic device performs "image frame 1 synthesis"; at time t 3, a vertical synchronizing signal 3 coming; in response to the time t 3 of Vertical synchron
  • the method of the embodiment of the present application it is possible to avoid the phenomenon of frame dropping in the displayed image, so as to prevent the display screen from displaying a frame of repeated images.
  • the method of the embodiment of the present application can ensure the smoothness of the image displayed on the display screen, thereby improving the user's visual experience.
  • the electronic device in response to the vertical synchronization signal 1, the electronic device may perform "drawing 1", “rendering 1", and "image frame synthesis 1".
  • FIG. 7 (b) in the time T 9, "draw 1" and "Render 1" has ended.
  • the electronic device may start layer synthesis at time t 9 before the arrival of the vertical synchronization signal 2 at time t 2 , that is, execute "image frame synthesis 1". That is, the electronic device does not need to wait for the arrival of the vertical synchronization signal 2 at time t2, and can start to perform "image frame synthesis 1".
  • the electronic device may perform "drawing 2", “rendering 2", and "image frame synthesis 2". As shown in (b) in FIG. 7, at time t 10 , "Draw 2" and “Render 2" have ended.
  • the electronic device may start layer composition at time t 10 before the arrival of the vertical synchronization signal 2 at time t 4 , that is, execute "image frame composition 2". I.e., the electronic device need not wait for the time t 4 the arrival of the vertical synchronizing signal 2, you can begin "the image frame of two.”
  • the electronic device in response to the vertical synchronization signal 1, the electronic device may perform "drawing 3", “rendering 3", and "image frame synthesis 2".
  • FIG. 7 (b) in response to the vertical synchronization signal 1, the electronic device may perform "drawing 3", "rendering 3", and "image frame synthesis 2".
  • the electronic device may start layer synthesis at time t 11 before the arrival of the vertical synchronization signal 2 at time t 5 , that is, execute "image frame synthesis 3". That is, the electronic device does not need to wait for the arrival of the vertical synchronization signal 2 at time t 5 , and can start to perform "image frame synthesis 3".
  • the above-mentioned one or more layers may include: layers drawn by the electronic device to execute drawing tasks corresponding to one or more applications.
  • the one or more applications may include: at least one of one or more system-level applications and one or more user-level applications.
  • the system-level applications may include: status bar, launcher, navigation bar, wallpaper, etc.
  • the above-mentioned user-level applications may include: system applications of electronic devices such as "settings", "phone” and “short message”, and third-party applications that the electronic device can download from the application store in response to user operations.
  • third-party applications may include applications such as "WeChat", "Alipay", and "Baidu Maps".
  • the drawing tasks described in the embodiments of the present application include "layer drawing” and "layer rendering".
  • S401 may specifically include S401a and S401b.
  • S401a In response to the vertical synchronization signal 1, the electronic device draws one or more first layers for each of the one or more applications, and renders the one or more first layers.
  • S401b The electronic device performs layer synthesis on one or more first layers rendered by the electronic device for one or more applications to obtain a first image frame.
  • the electronic device may uniformly synthesize the layers rendered by the electronic device for one or more applications to obtain an image frame.
  • the electronic device may perform layer synthesis on the layers rendered by the electronic device for each application to obtain multiple image frames.
  • an example is described by taking the electronic device to synthesize the layers rendered by the electronic device for one or more applications to obtain an image frame as an example.
  • the above one or more applications may include only one application.
  • the electronic device can perform a drawing task corresponding to the one application, drawing (that is, drawing and rendering) one or more layers. That is, the electronic device can execute S401a for this application to draw and render the corresponding one or more layers. Then, the electronic device may execute S401b to perform layer composition on one or more layers rendered for this application.
  • the electronic device can execute S401a to draw and render one or more layers 1 for application 1; then, the electronic device can execute S401b to perform S401b for an application rendered for this application 1. Or multiple layers 1 are combined to obtain image frame 1.
  • the electronic device may execute "Draw 1" and "Render 1" in response to the vertical synchronization signal 1 at time t 1 to obtain one or more rendered layers 1; , for rendering one or more layers of layer 1 synthesis, i.e., performs "image frame 1 synthesis", to give an image frame; time t 2 in response to the vertical synchronizing signal 3, the electronic device may perform "image frame display 1 ", refresh the display image frame 1.
  • the above one or more applications may include multiple applications.
  • the electronic device may respectively execute the drawing tasks corresponding to the multiple applications, and draw (that is, draw and render) one or more layers for each of the multiple applications. That is, the electronic device may execute S401a for each of the multiple applications to draw and render the corresponding one or more layers. Then, the electronic device may execute S401b to perform layer synthesis on one or more layers respectively rendered for the multiple applications. That is, in S401b, the rendered one or more layers may include: layers obtained by the electronic device executing drawing tasks corresponding to multiple applications.
  • the electronic device may draw and render one or more layers 1 for application 1, and draw and render for application a.
  • One or more layers a the electronic device may perform layer synthesis on the one or more layers 1 and the one or more layers a to obtain the image frame a.
  • the electronic device in response to the vertical synchronization signal 1 at time t 1 , can execute “drawing 1” and “rendering 1” for application 1, to obtain one or more layers 1, and execute for application a. "Draw a” and “render a” to obtain the above one or more layers a; then, the electronic device can perform "image frame synthesis 1+a" on one or more layers 1 and one or more layers a "obtain image frame a; time t 2 in response to the vertical synchronizing signal 3, the electronic device may perform" image frame display 1 + a ", refresh the display image frame a.
  • the vertical synchronizing signal 1, for the application can perform a "Draw 2" and "render 2", to obtain the one or more layer 2, the application for a Perform “draw b” and “render b” to obtain one or more of the above-mentioned layers b; then, the electronic device can perform “image frame synthesis 2+” on one or more layers 2 and one or more layers b b" to obtain the image frame b; in response to the vertical synchronization signal 3 at time t3, the electronic device can execute "image frame display 2+b" to refresh and display the image frame b.
  • image frame synthesis 1+a” and “image frame synthesis 2+b” shown in Fig. 6B both represent a layer synthesis.
  • the electronic device executes "image frame display 1+a” or “image frame display 2+b”, the refreshed display is also a frame of image.
  • the method of the embodiment of the present application can be applied not only to a scenario where an electronic device displays an application interface in a single window, but also can be applied to a scenario where an electronic device displays multiple application interfaces in multiple windows.
  • the method of the embodiment of the present application can be applied to a scenario where an electronic device displays multiple windows on a horizontal screen, and can also be applied to a scenario where a folding screen electronic device displays multiple application interfaces in multiple windows.
  • the method for the electronic device to display the interface of the application in a single window or multiple windows will not be repeated here in the embodiment of the present application.
  • the electronic device for one application executes the method of the embodiments of the present application as an example for description. However, it does not mean that in the method provided by the embodiment of the present application, the electronic device cannot simultaneously perform layer drawing, rendering, synthesis, and image frame display for multiple applications.
  • the electronic device may perform layer synthesis on the rendered layers to obtain the first image frame when the rendering of the layers of all applications in the above multiple applications ends.
  • the electronic device in response to the vertical synchronization signal 1 at time t 1 , can execute “drawing 1” and “rendering 1” for application 1, to obtain one or more layers 1, and execute for application a. "Draw a” and “render a” to obtain one or more layers a above. 6B, the electronic device is completed “Render a" at time t 15, to complete the "Render 1" at time t 6. In case (1), even if the electronic device has completed “rendering a" at t 15 ; however, because the electronic device has not completed “rendering 1"; therefore, the electronic device will not start layer synthesis at t 15 ( That is to perform "image frame synthesis 1+a").
  • the electronic device at the time t 6 Completed "Render a" and "Render 1"; therefore, the electronic device can perform "the image frame of 1 + a", the image frame to obtain a.
  • the electronic device may perform "image frame display 1 + a", i.e., to refresh the display image frame a.
  • the electronic device can perform "the image frame of 2 + b", in order to obtain an image frame b.
  • the electronic device can execute "image frame display 2+b", that is, refresh the display image frame b.
  • the electronic device may perform layer synthesis on the rendered layers to obtain the first image frame when the rendering of the partially applied layers in the above multiple applications is all finished.
  • the above S401b can be replaced with: one or more first layers of applications such as focus applications, key applications, or applications that are strongly related to the fluency of the electronic device in one or more applications of the electronic device
  • layer composition is performed on one or more first layers that the electronic device has rendered for one or more applications to obtain the first image frame.
  • the focus application is the application corresponding to the user's touch operation received by the electronic device, that is, the application focused by the actual focus.
  • the electronic device is completed at time t 15 the application of a "rendering a", at time t 6 to complete the application of a "1 rendering.”
  • the electronic device has not completed “rendering 1" at time t 15 ; however, the electronic device has completed “rendering a" of focus application a at time t 15 ; therefore, the electronic device can be at time t 15 Perform layer composition (that is, perform "image frame composition a").
  • the electronic device is completed at time t 16 the application of a "rendering B", at time t 6 to complete the application of a "Render 2."
  • the electronic device may be at the time t 16 Perform layer synthesis (ie, perform "Image Frame Synthesis 1+b").
  • the electronic device performs "the image frame of a" at t 15 time, performing "the image frame of 1 + b", may reflect interface changes the focus of the application a at the interface between the t 16 time, can improve the user The visual experience of the fluency of electronic devices.
  • the electronic device may perform layer synthesis on the rendered layers to obtain the first image frame when the rendering of the preset layers partially applied in the above multiple applications is all finished.
  • the electronic device may perform layering on the electronic device for one or more first layers that have been rendered when the preset layer rendering of the focus application or the application related to the fluency of the electronic device is finished. Synthesize to get the first image frame.
  • the preset layer may be one or more first layers drawn by the electronic device for focus applications or applications related to the fluency of the electronic device. The ratio of the area of the layer to the area of the display screen is greater than A layer with preset scale thresholds.
  • the preset layer may be one or more first layers drawn by the electronic device for applications such as focus applications, key applications, or applications related to the fluency of the electronic device, and the user touch operation received by the electronic device The corresponding layer is the layer that the actual focus is focused on.
  • the preset layer may be one or more first layers drawn by the electronic device for applications such as focus applications, key applications, or applications related to the fluency of the electronic device, and are strongly related to the fluency of the electronic device Layers.
  • the electronic device may be synthesized layers time t 17 (i.e., perform "image frame synthesized a1"), i.e., on the "Render a1" preset rendering layer was subjected to layer synthesis.
  • FIG. 6C the electronic device 16 is completed at time t of the application a "rendering B", at time t. 6 complete “Render 2" application 1; t the electronic device 16 before the time t 18 time has been completed
  • “rendering b” may include “rendering b1" and “rendering b2".
  • “Rendering b1" is the rendering of the preset layer in "Rendering b".
  • the electronic device can perform layer synthesis at t 18 (that is, execute "image frame synthesis 1+a2+b1"), that is, the preset images obtained by rendering "rendering 1", “rendering a2” and “rendering b1" Layer to layer composite.
  • the interface change of the focus application a is mainly reflected in the change of the aforementioned preset layer.
  • the user pays more attention to the influence of the change of the preset layer of the focus application a on the fluency of the electronic device.
  • the electronic device performs "image frame synthesized a1" at time 17 t, performing "the image frame of 1 + a2 + b1" at time t 18, the focus may be reflected in a preset application layer at the interface Changes can enhance the user's visual experience of the fluency of electronic devices.
  • the criticality of the above-mentioned applications or layers or the degree of influence on the fluency of electronic devices can be identified based on statistics (for example, in a laboratory) or preset by application (for example, during development)
  • the priority information can be identified and compared, and it can also be analyzed and predicted based on the user’s behavior and focus.
  • the embodiment of the present application does not limit the specific method for the electronic device to determine the criticality of the application and the degree of influence of the application on the fluency of the electronic device.
  • the electronic device can also forwardly schedule the hardware resources of the electronic device, so as to shorten the time required for the electronic device to draw, render, and synthesize layers.
  • the electronic device may perform one or more of the following hardware resource scheduling to shorten the time required for the electronic device to perform layer drawing, layer rendering, and/or layer synthesis.
  • the above forward scheduling may include: increasing the operating frequency of the processor of the electronic device, selecting a large core processor to execute the above method, and increasing the operating frequency of the memory of the electronic device.
  • the processor may include a CPU and/or GPU.
  • the electronic device can forwardly schedule the hardware resources of the electronic device.
  • the electronic device performs forward scheduling of the hardware resources of the electronic device, which can shorten the time required for the electronic device to draw, render, and synthesize layers. In this way, the possibility that the electronic device completes the drawing, rendering and synthesis of the layer in one synchronization cycle can be improved, thereby shortening the response delay of the electronic device, and the electronic device can perform forward scheduling of the hardware resources of the electronic device.
  • the electronic device may perform forward scheduling of the hardware resources of the electronic device according to the above-mentioned first processing frame length, so as to shorten the time required for the electronic device to perform layer drawing, layer rendering and/or layer synthesis. duration.
  • the greater the length of the first processing frame the greater the extent that the electronic device performs forward scheduling of the hardware resources of the electronic device. For example, taking the electronic device increasing the operating frequency of the processor as an example, the larger the first processing frame length is, the higher the operating frequency of the processor is adjusted when the electronic device adjusts the operating frequency of the processor.
  • the electronic device can perform a positive correction on the hardware resources of the electronic device according to the number or probability that the electronic device completes layer drawing, layer rendering, and layer synthesis in a synchronization period in the first statistical period. Scheduling to shorten the time required for layer drawing, layer rendering and/or layer synthesis by electronic equipment.
  • the above probability is the ratio of the number of times that the electronic device completes layer drawing, layer rendering, and layer synthesis in one synchronization period in the first statistical period to the total number of times.
  • the electronic device completes layer drawing, layer rendering, and layer composition in a synchronization period, the fewer the number of times, the lower the probability; the greater the extent to which the electronic device performs forward scheduling of the hardware resources of the electronic device Big. For example, taking the electronic device to increase the working frequency of the processor as an example, the less the number of times the electronic device completes layer drawing, layer rendering, and layer synthesis in a synchronization period in the first statistical period, the lower the probability; When adjusting the working frequency of the processor, the higher the working frequency of the processor is adjusted.
  • the electronic device can forwardly schedule the hardware resources of the electronic device according to the front-end application of the electronic device, so as to shorten the time required for the electronic device to perform layer drawing, layer rendering, and/or layer synthesis. duration.
  • the foreground application may be an application corresponding to the interface currently displayed on the display screen of the electronic device.
  • the electronic device For example, taking the electronic device to increase the working frequency of the processor as an example, when the electronic device runs the foreground application, the longer it takes for layer drawing, rendering, and synthesis, the longer the electronic device adjusts the working frequency of the processor. The higher the operating frequency is adjusted.
  • the model code of a preset artificial intelligence (Artificial Intelligence, AI) model can be stored in the electronic device.
  • the preset AI model is capable of "according to the "first processing frame length", "the number or probability of the electronic device completing layer drawing, layer rendering and layer synthesis in a synchronization period in the first statistical period” or "electronic The front-end application of the device's AI model for forward scheduling of the hardware resources of electronic devices to improve the possibility of single-frame rendering and synthesis.
  • the preset AI model is based on the'first processing frame length','the number or probability of the electronic device completing layer drawing, layer rendering and layer synthesis in a synchronization period in the first statistical period' or'electronic device's
  • the front application's sample training is obtained.
  • single-frame rendering synthesis means that the electronic device completes the drawing, rendering, and synthesis of layers in a synchronization cycle.
  • the electronic device can run the model code of the preset AI model, according to the'first processing frame length', the number of times the electronic device completes layer drawing, layer rendering and layer synthesis in a synchronization period in the first statistical period, or 'Probability' or'Front-end application of electronic devices', forward scheduling of hardware resources of electronic devices, so as to shorten the time required for electronic devices to draw, render and synthesize layers.
  • the possibility that the electronic device completes the drawing, rendering and synthesis of the layer in one synchronization cycle can be improved, thereby shortening the response delay of the electronic device, and the electronic device can perform forward scheduling of the hardware resources of the electronic device.
  • the electronic device performs forward scheduling on the hardware resources of the electronic device, thereby reducing the response delay effect description.
  • FIG. 7 it is a schematic diagram of the sequence of image processing based on the vertical synchronization signal before the electronic device performs forward scheduling of the hardware resources of the electronic device.
  • FIG. 8A it is a timing diagram of image processing based on the vertical synchronization signal after the electronic device performs forward scheduling on the hardware resources of the electronic device. Comparing (b) in Figure 7 with Figure 8A, we can get:
  • the electronic device starts to execute "rendering 1" and “rendering 1” at time t 1 shown in Figure 7 (b), and completes “rendering 1" at time t 9 ; the electronic device starts at time t 1 shown in Figure 8A perform "drawing 1" and "render 1", t 12 until time t 9 at the time of completion "render 1.” That is, the time required for the electronic device to execute “Draw 1” and “Render 1” shown in FIG. 8A is less than the time required for the electronic device to execute “Draw 1” and “Render 1” shown in (b) in FIG. 7. The time required for the electronic device to execute "image frame synthesis 1" shown in FIG. 8A is less than the time required for the electronic device to execute "image frame synthesis 1" shown in (b) in FIG. 7.
  • the electronic device starts to execute "rendering 2" and “rendering 2” at time t 2 shown in Figure 7 (b), and completes “rendering 2" at time t 10 ; the electronic device starts at time t 2 shown in Figure 8A Perform "Draw 2" and "Render 2", and complete “Render 2" at time t 13 before time t 10 . That is, the time required for the electronic device to execute “Draw 2” and “Render 2” shown in FIG. 8A is less than the time required for the electronic device to execute “Draw 2” and “Render 2” shown in (b) in FIG. 7. The time required for the electronic device to execute "Image Frame Synthesis 2" shown in FIG. 8A is less than the time required for the electronic device to execute "Image Frame Synthesis 2" shown in (b) in FIG.
  • the electronic device starts to execute "rendering 3" and “rendering 3” at time t 3 shown in Figure 7 (b), and completes “rendering 3" at time t 11 ; the electronic device starts at time t 3 shown in Figure 8A performing "draw 3" and "render 3", t 14 until time t 11 the time of the completion of "rendering 3.” That is, the time required for the electronic device to execute "Draw 3" and “Render 3" shown in FIG. 8A is less than the time required for the electronic device to execute "Draw 3" and “Render 3” shown in (b) in FIG. The time required for the electronic device to execute "image frame synthesis 3" shown in FIG. 8A is less than the time required for the electronic device to execute "image frame synthesis 3" shown in (b) in FIG. 7.
  • the electronic device can shorten the time required for the electronic device to draw, render, and compose layers. In this way, the possibility that the electronic device can complete the drawing, rendering, and composition of the layer in one synchronization cycle can be improved.
  • the electronic device can complete "drawing 1", “rendering 1" and “image frame synthesis 1" in a synchronization period from time t 1 to time t 2 (ie T Z ), and at time t 2
  • the synchronization cycle (ie T Z ) to time t 3 completes "Draw 2", "Rendering 2" and “Image Frame Synthesis 2", and the synchronization cycle (ie T Z ) from time t 3 to time t 4 is completed.
  • Draw 3 "Render 3" and "Image Frame Synthesis 3".
  • This embodiment describes specific conditions for the electronic device to schedule the hardware resources of the electronic device (for example, positive scheduling or negative scheduling). That is, the electronic device can schedule the hardware resources of the electronic device under the following conditions.
  • the electronic device performs forward scheduling on the hardware resources of the electronic device when the first processing frame length of the first statistical period is greater than the preset single frame frame length.
  • the preset single frame frame length in the embodiment of the present application is less than or equal to the synchronization period T Z.
  • the preset single frame frame length may be the difference between the synchronization period T Z and the preset delay threshold.
  • the preset delay threshold is greater than or equal to 0 ms.
  • the preset delay threshold may be 0ms, 1ms, 2ms, 1.5ms, 3ms, etc.
  • take the synchronization period T Z 16.667ms
  • the preset delay threshold is 1ms as an example.
  • the preset single frame frame length can be 15.667ms.
  • the electronic device can forward the hardware resources of the electronic device to shorten the time required for the electronic device to draw, render and synthesize the layer, and to improve the electronic device to complete the drawing, rendering and synthesis of the layer in a synchronization cycle Possibility.
  • the hardware resources of the electronic device can be negatively scheduled to Avoid frame dropping in the display image on the display screen, and reduce the power consumption of the electronic device while ensuring the smoothness of the display image on the display screen.
  • the preset refresh rate threshold may be 80 Hz, 90 Hz, or 85 Hz.
  • the electronic device can perform forward scheduling of the hardware resources of the electronic device by default, thereby increasing the frame rate of the electronic device and avoiding the display screen.
  • the display image has dropped frames to ensure the smoothness of the display image.
  • forward scheduling of the hardware resources of the electronic device will greatly increase the power consumption of the electronic device.
  • the electronic device after the electronic device performs forward scheduling of the hardware resources of the electronic device, when the electronic device is at a high screen refresh rate, the electronic device is combined to perform layer drawing, rendering and synthesis in response to the vertical synchronization signal 1
  • the solution electronic equipment does not need to perform a greater degree of forward scheduling of hardware resources, or electronic equipment does not need to perform forward scheduling of hardware resources, as long as the electronic equipment can complete the drawing and rendering of the layer in two synchronization cycles
  • And synthesis can avoid the phenomenon of frame loss in the display image on the display screen and ensure the smoothness of the display image on the display screen.
  • the electronic device can perform negative scheduling of hardware resources to reduce the load of the electronic device on the premise of avoiding frame loss in the display image on the display screen and ensuring the smoothness of the display image on the display screen.
  • the foregoing negative scheduling may include: reducing the operating frequency of the processor of the electronic device, selecting a processor with a small core to execute the foregoing method, and reducing the operating frequency of the memory of the electronic device.
  • the processor may include a CPU and/or GPU.
  • the electronic device may perform negative scheduling on hardware resources when the screen refresh rate of the electronic device is greater than the preset refresh rate threshold.
  • the threshold for hardware resource scheduling of electronic devices can also reduce the threshold for hardware resource scheduling of electronic devices to achieve a lower Under the required hardware resources, provide a frame rate that matches the high screen refresh rate, and provide users with a high screen refresh rate without frame drop.
  • the hardware resources of the electronic device can be negatively scheduled to reduce the power consumption of the electronic device while avoiding frame loss in the display image displayed on the display screen and ensuring the fluency of the display image on the display screen.
  • the aforementioned preset double frame frame length is greater than the aforementioned preset single frame frame length.
  • the preset double frame frame length may be greater than the preset single frame frame length, and less than or equal to K times the signal period of the vertical synchronization signal 3 (ie, the aforementioned synchronization period).
  • K is greater than or equal to 2.
  • K can be 2 or 2.5.
  • the preset double frame frame length may be 22.2 ms.
  • the first processing frame length is greater than the preset double frame frame length, it means that the electronic device cannot complete the drawing, rendering, and composition of the layer in two synchronization cycles.
  • the frame loss shown in (a) in Figure 7 may occur.
  • the electronic device can negatively schedule the hardware resources of the electronic device to improve the electronic device to complete the drawing and rendering of the layer in two synchronization cycles And the possibility of synthesis.
  • the electronic device may first predict whether the time required for the electronic device to perform layer drawing, rendering, and composition is less than or equal to the preset single frame length ( The preset single frame frame length is less than the aforementioned synchronization period). If it is predicted that the above-mentioned time is less than or equal to the preset single frame length, it means that the electronic device is more likely to complete the drawing, rendering, and composition of the layer in one synchronization cycle. In this case, the electronic device executes the above S401-S402. Specifically, as shown in FIG. 8B, before S401 shown in FIG. 4, the embodiment of the present application may further include S801-S803.
  • the electronic device determines whether the first processing frame length of the first statistical period is less than or equal to a preset single frame frame length.
  • the foregoing first processing frame length is the sum of the first rendering frame length and the first SF frame length.
  • the first rendering frame length is the length of time required for layer drawing and rendering of the drawn layer.
  • the first SF frame length is the length of time required to perform layer synthesis on the rendered layer.
  • the electronic device may determine the foregoing first processing frame length through the following implementation (i) and implementation (ii).
  • the focused application is an application corresponding to the user's touch operation received by the electronic device.
  • the first rendering frame length is the length of time required for layer drawing and rendering of the drawn layer.
  • the first rendering frame length a of the application a is the length of time required for the electronic device to draw the layer for the application a and render the drawn layer.
  • the first rendering frame length b of application b is the length of time required for the electronic device to draw layers for application b and render the drawn layers.
  • the first rendering frame length c of the application c is the length of time required for the electronic device to draw layers for the application c and render the drawn layers.
  • the first SF frame length x is the time required for the electronic device to perform layer synthesis for the layer rendered for application a, the layer rendered for application b, and the layer rendered for application c.
  • the electronic device may determine the foregoing first processing frame length according to the first rendering frame length a and the first SF frame length x of the focus application a.
  • the first processing frame length is the sum of the first rendering frame length a and the first SF frame length x.
  • the electronic device may determine the foregoing first processing frame length according to the first rendering frame length b and the first SF frame length x.
  • the first processing frame length is the sum of the first rendering frame length b and the first SF frame length x.
  • the electronic device may periodically count the length of the first processing frame in each statistical period.
  • the first statistical period is the previous statistical period or an earlier statistical period at the current moment.
  • the statistical period in the embodiment of the present application may be any duration such as 1S, 2S, 3S, or 5S.
  • the electronic device may execute S801a-S801b to obtain the first processing frame length of the first statistical period.
  • the electronic device obtains one or more second processing frame lengths of the first statistical period, where each second processing frame length is the sum of the second rendering frame length and the second SF frame length.
  • the electronic device can count the time required for each layer drawing and rendering in the first statistical period (that is, the second rendering frame length), and the time required for layer synthesis of the rendered image (that is, the second SF frame length), calculate the sum of the second rendering frame length and the corresponding second SF frame length, and obtain the total time required for each layer drawing, rendering, and composition (that is, the second processing frame length).
  • the time required for the electronic device to draw layer a and render the layer a is the second rendering frame length a; the time required for the electronic device to perform layer synthesis on the rendered layer a is the second SF frame length a.
  • the time required for the electronic device to draw layer b and render the layer b is the second rendering frame length b; the time required for the electronic device to perform layer synthesis on the rendered layer b is the second SF frame length b.
  • the time required for the electronic device to draw the layer c and render the layer a is the second rendering frame length c; the time required for the electronic device to perform layer synthesis on the rendered layer c is the second SF frame length c.
  • the electronic device may calculate the sum of the second rendering frame length a and the second SF frame length a to obtain the second processing frame length a; calculate the sum of the second rendering frame length b and the second SF frame length b to obtain the second processing frame length b ; Calculate the sum of the second rendering frame length c and the second SF frame length c to obtain the second processing frame length c. In this way, the electronic device can obtain three second processing frame lengths in the first statistical period.
  • the electronic device determines the first processing frame length of the first statistical period according to one or more second processing frame lengths.
  • only one second processing frame length is included in the one or more second processing frame lengths.
  • the foregoing first processing frame length is equal to this second processing frame length.
  • the one or more second processing frame lengths may include multiple second processing frame lengths.
  • the first processing frame length is an average value of the aforementioned multiple second processing frame lengths.
  • the first processing frame length of the first statistical period may be an average value of the second processing frame length a, the second processing frame length b, and the second processing frame length c.
  • the one or more second processing frame lengths may include multiple second processing frame lengths.
  • the first processing frame length is the largest second processing frame length among the plurality of second processing frame lengths.
  • the first processing frame length of the first statistical period may be the maximum value of the second processing frame length a, the second processing frame length b, and the second processing frame length c.
  • the electronic device can complete the graph in one synchronization period in the first statistical period. Layer drawing, rendering and compositing. Then, in the next statistical period of the first statistical period (that is, the statistical period in which the current moment is located), the electronic device is more likely to complete the drawing, rendering, and composition of the layer in one synchronization period. In this case, the electronic device can perform S401.
  • the electronic device drawing, rendering, and synthesizing layers in one synchronization period may specifically include: in response to the vertical synchronization signal 1, drawing one or more first images in the first synchronization period Layer, and render one or more first layers, and after one or more first layers are rendered, perform layer synthesis on the rendered one or more first layers to obtain the first image frame.
  • the first synchronization period is the synchronization period corresponding to the vertical synchronization signal 1.
  • the first synchronization period may be the synchronization period T Z from time t 1 to time t 2 shown in (b) in FIG. 6A. That is to say, in the embodiment of the present application, the electronic device may start layer composition during one synchronization period (ie, the first synchronization period) for layer drawing and rendering.
  • the electronic device If the first processing frame length of the first statistical period (that is, the previous statistical period at the current moment) is greater than the preset single frame frame length, it means that the electronic device cannot complete the drawing and rendering of the layer in one synchronization period in the first statistical period And synthesis. Then, in the next statistical period of the first statistical period (that is, the statistical period in which the current moment is located), the electronic device is less likely to complete the drawing, rendering, and composition of the layer in one synchronization period. In this case, the electronic device can perform S802-S803.
  • the electronic device draws one or more first layers, and renders one or more first layers.
  • the electronic device In response to the vertical synchronization signal 2, the electronic device performs layer synthesis on one or more rendered first layers to obtain a first image frame.
  • the electronic device may perform layer synthesis on the rendered first layer in the second synchronization period to obtain the first image frame.
  • the second synchronization period is different from the first synchronization period described above.
  • the first synchronization period may be a synchronization period T Z from time t 1 to time t 2 ; the second synchronization period may be a synchronization period from time t 2 to time t 3 Period T Z.
  • the electronic devices are not synthesized in the layers of the time t 6 6A (i.e., synthesized image frame) ; but will after a time t 2 the arrival of the vertical synchronizing signal 2, the vertical synchronizing signal in response to the second time t 2, for synthesizing the layers.
  • the electronic device can only respond to the vertical synchronization signal 1 when it is predicted that the time required for the electronic device to perform layer drawing, rendering, and composition is less than or equal to the preset single frame frame length, and draw and Render one or more first layers, and after the rendering of one or more first layers is completed, perform layer synthesis on the rendered one or more first layers to obtain a first image frame.
  • the electronic device can only respond to the vertical synchronization signal 1 when it is predicted that the time required for the electronic device to perform layer drawing, rendering, and composition is less than or equal to the preset single frame frame length, and draw and Render one or more first layers, and after the rendering of one or more first layers is completed, perform layer synthesis on the rendered one or more first layers to obtain a first image frame.
  • This embodiment describes the conditions for the electronic device to execute the above-mentioned image processing method based on the vertical synchronization signal in any of the above-mentioned embodiments. Specifically, through this embodiment, it can be explained under what conditions or situations the electronic device can execute the above method.
  • the electronic device can execute the method of the embodiment of the present application in the accelerated rendering mode, such as S401-S402 and related steps.
  • the method in the embodiment of the present application may further include S901.
  • the method of the embodiment of the present application may further include S901.
  • the electronic device In response to the first event, the electronic device starts an accelerated rendering mode.
  • the electronic device after the electronic device starts the accelerated rendering mode, before the arrival of the first vertical synchronization signal 2 after the rendering of one or more first layers ends, it can start layer synthesis on the rendered layers to obtain an image frame.
  • implementation (I) uses the following two implementation modes, namely, implementation (I) and implementation (II), to illustrate the above-mentioned first event.
  • the above-mentioned first event may be: the electronic device receives the user's first operation.
  • the first operation is used to trigger the electronic device to start the accelerated rendering mode.
  • the first operation may be a user's click operation on the "accelerated rendering" option 1003 in the setting interface 1002 displayed on the mobile phone 1001. This first operation is used to enable the "accelerated rendering" option 1003 to trigger the mobile phone 1001 to start the accelerated rendering mode.
  • the first operation may be a user's click operation on the "Accelerate Rendering" button 1005 in the notification bar 1004 displayed on the mobile phone 1001.
  • the first operation is used to turn on the "accelerated rendering” button 1005 to trigger the mobile phone 1001 to start the accelerated rendering mode.
  • the above-mentioned first event may be that the first processing frame length of the first statistical period is less than or equal to the preset single frame frame length.
  • the detailed description of the first statistical period, the preset single frame length, and the first processing frame length less than or equal to the preset single frame frame length can refer to the relevant content in the above-mentioned embodiment, which will not be repeated here. .
  • the electronic device can also exit the aforementioned accelerated rendering mode. After the electronic device exits the accelerated rendering mode, the electronic device can only perform layer synthesis in response to the vertical synchronization signal 2.
  • the method in the embodiment of the present application may further include S902-S905.
  • the method in the embodiment of the present application may further include S902-S905.
  • the electronic device In response to the second event, the electronic device exits the accelerated rendering mode.
  • implementation manner (a) and implementation manner (b), to illustrate the foregoing second event as an example.
  • the above-mentioned second event may be: the electronic device receives the second operation of the user.
  • the second operation is used to trigger the electronic device to exit the accelerated rendering mode.
  • the second operation corresponds to the above-mentioned first operation.
  • the second operation may be that the electronic device, in response to the user's first operation, turns on the above-mentioned "Accelerated Rendering" option 1003 or “Accelerated Rendering” button 1005 to start the accelerated rendering mode, and the received user's response to the "Accelerated Rendering” option 1003 or click operation of the "Accelerate Rendering” button 1005.
  • the mobile phone 1001 may turn off the "accelerated rendering” option 1003 or the "accelerated rendering” button 1005 to exit or close the accelerated rendering mode.
  • the above-mentioned second event may be that the first processing frame length of the first statistical period is greater than the preset single frame length.
  • the first statistical period, the preset single frame length, and the first processing frame length greater than the preset single frame frame length reference may be made to the related content in the foregoing embodiment, which is not repeated in the embodiment of the application.
  • the electronic device draws one or more second layers, and renders one or more second layers.
  • the electronic device In response to the vertical synchronization signal 2, the electronic device performs layer synthesis on the rendered second layer or layers to obtain a second image frame.
  • the electronic device after the electronic device exits the accelerated rendering mode, the electronic device cannot start layer synthesis on the rendered second layer before the arrival of the first vertical synchronization signal 2 after the rendering of one or more first layers is completed. . Instead, it waits for the arrival of the vertical synchronization signal 2. In response to the vertical synchronization signal 2, the electronic device performs layer synthesis on the rendered second layer to obtain a second image frame.
  • FIG. 6A (a) the electronic devices are not synthesized in the layers time t 6. 2 but waits the arrival time t 2 the vertical synchronizing signal in response to a time t 2 the vertical synchronizing signal 2, the layer can be rendered layers to obtain the synthesis of image frames.
  • the electronic device refreshes and displays the second image frame.
  • the electronic device can start the accelerated rendering mode in response to a user's operation; it can also automatically start the accelerated rendering mode according to the time required for the electronic device to perform layer drawing, rendering, and composition in the statistical period.
  • the foregoing accelerated rendering mode is only a name of the working mode in which the electronic device executes the method of the embodiment of the present application.
  • the accelerated rendering mode may also have other names, which are not limited in the embodiment of the present application.
  • the aforementioned accelerated rendering mode may also be referred to as an accelerated rendering synthesis mode or an accelerated synthesis mode.
  • the electronic device may periodically obtain the first processing frame length of each statistical period; if the first processing frame length of a statistical period (such as the first statistical period) is greater than the preset single frame frame length, it means that the electronic device is in The possibility of drawing, rendering, and compositing layers in one synchronization cycle is low.
  • the electronic device can perform forward scheduling on the hardware resources of the electronic device, so as to shorten the time required for the electronic device to perform layer drawing, layer rendering, and/or layer synthesis. In this way, the possibility that the electronic device can complete the drawing, rendering, and composition of the layer in one synchronization cycle can be improved.
  • the electronic device increases the operating frequency of the processor of the electronic device as an example to illustrate the specific method of the electronic device for forward scheduling of hardware resources.
  • the method in the embodiment of the present application may further include S1101-S1104.
  • the method in this embodiment of the present application may further include S1101-S1104.
  • the electronic device obtains the first processing frame length of the first statistical period.
  • the electronic device may execute S801a-S801b in response to the end of the first statistical period to obtain the first processing frame length of the first statistical period.
  • the electronic device determines whether the first processing frame length of the first statistical period is less than or equal to a preset single frame frame length.
  • the foregoing first processing frame length is the sum of the first rendering frame length and the first SF frame length.
  • the first rendering frame length is the length of time required for layer drawing and rendering of the drawn layer.
  • the first SF frame length is the length of time required to perform layer synthesis on the rendered layer.
  • the electronic device may periodically count the length of the first processing frame in each statistical period.
  • the first statistical period is the previous statistical period of the current moment.
  • the statistical period in the embodiment of the present application may be any duration such as 1S, 2S, 3S, or 5S.
  • S1102 reference may be made to the related introduction in S801, which is not repeated in the embodiment of the present application.
  • the higher the working frequency of the processor the faster the computing speed of the processor, and the shorter the time required for the electronic device to draw, render, and synthesize layers.
  • the working frequency of the processor is sufficient to ensure that the electronic device completes the drawing, rendering and synthesis of layers in one synchronization period. The device does not need to increase the working frequency of the processor.
  • the electronic device can execute S1104 to increase the operating frequency of the processor.
  • the electronic device can increase the working frequency of the processor only when the current working frequency of the processor is less than the maximum working power of the processor. Therefore, as shown in FIG. 11, after S1102, if the first processing frame length of the first statistical period is greater than the preset single frame frame length, the electronic device may perform S1103.
  • the electronic device determines that the current operating frequency of the processor is less than the maximum operating frequency of the processor.
  • the electronic device may perform S1104. If the current operating frequency of the processor is the maximum operating power of the processor, the electronic device does not need to adjust the operating frequency of the processor.
  • the electronic device increases the operating frequency of the processor.
  • the processor in the embodiment of the present application may include at least one of a CPU and a GPU.
  • the unit of the working frequency f of the processor may be Hertz (Hz, Hertz for short), kilohertz (kHz), megahertz (MHz) or gigahertz (GHz).
  • the electronic device may increase the operating frequency of the processor according to the first preset step.
  • the unit of the first preset step may be Hz, kHz, or MHz.
  • the first preset step can be pre-configured in the electronic device.
  • the first preset step may be set by the user in the electronic device.
  • the electronic device may increase the operating frequency of the processor according to the difference between the first processing frame length and the aforementioned preset single frame frame length, so that the second processing frame length in the next statistical period Less than or equal to the preset single frame length.
  • the adjustment range of the operating frequency of the processor by the electronic device is proportional to the magnitude of the aforementioned difference. That is, the greater the difference between the first processing frame length and the preset single frame frame length, the greater the adjustment range of the operating frequency of the processor by the electronic device. The smaller the difference between the first processing frame length and the preset single frame frame length, the smaller the adjustment range of the operating frequency of the processor by the electronic device.
  • the electronic device may adjust the operating frequency of the processor through the preset AI model according to the first processing frame length and the preset single frame frame length.
  • the preset AI model is obtained through a large number of sample training.
  • the preset AI model has an AI model with the function of "adjusting the working frequency of the processor according to the first processing frame length to improve the possibility of single frame rendering and synthesis".
  • single-frame rendering synthesis means that the electronic device completes the drawing, rendering, and synthesis of layers in a synchronization cycle.
  • the electronic device can perform S401-S402; if the first processing frame length is greater than the preset single frame frame length, the electronic device The device can also perform S802, S803, and S402.
  • the electronic device executes S1103. That is, when the electronic device executes S1103, the first processing frame length is greater than the preset single frame frame length, and the electronic device is less likely to complete the drawing, rendering, and composition of the layer in one synchronization cycle. Therefore, after S1103, if the current operating frequency of the electronic device is equal to the maximum operating frequency, it means that even if the processor is working at the maximum operating frequency, the electronic device is less likely to complete the drawing, rendering, and synthesis of the layer in one synchronization cycle. .
  • the electronic device can use conventional solutions to draw, render, and compose layers. As shown in FIG. 11, after S1103, if the current operating frequency of the electronic device is equal to the maximum operating frequency, the electronic device can perform S802, S803, and S402.
  • the electronic device may increase the height of the electronic device when the first processing frame length is greater than the preset single frame frame length, that is, when the possibility that the electronic device completes the drawing, rendering and synthesis of the layer in one synchronization period is low
  • the operating frequency of the processor In this way, the computing speed of the processor can be increased, so as to shorten the time required for the electronic device to draw, render, and synthesize the layer, so as to increase the possibility that the electronic device completes the drawing, rendering, and synthesis of the layer in one synchronization cycle.
  • the electronic device can complete the drawing, rendering, and composition of the layer in one synchronization cycle, the response delay of the electronic device can be shortened by one synchronization cycle, and the fluency of the electronic device (such as hand-following performance) can be improved.
  • the processor of an electronic device has a higher working frequency, although it can increase the computing speed of the electronic device and shorten the time required for the electronic device to draw, render and synthesize layers; however, the higher the working frequency of the processor, The greater the power consumption.
  • the electronic device may also reduce the operating frequency of the processor when the foregoing first processing frame length meets a preset condition. In this way, the power consumption of the electronic device can be reduced.
  • the method of the embodiment of the present application may further include S1201.
  • the first processing frame length satisfies a preset condition, which may specifically include: the first processing frame length is less than the preset single frame frame length.
  • the electronic device is more likely to complete the drawing, rendering, and composition of the layer in one synchronization cycle.
  • the working frequency of the processor is higher, which makes the calculation speed of the processor faster, so that the electronic device can complete the drawing, rendering, and synthesis of the layer in a synchronous cycle.
  • the operating frequency of the processor is too high, which will cause the power consumption of the electronic device to be high. Therefore, the electronic device can lower the operating frequency of the processor.
  • the first processing frame length satisfies the preset condition, which may specifically include: the first processing frame length is less than the preset single frame frame length, and the difference between the preset single frame frame length and the first processing frame length The value is greater than the first preset duration.
  • the electronic device if the first processing frame length is less than the preset single frame frame length, and the difference between the preset single frame frame length and the first processing frame length is greater than the first preset duration, it means that the electronic device completes the layer in one synchronization cycle After drawing, rendering and compositing, it may take a while to wait until the vertical synchronization signal 3 arrives, and in response to the vertical synchronization signal 3, the composite image frame is refreshed and displayed.
  • the operating frequency of the processor is generally higher. In order to reduce the power consumption of the electronic device, the electronic device can reduce the operating frequency of the processor.
  • the method by which the electronic device can lower the operating frequency of the processor may include: the electronic device lowers the operating frequency of the processor according to a second preset step.
  • the second preset step may be equal to the first preset step.
  • the second preset step may also be smaller than the first preset step.
  • the electronic device can adjust the operating frequency of the processor in a fast-up and slow-down manner. In this way, it is beneficial for the electronic device to execute the method of the embodiment of the present application, shorten the touch response time delay of the electronic device, and improve the fluency of the electronic device (such as hand-following performance).
  • the electronic device may reduce the working frequency of the processor only when the first processing frame length of N consecutive statistical periods meets the preset condition. For example, as shown in FIG. 12, after S1102 shown in FIG. 11, the method of the embodiment of the present application may further include S1201a and S1202.
  • the electronic device determines that the first processing frame length of N consecutive statistical periods meets a preset condition.
  • N is a positive integer.
  • N can be any positive integer such as 5, 4, 3, 2, or 6.
  • the method for the electronic device to determine that the first processing frame length of N consecutive statistical periods meets the preset condition may include: if the first processing frame length of one statistical period (for example, statistical period 1) meets the preset condition, the electronic device The count value of the counter can be increased by 1; the initial value of the counter is 0; if the first processing frame length of the next statistical period of the statistical period 1 meets the preset condition, the electronic device will increase the count value of the counter by 1; The first processing frame length of the next statistical period of 1 does not meet the preset condition, and the electronic device clears the count value of the counter.
  • the first processing frame length of one statistical period for example, statistical period 1
  • the electronic device The count value of the counter can be increased by 1; the initial value of the counter is 0; if the first processing frame length of the next statistical period of the statistical period 1 meets the preset condition, the electronic device will increase the count value of the counter by 1; The first processing frame length of the next statistical period of 1 does not meet the preset condition, and the electronic device clears the
  • the electronic device may enter the accelerated rendering mode only when it is determined that the first processing frame length of N consecutive statistical periods meets a preset condition.
  • the electronic device reduces the working frequency of the processor.
  • the first processing frame length satisfies the preset condition, and the electronic device reduces the operating frequency of the processor, please refer to the related description in the above embodiment. Do not repeat it.
  • the electronic device may reduce the working frequency of the processor only when the first processing frame length of N consecutive statistical periods meets the preset condition. In this way, not only can the ping-pong phenomenon occur when adjusting the operating frequency of the processor, but also the rapid rise and slow fall when adjusting the operating frequency of the processor can be achieved. In this way, under the premise of ensuring the system stability of the electronic device for layer drawing, rendering and synthesis, the possibility of the electronic device completing the drawing, rendering and synthesis of the layer in a synchronization cycle can be improved, so as to shorten the touch of the electronic device. Response time delay, improve the fluency of electronic equipment (such as hand-following performance).
  • the electronic device may make the electronic device unable to complete the drawing and rendering of the layer in one synchronization period. And synthesis. In this case, the electronic device may exit the accelerated rendering mode. Then, the electronic device may reactivate the accelerated rendering mode in response to the aforementioned first event.
  • FIG. 6A (a) the electronic device before entering the acceleration mode rendering, waiting time t 2 the arrival of the vertical synchronizing signal 2, the vertical synchronizing signal in response to the second time t 2, the layer will be synthesized (i.e., synthesized image frame); wait time t 3 of the arrival of the vertical synchronizing signal 3, time t 3 in response to the vertical synchronizing signal 3, to refresh the display the synthesized image frame.
  • FIG. 6A (b) the electronic device when rendered into the acceleration mode, at time t.
  • the method of the embodiment of the present application may further include S1301.
  • the electronic device adjusts the working frequency of the processor Adjust to the maximum operating frequency of the processor.
  • the above-mentioned first feature point may include at least any one of the following: the electronic device draws one or more third layers; the electronic device renders one or more third layers; the electronic device draws one or more third layers Any function is executed during the process of rendering; any function is executed during the process of rendering one or more third layers by the electronic device.
  • a second preset duration may be set for each of the above-mentioned first feature points.
  • the second preset duration may be determined by counting the time required for a large number of electronic devices to perform the operation corresponding to the first feature point multiple times.
  • the electronic device in a statistical period, if the time spent on the first feature point is greater than the second preset time corresponding to the first feature point, it means that the electronic device adopts the method corresponding to the accelerated rendering mode and cannot complete one or more third The possibility of layer drawing and rendering is higher.
  • one or more third layers are the layers that the electronic device is drawing or rendering in the statistical period.
  • the electronic device can instantaneously increase the frequency of the processor and adjust the operating frequency of the processor to the maximum operating frequency of the processor. After the processor instantaneously increases the frequency, the computing speed of the processor can be increased, and the time required for layer drawing, rendering, and composition of the electronic device can be shortened.
  • the electronic device can exit the accelerated rendering mode.
  • the method of the embodiment of the present application may further include S1302.
  • the electronic device responds to the vertical synchronization signal 2 to perform layer synthesis on the rendered layers to obtain an image frame.
  • the third processing frame length is the sum of the third rendering frame length and the third SF frame length.
  • the third rendering frame length is the length of time required to draw and render one or more third layers
  • the third SF frame length is the length of time required to perform layer synthesis on the rendered one or more third layers.
  • the electronic device can exit the accelerated rendering mode.
  • the electronic device will not perform layer synthesis on the rendered layers; instead, it will wait for the vertical synchronization signal 2 When it comes, in response to the vertical synchronization signal 2, the rendered layers will be combined to obtain an image frame.
  • FIG. 13 shows a simplified flowchart of an image processing method based on a vertical synchronization signal provided by an embodiment of the present application.
  • the electronic device may execute S-1 to obtain the first processing frame length of the statistical period.
  • S-1 please refer to S801a-S801b, S1101 and related descriptions in the above-mentioned embodiments, which are not repeated here.
  • S-2 the electronic device may execute S-2 to determine whether the first processing frame length of the statistical period is greater than the preset single frame frame length.
  • S801 and related descriptions in the foregoing embodiments which are not repeated here.
  • the electronic device can execute S-3 to increase the working frequency of the processor (that is, increase the frequency). For the detailed description of S-3, reference may be made to S1104 and related descriptions in the foregoing embodiments, which are not repeated here.
  • S-4 the electronic device may execute S-4 to determine whether the first processing frame length of N consecutive statistical periods meets the preset condition. For the detailed description of S-4, reference may be made to S1201a and related descriptions in the foregoing embodiments, which will not be repeated here. If the first processing frame length of N consecutive statistical periods meets the preset condition, S-5 and S-6 are executed.
  • the electronic device executes S-5 to lower the working frequency of the processor (ie, frequency reduction).
  • S-5 For the detailed description of S-5, reference may be made to S1202 and related descriptions in the foregoing embodiments, which will not be repeated here.
  • the electronic device executes S-6 to start the accelerated rendering mode.
  • the electronic device may also start the accelerated rendering mode in response to the first event. In the accelerated rendering mode, the electronic device can perform S401-S402.
  • the electronic device may execute S-7 to determine whether the time-consuming duration of the first feature point is greater than the second preset duration corresponding to the first feature point. If the time-consuming duration of the first feature point is greater than the second preset duration corresponding to the first feature point, the electronic device may execute S-8 to perform an instantaneous frequency increase.
  • S-7 and S-8 reference may be made to S1301 and related descriptions in the foregoing embodiments, which are not repeated here.
  • the electronic device continues to be in the accelerated rendering mode.
  • the electronic device instantaneously increases the frequency (ie, executes S-8), it can execute S-9 to determine whether the third processing frame length is greater than the preset single frame frame length. If the third processing frame length is greater than the preset single frame frame length, S-10 can be executed to exit the accelerated rendering mode.
  • S-9 and S-10 reference may be made to S1302 and related descriptions in the foregoing embodiments, which are not repeated here. If the third processing frame length is less than or equal to the preset single frame frame length, the accelerated rendering mode can be continued.
  • the electronic device may also exit the accelerated rendering mode in response to the second event.
  • the electronic device can use S903-S905 methods to perform layer drawing, rendering, synthesis, and image frame display.
  • FIG. 14 shows a schematic diagram of an optimization module provided by an embodiment of the present application.
  • the optimization module may be a functional module used to implement the method of the embodiment of the present application in a device or an electronic device that generates an image frame.
  • the optimization module may include: a service communication interface module 1401, a frame length detection module 1402, a single frame rendering strategy module 1403, and a dynamic adjustment algorithm module 1404.
  • the frame length detection module 1402 is used to obtain multiple second processing frame lengths in the statistical period (such as the first statistical period), determine the first processing frame length of the statistical period, and transmit the data of the statistical period to the dynamic adjustment algorithm module 1404 The first processing frame length.
  • the frame length detection module 1402 may obtain multiple second processing frame lengths in the statistical period through the service communication interface module 1401.
  • the frame length detection module 1402 is used to support the electronic device to execute S801a-S801b and S1101 in the foregoing method embodiment, or other processes used in the technology described herein.
  • the dynamic adjustment algorithm module 1404 is configured to call the scheduling module to adjust the working frequency of the processor according to the first processing frame length determined by the frame length detection module 1402.
  • the dynamic adjustment algorithm module 1404 is used to support the electronic device to execute S801, S1102, S1103, S1104, S1201, S1201a, S1202, S1301, S1302 in the above method embodiment "judge that the third processing frame length is greater than the preset single frame "Long” operation, S901, S902, or other processes used in the technology described herein.
  • the single frame rendering strategy module 1403 is configured to respond to the control of the dynamic adjustment algorithm module 1404 to control the UI thread, the rendering thread, and the composition thread of the electronic device in a corresponding manner to perform layer drawing, rendering, and composition.
  • the single-frame rendering strategy module 1403 is used to support the electronic device to perform the operation of "performing layer composition" in the above method embodiment in S401, S402, S802, S803, S903, S904, S905, and S1302, or used in this article Other processes of the described technique.
  • the software architecture shown in FIG. 1A may further include: the above optimization module 60.
  • the optimization module 60 may include: a service communication interface module 1401, a frame length detection module 1402, a single frame rendering strategy module 1403, and a dynamic adjustment algorithm module 1404.
  • FIG. 16 shows a schematic diagram of the test results of the sliding test scene of the "Contact” application when the robot speed of two brands of mobile phones is 100 millimeters (mm)/s.
  • FIG. 16 shows a test result of a mobile phone 1 of a brand (such as an xs mobile phone of an iphone) when the robot speed is 100 mm/s in the sliding test scenario of the "contacts" application.
  • the touch response time of the mobile phone 1 is 82ms-114ms.
  • Figure 16 (b) shows another brand of mobile phone 2 (such as a Huawei mobile phone) before performing the method of the embodiment of this application, when the robot speed is 100mm/s, in the sliding test scene of the "contacts" application The test results.
  • the touch response time is 82ms-136ms.
  • FIG. 16 shows the test result of the sliding test scenario of the "contacts" application when the robot speed is 100 mm/s after the mobile phone 2 executes the method of the embodiment of the present application.
  • the touch response time is 65ms-84ms.
  • the method of the embodiment of the present application can greatly shorten the response delay of the electronic device, and improve the fluency of the electronic device (such as the ability to follow the hands).
  • Some embodiments of the present application provide an electronic device, which may include: a display screen (such as a touch screen), a memory, and one or more processors.
  • the display screen, memory and processor are coupled.
  • the memory is used to store computer program code, and the computer program code includes computer instructions.
  • the processor executes the computer instructions, the electronic device can execute various functions or steps executed by the electronic device in the foregoing method embodiments.
  • the structure of the electronic device can refer to the structure of the electronic device 200 shown in FIG. 2.
  • the chip system includes at least one processor 1701 and at least one interface circuit 1702.
  • the processor 1701 and the interface circuit 1702 may be interconnected by wires.
  • the interface circuit 1702 may be used to receive signals from other devices (such as the memory of an electronic device).
  • the interface circuit 1702 may be used to send signals to other devices (such as the processor 1701 or the touch screen of an electronic device).
  • the interface circuit 1702 can read instructions stored in the memory, and send the instructions to the processor 1701.
  • the electronic device can execute the steps in the foregoing embodiments.
  • the chip system may also include other discrete devices, which are not specifically limited in the embodiment of the present application.
  • the embodiments of the present application also provide a computer storage medium, the computer storage medium includes computer instructions, when the computer instructions run on the above-mentioned electronic device, the electronic device is caused to perform various functions or functions performed by the electronic device in the above-mentioned method embodiment. step.
  • the embodiments of the present application also provide a computer program product, which when the computer program product runs on a computer, causes the computer to execute each function or step performed by the electronic device in the above method embodiment.
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the modules or units is only a logical function division.
  • there may be other division methods for example, multiple units or components may be It can be combined or integrated into another device, or some features can be omitted or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate parts may or may not be physically separate.
  • the parts displayed as units may be one physical unit or multiple physical units, that is, they may be located in one place, or they may be distributed to multiple different places. . Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • each unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art, or all or part of the technical solutions can be embodied in the form of software products, which are stored in a storage medium.
  • a device which may be a single-chip microcomputer, a chip, etc.
  • a processor processor
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (read only memory, ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Abstract

本申请实施例提供一种基于垂直同步信号的图像处理方法及电子设备,涉及图像处理及显示技术领域,可以缩短电子设备的响应延迟,提升电子设备的流畅性(如跟手性能)。具体方案包括:电子设备响应于第一垂直同步信号,绘制一个或多个第一图层,并渲染一个或多个第一图层,并在一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧;响应于第二垂直同步信号,刷新显示第一图像帧。

Description

一种基于垂直同步信号的图像处理方法及电子设备
本申请要求于2019年07月03日提交国家知识产权局、申请号为201910596178.X、发明名称为“一种基于垂直同步信号的图像处理方法及电子设备”的中国专利申请的优先权,于2019年07月09日提交国家知识产权局、申请号为201910617101.6、发明名称为“一种基于垂直同步信号的图像处理方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及图像处理及显示技术领域,尤其涉及一种基于垂直同步信号的图像处理方法及电子设备。
背景技术
随着电子技术的发展,各类电子产品(如手机)的性能越来越好。消费者对电子产品的人机交互性能的要求也越来越高。其中,流畅性是一项重要的人机交互性能。例如,流畅性可以包括跟手性能。
流畅性可以体现为从“用户向电子产品输入用户操作”到“电子产品显示该用户操作对应的图像”的延迟时间的长度。例如,上述用户操作可以是用户通过鼠标或者按键等输入的操作;或者,上述用户操作可以是用户对触摸屏的触摸操作。上述延迟时间可以称为电子设备的响应延迟。例如,上述用户操作是触摸操作的情况下,该延迟时刻可以称为触摸响应延迟。其中,延迟时间越长,流畅性(如跟手性能)越差;延迟时间越短,流畅性(如跟手性能)越好。因此,如何缩短上述延迟时间,提升电子产品的流畅性是亟待解决的一个问题。
发明内容
本申请实施例提供一种基于垂直同步信号的图像处理方法及电子设备,可以缩短电子设备的响应延迟,提升电子设备的流畅性。
为达到上述目的,本申请实施例采用如下技术方案:
第一方面,本申请实施例提供一种基于垂直同步信号的图像处理方法,该方法可以应用于包括显示屏的电子设备。该方法可以包括:电子设备响应于第一垂直同步信号,绘制一个或多个第一图层,并在一个或多个第一图层渲染完成后,渲染所述一个或多个第一图层,并对对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧;响应于第二垂直同步信号,刷新显示第一图像帧。
在第一种情况下,电子设备可以在一个同步周期完成图层的绘制、渲染和合成。即电子设备进行图层的绘制、渲染和合成所需的时长小于或等于一个同步周期。该同步周期等于第二垂直同步信号的信号周期。在这种情况下,电子设备响应于第一垂直同步信号,可以进行图层绘制、渲染和合成;而不是等待第三垂直同步信号到来后,响应于第三垂直同步信号才对渲染的第一图层进行图层合成。这样,电子设备便可以在一个同步周期(如第一同步周期)完成图层的绘制、渲染和合成。也就是说,通过本申请实施例的方法,可以将电子设备的响应延迟缩短一个同步周期,可以提升电子设备的流畅性(如跟手性能)。
在第二种情况下,电子设备不能在一个同步周期完成图层的绘制、渲染和合成。即电子设备进行图层的绘制、渲染和合成所需的时长大于一个同步周期。在这种情况下,如果采用 常规的方案“响应于第三垂直同步信号,进行图层合成”,显示屏刷新显示图像帧的过程中,可能会出现丢帧的现象。而通过本申请实施例的方法,可以避免显示图像出现丢帧现象,以避免显示屏显示一帧重复图像。也就是说,通过本申请实施例的方法,可以保证显示屏显示图像的流畅性,从而提升用户的视觉体验。并且,可以提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性。
结合第一方面,在一种可能的设计方式中,电子设备可以检测到用户操作或者电子设备发生用户界面(User Interface,UI)事件时,响应于第一垂直同步信号,绘制一个或多个第一图层,并渲染一个或多个第一图层。该用户操作可以用户触发电子设备更新界面。
结合第一方面,在另一种可能的设计方式中,本申请实施例的方法还可以包括:电子设备对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。
结合第一方面,在另一种可能的设计方式中,上述电子设备对电子设备的硬件资源进行正向调度,可以包括:电子设备执行以下硬件资源调度中的一项或多项,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。
其中,上述正向调度可以包括:调高电子设备的处理器的工作频率,选择大核的处理器执行上述方法,以及调高电子设备的内存工作频率。该处理器可以包括中央处理器(Central Processing Unit,CPU)和/或图形处理器(graphics processing unit,GPU)。
可以理解,处理器的工作频率越高,该处理器的运算速度越快,电子设备进行图层的绘制、渲染和合成所需的时长则越短。大核处理器的运算速度快于小核处理器的运算速度。电子设备的内存工作频率越高,该电子设备的读写速度越快,电子设备进行图层的绘制、渲染和合成所需的时长则越短。电子设备对所述电子设备的硬件资源进行正向调度,可以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长,提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性。这样,如果电子设备可以在一个同步周期完成图层的绘制、渲染和合成,便可以将电子设备的响应延迟缩短一个同步周期,可以提升电子设备的流畅性(如跟手性能)。
结合第一方面,在另一种可能的设计方式中,电子设备可以根据上述第一处理帧长,对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。例如,以电子设备调高处理器的工作频率为例,上述第一处理帧长越大,电子设备调整处理器的工作频率时,将处理器的工作频率调整的越高。
结合第一方面,在另一种可能的设计方式中,电子设备可以根据第一统计周期内电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的次数或者概率,对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。例如,以电子设备调高处理器的工作频率为例,第一统计周期内电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的次数越少,或者第一统计周期内电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的概率越小;电子设备调整处理器的工作频率时,将处理器的工作频率调整的越高。
结合第一方面,在另一种可能的设计方式中,电子设备可以根据电子设备的前台应用,对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。前台应用是显示屏当前显示的界面对应的应用。
可以理解,电子设备中可以安装多个应用。电子设备在前台运行不同应用时,进行图层绘制、渲染和合成所需的时间不同。因此,可以针对每一个应用设置一个用于对电子设备的 硬件资源进行正向调度的方式或者策略。例如,以电子设备调高处理器的工作频率为例,电子设备运行前台应用时,进行图层绘制、渲染和合成所需的时长越大,电子设备调整处理器的工作频率时,将处理器的工作频率调整的越高。
结合第一方面,在另一种可能的设计方式中,上述电子设备可以在第一统计周期的第一处理帧长小于或等于预设单帧帧长的情况下,响应于第一垂直同步信号,绘制一个或多个第一图层,并渲染一个或多个第一图层,并在一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧。
其中,上述第一处理帧长是第一渲染帧长和第一SF帧长之和。该第一渲染帧长是进行图层绘制和对绘制的图层进行渲染所需的时长。该第一SF帧长是对渲染的图层进行图层合成所需的时长。
可以理解,如果第一统计周期(即当前时刻的前一个统计周期)的第一处理帧长小于或等于预设单帧帧长,则表示在第一统计周期电子设备可以在一个同步周期完成图层的绘制、渲染和合成。那么在该第一统计周期的下一个统计周期(即当前时刻所在的统计周期),电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性则较高。
结合第一方面,在另一种可能的设计方式中,上述预设单帧帧长小于或等于第二垂直同步信号的信号周期。
结合第一方面,在另一种可能的设计方式中,如果第一统计周期(即当前时刻的前一个统计周期)的第一处理帧长大于预设单帧帧长,则表示在第一统计周期电子设备不能在一个同步周期完成图层的绘制、渲染和合成。那么,在该第一统计周期的下一个统计周期(即当前时刻所在的统计周期),电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性则较低。在这种情况下,电子设备则可以响应于第一垂直同步信号,绘制一个或多个第一图层;响应于第三垂直同步信号,对渲染的第一图层进行图层合成,以得到第一图像帧。
结合第一方面,在另一种可能的设计方式中,即使第一处理帧长大于预设单帧时长,电子设备也可以响应于第一垂直同步信号,绘制一个或多个第一图层,并渲染一个或多个第一图层,并在一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧。这样,电子设备可以提前进行图层合成,可以提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性。
结合第一方面,在另一种可能的设计方式中,本申请实施例的方法还可以包括:电子设备获取第一统计周期的一个或多个第二处理帧长;根据一个或多个第二处理帧长,确定第一处理帧长。其中,每个第二处理帧长是第二渲染帧长和第二SF帧长之和。第二渲染帧长是进行图层绘制和对绘制的图层进行渲染所需的时长。第二SF帧长是对渲染的图层进行图层合成所需的时长。其中,在上述一个或多个第二处理帧长包括多个第二处理帧长的情况下,上述第一处理帧长是多个第二处理帧长中最大的第二处理帧长;或者,上述第一处理帧长是多个第二处理帧长的平均值。
结合第一方面,在另一种可能的设计方式中,上述响应于第一垂直同步信号,绘制一个或多个第一图层,并渲染一个或多个第一图层,并在一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧,可以包括:电子设备响应于第一垂直同步信号,在第一同步周期绘制一个或多个第一图层,渲染一个或多个第一图层,并对在一个或多个第一图层渲染完成后,渲染的一个或多个第一图层进行图层合成,以得到第一图像帧。该第一同步周期是第一垂直同步信号对应的同步周期。也就是说,本申请实施例中,电子设备可以在进行图层绘制和渲染的一个同步周期(即第一同步周期)内,便开始进 行图层合成。
结合第一方面,在另一种可能的设计方式中,本申请实施例的方法还可以包括:如果第一处理帧长大于所述预设单帧帧长,电子设备对所述电子设备的硬件资源进行正向调度,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。其中,电子设备对电子设备的硬件资源进行正向调度的具体方法,以及达到的技术效果,可以参考上述可能的设计方式中的描述,本申请实施例这里不予赘述。
结合第一方面,在另一种可能的设计方式中,上述一个或多个图层可以包括:电子设备执行一个或多个应用对应的绘制任务所绘制的图层。该一个或多个应用可以包括:一个或多个系统级应用,以及一个或多个用户级应用中的至少一个。例如,该系统级应用可以包括:状态栏、launcher、导航栏和壁纸等。上述用户级应用可以包括:“设置”“电话”和“短消息”等电子设备的系统应用,以及电子设备可响应于用户的操作从应用商店下载的第三方应用。例如,第三方应用可以包括“微信”、“支付宝”、“百度地图”等应用。
结合第一方面,在另一种可能的设计方式中,上述电子设备响应于第一垂直同步信号,绘制一个或多个第一图层,并渲染一个或多个第一图层,并在一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧,具体可以包括:电子设备响应于第一垂直同步信号,分别针对一个或多个应用中的每个应用,绘制一个或多个第一图层,并渲染一个或多个第一图层;电子设备对电子设备针对一个或多个应用渲染的一个或多个第一图层进行图层合成,以得到第一图像帧。也就是说,响应于第一垂直同步信号,电子设备可以针对每个应用进行图层绘制和渲染;然后再对电子设备针对一个或多个应用中所有应用渲染的一个或多个第一图层进行图层合成,得到第一图像帧。
结合第一方面,在另一种可能的设计方式中,上述在一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧,具体可以包括:电子设备在一个或多个应用中的焦点应用、关键应用或者与电子设备的流畅性强相关的应用的一个或多个第一图层渲染完成后,对电子设备针对一个或多个应用已渲染的第一图层进行图层合成,以得到第一图像帧。也就是说,当电子设备完成焦点应用的图层渲染时,即使其他应用的图层渲染还未完成,电子设备也可以开始对已经完成渲染的第一图层进行图层合成,以得到第一图像帧。
结合第一方面,在另一种可能的设计方式中,上述在一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧,具体可以包括:电子设备在上述一个或多个第一图层中的焦点图层、关键图层或者与电子设备的流畅性强相关的图层渲染完成后,对电子设备针对一个或多个应用已渲染的第一图层进行图层合成,以得到第一图像帧。
结合第一方面,在另一种可能的设计方式中,电子设备可以根据上述一个或多个应用中的焦点应用对应的第一渲染帧长,以及电子设备对一个或多个应用对应的第一SF帧长,确定上述第一处理帧长。
结合第一方面,在另一种可能的设计方式中,电子设备可以根据一个或多个应用中每个应用对应的第一渲染帧长中,最大的第一渲染帧长,以及电子设备对一个或多个应用对应的第一SF帧长,确定第一处理帧长。
结合第一方面,在另一种可能的设计方式中,电子设备对电子设备的硬件资源进行正向调度之后,在电子设备的屏幕刷新率大于预设刷新率阈值时,电子设备可以对电子设备的硬件资源进行负向调度,以降低电子设备的功耗。这样,电子设备可以在低功耗的前提下,避 免上述第二种情况中,显示图像出现丢帧现象,以避免显示屏显示一帧重复图像。
结合第一方面,在另一种可能的设计方式中,电子设备对电子设备的硬件资源进行正向调度之后,当电子设备的屏幕刷新率大于预设刷新率阈值时,如果第一处理帧长大于预设双帧帧长,电子设备可以对电子设备的硬件资源进行负向调度,以降低电子设备的功耗。这样,电子设备可以在低功耗的前提下,避免上述第二种情况中,显示图像出现丢帧现象,以避免显示屏显示一帧重复图像。
结合第一方面,在另一种可能的设计方式中,电子设备可以执行以下负向调度中的一项或多项,以降低电子设备的功耗。其中,上述负向调度包括:调低电子设备的处理器的工作频率,选择小核的处理器执行所述方法,以及调低电子设备的内存工作频率。
结合第一方面,在另一种可能的设计方式中,上述预设双帧帧长小于或等于第二垂直同步信号的信号周期的K倍。其中,K≥2。
结合第一方面,在另一种可能的设计方式中,电子设备调高处理器的工作频率的方法可以包括:电子设备按照第一预设步进调高所述处理器的工作频率;或者,电子设备根据所述第一处理帧长与所述预设单帧帧长的差值,调高所述处理器的工作频率。该处理器的工作频率的调整幅度与所述差值的大小成正比。
结合第一方面,在另一种可能的设计方式中,本申请实施例的方法还可以包括:如果第一处理帧长满足预设条件,则调低处理器的工作频率。其中,第一处理帧长满足预设条件,具体包括:第一处理帧长小于预设单帧帧长;或者,第一处理帧长小于预设单帧帧长,且预设单帧帧长与第一处理帧长的差值大于第一预设时长。
可以理解,如果第一处理帧长小于预设单帧帧长,则表示电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性较高。在这种情况下,可能是因为处理器的工作频率较高,使得处理器的运算速度较快,从而使得电子设备可以一个同步周期完成图层的绘制、渲染和合成。但是,处理器的工作频率过高,会导致电子设备的功耗较大。因此,电子设备可以调低处理器的工作频率。
结合第一方面,在另一种可能的设计方式中,为了防止调整处理器的工作频率时出现乒乓现象。本申请实施例的方法还可以包括:如果连续N个统计周期的第一处理帧长满足预设条件,则调低处理器的工作频率,N≥2,N是正整数。这样,不仅可以防止调整处理器的工作频率时出现乒乓现象,还可以实现调整处理器的工作频率时的快升慢降。如此,可以在保证电子设备进行图层的绘制、渲染和合成的系统稳定性的前提下,缩短电子设备的触摸响应时延,提升电子设备的流畅性(如跟手性能)。
结合第一方面,在另一种可能的设计方式中,电子设备调低处理器的工作频率的方法可以包括:电子设备按照第二预设步进调低所述处理器的工作频率。
其中,第二预设步进可以等于第一预设步进。或者,第二预设步进也可以小于第一预设步进。
需要注意的是,在第二预设步进小于第一预设步进的情况下,电子设备可以以快升慢降的方式调整处理器的工作频率。这样,有利于电子设备执行本申请实施例的方法,缩短电子设备的触摸响应时延,提升电子设备的流畅性(如跟手性能)。
结合第一方面,在另一种可能的设计方式中,本申请实施例的方法还可以包括:在一个统计周期内,如果绘制并渲染一个或多个第三图层时第一特征点的耗费时长大于所述第一特征点对应的第二预设时长,则将处理器的工作频率调整为所述处理器的最大工作频率。其中,所述第一特征点至少包括以下任一种:绘制所述一个或多个第三图层;渲染所述一个或多个 第三图层;绘制所述一个或多个第三图层的过程中执行任一个函数;渲染所述一个或多个第三图层的过程中执行任一个函数。
可以理解,在一个统计周期内,如果第一特征点的耗费时长大于第一特征点对应的第二预设时长,则表示电子设备采用加速渲染模式对应的方法,无法完成一个或多个第三图层的绘制和渲染的可能性较高。
其中,一个或多个第三图层是该统计周期内电子设备正在绘制或渲染的图层。在这种情况下,电子设备可以对处理器进行瞬时提频,将处理器的工作频率调整为处理器的最大工作频率。处理器瞬时提频后,可以提升处理器的运算速度,进而可以缩短电子设备进行图层绘制、渲染和合成所需的时长。
结合第一方面,在另一种可能的设计方式中,在电子设备将处理器的工作频率调整为处理器的最大工作频率之后,本申请实施例的方法还可以包括:如果第三处理帧长大于预设单帧帧长,则响应于第三垂直同步信号,对渲染的图层进行图层合成得到图像帧。其中,第三处理帧长是第三渲染帧长和第三SF帧长之和。第三渲染帧长是绘制并渲染一个或多个第三图层所需的时长。第三SF帧长是对渲染的一个或多个第三图层进行图层合成所需的时长。
结合第一方面,在另一种可能的设计方式中,上述预设单帧时长是同步周期与预设时延阈值的差值。该预设时延阈值大于或等于零。
结合第一方面,在另一种可能的设计方式中,在上述电子设备则响应于第一垂直同步信号,绘制一个或多个第一图层,并渲染一个或多个第一图层,并在一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧之前,本申请实施例的方法还可以包括:响应于第一事件,启动加速渲染模式。
其中,启动所述加速渲染模式后,电子设备响应于第一垂直同步信号,不仅可以绘制一个或多个第一图层,并渲染一个或多个第一图层,还可以对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧。上述第一事件可以包括:接收到用户的第一操作;和/或,第一统计周期的第一处理帧长小于或等于预设单帧帧长。其中,第一统计周期是当前时刻的前一个统计周期。
结合第一方面,在另一种可能的设计方式中,电子设备响应于第二事件,可以退出上述加速渲染模式。该第二事件可以包括:接收到用户的第二操作;和/或,第一统计周期的第一处理帧长大于预设单帧帧长。
其中,在电子设备退出上述加速渲染模式之后,本申请实施例的方法还可以包括:响应于第一垂直同步信号,绘制一个或多个第二图层,并渲染一个或多个第二图层;响应于第三垂直同步信号,对渲染的一个或多个第二图层进行图层合成得到第二图像帧;响应于第二垂直同步信号刷新显示第二图像帧。
第二方面,本申请提供一种电子设备,该电子设备包括触摸屏、存储器和一个或多个处理器;所述触摸屏、所述存储器和所述处理器耦合;所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述处理器执行所述计算机指令时,所述电子设备执行如第一方面及其任一种可能的设计方式所述的方法。
第三方面,本申请提供一种芯片系统,该芯片系统应用于包括触摸屏的电子设备;所述芯片系统包括一个或多个接口电路和一个或多个处理器;所述接口电路和所述处理器通过线路互联;所述接口电路用于从所述电子设备的存储器接收信号,并向所述处理器发送所述信号,所述信号包括所述存储器中存储的计算机指令;当所述处理器执行所述计算机指令时,所述电子设备执行如第一方面及其任一种可能的设计方式所述的方法。
第四方面,本申请提供一种计算机存储介质,该计算机存储介质包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如第一方面及其任一种可能的设计方式所述的方法。
第五方面,本申请提供一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如第一方面及其任一种可能的设计方式所述的方法。
可以理解地,上述提供的第二方面所述的电子设备,第三方面所述的芯片系统,第四方面所述的计算机存储介质,第五方面所述的计算机程序产品所能达到的有益效果,可参考第一方面及其任一种可能的设计方式中的有益效果,此处不再赘述。
附图说明
图1A为本申请实施例提供的一种电子设备响应于触摸操作显示图像的软件处理流程示意图;
图1B为图1A所示的软件处理流程中的延迟示意图;
图2为本申请实施例提供的一种电子设备的硬件结构示意图;
图3为本申请实施例提供的一种图像处理流程示意图;
图4为本申请实施例提供的一种基于垂直同步信号的图像处理方法流程图;
图5为本申请实施例提供的一种用于触发图层绘制的垂直同步信号、用于触发图层渲染的垂直同步信号和用于触发图层合成的垂直同步信号的示意图;
图6A为本申请实施例提供的一种基于垂直同步信号的图像处理方法原理示意图;
图6B为本申请实施例提供的另一种基于垂直同步信号的图像处理方法原理示意图;
图6C为本申请实施例提供的另一种基于垂直同步信号的图像处理方法原理示意图;
图6D为本申请实施例提供的另一种基于垂直同步信号的图像处理方法原理示意图;
图7为本申请实施例提供的另一种基于垂直同步信号的图像处理方法原理示意图;
图8A为本申请实施例提供的另一种基于垂直同步信号的图像处理方法原理示意图;
图8B为本申请实施例提供的另一种基于垂直同步信号的图像处理方法流程图;
图9为本申请实施例提供的另一种基于垂直同步信号的图像处理方法流程图;
图10为本申请实施例提供的一种显示界面示意图;
图11为本申请实施例提供的另一种基于垂直同步信号的图像处理方法流程图;
图12为本申请实施例提供的另一种基于垂直同步信号的图像处理方法流程图;
图13为本申请实施例提供的另一种基于垂直同步信号的图像处理方法流程图;
图14为本申请实施例提供的一种垂直同步信号的相位调整装置的结构组成示意图;
图15为本申请实施例提供的另一种电子设备响应于触摸操作显示图像的软件处理流程示意图;
图16为本申请实施例提供的一种测试场景的测试结果示意图;
图17为本申请实施例提供的一种芯片系统的结构示意图。
具体实施方式
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请实施例提供一种基于垂直同步信号的图像处理方法,该方法可以应用于包括触摸屏的电子设备。具体的,该方法可以应用于电子设备响应于用户在触摸屏的触摸操作,在触 摸屏显示图像的过程中。
其中,“用户向电子设备输入用户操作”到“电子设备显示该用户操作对应的图像”的延迟时间可以称为电子设备响应延迟。电子设备的流畅性(如跟手性能)可以体现为响应延迟的长度。例如,上述用户操作是触摸操作时,上述流畅性可以为跟手性能,上述响应延迟可以称为触摸响应延迟。该触摸响应延迟是从“用户手指在触摸屏输入触摸操作”到“触摸屏显示该触摸操作对应的图像”的延迟时间。
具体的,电子设备的响应延迟越长,电子设备的流畅性(如跟手性能)越差;电子设备的响应延迟越短,电子设备的流畅性(如跟手性能)越好。其中,电子设备的流畅性(如跟手性能)越好,用户通过用户操作(如触摸操作)控制电子设备的使用体验越好,感觉越流畅。通过本申请实施例的方法,可以提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性。从而可以缩短电子设备的响应延迟,提升电子设备的流畅性(如跟手性能),提升用户体验。
请参考图1A,其以上述用户操作是触摸操作为例,示出从“用户手指在触摸屏输入触摸操作”到“触摸屏显示该触摸操作对应的图像”过程中,电子设备的软件处理流程示意图。如图1A所示,电子设备可以包括:触控面板(touch panel,TP)/TP驱动(Driver)10、Input框架(即Input Framework)20、UI框架(即UI Framework)30、Display框架(即Display Framework)40和硬件显示模块50。
如图1A所示,电子设备的软件处理流程可以包括以下步骤(1)-步骤(5)。
步骤(1):TP IC/TP驱动10中的TP采集用户手指对电子设备的TP的触摸操作后,TP驱动向Event Hub上报相应的触摸事件。
步骤(2):Input框架20的Input Reader线程可以从Event Hub中读取触摸事件,然后向Input Dispatcher线程发送该触摸事件;由Input Dispatcher线程向UI框架30中的UI线程(如DoFrame)上传该触摸事件。
步骤(3):UI框架30中的UI线程绘制该触摸事件对应的一个或多个图层;渲染线程(如DrawFrame)对一个或多个图层进行图层渲染。
步骤(4):Display框架40中的合成线程对绘制的一个或多个图层(即渲染后的一个或多个图层)进行图层合成得到图像帧。
步骤(5):硬件显示模块50的液晶显示面板(Liquid Crystal Display,LCD)驱动可接收合成的图像帧,由LCD显示合成的图像帧。LCD显示图像帧后,LCD显示的图像可被人眼感知。
本申请实施例这里通过分析“用户手指在触摸屏输入触摸操作”到“触摸屏显示该触摸操作对应的图像被人眼感知”过程中、电子设备的处理流程,对电子设备缩短响应延迟的原理进行简单说明。
其中,上述步骤(1)中,TP IC/TP驱动10采集触摸操作并向Input框架20上报触控时间的过程中,可能会存在图1B所示的内核延迟。上述步骤(2)中,Input框架20处理触控时间并向UI框架输入触控事件的过程中,可能会存在图1B所示的输入延迟。上述步骤(3)中,UI框架中的UI线程绘制触摸事件对应的一个或多个图层,可能会存在图1B所示的绘制延迟(也称为UI线程延迟);并且,渲染线程进行图层渲染可能会存在图1B所示的渲染延迟。上述步骤(4)中,Display框架40中的合成线程进行图层合成可能会存在图1B所示的合成延迟。上述步骤(5)中,硬件显示模块50显示合成的图像帧的过程中,可能会存在图1B所示的送显延迟。
本申请实施例提供的一种基于垂直同步信号的图像处理方法,可以通过缩短图1B所示的“绘制延迟”、“渲染延迟”和“合成延迟”,以缩短电子设备的响应延迟,提升电子设备的流畅性(如跟手性能)。
示例性的,本申请实施例中的电子设备可以是手机、平板电脑、桌面型、膝上型、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备等包括触摸屏的设备,本申请实施例对该电子设备的具体形态不作特殊限制。
下面将结合附图对本申请实施例的实施方式进行详细描述。
请参考图2,为本申请实施例提供的一种电子设备200的结构示意图。如图2所示,电子设备200可以包括处理器210,外部存储器接口220,内部存储器221,通用串行总线(universal serial bus,USB)接口230,充电管理模块240,电源管理模块241,电池242,天线1,天线2,移动通信模块250,无线通信模块260,音频模块270,扬声器270A,受话器270B,麦克风170C,耳机接口270D,传感器模块280,按键290,马达291,指示器292,摄像头293,显示屏294,以及用户标识模块(subscriber identification module,SIM)卡接口295等。
其中,传感器模块280可以包括压力传感器280A,陀螺仪传感器280B,气压传感器280C,磁传感器280D,加速度传感器280E,距离传感器280F,接近光传感器280G,指纹传感器280H,温度传感器280J,触摸传感器280K,环境光传感器280L,以及骨传导传感器280M等。
可以理解的是,本实施例示意的结构并不构成对电子设备200的具体限定。在另一些实施例中,电子设备200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器210可以包括一个或多个处理单元,例如:处理器210可以包括应用处理器(application processor,AP),调制解调处理器,GPU,图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是电子设备200的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器210中的存储器为高速缓冲存储器。该存储器可以保存处理器210刚用过或循环使用的指令或数据。如果处理器210需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器210的等待时间,因而提高了系统的效率。
在一些实施例中,处理器210可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
可以理解的是,本实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备200的结构限定。在另一些实施例中,电子设备200也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块240用于从充电器接收充电输入。充电管理模块240为电池242充电的同时,还可以通过电源管理模块241为电子设备供电。
电源管理模块241用于连接电池242,充电管理模块240与处理器210。电源管理模块241接收电池242和/或充电管理模块240的输入,为处理器210,内部存储器221,外部存储器,显示屏294,摄像头293,和无线通信模块260等供电。在其他一些实施例中,电源管理模块241也可以设置于处理器210中。在另一些实施例中,电源管理模块241和充电管理模块240也可以设置于同一个器件中。
电子设备200的无线通信功能可以通过天线1,天线2,移动通信模块250,无线通信模块260,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备200中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。
移动通信模块250可以提供应用在电子设备200上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块250可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块250可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块250还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器270A,受话器270B等)输出声音信号,或通过显示屏294显示图像或视频。
无线通信模块260可以提供应用在电子设备200上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块260可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块260经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器210。无线通信模块260还可以从处理器210接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备200的天线1和移动通信模块250耦合,天线2和无线通信模块260耦合,使得电子设备200可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation  satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备200通过GPU,显示屏294,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏294和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器210可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏294用于显示图像,视频等。该显示屏294包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。
其中,本申请实施例中的显示屏294可以是触摸屏。即该显示屏294中集成了触摸传感器280K。该触摸传感器280K也可以称为“触控面板”。也就是说,显示屏294可以包括显示面板和触摸面板,由触摸传感器280K与显示屏294组成触摸屏,也称“触控屏”。触摸传感器280K用于检测作用于其上或附近的触摸操作。触摸传感器280K检测到的触摸操作后,可以由内核层的驱动(如TP驱动)传递给上层,以确定触摸事件类型。可以通过显示屏294提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器280K也可以设置于电子设备200的表面,与显示屏294所处的位置不同。
电子设备200可以通过ISP,摄像头293,视频编解码器,GPU,显示屏294以及应用处理器等实现拍摄功能。ISP用于处理摄像头293反馈的数据。摄像头293用于捕获静态图像或视频。数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。视频编解码器用于对数字视频压缩或解压缩。电子设备200可以支持一种或多种视频编解码器。这样,电子设备200可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备200的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口220可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备200的存储能力。外部存储卡通过外部存储器接口220与处理器210通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。内部存储器221可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器210通过运行存储在内部存储器221的指令,从而执行电子设备200的各种功能应用以及数据处理。例如,在本申请实施例中,处理器210可以通过执行存储在内部存储器221中的指令,内部存储器221可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备200使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器221可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备200可以通过音频模块270,扬声器270A,受话器270B,麦克风170C,耳机接口270D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块270用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块270还可以用于对音频信号编码和解码。扬声器270A,也称“喇叭”,用于将音频电信号转换为声音信号。受话器270B,也称“听筒”,用于将音频电信号转换成声音信号。麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。耳机接口270D用于连接有线耳机。
压力传感器280A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器280A可以设置于显示屏294。压力传感器280A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器280A,电极之间的电容改变。电子设备200根据电容的变化确定压力的强度。当有触摸操作作用于显示屏294,电子设备200根据压力传感器280A检测所述触摸操作强度。电子设备200也可以根据压力传感器280A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。本申请实施例中,电子设备200可以通过压力传感器280A获取用户的触摸操作的按压力度。
按键290包括开机键,音量键等。按键290可以是机械按键。也可以是触摸式按键。电子设备200可以接收按键输入,产生与电子设备200的用户设置以及功能控制有关的键信号输入。马达291可以产生振动提示。马达291可以用于来电振动提示,也可以用于触摸振动反馈。指示器292可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。SIM卡接口295用于连接SIM卡。SIM卡可以通过插入SIM卡接口295,或从SIM卡接口295拔出,实现和电子设备200的接触和分离。电子设备200可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口295可以支持Nano SIM卡,Micro SIM卡,SIM卡等。
以下实施例中的方法均可以在具有上述硬件结构的电子设备200中实现。
以下对本申请实施例中涉及的术语进行介绍:
垂直同步信号1:如VSYNC_APP。该垂直同步信号1可以用于触发绘制一个或多个图层。需要注意的是,本申请实施例中“垂直同步信号1可以用于触发绘制一个或多个图层”具体是指:垂直同步信号1可以用于触发绘制一个或多个图层,并触发对该一个或多个图层进行渲染。即本申请实施例中,绘制的一个或多个图层是指渲染后的一个或多个图层。本申请实施例中,电子设备响应于垂直同步信号1,可以通过多个绘制线程中的每个绘制线程分别针对每个应用绘制一个或多个图层。即电子设备响应于垂直同步信号1,可以同时针对一个或多个应用执行绘制任务,以绘制每个应用对应的一个或多个图层。其中,上述一个或多个应用的详细描述可以参考以下实施例中的相关内容,本申请实施例这里不予赘述。
垂直同步信号2:如VSYNC_SF。该垂直同步信号2可以用于触发对绘制的一个或多个图层进行图层合成得到图像帧。
垂直同步信号3:如HW_VSYNC。该垂直同步信号3可以用于触发硬件刷新显示图像帧。
其中,本申请实施例中的垂直同步信号1(如VSYNC_APP)为权利要求书中所述的第一垂直同步信号,垂直同步信号2(如VSYNC_SF)为权利要求书中所述的第三垂直同步信号,垂直同步信号3(HW_VSYNC)为权利要求书中所述的第二垂直同步信号。
需要注意的是,在不同的系统或者架构中,垂直同步信号的名称可能不同。例如,在一些系统或者架构中,上述用于触发绘制一个或多个图层的垂直同步信号(即垂直同步信号1)的名称可能不是VSYNC_APP。但是,无论垂直同步信号的名称是什么,只要是具备类似功 能的同步信号,符合本申请实施例提供的方法的技术思路,都应涵盖在本申请的保护范围之内。
并且,在不同的系统或者架构中,对上述垂直同步信号的定义也可能不同。例如,在另一些系统或架构中,上述垂直同步信号1的定义可以为:垂直同步信号1可以用于触发渲染一个或多个图层;垂直同步信号2的定义可以为:垂直同步信号2可以用于触发根据一个或多个图层生成图像帧;垂直同步信号3的定义可以为:垂直同步信号3可以用于触发显示图像帧。本申请实施例中,对垂直同步信号的定义不作限定。但是,无论对垂直同步信号做何种定义,只要是具备类似功能的同步信号,符合本申请实施例提供的方法的技术思路,都应涵盖在本申请的保护范围之内。
结合图1A,响应于用户操作(例如,用户对TP的触摸操作,如图3所示,手指触摸TP)或者电子设备发生UI事件,UI框架可以在垂直同步信号1到来的时刻,调用UI线程绘制触控事件对应的一个或多个图层,再调用渲染线程以对该一个或多个图层进行渲染。例如,用户操作还可以是用户通过鼠标或者按键等输入的操作。电子设备响应于用户通过鼠标或按键等输入的用户操作,也可以通过本申请实施例的方法,提升电子设备的流畅性。然后,硬件合成(Hardware Composer,HWC)可以在垂直同步信号2到来的时刻,调用合成线程对绘制的一个或多个图层(即渲染后的一个或多个图层)进行图层合成得到图像帧;最后,硬件显示模块可以在垂直同步信号3到来的时刻,在LCD(即显示屏,如上述显示屏294,此处以LCD为例)刷新显示上述图像帧。其中,上述UI事件可以是由用户对TP的触摸操作触发的。或者,该UI事件可以是由电子设备自动触发的。例如,电子设备的前台应用自动切换画面时,可以触发上述UI事件。前台应用是电子设备的显示屏当前显示的界面对应的应用。
需要注意的是,UI框架是基于垂直同步信号1周期性的进行图层绘制和渲染的;硬件合成HWC是基于垂直同步信号2周期性的进行图层合成的;LCD是基于垂直同步信号3周期性的进行图像帧刷新的。
其中,垂直同步信号3是由电子设备的显示屏驱动触发的硬件信号。本申请实施例中,垂直同步信号3(如HW_VSYNC)的信号周期T3是根据电子设备的显示屏的屏幕刷新率确定的。具体的,垂直同步信号3的信号周期T3是电子设备的显示屏(如LCD)的屏幕刷新率的倒数。
例如,电子设备的显示屏的屏幕刷新率可以为60赫兹(Hz)、70Hz、75Hz或者80Hz等任一值。以屏幕刷新率是60Hz为例,上述垂直同步信号3的信号周期T3=1/60=0.01667秒(s)=16.667毫秒(ms)。需要注意的是,其中,电子设备可能支持多个不同的屏幕刷新率。例如,假设电子设备支持的最大屏幕刷新率为80Hz。那么,该电子设备则可能支持屏幕刷新率80Hz、60Hz或者40Hz等。本申请实施例中所述的屏幕刷新率是电子设备当前所使用的屏幕刷新率。即垂直同步信号3的信号周期T3是电子设备当前所使用的屏幕刷新率的倒数。
需要注意的是,本申请实施例中的垂直同步信号3是周期性离散信号。例如,如图5所示,每间隔一个信号周期(如T3)就会有一个由硬件驱动触发的垂直同步信号3,图5中多次出现的垂直同步信号3是按照垂直同步信号3的信号周期T3依次地到来。
而垂直同步信号1和垂直同步信号2是基于垂直同步信号3产生的。即垂直同步信号3可以是垂直同步信号1和垂直同步信号2的信号源;或者,垂直同步信号1和垂直同步信号2与垂直同步信号3同步。故垂直同步信号1和垂直同步信号2的信号周期与垂直同步信号3的信号周期相同,且相位一致。例如,如图5所示,垂直同步信号1的信号周期T1,垂直同步信号2的信号周期T2和垂直同步信号3的信号周期T3相同。即T1=T2=T3。并且,如图 5所示,垂直同步信号1、垂直同步信号2,以及垂直同步信号3的相位一致。可以理解的是,实际实施过程中,垂直同步信号1、垂直同步信号2,以及垂直同步信号3之间可能会因为各种因素(如处理性能)存在一定的相位误差。需要注意的是,在理解本申请实施例的方法时,上述相位误差被忽略。
需要注意的是,上述垂直同步信号1和垂直同步信号2也是周期性离散信号。例如,如图5所示,每间隔一个信号周期(如T1)就会有一个垂直同步信号1,每间隔一个信号周期(如T2)就会有一个垂直同步信号2,即图5中多次出现的垂直同步信号1是按照垂直同步信号1的信号周期T1依次地到来,图5中多次出现的垂直同步信号2是按照垂直同步信号2的信号周期T2依次地到来。因此,上述垂直同步信号3、垂直同步信号1和垂直同步信号2都可以看做是周期性离散信号。
由于垂直同步信号1、垂直同步信号2和垂直同步信号3都是周期性信号;因此,本申请实施例所述的垂直同步信号到来(如垂直同步信号1到来),都是指该垂直同步信号的脉冲边缘到来;响应于垂直同步信号(如响应于垂直同步信号1),都是指响应于该垂直同步信号的脉冲边缘。例如,如图5所示,t 1时刻的垂直同步信号1到来,是指t 1时刻垂直同步信号1的脉冲边缘到来;响应于t1时刻的垂直同步信号1,是指响应于t1时刻垂直同步信号1的脉冲边缘。
其中,上述脉冲边缘是一种从示波器或者观测系统中形象地观察到的脉冲的边缘。在不同系统中可能是上升沿或者下降沿或者二者都包括,在实际系统中上可以是通过定时器翻转、中断信号等方式实现。
本申请实施例中,上述垂直同步信号1、垂直同步信号2和垂直同步信号3的信号周期都可以称为同步周期T Z。即T1=T2=T3=T Z。也就是说,本申请实施例中的同步周期是电子设备的显示屏的屏幕刷新率的倒数。其中,图3所示的Frame1、Frame2、Frame3和Frame4均为上述同步周期。例如,电子设备的显示屏的屏幕刷新率可以为60赫兹(Hz)、70Hz、75Hz或者80Hz等任一值。以屏幕刷新率可以是60Hz为例,上述同步周期T Z=1/60=0.01667秒(s)=16.667毫秒(ms),即T1=T2=T3=T Z=16.667ms。
其中,TP是触控面板TP可以集成在上述显示屏294中。TP也成称为触摸传感器,如上述触摸传感器280K。TP可以周期性检测用户的触摸操作。TP检测到触摸操作后,可以唤醒上述垂直同步信号1和垂直同步信号2,以触发UI框架基于垂直同步信号1进行图层绘制和渲染,硬件合成HWC基于垂直同步信号2进行图层合成。其中,TP检测触摸操作的检测周期与垂直同步信号3(如HW_VSYNC)的信号周期T3相同。
可以理解,由于绘制线程是基于垂直同步信号1进行图层绘制的,然后再由渲染线程进行图层渲染;合成线程是基于垂直同步信号2(如VSYNC_SF)进行图层合成的;因此,电子设备进行图层的绘制、渲染和合成需要在两个同步周期才可以完成。
例如,如图3所示,图层的绘制和渲染在Frame2完成,而图像帧合成则在Frame3完成。其中,图3所示的Frame1、Frame2、Frame3和Frame4分别对应一个同步周期。
但是,在一些情况下,电子设备进行图层的绘制、渲染和合成所需的时长可能小于或等于一个同步周期。也就是说,在上述两个同步周期(如图3所示的Frame2和Frame3)中,电子设备可能只占用了一部分时间进行图层的绘制、渲染和合成;其他时间则在等待垂直同步信号2和垂直同步信号3的到来。如此,则会造成电子设备的响应延迟不必要的延长,影响电子设备的流畅性(如跟手性能)。
本申请实施例提供的方法中,可以在电子设备进行图层的绘制、渲染和合成所需的时长 满足单帧渲染要求(如上述所需的时长小于或等于一个同步周期)时,在一个同步周期内进行图层的绘制、渲染和合成。这样,便可以缩短图1B所示的“绘制延迟”、“渲染延迟”和“合成延迟”,以缩短电子设备的响应延迟,提升电子设备的流畅性(如跟手性能)。具体的,通过本申请实施例的方法,可以提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性。即电子设备缩短响应延迟的收益点可以为一个同步周期。
本申请实施例提供的一种基于垂直同步信号的图像处理方法的执行主体可以是生成图像帧的装置。该生成图像帧的装置可以是上述电子设备中的任一种(例如,相位调整装置可以为图2所示的电子设备200)。或者,该生成图像帧的装置还可以为电子设备的CPU,或者电子设备中的用于执行所述基于垂直同步信号的图像处理方法的控制模块。本申请实施例中以电子设备执行基于垂直同步信号的图像处理方法为例,说明本申请实施例提供的基于垂直同步信号的图像处理方法。
以下实施例中,以第一垂直同步信号是上述垂直同步信号1(如VSYNC_APP信号),第三垂直同步信号是上述垂直同步信号2(如VSYNC_SF信号),第二垂直同步信号是上述垂直同步信号3(如HW_VSYNC信号)为例,对本申请实施例的方法进行说明。
实施例(一)
本申请实施例提供一种基于垂直同步信号的图像处理方法。如图4所示,该基于垂直同步信号的图像处理方法可以包括S401-S402。
S401、响应于垂直同步信号1,电子设备绘制一个或多个第一图层,并渲染一个或多个第一图层,并在上述一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成,以得到第一图像帧。
一般而言,垂直同步信号1相对于垂直同步信号3的延迟时间为零,相位差为零。例如,如图5所示,垂直同步信号1、垂直同步信号2相对于垂直同步信号3的延迟时间为零,相位差为零。其中,在垂直同步信号1的信号周期T1、垂直同步信号2的信号周期T2与垂直同步信号3的信号周期T3相同(即T1=T3=T Z)的前提下,垂直同步信号1、垂直同步信号2相对于垂直同步信号3的延迟时间为零,具体可以为:在一个垂直同步信号3到来的同时,也会有一个垂直同步信号1和一个垂直同步信号2到来。
例如,如图5或图6A中的(a)所示,在t 1时刻,一个垂直同步信号1、一个垂直同步信号2和一个垂直同步信号3同时到来;在t 2时刻,一个垂直同步信号1、一个垂直同步信号2和一个垂直同步信号3同时到来;在t 3时刻,一个垂直同步信号1、一个垂直同步信号2和一个垂直同步信号3同时到来;在t 4时刻,一个垂直同步信号1、一个垂直同步信号2和一个垂直同步信号3同时到来;在t 5时刻,一个垂直同步信号1、一个垂直同步信号2和一个垂直同步信号3同时到来。
示例性的,如图6A中的(a)所示,t 1时刻,一个垂直同步信号1到来;响应于t 1时刻的垂直同步信号1,电子设备可以执行绘制1和渲染1;t 2时刻,一个垂直同步信号1到来;响应于t 2时刻的垂直同步信号1,电子设备可以执行绘制2和渲染2。
一般而言,电子设备(即电子设备的HWC)是基于垂直同步信号2进行图层合成的。也就是说,即使电子设备(即电子设备的UI线程和渲染线程)已经完成一个或多个第一图层的渲染,但是如果没有检测到垂直同步信号2,HWC是不会进行对渲染的一个或多个第一图层进行图层合成的。HWC只会在垂直同步信号2的时刻到来时进行图层合成以得到图像帧。
例如,如图6A中的(a)所示,即使电子设备已经在t 6时刻完成图层的渲染(即“渲染1”),但是t 6时刻之后的t 2时刻,垂直同步信号2才可以到来;响应于t 2时刻的垂直同步信号 2,电子设备(即电子设备的HWC)才可以进行图层合成(即执行“图像帧合成1”)得到第一图像帧。也就是说,电子设备需要等待图6A中的(a)所示的Δt1,才可以执行“图像帧合成1”。
又例如,如图6A中的(a)所示,即使电子设备已经在t 7时刻完成图层的渲染(即“渲染2”),但是t 7时刻之后的t 3时刻,垂直同步信号2才可以到来;响应于t 3时刻的垂直同步信号2,电子设备(即电子设备的HWC)才可以进行图层合成(即执行“图像帧合成2”)得到第一图像帧。也就是说,电子设备需要等待图6A中的(a)所示的Δt2,才可以执行“图像帧合成2”。
如此,电子设备进行图层的绘制、渲染和合成需要在两个同步周期才可以完成。例如,如图6A中的(a)所示,“绘制1”和“渲染1”在t 1时刻至t 2时刻这一同步周期完成,而“图像帧合成1”则在t 2时刻至t 3时刻这一同步周期完成。又例如,如图6A中的(a)所示,“绘制2”和“渲染2”在t 2时刻至t 3时刻这一同步周期完成,而“图像帧合成2”则在t 3时刻至t 4时刻这一同步周期完成。
而本申请实施例中,电子设备不需要等待垂直同步信号2,响应于垂直同步信号1,绘制并渲染一个或多个第一图层渲染后,即在一个或多个第一图层渲染完成后,便可以开始对渲染的第一图层进行图层合成,以得到第一图像帧。也就是说,电子设备可以提前对渲染的第一图层进行图层合成。
例如,如图6A中的(b)所示,响应于垂直同步信号1,电子设备可以执行“绘制1”、“渲染1”和“图像帧合成1”。如图6A中的(b)所示,在t 6时刻,“绘制1”和“渲染1”已结束。电子设备可以在t 2时刻的垂直同步信号2到来之前,在t 6时刻便开始进行图层合成,即执行“图像帧合成1”。即电子设备不需要等待t 2时刻的垂直同步信号2到来,便可以开始执行“图像帧合成1”。
又例如,如图6A中的(b)所示,响应于垂直同步信号1,电子设备可以执行“绘制2”、“渲染2”和“图像帧合成2”。如图6A中的(b)所示,在t 7时刻,“绘制2”和“渲染2”已结束。电子设备可以在t 3时刻的垂直同步信号2到来之前,在t 6时刻便开始进行图层合成,即执行“图像帧合成2”。即电子设备不需要等待t 3时刻的垂直同步信号2到来,便可以开始执行“图像帧合成2”。
需要注意的是,在上述一个或多个第一图层仅包括一个第一图层的情况下,电子设备对这个一个第一图层进行图层合成,具体包括:电子设备对这一个第一图层进行格式转换,将这一个第一图层转换为第一图像帧。在上述一个或多个第一图层包括多个第一图层的情况下,电子设备对该多个第一图层进行图层合成,具体包括:电子设备对该多个第一图层进行图层合成,得到第一图像帧。
示例性的,S401中,电子设备对渲染的一个或多个第一图层进行图层合成以得到第一图像帧,具体可以包括:渲染线程完成一个或多个第一图层的渲染后,可以调用合成线程对渲染的一个或多个第一图层进行图层合成以得到第一图像帧;或者,渲染线程完成一个或多个第一图层的渲染后,向合成线程发送指示消息,以触发合成线程对渲染的一个或多个第一图层进行图层合成以得到第一图像帧;或者,合成线程可以在检测到渲染线程完成一个或多个第一图层的渲染时,对渲染的一个或多个第一图层进行图层合成以得到第一图像帧。
可以理解的是,S401中,电子设备可以在完成一个或多个第一图层的渲染后,立即对渲染的一个或多个第一图层进行图层合成。或者,S401中,电子设备在完成一个或多个第一图层的渲染后,可以在一定的延迟时间后,对渲染的一个或多个第一图层进行图层合成。也就 是说,在实际实施过程中,电子设备执行图层渲染和图层合成之间可以存在一定的延迟。
S402、响应于垂直同步信号3,电子设备刷新显示第一图像帧。
如图6A中的(b)所示,在t 2时刻,一个垂直同步信号3到来;响应于t 2时刻的垂直同步信号3,电子设备可以刷新显示执行“图像帧合成1”得到的图像帧(如第一图像帧),即执行“图像帧显示1”;在t 3时刻,一个垂直同步信号3到来;响应于t 3时刻的垂直同步信号3,电子设备刷新显示执行“图像帧合成2”得到的图像帧,即执行“图像帧显示2”。
实施例(一)中缩短电子设备的响应延迟的原理及效果分析。
本申请实施例这里结合图6A中的(b),对电子设备执行S401后,缩短电子设备的响应延迟的原理及效果进行分析说明:
(1)电子设备进行图层的绘制、渲染和合成所需的时长小于或等于一个同步周期的情况(即第一种情况)下,本申请实施例中缩短响应延迟的原理及效果说明。
其中,电子设备进行图层的绘制、渲染和合成所需的时长可能小于或等于一个同步周期。在这种情况下,电子设备执行S401可以在一个同步周期(如第一同步周期)完成图层的绘制、渲染和合成。
例如,如图6A中的(b)所示,电子设备在t 1时刻至t 2时刻这一同步周期完成了“绘制1”、“渲染1”和“图像帧合成1”;电子设备在图6A中的(b)所示的在t 2时刻至t 3时刻这一同步周期完成了“绘制2”、“渲染2”和“图像帧合成2”。
对比图6A中的(b)和图6A中的(a)可知:采用本申请实施例的方法,电子设备可以提前一个同步周期(如t 2时刻相比于t 3时刻,提前一个同步周期T Z)刷新显示第一图像帧。也就是说,通过本申请实施例的方法,可以将电子设备的响应延迟缩短一个同步周期,可以提升电子设备的流畅性(如跟手性能)。
(2)电子设备进行图层的绘制、渲染和合成所需的时长大于一个同步周期的情况(即第二种情况)下,本申请实施例中缩短响应延迟的原理及效果说明。
其中,电子设备进行图层的绘制、渲染和合成所需的时长,或者电子设备进行图层的绘制和渲染所需的时长可能大于一个同步周期。在这种情况下,采用常规的方案“响应于垂直同步信号2,进行图层合成”,显示屏刷新显示图像帧的过程中,可能会出现丢帧的现象。具体的,显示屏刷新显示图像帧的过程中,可能会显示一帧重复图像。这样,会影响显示屏显示图像的流畅性,从而影响用户的视觉体验。
例如,如图7中的(a)所示,在t 1时刻,一个垂直同步信号1到来;响应于t 1时刻的垂直同步信号1,电子设备执行“绘制1”和“渲染1”;在t 2时刻,一个垂直同步信号2到来;响应于t 2时刻的垂直同步信号2,电子设备执行“图像帧合成1”;在t 3时刻,一个垂直同步信号3到来;响应于t 3时刻的垂直同步信号3,电子设备执行“图像帧显示1”。如图7中的(a)所示,在t 2时刻,一个垂直同步信号1到来;响应于t 2时刻的垂直同步信号1,电子设备执行“绘制2”和“渲染2”。由于“绘制2”和“渲染2”无法在一个同步周期(如t 2时刻到t 3时刻这一同步周期)内完成,即电子设备在t 3时刻的垂直同步信号2到来之前,未完成“渲染2”;因此,电子设备只能等待t 4时刻的垂直同步信号2到来,响应于t 4时刻的垂直同步信号2,执行“图像帧合成2”。如此,电子设备也只能等待t 5时刻的垂直同步信号3到来,响应于t 5时刻的垂直同步信号3,电子设备执行“图像帧显示2”。
结合上述描述,由图7中的(a)可知:由于“绘制2”和“渲染2”无法在一个同步周期(如t 2时刻到t 3时刻这一同步周期)内完成,导致电子设备只能等待t 4时刻的垂直同步信号2到来,响应于t 4时刻的垂直同步信号2执行“图像帧合成2”;因此,在t 4时刻到t 5时刻 这一同步周期,显示屏显示图像出现丢帧现象,即显示屏会显示一帧重复图像,即显示屏在t 4时刻到t 5时刻继续执行“图像帧显示1”。其中,显示屏在t 4时刻到t 5时刻显示的图像帧与在t 3时刻到t 4时刻显示的图像帧是同一个图像帧。
而通过本申请实施例的方法,可以避免显示图像出现丢帧现象,以避免显示屏显示一帧重复图像。也就是说,通过本申请实施例的方法可以保证显示屏显示图像的流畅性,从而提升用户的视觉体验。
例如,如图7中的(b)所示,响应于垂直同步信号1,电子设备可以执行“绘制1”、“渲染1”和“图像帧合成1”。如图7中的(b)所示,在t 9时刻,“绘制1”和“渲染1”已结束。电子设备可以在t 2时刻的垂直同步信号2到来之前,在t 9时刻便开始进行图层合成,即执行“图像帧合成1”。即电子设备不需要等待t 2时刻的垂直同步信号2到来,便可以开始执行“图像帧合成1”。
又例如,如图7中的(b)所示,响应于垂直同步信号1,电子设备可以执行“绘制2”、“渲染2”和“图像帧合成2”。如图7中的(b)所示,在t 10时刻,“绘制2”和“渲染2”已结束。电子设备可以在t 4时刻的垂直同步信号2到来之前,在t 10时刻便开始进行图层合成,即执行“图像帧合成2”。即电子设备不需要等待t 4时刻的垂直同步信号2到来,便可以开始执行“图像帧合成2”。
再例如,如图7中的(b)所示,响应于垂直同步信号1,电子设备可以执行“绘制3”、“渲染3”和“图像帧合成2”。如图7中的(b)所示,在t 11时刻,“绘制3”和“渲染3”已结束。电子设备可以在t 5时刻的垂直同步信号2到来之前,在t 11时刻便开始进行图层合成,即执行“图像帧合成3”。即电子设备不需要等待t 5时刻的垂直同步信号2到来,便可以开始执行“图像帧合成3”。
由图7中的(b)可知:电子设备在t 10时刻开始执行“图像帧合成2”,在t 4时刻的垂直同步信号3到来之前便可以完成“图像帧合成2”。这样,电子设备便可以在t 4时刻的垂直同步信号3到来后,响应于t 4时刻的垂直同步信号3,执行“图像帧显示2”。如此,便可以避免显示屏显示图像出现丢帧现象,可以保证显示屏显示图像的流畅性,从而提升用户的视觉体验。
需要注意的是,在上述第二种情况下,电子设备虽然无法在一个同步周期完成图层的绘制、渲染和合成;如图7中的(b)所示,电子设备在t 1时刻到t 3时刻的两个同步周期才完成“绘制1”、“渲染1”和“合成1”;在t 2时刻到t 4时刻的两个同步周期才完成“绘制2”、“渲染2”和“合成2”;但是,可以避免显示屏显示图像出现丢帧现象,可以保证显示屏显示图像的流畅性,从而提升用户的视觉体验。并且,电子设备提前进行图层合成,可以提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性。
实施例(二)
本申请实施例,对电子设备针对一个或多个应用,执行上述方法的原理说明。
上述的一个或多个图层可以包括:电子设备执行一个或多个应用对应的绘制任务所绘制的图层。该一个或多个应用可以包括:一个或多个系统级应用,以及一个或多个用户级应用中的至少一个。例如,该系统级应用可以包括:状态栏、launcher、导航栏和壁纸等。上述用户级应用可以包括:“设置”“电话”和“短消息”等电子设备的系统应用,以及电子设备可响应于用户的操作从应用商店下载的第三方应用。例如,第三方应用可以包括“微信”、“支付宝”、“百度地图”等应用。本申请实施例所述的绘制任务包括“图层绘制”和“图层渲染”。
相应的,上述S401具体可以包括S401a和S401b。S401a:电子设备响应于垂直同步信 号1,分别针对一个或多个应用中的每个应用,绘制一个或多个第一图层,并渲染一个或多个第一图层。S401b:电子设备对电子设备针对一个或多个应用渲染的一个或多个第一图层进行图层合成,以得到第一图像帧。
需要注意的是,本申请实施例中,电子设备可以对电子设备针对一个或多个应用渲染的图层,统一进行图层合成得到一个图像帧。或者,电子设备可以对电子设备针对每个应用渲染的图层,分别进行图层合成得到多个图像帧。本申请实施例(如S401b)中,以电子设备对电子设备针对一个或多个应用渲染的图层,统一进行图层合成得到一个图像帧为例,进行说明。
以下通过两种应用场景,对电子设备针对一个或多个应用,执行本申请实施例的方法,进行图层的绘制、渲染、合成以及图像帧显示的原理进行说明。
在应用场景(一)中,上述一个或多个应用可以仅包括一个应用。电子设备可以执行该一个应用对应的绘制任务,绘制(即绘制并渲染)一个或多个图层。即电子设备可以针对这一个应用执行S401a,以绘制并渲染对应的一个或多个图层。然后,电子设备可以执行S401b,对针对这一个应用渲染的一个或多个图层进行图层合成。
示例性的,以上述一个应用是应用1为例,电子设备可以执行S401a,针对应用1绘制并渲染一个或多个图层1;然后,电子设备可以执行S401b,对针对该应用1渲染的一个或多个图层1进行图层合成,得到图像帧1。
例如,如图6A中的(b)所示,电子设备响应于t 1时刻的垂直同步信号1,可以执行“绘制1”和“渲染1”,得到渲染的一个或多个图层1;然后,对渲染的一个或多个图层1进行图层合成,即执行“图像帧合成1”,得到图像帧1;响应于t 2时刻的垂直同步信号3,电子设备可执行“图像帧显示1”,刷新显示图像帧1。
在应用场景(二)中,上述一个或多个应用可以包括多个应用。电子设备可以分别执行该多个应用对应的绘制任务,分别针对该多个应用中的每个应用绘制(即绘制并渲染)一个或多个图层。即电子设备可以分别针对这多个应用中的每个应用执行S401a,以绘制并渲染对应的一个或多个图层。然后,电子设备可以执行S401b,对针对这多个应用分别渲染的一个或多个图层进行图层合成。即S401b中,渲染的一个或多个图层可以包括:电子设备执行多个应用对应的绘制任务,得到的图层。
示例性的,以上述多个应用是2个应用(包括应用1和应用a)为例,电子设备可以针对应用1,绘制并渲染得到一个或多个图层1;针对应用a绘制并渲染得到一个或多个图层a。然后,电子设备可以对上述一个或多个图层1和上述一个或多个图层a,进行图层合成得到图像帧a。
例如,如图6B所示,电子设备响应于t 1时刻的垂直同步信号1,可以针对应用1执行“绘制1”和“渲染1”,得到上述一个或多个图层1,针对应用a执行“绘制a”和“渲染a”,得到上述一个或多个图层a;然后,电子设备可以对一个或多个图层1和一个或多个图层a,执行“图像帧合成1+a”得到图像帧a;响应于t 2时刻的垂直同步信号3,电子设备可执行“图像帧显示1+a”,刷新显示图像帧a。又例如,如图6B所示,电子设备响应于t 2时刻的垂直同步信号1,可以针对应用1执行“绘制2”和“渲染2”,得到上述一个或多个图层2,针对应用a执行“绘制b”和“渲染b”,得到上述一个或多个图层b;然后,电子设备可以对一个或多个图层2和一个或多个图层b,执行“图像帧合成2+b”得到图像帧b;响应于t 3时刻的垂直同步信号3,电子设备可执行“图像帧显示2+b”,刷新显示图像帧b。
需要注意的是,图6B所示的“图像帧合成1+a”和“图像帧合成2+b”均表示一次图层 合成。电子设备执行“图像帧显示1+a”或“图像帧显示2+b”时,刷新显示的也是一帧图像。
由上述描述可知,本申请实施例的方法不仅可以应用于电子设备以单窗口显示一个应用的界面的场景中,还可以应用于电子设备以多窗口显示多个应用的界面的场景中。例如,本申请实施例的方法可以应用于电子设备横屏显示多窗口的场景中,还可以应用于折叠屏电子设备以多窗口显示多个应用的界面的场景中。其中,电子设备以单窗口或多窗口显示应用的界面的方法,本申请实施例这里不予赘述。
需要注意的是,本申请实施例中,以电子设备针对一个应用,执行本申请实施例的方法为例进行说明。但是,并不表示本申请实施例提供的方法中,电子设备不能同时针对多应用进行图层的绘制、渲染、合成以及图像帧显示。
实施例(三)
本实施例针对上述应用场景(二),即多个应用的场景,对电子设备执行S401b的具体方法进行说明。
在情况(1),电子设备可以在上述多个应用中所有应用的图层渲染均结束时,对已经渲染的图层进行图层合成,得到第一图像帧。
例如,如图6B所示,电子设备响应于t 1时刻的垂直同步信号1,可以针对应用1执行“绘制1”和“渲染1”,得到上述一个或多个图层1,针对应用a执行“绘制a”和“渲染a”,得到上述一个或多个图层a。如图6B所示,电子设备在t 15时刻完成“渲染a”,在t 6时刻完成“渲染1”。在情况(1)中,即使电子设备在t 15时刻已经完成“渲染a”;但是,由于电子设备还未完成“渲染1”;因此,电子设备在t 15时刻不会开始进行图层合成(即执行“图像帧合成1+a”)。如图6B所示,在t 6时刻电子设备已完成“渲染a”和“渲染1”;因此,电子设备可以执行“图像帧合成1+a”,以得到图像帧a。响应于t 2时刻的垂直同步信号3,电子设备可执行“图像帧显示1+a”,即刷新显示图像帧a。
同理,在t 16时刻之后的t 7时刻电子设备已完成“渲染b”和“渲染2”;因此,电子设备可以执行“图像帧合成2+b”,以得到图像帧b。响应于t 3时刻的垂直同步信号3,电子设备可执行“图像帧显示2+b”,即刷新显示图像帧b。
在情况(2)中,电子设备可以在上述多个应用中部分应用的图层渲染均结束时,对已经渲染的图层进行图层合成,得到第一图像帧。
在情况(2)中,上述S401b可以替换为:电子设备在一个或多个应用中的焦点应用、关键应用或者与电子设备的流畅性强相关的应用等应用的一个或多个第一图层渲染结束时,对电子设备针对一个或多个应用已渲染的一个或多个第一图层进行图层合成,以得到第一图像帧。该焦点应用是电子设备接收的用户触摸操作对应的应用,即实际焦点所聚焦的应用。
例如,以焦点应用为例。结合上述实例,假设应用a是焦点应用。如图6C所示,电子设备在t 15时刻完成应用a的“渲染a”,在t 6时刻完成应用1的“渲染1”。在情况(2)中,虽然电子设备在t 15时刻还未完成“渲染1”;但是,电子设备在t 15时刻已经完成焦点应用a的“渲染a”;因此,电子设备可以在t 15时刻进行图层合成(即执行“图像帧合成a”)。又例如,如图6C所示,电子设备在t 16时刻完成应用a的“渲染b”,在t 6时刻完成应用1的“渲染2”。在情况(2)中,虽然电子设备在t 16时刻还未完成“渲染2”;但是,电子设备在t 16时刻已经完成焦点应用a的“渲染b”;因此,电子设备可以在t 16时刻进行图层合成(即执行“图像帧合成1+b”)。
可以理解,由于应用a是焦点应用;因此,用户更加关注应用a的界面变化。也就是说,用户更加关注焦点应用a的界面变化对电子设备的流畅性的影响。上述实例中,电子设备在 t 15时刻执行“图像帧合成a”,在t 16时刻执行“图像帧合成1+b”,也可以在界面上体现出焦点应用a的界面变化,可以提升用户对电子设备的流畅性的视觉体验。
在情况(3)中,电子设备可以在上述多个应用中部分应用的预设图层渲染均结束时,对已经渲染的图层进行图层合成,得到第一图像帧。
示例性的,电子设备可以在焦点应用或者与电子设备的流畅性强相关的应用等应用的预设图层渲染结束时,对电子设备针对已渲染的一个或多个第一图层进行图层合成,以得到第一图像帧。其中,该预设图层可以是电子设备针对焦点应用或者与电子设备的流畅性强相关的应用等应用绘制的一个或多个第一图层中,图层面积与显示屏的面积的比值大于预设比例阈值的图层。或者,该预设图层可以是电子设备针对焦点应用、关键应用或者与电子设备的流畅性强相关的应用等应用绘制的一个或多个第一图层中,与电子设备接收的用户触摸操作对应的图层,即实际焦点所聚焦的图层。或者,该预设图层可以是电子设备针对焦点应用、关键应用或者与电子设备的流畅性强相关的应用等应用绘制的一个或多个第一图层中,与电子设备的流畅性强相关的图层。
例如,以焦点应用的预设图层为例。结合上述实例,假设应用a是焦点应用。如图6D所示,电子设备在t 15时刻完成应用a的“渲染a”,在t 6时刻完成应用1的“渲染1”;电子设备在t 15时刻之前的t 17时刻已完成“渲染a”中预设图层的渲染。其中,“渲染a”可以包括“渲染a1”和“渲染a2”。“渲染a1”是“渲染a”中预设图层的渲染。因此,如图6D所示,电子设备可以在t 17时刻进行图层合成(即执行“图像帧合成a1”),即对“渲染a1”渲染得到的预设图层进行图层合成。又例如,如图6C所示,电子设备在t 16时刻完成应用a的“渲染b”,在t 6时刻完成应用1的“渲染2”;电子设备在t 16时刻之前的t 18时刻已完成“渲染b”中预设图层的渲染。其中,“渲染b”可以包括“渲染b1”和“渲染b2”。“渲染b1”是“渲染b”中预设图层的渲染。因此,电子设备可以在t 18时刻进行图层合成(即执行“图像帧合成1+a2+b1”),即对“渲染1”、“渲染a2”和“渲染b1”渲染得到的预设图层进行图层合成。
可以理解,由于应用a是焦点应用;因此,用户更加关注应用a的界面变化。而焦点应用a的界面变化主要体现在上述预设图层的变化。也就是说,用户更加关注焦点应用a的预设图层的变化对电子设备的流畅性的影响。上述实例中,电子设备在t 17时刻执行“图像帧合成a1”,在t 18时刻执行“图像帧合成1+a2+b1”,也可以在界面上体现出焦点应用a的预设图层的变化,可以提升用户对电子设备的流畅性的视觉体验。
需要注意的是,本申请实施例中,上述应用或图层的关键程度或者对电子设备的流畅性的影响程度,可以根据统计(例如在实验室中)识别或者应用(例如开发时)预设的优先级信息来识别和比较得到,也可以根据用户的行为、关注焦点等进行分析和预测得到。本申请实施例这里对电子设备确定应用的关键程度,以及应用对电子设备的流畅性的影响程度的具体方法不作限定。
实施例(四)
结合实施例(一),本实施例中,电子设备还可以对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层的绘制、渲染和合成所需的时长。具体的,电子设备可以执行以下硬件资源调度中的一项或多项,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。其中,上述正向调度可以包括:调高电子设备的处理器的工作频率,选择大核的处理器执行上述方法,以及调高电子设备的内存工作频率。该处理器可以包括CPU和/或GPU。
可以理解,处理器的工作频率越高,该处理器的运算速度越快,电子设备进行图层的绘制、渲染和合成所需的时长则越短。并且,大核处理器的运算速度快于小核处理器的运算速 度。电子设备的内存工作频率越高,该电子设备的读写速度越快,电子设备进行图层的绘制、渲染和合成所需的时长则越短。为了缩短电子设备进行图层的绘制、渲染和合成所需的时长,缩短电子设备响应延迟,电子设备可以对电子设备的硬件资源进行正向调度。
其中,电子设备对电子设备的硬件资源进行正向调度,可以缩短电子设备进行图层的绘制、渲染和合成所需的时长。这样,可以提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性,从而缩短电子设备响应延迟,电子设备可以对所述电子设备的硬件资源进行正向调度。
在一种实现方式中,电子设备可以根据上述第一处理帧长,对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。其中,上述第一处理帧长越大,电子设备对电子设备的硬件资源进行正向调度的幅度越大。例如,以电子设备调高处理器的工作频率为例,上述第一处理帧长越大,电子设备调整处理器的工作频率时,将处理器的工作频率调整的越高。
在另一种实现方式中,电子设备可以根据第一统计周期内电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的次数或者概率,对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。其中,上述概率是第一统计周期内电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的次数与总次数的比值。
其中,第一统计周期内电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的次数越少,概率越小;电子设备对电子设备的硬件资源进行正向调度的幅度越大。例如,以电子设备调高处理器的工作频率为例,第一统计周期内电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的次数越少,概率越小;电子设备调整处理器的工作频率时,将处理器的工作频率调整的越高。
在另一种实现方式中,电子设备可以根据电子设备的前台应用,对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。该前台应用可以是电子设备的显示屏当前显示的界面对应的应用。
可以理解,电子设备中可以安装多个应用。电子设备在前台运行不同应用时,进行图层绘制、渲染和合成所需的时间不同。电子设备运行一个前台应用时,进行图层绘制、渲染和合成所需的时间越长;对电子设备的硬件资源进行正向调度的幅度越大。本申请实施例中,可以针对每一个应用设置一个用于对硬件资源进行正向调度的方式或者策略。
例如,以电子设备调高处理器的工作频率为例,电子设备运行前台应用时,进行图层绘制、渲染和合成所需的时长越大,电子设备调整处理器的工作频率时,将处理器的工作频率调整的越高。
在另一种实现方式中,电子设备中可以保存预设人工智能(Artificial Intelligence,AI)模型的模型代码。该预设AI模型是具备“根据‘第一处理帧长’、‘第一统计周期内电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的次数或者概率’或者‘电子设备的前台应用’,对电子设备的硬件资源进行正向调度,以提升单帧渲染合成的可能性”功能的AI模型。该预设AI模型是根据‘第一处理帧长’、‘第一统计周期内电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的次数或者概率’或者‘电子设备的前台应用’进行多次样本训练得到的。其中,单帧渲染合成是指电子设备在一个同步周期完成图层的绘制、渲染和合成。
电子设备可以运行该预设AI模型的模型代码,根据‘第一处理帧长’、‘第一统计周期内 电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的次数或者概率’或者‘电子设备的前台应用’,对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层的绘制、渲染和合成所需的时长。这样,可以提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性,从而缩短电子设备响应延迟,电子设备可以对所述电子设备的硬件资源进行正向调度。
需要注意的是,电子设备对电子设备的硬件资源进行正向调度的具体方法可以参考下述实施例中的相关描述,本申请实施例这里不予赘述。
本申请实施例中,电子设备对电子设备的硬件资源进行正向调度,缩短响应延迟的效果说明。例如,如图7中的(b)所示,为电子设备未对电子设备的硬件资源进行正向调度前,基于垂直同步信号进行图像处理的时序示意图。如图8A所示,为电子设备对所述电子设备的硬件资源进行正向调度后,基于垂直同步信号进行图像处理的时序示意图。对比图7中的(b)和图8A,可以得出:
电子设备在图7中的(b)所示的t 1时刻开始执行“绘制1”和“渲染1”,在t 9时刻完成“渲染1”;电子设备在图8A所示的t 1时刻开始执行“绘制1”和“渲染1”,在t 9时刻之前的t 12时刻完成“渲染1”。即电子设备执行图8A所示的“绘制1”和“渲染1”所需的时长小于电子设备执行图7中的(b)所示的“绘制1”和“渲染1”所需的时长。电子设备执行图8A所示的“图像帧合成1”所需的时长小于电子设备执行图7中的(b)所示的“图像帧合成1”所需的时长。
电子设备在图7中的(b)所示的t 2时刻开始执行“绘制2”和“渲染2”,在t 10时刻完成“渲染2”;电子设备在图8A所示的t 2时刻开始执行“绘制2”和“渲染2”,在t 10时刻之前的t 13时刻完成“渲染2”。即电子设备执行图8A所示的“绘制2”和“渲染2”所需的时长小于电子设备执行图7中的(b)所示的“绘制2”和“渲染2”所需的时长。电子设备执行图8A所示的“图像帧合成2”所需的时长小于电子设备执行图7中的(b)所示的“图像帧合成2”所需的时长。
电子设备在图7中的(b)所示的t 3时刻开始执行“绘制3”和“渲染3”,在t 11时刻完成“渲染3”;电子设备在图8A所示的t 3时刻开始执行“绘制3”和“渲染3”,在t 11时刻之前的t 14时刻完成“渲染3”。即电子设备执行图8A所示的“绘制3”和“渲染3”所需的时长小于电子设备执行图7中的(b)所示的“绘制3”和“渲染3”所需的时长。电子设备执行图8A所示的“图像帧合成3”所需的时长小于电子设备执行图7中的(b)所示的“图像帧合成3”所需的时长。
综上所述,电子设备可以缩短电子设备进行图层的绘制、渲染和合成所需的时长。这样,可以提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性。例如,如图8A所示,电子设备可以在t 1时刻至t 2时刻这一个同步周期(即T Z)完成“绘制1”、“渲染1”和“图像帧合成1”,在t 2时刻至t 3时刻这一个同步周期(即T Z)完成“绘制2”、“渲染2”和“图像帧合成2”,在t 3时刻至t 4时刻这一个同步周期(即T Z)完成“绘制3”、“渲染3”和“图像帧合成3”。
实施例(五)
本实施例对电子设备对电子设备的硬件资源进行调度(如正向调度或负向调度)的具体条件进行说明。即电子设备可以在以下条件下,对电子设备的硬件资源进行调度。
(1)、电子设备在第一统计周期的第一处理帧长大于预设单帧帧长时,对电子设备的硬件资源进行正向调度。
其中,本申请实施例中的预设单帧帧长小于或等于同步周期T Z。例如,该预设单帧帧长可以是上述同步周期T Z与预设时延阈值的差值。其中,预设时延阈值大于或等于0ms。示例性的,该预设时延阈值可以为0ms、1ms、2ms、1.5ms或者3ms等。例如,以同步周期T Z=16.667ms,预设时延阈值为1ms为例。该预设单帧帧长可以为15.667ms。
其中,如果第一统计周期(即当前时刻的前一个统计周期)的第一处理帧长大于预设单帧帧长,则表示在第一统计周期电子设备不能在一个同步周期完成图层的绘制、渲染和合成。因此,电子设备可以对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层的绘制、渲染和合成所需的时长,提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性。
(2)、电子设备在对电子设备的硬件资源进行正向调度之后,在电子设备的屏幕刷新率大于预设刷新率阈值的情况下,可以对电子设备的硬件资源进行负向调度,以在避免显示屏显示图像出现丢帧现象,保证显示屏显示图像的流畅性的前提下,降低电子设备的功耗。
其中,本申请实施例中,可以在电子设备的屏幕刷新率大于预设刷新率阈值的情况下,称电子设备处于高刷新率的情况下。例如,该预设刷新率阈值可以为80Hz、90Hz或者85Hz等。
一般而言,如果电子设备处于高屏幕刷新率的情况下,为了匹配高屏幕刷新率,电子设备可以默认对电子设备的硬件资源进行正向调度,从而提升电子设备的帧率,以避免显示屏显示图像出现丢帧现象,保证显示屏显示图像的流畅性。但是,在高屏幕刷新率的情况下,对电子设备的硬件资源进行正向调度,会较大程度的增大电子设备的功耗。
而本申请实施例中,电子设备对电子设备的硬件资源进行正向调度之后,在电子设备处于高屏幕刷新率的情况下,结合电子设备响应于垂直同步信号1进行图层绘制、渲染和合成的方案;电子设备不需要对硬件资源进行较大程度的正向调度,或者电子设备不需要对硬件资源进行正向调度,只要可以实现电子设备能够在两个同步周期完成图层的绘制、渲染和合成,便可以避免显示屏显示图像出现丢帧现象,保证显示屏显示图像的流畅性。在这种情况下,电子设备可以对硬件资源进行负向调度,以在避免显示屏显示图像出现丢帧现象,保证显示屏显示图像的流畅性的前提下,降低电子设备的负载。
示例性的,上述负向调度可以包括:调低电子设备的处理器的工作频率,选择小核的处理器执行上述方法,以及调低电子设备的内存工作频率。该处理器可以包括CPU和/或GPU。
综上所述,电子设备可以在电子设备的屏幕刷新率大于预设刷新率阈值的情况下,对硬件资源进行负向调度。这样,对于高屏幕刷新率的显示系统而言,不仅可以避免显示屏显示图像出现丢帧现象,保证显示屏显示图像的流畅性;还可以降低电子设备进行硬件资源调度的门限,以在更低要求的硬件资源下,提供与高屏幕刷新率匹配的帧率,为用户提供高屏幕刷新率不掉帧的使用体验。
进一步的,电子设备对电子设备的硬件资源进行正向调度之后,电子设备在电子设备的屏幕刷新率大于预设刷新率阈值的情况下,如果第一处理帧长大于预设双帧帧长,可以对电子设备的硬件资源进行负向调度,以在避免显示屏显示图像出现丢帧现象,保证显示屏显示图像的流畅性的前提下,降低电子设备的功耗。
其中,上述预设双帧帧长大于上述预设单帧帧长。例如,该预设双帧帧长可以大于预设单帧帧长,且小于或等于垂直同步信号3的信号周期(即上述同步周期)的K倍。其中,K大于或等于2。例如,K可以为2或2.5等。举例来说,以K=2为例,上述预设单帧帧长为11.1ms时,该预设双帧帧长可以为22.2ms。
可以理解,如果第一处理帧长大于预设双帧帧长,则表示电子设备不能在两个同步周期完成图层的绘制、渲染和合成。在这种情况下,则可能会出现图7中的(a)所示的丢帧现象。为了避免显示屏显示图像出现丢帧现象,保证显示屏显示图像的流畅性,电子设备可以对电子设备的硬件资源进行负向调度,以提高电子设备在两个同步周期完成图层的绘制、渲染和合成的可能性。
实施例(六)
相比于上述实施例(一),本实施例中,电子设备可以在执行S401之前,先预测电子设备进行图层绘制、渲染和合成所需的时长是否小于或等于预设单帧帧长(该预设单帧帧长小于上述同步周期)。如果预测得到上述时间小于或等于预设单帧帧长,则表示电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性较高。在这种情况下,电子设备才执行上述S401-S402。具体的,如图8B所示,在图4所示的S401之前,本申请实施例还可以包括S801-S803。
S801、电子设备判断第一统计周期的第一处理帧长是否小于或等于预设单帧帧长。
其中,上述第一处理帧长是第一渲染帧长和第一SF帧长之和。该第一渲染帧长是进行图层绘制和对绘制的图层进行渲染所需的时长。该第一SF帧长是对上述渲染的图层进行图层合成所需的时长。
示例性的,电子设备可以通过以下实现方式(i)和实现方式(ii),确定上述第一处理帧长。
实现方式(i):电子设备根据一个或多个应用中的焦点应用对应的第一渲染帧长,以及电子设备对一个或多个应用对应的第一SF帧长,确定第一处理帧长。其中,焦点应用是电子设备接收的用户触摸操作对应的应用。第一渲染帧长是进行图层绘制和对绘制的图层进行渲染所需的时长。
例如,假设上述一个或多个应用包括:应用a、应用b和应用c,应用a是焦点应用。其中,应用a的第一渲染帧长a是电子设备针对应用a进行图层绘制和对绘制的图层进行渲染所需的时长。应用b的第一渲染帧长b是电子设备针对应用b进行图层绘制和对绘制的图层进行渲染所需的时长。应用c的第一渲染帧长c是电子设备针对应用c进行图层绘制和对绘制的图层进行渲染所需的时长。第一SF帧长x是电子设备对针对应用a渲染的图层、针对应用b渲染的图层和针对应用c渲染的图层,进行图层合成所需的时长。
其中,电子设备可以根据焦点应用a的第一渲染帧长a和第一SF帧长x,确定上述第一处理帧长。例如,该第一处理帧长是第一渲染帧长a与第一SF帧长x之和。
实现方式(ii):电子设备根据一个或多个应用中每个应用对应的第一渲染帧长中,最大的第一渲染帧长,以及电子设备对一个或多个应用对应的第一SF帧长,确定第一处理帧长。
结合上述实例,假设第一渲染帧长b>第一渲染帧长a>第一渲染帧长c。那么,电子设备则可以根据第一渲染帧长b和第一SF帧长x,确定上述第一处理帧长。例如,该第一处理帧长是第一渲染帧长b与第一SF帧长x之和。
示例性的,电子设备可以周期性统计每一个统计周期内的第一处理帧长。该第一统计周期是当前时刻的前一个统计周期或者更早的统计周期。例如,本申请实施例中的统计周期可以为1S、2S、3S或者5S等任一时长。例如,电子设备可以执行S801a-S801b,以获取第一统计周期的第一处理帧长。
S801a、电子设备获取第一统计周期的一个或多个第二处理帧长,每个第二处理帧长是第二渲染帧长和第二SF帧长之和。
其中,电子设备可以统计第一统计周期内每次进行图层绘制和渲染所需的时长(即第二渲染帧长),以及对渲染后的图像进行图层合成所需的时长(即第二SF帧长),计算第二渲染帧长和对应第二SF帧长之和,得到每次进行图层绘制、渲染和合成所需的总时长(即第二处理帧长)。
例如,假设电子设备在第一统计周期内进行了3次图层绘制、渲染和合成。电子设备绘制图层a,并对图层a进行渲染所需的时长为第二渲染帧长a;电子设备对渲染的图层a进行图层合成所需的时长为第二SF帧长a。电子设备绘制图层b,并对图层b进行渲染所需的时长为第二渲染帧长b;电子设备对渲染的图层b进行图层合成所需的时长为第二SF帧长b。电子设备绘制图层c,并对图层a进行渲染所需的时长为第二渲染帧长c;电子设备对渲染的图层c进行图层合成所需的时长为第二SF帧长c。电子设备可以计算第二渲染帧长a与第二SF帧长a之和得到第二处理帧长a;计算第二渲染帧长b与第二SF帧长b之和得到第二处理帧长b;计算第二渲染帧长c与第二SF帧长c之和得到第二处理帧长c。如此,电子设备便可以得到第一统计周期内的三个第二处理帧长。
S801b、电子设备根据一个或多个第二处理帧长,确定第一统计周期的第一处理帧长。
在一种实现方式中,一个或多个第二处理帧长中仅包括一个第二处理帧长。在这种实现方式中,上述第一处理帧长等于这一个第二处理帧长。
在一种实现方式中,一个或多个第二处理帧长中可以包括多个第二处理帧长。第一处理帧长是上述多个第二处理帧长的平均值。结合上述实例,第一统计周期的第一处理帧长可以为第二处理帧长a、第二处理帧长b和第二处理帧长c的平均值。
在另一种实现方式中,一个或多个第二处理帧长中可以包括多个第二处理帧长。第一处理帧长是上述多个第二处理帧长中最大的第二处理帧长。结合上述实例,第一统计周期的第一处理帧长可以为上述第二处理帧长a、第二处理帧长b和第二处理帧长c中的最大值。
可以理解,如果第一统计周期(即当前时刻的前一个统计周期)的第一处理帧长小于或等于预设单帧帧长,则表示在第一统计周期电子设备可以在一个同步周期完成图层的绘制、渲染和合成。那么在该第一统计周期的下一个统计周期(即当前时刻所在的统计周期),电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性则较高。在这种情况下,电子设备则可以执行S401。
需要注意的是,本申请实施例中,电子设备在一个同步周期进行图层的绘制、渲染和合成具体可以包括:响应于垂直同步信号1,在第一同步周期绘制一个或多个第一图层,并渲染一个或多个第一图层,并在一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成得到第一图像帧。该第一同步周期是垂直同步信号1对应的同步周期。例如,该第一同步周期可以为图6A中的(b)所示的t 1时刻至t 2时刻这一同步周期T Z。也就是说,本申请实施例中,电子设备可以在进行图层绘制和渲染的一个同步周期(即第一同步周期)内,便开始进行图层合成。
如果第一统计周期(即当前时刻的前一个统计周期)的第一处理帧长大于预设单帧帧长,则表示在第一统计周期电子设备不能在一个同步周期完成图层的绘制、渲染和合成。那么在该第一统计周期的下一个统计周期(即当前时刻所在的统计周期),电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性则较低。在这种情况下,电子设备则可以执行S802-S803。
S802、响应于垂直同步信号1,电子设备绘制一个或多个第一图层,并渲染一个或多个第一图层。
S803、响应于垂直同步信号2,电子设备对渲染的一个或多个第一图层进行图层合成得到第一图像帧。
具体的,响应于垂直同步信号2,电子设备可以在第二同步周期对渲染的第一图层进行图层合成得到第一图像帧。第二同步周期与上述第一同步周期不同。例如,如图6A中的(a)所示,第一同步周期可以为t 1时刻至t 2时刻这一同步周期T Z;第二同步周期可以为的t 2时刻至t 3时刻这一同步周期T Z
例如,如果第一统计周期的第一处理帧长大于预设单帧帧长,如图6A中的(a)所示,电子设备不会在t 6时刻进行图层合成(即图像帧合成);而是会在t 2时刻垂直同步信号2到来后,响应于t 2时刻的垂直同步信号2,进行图层合成。
本申请实施例中,电子设备可以在预测得到电子设备进行图层绘制、渲染和合成所需的时长小于或等于预设单帧帧长的情况下,才会响应于垂直同步信号1,绘制并渲染一个或多个第一图层,并在一个或多个第一图层渲染完成后,对渲染的一个或多个第一图层进行图层合成以得到第一图像帧。这样,可以减少出现上述第二种情况中“只能提前进行图层合成,而不能缩短电子设备的响应延迟”这一现象的可能性。
实施例(七)
本实施例对上述任一实施例中,电子设备执行上述基于垂直同步信号的图像处理方法的条件进行说明。具体的,通过该实施例,可以说明电子设备在什么条件或者情况下,可以执行上述方法。
其中,电子设备可以在加速渲染模式中,执行本申请实施例的方法,如S401-S402及其相关步骤。具体的,在上述S401之前,本申请实施例的方法还可以包括S901。例如,如图9所示,在图4所示的S401之前,本申请实施例的方法还可以包括S901。
S901、响应于第一事件,电子设备启动加速渲染模式。
其中,电子设备启动加速渲染模式后,在一个或多个第一图层渲染结束后的第一个垂直同步信号2到来之前,便可以开始对渲染的图层进行图层合成得到图像帧。
示例性的,本申请实施例这里通过以下两种实现方式,即实现方式(I)和实现方式(II),举例说明上述第一事件。
在实现方式(I)中,上述第一事件可以为:电子设备接收到用户的第一操作。该第一操作用于触发电子设备启动加速渲染模式。
例如,以电子设备是手机1001为例。如图10中的(a)所示,第一操作可以是用户对手机1001所显示的设置界面1002中“加速渲染”选项1003的点击操作。该第一操作用于开启“加速渲染”选项1003,以触发手机1001启动加速渲染模式。
又例如,如图10中的(b)所示,第一操作可以是用户对手机1001所显示的通知栏1004中“加速渲染”按钮1005的点击操作。该第一操作用于开启“加速渲染”按钮1005,以触发手机1001启动加速渲染模式。
在实现方式(II)中,上述第一事件可以为第一统计周期的第一处理帧长小于或等于预设单帧帧长。其中,第一统计周期、预设单帧帧长,以及第一处理帧长小于或等于预设单帧帧长的详细描述可以参考上述实施例中的相关内容,本申请实施例这里不予赘述。
进一步的,电子设备还可以退出上述加速渲染模式。电子设备退出加速渲染模式后,电子设备只能响应于垂直同步信号2,进行图层合成。具体的,本申请实施例的方法还可以包括S902-S905。例如,如图9所示,本申请实施例的方法还可以包括S902-S905。
S902、响应于第二事件,电子设备退出加速渲染模式。
示例性的,本申请实施例这里通过以下两种实现方式,即实现方式(a)和实现方式(b),举例说明上述第二事件。
在实现方式(a)中,上述第二事件可以为:电子设备接收到用户的第二操作。该第二操作用于触发电子设备退出加速渲染模式。
其中,第二操作与上述第一操作对应。例如,第二操作可以为电子设备响应于用户的第一操作,开启上述“加速渲染”选项1003或“加速渲染”按钮1005,以启动加速渲染模式后,接收到的用户对“加速渲染”选项1003或“加速渲染”按钮1005的点击操作。响应于该第二操作,手机1001可以关闭“加速渲染”选项1003或“加速渲染”按钮1005,以退出或关闭加速渲染模式。
在实现方式(b)中,上述第二事件可以为第一统计周期的第一处理帧长大于预设单帧帧长。其中,第一统计周期、预设单帧帧长,以及第一处理帧长大于预设单帧帧长的详细描述可以参考上述实施例中的相关内容,本申请实施例这里不予赘述。
S903、响应于垂直同步信号1,电子设备绘制一个或多个第二图层,并渲染一个或多个第二图层。
S904、响应于垂直同步信号2,电子设备对渲染的一个或多个第二图层进行图层合成得到第二图像帧。
其中,电子设备退出加速渲染模式后,电子设备则不能在一个或多个第一图层渲染结束后的第一个垂直同步信号2到来之前,便开始对渲染的第二图层进行图层合成。而是要等待垂直同步信号2的到来,响应于垂直同步信号2,电子设备对渲染的第二图层进行图层合成得到第二图像帧。
例如,如图6A中的(a)所示,电子设备不会在t 6时刻进行图层合成。而是等待t 2时刻垂直同步信号2的到来,响应于t 2时刻的垂直同步信号2,才可以对渲染的图层进行图层合成得到图像帧。
S905、响应于垂直同步信号3,电子设备刷新显示第二图像帧。
本申请实施例中,电子设备可以响应于用户的操作,启动加速渲染模式;也可以根据电子设备在统计周期内进行图层绘制、渲染和合成所需的时长,自动启动加速渲染模式。
需要注意的是,上述加速渲染模式只是电子设备执行本申请实施例的方法时所处的工作模式的一种名称。加速渲染模式也可以有其他的命名,本申请实施例对此不作限制。例如,上述加速渲染模式也可以称为加速渲染合成模式或者加速合成模式等。
实施例(八)
本实施例结合上述实施例,进一步说明本申请实施例的方法。具体的,电子设备可以周期性获取每一个统计周期的第一处理帧长;如果一个统计周期(如第一统计周期)的第一处理帧长大于预设单帧帧长,则表示电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性较低。这种情况下,在该实施例中,电子设备可以对电子设备的硬件资源进行正向调度,以缩短电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。这样,可以提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性。其中,本申请实施例中,以电子设备调高电子设备的处理器的工作频率为例,对电子设备对硬件资源进行正向调度的具体方法进行举例说明。
具体的,本申请实施例的方法还可以包括S1101-S1104。例如,如图11所示,本申请实施例的方法还可以包括S1101-S1104。
S1101、电子设备获取第一统计周期的第一处理帧长。
其中,电子设备可以响应于第一统计周期结束,执行上述S801a-S801b,获取第一统计周期的第一处理帧长。
S1102、电子设备判断第一统计周期的第一处理帧长是否小于或等于预设单帧帧长。
其中,上述第一处理帧长是第一渲染帧长和第一SF帧长之和。该第一渲染帧长是进行图层绘制和对绘制的图层进行渲染所需的时长。该第一SF帧长是对上述渲染的图层进行图层合成所需的时长。
示例性的,电子设备可以周期性统计每一个统计周期内的第一处理帧长。该第一统计周期是当前时刻的前一个统计周期。例如,本申请实施例中的统计周期可以为1S、2S、3S或者5S等任一时长。其中,S1102的详细描述可以参考S801中的相关介绍,本申请实施例这里不予赘述。
可以理解,处理器的工作频率越高,该处理器的运算速度越快,电子设备进行图层的绘制、渲染和合成所需的时长则越短。其中,如果第一统计周期的第一处理帧长小于或等于预设单帧帧长,则表示处理器的工作频率已经足以保证电子设备在一个同步周期完成图层的绘制、渲染和合成,电子设备不需要再调高处理器的工作频率。
如果第一统计周期的第一处理帧长大于预设单帧帧长,则表示处理器的工作频率较低,不足以保证电子设备在一个同步周期完成图层的绘制、渲染和合成,电子设备则可以调高处理器的工作频率。例如,电子设备可以执行S1104,调高处理器的工作频率。但是,只有当处理器当前的工作频率小于处理器的最大工作功率时,电子设备才可以调高处理器的工作频率。因此,如图11所示,S1102之后,如果第一统计周期的第一处理帧长大于预设单帧帧长,电子设备可以执行S1103。
S1103、电子设备判断处理器当前的工作频率小于处理器的最大工作频率。
具体的,如果处理器当前的工作频率小于处理器的最大工作功率,电子设备可以执行S1104。如果处理器当前的工作频率为该处理器的最大工作功率,电子设备则不需要调整处理器的工作频率。
S1104、电子设备调高处理器的工作频率。
示例性的,本申请实施例中的处理器至少可以包括CPU和GPU中的至少一项。其中,处理器的工作频率f的单位可以为赫兹(Hz,简称赫),千赫(kHz),兆赫(MHz)或者吉赫(GHz)。
在一种实现方式中,电子设备可以按照第一预设步进调高处理器的工作频率。例如,第一预设步进的单位可以为Hz、kHz或者MHz等。该第一预设步进可以预先配置在电子设备中。或者,该第一预设步进可以由用户在电子设备中设置。
在另一种实现方式中,电子设备可以根据第一处理帧长与上述预设单帧帧长的差值,调高处理器的工作频率,以使下一个统计周期内的第二处理帧长小于或等于预设单帧帧长。其中,在该实现方式中,电子设备对处理器的工作频率的调整幅度与上述差值的大小成正比。也就是说,第一处理帧长与预设单帧帧长的差值越大,电子设备对处理器的工作频率的调整幅度越大。第一处理帧长与预设单帧帧长的差值越小,电子设备对处理器的工作频率的调整幅度越小。
在一些实施例中,电子设备可以根据第一处理帧长和预设单帧帧长,通过预设AI模型,调整处理器的工作频率。其中,该预设AI模型是经过大量样本训练得到的。该预设AI模型具备“根据第一处理帧长调整处理器的工作频率,以提升单帧渲染合成的可能性”功能的AI模型。其中,单帧渲染合成是指电子设备在一个同步周期完成图层的绘制、渲染和合成。
如图11所示,在S1102之后,如果第一处理帧长小于或等于预设单帧帧长,电子设备则可以执行S401-S402;如果第一处理帧长大于预设单帧帧长,电子设备还可以执行S802、S803和S402。
其中,S1102之后,如果第一处理帧长大于预设单帧帧长,电子设备执行S1103。也就是说,电子设备执行S1103时,第一处理帧长大于预设单帧帧长,电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性较低。因此,在S1103之后,如果电子设备当前的工作频率等于最大工作频率,则表示即使处理器工作在最大工作频率,电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性也较低。
在这种情况下,电子设备可以采用常规方案进行图层的绘制、渲染和合成。如图11所示,在S1103之后,如果电子设备当前的工作频率等于最大工作频率,电子设备则可以执行S802、S803和S402。
需要注意的是,如图11所示,电子设备执行S401-S402时,该电子设备处于加速渲染模式;电子设备执行S802、S803和S402时,该电子设备已经退出加速渲染模式。
本申请实施例中,电子设备可以在第一处理帧长大于预设单帧帧长,即电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性较低时,调高电子设备的处理器的工作频率。如此,便可以提升处理器的运算速度,以缩短电子设备进行图层的绘制、渲染和合成所需的时长,以提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性。这样,如果电子设备可以在一个同步周期完成图层的绘制、渲染和合成,便可以将电子设备的响应延迟缩短一个同步周期,可以提升电子设备的流畅性(如跟手性能)。
可以理解,电子设备的处理器的工作频率较高,虽然可以提升电子设备的运算速度,缩短电子设备进行图层的绘制、渲染和合成所需的时长;但是,处理器的工作频率越高,功耗也就越大。本申请实施例中,电子设备还可以在上述第一处理帧长满足预设条件时,调低处理器的工作频率。这样,可以降低电子设备的功耗。具体的,本申请实施例的方法还可以包括S1201。
S1201、如果第一处理帧长满足预设条件,则调低处理器的工作频率。
在一种实现方式中,第一处理帧长满足预设条件,具体可以包括:第一处理帧长小于预设单帧帧长。
其中,如果第一处理帧长小于预设单帧帧长,则表示电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性较高。在这种情况下,可能是因为处理器的工作频率较高,使得处理器的运算速度较快,从而使得电子设备可以一个同步周期完成图层的绘制、渲染和合成。但是,处理器的工作频率过高,会导致电子设备的功耗较大。因此,电子设备可以调低处理器的工作频率。
在另一种实现方式中,第一处理帧长满足预设条件,具体可以包括:第一处理帧长小于预设单帧帧长,且预设单帧帧长与第一处理帧长的差值大于第一预设时长。
其中,如果第一处理帧长小于预设单帧帧长,且预设单帧帧长与第一处理帧长的差值大于第一预设时长,则表示电子设备在一个同步周期完成图层的绘制、渲染和合成后,还可能需要等待一段时间,才可以等到垂直同步信号3到来,响应于垂直同步信号3,刷新显示合成的图像帧。在这种情况下,处理器的工作频率一般较高。为了降低电子设备的功耗,电子设备可以调低处理器的工作频率。
本申请实施例中,电子设备可以调低处理器的工作频率的方法可以包括:电子设备按照第二预设步进调低处理器的工作频率。其中,第二预设步进可以等于第一预设步进。或者, 第二预设步进也可以小于第一预设步进。
需要注意的是,在第二预设步进小于第一预设步进的情况下,电子设备可以以快升慢降的方式调整处理器的工作频率。这样,有利于电子设备执行本申请实施例的方法,缩短电子设备的触摸响应时延,提升电子设备的流畅性(如跟手性能)。
在一些实施例中,为了防止调整处理器的工作频率时出现乒乓现象。电子设备可以在连续N个统计周期的第一处理帧长满足预设条件时,才调低处理器的工作频率。例如,如图12所示,在图11所示的S1102之后,本申请实施例的方法还可以包括S1201a和S1202。
S1201a、电子设备确定连续N个统计周期的第一处理帧长满足预设条件。
其中,N≥2,N是正整数。例如,N可以为5、4、3、2或者6等任一正整数。
示例性的,电子设备确定连续N个统计周期的第一处理帧长满足预设条件的方法可以包括:如果一个统计周期(如统计周期1)的第一处理帧长满足预设条件,电子设备可以将计数器的计数值加1;计数器的计数初始值为0;如果统计周期1的下一个统计周期的第一处理帧长满足预设条件,电子设备将计数器的计数值加1;如果统计周期1的下一个统计周期的第一处理帧长不满足预设条件,电子设备将计数器的计数值清零。
可选的,在一些实施例中,电子设备可以在确定连续N个统计周期的第一处理帧长满足预设条件时,才进入加速渲染模式。
S1202、电子设备调低处理器的工作频率。
需要注意的是,该实施例中,第一处理帧长满足预设条件,以及电子设备调低所述处理器的工作频率的方法,可以参考上述实施例中的相关描述,本申请实施例这里不予赘述。
本申请实施例中,电子设备可以在连续N个统计周期的第一处理帧长满足预设条件时,才调低处理器的工作频率。这样,不仅可以防止调整处理器的工作频率时出现乒乓现象,还可以实现调整处理器的工作频率时的快升慢降。如此,可以在保证电子设备进行图层的绘制、渲染和合成的系统稳定性的前提下,提高电子设备在一个同步周期完成图层的绘制、渲染和合成的可能性,以缩短电子设备的触摸响应时延,提升电子设备的流畅性(如跟手性能)。
其中,在每一个统计周期内,如果绘制并渲染一个或多个图层时某些特征点的耗时时长较大,则可能会使得电子设备无法在一个同步周期内完成图层的绘制、渲染和合成。在这种情况下,电子设备可能会退出加速渲染模式。然后,电子设备可以响应于上述第一事件重现启动加速渲染模式。
但是,电子设备每次进入加速渲染模式时,都会有一次丢帧。例如,如图6A中的(a)所示,电子设备在进入加速渲染模式前,等待t 2时刻的垂直同步信号2到来,响应于t 2时刻的垂直同步信号2,才会进行图层合成(即图像帧合成);等待t 3时刻的垂直同步信号3到来,响应于t 3时刻的垂直同步信号3,才会刷新显示合成的图像帧。但是,如图6A中的(b)所示,电子设备在进入加速渲染模式后,在t 7时刻便可以进行图层合成(即图像帧合成);响应于t 2时刻的垂直同步信号3,便可以刷新显示合成的图像帧。如此,在进入加速渲染模式时,则可能会有一帧图像(即一个图像帧)是在电子设备在进入加速渲染模式之前响应于垂直同步信号2合成的。这一帧图像在电子设备进入加速渲染模式后,不会被显示屏刷新显示,则会丢帧。
可以理解,偶尔一次丢帧并不会对显示屏的显示效果产生较大影响,也就不会对用户的视觉体验造成较大影响。但是,如果电子设备频繁启动和退出加速渲染模式,则会频繁的出现丢帧现象。频繁出现丢帧现象会影响显示屏的显示效果,进而会影响用户的视觉体验。
在一些实施例中,为了避免电子设备频繁启动和退出加速渲染模式,在电子设备进入加 速渲染模式后,本申请实施例的方法还可以包括S1301。
S1301、在一个统计周期内,如果绘制并渲染一个或多个第三图层时第一特征点的耗费时长大于第一特征点对应的第二预设时长,电子设备则将处理器的工作频率调整为处理器的最大工作频率。
其中,上述第一特征点至少可以包括以下任一种:电子设备绘制一个或多个第三图层;电子设备渲染一个或多个第三图层;电子设备绘制一个或多个第三图层的过程中执行任一个函数;电子设备渲染一个或多个第三图层的过程中执行任一个函数。
本申请实施例中,可以针对上述每一种第一特征点设置一个第二预设时长。该第二预设时长可以是统计大量电子设备多次执行第一特征点对应的操作所需的时间确定的。
可以理解,在一个统计周期内,如果第一特征点的耗费时长大于第一特征点对应的第二预设时长,则表示电子设备采用加速渲染模式对应的方法,无法完成一个或多个第三图层的绘制和渲染的可能性较高。其中,一个或多个第三图层是该统计周期内电子设备正在绘制或渲染的图层。在这种情况下,电子设备可以对处理器进行瞬时提频,将处理器的工作频率调整为处理器的最大工作频率。处理器瞬时提频后,可以提升处理器的运算速度,进而可以缩短电子设备进行图层绘制、渲染和合成所需的时长。
如果电子设备对处理器进行瞬时提频后,电子设备还是无法在一个同步周期完成图层的绘制、渲染和合成;为了保证电子设备可以完成图层的绘制、渲染和合成,保证电子设备的显示屏的显示效果,电子设备可以退出加速渲染模式。具体的,在S1301之后,本申请实施例的方法还可以包括S1302。
S1302、如果第三处理帧长大于预设单帧帧长,电子设备响应于垂直同步信号2,对渲染的图层进行图层合成得到图像帧。
其中,第三处理帧长是第三渲染帧长和第三SF帧长之和。第三渲染帧长是绘制并渲染一个或多个第三图层所需的时长,第三SF帧长是对渲染的一个或多个第三图层进行图层合成所需的时长。
如果第三处理帧长大于预设单帧帧长,则表示处理器瞬时提频后,电子设备还是无法在一个同步周期完成上述一个或多个第三图层的绘制、渲染和合成。在这种情况下,电子设备可以退出加速渲染模式。也就是说,在一个或多个第一图层渲染结束后的第一个垂直同步信号2到来之前,电子设备不会对渲染的图层进行图层合成;而是会等待垂直同步信号2的到来,响应于垂直同步信号2,才会对渲染的图层进行图层合成得到图像帧。
实施例(九)
请参考图13,其示出本申请实施例提供的一种基于垂直同步信号的图像处理方法的简化流程图。
如图13所示,在一个统计周期结束后,电子设备可以执行S-1,获取该统计周期的第一处理帧长。其中,S-1的详细描述可以参考S801a-S801b、S1101和上述实施例中的相关描述,这里不予赘述。然后,电子设备可以执行S-2,判断该统计周期的第一处理帧长是否大于预设单帧帧长。其中,S-2的详细描述可以参考S801和上述实施例中的相关描述,这里不予赘述。
如果统计周期的第一处理帧长大于预设单帧帧长,电子设备则可以执行S-3,调高处理器的工作频率(即提频)。其中,S-3的详细描述可以参考S1104和上述实施例中的相关描述,这里不予赘述。如果统计周期的第一处理帧长小于或等于预设单帧帧长,电子设备则可以执行S-4,判断连续N个统计周期的第一处理帧长是否满足预设条件。其中,S-4的详细描述可以参考S1201a和上述实施例中的相关描述,这里不予赘述。如果连续N个统计周期的第一 处理帧长满足预设条件,则执行S-5和S-6。其中,电子设备执行S-5可以调低处理器的工作频率(即降频)。其中,S-5的详细描述可以参考S1202和上述实施例中的相关描述,这里不予赘述。电子设备执行S-6可以启动加速渲染模式。其中,本申请实施例中,电子设备响应于第一事件也可以启动加速渲染模式。在加速渲染模式下,电子设备可以执行S401-S402。
并且,在加速渲染模式下,电子设备可以执行S-7,判断第一特征点的耗时时长是否大于第一特征点对应的第二预设时长。如果第一特征点的耗时时长大于第一特征点对应的第二预设时长,电子设备可以执行S-8,进行瞬时提频。其中,S-7和S-8的详细描述可以参考S1301和上述实施例中的相关描述,这里不予赘述。
如果第一特征点的耗时时长小于或等于第一特征点对应的第二预设时长,电子设备则继续处于加速渲染模式。电子设备瞬时提频(即执行S-8)之后,可以执行S-9,判断第三处理帧长是否大于预设单帧帧长。如果第三处理帧长大于预设单帧帧长,则可以执行S-10,退出加速渲染模式。其中,S-9和S-10的详细描述可以参考S1302和上述实施例中的相关描述,这里不予赘述。如果第三处理帧长小于会等于预设单帧帧长,则可以继续处于加速渲染模式。其中,本申请实施例中,电子设备响应于第二事件也可以退出加速渲染模式。其中,退出加速渲染模式后,电子设备可以采用S903-S905的方式,进行图层的绘制、渲染、合成以及图像帧的显示。
实施例(十)
请参考图14,其示出本申请实施例提供的一种优化模块的示意图。该优化模块可以是生成图像帧的装置或电子设备中,用于实现本申请实施例的方法的功能模块。如图14所示,该优化模块可以包括:服务通信接口模块1401、帧长检测模块1402、单帧渲染策略模块1403和动态调节算法模块1404。
帧长检测模块1402用于获取统计周期(如第一统计周期)内的多个第二处理帧长,并确定该统计周期的第一处理帧长,向动态调节算法模块1404传输该统计周期的第一处理帧长。其中,帧长检测模块1402可以通过服务通信接口模块1401获取统计周期内的多个第二处理帧长。例如,帧长检测模块1402用于支持电子设备执行上述方法实施例中的S801a-S801b和S1101,/或用于本文所描述的技术的其它过程。
动态调节算法模块1404用于根据帧长检测模块1402确定的第一处理帧长,调用调度模块调整处理器的工作频率。例如,动态调节算法模块1404用于支持电子设备执行上述方法实施例中的S801,S1102,S1103,S1104,S1201,S1201a,S1202,S1301,S1302中“判断第三处理帧长大于预设单帧帧长”的操作,S901,S902,/或用于本文所描述的技术的其它过程。
单帧渲染策略模块1403用于响应于动态调节算法模块1404的控制,控制电子设备的UI线程、渲染线程和合成线程采用对应的方式,进行图层的绘制、渲染和合成。例如,单帧渲染策略模块1403用于支持电子设备执行上述方法实施例中的S401,S402,S802,S803,S903,S904,S905,S1302中“进行图层合成”的操作,/或用于本文所描述的技术的其它过程。
其中,如图15所示,图1A所示的软件架构还可以包括:上述优化模块60。该优化模块60中可以包括:服务通信接口模块1401、帧长检测模块1402、单帧渲染策略模块1403和动态调节算法模块1404。
实施例(十一)
请参考图16,请参考图16,其示出两种品牌的手机在机械手速为100毫米(mm)/s时,在“联系人”应用的滑动测试场景的测试结果示意图。
其中,图16中的(a)示出一种品牌的手机1(如iphone的xs手机)在机械手速为100mm/s 时,在“联系人”应用的滑动测试场景的测试结果。如图16中的(a)所示,手机1的触摸响应时间为82ms-114ms。
图16中的(b)示出另一种品牌的手机2(如华为手机)在执行本申请实施例的方法之前,在机械手速为100mm/s时,在“联系人”应用的滑动测试场景的测试结果。如图16中的(b)所示,手机2在执行本申请实施例的方法前,触摸响应时间为82ms-136ms。
图16中的(c)示出上述手机2在执行本申请实施例的方法之后,在机械手速为100mm/s时,在“联系人”应用的滑动测试场景的测试结果。如图16中的(c)所示,手机2在执行本申请实施例的方法前,触摸响应时间为65ms-84ms。
对比图16中的(c)与图16中的(b)可知:以相同的机械手速(如100mm/s)在相同的测试场景进行测试,相比于手机2执行本申请实施例的方法之前的触摸响应时间(简称“触摸响应时间-前”,如82ms-136ms),手机2执行本申请实施例的方法之后的触摸响应时间(简称“触摸响应时间-后”,如65ms-84ms)缩短了较长的延迟时间。也就是说,通过本申请实施例的方法,可以缩短电子设备的响应延迟,提升电子设备的流畅性(如跟手性能)。
对比图16中的(a)与图16中的(b)可知:以相同的机械手速(如100mm/s)在相同的测试场景进行测试,相比于手机1的触摸响应时间(如82ms-114ms),上述“触摸响应时间-前”(如82ms-136ms)较长。
而对比图16中的(a)、图16中的(b)和图16中的(c)可知:以相同的机械手速(如100mm/s)在相同的测试场景进行测试,上述“触摸响应时间-后”(如65ms-84ms)不仅相比于上述“触摸响应时间-前”(如82ms-136ms)缩短了较长的延迟时间。相比于手机1的触摸响应时间(如82ms-114ms)也缩短了一定时间。
由上述测试场景可知:通过本申请实施例的方法,可以较大幅度的缩短电子设备的响应延迟,提升电子设备的流畅性(如跟手性能)。
实施例(十二)
本申请一些实施例提供了一种电子设备,该电子设备可以包括:显示屏(如触摸屏)、存储器和一个或多个处理器。该显示屏、存储器和处理器耦合。该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令。当处理器执行计算机指令时,电子设备可执行上述方法实施例中电子设备执行的各个功能或者步骤。该电子设备的结构可以参考图2所示的电子设备200的结构。
本申请实施例还提供一种芯片系统,如图17所示,该芯片系统包括至少一个处理器1701和至少一个接口电路1702。处理器1701和接口电路1702可通过线路互联。例如,接口电路1702可用于从其它装置(例如电子设备的存储器)接收信号。又例如,接口电路1702可用于向其它装置(例如处理器1701或者电子设备的触摸屏)发送信号。示例性的,接口电路1702可读取存储器中存储的指令,并将该指令发送给处理器1701。当所述指令被处理器1701执行时,可使得电子设备执行上述实施例中的各个步骤。当然,该芯片系统还可以包含其他分立器件,本申请实施例对此不作具体限定。
本申请实施例还提供一种计算机存储介质,该计算机存储介质包括计算机指令,当所述计算机指令在上述电子设备上运行时,使得该电子设备执行上述方法实施例中电子设备执行的各个功能或者步骤。
本申请实施例还提供一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得所述计算机执行上述方法实施例中电子设备执行的各个功能或者步骤。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简 洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (23)

  1. 一种基于垂直同步信号的图像处理方法,其特征在于,所述方法应用于包括显示屏的电子设备,所述方法包括:
    所述电子设备响应于第一垂直同步信号,绘制一个或多个第一图层,并渲染所述一个或多个第一图层,并在所述一个或多个第一图层渲染完成后,对渲染的所述一个或多个第一图层进行图层合成,以得到第一图像帧;
    所述电子设备响应于第二垂直同步信号,刷新显示所述第一图像帧。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    所述电子设备对所述电子设备的硬件资源进行正向调度,以缩短所述电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。
  3. 根据权利要求1或2所述的方法,其特征在于,所述电子设备响应于第一垂直同步信号,绘制一个或多个第一图层,并渲染所述一个或多个第一图层,包括:
    所述电子设备检测到用户操作或者所述电子设备发生用户界面UI事件时,响应于所述第一垂直同步信号,绘制所述一个或多个第一图层,并渲染所述一个或多个第一图层。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,所述电子设备响应于第一垂直同步信号,绘制一个或多个第一图层,并渲染所述一个或多个第一图层,并在一个或多个第一图层渲染完成后,对渲染的所述一个或多个第一图层进行图层合成,以得到第一图像帧,包括:
    如果第一统计周期的第一处理帧长小于或等于预设单帧帧长,所述电子设备则响应于所述第一垂直同步信号,绘制一个或多个第一图层,并渲染所述一个或多个第一图层,并在所述一个或多个第一图层渲染完成后,对渲染的所述一个或多个第一图层进行图层合成,以得到所述第一图像帧;
    其中,所述第一处理帧长是第一渲染帧长和第一图层合成SF帧长之和,所述第一渲染帧长是进行图层绘制和对绘制的图层进行渲染所需的时长,所述第一SF帧长是对渲染的图层进行图层合成所需的时长。
  5. 根据权利要求4所述的方法,其特征在于,所述预设单帧帧长小于或等于所述第二垂直同步信号的信号周期。
  6. 根据权利要求4或5所述的方法,其特征在于,所述方法还包括:
    如果所述第一处理帧长大于所述预设单帧时长,所述电子设备则响应于所述第一垂直同步信号,绘制一个或多个第一图层;响应于第三垂直同步信号,对渲染的所述一个或多个第一图层进行图层合成,以得到第一图像帧。
  7. 根据权利要求4或5所述的方法,其特征在于,所述方法还包括:
    如果所述第一处理帧长大于所述预设单帧帧长,所述电子设备对所述电子设备的硬件资源正向进行调度,以缩短所述电子设备进行图层绘制、图层渲染和/或图层合成所需的时长。
  8. 根据权利要求4-7中任一项所述的方法,其特征在于,所述一个或多个第一图层包括:所述电子设备执行一个或多个应用对应的绘制任务所绘制的图层;所述一个或多个应用包括:一个或多个系统级应用,以及一个或多个用户级应用中的至少一项;所述系统级应用包括:状态栏、launcher、导航栏和壁纸。
  9. 根据权利要求8所述的方法,其特征在于,所述在所述一个或多个第一图层渲染完成后,对渲染的所述一个或多个第一图层进行图层合成,以得到第一图像帧,包括:
    所述电子设备在所述一个或多个应用中的焦点应用、关键应用或者与所述电子设备的流 畅性强相关的应用的一个或多个第一图层渲染完成后,对所述电子设备针对所述一个或多个应用已渲染的第一图层进行图层合成,以得到所述第一图像帧。
  10. 根据权利要求8所述的方法,其特征在于,所述在所述一个或多个第一图层渲染完成后,对渲染的所述一个或多个第一图层进行图层合成,以得到第一图像帧,包括:
    所述电子设备在所述一个或多个第一图层中的焦点图层、关键图层或者与所述电子设备的流畅性强相关的图层渲染完成后,对所述电子设备针对所述一个或多个应用已渲染的第一图层进行图层合成,以得到所述第一图像帧。
  11. 根据权利要求8-10中任一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备根据所述一个或多个应用中的焦点应用对应的第一渲染帧长,以及所述电子设备对所述一个或多个应用对应的第一SF帧长,确定所述第一处理帧长。
  12. 根据权利要求8-10中任一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备根据所述一个或多个应用中每个应用对应的第一渲染帧长中,最大的第一渲染帧长,以及电子设备对所述一个或多个应用对应的第一SF帧长,确定所述第一处理帧长。
  13. 根据权利要求1-12中任一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备在所述电子设备的屏幕刷新率大于预设刷新率阈值时,所述电子设备对所述电子设备的硬件资源进行负向调度,以降低所述电子设备的功耗。
  14. 根据权利要求13所述的方法,其特征在于,所述方法还包括:
    所述电子设备在所述电子设备的屏幕刷新率大于预设刷新率阈值时,如果所述第一处理帧长大于预设双帧帧长,所述电子设备对所述电子设备的硬件资源进行负向调度,以降低所述电子设备的功耗。
  15. 根据权利要求14所述的方法,其特征在于,所述预设双帧帧长小于或等于所述第二垂直同步信号的信号周期的K倍,K≥2。
  16. 根据权利要求2或7所述的方法,其特征在于,所述电子设备对所述电子设备的硬件资源进行正向调度,以缩短所述电子设备进行图层绘制、图层渲染和/或图层合成所需的时长,包括:
    所述电子设备执行以下正向调度中的一项或多项,以缩短所述电子设备进行图层绘制、图层渲染和/或图层合成所需的时长;
    其中,所述正向调度包括:调高所述电子设备的处理器的工作频率,选择大核的处理器执行所述方法,以及调高所述电子设备的内存工作频率;所述处理器包括中央处理器CPU和/或图形处理器GPU。
  17. 根据权利要求2或7所述的方法,其特征在于,所述电子设备对所述电子设备的硬件资源进行正向调度,以缩短所述电子设备进行图层绘制、图层渲染和/或图层合成所需的时长,包括:
    所述电子设备根据所述第一处理帧长、所述第一统计周期内所述电子设备在一个同步周期内完成图层绘制、图层渲染和图层合成的次数或者概率,或者所述电子设备的前台应用或焦点应用,对所述电子设备的硬件资源进行正向调度,以缩短所述电子设备进行图层绘制、图层渲染和/或图层合成所需的时长;其中,所述前台应用是所述显示屏当前显示的界面对应的应用。
  18. 根据权利要求13-15中任一项所述的方法,其特征在于,所述电子设备执行以下负向调度中的一项或多项,以降低所述电子设备的功耗;
    其中,所述负向调度包括:调低所述电子设备的处理器的工作频率,选择小核的处理器 执行所述方法,以及调低所述电子设备的内存工作频率;所述处理器包括CPU和/或图形处理器GPU。
  19. 根据权利要求4-18中任一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备获取所述第一统计周期的一个或多个第二处理帧长,每个第二处理帧长是第二渲染帧长和第二SF帧长之和,所述第二渲染帧长是进行图层绘制和对绘制的图层进行渲染所需的时长,所述第二SF帧长是对渲染的图层进行图层合成所需的时长;
    所述电子设备根据所述一个或多个第二处理帧长,确定所述第一处理帧长;
    其中,所述一个或多个第二处理帧长包括多个第二处理帧长的情况下,所述第一处理帧长是所述多个第二处理帧长中最大的第二处理帧长;或者,所述第一处理帧长是所述多个第二处理帧长的平均值。
  20. 一种电子设备,其特征在于,所述电子设备包括触摸屏、存储器和一个或多个处理器;所述触摸屏、所述存储器和所述处理器耦合;所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述处理器执行所述计算机指令时,所述电子设备执行如权利要求1-19中任一项所述的方法。
  21. 一种芯片系统,其特征在于,所述芯片系统应用于包括触摸屏的电子设备;所述芯片系统包括一个或多个接口电路和一个或多个处理器;所述接口电路和所述处理器通过线路互联;所述接口电路用于从所述电子设备的存储器接收信号,并向所述处理器发送所述信号,所述信号包括所述存储器中存储的计算机指令;当所述处理器执行所述计算机指令时,所述电子设备执行如权利要求1-19中任一项所述的方法。
  22. 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1-19中任一项所述的方法。
  23. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-19中任一项所述的方法。
PCT/CN2020/100014 2019-07-03 2020-07-03 一种基于垂直同步信号的图像处理方法及电子设备 WO2021000921A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US17/624,292 US11887557B2 (en) 2019-07-03 2020-07-03 Method for image processing based on vertical synchronization signals and electronic device
JP2021578085A JP7337968B2 (ja) 2019-07-03 2020-07-03 垂直同期信号に基づいた画像処理の方法及び電子機器
EP20834161.0A EP3971715A4 (en) 2019-07-03 2020-07-03 IMAGE PROCESSING METHOD BASED ON VERTICAL SYNCHRONIZATION SIGNALS, AND ELECTRONIC EQUIPMENT

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201910596178.X 2019-07-03
CN201910596178 2019-07-03
CN201910617101.6 2019-07-09
CN201910617101.6A CN110503708A (zh) 2019-07-03 2019-07-09 一种基于垂直同步信号的图像处理方法及电子设备

Publications (1)

Publication Number Publication Date
WO2021000921A1 true WO2021000921A1 (zh) 2021-01-07

Family

ID=68586206

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/100014 WO2021000921A1 (zh) 2019-07-03 2020-07-03 一种基于垂直同步信号的图像处理方法及电子设备

Country Status (5)

Country Link
US (1) US11887557B2 (zh)
EP (1) EP3971715A4 (zh)
JP (1) JP7337968B2 (zh)
CN (1) CN110503708A (zh)
WO (1) WO2021000921A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113190315A (zh) * 2021-04-29 2021-07-30 安徽华米健康医疗有限公司 一种显示器刷新方法及其装置
CN114089933A (zh) * 2021-06-09 2022-02-25 荣耀终端有限公司 显示参数的调整方法、电子设备、芯片及可读存储介质
CN115048012A (zh) * 2021-09-30 2022-09-13 荣耀终端有限公司 数据处理方法和相关装置
CN115190351A (zh) * 2022-07-06 2022-10-14 Vidaa国际控股(荷兰)公司 显示设备及媒资缩放控制方法
CN115550708A (zh) * 2022-01-07 2022-12-30 荣耀终端有限公司 数据处理方法及电子设备

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110503708A (zh) * 2019-07-03 2019-11-26 华为技术有限公司 一种基于垂直同步信号的图像处理方法及电子设备
CN110942500A (zh) * 2019-11-29 2020-03-31 广州久邦世纪科技有限公司 一种静态图转化成动态图的方法和装置
CN111124230B (zh) * 2019-12-24 2020-11-17 腾讯科技(深圳)有限公司 输入响应方法、装置、电子设备及计算机可读存储介质
WO2021142780A1 (en) * 2020-01-17 2021-07-22 Qualcomm Incorporated Methods and apparatus for reducing frame latency
WO2021232328A1 (en) * 2020-05-21 2021-11-25 Qualcomm Incorporated Methods and apparatus for tickless pre-rendering
WO2021248370A1 (en) * 2020-06-10 2021-12-16 Qualcomm Incorporated Methods and apparatus for reducing frame drop via adaptive scheduling
CN115631258B (zh) * 2020-07-31 2023-10-20 荣耀终端有限公司 一种图像处理方法及电子设备
CN112104855B (zh) * 2020-09-17 2022-05-31 联想(北京)有限公司 一种图像处理方法及装置
CN114338952B (zh) * 2020-09-30 2023-07-11 华为技术有限公司 一种基于垂直同步信号的图像处理方法及电子设备
CN114327697A (zh) * 2020-09-30 2022-04-12 华为技术有限公司 一种事件处理方法及设备
CN113867663B (zh) * 2020-10-22 2024-04-09 华为技术有限公司 一种显示方法及电子设备
CN114531519A (zh) * 2020-10-31 2022-05-24 华为技术有限公司 一种基于垂直同步信号的控制方法及电子设备
CN114520854B (zh) * 2020-11-18 2024-01-12 格科微电子(上海)有限公司 一种多摄像头信号同步方法及装置
CN114756359A (zh) * 2020-12-29 2022-07-15 华为技术有限公司 一种图像处理方法和电子设备
CN114764358A (zh) * 2021-01-13 2022-07-19 华为技术有限公司 一种界面显示方法及电子设备
CN112929741B (zh) * 2021-01-21 2023-02-03 杭州雾联科技有限公司 一种视频帧渲染方法、装置、电子设备和存储介质
CN112767524A (zh) * 2021-01-26 2021-05-07 北京小米移动软件有限公司 图像显示方法及装置、电子设备、存储介质
CN115225940B (zh) * 2021-04-15 2023-07-28 青岛海信宽带多媒体技术有限公司 一种机顶盒及机顶盒页面显示方法
CN113064728B (zh) * 2021-04-16 2022-11-04 上海众链科技有限公司 高负载应用图像显示方法、终端及可读存储介质
CN113364767B (zh) * 2021-06-03 2022-07-12 北京字节跳动网络技术有限公司 一种流媒体数据显示方法、装置、电子设备及存储介质
CN113426112B (zh) * 2021-07-02 2023-07-14 腾讯科技(深圳)有限公司 游戏画面的显示方法和装置、存储介质及电子设备
CN113992981B (zh) * 2021-10-21 2024-03-15 稿定(厦门)科技有限公司 视频图像的处理方法及装置
CN114040252A (zh) * 2021-11-03 2022-02-11 Oppo广东移动通信有限公司 显示帧率控制方法及装置、计算机可读介质和电子设备
CN115639920B (zh) * 2021-12-24 2023-12-22 荣耀终端有限公司 绘制方法、电子设备和可读存储介质
CN114217730A (zh) * 2021-12-27 2022-03-22 科大讯飞股份有限公司 一种书写显示方法、装置、设备、系统及存储介质
WO2023216146A1 (zh) * 2022-05-11 2023-11-16 北京小米移动软件有限公司 显示图像的更新方法、装置及存储介质
CN117492628A (zh) * 2022-07-26 2024-02-02 华为技术有限公司 一种图像处理方法及装置
CN116723265A (zh) * 2022-09-14 2023-09-08 荣耀终端有限公司 图像处理方法、可读存储介质、程序产品和电子设备
CN116704087B (zh) * 2022-10-17 2024-02-27 荣耀终端有限公司 一种参数调整方法及电子设备
CN117215426A (zh) * 2023-01-28 2023-12-12 荣耀终端有限公司 一种显示方法及电子设备
CN116594543B (zh) * 2023-07-18 2024-03-26 荣耀终端有限公司 显示方法、设备及可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103593155A (zh) * 2013-11-06 2014-02-19 华为终端有限公司 显示帧生成方法和终端设备
US20160335737A1 (en) * 2015-05-14 2016-11-17 Qualcomm Innovation Center, Inc. Vsync aligned cpu frequency governor sampling
CN107220019A (zh) * 2017-05-15 2017-09-29 努比亚技术有限公司 一种基于动态vsync信号的渲染方法、移动终端及存储介质
CN108829475A (zh) * 2018-05-29 2018-11-16 北京小米移动软件有限公司 Ui绘制方法、装置及存储介质
CN110503708A (zh) * 2019-07-03 2019-11-26 华为技术有限公司 一种基于垂直同步信号的图像处理方法及电子设备

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4528027B2 (ja) 2004-05-11 2010-08-18 株式会社バンダイナムコゲームス 画像生成プログラム、情報記憶媒体及び画像生成システム
KR20140088691A (ko) * 2013-01-03 2014-07-11 삼성전자주식회사 Dvfs 정책을 수행하는 시스템-온 칩 및 이의 동작 방법
CN105913371A (zh) * 2015-11-16 2016-08-31 乐视致新电子科技(天津)有限公司 一种针对虚拟现实应用延迟的系统优化方法和装置
JP2017130040A (ja) * 2016-01-20 2017-07-27 株式会社ジャパンディスプレイ タッチ検出機能付き表示装置及び表示方法
EP3418879A4 (en) * 2016-03-08 2019-01-02 Huawei Technologies Co., Ltd. Display method and terminal device
US20180121213A1 (en) 2016-10-31 2018-05-03 Anthony WL Koo Method apparatus for dynamically reducing application render-to-on screen time in a desktop environment
KR20180067220A (ko) 2016-12-12 2018-06-20 삼성전자주식회사 모션 기반 영상을 처리하는 방법 및 장치
CN106933587B (zh) 2017-03-10 2019-12-31 Oppo广东移动通信有限公司 一种图层绘制控制方法、装置及移动终端
CN106658691B (zh) * 2017-03-10 2020-01-14 Oppo广东移动通信有限公司 一种显示控制方法、装置及移动终端

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103593155A (zh) * 2013-11-06 2014-02-19 华为终端有限公司 显示帧生成方法和终端设备
US20160335737A1 (en) * 2015-05-14 2016-11-17 Qualcomm Innovation Center, Inc. Vsync aligned cpu frequency governor sampling
CN107220019A (zh) * 2017-05-15 2017-09-29 努比亚技术有限公司 一种基于动态vsync信号的渲染方法、移动终端及存储介质
CN108829475A (zh) * 2018-05-29 2018-11-16 北京小米移动软件有限公司 Ui绘制方法、装置及存储介质
CN110503708A (zh) * 2019-07-03 2019-11-26 华为技术有限公司 一种基于垂直同步信号的图像处理方法及电子设备

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113190315A (zh) * 2021-04-29 2021-07-30 安徽华米健康医疗有限公司 一种显示器刷新方法及其装置
CN114089933A (zh) * 2021-06-09 2022-02-25 荣耀终端有限公司 显示参数的调整方法、电子设备、芯片及可读存储介质
CN114089933B (zh) * 2021-06-09 2022-09-02 荣耀终端有限公司 显示参数的调整方法、电子设备、芯片及可读存储介质
CN115048012A (zh) * 2021-09-30 2022-09-13 荣耀终端有限公司 数据处理方法和相关装置
CN115550708A (zh) * 2022-01-07 2022-12-30 荣耀终端有限公司 数据处理方法及电子设备
CN115550708B (zh) * 2022-01-07 2023-12-19 荣耀终端有限公司 数据处理方法及电子设备
CN115190351A (zh) * 2022-07-06 2022-10-14 Vidaa国际控股(荷兰)公司 显示设备及媒资缩放控制方法
CN115190351B (zh) * 2022-07-06 2023-09-29 Vidaa国际控股(荷兰)公司 显示设备及媒资缩放控制方法

Also Published As

Publication number Publication date
CN110503708A (zh) 2019-11-26
JP7337968B2 (ja) 2023-09-04
JP2022538464A (ja) 2022-09-02
US20220358894A1 (en) 2022-11-10
EP3971715A1 (en) 2022-03-23
US11887557B2 (en) 2024-01-30
EP3971715A4 (en) 2022-07-06

Similar Documents

Publication Publication Date Title
WO2021000921A1 (zh) 一种基于垂直同步信号的图像处理方法及电子设备
WO2020259457A1 (zh) 一种基于垂直同步信号的控制方法及电子设备
WO2020177585A1 (zh) 一种手势处理方法及设备
WO2021032097A1 (zh) 一种隔空手势的交互方法及电子设备
WO2022021895A1 (zh) 一种图像处理方法及电子设备
US20240021176A1 (en) Image Processing Method Based on Vertical Sychronization Signal and Electronic Device
WO2021027678A1 (zh) 一种基于垂直同步信号的图像处理方法及电子设备
WO2021057343A1 (zh) 一种对电子设备的操作方法及电子设备
US11899879B2 (en) Stylus detection method, system, and related apparatus for switching frequencies for detecting input signals
WO2023142995A1 (zh) 数据处理方法和相关装置
CN116991354A (zh) 数据处理方法和相关装置
WO2022089153A1 (zh) 一种基于垂直同步信号的控制方法及电子设备
WO2022068477A1 (zh) 一种事件处理方法及设备
WO2023124227A1 (zh) 帧率切换方法及装置
WO2022143094A1 (zh) 一种窗口页面的交互方法、装置、电子设备以及可读存储介质
WO2023124225A1 (zh) 帧率切换方法及装置
CN116414336A (zh) 帧率切换方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20834161

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020834161

Country of ref document: EP

Effective date: 20211214

ENP Entry into the national phase

Ref document number: 2021578085

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE