WO2023124225A1 - 帧率切换方法及装置 - Google Patents

帧率切换方法及装置 Download PDF

Info

Publication number
WO2023124225A1
WO2023124225A1 PCT/CN2022/117925 CN2022117925W WO2023124225A1 WO 2023124225 A1 WO2023124225 A1 WO 2023124225A1 CN 2022117925 W CN2022117925 W CN 2022117925W WO 2023124225 A1 WO2023124225 A1 WO 2023124225A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
thread
vsync
frame rate
cycle
Prior art date
Application number
PCT/CN2022/117925
Other languages
English (en)
French (fr)
Other versions
WO2023124225A9 (zh
Inventor
蔡立峰
孙学琛
张凯
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202210191921.5A external-priority patent/CN116414337A/zh
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to EP22879630.6A priority Critical patent/EP4236301A4/en
Publication of WO2023124225A1 publication Critical patent/WO2023124225A1/zh
Publication of WO2023124225A9 publication Critical patent/WO2023124225A9/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4433Implementing client middleware, e.g. Multimedia Home Platform [MHP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0407Resolution change, inclusive of the use of different resolutions for different screen areas
    • G09G2340/0435Change or adaptation of the frame rate of the video stream
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/10Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/393Arrangements for updating the contents of the bit-mapped memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/395Arrangements specially adapted for transferring the contents of the bit-mapped memory to the screen

Definitions

  • the embodiments of the present application relate to the technical field of terminals, and in particular, to a frame rate switching method and device.
  • the terminal device needs to switch the frame rate.
  • the sliding speed of the image frame may jump when it is displayed, which will cause the picture to freeze and the user experience is not good.
  • Embodiments of the present application provide a frame rate switching method and device to solve the problem of jumping image frame sliding speeds caused by frame rate switching when an electronic device displays image frames.
  • the embodiment of the present application proposes a frame rate switching method, which includes: the application thread draws and renders the first image frame based on the frame interval corresponding to the first frame rate in the first cycle; the application thread in the second cycle based on The frame interval corresponding to the second frame rate draws and renders the second image frame; the synthesis thread sends the first frame cut request to the hardware synthesizer in the third cycle; the hardware synthesizer switches from the first frame rate to the second frame rate based on the first frame cut request Second frame rate, so that the second image frame is displayed at the second frame rate.
  • the above-mentioned second period is located after the first period, and the second frame rate is different from the first frame rate.
  • the above-mentioned third period is located after the second period, or the third period coincides with the second period.
  • the above method can be applied to an electronic device, and the electronic device can be switched from the first frame rate to the second frame rate through the above method.
  • the second frame rate may be greater than the first frame rate, and the second frame rate may also be lower than the first frame rate.
  • the second frame rate and the first frame rate may be in an integer multiple or a non-integer multiple.
  • the aforementioned application thread draws and renders the first image frame based on the frame interval corresponding to the first frame rate in the first cycle may mean that the application thread draws and renders the first image frame based on the frame interval corresponding to the first frame rate at an appropriate time in the first cycle.
  • the present application does not limit the period of time during which the application thread draws and renders the first image frame in the first cycle.
  • the above-mentioned application thread draws and renders the second image frame based on the frame interval corresponding to the second frame rate in the second cycle may mean that the application thread draws and renders the second image frame based on the frame interval corresponding to the second frame rate at an appropriate time in the second cycle.
  • the present application does not limit the period of time during which the application thread draws and renders the second image frame in the second cycle.
  • the switching from the first frame rate to the second frame rate is not realized in the time intervals of the first cycle, the second cycle and the third cycle, and the frame rate is still the first frame rate at this time, only from the second frame rate
  • the second cycle starts to perform drawing and rendering processing on the second image frame based on the frame interval corresponding to the second frame rate in advance.
  • the image frame is drawn and rendered by using the frame interval corresponding to the second frame rate in advance, and then the hardware synthesizer is notified to switch the frame rate, so that the frame interval corresponding to the second frame rate is used for drawing and rendering.
  • the rendered image frame can be displayed at the second frame rate, so that the rendering and rendering rhythm of the image frame can match the frame rate during display, so that the sliding speed of the image frame will not jump when it is displayed.
  • the synthesis thread sends the first frame cutting request to the hardware synthesizer thread in the third period, including: the synthesis thread calls the function performSetActiveMode in the third period to wake up the hardware synthesizer thread and change the frame rate from the first One frame rate switches to the second frame rate.
  • the compositing thread can wake up the hardware compositor to control the switching of the frame rate by calling a dedicated function.
  • the above method further includes: the compositing thread sends a second frame cutting request to the Vsync thread in the fourth period; the Vsync thread sends a Vsync message to the application thread at a second frame rate based on the second frame cutting request.
  • the above-mentioned fourth period is located after the second period, or the fourth period coincides with the second period.
  • the composition thread sends the second frame cutting request to the Vsync thread in the fourth cycle, including: the composition thread calls the setDuration function in the fourth cycle to set the period parameter corresponding to the second frame rate to the Vsync thread , so that the Vsync thread sends a Vsync message to the application thread at the second frame rate, thereby implementing frame rate switching in software.
  • the compositing thread can enable the Vsync thread to switch the frame rate by calling a dedicated function, so as to send a Vsync message to the application thread at a new second frame rate, thereby realizing frame rate switching in software.
  • the third cycle is located after the second cycle, where N is an integer greater than or equal to 1.
  • the third cycle is located after the second cycle, and the third cycle is adjacent to the second cycle.
  • the third cycle is located after the second cycle, but the third cycle is not adjacent to the second cycle, and there is a certain time interval between the third cycle and the second cycle.
  • the time interval between the third cycle and the second cycle is longer.
  • the above-mentioned third period coincides with the second period.
  • the application thread draws and renders image frames and the compositing thread sends the first frame cutting request to the hardware compositor thread in the same cycle (both in the second cycle).
  • the fourth cycle is located after the second cycle, where N is an integer greater than or equal to 1.
  • the fourth cycle is located after the second cycle, and the fourth cycle is adjacent to the second cycle.
  • the fourth cycle is located after the second cycle, but the fourth cycle is not adjacent to the second cycle, and there is a certain time interval between the fourth cycle and the second cycle.
  • the time interval between the fourth cycle and the second cycle is longer.
  • the above fourth period coincides with the second period.
  • the application thread draws and renders the image frame and the compositing thread sends the second frame cutting request to the Vsync thread in the same cycle (both in the second cycle).
  • the above method further includes: the application thread receives the first Vsync message sent by the Vsync thread, and the first Vsync The frame interval corresponding to the second frame rate carried in the message.
  • the application thread since the first Vsync message carries the frame interval corresponding to the second frame rate, when the application thread receives the first Vsync message, it can Perform drawing and rendering processing on the second image frame.
  • the above application thread receives the first Vsync message sent by the Vsync thread, including:
  • the Vsync thread sends the first Vsync message to the application thread at the beginning moment of the second cycle, so that the application thread can receive the first Vsync message at the beginning moment of the second cycle, and obtain the first Vsync message carried by the first Vsync message. frame interval data, and then draw and render the second image frame according to the frame interval carried in the first Vsync message.
  • the above method further includes: the synthesis thread receives the initial frame cutting request from the application thread; the synthesis thread sends a frame interval modification notification to the Vsync thread message: the Vsync thread generates the first Vsync message based on the frame interval modification notification message.
  • the above initial frame cut request may carry a target frame rate to be switched, and the target frame rate here is actually the second frame rate, therefore, the initial frame cut request may carry the second frame rate.
  • the compositing thread After receiving the initial frame cutting request, the compositing thread generates a frame interval modification notification message based on the frame cutting request, and the frame interval modification notification message is used to notify the Vsync thread to modify the first Vsync message so that the first Vsync message carries the first Vsync message.
  • the frame interval corresponding to the second frame rate.
  • the synthesis thread receives the initial frame cutting request from the application thread, including: the synthesis thread receives the initial frame cutting request from the application thread in the first period; the synthesis thread sends the frame interval to the Vsync thread
  • the modification notification message includes: the compositing thread sends a frame interval modification notification message to the Vsync thread in the first period.
  • the composition thread after receiving the initial frame cut request, the composition thread sends a frame interval modification notification message to the Vsync thread within the same period, leaving enough time for the Vsync thread to generate a frame interval corresponding to the second frame rate.
  • the first Vsync message After receiving the initial frame cut request, the composition thread sends a frame interval modification notification message to the Vsync thread within the same period, leaving enough time for the Vsync thread to generate a frame interval corresponding to the second frame rate.
  • the first Vsync message after receiving the initial frame cut request, the composition thread sends a frame interval modification notification message to the Vsync thread within the same period, leaving enough time for the Vsync thread to generate a frame interval corresponding to the second frame rate. The first Vsync message.
  • the above method further includes: the compositing thread determines the next cycle for compositing the first image frame for the third cycle.
  • the above-mentioned first frame cutting request may carry the frame number (framenumber) and/or the corresponding VsyncID of the first image frame, and the synthesis thread may pass the frame number of the image frame or the corresponding VsyncID to the image frame when synthesizing the image frame.
  • Recognition when determining that the current image frame to be synthesized is the first image frame, determine the next cycle of the synthesis processing of the first image frame as the third cycle, and send the first cut frame to the hardware synthesizer in the third cycle ask.
  • the compositing thread determines the next cycle as the third cycle when determining, according to the frame number, that the current image frame to be composited is the first image frame.
  • the frame number of the first image frame is 1, and the corresponding VsyncID is 1, then the synthesis thread needs to judge the frame number or the corresponding VsyncID of the image frame when synthesizing the image frame.
  • the next period is determined as the third period.
  • the above method further includes: the composition thread determines the next cycle for compositing the first image frame as the first cycle. Four cycles.
  • the above-mentioned first frame cutting request may carry the frame number (framenumber) and/or the corresponding VsyncID of the first image frame, and the synthesis thread may pass the frame number of the image frame or the corresponding VsyncID to the image frame when synthesizing the image frame.
  • Recognition when it is determined that the current image frame to be synthesized is the first image frame, the next cycle of the synthesis processing of the first image frame is determined as the fourth cycle, and the second frame cut request is sent to the Vsync thread in the fourth cycle .
  • the compositing thread determines, according to the frame number, that the current image frame to be composited is the first image frame, the next period is determined as the fourth period.
  • the frame number of the first image frame is 1, and the corresponding VsyncID is 1, then the synthesis thread needs to judge the frame number or the corresponding VsyncID of the image frame when synthesizing the image frame.
  • the frame number of the image frame is 1 or the corresponding VsyncID is 1, the next period is determined as the fourth period.
  • the time intervals of the first period, the second period, and the third period are all the same as the corresponding frame intervals of the first frame rate.
  • the above-mentioned first period, second period and third period correspond to the first frame rate before switching, therefore, the time intervals of the above-mentioned first period, second period and third period are all the frame intervals corresponding to the first frame rate same size.
  • the time intervals of the first period, the second period, and the fourth period are all the same as the corresponding frame intervals of the first frame rate.
  • the embodiment of the present application provides an electronic device, and the electronic device may also be called a terminal (terminal), user equipment (user equipment, UE), mobile station (mobile station, MS), mobile terminal (mobile terminal, MT) and so on.
  • Electronic devices can be mobile phones, smart TVs, wearable devices, tablet computers (Pad), computers with wireless transceiver functions, virtual reality (virtual reality, VR) electronic devices, augmented reality (augmented reality, AR) electronic Equipment, wireless terminals in industrial control, wireless terminals in self-driving, wireless terminals in remote medical surgery, wireless terminals in smart grid, transportation Wireless terminals in transportation safety, wireless terminals in smart city, wireless terminals in smart home, etc.
  • the electronic device includes a processor, and the processor is used for invoking a computer program in a memory to execute the method according to the first aspect.
  • the embodiment of the present application provides a computer-readable storage medium, the computer-readable storage medium stores computer instructions, and when the computer instructions are run on the electronic device, the electronic device executes the method in the first aspect.
  • the embodiment of the present application provides a computer program product, which causes an electronic device to execute the method in the first aspect when the computer program is executed.
  • FIG. 1 is a schematic structural diagram of a hardware system of an electronic device provided in an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of a software system of an electronic device provided in an embodiment of the present application
  • FIG. 4 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a frame rate switching process of an electronic device in a conventional solution
  • FIG. 9 is a schematic diagram of a frame rate switching process of a frame rate switching method according to an embodiment of the present application.
  • FIG. 10 is an interactive diagram of the frame rate switching process of the frame rate switching method according to the embodiment of the present application.
  • FIG. 11 is a schematic diagram of a frame rate switching process of a frame rate switching method according to an embodiment of the present application.
  • FIG. 13 is a schematic diagram of a frame rate switching process of a frame rate switching method according to an embodiment of the present application.
  • words such as “first” and “second” are used to distinguish the same or similar items with basically the same function and effect.
  • the first chip and the second chip are only used to distinguish different chips, and their sequence is not limited.
  • words such as “first” and “second” do not limit the quantity and execution order, and words such as “first” and “second” do not necessarily limit the difference.
  • “at least one” means one or more, and “multiple” means two or more.
  • “And/or” describes the association relationship of associated objects, indicating that there may be three types of relationships, for example, A and/or B, which can mean: A exists alone, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the contextual objects are an “or” relationship.
  • “At least one of the following” or similar expressions refer to any combination of these items, including any combination of single or plural items.
  • At least one item (piece) of a, b, or c can represent: a, b, c, a-b, a-c, b-c, or a-b-c, where a, b, c can be single or multiple .
  • Electronic equipment includes terminal equipment, and electronic equipment may also be called a terminal (terminal), user equipment (user equipment, UE), mobile station (mobile station, MS), mobile terminal (mobile terminal, MT) and so on.
  • Electronic devices can be mobile phones, smart TVs, wearable devices, tablet computers (Pad), computers with wireless transceiver functions, virtual reality (virtual reality, VR) electronic devices, augmented reality (augmented reality, AR) electronic Equipment, wireless terminals in industrial control, wireless terminals in self-driving, wireless terminals in remote medical surgery, wireless terminals in smart grid, transportation Wireless terminals in transportation safety, wireless terminals in smart city, wireless terminals in smart home, etc.
  • the embodiments of the present application do not limit the specific technology and specific device form adopted by the electronic device.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor ( image signal processor (ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be recalled from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuitsound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver) /transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and/or Universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous receiver transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB Universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • SDA serial data line
  • SCL serial clock line
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (displayserial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the interface connection relationship between the modules shown in the embodiment of the present application is a schematic illustration and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Antennas in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • Wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband code division multiple wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM, and / or IR technology etc.
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used for displaying images, displaying videos, receiving sliding operations, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be liquid crystal display (liqPID crystal display, LCD), organic light-emitting diode (organic light-emitting diode, OLED), active matrix organic light-emitting diode or active-matrix organic light emitting diode (active-matrix organic light emitting diode , AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Electronic device 100 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to listen to the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the electronic device 100 when the electronic device 100 is a clamshell machine, the electronic device 100 can detect opening and closing of the clamshell according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light through the light emitting diode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 may receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture, and the like.
  • the software structure of the electronic device 100 is exemplarily described by taking an Android (Android) system with a layered architecture as an example.
  • FIG. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into five layers, which are application layer, application program framework layer, Android runtime (Android runtime) and system library, hardware abstraction layer, and kernel layer from top to bottom.
  • the application layer can include a series of application packages.
  • the application program package may include application programs such as telephone, email, calendar, and camera.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a frame rate control system, an image composition system, a view system, a package manager, an input manager, an event manager, and a resource manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • the frame rate control system is used to adjust the screen refresh rate.
  • the image synthesis system is used for controlling image synthesis and generating a vertical synchronization (vetical synchronization, Vsync) signal.
  • the image synthesis system includes: synthesis thread, Vsync thread, and cache queue (quene buffer) thread.
  • the compositing thread is used to be awakened by the Vsync signal for compositing.
  • the Vsync thread is used to generate the next Vsync signal according to the Vsync signal request.
  • the cache queue thread is used to store caches, generate Vsync signal requests, and wake up compositing threads.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the package manager is used for program management within the system, such as application installation, uninstallation, and upgrade.
  • Input Manager A program for managing input devices. For example, the input system can determine input actions such as mouse clicks, keyboard input actions, and touch swipes.
  • the Activity Manager is used to manage the lifecycle of individual applications and navigation fallback functionality. Responsible for the creation of the main thread of Android and the maintenance of the life cycle of each application.
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • a system library can include multiple function modules. For example: image rendering library, image synthesis library, function library, media library and input processing library, etc.
  • the application draws and renders the image through the image rendering library, and then the application sends the drawn and rendered image to the buffer queue of the image composition system.
  • the image synthesis system for example, surface flinger
  • the image synthesis system sequentially acquires a frame of images to be synthesized from the cache queue, and then performs image synthesis through the image synthesis library.
  • the hardware abstraction layer can include multiple library modules, such as a hardware synthesizer (hwcomposer, HWC), camera library module, etc.
  • the Android system can load corresponding library modules for the device hardware, and then realize the purpose of the application framework layer accessing the device hardware.
  • Device hardware may include, for example, LCD displays and cameras in electronic devices.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a touch panel (TP) driver, a display driver, a Bluetooth driver, a WIFI driver, a keyboard driver, a shared memory driver, and a camera driver.
  • Hardware can be audio devices, bluetooth devices, camera devices, sensor devices, etc.
  • the kernel layer processes the touch operation into an original input event (including touch coordinates, touch force, time stamp of the touch operation and other information).
  • Raw input events are stored at the kernel level.
  • the kernel layer reports the original input event to the input manager of the application framework layer through the input processing library.
  • the input manager of the application framework layer parses the information of the original input event (including: operation type and reporting location, etc.), determines the focus application according to the current focus, and sends the parsed information to the focus application.
  • the focus can be a touch point in a touch operation or a click position in a mouse click operation.
  • the focus application is the application running in the foreground of the electronic device or the application corresponding to the touch position in the touch operation.
  • the focus application determines the control corresponding to the original input event according to the parsed original input event information (for example, the location of the reporting point).
  • Frame refers to a single picture that is the smallest unit in the interface display.
  • a frame can be understood as a still picture, and displaying multiple connected frames in rapid succession can form the illusion of object motion.
  • the frame rate refers to the number of frames that refresh the picture in 1 second, and can also be understood as the number of times the graphics processor in the electronic device refreshes the picture per second.
  • a high frame rate results in smoother and more realistic animations. The more frames per second, the smoother the displayed motion will be.
  • Frame rendering coloring the drawn view or adding 3D effects.
  • the 3D effect may be a lighting effect, a shadow effect, a texture effect, and the like.
  • Frame compositing it is a process of synthesizing one or more of the above-mentioned rendered views into a display interface.
  • the display process of the interface of the electronic device 100 will be described below in combination with software and hardware.
  • electronic devices in order to improve display fluency and reduce display stuttering, electronic devices generally display based on Vsync signals to synchronize the processes of image drawing, rendering, synthesis, and screen refresh display.
  • the Vsync signal is a periodic signal, and the period of the Vsync signal can be set according to the screen refresh rate.
  • the period of the Vsync signal can be 16.6ms, that is, the electronic device generates a control every 16.6ms. signal causes the Vsync signal cycle to trigger.
  • the software Vsync signal and the hardware Vsync signal are cycle-synchronized. Taking the change of 60Hz and 120Hz as an example, if Vsync-HW switches from 60Hz to 120Hz, Vsync-APP and Vsync-SF change synchronously, switching from 60Hz to 120Hz.
  • FIG. 3 is a schematic diagram of an electronic device interface display processing flow in a possible implementation.
  • the images displayed by the electronic device correspond to frame 1, frame 2, and frame 3 in turn.
  • the application of the electronic device draws and renders frame 1 through the view system of the application framework layer.
  • the application of the electronic device sends the drawn and rendered frame 1 to an image compositing system (for example, surface flinger).
  • the image synthesis system synthesizes the drawn and rendered frame 1 .
  • the electronic device can start the display driver by calling the kernel layer, and display the content corresponding to frame 1 on the screen (display screen).
  • Frame 2 and frame 3 are also synthesized and displayed similar to the process of frame 1, which will not be repeated here.
  • each frame is lagged by 2 Vsync signal cycles from drawing to displaying, and the electronic device has hysteresis in the process of displaying image frames.
  • the frame rate switching method of the embodiment of the present application can be applied to various application scenarios of electronic devices, and the application scenarios of the frame rate switching method of the embodiment of the present application will be described below with reference to the accompanying drawings.
  • FIG. 4 is a schematic diagram of an application scenario of an embodiment of the present application.
  • the electronic device can be on the social application interface shown in a in Figure 4, or in the setting-related interface shown in b in Figure 4, or in the document interface shown in c in Figure 4, or in d in Figure 4
  • the product browsing interface as shown, and the other interface receive the user's swipe up operation or swipe down operation.
  • the electronic device may also receive a user's left-swipe operation or right-swipe operation on the interface shown in e in FIG. 4 , or the e-book interface shown in f in FIG. 4 .
  • the electronic device receives a user's sliding operation, the electronic device performs frame drawing, rendering, compositing and other processes based on the sliding operation, and displays content corresponding to the sliding operation.
  • the electronic device can switch from the current frame rate to another lower frame rate to reduce the electronic device's frame rate. power consumption.
  • the frame rate of the electronic device needs to be switched from 120Hz to 60Hz, and the compositing thread decides to switch the frame rate between 0ms-8.3ms, after two cycles (8.3ms-16.6ms and 16.6ms-33.2ms), the electronic device realizes the switching of the frame rate from 120Hz to 60Hz at 33.2ms.
  • the frame rate switching process shown in FIG. 5 mainly includes steps S1 and S2, and these two steps are briefly introduced below.
  • the application main thread sends a frame cutting request message to the compositing thread.
  • the above frame cutting request carries the target frame rate and process ID (process ID, PID) of the current frame.
  • the target frame rate is the frame rate that the electronic device requests to switch to.
  • the application main thread sends a frame cutting request message to the composition thread within a period of 0ms-8.3ms.
  • the current frame rate of the electronic device is 120Hz.
  • the PID in the frame cutting request message is used to identify the corresponding process. For example, if the electronic device is currently displaying a picture of a certain game, then the PID in the frame cutting request message is used to identify the relevant process of the game.
  • the compositing thread sends a frame rate switching notification message to the Vsync thread.
  • the compositing thread determines that the target frame rate is 60Hz, so it decides to switch the frame rate from the current 120Hz to 60Hz, and transfers to the The hardware synthesizer and the Vsync thread send frame rate switching notification messages to realize switching from the current frame rate to the target frame rate.
  • the electronic device After the compositing thread sends the frame rate switching notification message, it takes two more cycles (8.3ms-16.6ms and 16.6ms-33.2ms) to complete the frame rate switching. As shown in Figure 5, the electronic device realizes the Frame rate switching. At this time, the frequency of the hardware integrated unit and software Vsync signal has also been switched to 60Hz.
  • the compositing thread After receiving the frame rate switching request message sent by the application main thread at 0ms-8.3ms, the compositing thread decides to cut frames. Next, the compositing thread sends a frame rate switching notification message to the hardware compositor, so that the hardware compositor Control the hardware integrated circuit to switch the frame rate from 120Hz to 60Hz, and the hardware integrated circuit completes the frame rate switch in 33.2ms.
  • the compositing thread also sends a frame rate switching notification message to the Vsync thread to notify the Vsync thread to switch the software cycle to a new cycle during 16.6ms-33.2ms (the new cycle corresponds to the new frame rate of 60Hz), and the completion frequency switch.
  • the timer will reset the timing time based on the 16.6ms timestamp and the new frame interval (the frame interval corresponding to 60Hz is 16.6ms), so that the timer will follow the new timing time (with Timing time matching the target frame rate) to wake up the Vsync thread.
  • the electronic device completed the switching of the frame rate from 120Hz to 60Hz at 33.2ms.
  • the frame interval used for frame 7 rendering has been modified to 2Pixel, and the displacement during display has been modified to 2Pixel.
  • the frame interval used in frame 2 rendering is still 8.3ms
  • the rendering displacement is 1Pixel
  • the displacement during display is 1Pixel
  • the corresponding time interval during display is 8.3ms
  • the sliding speed when displaying frames 0 to 2 is 1Pixel/8.3ms
  • the sliding speed when displaying frames 3 to 6 is 0.5Pixel/8.3ms
  • the sliding speed when displaying frame 7 is 1Pixel/8.3ms.
  • the frame rate switching process shown in FIG. 5 above is switched from 120 Hz to 60 Hz, and the frame rates before and after the switching are integer multiples.
  • the frame rates before and after the switching are integer multiples.
  • the display speed of the image frame jumps, but also when the frame rate before and after the switch is a non-integer multiple, the display speed of the image frame also exists.
  • a transition occurs.
  • the frame rate of the electronic device needs to be switched from 120Hz to 90Hz.
  • the synthesis thread decides to switch the frame rate between 0ms-8.3ms.
  • the electronic device realizes the switching of the frame rate from 120Hz to 90Hz at 27.7ms.
  • the application main thread sends a frame cutting request message to the compositing thread.
  • the frame cutting request carries the target frame rate and PID of the current frame.
  • the main thread of the application sends a frame cutting request message to the synthesis thread within a period of 0ms-8.3ms.
  • the current frame rate of the electronic device is 120Hz, and the target frame rate that needs to be switched from the current frame rate to 90Hz, then, The target frame rate carried in the frame cut request is 90Hz.
  • the compositing thread sends a frame rate switching notification message to the Vsync thread.
  • the composition thread decides to switch the frame rate from the current 120Hz to 90Hz according to the target frame rate 90Hz carried in the frame cut request, and then Send a frame rate switching notification message to the hardware synthesizer and Vsync thread in the period of 0ms-8.3ms, so as to realize the switching from the current frame rate to the target frame rate.
  • the frame rate switching notification message After the compositing thread sends the frame rate switching notification message, it takes two more cycles (8.3ms-16.6ms and 16.6ms-27.7ms) to complete the frame rate switching. Therefore, in Figure 6, the frame rate is achieved at 27.7ms At this time, at this time, the frequency of the hardware integrated unit and the software Vsync signal has also been switched to 60Hz.
  • the compositing thread After receiving the frame rate switching request message sent by the application main thread at 0ms-8.3ms, the compositing thread decides to cut frames. Next, the compositing thread sends a frame rate switching notification message to the hardware compositor, so that the hardware compositor Control the hardware integrated circuit to switch the frame rate from 120Hz to 60Hz, and the hardware integrated circuit completes the frame rate switch in 27.7ms.
  • the synthesis thread also sends a frame rate switching notification message to the Vsync thread to notify the Vsync thread to switch the software cycle to a new cycle during 16.6ms-27.7ms (the new cycle corresponds to the new frame rate of 90Hz), and the completion frequency switch.
  • the timer will reset the timing time based on the 16.6ms timestamp and the new frame interval (the frame interval corresponding to 90Hz is 11.1ms), so that the timer will follow the new timing time (with Timing time matching the target frame rate) to wake up the Vsync thread.
  • the sliding speed of frame 0 to frame 2 is 1Pixel/8.3ms
  • the sliding speed of frame 3 to frame 6 is 0.7Pixel/8.3ms
  • the sliding speed of frame 7 is 1Pixel/8.3ms.
  • the embodiment of the present application provides a new frame rate switching method, which adjusts the frame interval used for image frame rendering in advance, and makes the Part of the image frames whose frame interval is adjusted in advance is displayed at the switched frame rate, so that the sliding speed of the image frames will not jump when the image frames are displayed, and the user experience is improved.
  • the frame rate switching method in the embodiment of the present application adjusts the frame interval used in the drawing and rendering of the image frame in advance, and determines the timing of the frame rate switching by tracking the frameNumber of the image frame, so that the frame used in the drawing and rendering is adjusted in advance Interval image frames can be displayed at the switched frame rate, so that the sliding speed of the image frames will not jump when the image frames are displayed.
  • the frame rate switching method in the embodiment of the present application will be described in detail below with reference to FIG. 7 .
  • the frame rate switching method shown in FIG. 7 can be performed by electronic equipment.
  • the frame rate between 8 and 8 is the second frame rate, and each moment corresponds to a VsyncID.
  • the method shown in FIG. 7 includes steps S101 to S104, and these four steps will be described in detail below.
  • the above-mentioned second period is located after the first period, and the second frame rate is different from the first frame rate. It can be understood that the situation shown in FIG. 7 is that the second period is located after the first period, and the second period is not adjacent to the first period. In fact, the present application only defines that the second period is located after the first period and does not limit whether the second period is adjacent to the first period. The second period may be adjacent to the first period or not.
  • the compositing thread sends a first frame cutting request to the hardware compositor in a third period.
  • the above-mentioned third period may be located after the second period, or, the above-mentioned third period and the second period coincide.
  • the above-mentioned third period is located after the second period, and the third period is not adjacent to the second period.
  • the number of Buffers stacked in the buffer queue is N, and N is an integer greater than or equal to 1.
  • the third cycle is located after the second cycle.
  • FIG. 7 shows the case where the number of stacked Buffers in the cache queue is 2. According to FIG. 7 , the third cycle is located after the second cycle, and the third cycle is not adjacent to the second cycle.
  • the compositing thread may specifically call the function performSetActiveMode in the third period, so as to wake up the hardware compositor thread and switch the frame rate from the first frame rate to the second frame rate.
  • the hardware synthesizer switches from the first frame rate to the second frame rate based on the first frame cutting request, so that the second image frame is displayed at the second frame rate.
  • the method shown in FIG. 7 also includes:
  • the compositing thread sends a second frame cutting request to the Vsync thread in the fourth period.
  • the Vsync thread sends a Vsync message to the application thread at the second frame rate based on the second frame cut request.
  • the fourth period is located after the second period, or the fourth period is coincident with the second period, that is to say, the fourth period cannot be located before the second period.
  • composition thread in S105 above can control the Vsync thread to send the Vsync message to the application thread at the second frame rate by calling a corresponding function.
  • the number of Buffers stacked in the buffer queue is N, and N is an integer greater than or equal to 1.
  • the fourth cycle is located after the second cycle.
  • the number of Buffers stacked in the buffer queue is 0.
  • the fourth cycle also coincides with the second cycle.
  • steps S103 and S105 may be performed within one period.
  • the method shown in FIG. 7 further includes:
  • the application thread receives the first Vsync message sent by the Vsync thread.
  • a frame interval corresponding to the second frame rate carried in the first Vsync message is a frame interval corresponding to the second frame rate carried in the first Vsync message.
  • step S102a specifically includes: the application thread receives the first Vsync message sent by the Vsync thread at the beginning of the second period.
  • the method shown in FIG. 7 further includes:
  • the synthesis thread sends a frame interval modification notification message to the Vsync thread;
  • the Vsync thread generates a first Vsync message based on the frame interval modification notification message.
  • the above initial frame cut request is used to request the compositing thread to switch the frame rate from the current first frame rate to the second frame rate.
  • the above-mentioned frame interval modification notification message is used to notify the Vsync thread to modify the frame interval in the next Vsync message (that is, the first Vsync message) to the frame interval corresponding to the second frame rate.
  • the Vsync thread A first Vsync message is generated. If the second frame rate is 90 Hz, the frame interval in the first Vsync message is 11.1 ms.
  • the application thread may first initiate an initial frame cutting request to the synthesis thread to request frame cutting, so that the application can flexibly initiate a frame cutting request to the synthesis thread as needed.
  • the above step S102x specifically includes: the compositing thread receives an initial frame cutting request from the application thread in the first cycle;
  • the above step S102y specifically includes: the compositing thread sends a frame interval modification notification message to the Vsync thread in the first period.
  • the composition thread after the composition thread receives the initial frame cut request in the first cycle, it sends a frame interval modification notification message to the Vsync thread in the first cycle, which can reserve sufficient time for the Vsync thread to modify the Vsync message. frame interval.
  • the method shown in FIG. 7 further includes:
  • the synthesis thread can identify whether the image frame to be synthesized is the first image frame according to the frame number (framenumber) and/or the corresponding VsyncID of the image frame, and determine whether the image frame to be synthesized currently is the first image
  • the next cycle of synthesizing and processing the first image frame is determined as the third cycle, and the first frame cutting request is sent to the hardware synthesizer in the third cycle.
  • the time intervals of the first period, the second period, and the third period are all the same as the corresponding frame intervals of the first frame rate.
  • the time intervals of the first period, the second period and the third period are all 8.3ms.
  • the frame rate switching method of the embodiment of the present application will be described in detail below with reference to FIG. 8 , taking the switching of an electronic device from 120 Hz to 90 Hz as an example.
  • the frame rate of the electronic device is 120Hz when it is between 0ms-41.5ms, and the frame rate is 90Hz when it is between 41.5ms-63.7ms.
  • Each moment corresponds to a VsyncID.
  • the corresponding VsyncID is 1 at 0 ms, 2 at 8.3 ms, and so on.
  • the application thread will receive the Vsync message from the Vsync thread, and the Vsync message will carry the corresponding VsyncID and the frame interval. After receiving the Vsync message, the application thread will match the corresponding The image frame is drawn and rendered.
  • Fig. 8 includes steps S1001 to S1007, and these steps will be described in detail below.
  • the application thread draws and renders the first image frame at a frame interval corresponding to 120 Hz between 0 ms and 8.3 ms.
  • the application thread draws and renders the first image frame based on the frame interval of 8.3ms.
  • the application thread requests the compositing thread to switch the frame rate to 90Hz between 0ms-8.3ms.
  • the composition thread notifies the Vsync thread to modify the frame interval in the next Vsync message to 11.1ms between 0ms-8.3ms.
  • the Vsync thread sends the first Vsync message to the application thread at 16.6ms.
  • the frame interval carried in the first Vsync message is 11.1 ms.
  • the Vsync thread must send a Vsync message to the application thread.
  • the Vsync thread sends a Vsync message at 8.3ms, and the frame interval in the Vsync message is still 8.3ms, which remains unchanged, while the Vsync thread sends another at 16.6ms
  • the frame interval should be modified to 11.1 ms.
  • the application thread draws and renders the second image frame at a frame interval corresponding to 90 Hz.
  • the application thread specifically draws and renders the second image frame based on the frame interval of 11.1ms.
  • the synthesis thread calls the function performSetActiveMode to wake up the hardware synthesizer thread to switch the frame rate from 120Hz to 90Hz.
  • the synthesis thread can start or wake up the hardware synthesizer by calling the function performSetActiveMode, so that the hardware synthesizer switches the frame rate from 120Hz to 90Hz.
  • the hardware synthesizer can control the hardware integrated unit to switch the frame rate from 120Hz to 90Hz after being woken up, and the hardware integrated unit switches to complete in 52.6ms.
  • the synthesis thread calls the setDuration function to set a period parameter corresponding to 90 Hz to the Vsync thread, so that the Vsync thread sends a Vsync message at 90 Hz.
  • the synthesis thread can transfer the period parameter corresponding to 90Hz (specifically, the frame interval corresponding to 90Hz is 11.1ms) to the Vsync thread, so that the Vsync thread sends a Vsync message at a frequency of 90Hz.
  • the composition thread can notify the Vsync thread to switch the software cycle to 90Hz during 33.2ms-52.6ms to complete the frequency switching.
  • the timer will reset the timing time based on the 41.5ms timestamp and the new frame interval, so that the timer will wake up the Vsync thread according to the new timing time (timing time matching 90Hz) .
  • the compositing thread can switch the frame rate from 120Hz to 90Hz at 52.6ms by calling two different functions to synchronize the hardware integrated unit and the Vsync thread.
  • the time stamp of Vsync, the identification ID of each Vsync-APP, the identification ID of each Vsync-SF, and Buffer information are marked on the top of the drawing.
  • the frame rate of the electronic device needs to be switched from 120Hz to 90Hz.
  • the application main thread sends a frame cut request (requesting frame rate switching from 120Hz to 90Hz) to the synthesis thread, and the synthesis thread receives the After the frame cutting request message, the Vsync thread is notified to modify the frame interval used in the drawing and rendering of the image frame in advance, and the image frame whose frame interval is adjusted in advance is displayed at the switched frame rate, so that the rendering speed of this part of the image frame Keep consistent with the sliding speed when displaying, so as to avoid the jump of sliding speed when the image is displayed.
  • the application main thread sends a frame cutting request message to the compositing thread.
  • the above frame cutting request message carries the VsyncId, frameNumber, target frame rate and PID of the current frame.
  • the application main thread sends a frame cut request message to the compositing thread during 0ms-8.3ms
  • the image frame that needs to be rendered in the current cycle is frame 4
  • the compositing thread sends a frame interval modification notification message.
  • the synthesis thread sends a frame interval modification notification message to the Vsync thread, so as to notify the Vsync thread to modify the frame interval in the Vsync message in advance.
  • the frame rate of the electronic device is 120 Hz before the frame rate switching, and the corresponding frame interval is 8.3 ms.
  • the compositing thread learns that the target frame rate is 90 Hz, and the target frame interval corresponding to the target frame rate is 11.1 ms (11.1 ms is obtained by taking the inverse of 90 Hz). Therefore, the compositing thread needs to notify the Vsync thread to modify the frame interval to the target frame interval of 11.1 ms in the next Vsync message following the Vsync message corresponding to the frame cutting request initiated by the application main thread.
  • the compositing thread notifies the Vsync thread to change the frame interval to the target frame interval of 11.1 ms in the Vsync message with the ID of N+2.
  • the Vsync-app signal may also be called a Vsync message, and the Vsync message shall prevail in the following introduction.
  • the Vsync thread can wake up its own thread by setting a timer at 16.6ms. After being woken up, the Vsync thread sends a Vsync message to the application main thread.
  • the Vsync flag carried by the Vsync message is 3, and the frame interval is 11.1ms. , timestamp 16.6ms.
  • the frame interval in each Vsync message is calculated according to the current frame rate, and the frame interval is the reciprocal of the current frame rate.
  • the frame interval used for drawing and rendering of frame 6 is modified in advance to 1/target frame rate. Therefore, the frame interval used for drawing and rendering of frame 6 is the same as the frame interval used for displaying frame 6, so as to prevent the speed jump of the image frame before and after frame rate switching when displaying.
  • the frame interval used when frame 6 is displayed is not the duration of frame 6, but can be considered as the frame interval used when the human eye perceives the display speed of frame 6, as shown in Figure 9, the corresponding time when frame 6 is displayed The interval is 41.5ms-52.6ms, which is 11.1ms.
  • the compositing thread sends a frame rate switching notification message.
  • step S3 the synthesis thread decides to switch the frame rate from 120 Hz to 90 Hz, and sends a frame rate switching notification message to the hardware synthesizer and the Vsync thread, so as to realize the frame rate switching.
  • the synthesis thread After the synthesis thread receives the frame cutting request message sent by the application main thread, the synthesis thread judges whether the frameNumber of the image frame to be synthesized is the same as the frameNumber of the image frame carried when the application main thread initiates the frame cutting request when compositing the image frame , if they are the same, then decide to switch the frame rate, and send a frame rate switching notification message to the hardware synthesizer and the Vsync thread.
  • the frameNumber of the image frame to be synthesized and processed by the compositing thread is 4, and the frameNumber carried in the frame cutting request initiated by the application main thread is also 4. Therefore, the compositing thread is During the period of 24.9ms-33.2ms, it was decided to switch the frame rate from the original 120Hz to 90Hz.
  • the electronic device completes the switching of the frame rate at 52.6ms.
  • the compositing thread notifies the Vsync thread to switch the software cycle to a new cycle (the new cycle corresponds to the new frame rate) during 33.2ms-52.6ms to complete the frequency switching.
  • the timer will reset the timing time based on the 41.5ms timestamp and the new frame interval, so that the timer will wake up according to the new timing time (timing time that matches the target frame rate) Vsync threads.
  • the compositing thread may also send a frame cutting request to the compositing thread at a time interval before or after 0ms-8.3ms. That is to say, the Vsyncld carried in the frame cutting request may be the Vsyncld corresponding to the initial moment of the time interval for sending the frame cutting request message, or may not be the Vsyncld corresponding to the initial moment of the time interval for sending the frame cutting request message, This application does not limit the number of times. The following example illustrates this.
  • the application main thread can also send a frame cutting request message to the synthesis thread within the time interval of 8.3ms-16.6ms, and the Vsyncld carried in the frame cutting request message is 2.
  • the synthesis thread may notify the Vsync thread to modify the frame interval carried in the next Vsyncld of the Vsyncld carried in the frame cutting request, that is, the frame interval carried in the Vsync message with the Vsyncld being 3.
  • the compositing thread still decides to cut frames between 24.9ms-33.2ms.
  • the synthesis thread can also send a frame cutting request message to the synthesis thread between 0ms-8.3ms, and the Vsyncld carried in the frame cutting request message is 2.
  • the synthesis thread The Vsync thread may be notified to modify the frame interval carried in the next Vsyncld of the Vsyncld carried in the frame cutting request, that is, the frame interval carried in the Vsync message whose Vsyncld is 3.
  • the compositing thread still decides to cut frames between 24.9ms-33.2ms.
  • the synthesis thread can also send a frame cutting request message to the synthesis thread between 8.3ms-16.6ms, and the Vsyncld carried in the frame cutting request message is 1.
  • the synthesis thread can The Vsync thread is notified to modify the frame interval carried in the next Vsyncld of the Vsyncld carried in the frame cutting request, that is, the frame interval carried in the Vsync message whose Vsyncld is 3.
  • the compositing thread still decides to cut frames between 24.9ms-33.2ms.
  • the timing of sending the frame cutting request message can be relatively flexibly set.
  • the composition thread sends a frame interval modification notification message to the Vsync thread at an appropriate time, and sends a frame rate switching notification message to the hardware compositor and the Vsync thread, the hardware integration unit and the Vsync thread change the image frame of the frame interval during rendering processing Ability to display at the target frame rate.
  • FIG. 10 is a schematic diagram of the interaction between various modules during the frame rate switching process of the frame rate switching method provided in the embodiment of the present application.
  • Fig. 10 corresponds to the process shown in Fig. 9 above, and Fig. 10 also includes S1 to S3, and S1 to S3 in Fig. 10 will be introduced below.
  • the application main thread sends a frame cutting request message to the compositing thread between 0 ms and 8.3 ms, requesting frame cutting.
  • the frame cut request message in S1 is used to request the electronic device to switch from the current 120Hz to 90Hz.
  • the synthesis thread sends a frame interval modification notification message to the Vsync thread at 0ms-8.3ms.
  • the compositing thread After receiving the frame cutting request message from the application main thread, the compositing thread decides to notify the Vsync thread to modify the frame interval in advance.
  • the Vsync thread can modify the frame interval to 11.1 ms in the Vsync message of the next VsyncId according to the pre-agreed rule.
  • the pre-agreed rule here may mean that the Vsync thread modifies the frame interval in the Vsync message corresponding to the next VsyncId of the VsyncId carried in the frame interval modification notification message, so that the modified frame interval corresponds to the target frame rate.
  • the pre-scheduled rules here can be agreed in advance between the Vsync thread and the synthesis thread, so that the Vsync thread can modify the frame interval in the Vsync message according to the agreed rules after receiving the frame interval modification notification message.
  • the synthesis thread sends a frame rate switching notification message to the hardware synthesizer and the Vsync thread between 24.9ms-33.2ms.
  • FIG. 11 is a schematic diagram of a frame rate switching method according to an embodiment of the present application.
  • the frame rate of the electronic device is also switched from 120 Hz to 90 Hz, and the frame rate switching process shown in FIG. 11 also includes steps S1 to S3 .
  • the following mainly introduces S1 to S3 in detail.
  • the application main thread sends a frame cutting request message to the compositing thread.
  • the compositing thread sends a frame interval modification notification message.
  • the synthesis thread sends a frame interval modification notification message to the Vsync thread, so as to notify the Vsync thread to modify the frame interval in the Vsync message in advance.
  • the frame rate of the electronic device before the frame rate switching is 120 Hz, and the corresponding frame interval is 8.3 ms.
  • the compositing thread learns that the target frame rate is 90 Hz, and the target frame interval corresponding to the target frame rate is 11.1 ms. Therefore, the compositing thread needs to notify the Vsync thread to modify the frame interval to the target frame interval of 11.1 ms in the next Vsync message following the Vsync message corresponding to the frame cutting request initiated by the application main thread.
  • the composition thread needs to notify the Vsync thread to change the frame interval to the target frame interval of 11.1 ms in the Vsync message with the ID of N+2.
  • the compositing thread receives the frame cutting request message, and then, during the period of 16.6ms-24.9ms, the application main thread requests the next Vsync message from the Vsync thread, and the request carries The current process ID, the ID of the next Vsync message calculated by the Vsync thread is 4, the queue data of the Vsync message is stored inside the Vsync thread, and the Vsync thread judges that the ID of the current Vsync message is 4 to meet the requirements of the N+2 signal. Therefore, in In the Vsync messages of frame 6 and subsequent image frames, the Vsync thread modifies the frame intervals to 11.1 ms.
  • the compositing thread sends a frame rate switching notification message.
  • the synthesis thread After the synthesis thread receives the frame cutting request message sent by the application main thread, the synthesis thread judges whether the frameNumber of the image frame to be synthesized is the same as the frameNumber of the image frame carried when the application main thread initiates the frame cutting request when compositing the image frame , if they are the same, it is decided to switch the frame rate.
  • the frameNumber of the image frame to be synthesized and processed by the compositing thread is 4, and the frameNumber carried in the frame cutting request initiated by the application main thread is also 4. Therefore, the compositing thread is During the period of 24.9ms-33.2ms, it was decided to switch the frame rate from the original 120Hz to 90Hz.
  • the frame rate switching is realized at 52.6 ms.
  • the frequencies of the hardware Vsync signal and software Vsync signal are also Already switched to 90Hz.
  • the frame rate control system decides to switch the frame rate between 24.9ms-33.2ms, it notifies the hardware synthesizer to control the hardware integration unit to switch the frame rate from 120Hz to 90Hz, and the hardware integration unit completes the switching at 60.9ms .
  • the compositing thread notifies the Vsync thread to switch the software cycle to a new cycle (the new cycle corresponds to the new frame rate) during 33.2ms-52.6ms to complete the frequency switching.
  • the timer will reset the timing time based on the 41.5ms timestamp and the new frame interval, so that the timer will wake up according to the new timing time (timing time that matches the target frame rate) Vsync threads.
  • the application main thread sends a frame cutting request message to the compositing thread between 8.3ms and 16.6ms, requesting frame cutting.
  • the frame cut request message in S1 is used to request the electronic device to switch from the current 120Hz to 90Hz.
  • the synthesis thread sends a frame interval modification notification message to the Vsync thread at 8.3ms-16.6ms.
  • the Vsync thread can modify the frame interval to 11.1 ms in the Vsync message of the next VsyncId according to the pre-agreed rule.
  • the pre-agreed rule here may mean that the Vsync thread modifies the frame interval in the Vsync message corresponding to the next VsyncId of the VsyncId carried in the frame interval modification notification message, so that the modified frame interval corresponds to the target frame rate.
  • the pre-scheduled rules here can be agreed in advance between the Vsync thread and the synthesis thread, so that the Vsync thread can modify the frame interval in the Vsync message according to the agreed rules after receiving the frame interval modification notification message.
  • FIG. 13 is a schematic diagram of a frame rate switching method according to an embodiment of the present application.
  • the application main thread sends a frame cutting request message to the compositing thread.
  • the frame cutting request message may carry the VsyncId, frameNumber, target frame rate, and PID of the current frame.
  • the application main thread sends a frame cut request message to the compositing thread within a period of 16.6ms-24.9ms.
  • the current frame is frame 4, and the electronic device wants to switch the frame rate from 120Hz to 90Hz.
  • the compositing thread sends a frame interval modification notification message.
  • the frame rate of the electronic device before the frame rate switching is 120 Hz, and the corresponding frame interval is 8.3 ms.
  • the compositing thread learns that the target frame rate is 90Hz, and the target frame interval corresponding to the target frame rate is 11.1ms. Therefore, the compositing thread needs to notify the Vsync thread to modify the frame interval to the target frame interval of 11.1 ms in the next Vsync message following the Vsync message corresponding to the frame cutting request initiated by the application main thread.
  • the compositing thread needs to notify the Vsync thread to change the frame interval to the target frame interval of 11.1 ms in the Vsync message with the ID of N+2.
  • the compositing thread receives the frame cutting request message, and then, during the period of 24.9ms-33.2ms, the application main thread requests the next Vsync message from the Vsync thread, and the request carries The current process ID, the Vsync thread calculates the ID of the next Vsync message as 5, the queue data of the Vsync message is stored inside the Vsync thread, and the Vsync thread judges that the ID of the current Vsync message is 5 to meet the requirements of the N+2 signal. Therefore, in In the Vsync messages with the Vsync flag being 5 and subsequent image frames, the Vsync thread modifies the frame intervals to 11.1 ms.
  • the Vsync thread can wake up the Vsync thread by setting a timer at 33.2ms. After being woken up, the Vsync thread sends a Vsync message to the application main thread.
  • the Vsync flag carried by the Vsync message is 5, and the frame interval is 11.1ms.
  • the rendering thread makes the rendering thread perform rendering processing on frame 6 according to the displacement amount of 1.3Pixel, and performs compositing processing in the next cycle after the rendering processing is completed.
  • the compositing thread sends a frame rate switching notification message.
  • step S3 the synthesis thread decides to switch the frame rate from 120 Hz to 90 Hz, and sends a frame rate switching notification message to the hardware synthesizer and the Vsync thread, so as to realize the frame rate switching.
  • the synthesis thread After the synthesis thread receives the frame cutting request sent by the application main thread, the synthesis thread judges whether the frameNumber of the image frame to be synthesized is the same as the frameNumber of the image frame carried when the application main thread initiates the frame cutting request when compositing the image frame. If they are the same, it is decided to switch the frame rate.
  • the frameNumber of the image frame to be synthesized and processed by the compositing thread is 4, and the frameNumber carried in the frame cutting request initiated by the application main thread is also 4. Therefore, the compositing thread is During the period of 24.9ms-33.2ms, it was decided to switch the frame rate from the original 120Hz to 90Hz.
  • the electronic device completes the switching of the frame rate at 52.6ms.
  • the frame rate switch is realized at 52.6ms. At this time, the frequency of the hardware Vsync signal and the software Vsync signal is also Already switched to 90Hz.
  • the frame rate control system decides to switch the frame rate between 24.9ms-33.2ms, it notifies the hardware synthesizer to control the hardware integration unit to switch the frame rate from 120Hz to 90Hz, and the hardware integration unit completes the switching at 60.9ms .
  • the compositing thread notifies the Vsync thread to switch the software cycle to a new cycle (the new cycle corresponds to the new frame rate) during 33.2ms-52.6ms to complete the frequency switching.
  • the timer will reset the timing time based on the 41.5ms timestamp and the new frame interval, so that the timer will wake up according to the new timing time (timing time that matches the target frame rate) Vsync threads.
  • FIG. 14 is a schematic diagram of the interaction between various modules during the frame rate switching process of the frame rate switching method provided by the embodiment of the present application.
  • the application main thread sends a frame cutting request message to the compositing thread between 16.6ms and 24.9ms, requesting frame cutting.
  • the frame cut request message in S1 is used to request the electronic device to switch from the current 120Hz to 90Hz.
  • the synthesis thread sends a frame interval modification notification message to the Vsync thread at 16.6ms-24.9ms.
  • the compositing thread After receiving the frame cutting request message from the application main thread, the compositing thread decides to notify the Vsync thread to modify the frame interval in advance.
  • the Vsync thread can modify the frame interval to 11.1 ms in the Vsync message of the next VsyncId according to the pre-agreed rule.
  • the pre-agreed rule here may mean that the Vsync thread modifies the frame interval in the Vsync message corresponding to the next VsyncId of the VsyncId carried in the frame interval modification notification message, so that the modified frame interval corresponds to the target frame rate.
  • the pre-scheduled rules here can be agreed in advance between the Vsync thread and the synthesis thread, so that the Vsync thread can modify the frame interval in the Vsync message according to the agreed rules after receiving the frame interval modification notification message.
  • the synthesis thread sends a frame rate switching notification message to the hardware synthesizer and the Vsync thread between 24.9ms-33.2ms.
  • the difference between S1 and S3 in Figure 9 is two cycle intervals
  • the difference between S1 and S3 in Figure 11 is one cycle interval
  • the difference between S1 and S3 in Figure 13 is 0 time interval (the time interval where S1 is located is adjacent to the time interval where S3 is located)
  • the computer-executed instructions in the embodiments of the present application may also be referred to as application program codes, which is not specifically limited in the embodiments of the present application.
  • the frame rate switching device provided in the embodiment of the present application is used to implement the frame rate switching method in the above embodiment, and the technical principles and technical effects are similar, so they will not be repeated here.
  • An embodiment of the present application provides an electronic device, the structure of which is shown in FIG. 1 .
  • the memory of the electronic device can be used to store at least one program instruction, and the processor is used to execute the at least one program instruction, so as to realize the technical solutions of the above method embodiments. Its implementation principle and technical effect are similar to those of the related embodiments of the method above, and will not be repeated here.
  • An embodiment of the present application provides a chip.
  • the chip includes a processor, and the processor is used to call the computer program in the memory to execute the technical solutions in the above embodiments. Its implementation principle and technical effect are similar to those of the above-mentioned related embodiments, and will not be repeated here.
  • An embodiment of the present application provides a computer program product, which enables the electronic device to execute the technical solutions in the foregoing embodiments when the computer program product is running on the electronic device. Its implementation principle and technical effect are similar to those of the above-mentioned related embodiments, and will not be repeated here.
  • An embodiment of the present application provides a computer-readable storage medium, on which program instructions are stored.
  • the program instructions are executed by an electronic device, the electronic device executes the technical solutions of the foregoing embodiments. Its implementation principle and technical effect are similar to those of the above-mentioned related embodiments, and will not be repeated here.
  • Embodiments of the present application are described with reference to flowcharts and/or block diagrams of methods, devices (systems), and computer program products according to the embodiments of the present application. It should be understood that each procedure and/or block in the flowchart and/or block diagram, and a combination of procedures and/or blocks in the flowchart and/or block diagram can be realized by computer program instructions. These computer program instructions may be provided to a general purpose computer, special purpose computer, embedded processor, or processing unit of other programmable data processing equipment to produce a machine such that the instructions executed by the processing unit of the computer or other programmable data processing equipment produce a An apparatus for realizing the functions specified in one or more procedures of the flowchart and/or one or more blocks of the block diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Television Systems (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

一种帧率切换方法及装置,应用于终端技术领域。该方法包括:应用线程在第一周期基于第一帧率对应的帧间隔绘制和渲染第一图像帧;应用线程在第二周期基于第二帧率对应的帧间隔绘制和渲染第二图像帧,其中,第二周期位于第一周期之后,第二帧率与第一帧率不同;合成线程在第三周期向硬件合成器发送第一切帧请求,其中,第三周期位于第二周期之后,或者,第三周期与第二周期重合;硬件合成器基于第一切帧请求从第一帧率切换到第二帧率,以使得第二图像帧以第二帧率显示。本方法能解决电子设备由于帧率切换而导致图像帧显示时出现速度跳跃的问题。

Description

帧率切换方法及装置
本申请要求于2021年12月29日提交国家知识产权局、申请号为202111650488.9、发明名称为“降低刷新率时画面稳定的方法和装置”的中国专利申请的优先权,以及于2022年02月28日提交国家知识产权局、申请号为202210191921.5、发明名称为“帧率切换方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及终端技术领域,尤其涉及帧率切换方法及装置。
背景技术
随着电子设备尤其是终端设备的发展,用户可以通过终端设备的显示屏来查看越来越多种类的内容,当显示屏需要显示的内容较多的时候,用户可以通过在显示屏上滑动操作来查看相关内容。
在某些情况下,为了降低系统负载或者提高用户体验,终端设备需要进行帧率的切换。但是在帧率切换过程中可能会导致图像帧在显示的时候出现滑动速度的跳跃,进而导致画面卡顿,用户体验不佳。
发明内容
本申请实施例提供了一种帧率切换方法及装置,以解决电子设备在显示图像帧时,由于帧率切换而导致的图像帧滑动速度跳跃的问题。
第一方面,本申请实施例提出一种帧率切换方法,该方法包括:应用线程在第一周期基于第一帧率对应的帧间隔绘制和渲染第一图像帧;应用线程在第二周期基于第二帧率对应的帧间隔绘制和渲染第二图像帧;合成线程在第三周期向硬件合成器发送第一切帧请求;硬件合成器基于第一切帧请求从第一帧率切换到第二帧率,以使得第二图像帧以第二帧率显示。
其中,上述第二周期位于第一周期之后,第二帧率与第一帧率不同。
此外,上述第三周期位于第二周期之后,或者,第三周期与第二周期重合。
上述方法可以应用于电子设备,通过上述方法可以将电子设备从第一帧率切换到第二帧率。
上述第二帧率可以大于第一帧率,上述第二帧率也可以小于第一帧率。上述第二帧率与第一帧率可以是整数倍关系或者非整数倍关系。
上述应用线程在第一周期基于第一帧率对应的帧间隔绘制和渲染第一图像帧可以是指应用线程在第一周期内的合适时间基于第一帧率对应的帧间隔绘制和渲染第一图像帧,本申请对应用线程在第一周期内的什么时间段来对第一图像帧进行绘制和渲染处理不做限定。
类似的,上述应用线程在第二周期基于第二帧率对应的帧间隔绘制和渲染第二图像帧可以是指应用线程在第二周期内的合适时间基于第二帧率对应的帧间隔绘制和渲染第二图像帧,本申请对应用线程在第二周期内的什么时间段来对第二图像帧进行绘 制和渲染处理不做限定。
可以理解的是,上述第一周期,第二周期和第三周期的时间间隔内并未实现从第一帧率到第二帧率的切换,此时帧率还是第一帧率,只是从第二周期开始提前基于第二帧率对应的帧间隔对第二图像帧进行绘制渲染处理。
上述第二周期位于第一周期之后具体可以是指第二周期的起始时刻位于第一周期的结束时刻之后,或者第二周期的起始时刻与第一周期的结束时刻重合。
上述第三周期与第二周期重合可以是指第三周期的起始时刻与第二周期的起始时刻重合,并且第三周期的结束时刻也与第二周期的结束时刻重合。
本申请中,通过提前采用第二帧率对应的帧间隔对图像帧进行绘制和渲染处理,并在之后通知硬件合成器进行帧率的切换,使得采用第二帧率对应的帧间隔进行绘制和渲染处理的图像帧能够以第二帧率显示,这样图像帧绘制渲染的节奏能够与显示时的帧率匹配,从而使得图像帧在显示的时候不会出现滑动速度的跳跃。
在一种可能的实现方式中,合成线程在第三周期向硬件合成器线程发送第一切帧请求,包括:合成线程在第三周期调用函数performSetActiveMode,以唤醒硬件合成器线程将帧率从第一帧率切换到第二帧率。
本申请实施例中,合成线程通过调用专用的函数能够唤醒硬件合成器控制帧率的切换。
硬件合成器在被唤醒后可以控制硬件集成单元进行帧率的切换,使得硬件集成单元从第一帧率切换到第二帧率,从而实现了在硬件上的帧率切换。
在一种可能的实现方式中,上述方法还包括:合成线程在第四周期向Vsync线程发送第二切帧请求;Vsync线程基于第二切帧请求以第二帧率向应用线程发送Vsync消息。
上述第四周期位于第二周期之后,或者第四周期与第二周期重合。
在一种可能的实现方式中,上述合成线程在第四周期向Vsync线程发送第二切帧请求,包括:合成线程在第四周期调用setDuration函数将第二帧率对应的周期参数设置到Vsync线程,以使得Vsync线程以第二帧率向应用线程发送Vsync消息,从而在软件上实现了帧率切换。
本申请实施例中,合成线程通过调用专用的函数能够使得Vsync线程也实现帧率的切换,从而以新的第二帧率向应用线程发送Vsync消息,从而实现了在软件上的帧率切换。
在一种可能的实现方式中,在应用线程对应的缓存队列中堆叠的Buffer(缓存)数量为N的情况下,第三周期位于第二周期之后,其中,N为大于或者等于1的整数。
当缓存队列中的堆叠的Buffer数量为1的时候,第三周期位于第二周期之后,且第三周期与第二周期相邻。而当缓存队列中堆叠的Buffer数量大于1的时候,第三周期位于第二周期之后,但是第三周期与第二周期不相邻,第三周期与第二周期之间存在一定的时间间隔,并且当缓存队列中堆叠的Buffer数量越多的时候,第三周期与第二周期之间相隔的时间间隔越长。
在一种可能的实现方式中,在应用线程对应的缓存队列中堆叠的Buffer数量为0的情况下,上述第三周期与第二周期重合。
当缓存队列中堆叠的Buffer数量为0的时候,应用线程绘制和渲染图像帧与合成线程向硬件合成器线程发送第一切帧请求在同一个周期内进行(都是在第二周期进行)。
上述第四周期与第二周期之间的关系也与应用线程对应的缓存队列中堆叠的Buffer数量有关。
在一种可能的实现方式中,在应用线程对应的缓存队列中堆叠的Buffer数量为N的情况下,第四周期位于第二周期之后,其中,N为大于或者等于1的整数。
当缓存队列中的堆叠的Buffer数量为1的时候,第四周期位于第二周期之后,且第四周期与第二周期相邻。而当缓存队列中堆叠的Buffer数量大于1的时候,第四周期位于第二周期之后,但是第四周期与第二周期不相邻,第四周期与第二周期之间存在一定的时间间隔,并且当缓存队列中堆叠的Buffer数量越多的时候,第四周期与第二周期之间相隔的时间间隔越长。
在一种可能的实现方式中,在应用线程对应的缓存队列中堆叠的Buffer数量为0的情况下,上述第四周期与第二周期重合。
当缓存队列中堆叠的Buffer数量为0的时候,应用线程绘制和渲染图像帧与合成线程向Vsync线程发送第二切帧请求在同一个周期内进行(都是在第二周期进行)。
在一种可能的实现方式中,在应用线程在第二周期以第二帧率绘制和渲染第二图像帧之前,上述方法还包括:应用线程接收Vsync线程发送的第一Vsync消息,第一Vsync消息中携带的第二帧率对应的帧间隔。
本申请中,由于第一Vsync消息中携带的是第二帧率对应的帧间隔,因此,当应用线程在接收到第一Vsync消息后能够在第二周期根据该第二帧率对应的帧间隔对第二图像帧进行绘制和渲染处理。
在一种可能的实现方式中,上述应用线程接收Vsync线程发送的第一Vsync消息,包括:
应用线程在第二周期的起始时刻接收到Vsync线程发送的第一Vsync消息。
Vsync线程通过在第二周期的起始时刻向应用线程发送第一Vsync消息,能够使得应用线程在第二周期的起始时刻就能接收到该第一Vsync消息,并获取第一Vsync消息携带的帧间隔数据,进而根据该第一Vsync消息携带的帧间隔对第二图像帧进行绘制和渲染处理。
在一种可能的实现方式中,应用线程接收Vsync线程发送的第一Vsync消息之前,上述方法还包括:合成线程接收到来自应用线程的初始切帧请求;合成线程向Vsync线程发送帧间隔修改通知消息;Vsync线程基于帧间隔修改通知消息生成第一Vsync消息。
上述初始切帧请求可以携带要切换的目标帧率,这里的目标帧率其实就是第二帧率,因此,该初始切帧请求中可以携带第二帧率。在接收到该初始切帧请求后,合成线程基于该切帧请求生成帧间隔修改通知消息,该帧间隔修改通知消息用于通知Vsync线程对第一Vsync消息进行修改,使得第一Vsync消息携带第二帧率对应的帧间隔。
在一种可能的实现方式中,上述合成线程接收到来自应用线程的初始切帧请求,包括:合成线程在第一周期接收来自应用线程的初始切帧请求;上述合成线程向Vsync 线程发送帧间隔修改通知消息,包括:合成线程在第一周期向Vsync线程发送帧间隔修改通知消息。
本申请中,合成线程在接收到初始切帧请求之后就在同样的周期内向Vsync线程发送帧间隔修改通知消息,为Vsync线程留出了足够的时间生成帧间隔为第二帧率对应的帧间隔的第一Vsync消息。
在一种可能的实现方式中,上述在合成线程在第三周期向硬件合成器发送第一切帧请求之前,上述方法还包括:合成线程将对第一图像帧进行合成处理的下一个周期确定为第三周期。
上述第一切帧请求可以携带第一图像帧的帧号(framenumber)和/或对应的VsyncID,合成线程在对图像帧进行合成处理时可以通过图像帧的帧号或者对应的VsyncID对图像帧进行识别,在确定当前待合成处理的图像帧为第一图像帧的时候将合成处理第一图像帧的下一个周期确定为第三周期,并在该第三周期向硬件合成器发送第一切帧请求。
在一种可能的实现方式中,合成线程根据帧号确定当前待合成处理的图像帧为第一图像帧时将下一个周期确定为第三周期。
具体地,第一图像帧的帧号为1,对应的VsyncID为1,则合成线程在对图像帧进行合成处理时要判断该图像帧的帧号或者对应的VsyncID,如果发现当前待合成处理的图像帧的帧号为1或者对应的VsyncID为1时,则将下一个周期确定为第三周期。
在一种可能的实现方式中,上述合成线程在第四周期向Vsync线程发送第二切帧请求之前,上述方法还包括:合成线程将对第一图像帧进行合成处理的下一个周期确定为第四周期。
上述第一切帧请求可以携带第一图像帧的帧号(framenumber)和/或对应的VsyncID,合成线程在对图像帧进行合成处理时可以通过图像帧的帧号或者对应的VsyncID对图像帧进行识别,在确定当前待合成处理的图像帧为第一图像帧的时候将合成处理第一图像帧的下一个周期确定为第四周期,并在该第四周期向Vsync线程发送第二切帧请求。
在一种可能的实现方式中,合成线程根据帧号确定当前待合成处理的图像帧为第一图像帧时将下一个周期确定为第四周期。
具体地,第一图像帧的帧号为1,对应的VsyncID为1,则合成线程在对图像帧进行合成处理时要判断该图像帧的帧号或者对应的VsyncID,如果发现当前待合成处理的图像帧的帧号为1或者对应的VsyncID为1时,将下一个周期确定为第四周期。
在一种可能的实现方式中,上述第一周期,第二周期以及第三周期的时间间隔大小均与第一帧率的对应的帧间隔大小相同。
上述第一周期,第二周期和第三周期对应于切换前的第一帧率,因此,上述第一周期,第二周期以及第三周期的时间间隔大小均与第一帧率对应的帧间隔大小相同。
在一种可能的实现方式中,上述第一周期,第二周期以及第四周期的时间间隔大小均与第一帧率的对应的帧间隔大小相同。
第三方面,本申请实施例提供了一种电子设备,电子设备也可以称为终端(terminal)、用户设备(user equipment,UE)、移动台(mobile station,MS)、移 动终端(mobile terminal,MT)等。电子设备可以是手机(mobile phone)、智能电视、穿戴式设备、平板电脑(Pad)、带无线收发功能的电脑、虚拟现实(virtual reality,VR)电子设备、增强现实(augmented reality,AR)电子设备、工业控制(industrial control)中的无线终端、无人驾驶(self-driving)中的无线终端、远程手术(remote medical surgery)中的无线终端、智能电网(smart grid)中的无线终端、运输安全(transportation safety)中的无线终端、智慧城市(smart city)中的无线终端、智慧家庭(smart home)中的无线终端等等。
该电子设备包括处理器,处理器用于调用存储器中的计算机程序以执行如第一方面的方法。
第四方面,本申请实施例提供了一种计算机可读存储介质,计算机可读存储介质存储有计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行如第一方面的方法。
第五方面,本申请实施例提供了一种计算机程序产品,当计算机程序被运行时,使得电子设备执行如第一方面的方法。
第六方面,本申请实施例提供了一种芯片,芯片包括处理器,处理器用于调用存储器中的计算机程序,以执行如第一方面的方法。
应当理解的是,本申请的第二方面至第六方面与本申请的第一方面的技术方案相对应,各方面及对应的可行实施方式所取得的有益效果相似,不再赘述。
附图说明
图1为本申请实施例提供的电子设备的硬件系统结构示意图;
图2为本申请实施例提供的电子设备的软件系统结构示意图;
图3为可能的实现方式中电子设备界面显示处理流程的示意图;
图4为本申请实施例提供的应用场景示意图;
图5为传统方案中电子设备的帧率切换过程的示意图;
图6为传统方案中电子设备的帧率切换过程的示意图;
图7为本申请实施例的帧率切换方法的帧率切换过程的示意图;
图8为本申请实施例的帧率切换方法的帧率切换过程的示意图;
图9为本申请实施例的帧率切换方法的帧率切换过程的示意图;
图10为本申请实施例的帧率切换方法的帧率切换过程的交互图;
图11为本申请实施例的帧率切换方法的帧率切换过程的示意图;
图12为本申请实施例的帧率切换方法的帧率切换过程的交互图;
图13为本申请实施例的帧率切换方法的帧率切换过程的示意图;
图14为本申请实施例的帧率切换方法的帧率切换过程的交互图。
具体实施方式
为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。例如,第一芯片和第二芯片仅仅是为了区分不同的芯片,并不对其先后顺序进行限定。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。
需要说明的是,本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其他实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
本申请实施例中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
本申请实施例提供的帧率切换方法,可以应用在具备显示功能的电子设备中。
电子设备包括终端设备,电子设备也可以称为终端(terminal)、用户设备(user equipment,UE)、移动台(mobile station,MS)、移动终端(mobile terminal,MT)等。电子设备可以是手机(mobile phone)、智能电视、穿戴式设备、平板电脑(Pad)、带无线收发功能的电脑、虚拟现实(virtual reality,VR)电子设备、增强现实(augmented reality,AR)电子设备、工业控制(industrial control)中的无线终端、无人驾驶(self-driving)中的无线终端、远程手术(remote medical surgery)中的无线终端、智能电网(smart grid)中的无线终端、运输安全(transportation safety)中的无线终端、智慧城市(smart city)中的无线终端、智慧家庭(smart home)中的无线终端等等。本申请的实施例对电子设备所采用的具体技术和具体设备形态不做限定。
为了能够更好地理解本申请实施例,下面对本申请实施例的电子设备的结构进行介绍:
图1示出了电子设备100的结构示意图。电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriberidentification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processingunit, GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从存储器中调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuitsound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purposeinput/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(displayserial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器, 也可以是有线充电器。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wirelesslocal area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(codedivision multiple access,CDMA),宽带码分多址(wideband code division multipleaccess,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidounavigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellitesystem,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像、显示视频和接收滑动操作等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liqPID crystal display,LCD),有机发光二极管(organic light-emittingdiode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrixorganic light emitting diod,AMOLED),柔性发光二极管(flex light-emittingdiode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot lightemitting diodes,QLED)等。在一些实施例中,电子设备100可以 包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将电信号传递给ISP处理,转化为肉眼可见的图像。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备 100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x、y和z轴)的角速度。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应 用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构,等。本申请实施例以分层架构的Android(安卓)系统为例,示例性说明电子设备100的软件结构。
图2是本申请实施例的电子设备的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为五层,从上至下分别为应用层,应用程序框架层,安卓运行时(Android runtime)和系统库,硬件抽象层,以及内核层。
应用层可以包括一系列应用程序包。
如图2所示,应用程序包可以包括电话、邮箱、日历、相机等应用程序。
应用程序框架层为应用层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图2所示,应用程序框架层可以包括窗口管理器、帧率控制系统、图像合成系统、视图系统、包管理器、输入管理器、活动管理器和资源管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
帧率控制系统用于调节屏幕刷新率。
图像合成系统用于控制图像合成,以及产生垂直同步(vetical synchronization,Vsync)信号。
图像合成系统包括:合成线程、Vsync线程、缓存队列(quene buffer)线程。合成线程用于被Vsync信号唤醒进行合成。Vsync线程用于根据Vsync信号请求生成下一个Vsync信号。缓存队列线程用于存放缓存、产生Vsync信号请求,以及唤醒合成线程等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图 标的显示界面,可以包括显示文字的视图以及显示图片的视图。
包管理器用于系统内的程序管理,例如:应用程序安装、卸载和升级等。
输入管理器用于管理输入设备的程序。例如,输入系统可以确定鼠标点击操作、键盘输入操作和触摸滑动等输入操作。
活动管理器用于管理各个应用程序的生命周期以及导航回退功能。负责Android的主线程创建,各个应用程序的生命周期的维护。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
Android runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用层和应用程序框架层运行在虚拟机中。虚拟机将应用层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:图像渲染库、图像合成库、函数库、媒体库和输入处理库等。
图像渲染库用于二维或三维图像的渲染。图像合成库用于二维或三维图像的合成。
可能的实现方式中,应用通过图像渲染库对图像进行绘制渲染,然后应用将绘制渲染后的图像发送至图像合成系统的缓存队列中。每当Vsync信号到来时,图像合成系统(例如,surface flinger)从缓存队列中按顺序获取待合成的一帧图像,然后通过图像合成库进行图像合成。
函数库提供C语言中所使用的宏、类型定义、字符串操作函数、数学计算函数以及输入输出函数等。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4、H.264、MP3、AAC、AMR、JPG和PNG等。
输入处理库用于处理输入设备的库,可以实现鼠标、键盘和触摸输入处理等。
硬件抽象层,可以包含多个库模块,库模块如可以为硬件合成器(hwcomposer,HWC)、摄像头库模块等。Android系统可以为设备硬件加载相应的库模块,进而实现应用程序框架层访问设备硬件的目的。设备硬件可以包括如电子设备中的LCD显示屏、摄像头等。
内核层是硬件和软件之间的层。内核层至少包含触控(touch panel,TP)驱动、显示驱动、蓝牙驱动、WIFI驱动、键盘驱动、共用存储器驱动和相机驱动等。
硬件可以是音频设备、蓝牙设备、相机设备、传感器设备等。
下面结合应用程序启动或应用程序中发生界面切换的场景,示例性说明电子设备100软件以及硬件的工作流程。
当触控面板中触摸传感器180K接收到触摸操作时,内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸力度,触摸操作的时间戳等信息)。原始输入事件 被存储在内核层。内核层通过输入处理库将原始输入事件上报至应用程序框架层的输入管理器。应用程序框架层的输入管理器解析该原始输入事件的信息(包括:操作类型和报点位置等)和根据当前焦点确定焦点应用,并将解析后的信息发送至焦点应用。焦点可以是触摸操作中触碰点或者鼠标点击操作中点击位置。焦点应用为电子设备前台运行的应用或者触摸操作中触碰位置对应的应用。焦点应用根据解析后的原始输入事件的信息(例如,报点位置)确定该原始输入事件所对应的控件。
以该触摸操作是触摸滑动操作,该触摸滑动操作所对应的控件为微信应用的列表控件为例,微信应用通过应用程序框架层的视图系统,调用系统库中图像渲染库对图像进行绘制渲染。微信应用将绘制渲染后的图像发送至图像合成系统的缓存队列中。通过系统库中图像合成库将图像合成系统中绘制渲染后的图像合成为微信界面。图像合成系统通过内核层的显示驱动,使得屏幕(显示屏)显示微信应用的相应界面。
为了便于理解,示例的给出部分与本申请实施例相关概念的说明以供参考。
1、帧:是指界面显示中最小单位的单幅画面。一帧可以理解为一副静止的画面,快速连续地显示多个相连的帧可以形成物体运动的假象。帧率是指在1秒钟时间里刷新图片的帧数,也可以理解为电子设备中图形处理器每秒钟刷新画面的次数。高的帧率可以得到更流畅和更逼真的动画。每秒钟帧数越多,所显示的动作就会越流畅。
需要说明的是,界面显示帧前通常需要经过绘制、渲染、合成等过程。
2、帧绘制:是指显示界面的图片绘制。显示界面可以由一个或多个视图组成,各个视图可以由视图系统的可视控件绘制,各个视图由子视图组成,一个子视图对应视图中的一个小部件,例如,其中的一个子视图对应图片视图中的一个符号。
3、帧渲染:是将绘制后的视图进行着色操作或增加3D效果等。例如:3D效果可以是灯光效果、阴影效果和纹理效果等。
4、帧合成:是将多个上述一个或多个渲染后的视图合成为显示界面的过程。
下面结合软件以及硬件对电子设备100的界面的显示过程进行说明。
需要说明的是,为了提高显示的流畅性,减少出现显示卡顿等现象,电子设备一般基于Vsync信号进行显示,以对图像的绘制、渲染、合成和屏幕刷新显示等流程进行同步。
可以理解的是,Vsync信号为周期性信号,Vsync信号周期可以根据屏幕刷新率进行设置,例如,屏幕刷新率为60Hz时,Vsync信号周期可以为16.6ms,即电子设备每间隔16.6ms生成一个控制信号使Vsync信号周期触发。
需要说明的是,Vsync信号可以分为软件Vsync信号和硬件Vsync信号。软件Vsync信号包括Vsync-APP和Vsync-SF。Vsync-APP用于触发绘制渲染流程。Vsync-SF用于触发合成流程。硬件Vsync信号(Vsync-HW)用于触发屏幕显示刷新流程。
通常情况下,软件Vsync信号和硬件Vsync信号保持周期同步。以60Hz和120Hz变化为例,若Vsync-HW从60Hz切换到120Hz,Vsync-APP、Vsync-SF同步变化,从60Hz切换到120Hz。
示例性的,图3为可能的实现中一种电子设备界面显示处理流程的示意图。按照时间顺序,电子设备显示的图像依次对应于帧1、帧2、和帧3。
具体的,以帧1的显示为例,电子设备的应用通过应用程序框架层的视图系统, 对帧1进行绘制渲染。帧1绘制渲染完成后,电子设备的应用将绘制渲染好的帧1发送至图像合成系统(例如,surface flinger)。图像合成系统对绘制渲染好的帧1进行合成。帧1完成合成后,电子设备可以通过调用内核层启动显示驱动,在屏幕(显示屏)显示帧1对应的内容。帧2和帧3类似于帧1的过程也进行合成和显示,此处不再赘述。图3中每帧从绘制到显示,滞后2个Vsync信号周期,电子设备的在显示图像帧的过程中具有滞后性。
在某些情况下,如果电子设备的系统负载较大,则可以通过降低电子设备的屏幕刷新率来减少卡顿。在电子设备的系统画面静止、低帧率速度的视频场景下,也可以通过降低电子设备的屏幕刷新率来节约功耗。而在用户感知度较高的应用内滑动、应用切换、游戏等场景下,可以通过升高电子设备的刷新率来提升系统的流畅性,以提高用户体验。
本申请实施例的帧率切换方法可以应用在电子设备的多种应用场景,下面结合附图对本申请实施例的帧率切换方法的应用场景进行说明。
图4为本申请实施例的应用场景示意图。
电子设备可以在图4中的a所示的社交应用的界面,或在图4中的b所示的设置相关界面中,或者图4中的c所示的文档界面,或者图4中的d所示的商品浏览界面,等界面接收用户上滑操作或下滑操作。电子设备还可以在图4中的e所示的界面中,或者图4中的f所示的电子书界面,等接收到用户左滑操作或右滑动操作。当电子设备接收到用户的滑动操作时,电子设备基于滑动操作进行帧绘制、渲染、合成等过程,对滑动操作对应的内容进行显示。
在图4所示的多种滑动场景下,在某些情况下,当用户结束滑动后,为了降低帧率,电子设备可以从当前帧率切换到另一个较低的帧率,以降低电子设备的功耗。
在传统方案中,一般是在决策切帧后就进行帧率切换,并且在帧率切换完成后才更改图像帧绘制渲染处理时采用的帧间隔,这就使得一些图像帧绘制渲染处理时仍然采用的是旧的帧间隔,但是这些图像帧在显示的时候已经切换到新的帧率,这就导致一些图像帧在显示的时候会出现速度跳跃的情况,从而导致画面卡顿,使得用户体验不佳。
下面结合附图5对传统方案中电子设备的帧率切换流程进行简单的介绍。
在图5所示的示例中,电子设备的帧率要从120Hz切换到60Hz,其中,在0ms-8.3ms之间合成线程决策要进行帧率切换,经过两个周期(8.3ms-16.6ms和16.6ms-33.2ms),电子设备在33.2ms时实现了帧率从120Hz到60Hz的切换。
图5所示的帧率切换过程主要包括步骤S1和S2,下面对这两个步骤进行简单介绍。
S1,应用主线程向合成线程发送切帧请求消息。
上述切帧请求中携带当前帧的目标帧率和进程标识(process ID,PID)。
其中,目标帧率就是电子设备请求切换到的帧率,如图5所示,应用主线程在0ms-8.3ms的周期内向合成线程发送切帧请求消息,电子设备当前的帧率为120Hz,需要从当前帧率切换到的目标帧率为60Hz,那么,切帧请求中携带的目标帧率就为60Hz。
另外,切帧请求消息中的PID用于识别相应的进程,比如,电子设备当前显示的 是某个游戏的画面,那么,切帧请求消息中的PID用于识别该游戏的相关进程。
S2,合成线程向Vsync线程发送帧率切换通知消息。
如图5所示,合成线程在接收到应用主线程的切帧请求消息后,确定目标帧率为60Hz,因此决定将帧率从当前的120Hz切换到60Hz,并在0ms-8.3ms的周期内向硬件合成器和Vsync线程发送帧率切换通知消息,以实现从当前帧率到目标帧率的切换。
经过上述过程,在33.2ms时完成帧率的切换。
在合成线程发送帧率切换通知消息之后,需要再经过两个周期(8.3ms-16.6ms和16.6ms-33.2ms)完成帧率的切换,如图5所示,电子设备在33.2ms时实现了帧率切换,此时,硬件集成单元和软件Vsync信号的频率也已经切换到了60Hz。
具体地,合成线程在0ms-8.3ms接收到应用主线程发送的帧率切换请求消息后,决定进行切帧,接下来,合成线程向硬件合成器发送帧率切换通知消息,以使得硬件合成器控制硬件集成电路将帧率从120Hz切换到60Hz,硬件集成电路在33.2ms时完成帧率的切换。
此外,合成线程还向Vsync线程发送帧率切换通知消息,以通知Vsync线程在16.6ms-33.2ms期间将软件周期切换为新的周期(新的周期与新的帧率60Hz相对应),完成频率切换。在帧率切换完成之后,会对定时器基于16.6ms的时间戳及新的帧间隔(60Hz对应的帧间隔为16.6ms),重新设置定时时间,从而使得定时器后续根据新的定时时间(与目标帧率相匹配的定时时间)来唤醒Vsync线程。
下面对图5所示的帧率切换过程中画面帧在显示的时候出现速度跳跃的原因进行详细分析。
如图5所示,电子设备在33.2ms时完成了帧率从120Hz到60Hz的切换,帧7绘制渲染时采用的帧间隔已经修改成了2Pixel,显示时的位移量已修改成2Pixel,显示时对应的时间间隔为16.6ms,因此,帧7显示时的滑动速度为2Pixel/16.6ms=1Pixel/8.3ms。
而对于帧6来说,由于绘制渲染时采用的帧间隔为8.3ms,渲染位移为1Pixel,帧6显示时的位移增量也为1Pixel,但是显示时对应的时间间隔为16.6ms,因此,帧6显示时的滑动速度为1Pixel/16.6ms=0.5Pixel/8.3ms。类似的,帧3至帧5显示时的滑动速度也为0.5Pixel/8.3ms。
而对于帧2来说,由于绘制渲染时还未切换帧率,帧2绘制渲染时采用的帧间隔仍为8.3ms,渲染位移为1Pixel,显示时的位移量为1Pixel,显示时对应的时间间隔为8.3ms,因此,帧2显示时的滑动速度为2Pixel/16.6ms=1Pixel/8.3ms。类似的,帧0和帧1显示时的滑动速度也为2Pixel/16.6ms=1Pixel/8.3ms。
如图5所示,各个图像帧显示时的滑动速度如下:
帧0至帧2显示时的滑动速度均为1Pixel/8.3ms,帧3至帧6显示时的滑动速度均为0.5Pixel/8.3ms,帧7显示时的滑动速度为1Pixel/8.3ms。
因此,在帧2显示完毕到切换到帧3显示的过程中会出现滑动速度的跳跃(速度由1Pixel/8.3ms降低到0.5Pixel/8.3ms),而在帧6显示完毕到切换到帧7显示的过程中也会出现速度的跳跃(速度由0.5Pixel/8.3ms上升到1Pixel/8.3ms)。也就是说,在帧率从120Hz切换到60Hz前后,图像帧的滑动速度是先降低再升高,出现了速度的 跳变。
因此,如图5所示,在电子设备的帧率切换的过程中,由于部分帧在绘制渲染处理时的渲染位移与显示时的显示位移不相同(或者绘制渲染时采用的帧间隔与显示时对应的时间间隔不同),或者是由于传统方案切帧时是同时切换应用渲染帧率和合成帧率以及硬件设备帧率(三个Vsync一起切)。由于Android上有Buffer堆积问题,这样会导致旧帧率下渲染的Buffer在新帧率下显示。因此,导致了帧率切换过程中图像帧显示时会产生滑动速度的跳跃,导致用户感知到画面卡顿,用户体验不好。
在上述图5所示的帧率切换过程是从120Hz切换到60Hz,切换前后的帧率是整数倍的关系。实际上传统方案在帧率切换过程中不仅切换前后的帧率是整数倍的情况下存在图像帧显示速度跳变的情况,在切换前后的帧率是非整数倍的情况下也存在图像帧显示速度出现跳变的情况。为了更好地理解切换前后的帧率是非整数倍的情况下的切换过程,下面对传统方案中帧率切换前后的帧率是非整数倍的情况进行简单介绍。
如图6所示,电子设备的帧率要从120Hz切换到90Hz,其中,在0ms-8.3ms之间合成线程决策要进行帧率切换,经过两个周期(8.3ms-16.6ms和16.6ms-27.7ms),电子设备在27.7ms时实现了帧率从120Hz到90Hz的切换。
图6所示的帧率切换过程主要包括步骤S1和S2,下面对这两个步骤进行简单介绍。
S1,应用主线程向合成线程发起切帧请求消息。
该切帧请求中携带当前帧的目标帧率和PID。
如图6所示,应用主线程在0ms-8.3ms的周期内向合成线程发送切帧请求消息,电子设备当前的帧率为120Hz,需要从当前帧率切换到的目标帧率为90Hz,那么,切帧请求中携带的目标帧率为90Hz。
S2,合成线程向Vsync线程发送帧率切换通知消息。
如图6所示,在合成线程在接收到应用主线程的切帧请求消息后,合成线程根据切帧请求中携带的目标帧率90Hz,决定将帧率从当前的120Hz切换到90Hz,并在0ms-8.3ms的周期内向硬件合成器和Vsync线程发送帧率切换通知消息,以实现从当前帧率到目标帧率的切换。
经过上述过程,在27.7ms时完成帧率的切换。
在合成线程发送帧率切换通知消息之后,需要再两个周期(8.3ms-16.6ms和16.6ms-27.7ms)完成帧率的切换,因此,在图6中,在27.7ms时实现了帧率的切换,此时,此时,硬件集成单元和软件Vsync信号的频率也已经切换到了60Hz。
具体地,合成线程在0ms-8.3ms接收到应用主线程发送的帧率切换请求消息后,决定进行切帧,接下来,合成线程向硬件合成器发送帧率切换通知消息,以使得硬件合成器控制硬件集成电路将帧率从120Hz切换到60Hz,硬件集成电路在27.7ms时完成帧率的切换。
此外,合成线程还向Vsync线程发送帧率切换通知消息,以通知Vsync线程在16.6ms-27.7ms期间将软件周期切换为新的周期(新的周期与新的帧率90Hz相对应),完成频率切换。在帧率切换完成之后,会对定时器基于16.6ms的时间戳及新的帧间隔(90Hz对应的帧间隔为11.1ms),重新设置定时时间,从而使得定时器后续根据新的 定时时间(与目标帧率相匹配的定时时间)来唤醒Vsync线程。
通过计算可以得到图6所示的各个图像帧显示时的滑动速度如下:
帧0到帧2显示时的滑动速度均为1Pixel/8.3ms,帧3到帧6显示时的滑动速度均为0.7Pixel/8.3ms,帧7显示时的滑动速度为1Pixel/8.3ms。
因此,在帧2显示完毕到切换到帧3显示的过程中会出现滑动速度的跳跃(速度由1Pixel/8.3ms降低到0.7Pixel/8.33ms),而在帧6显示完毕到切换到帧7显示的过程中也会出现速度的跳跃(速度由0.7Pixel/8.3ms上升到1Pixel/8.33ms)。也就是说,在帧率从120Hz切换到90Hz前后,图像帧的速度先降低再升高,出现了速度的跳变。
综上,根据图5和图6可知,传统方案在帧率切换过程中会出现图像帧显示的滑动速度跳变的情况。
为了上述图5和图6中所示的过程中存在的问题,本申请实施例提供了一种新的帧率切换方法,该方案通过提前调整图像帧绘制渲染时采用的帧间隔,并且使得该部分提前调整帧间隔的图像帧以切换后的帧率显示,从而使得图像帧在显示的时候滑动速度不会出现跳变,提高用户体验。具体地,本申请实施例的帧率切换方法通过提前调整图像帧绘制渲染时采用的帧间隔,并通过跟踪图像帧的frameNumber来确定帧率切换的时机,从而使得提前调整绘制渲染时采用的帧间隔的图像帧能够以切换后的帧率显示,最终使得图像帧在显示的时候滑动速度不会出现跳变。
下面结合图7对本申请实施例的帧率切换方法进行详细的介绍。图7所示的帧率切换方法可以由电子设备执行,如图7所示,电子设备在VsyncID=1至VsyncID=6之间时的帧率为第一帧率,在VsyncID=6至VsyncID=8之间时的帧率为第二帧率,每个时刻对应一个VsyncID。
图7所示的方法包括步骤S101至S104,下面分别对这四个步骤进行详细的介绍。
S101,应用线程在第一周期基于第一帧率对应的帧间隔绘制和渲染第一图像帧。
S102,应用线程在第二周期基于第二帧率对应的帧间隔绘制和渲染第二图像帧。
上述第一帧率和第二帧率对应的帧间隔可以是分别通过对第一帧率和第二帧率取倒数得到的。例如,第一帧率为120Hz,则第一帧率对应的帧间隔为8.3ms,第二帧率为90Hz,则第一帧率对应的帧间隔为11.1ms。
如图7所示,上述第二周期位于第一周期之后,第二帧率与第一帧率不同。可以理解的是图7中示出的情况是第二周期位于第一周期之后,且第二周期与第一周期不相邻。实际上,本申请中只限定第二周期位于第一周期之后而并不限定第二周期和第一周期是否相邻,第二周期与第一周期既可以相邻也可以不相邻。
可以理解的是,上述第一帧率可以是切换前的帧率,上述第二帧率可以是切换后的帧率,以电子设备从120Hz切换到90Hz为例,则上述第一帧率可以是120Hz,上述第二帧率可以为90Hz。在这种情况下,在S101中,应用线程在第一周期是基于120Hz对应的帧间隔8.3ms来绘制和渲染第一图像帧,在S102中应用线程在第二周期是基于90Hz对应的帧间隔11.1ms来绘制和渲染第一图像帧。
S103,合成线程在第三周期向硬件合成器发送第一切帧请求。
其中,上述第三周期可以位于第二周期之后,或者,上述第三周期与第二周期重合。
如图7所示,上述第三周期位于第二周期之后,且第三周期与第二周期不相邻。
可以理解的是上述应用线程在对图像帧进行绘制和渲染处理的时候对应的有缓存队列,该缓存队列可以堆叠有一定数量的Buffer。
可选地,上述缓存队列中堆叠的Buffer数量为N,N为大于或者等于1的整数,在这种情况下,上述第三周期位于第二周期之后。
可选地,上述缓存队列中堆叠的Buffer数量为0,在这种情况下,上述第三周期与第二周期重合。
图7示出的是缓存队列中堆叠的Buffer数量为2的情况,根据图7可知,上述第三周期位于第二周期之后,且第三周期与第二周期不相邻。
在上述S103中,合成线程具体可以在第三周期调用函数performSetActiveMode,以唤醒硬件合成器线程将帧率从所述第一帧率切换到所述第二帧率。
S104,硬件合成器基于第一切帧请求从第一帧率切换到所述第二帧率,以使得第二图像帧以第二帧率显示。
硬件合成器会基于第一切帧请求控制硬件集成单元从第一帧率切换到第二帧率,结合图7所示,硬件合成器会控制硬件集成单元在VsyncID=7的时刻切换到第二帧率。
对于电子设备来说,除了控制硬件单元切换帧率之外,还需要控制软件实现帧率的切换。
可选地,作为一个实施例,图7所示的方法还包括:
S105,合成线程在第四周期向Vsync线程发送第二切帧请求。
S106,Vsync线程基于第二切帧请求以第二帧率向应用线程发送Vsync消息。
上述S105中该第四周期位于第二周期之后,或者所述该第四周期与第二周期重合,也就是说上述第四周期不可能位于第二周期之前。
上述S105中合成线程可以通过调用相应的函数来控制Vsync线程以第二帧率向应用线程发送Vsync消息。
具体地,上述S105具体包括:合成线程在第四周期调用setDuration函数设置第二帧率对应的周期参数到Vsync线程,以使得所述Vsync线程以所述第二帧率向应用线程发送Vsync消息。
可以理解的是上述应用线程在对图像帧进行绘制和渲染处理的时候对应的有缓存队列,该缓存队列可以堆叠有一定数量的Buffer。
可选地,上述缓存队列中堆叠的Buffer数量为N,N为大于或者等于1的整数,在这种情况下,上述第四周期位于第二周期之后。
可选地,上述缓存队列中堆叠的Buffer数量为0,在这种情况下,上述第四周期也与第二周期重合。
当上述第三周期和第四周期都与第二周期重合时,也就是第三周期和第四周期重合,在这种情况下步骤S103和S105可以是在一个是周期内进行的。
可选地,作为一个实施例,在上述步骤S102之前,图7所示的方法还包括:
S102a,应用线程接收Vsync线程发送的第一Vsync消息。
上述第一Vsync消息中携带的第二帧率对应的帧间隔。
例如,当第二帧率为90Hz时,上述第一Vsync消息中携带的帧间隔为11.1ms。
上述步骤S102a具体包括:应用线程在第二周期的起始时刻接收到Vsync线程发送的第一Vsync消息。
也就是说,当应用线程在第二周期接收到第一Vsync消息之后,接下来就在第二周期内根据第一Vsync消息携带的帧间隔(第二帧率对应的帧间隔)对第二图像帧进行绘制和渲染。
可选地,作为一个实施例,在上述步骤S102a之前,图7所示的方法还包括:
S102x,合成线程接收到来自应用线程的初始切帧请求;
S102y,合成线程向Vsync线程发送帧间隔修改通知消息;
S102z,Vsync线程基于帧间隔修改通知消息生成第一Vsync消息。
上述初始切帧请求用于请求合成线程将帧率由当前的第一帧率切换到第二帧率。
上述帧间隔修改通知消息用于通知Vsync线程将下下个Vsync消息(也就是第一Vsync消息)中的帧间隔修改为第二帧率对应的帧间隔,基于该帧间隔修改通知消息,Vsync线程生成第一Vsync消息。若第二帧率为90Hz的话,则该第一Vsync消息中的帧间隔为11.1ms。
本申请实施例中,应用线程可以先向合成线程发起初始切帧请求来请求切帧,使得应用可以灵活的根据需要向合成线程发起切帧请求。
可选地,作为一个实施例,上述步骤S102x具体包括:合成线程在第一周期接收来自应用线程的初始切帧请求;
可选地,作为一个实施例,上述步骤S102y具体包括:合成线程在第一周期向Vsync线程发送帧间隔修改通知消息。
本申请实施例中,合成线程在第一周期接收到初始切帧请求之后,就在该第一周期内向Vsync线程发送帧间隔修改通知消息,能够为Vsync线程留出充足的时间来修改Vsync消息中的帧间隔。
可选地,作为一个实施例,在上述步骤S103之前,图7所示的方法还包括:
S103a,合成线程将对第一图像帧进行合成处理的下一个周期确定为第三周期。
具体地,在步骤S103a中合成线程可以对每一个需要合成处理的图像帧进行识别,当确定要对第一图像帧要进行合成处理时,则确定在合成第一图像帧的下一个周期确定为第三周期,并在该第三周期执行步骤S103。
在上述步骤103a中合成线程可以根据图像帧的帧号(framenumber)和/或对应的VsyncID来识别合成处理的图像帧是否为第一图像帧,在确定当前待合成处理的图像帧为第一图像帧的时候将合成处理第一图像帧的下一个周期确定为第三周期,并在该第三周期向硬件合成器发送第一切帧请求。
可选地,作为一个实施例,上述第一周期,第二周期以及第三周期的时间间隔大小均与第一帧率的对应的帧间隔大小相同。
例如,当上述第一帧率为120Hz时,则第一周期,第二周期以及第三周期的时间间隔大小均为8.3ms。
下面结合图8,以电子设备从120Hz切换到90Hz为例对本申请实施例的帧率切换方法进行详细的介绍。
如图8所示,电子设备在0ms-41.5ms之间时的帧率为120Hz,在41.5ms-63.7ms 之间时的帧率为90Hz。每个时刻对应一个VsyncID,如图8所示,0ms时对应的VsyncID为1,8.3ms时对应的VsyncID为2等等。在每个VsyncID对应的时刻应用线程会接收到来自Vsync线程的Vsync消息,该Vsync消息中会携带对应的VsyncID和帧间隔,应用线程接收到Vsync消息后基于该Vsync消息中携带的帧间隔对对应的图像帧进行绘制渲染处理。
图8包括步骤S1001至S1007,下面对这些步骤进行详细介绍。
S1001,应用线程在0ms-8.3ms之间以120Hz对应的帧间隔绘制和渲染第一图像帧。
由于120Hz对应的帧间隔为8.3ms,因此,在S1001中应用线程是基于8.3ms的帧间隔绘制和渲染的第一图像帧。
具体地,Vsync线程内可以通过设置定时器在0ms时将自身线程唤醒,在被唤醒后,Vsync线程向应用线程发送Vsync消息,该Vsync消息携带的帧间隔为8.3ms,时间戳为0ms。应用线程在接收到来自Vsync线程的Vsync消息后,保存消息中的时间戳。
假设,图像帧正常显示时的滑动速度为1Pixel/8.3ms,则可以计算得到第一图像帧的位移量为8.3*1Pixel/8.3ms=1Pixel,接下来通知渲染线程,使得渲染线程根据1Pixel的位移量对第一图像帧进行渲染处理,完成渲染处理后,渲染线程将第一图像帧送入到缓存线程入队,待后续进行合成处理。
S1002,应用线程在0ms-8.3ms之间向合成线程请求将帧率切换到90Hz。
在S1002中,应用线程可以根据应用需要发起切帧请求。例如,当电子设备的负载较大的时候应用主线程可以发起切帧请求,请求将帧率从120Hz切换到90Hz。
这里的切帧请求可以是应用线程在向合成线程发送的切帧请求消息,该切帧请求消息可以携带当前帧的VsyncId和frameNumber。如图8所示,S1002中发送的切帧请求消息可以携带VsyncId=1,frameNumber=1,其中,frameNumber=1表示第一图像帧的帧号为1。
S1003,合成线程在0ms-8.3ms之间通知Vsync线程将下下个Vsync消息中的帧间隔修改为11.1ms。
在S1003中合成线程具体可以向Vsync线程发送帧间隔修改通知消息,以使得Vsync线程将下下个Vsync消息中的帧间隔修改为11.1ms,这里的下下个Vsync消息就是对应的VsyncId=3的第一Vsync消息。
S1004,Vsync线程在16.6ms时向应用线程发送第一Vsync消息。
该第一Vsync消息中携带的帧间隔为11.1ms。
具体地,在图8中,在0ms,8.3ms,16.6ms等处,Vsync线程都要向应用线程发送Vsync消息。合成线程通知Vsync线程将帧间隔修改为11.1ms后,Vsync线程在8.3ms发送了一个Vsync消息,该Vsync消息中的帧间隔仍然为8.3ms,保持不变,而Vsync线程在16.6ms发送另一个Vsync消息也就是S1004中发送的第一Vsync消息时要将帧间隔修改为11.1ms。
S1005,应用线程以90Hz对应的帧间隔绘制和渲染第二图像帧。
在S1005中,由于应用线程接收到了帧间隔修改后的第一Vsync消息,因此,应 用线程开始以修改后的帧间隔对第二图像帧进行绘制和渲染处理。
由于90Hz对应的帧间隔是11.1ms,因此,在S1005中应用线程具体是基于11.1ms的帧间隔来绘制和渲染第二图像帧。
具体地,结合图8所示,Vsync线程内可以通过设置定时器在16.6ms时将自身线程唤醒,在被唤醒后,Vsync线程向应用线程发送第一Vsync消息,该第一Vsync消息携带的帧间隔为11.1ms,时间戳为16.6ms。应用线程在接收到来自Vsync线程的第一Vsync消息后,保存消息中的时间戳。
假设,图像帧正常显示时的滑动速度为1Pixel/8.3ms,则可以计算得到第二图像帧的位移量为11.1*1Pixel/8.3ms=1.3Pixel,接下来通知渲染线程,使得渲染线程根据1.3Pixel的位移量对第二图像帧进行渲染处理,完成渲染处理后,渲染线程将第二图像帧送入到缓存线程入队,待后续进行合成处理。
S1006,合成线程调用函数performSetActiveMode,以唤醒硬件合成器线程将帧率从120Hz切换到90Hz。
在S1006中,合成线程通过调用函数performSetActiveMode能够启动或者唤醒硬件合成器,以使得硬件合成器将帧率从120Hz切换到90Hz。
如图8所示,硬件合成器在被唤醒后可以控制硬件集成单元将帧率从120Hz切换到90Hz,硬件集成单元在52.6ms位置切换完成。
S1007,合成线程调用setDuration函数设置90Hz对应的周期参数到Vsync线程,以使得Vsync线程以90Hz发送Vsync消息。
在S1007中,合成线程通过调用setDuration函数,能够将90Hz对应的周期参数(具体可以90Hz对应的帧间隔11.1ms)到Vsync线程,从而使得Vsync线程以90Hz的频率发送Vsync消息。
具体地,合成线程通过调用setDuration函数,能够通知Vsync线程在33.2ms-52.6ms期间将软件周期切换为90Hz,完成频率切换。在帧率切换完成之后,会对定时器基于41.5ms时间戳及新的帧间隔,重新设置定时时间,从而使得定时器后续根据新的定时时间(与90Hz相匹配的定时时间)来唤醒Vsync线程。
在S1006和S1007中,合成线程通过调用两种不同的函数能够使得硬件集成单元和Vsync线程同步在52.6ms将帧率从120Hz切换到90Hz。
为了更好的理解本申请实施例的帧率切换方法,下面结合图9至图14从另一个角度结合时序图和交互图对本申请实施例的帧率切换方法进行详细的介绍。
下面结合图9,以缓存队列中有两个buffer堆积的情况为例对本申请实施例的帧率切换方法的帧率切换过程进行介绍。
图9为本申请实施例的帧率切换方法的帧率切换过程的示意图。
如图9所示,附图上方都标记了Vsync的时间戳,每个Vsync-APP的标识ID、每个Vsync-SF的标识ID,Buffer信息。其中,在缓存队列中有两个帧的buffer,每帧冒号后面的数字表示当前渲染帧相对于前一阵的位移,单位为像素(Pixel)。
如图9所示,电子设备的帧率要从120Hz切换到90Hz,该实施例中,应用主线程通过向合成线程发送切帧请求(请求帧率从120Hz切换到90Hz),合成线程在接收到该切帧请求消息后,通知Vsync线程提前对图像帧绘制渲染时采用的帧间隔进行修改, 并且使得提前调整帧间隔的图像帧以切换后的帧率显示,从而使得这部分图像帧的渲染速度与显示时的滑动速度保持一致,从而避免图像显示时出现滑动速度的跳变。
图9所示的帧率切换过程主要包括步骤S1至S3,下面对这几个步骤进行详细介绍。
S1、应用主线程向合成线程发送切帧请求消息。
上述切帧请求消息中携带有当前帧的VsyncId,frameNumber,目标帧率和PID。
如图9所示,应用主线程在0ms-8.3ms期间向合成线程发送切帧请求消息,当前周期内需要进行绘制渲染处理的图像帧为帧4,并且该电子设备是要将帧率从120Hz切换到90Hz,因此,切帧请求中携带的Vsyncld=2,frameNumber=4,目标帧率为90Hz。
在上述S1中应用主线程可以根据应用需要发起切帧请求,例如,当电子设备的负载较大的时候应用主线程可以发起切帧请求,请求切换到较低的帧率。而如果当前启动的应用程序是用户感知度较高的应用程序(例如,当前启动的应用程序为游戏应用)时,应用主线程可以发起切帧请求,请求切换到较高的帧率。
S2、合成线程发出帧间隔修改通知消息。
具体地,合成线程向Vsync线程发送帧间隔修改通知消息,以通知Vsync线程提前对Vsync消息中的帧间隔进行修改。
由于电子设备是要从120Hz切换到90Hz,因此,在帧率切换之前电子设备的帧率是120Hz,对应的帧间隔为8.3ms。合成线程在接收到切帧请求消息后,获知目标帧率为90Hz,该目标帧率对应的目标帧间隔为11.1ms(通过对90Hz取倒数得到11.1ms)。因此,合成线程需要通知Vsync线程在应用主线程发起切帧请求对应的Vsync消息的下下个Vsync消息中,将帧间隔修改为目标帧间隔11.1ms。
例如,应用主线程在接收到ID为N的Vsync-app信号之后发起切帧请求,则合成线程通知Vsync线程在ID为N+2的Vsync消息将帧间隔修改为目标帧间隔11.1ms。
本申请实施例中,Vsync-app信号也可以称为Vsync消息,下面以Vsync消息为准进行介绍。
如图9所示,在0ms-8.3ms期间,合成线程接收到切帧请求消息,接下来,在8.3ms-24.9ms期间,应用主线程向Vsync线程请求下一个Vsync消息,该请求中携带当前进程标识,Vsync线程计算下一个Vsync消息的标识为3,Vsync线程内部存储有Vsync消息的队列数据,Vsync线程判断当前Vsync消息的标识为3满足第N+2个信号的要求,因此,在Vsync消息的标识为3以及3以后的Vsync消息中,Vsync线程将帧间隔全部修改为11.1ms。
具体来说,Vsync线程内可以通过设置定时器在16.6ms时将自身线程唤醒,在被唤醒后,Vsync线程向应用主线程发送Vsync消息,该Vsync消息携带的Vsync标识为3,帧间隔11.1ms,时间戳16.6ms。应用主线程在接收到来自Vsync线程的Vsync消息后,保存消息中的时间戳,并计算得到帧6的位移量为11.1ms*1Pixel/8.3ms=1.3Pixel,接下来通知渲染线程,使得渲染线程根据1.3Pixel的位移量对帧6进行渲染处理,完成渲染处理后,渲染线程将帧6送入到缓存线程入队,待后续进行合成处理。
在传统方案(如图5和图6所示的方案)中,每个Vsync消息中的帧间隔都是根 据当前帧率计算出来的,帧间隔为当前帧率的倒数。
而在本申请实施例中,如图9所示,这里是将帧6绘制渲染采用的帧间隔提前修改为1/目标帧率。从而使得帧6绘制渲染时采用的帧间隔与帧6显示时所采用的帧间隔相同,防止帧率切换前后图像帧在显示的时候出现速度的跳变。其中,帧6显示时所采用的帧间隔不是帧6的持续时间,而可以认为是人眼感受到帧6显示速度时所采用的帧间隔,如图9所示,帧6显示时对应的时间间隔是41.5ms-52.6ms,也就是11.1ms。
S3、合成线程发送帧率切换通知消息。
在步骤S3中,合成线程决定将帧率由120Hz切换到90Hz,并向硬件合成器和Vsync线程发送帧率切换通知消息,以实现帧率的切换。
合成线程的具体决策机制如下:
在合成线程接收到应用主线程发送的切帧请求消息之后,合成线程在对图像帧进行合成时判断要合成的图像帧的frameNumber与应用主线程发起切帧请求时携带的图像帧的frameNumber是否相同,如果相同的话则决策进行帧率的切换,向硬件合成器和Vsync线程发送帧率切换通知消息。
如图9所示,在24.9ms-33.2ms周期内,合成线程要合成处理的是图像帧的frameNumber为4,而应用主线程发起的切帧请求中携带的frameNumber也是4,因此,合成线程在24.9ms-33.2ms周期内,决定将帧率从原来的120Hz切换到90Hz。
经过上述过程,电子设备在52.6ms时完成帧率的切换。
在合成线程发送帧率切换通知消息后,需要经过两个周期完成帧率切换,因此,在图9中,在52.6ms时实现帧率切换,此时,硬件Vsync信号和软件Vsync信号的频率也已经切换到了90Hz。
如图9所示,帧率控制系统在24.9ms-33.2ms之间决策切换帧率后,通知硬件合成器控制硬件集成单元将帧率从120Hz切换到90Hz,硬件集成单元在52.6ms位置切换完成。
此外,合成线程通知Vsync线程在33.2ms-52.6ms期间将软件周期切换为新的周期(新的周期与新的帧率相对应),完成频率切换。在帧率切换完成之后,会对定时器基于41.5ms时间戳及新的帧间隔,重新设置定时时间,从而使得定时器后续根据新的定时时间(与目标帧率相匹配的定时时间)来唤醒Vsync线程。
在图9所示的过程中,合成线程是在0ms-8.3ms之间向合成线程发送切帧请求消息,切帧请求消息中携带的Vsyncld为1,Vsync线程是对切帧请求中携带的Vsyncld的下下个Vsyncld也就是Vsyncld为3的Vsync消息中携带的帧间隔进行修改。
可以理解的是,合成线程也可以在0ms-8.3ms之前的时间间隔或者之后的时间间隔向合成线程发送切帧请求。也就是说,切帧请求中携带的Vsyncld既可以是发送该切帧请求消息的时间间隔的起始时刻对应的Vsyncld,也可以不是发送该切帧请求的时间间隔的起始时刻对应的Vsyncld,本申请对次并不限定。下面举例对此进行说明。
例如,以图9为例,应用主线程还可以在0ms-8.3ms之前的周期向合成线程发送切帧请求消息。例如,应用主线程可以在-8.3ms-0ms之间向合成线程发送切帧请求消息,该切帧请求消息中携带的Vsyncld为0,在这种情况下,合成线程可以通知Vsync线程对切帧请求中携带的Vsyncld之后的第3个Vsyncld也就是Vsyncld为3的Vsync 消息中携带的帧间隔进行修改。而合成线程仍是在24.9ms-33.2ms之间决策切帧。
再如,仍以图9为例,应用主线程还可以在8.3ms-16.6ms的时间间隔内向合成线程发送切帧请求消息,该切帧请求消息中携带的Vsyncld为2,在这种情况下,合成线程可以通知Vsync线程对切帧请求中携带的Vsyncld的下个Vsyncld也就是Vsyncld为3的Vsync消息中携带的帧间隔进行修改。这种情况下,合成线程仍是在24.9ms-33.2ms之间决策切帧。
此外,在图9所示的过程中,合成线程也可以在0ms-8.3ms之间向合成线程发送切帧请求消息,切帧请求消息中携带的Vsyncld为2,在这种情况下,合成线程可以通知Vsync线程对切帧请求中携带的Vsyncld的下个Vsyncld也就是Vsyncld为3的Vsync消息中携带的帧间隔进行修改。这种情况下,合成线程仍是在24.9ms-33.2ms之间决策切帧。
在图9所示的过程中,合成线程也可以在8.3ms-16.6ms之间向合成线程发送切帧请求消息,切帧请求消息中携带的Vsyncld为1,在这种情况下,合成线程可以通知Vsync线程对切帧请求中携带的Vsyncld的下下个Vsyncld也就是Vsyncld为3的Vsync消息中携带的帧间隔进行修改。这种情况下,合成线程仍是在24.9ms-33.2ms之间决策切帧。
本方案中,切帧请求消息发送的时机相对可以灵活的设置。只要合成线程在合适的时机向Vsync线程发送帧间隔修改通知消息,以及向硬件合成器和Vsync线程发送帧率切换通知消息,使得硬件集成单元和Vsync线程在绘制渲染处理时更改帧间隔的图像帧能够以目标帧率进行显示。
上文结合图9所示的帧率切换方法中的切换帧率的过程往会涉及到很多模块之间的交互,为方便理解,下面结合图10对图9所示的帧率切换过程中涉及的各个模块之间的交互过程进行详细说明。
示例性的,图10为本申请实施例提供的帧率切换方法的帧率切换过程中各个模块之间交互的过程示意图。
图10与上述图9所示的过程对应,图10也包括S1至S3,下面对图10中的S1至S3进行介绍。
S1、应用主线程在0ms-8.3ms之间向合成线程发送切帧请求消息,请求切帧。
上述S1中的切帧请求消息中携带的Vsyncld=1,frameNumber=4,目标帧率90Hz。
结合图9所示的内容,S1中的切帧请求消息用于请求电子设备从当前的120Hz切换到90Hz。
S2、合成线程在0ms-8.3ms向Vsync线程发送帧间隔修改通知消息。
合成线程在接收到来自应用主线程的切帧请求消息后,决定通知Vsync线程提前对帧间隔进行修改。S2中的帧间隔修改通知消息中携带切帧请求消息中包含的VsyncId以及目标帧率,具体地,该帧间隔修改通知消息中携带的VsyncId=1,目标帧率为90Hz。
Vsync线程在接收到帧间隔修改通知消息后,可以根据预先约定的规则,在下下个VsyncId的Vsync消息中将帧间隔修改为11.1ms。这里的预先约定的规则可以是指Vsync线程对帧间隔修改通知消息中携带的VsyncId的下下个VsyncId对应的Vsync 消息中的帧间隔进行修改,使得修改后的帧间隔与目标帧率相对应。这里的预先预定的规则可以提前在Vsync线程和合成线程之间约定好,这样Vsync线程在接收到帧间隔修改通知消息后就可以按照约定的规则对Vsync消息中的帧间隔进行修改了。
S3、合成线程在24.9ms-33.2ms之间向硬件合成器和Vsync线程发送帧率切换通知消息。
可以理解的是,在S3之前,合成线程需要判断何时向硬件合成器和和Vsync线程发送帧率切换通知消息,由于S1中的切帧请求消息中携带的frameNumber=4,因此,在确定当前要合成处理的图像帧的frameNumber也为4的情况下就向硬件合成器和Vsync线程发送帧率切换通知消息,从而使得硬件集成单元和Vsync线程实现帧率的切换。
上文结合图9和图10对缓存队列中堆叠有两个buffer的情况下的帧率切换的过程进行了详细的介绍,下面结合图11对缓存队列中堆叠有1个buffer的情况下的帧率切换的过程进行介绍。
图11为本申请实施例的帧率切换方法的示意图。
如图11所示,电子设备的帧率也是要从120Hz切换到90Hz,图11所示的帧率切换过程也包括步骤S1至S3。下面主要对S1至S3进行详细介绍。
S1、应用主线程向合成线程发送切帧请求消息。
该切帧请求消息中可以携带当前帧的VsyncId,frameNumber,目标帧率和PID。
如图11所示,应用主线程在8.3ms-16.6ms期间向合成线程发送切帧请求消息,当前帧为帧4,并且该电子设备是要将帧率从120Hz切换到90Hz,因此,切帧请求中携带的Vsyncld=2,frameNumber=4,目标帧率为90Hz。
S2、合成线程发出帧间隔修改通知消息。
在S2中,合成线程向Vsync线程发送帧间隔修改通知消息,以通知Vsync线程提前对Vsync消息中的帧间隔进行修改。
由于电子设备是要从120Hz切换到90Hz,因此,在帧率切换之前电子设备的帧率为120Hz,对应的帧间隔为8.3ms。合成线程在接收到切帧请求消息后,获知目标帧率为90Hz,该目标帧率对应的目标帧间隔为11.1ms。因此,合成线程需要通知Vsync线程在应用主线程发起切帧请求对应的Vsync消息的下下个Vsync消息中,将帧间隔修改为目标帧间隔11.1ms。
例如,应用主线程在接收到ID为N的Vsync消息之后发起切帧请求,则合成线程需要通知Vsync线程在ID为N+2的Vsync消息中将帧间隔修改为目标帧间隔11.1ms。
如图11所示,在8.3ms-16.6ms期间,合成线程接收到切帧请求消息,接下来,在16.6ms-24.9ms期间,应用主线程向Vsync线程请求下一个Vsync消息,该请求中携带当前进程标识,Vsync线程计算下一个Vsync消息的标识为4,Vsync线程内部存储有Vsync消息的队列数据,Vsync线程判断当前Vsync消息的标识为4满足第N+2个信号的要求,因此,在帧6以及其后的图像帧的Vsync消息中,Vsync线程将帧间隔全部修改为11.1ms。
具体来说,Vsync线程内可以通过设置定时器在24.9ms时将Vsync线程唤醒,在被唤醒后,Vsync线程向应用主线程发送Vsync消息,该Vsync消息携带的Vsync标 识为4,帧间隔11.1ms,时间戳24.9ms。应用主线程在接收到来自Vsync线程的Vsync消息后,保存消息中的时间戳,并计算得到帧6的位移量为11.1ms*1Pixel/8.3ms=1.3Pixel,接下来通知渲染线程,使得渲染线程根据1.3Pixel的位移量对帧6进行渲染处理,完成渲染处理后,渲染线程将帧6送入到缓存线程入队,待后续进行合成处理。
S3、合成线程发送帧率切换通知消息。
在步骤S3中,合成线程决定将帧率由120Hz切换到90Hz,并向硬件合成器和Vsync线程发送帧率切换通知消息,以实现帧率的切换。
合成线程的具体决策机制如下:
在合成线程接收到应用主线程发送的切帧请求消息之后,合成线程在对图像帧进行合成时判断要合成的图像帧的frameNumber与应用主线程发起切帧请求时携带的图像帧的frameNumber是否相同,如果相同的话则决策进行帧率的切换。
如图11所示,在24.9ms-33.2ms周期内,合成线程要合成处理的是图像帧的frameNumber为4,而应用主线程发起的切帧请求中携带的frameNumber也是4,因此,合成线程在24.9ms-33.2ms周期内,决定将帧率从原来的120Hz切换到90Hz。
经过上述过程,电子设备在52.6ms时完成帧率的切换。
在合成线程发送帧率切换通知消息后,需要有两个周期完成帧率切换,因此,在图11中,在52.6ms时实现帧率切换,此时,硬件Vsync信号和软件Vsync信号的频率也已经切换到了90Hz。
如图11所示,帧率控制系统在24.9ms-33.2ms之间决策切换帧率后,通知硬件合成器控制硬件集成单元将帧率从120Hz切换到90Hz,硬件集成单元在60.9ms位置切换完成。
此外,合成线程通知Vsync线程在33.2ms-52.6ms期间将软件周期切换为新的周期(新的周期与新的帧率相对应),完成频率切换。在帧率切换完成之后,会对定时器基于41.5ms时间戳及新的帧间隔,重新设置定时时间,从而使得定时器后续根据新的定时时间(与目标帧率相匹配的定时时间)来唤醒Vsync线程。
示例性的,图12为本申请实施例提供的帧率切换方法的帧率切换过程中各个模块之间交互的过程示意图。
S1、应用主线程在8.3ms-16.6ms之间向合成线程发送切帧请求消息,请求切帧。
上述S1中的切帧请求消息中携带的Vsyncld=2,frameNumber=4,目标帧率90Hz。
结合图11所示的内容,S1中的切帧请求消息用于请求电子设备从当前的120Hz切换到90Hz。
S2、合成线程在8.3ms-16.6ms向Vsync线程发送帧间隔修改通知消息。
合成线程在接收到来自应用主线程的切帧请求消息后,决定通知Vsync线程提前对帧间隔进行修改。S2中的帧间隔修改通知消息中携带切帧请求消息中包含的VsyncId以及目标帧率,具体地,该帧间隔修改通知消息中携带的VsyncId=2,目标帧率为90Hz。
Vsync线程在接收到帧间隔修改通知消息后,可以根据预先约定的规则,在下下个VsyncId的Vsync消息中将帧间隔修改为11.1ms。这里的预先约定的规则可以是指 Vsync线程对帧间隔修改通知消息中携带的VsyncId的下下个VsyncId对应的Vsync消息中的帧间隔进行修改,使得修改后的帧间隔与目标帧率相对应。这里的预先预定的规则可以提前在Vsync线程和合成线程之间约定好,这样Vsync线程在接收到帧间隔修改通知消息后就可以按照约定的规则对Vsync消息中的帧间隔进行修改了。
S3、合成线程在24.9ms-33.2ms之间向硬件合成器和Vsync线程发送帧率切换通知消息。
可以理解的是,在S3之前,合成线程需要判断何时向硬件合成器和和Vsync线程发送帧率切换通知消息,由于S1中的切帧请求消息中携带的frameNumber=4,因此,在确定当前要合成处理的图像帧的frameNumber也为4的情况下就向硬件合成器和Vsync线程发送帧率切换通知消息,从而使得硬件集成单元和Vsync线程实现帧率的切换。
上文结合图9至图12对帧率切换过程中缓存队列中堆叠有2个和1个buffer的情况进行了说明,而在某些情况下,缓存队列中也可以不包含buffer,此时渲染处理后的帧图像不必经过缓存而是可以直接由合成层进行处理。
下面结合图13对缓存队列中不包含buffer(包含0个buffer)的情况进行说明。
图13为本申请实施例的帧率切换方法的示意图。
如图13所示,电子设备的帧率也是要从120Hz切换到90Hz,图13示的帧率切换过程也包括步骤S1至S3。下面主要对S1至S3进行介绍。
S1、应用主线程向合成线程发送切帧请求消息。
该切帧请求消息中可以携带当前帧的VsyncId,frameNumber、目标帧率、PID。
如图13所示,应用主线程在16.6ms-24.9ms的周期内向合成线程发送切帧请求消息,当前帧为帧4,并且该电子设备是要将帧率从120Hz切换到90Hz,因此,切帧请求中携带的Vsyncld=3,frameNumber=4,目标帧率为90Hz。
S2、合成线程发出帧间隔修改通知消息。
由于电子设备是要从120Hz切换到90Hz,因此,在帧率切换之前电子设备的帧率为120Hz,对应的帧间隔为8.3ms。合成线程在接收到切帧请求后,获知目标帧率为90Hz,该目标帧率对应的目标帧间隔为11.1ms。因此,合成线程需要通知Vsync线程在应用主线程发起切帧请求对应的Vsync消息的下下个Vsync消息中,将帧间隔修改为目标帧间隔11.1ms。
例如,应用主线程在接收到ID为N的Vsync消息之后发起切帧请求,则合成线程需要通知Vsync线程在ID为N+2的Vsync消息将帧间隔修改为目标帧间隔11.1ms。
如图13所示,在16.6ms-24.9ms期间,合成线程接收到切帧请求消息,接下来,在24.9ms-33.2ms期间,应用主线程向Vsync线程请求下一个Vsync消息,该请求中携带当前进程标识,Vsync线程计算下一个Vsync消息的标识为5,Vsync线程内部存储有Vsync消息的队列数据,Vsync线程判断当前Vsync消息的标识为5满足第N+2个信号的要求,因此,在Vsync标志为5以及其后的图像帧的Vsync消息中,Vsync线程将帧间隔全部修改为11.1ms。
具体来说,Vsync线程内可以通过设置定时器在33.2ms时将Vsync线程唤醒,在被唤醒后,Vsync线程向应用主线程发送Vsync消息,该Vsync消息携带的Vsync标 识为5,帧间隔11.1ms,时间戳33.2ms;应用主线程在接收到来自Vsync线程的Vsync消息后,保存消息中的时间戳,并计算得到帧6的位移量为11.1ms*1Pixel/8.3ms=1.3Pixel,接下来通知渲染线程,使得渲染线程根据1.3Pixel的位移量对帧6进行渲染处理,在完成渲染处理后的下一个周期进行合成处理。
S3、合成线程发送帧率切换通知消息。
在步骤S3中,合成线程决定将帧率由120Hz切换到90Hz,并向硬件合成器和Vsync线程发送帧率切换通知消息,以实现帧率的切换。
合成线程的具体决策机制如下:
在合成线程接收到应用主线程发送的切帧请求之后,合成线程在对图像帧进行合成时判断要合成的图像帧的frameNumber与应用主线程发起切帧请求时携带的图像帧的frameNumber是否相同,如果相同的话则决策进行帧率的切换。
如图12所示,在24.9ms-33.2ms周期内,合成线程要合成处理的是图像帧的frameNumber为4,而应用主线程发起的切帧请求中携带的frameNumber也是4,因此,合成线程在24.9ms-33.2ms周期内,决定将帧率从原来的120Hz切换到90Hz。
经过上述过程,电子设备在52.6ms时完成帧率的切换。
在合成线程发送帧率切换通知消息后,需要有两个周期完成帧率切换,因此,在图12中,在52.6ms时实现帧率切换,此时,硬件Vsync信号和软件Vsync信号的频率也已经切换到了90Hz。
如图13所示,帧率控制系统在24.9ms-33.2ms之间决策切换帧率后,通知硬件合成器控制硬件集成单元将帧率从120Hz切换到90Hz,硬件集成单元在60.9ms位置切换完成。
此外,合成线程通知Vsync线程在33.2ms-52.6ms期间将软件周期切换为新的周期(新的周期与新的帧率相对应),完成频率切换。在帧率切换完成之后,会对定时器基于41.5ms时间戳及新的帧间隔,重新设置定时时间,从而使得定时器后续根据新的定时时间(与目标帧率相匹配的定时时间)来唤醒Vsync线程。
示例性的,图14为本申请实施例提供的帧率切换方法的帧率切换过程中各个模块之间交互的过程示意图。
S1、应用主线程在16.6ms-24.9ms之间向合成线程发送切帧请求消息,请求切帧。
上述S1中的切帧请求消息中携带的Vsyncld=3,frameNumber=4,目标帧率90Hz。
结合图13所示的内容,S1中的切帧请求消息用于请求电子设备从当前的120Hz切换到90Hz。
S2、合成线程在16.6ms-24.9ms向Vsync线程发送帧间隔修改通知消息。
合成线程在接收到来自应用主线程的切帧请求消息后,决定通知Vsync线程提前对帧间隔进行修改。S2中的帧间隔修改通知消息中携带切帧请求消息中包含的VsyncId以及目标帧率,具体地,该帧间隔修改通知消息中携带的VsyncId=3,目标帧率为90Hz。
Vsync线程在接收到帧间隔修改通知消息后,可以根据预先约定的规则,在下下个VsyncId的Vsync消息中将帧间隔修改为11.1ms。这里的预先约定的规则可以是指Vsync线程对帧间隔修改通知消息中携带的VsyncId的下下个VsyncId对应的Vsync 消息中的帧间隔进行修改,使得修改后的帧间隔与目标帧率相对应。这里的预先预定的规则可以提前在Vsync线程和合成线程之间约定好,这样Vsync线程在接收到帧间隔修改通知消息后就可以按照约定的规则对Vsync消息中的帧间隔进行修改了。
S3、合成线程在24.9ms-33.2ms之间向硬件合成器和Vsync线程发送帧率切换通知消息。
可以理解的是,在S3之前,合成线程需要判断何时向硬件合成器和和Vsync线程发送帧率切换通知消息,由于S1中的切帧请求消息中携带的frameNumber=4,因此,在确定当前要合成处理的图像帧的frameNumber也为4的情况下就向硬件合成器和Vsync线程发送帧率切换通知消息,从而使得硬件集成单元和Vsync线程实现帧率的切换。
应理解,上文结合图9至图14以缓存队列中的buffer数量为0,1和2的情况对对本申请实施例的帧率切换方法进行了详细的介绍,可以理解的是,本申请实施例也可以应用于缓存队列中buffer数量大于2的情况。
根据上述分析可知,图9中的S1和S3之间相差了两个周期间隔,图11中的S1和S3之间相差了一个周期间隔,而图13中的S1和S3之间相差0个时间间隔(S1所在的时间间隔与S3所在的时间间隔相邻),由此看来,当缓存队列中堆叠的Buffer数量越多的时候,S1和S3之间相差的时间间隔越长,而当缓存队列中堆叠的Buffer数量越少的时候,S1和S3之间相差的时间间隔越短。这主要是由于当缓存队列中堆叠的Buffer数量较多的时候,在S1发出请求切帧消息之后,由于缓存Buffer的存在,需要经过一段时间之后才能再发送帧率切换通知消息。
可以理解的是,在图9,图11以及图13中的S1和S2对应的箭头的先后顺序并不表示S1和S2实际发生的时间。实际上,在上述图9,图11以及图13中,S1都是先于S2发生的,也就是说,先由S1中应用线程的请求切帧率,才会有S2中的合成线程的发送帧率通知消息。
另外,上文中结合图9至图14介绍的都是帧率从120Hz切换到90Hz的情况,也就是说切换前后的帧率是非整数倍的关系,实际上,本申请的方案也适用于切换前后帧率是整数倍关系的情况,切换前后帧率是整数倍的关系的情况下帧率的切换过程与切换前后帧率是非整数倍的关系的情况下的切换过程相同,这里不再赘述。
在可能的实现方式中,本申请实施例中的计算机执行指令也可以称之为应用程序代码,本申请实施例对此不作具体限定。
本申请实施例提供的帧率切换装置,用于执行上述实施例的帧率切换方法,技术原理和技术效果相似,此处不再赘述。
本申请实施例提供一种电子设备,结构参见图1。电子设备的存储器可用于存储至少一个程序指令,处理器用于执行至少一个程序指令,以实现上述方法实施例的技术方案。其实现原理和技术效果与上述方法相关实施例类似,此处不再赘述。
本申请实施例提供一种芯片。芯片包括处理器,处理器用于调用存储器中的计算机程序,以执行上述实施例中的技术方案。其实现原理和技术效果与上述相关实施例类似,此处不再赘述。
本申请实施例提供一种计算机程序产品,当计算机程序产品在电子设备运行时, 使得电子设备执行上述实施例中的技术方案。其实现原理和技术效果与上述相关实施例类似,此处不再赘述。
本申请实施例提供一种计算机可读存储介质,其上存储有程序指令,程序指令被电子设备执行时,使得电子设备执行上述实施例的技术方案。其实现原理和技术效果与上述相关实施例类似,此处不再赘述。
本申请实施例是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理单元以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理单元执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
以上的具体实施方式,对本发明的目的、技术方案和有益效果进行了进一步详细说明,所应理解的是,以上仅为本发明的具体实施方式而已,并不用于限定本发明的保护范围,凡在本发明的技术方案的基础之上,所做的任何修改、等同替换、改进等,均应包括在本发明的保护范围之内。

Claims (15)

  1. 一种帧率切换方法,其特征在于,包括:
    应用线程在第一周期基于第一帧率对应的帧间隔绘制和渲染第一图像帧;
    所述应用线程在第二周期基于第二帧率对应的帧间隔绘制和渲染第二图像帧,其中,所述第二周期位于所述第一周期之后,所述第二帧率与所述第一帧率不同;
    合成线程在第三周期向硬件合成器发送第一切帧请求,其中,所述第三周期位于所述第二周期之后,或者,所述第三周期与所述第二周期重合;
    所述硬件合成器基于所述第一切帧请求从所述第一帧率切换到所述第二帧率,以使得所述第二图像帧以所述第二帧率显示。
  2. 如权利要求1所述的方法,其特征在于,所述合成线程在第三周期向硬件合成器线程发送第一切帧请求,包括:
    所述合成线程在所述第三周期调用函数performSetActiveMode,以唤醒所述硬件合成器线程将帧率从所述第一帧率切换到所述第二帧率。
  3. 如权利要求1或2所述的方法,其特征在于,所述方法还包括:
    所述合成线程在第四周期向Vsync线程发送第二切帧请求,所述第四周期位于所述第二周期之后,或者,所述第四周期与所述第二周期重合;
    所述Vsync线程基于所述第二切帧请求以所述第二帧率向应用线程发送Vsync消息。
  4. 如权利要求3所述的方法,其特征在于,所述合成线程在第四周期向Vsync线程发送第二切帧请求,包括:
    所述合成线程在第四周期调用setDuration函数将所述第二帧率对应的周期参数设置到Vsync线程,以使得所述Vsync线程以所述第二帧率向应用线程发送Vsync消息。
  5. 如权利要求1-4中任一项所述的方法,其特征在于,在所述应用线程对应的缓存队列中堆叠的Buffer数量为N的情况下,所述第三周期位于所述第二周期之后,其中,N为大于或者等于1的整数。
  6. 如权利要求1-4中任一项所述的方法,其特征在于,在所述应用线程对应的缓存队列中堆叠的Buffer数量为0的情况下,所述第三周期与所述第二周期重合。
  7. 如权利要求1-6中任一项所述的方法,其特征在于,在所述应用线程在第二周期以第二帧率绘制和渲染第二图像帧之前,所述方法还包括:
    所述应用线程接收Vsync线程发送的第一Vsync消息,所述第一Vsync消息中携带的所述第二帧率对应的帧间隔。
  8. 如权利要求7所述的方法,其特征在于,所述应用线程接收Vsync线程发送的第一Vsync消息,包括:
    所述应用线程在所述第二周期的起始时刻接收到所述Vsync线程发送的第一Vsync消息。
  9. 如权利要求7或8所述的方法,其特征在于,所述应用线程接收Vsync线程发送的第一Vsync消息之前,所述方法还包括:
    所述合成线程接收到来自所述应用线程的初始切帧请求;
    所述合成线程向所述Vsync线程发送帧间隔修改通知消息;
    所述Vsync线程基于所述帧间隔修改通知消息生成所述第一Vsync消息。
  10. 如权利要求9所述的方法,其特征在于,所述合成线程接收到来自所述应用线程的初始切帧请求,包括:
    所述合成线程在所述第一周期接收来自所述应用线程的初始切帧请求;
    所述合成线程向所述Vsync线程发送帧间隔修改通知消息,包括:
    所述合成线程在所述第一周期向所述Vsync线程发送帧间隔修改通知消息。
  11. 如权利要求10所述的方法,其特征在于,在所述合成线程在第三周期向硬件合成器发送第一切帧请求之前,所述方法还包括:
    所述合成线程将对第一图像帧进行合成处理的下一个周期确定为所述第三周期。
  12. 如权利要求1-11中任一项所述的方法,其特征在于,所述第一周期,所述第二周期以及所述第三周期的时间间隔大小均与所述第一帧率的对应的帧间隔大小相同。
  13. 一种电子设备,其特征在于,所述电子设备包括处理器,所述处理器用于调用存储器中的计算机程序,以执行如权利要求1-12中任一项所述的方法。
  14. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1-12中任一项所述的方法。
  15. 一种芯片,其特征在于,所述芯片包括处理器,所述处理器用于调用存储器中的计算机程序,以执行如权利要求1-12中任一项所述的方法。
PCT/CN2022/117925 2021-12-29 2022-09-08 帧率切换方法及装置 WO2023124225A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22879630.6A EP4236301A4 (en) 2021-12-29 2022-09-08 METHOD AND DEVICE FOR IMAGE FREQUENCY SWITCHING

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202111650488.9 2021-12-29
CN202111650488 2021-12-29
CN202210191921.5A CN116414337A (zh) 2021-12-29 2022-02-28 帧率切换方法及装置
CN202210191921.5 2022-02-28

Publications (2)

Publication Number Publication Date
WO2023124225A1 true WO2023124225A1 (zh) 2023-07-06
WO2023124225A9 WO2023124225A9 (zh) 2023-11-23

Family

ID=86693023

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/117925 WO2023124225A1 (zh) 2021-12-29 2022-09-08 帧率切换方法及装置

Country Status (2)

Country Link
EP (1) EP4236301A4 (zh)
WO (1) WO2023124225A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020126756A1 (en) * 2001-03-08 2002-09-12 Matsushita Electric Industrial Co., Ltd. Video encoding apparatus, video encoding method, and frame rate conversion apparatus
US20080192060A1 (en) * 2007-02-13 2008-08-14 Sony Computer Entertainment Inc. Image converting apparatus and image converting method
CN106657680A (zh) * 2017-03-10 2017-05-10 广东欧珀移动通信有限公司 一种移动终端帧率的控制方法、装置及移动终端
CN107786748A (zh) * 2017-10-31 2018-03-09 广东欧珀移动通信有限公司 图像显示方法及设备
CN113299251A (zh) * 2020-02-21 2021-08-24 联发科技股份有限公司 一种处理器的控制方法及处理器
CN113630572A (zh) * 2021-07-09 2021-11-09 荣耀终端有限公司 帧率切换方法和相关装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020062069A1 (en) * 2018-09-28 2020-04-02 Qualcomm Incorporated Frame composition alignment to target frame rate for janks reduction
KR20210101627A (ko) * 2020-02-10 2021-08-19 삼성전자주식회사 디스플레이를 포함하는 전자 장치와 이의 동작 방법
CN113625860B (zh) * 2021-07-23 2023-01-31 荣耀终端有限公司 模式切换方法、装置、电子设备及芯片系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020126756A1 (en) * 2001-03-08 2002-09-12 Matsushita Electric Industrial Co., Ltd. Video encoding apparatus, video encoding method, and frame rate conversion apparatus
US20080192060A1 (en) * 2007-02-13 2008-08-14 Sony Computer Entertainment Inc. Image converting apparatus and image converting method
CN106657680A (zh) * 2017-03-10 2017-05-10 广东欧珀移动通信有限公司 一种移动终端帧率的控制方法、装置及移动终端
CN107786748A (zh) * 2017-10-31 2018-03-09 广东欧珀移动通信有限公司 图像显示方法及设备
CN113299251A (zh) * 2020-02-21 2021-08-24 联发科技股份有限公司 一种处理器的控制方法及处理器
CN113630572A (zh) * 2021-07-09 2021-11-09 荣耀终端有限公司 帧率切换方法和相关装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4236301A4

Also Published As

Publication number Publication date
EP4236301A1 (en) 2023-08-30
WO2023124225A9 (zh) 2023-11-23
EP4236301A4 (en) 2024-02-28

Similar Documents

Publication Publication Date Title
CN109814766B (zh) 一种应用显示方法及电子设备
WO2020253719A1 (zh) 一种录屏方法及电子设备
WO2023142995A1 (zh) 数据处理方法和相关装置
WO2020259452A1 (zh) 一种移动终端的全屏显示方法及设备
CN113254120B (zh) 数据处理方法和相关装置
WO2020093988A1 (zh) 一种图像处理方法及电子设备
CN114579076B (zh) 数据处理方法和相关装置
WO2023000772A1 (zh) 模式切换方法、装置、电子设备及芯片系统
WO2023065873A1 (zh) 帧率调整方法、终端设备及帧率调整系统
CN115048012B (zh) 数据处理方法和相关装置
US11573829B2 (en) Task processing method and apparatus, terminal, and computer readable storage medium
WO2022037726A1 (zh) 分屏显示方法和电子设备
WO2023066395A1 (zh) 一种应用运行方法以及相关设备
CN113986070B (zh) 一种应用卡片的快速查看方法及电子设备
WO2023124225A1 (zh) 帧率切换方法及装置
WO2023124227A1 (zh) 帧率切换方法及装置
CN115686403A (zh) 显示参数的调整方法、电子设备、芯片及可读存储介质
CN116414336A (zh) 帧率切换方法及装置
WO2024066834A9 (zh) Vsync信号的控制方法、电子设备、存储介质及芯片
WO2024066834A1 (zh) Vsync信号的控制方法、电子设备、存储介质及芯片
CN116414337A (zh) 帧率切换方法及装置
CN115904184B (zh) 数据处理方法和相关装置
CN116700578B (zh) 图层合成方法、电子设备以及存储介质
CN116069187B (zh) 一种显示方法及电子设备
WO2023020528A1 (zh) 一种投屏方法、设备、存储介质及计算机程序产品

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2022879630

Country of ref document: EP

Effective date: 20230427