WO2023109184A1 - 刷新率的设置方法及相关设备 - Google Patents

刷新率的设置方法及相关设备 Download PDF

Info

Publication number
WO2023109184A1
WO2023109184A1 PCT/CN2022/115417 CN2022115417W WO2023109184A1 WO 2023109184 A1 WO2023109184 A1 WO 2023109184A1 CN 2022115417 W CN2022115417 W CN 2022115417W WO 2023109184 A1 WO2023109184 A1 WO 2023109184A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
barrage
video
frame
interface
Prior art date
Application number
PCT/CN2022/115417
Other languages
English (en)
French (fr)
Inventor
冯晓刚
李飞
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to EP22879601.7A priority Critical patent/EP4224872A4/en
Publication of WO2023109184A1 publication Critical patent/WO2023109184A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Definitions

  • the present application relates to the field of display technology, in particular to a refresh rate setting method and related equipment.
  • the refresh rate is the number of times the screen is refreshed per second.
  • the refresh rate supported by electronic devices such as mobile phones and tablet computers continues to increase.
  • electronic devices usually refresh and display according to a fixed high refresh rate, for example, refresh and display according to a fixed refresh rate of 60HZ, 90HZ or 120HZ.
  • the present application provides a refresh rate setting method and related equipment, aiming at satisfying the refresh rate requirement during video playback.
  • the embodiment of the present application discloses a method for setting a refresh rate, which is applied to an electronic device, and the electronic device is installed with an application.
  • the method includes:
  • the application interface of the application In response to receiving an operation of starting the application by the user, the application interface of the application is refreshed and displayed according to the first refresh rate. In response to receiving the user's operation of playing the video, the video picture is displayed according to the first refresh rate. In the process of displaying the video screen, determine whether to draw bullet chatting within the preset duration. If the bullet chatting drawing is performed within the preset duration, the bullet chatting data of each frame of the video display screen is counted, and then the application interface of the display application is refreshed according to the second refresh rate according to the bullet chatting data of each frame of the display screen. If the barrage drawing is not performed within the preset time period, the application interface of the displayed application is refreshed according to a third refresh rate, and the third refresh rate is different from the second refresh rate.
  • the bullet chat drawing in the process of displaying the video screen, it is determined in real time whether the bullet chat drawing is performed within the preset time length. If the bullet chat drawing is performed within the preset time length, it can be determined that the application enters the bullet chat scene.
  • Each frame displays the bullet chatting data of the screen, and then refreshes the application interface of the display application according to the second refresh rate according to the bullet chatting data of each frame of the display screen, so as to use the refresh that meets the display requirements in the bullet chatting scene according to the bullet chatting data
  • the interface is refreshed at a certain rate.
  • the barrage drawing is not performed within the preset time period, it can be considered that the application has entered the video playback scene, and then the application interface of the application can be refreshed and displayed according to the third refresh rate to meet the display requirements in the video playback scene.
  • the application interface of the display application is refreshed according to the second refresh rate, including:
  • the second refresh rate is determined according to the barrage data of each frame of the display screen, and the application interface of the application is refreshed and displayed according to the determined second refresh rate.
  • the second refresh rate is determined according to the barrage data of each frame of the display screen, including:
  • the barrage scale is used to indicate the barrage density level on the display screen.
  • the second refresh rate is the preset refresh rate corresponding to the current barrage gear. The greater the density level of the barrage indicated by the barrage gear position, the greater the determined second refresh rate.
  • the bullet chat data of one frame of display screen includes: the number of bullet chat bars drawn in one frame of display screen, and/or the number of bullet chat characters drawn in one frame of display screen.
  • the current barrage gear is determined according to the currently counted barrage data of each frame of the display screen, including:
  • the timestamp corresponding to each frame of display screen, and the aging time of bullet chats calculate the total number of bullet chats displayed in the current frame of display screen. According to the number of barrage bars drawn in the current frame of display screen and the total number of barrage bars displayed in the current frame of display screen, the current barrage gear is determined.
  • the barrage gear includes: a high barrage gear or a low barrage gear. According to the number of barrage bars drawn in the current frame of display screen and the total number of barrage bars displayed in the current frame of display screen, determine the current barrage gear, including:
  • the current barrage gear is determined It is a low barrage gear. If the number of barrage bars drawn in the current frame display screen is greater than or equal to the second preset value, or the total number of barrage bars displayed in the current frame display screen is greater than or equal to the second preset value, then determine the current The barrage gear is the high barrage gear.
  • counting the bullet chat data of each frame of the video including:
  • Each frame of the statistical video displays the barrage data of a specific area in the screen.
  • the method before determining whether to draw bullet chatting within the preset time period, the method further includes:
  • the layer information of the application interface determine whether it is a video playback scene.
  • the video playing scene is a scene of playing a video.
  • the determination of whether to perform barrage drawing within the preset time period includes: if it is determined to be a video playback scene, determining whether to perform barrage drawing within the preset time period.
  • the method before determining whether to draw bullet chatting within the preset time period, the method further includes:
  • determining whether it is a video playback scene includes: if it is determined that the application is in the playback whitelist, then according to the layer information of the application interface, determining whether it is a video playback scene.
  • determining whether the application is in the playback whitelist includes:
  • the playback whitelist includes: the package name of each app with video playback permission.
  • the layer information of the application interface has the feature information of the video layer, and if the layer information of the application interface has the feature information of the video layer, it is determined that it is in a video playback scene. If the layer information of the application interface does not have the feature information of the video layer, it is determined that the scene is not in the video playback scene.
  • the characteristic information of the video layer is: the SurfaceView field of the video layer carried in the layer name.
  • the third refresh rate is determined according to a video source frame rate of the video.
  • the method further includes: in response to receiving an operation of quitting video playback by the user, refreshing and displaying the application interface of the application at a first refresh rate.
  • the operating system of the electronic device includes: an application and an image synthesizer Surface Flinger; determining whether to perform barrage drawing within a preset duration includes:
  • the Bullet chat drawing information of a frame includes: the number of bullet chat bars drawn in a frame of display screen, the timestamp corresponding to a frame of display screen, and Draw the rendered barrage data in one frame display screen.
  • the operating system of the electronic device further includes: a frame rate decision module.
  • the barrage data of one frame of display screen includes: the number of barrage lines drawn in one frame of display screen. If the bullet chatting is drawn within the preset duration, the bullet chatting data of each frame of the video will be counted, including:
  • Surface Flinger receives a frame of bullet chatting drawing information within a preset time period, it will store the number of bullet chatting lines drawn in one frame of display screen and the corresponding time stamp of one frame of display screen.
  • the operating system of the electronic device further includes: a frame rate decision module. According to the barrage data of each frame of the display screen, the application interface of the application is refreshed and displayed according to the second refresh rate, including:
  • Surface Flinger determines the current barrage gear based on the stored number of barrage lines drawn in each frame of the display screen, the timestamp corresponding to each frame of the display screen, and the aging time of the barrage.
  • Surface Flinger sends the current barrage gear information to the frame rate decision module.
  • Barrage gear information including: current barrage gear.
  • the frame rate decision module determines the second refresh rate according to the barrage gear information.
  • the frame rate decision module sends the determined second refresh rate to Surface Flinge.
  • Surface Flinge controls the display screen of the electronic device to refresh and display the application interface of the application according to the second refresh rate.
  • Surface Flinger sends the current barrage gear information to the frame rate decision module, including:
  • the operating system of the electronic device further includes: a drawing and rendering module, where the drawing and rendering module includes a barrage counter. If Surface Flinger receives a frame of bullet chatting drawing information within the preset duration, before Surface Flinger receives a frame of bullet chatting drawing information within the preset duration, it also includes:
  • the drawing and rendering module determines whether the barrage statistics are enabled. If the drawing and rendering module determines that the barrage statistics are enabled, it controls the barrage counter to draw Count the number of barrage entries, and the drawing and rendering module sends the barrage drawing information of the current frame to Surface Flinger according to the rhythm of the Vsync signal.
  • the operating system of the electronic device further includes: a frame rate decision module. If the barrage drawing is not performed within the preset duration, the application interface of the application is refreshed and displayed according to the third refresh rate, including:
  • the frame rate decision module sends the determined third refresh rate to Surface Flinge, and Surface Flinge controls the display screen of the electronic device to refresh and display the application interface of the application according to the third refresh rate.
  • the operating system of the electronic device further includes: a codec Media Codec. After receiving the user's operation to play the video, it also includes:
  • the application calls the Media Codec to decode the video, and the Media Codec sends the video source frame rate of the video to the frame rate decision module.
  • the operating system of the electronic device includes: Surface Flinge. According to the layer information of the application interface, determine whether it is a video playback scene, including:
  • Surface Flinge determines whether the layer name in the layer information of the application interface carries the SurfaceView field. If Surface Flinge determines that the layer name contains the SurfaceView field, it determines that it is a video playback scene. If Surface Flinge determines that the layer name does not carry SurfaceView field, it is determined not to play the scene for the video.
  • the operating system of the electronic device includes: an application, a Surface Flinge, and a rendering module.
  • Display video images according to the first refresh rate including:
  • the video playback interface is an application interface for displaying video images.
  • the application calls the drawing and rendering module to draw and render the layers of the video playback interface according to the rhythm of the Vsync signal corresponding to the first refresh rate.
  • the drawing and rendering module sends the rendered and rendered video playback interface image data to Surface Flinge according to the rhythm of the Vsync signal corresponding to the first refresh rate.
  • Surface Flinge uses the image data of the rendered video playback interface to perform layer synthesis.
  • Surface Flinge outputs the synthesized video playback interface image data to the display screen according to the Vsync signal rhythm corresponding to the first refresh rate.
  • the operating system of the electronic device further includes: AMS and a window management module WMS.
  • AMS a window management module
  • WMS window management module
  • the application sends a request to start the video playback activity to AMS; the request to start the video playback activity carries the package name of the application and the name of the video playback interface.
  • AMS starts the video playback activity according to the application package name and video playback interface name.
  • the AMS sends the window information corresponding to the video playback interface to the WMS.
  • the WMS creates a window of the video playback interface according to the window information corresponding to the video playback interface.
  • WMS sends the layer information of the video playback interface to Surface Flinge.
  • the layer information carries the package name of the application.
  • the layer information of the video playback interface corresponds to the window of the video playback interface.
  • Surface Flinge creates the layers of the video playback interface according to the layer information of the video playback interface.
  • the operating system of the electronic device includes: applications, Surface Flinge, and Media Codec. Determines whether to decode the video, including:
  • the application calls Media Codec to decode the video, and Media Codec sends the called information to Surface Flinge.
  • the application before the application calls Media Codec to decode the video, it also includes:
  • the WMS sends the information that the window of the video playback interface is created to the application.
  • determining whether the application is in the playback whitelist includes:
  • the Surface Flinge determines whether the application is in the playback whitelist according to the package name of the application carried in the layer information of the application interface; the playback whitelist includes: the package name of each application with video playback permission.
  • the operating system of the electronic device includes: an application, a Surface Flinge, a frame rate decision module, and a drawing and rendering module.
  • the application interface of the display application is refreshed according to the first refresh rate, including:
  • Surface Flinge triggers the application to draw and render the layers of the main interface of the application according to the Vsync signal rhythm corresponding to the first refresh rate.
  • the main interface of the application is the application interface displayed after the application is started.
  • the application calls the drawing and rendering module to draw and render the layers of the main interface of the application.
  • the drawing and rendering module sends the image data of the main interface of the application after drawing and rendering to Surface Flinge according to the rhythm of the Vsync signal corresponding to the first refresh rate.
  • Surface Flinge uses the image data of the main interface of the application after drawing and rendering to perform layer synthesis.
  • Surface Flinge outputs the synthesized application main interface image data to the display screen according to the Vsync signal rhythm corresponding to the first refresh rate.
  • the operating system of the electronic device further includes: a desktop launcher Launcher, an AMS, and a WMS.
  • a desktop launcher Launcher Before Surface Flinge triggers the application to draw and render the layers of the main interface of the application according to the Vsync signal rhythm corresponding to the first refresh rate, it also includes:
  • the Launcher sends a request to start the application activity to AMS; the request to start the application activity carries the package name of the application.
  • the AMS starts the application Activity, and the AMS sends the window information corresponding to the main interface of the application to the WMS, and the window information carries the package name of the application.
  • the WMS creates a window of the application main interface according to the window information corresponding to the application main interface, and sends the package name of the application to the frame rate decision module.
  • WMS sends the layer information of the application main interface to Surface Flinge; the layer information of the application main interface corresponds to the window of the application main interface.
  • the frame rate decision module determines the first refresh rate according to the package name of the application, and the first refresh rate is a preset refresh rate corresponding to the application.
  • Surface Flinge creates the layers of the main interface of the application according to the layer information of the main interface of the application.
  • the present application discloses an electronic device, including: one or more processors, memory and display screen.
  • the memory and the display screen are respectively coupled with one or more processors.
  • the display screen is used to display the application interface.
  • the memory is used to store computer program codes, and the computer program codes include computer instructions.
  • the electronic device executes the refresh rate setting method according to any one of the above first aspects.
  • FIG. 1 is a first schematic diagram of a video playback scene of a video application proposed in an embodiment of the present application
  • FIG. 2 is a hardware structural diagram of an electronic device proposed in an embodiment of the present application.
  • FIG. 3 is a system diagram of an electronic device proposed in an embodiment of the present application.
  • FIG. 4a is a flow chart of the method for setting the refresh rate proposed in the embodiment of the present application during the application startup phase
  • FIG. 4b is a flow chart of the method for setting the refresh rate proposed in the embodiment of the present application in the video playback stage;
  • FIG. 4c is a second schematic diagram of a video playback scene of a video application proposed in an embodiment of the present application.
  • Figure 4d is a schematic diagram of a video playback scene three of the video application proposed by the embodiment of the present application.
  • Fig. 4e is a flow chart of another refresh rate setting method proposed in the embodiment of the present application in the video playing stage.
  • one or more refers to one, two or more than two; "and/or” describes the association relationship of associated objects, indicating that there may be three types of relationships; for example, A and/or B may mean: A exists alone, A and B exist simultaneously, and B exists alone, wherein A and B may be singular or plural.
  • the character "/" generally indicates that the contextual objects are an "or" relationship.
  • references to "one embodiment” or “some embodiments” or the like in this specification means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically stated otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise.
  • a plurality referred to in the embodiment of the present application means greater than or equal to two. It should be noted that in the description of the embodiments of the present application, words such as “first” and “second” are only used to distinguish the purpose of description, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or imply order.
  • each application of the mobile phone can be set with a preset refresh rate corresponding to the application, and the preset refresh rates corresponding to different applications can be the same or different.
  • the mobile phone may refresh the screen at a preset refresh rate corresponding to the application during the running of the application.
  • the main interface of the mobile phone displays clock, calendar, gallery, memo, file management, email, music, calculator, video, sports health, weather and browser application.
  • playable videos such as video 1 , video 2 , video 3 , video 4 , and video 5 are displayed on the main interface of the application.
  • the application has been started, and the mobile phone refreshes according to the preset refresh rate of 60HZ corresponding to the application.
  • (3) of Figure 1 shows the playback interface of Video 1, and the barrage switch of Video 1 is off at this time, and during the playback of Video 1 shown in (3) of Figure 1, the video screen is still refreshed at 60HZ
  • the rate is refreshing.
  • the barrage switch is turned on, there is scrolling barrage in the video screen, and the video screen is still refreshed and displayed at 60HZ.
  • the preset refresh rate corresponding to the application cannot meet the refresh rate requirement of the video playback process.
  • the video 1 played in (3) of Figure 1 is a movie
  • only a refresh rate of 24HZ is required to meet the requirements of a smooth display screen
  • using a refresh rate of 60HZ to refresh the video screen will cause excessive load on the processor , High power consumption, does not meet the needs of low power consumption of mobile phones.
  • the default refresh rate corresponding to the application is higher than the actual refresh rate required in the video playback scene, which may easily cause excessive processor load and high power consumption, which does not meet the low power consumption requirements of mobile phones .
  • the preset refresh rate corresponding to the application is fixed, but the demand for the refresh rate in the video playback scene will change dynamically, and the preset refresh rate corresponding to the application cannot be applied to the refresh rate in the video playback scene different needs.
  • the embodiment of the present application proposes a refresh rate setting method to meet the refresh rate requirements of applications in different video playback scenarios.
  • a method for setting the refresh rate proposed in the embodiment of the present application can be applied to mobile phones, tablet computers, notebook computers, ultra-mobile personal computers (Ultra-mobile Personal Computer, UMPC), handheld computers, netbooks, personal digital assistants (Personal Digital Assistant) , PDA), wearable electronic devices, smart watches and other electronic devices, which have video applications installed in them.
  • the video applications mentioned in the embodiments of the present application all refer to applications with a function of playing videos.
  • the electronic device proposed by the embodiment of the present application may include a processor 110, an external memory interface 120, an internal memory 121, a charging management module 130, a power management module 131, a battery 132, an antenna 1, an antenna 2, a mobile
  • the sensor module 170 may include a fingerprint sensor 170A, a touch sensor 170B and so on.
  • the structure shown in this embodiment does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or fewer components than shown, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor ( image signal processor, ISP), video codec. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the processor 110 can be used to execute any refresh rate setting method proposed in the embodiment of the application. Images and barrage are involved in the setting method of either refresh rate proposed. Specifically, reference may be made to the relevant content in the following parts of FIG. 4a , FIG. 4b , and FIG. 4e , which will not be repeated here.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • processor 110 may include one or more interfaces.
  • the interface may include a mobile industry processor interface (mobile industry processor interface, MIPI) and the like.
  • the MIPI interface can be used to connect peripheral devices such as the processor 110 and the display screen 180 .
  • the MIPI interface includes a display serial interface (display serial interface, DSI) and the like.
  • the processor 110 communicates with the display screen 180 through a DSI interface to realize the display function of the electronic device.
  • the processor 110 communicates with the display screen 180 through the DSI interface to display on the display screen 180 the images and barrages involved in any refresh rate setting method proposed in the embodiment of the present application.
  • the interface connection relationship among the modules shown in this embodiment is only a schematic illustration, and does not constitute a structural limitation of the electronic device.
  • the electronic device may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 130 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the power management module 131 is used for connecting the battery 132 , the charging management module 130 and the processor 110 .
  • the power management module 131 receives the input of the battery 132 and/or the charging management module 130 to provide power for the processor 110 , the internal memory 121 , the display screen 180 and so on.
  • the wireless communication function of the electronic device can be realized by the antenna 1, the antenna 2, the mobile communication module 140, the wireless communication module 150, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • the mobile communication module 140 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices.
  • the wireless communication module 150 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT) and other wireless communication solutions applied on electronic equipment. plan.
  • WLAN wireless local area networks
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • BT bluetooth
  • the antenna 1 of the electronic device is coupled to the mobile communication module 140, and the antenna 2 is coupled to the wireless communication module 150, so that the electronic device can communicate with the network and other devices through wireless communication technology.
  • the electronic device realizes the display function through the GPU, the display screen 180, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 180 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the GPU may be used to display on the display screen 180 the images and barrage involved in any refresh rate setting method proposed in the embodiment of the present application. Specifically, reference may be made to the relevant content in the following parts of FIG. 4a , FIG. 4b , and FIG. 4e , which will not be repeated here.
  • the display screen 180 is used to display images, videos and the like.
  • the display screen 180 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), etc.
  • the electronic device may include 1 or N display screens 180 , where N is a positive integer greater than 1.
  • GUI graphical user interface
  • the display screen 180 may be used to display images and barrages involved in any refresh rate setting method proposed in the embodiment of the present application.
  • FIG. 4a , FIG. 4b and FIG. 4e please refer to the relevant content in FIG. 4a , FIG. 4b and FIG. 4e , which will not be repeated here.
  • Video codecs are used to compress or decompress digital video.
  • An electronic device may support one or more video codecs.
  • the electronic device can play or record video in multiple encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • the video codec may be used to decode the video played by the video application involved in any refresh rate setting method proposed in the embodiment of the present application. Specifically, reference may be made to S4206 in FIG. 4b and related content in S4306 in FIG. 4e , which will not be repeated here.
  • the video codec can also be software, for example, it can be a codec (Media Codec) shown in FIG. 3 below.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device.
  • an external memory card such as a Micro SD card
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device by executing instructions stored in the internal memory 121 .
  • the processor 110 may execute any refresh rate setting method mentioned in the embodiments of the present application by executing instructions stored in the internal memory 121 .
  • the audio module 160 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • Speaker 160A also called “horn” is used to convert audio electrical signals into sound signals.
  • the fingerprint sensor 170A is used to collect fingerprints. Electronic devices can use the collected fingerprint features to unlock fingerprints, access application locks, take pictures with fingerprints, answer incoming calls with fingerprints, etc.
  • the touch sensor 170B is also called “touch device”.
  • the touch sensor 170B can be disposed on the display screen 180 , and the touch sensor 170B and the display screen 180 form a touch screen, also called “touch screen”.
  • an operating system runs on top of the above components.
  • Applications can be installed and run on this operating system.
  • Fig. 3 is a structural block diagram of an electronic device according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • Fig. 3 is a structural block diagram of an electronic device according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application layer may include a desktop launcher (Launcher) and a video application.
  • Laser desktop launcher
  • the Launcher is used to start the application Activity.
  • the Launcher sends a request to start the application Activity to the Activity Management Service Module (Activity Manager Service, AMS) in response to the operation of the user clicking the icon of the video application.
  • Activity Management Service Module Activity Manager Service
  • AMS Activity Management Service Module
  • Video applications are used to provide users with video playback functions.
  • the video application responds to the user's operation of clicking the video play, calls the codec (Media Codec) to decode the video, and sends a request to start the video play activity to the AMS.
  • codec Media Codec
  • the drawing and rendering thread of the drawing and rendering module is called to perform drawing of image data and/or barrage data. Render, and send the rendered data to Surface Flinger for synthesis.
  • image synthesizer Surface Flinger
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions. As shown in Figure 3, the application framework layer can include Media Codec, AMS, window management module (Window Manager Service, WMS), and drawing and rendering module.
  • Media Codec is used to process audio and video codec.
  • the Media Codec decodes the video to be played by the video application, and sends the frame rate of the video source and the called information to the frame rate decision module.
  • the related content of step S4207 shown in FIG. 4 b and the related content of step S4307 shown in FIG. 4 e which will not be repeated here.
  • AMS is used to manage the application process and the activity of the process.
  • the AMS responds to the launch application Activity request sent by the Launcher, starts the application Activity, and then sends the window creation information to the WMS, for details, refer to steps S4102 to Relevant content of step S4103 will not be repeated here.
  • the AMS responds to the video playback activity request sent by the video application, starts the video playback activity, and sends the window creation information to the WMS.
  • the Steps S4202 to S4203 and related content of S4302 to S4303 in FIG. 4e will not be repeated here.
  • WMS is used to manage windows, such as being responsible for starting, adding, and deleting windows.
  • the WMS is used to create the window of the video application main interface according to the window information corresponding to the video application main interface sent by the AMS.
  • the WMS can also create a window of the video playback interface according to the window information of the video playback interface.
  • step S4204 in FIG. 4b and related content in S4304 in FIG. 4e please refer to step S4204 in FIG. 4b and related content in S4304 in FIG. 4e.
  • the WMS is used to deliver the package name of the video application to the frame rate decision module.
  • the relevant content of step S4107 shown in FIG. 4a which will not be repeated here.
  • WMS is used to deliver the layer information of the main interface of the video application, or the layer information of the video playback interface to Surface Flinger, for details, refer to the steps shown in Figure 4a
  • S4105 and step S4205 shown in FIG. 4b will not be repeated here.
  • the drawing and rendering module is used to start the drawing and rendering thread to perform drawing and rendering, and obtain image data after drawing and rendering, and/or, barrage data, etc.
  • the video application calls the drawing and rendering thread in the drawing and rendering module, so that the drawing and rendering module draws the layers of the main interface of the video application, the layers of the video playback interface and/or the barrage
  • the relevant content of steps S4111 to S4112 shown in FIG. 4a and the relevant content of step S4214 shown in FIG. 4b and S4316 shown in FIG. 4e , which will not be repeated here.
  • the drawing and rendering module also includes a barrage counter.
  • the barrage counter When receiving the information of opening barrage statistics sent by Surface Flinger, the barrage counter will start barrage statistics, count the number of barrage entries, and send Surface Flinger sends the bullet chat drawing, where the bullet chat drawing information carries the rendered bullet chat data, the number of bullet chat entries, and the timestamp.
  • steps S4214 to S4218 shown in FIG. 4 b and related content from S4316 to S4319 shown in FIG. 4 e , which will not be repeated here.
  • the AndroidRuntime includes core library and virtual machine.
  • the Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules.
  • the system library includes: frame rate decision module and Surface Flinger.
  • the frame rate decision module is used to send a target refresh rate (also called a target refresh rate) to Surface Flinger, so that Surface Flinger controls the synthesis of display data according to the target refresh rate.
  • a target refresh rate also called a target refresh rate
  • the frame rate decision module is used to send the invoked information to Surface Flinger.
  • the frame rate decision module is used to send the invoked information to Surface Flinger.
  • the frame rate decision module also receives scene information sent by Surface Flinger, determines a target refresh rate according to the scene information, and sends the target refresh rate to Surface Flinger.
  • the scene information includes: video scene information or barrage gear information.
  • the related content please refer to the related content from steps S4224 to S4225, S4229 to S4230 shown in FIG. 4b, and S4326 to S4327, and S4331 to S4332 in FIG. 4e, which will not be repeated here.
  • Surface Flinger is used to create a layer of a video application main interface or a layer of a video playback interface.
  • step S4106 shown in FIG. 4a
  • step S4209 shown in FIG. 4b
  • related content of S4310 shown in FIG. 4e
  • Surface Flinger is used to trigger the video application to perform drawing and rendering.
  • step S4110 shown in FIG. 4 a please refer to the relevant content of step S4110 shown in FIG. 4 a , which will not be repeated here.
  • Surface Flinger is also used to receive the layer information sent by WMS and the package name sent by Media Codec, and determine whether it is a video playback scene according to the layer characteristics and package name in the layer information, if For the video playback scene, send the information to enable the barrage statistics to the drawing and rendering module.
  • steps S4206 to step 4211 shown in FIG. 4b and related content from S4309 to S4312 in FIG. 4e which will not be repeated here.
  • Surface Flinger also receives the bullet chatting drawing information sent by the rendering and rendering module, which carries the bullet chatting data after rendering, the number of bullet chatting bars, and the time stamp, and determines the bullet chatting scene according to the bullet chatting drawing information The next barrage gear, and then send the barrage scene information with the barrage gear to the frame rate decision module.
  • the barrage drawing information is not received beyond the preset duration
  • the video playback scene information is sent to the frame rate decision module.
  • steps S4219 to S4223 shown in FIG. 4b and related content from S4321 to S4325 in FIG. 4e which will not be repeated here.
  • Surface Flinger also receives and caches the rendered image data and barrage data sent by the video application, and cooperates with the hardware designer (Hardware Composer, HWC) to perform layer composition.
  • HWC Hardware Composer
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least HWC, and the HWC is used to combine and display image data by using hardware.
  • the HWC is used to receive the layer and target refresh rate sent by Surface Flinger, process the layer according to the target refresh rate, and send it to the LCD at the hardware layer for display.
  • the relevant content of S4113 in FIG. 4a and the relevant content of step S4226 and step S4231 shown in FIG. 4b , and the relevant content of S4328 and step S4333 in FIG. 4e , which will not be repeated here.
  • Figure 4a is a schematic diagram of the start-up phase flow of any video application in the refresh rate setting method proposed by the embodiment of the present application, which is applied to the electronic device proposed by the embodiment of the present application, and specifically includes the following steps:
  • the Launcher sends a request for starting the application Activity to the AMS in response to the user's operation of starting the video application.
  • the Launcher receives the user's operation of starting the video application, and then sends a request for starting the application Activity to the AMS in response to the user's operation of starting the video application.
  • the user's operation of starting the video application may be the user's operation of clicking the video application icon, the user's voice operation of starting the video application, the user's operation of unlocking and automatically starting the video application, and the like.
  • the user clicks the icon of the video application so that the Launcher in the electronic device sends a request to start the application Activity to the AMS in response to the user's operation of clicking the video application icon.
  • Activity is an application component responsible for interacting with users, providing a screen or interface that users can use to interact in order to complete a certain task.
  • the application activity is mainly used to complete the task of starting the application and provide the user with the main interface of the application.
  • the request for starting the application Activity is used to request to start the video application Activity, so as to start the video application and display the main interface of the video application to the user.
  • the request to start the application Activity carries the package name of the video application, and in other embodiments, it may also carry a unique identifier unique to the video application, such as a path of the video application. In some other embodiments, the request for starting the application Activity may carry other information besides the unique identifier specific to the video application.
  • a video application refers to an application with a video playback function, which may specifically be a browser, an application of a video platform, an application of a music platform capable of playing short music videos, and the like.
  • a video playback function which may specifically be a browser, an application of a video platform, an application of a music platform capable of playing short music videos, and the like.
  • step S4101 For the execution process and principle of step S4101, reference may be made to the technical principles related to Activity in the process of starting an application in the Android system.
  • the AMS starts the application Activity.
  • the process of executing step S4102 is: AMS can respond to the request for starting the application activity, create an application activity according to the package name of the video application carried in the request for starting the application activity, and start the created application activity .
  • AMS starting application Activity For the execution process and principle of AMS starting application Activity, reference may be made to related technical principles of AMS starting application Activity in the Android system.
  • the AMS sends the window information corresponding to the video application main interface to the WMS.
  • the AMS After the AMS starts the application Activity, it needs to complete the task of starting the video application, and provide the user with the main interface of the video application for the user to interact with the controls on the main interface of the video application.
  • the window information corresponding to the main interface of the video application may be obtained from the installation information of the video application.
  • the AMS may determine the installation information of the video application according to the package name of the video application, and then obtain the window information corresponding to the main interface of the video application from the installation information of the video application.
  • the window information corresponding to the main interface of the video application may be determined through multiple interactions between the AMS and the video application.
  • the window information corresponding to the main interface of the video application can be used to create the window corresponding to the main interface of the video application.
  • the window information corresponding to the main interface of the video application can carry the package name of the video application. Specifically, it can include: window attribute information of the main interface of the video application, Window hierarchy information, etc.
  • the window attribute information may be window name, window position, window size, etc.
  • the window level information corresponding to the video application main interface is used to indicate that the window corresponding to the video application main interface is placed at the top of the display interface.
  • the AMS determines the window information corresponding to the video application main interface, it can send the window information to the WMS, so that the WMS can create a window corresponding to the video application main interface.
  • the execution process of step S4103 may be that the AMS calls the WMS, and sends the window information corresponding to the main interface of the video application to the WMS.
  • the AMS invokes the WMS thread, and applies to the WMS for creating each window of the video application main interface by issuing window information corresponding to the video application main interface.
  • the WMS creates a window of the video application main interface according to the window information corresponding to the video application main interface.
  • the WMS can Information, determine the size, position, level, etc. of the window of the main interface of the video application to be created, and then realize the creation of the window.
  • steps S4103 to S4104 describe the process and technical principle of creating a window through interaction between AMS and WMS, you can refer to the process and technical principle of creating a window through interaction between AMS and WMS in operating systems such as Android system.
  • the WMS sends the layer information corresponding to the window of the video application main interface to Surface Flinger.
  • the layer information corresponding to the window of the main interface of the video application can be used to create the layer corresponding to the window of the main interface of the video application, so that the layer corresponding to the window of the main interface of the video application can be displayed on the corresponding window of step S4104 area.
  • the layer information corresponding to the window of the main interface of the video application carries the package name of the video application, which may specifically include: the window name of the main interface of the video application, the layer name corresponding to the window of the main interface of the video application, and the window corresponding to the main interface of the video application The size of the layer, and the layer position corresponding to the window of the main interface of the video application, etc.
  • the package name of the video application may be carried in the layer name.
  • the layer information corresponding to the window of the main interface of the video application may be layer information including multiple layers, which depends on the number of layers to be created on the main interface of the video application, which is not limited in this application.
  • the layer information corresponding to the window of the main interface of the video application may be obtained from the installation information of the video application.
  • AMS can determine the installation information of the video application through the package name of the video application, obtain the layer information from the installation information of the video application, and then send the layer information to WMS. After the WMS obtains the layer information, it sends the To Surface Flinger.
  • the process of performing step S4105 may be: WMS can call Surface Flinger, and apply for creating a window corresponding to the main interface of video application in Surface Flinger by sending the layer information corresponding to the window of the main interface of the video application to Surface Flinger layer.
  • the Surface Flinger creates a layer corresponding to the window of the video application main interface according to the layer information corresponding to the window of the video application main interface.
  • the layer information corresponding to the window of the main interface of the video application includes the window name of the main interface of the video application, the layer name corresponding to the window, the layer size, the layer position and other information that can realize layer creation, and then Surface Flinger can be based on
  • the layer information corresponding to the window of the main interface of the video application is used to create the layer corresponding to the window of the main interface of the video application.
  • step S4106 the interaction between WMS and Surface Flinger implements layer creation through functions such as ViewRootImpl.
  • step S4105 the process and technical principle of creating a layer corresponding to the window of the main interface of the video application in Surface Flinger can be referred to WMS and Surface Flinger in the Android system.
  • Surface Flinger interacts with each other to realize the process and technical principle of creating layers.
  • the WMS sends the package name of the video application to the frame rate decision module.
  • step S4107 may be executed after the WMS finishes creating the window, that is, step S4104. Since in step S4103, the window information received by WMS carries the package name of the video application, so at the current stage of starting the video application, the WMS can send the package name obtained in step S4103 to the frame rate decision module, and then The frame rate decision-making module can determine that what needs to be started is a video application. The frame rate decision module can determine the preset refresh rate corresponding to the video application according to the package name of the video application.
  • the WMS to send the package name of the video application to the frame rate decision-making module. It only needs to allow the frame rate decision-making module to determine the preset refresh rate corresponding to the video application through the package name delivered by the WMS. That is, the specific way of sending the package name of the video application is not limited in this embodiment of the present application.
  • the frame rate decision module determines a preset refresh rate corresponding to the video application according to the package name of the video application.
  • the frame rate decision module can find the preset refresh rate corresponding to the video application through the package name of the video application.
  • the frame rate decision module sends the preset refresh rate corresponding to the video application to Surface Flinger.
  • the target refresh rate used in the display process is the preset refresh rate corresponding to the video application.
  • the frame rate decision module can control the refresh rate corresponding to the video application by sending it to Surface Flinger. Surface Flinger synthesizes the image to be displayed according to the preset refresh rate.
  • step S4107 and step S4108 may not be executed.
  • the preset refresh rates of all applications are the same, for example, the electronic device uses a default refresh rate of 60HZ, then it is no longer necessary to perform steps S4107 and S4108, and in step S4109, the preset refresh rate used by all applications by default Set the refresh rate as the target refresh rate and send it to Surface Flinger.
  • Surface Flinger triggers the video application to draw and render the layer corresponding to the window of the main interface of the video application according to the rhythm of the vertical synchronization (Vsync, Vertical Sync) signal corresponding to the preset refresh rate.
  • Vsync Vertical synchronization
  • the Vsync signal is used to control the rhythm of drawing and rendering of the display screen, layer synthesis, and display sending.
  • the preset refresh rate mentioned in step S4110 is the preset refresh rate corresponding to the video application, and the Surface Flinger triggers the video application to start the layer corresponding to the window of the main interface of the video application according to the rhythm of the Vsync signal corresponding to the preset refresh rate.
  • the period of the Vsync signal corresponds to a preset refresh rate.
  • the Vsync signal corresponding to the preset refresh rate can control the display screen to refresh and display according to the preset refresh rate.
  • the process of Surface Flinger triggering drawing and rendering according to the rhythm of the Vsync signal can refer to the relevant content of Surface Flinger in the Android system on the process and technical principle of triggering drawing and rendering.
  • the video application invokes the drawing and rendering module to draw and render the layer corresponding to the window of the main interface of the video application according to the rhythm of the Vsync signal corresponding to the preset refresh rate, and obtain image data of the main interface after drawing and rendering.
  • the image data of the main interface after drawing and rendering refers to the image data of the main interface of the video application after drawing and rendering, and can also be understood as the image data obtained after drawing and rendering the layer corresponding to the window of the main interface of the video application. image data.
  • the process of executing step S4111 may be that the video application calls the rendering thread of the rendering module according to the rhythm of the Vsync signal corresponding to the preset refresh rate, so that the rendering module can draw the layers of the main interface of the video application Rendering, and then drawing the rendering module can obtain the image data of the main interface after drawing and rendering.
  • functions such as Open GL ES and Skia may be called to perform drawing and rendering on the layer corresponding to the window of the main interface of the video application.
  • the drawing and rendering module sends the rendered main interface image data to the Surface Flinger according to the rhythm of the Vsync signal corresponding to the preset refresh rate.
  • the rendered main interface image data obtained by the rendering module in step S4111 can be sent to the Surface Flinger according to the rhythm of the Vsync signal corresponding to the preset refresh rate.
  • the specific drawing and rendering module sends the rendered main interface image data to Surface Flinger according to the rhythm of the Vsync signal, which can be realized by other modules.
  • the embodiment of the present application does not limit the specific implementation process of step S4112.
  • Surface Flinger uses the rendered main interface image data to perform layer synthesis according to the preset refresh rate corresponding to the video application, and sends the synthesized main interface image data to display.
  • Surface Flinger performs layer synthesis on the rendered main interface image data of each frame, and sends the synthesized main interface data to the display screen (such as LCD) for display. This enables the display screen of the electronic device to refresh and display the main interface of the video application according to the target refresh rate.
  • Surface Flinger performs layer composition on the rendered main interface image data of each frame according to the rhythm of the Vsync signal corresponding to the preset refresh rate, and synthesizes the main interface data according to the rhythm of the Vsync signal
  • the synthesized main interface data is sent to a display screen (such as LCD) for display, and then the display screen refreshes and displays the main interface of the video application according to the preset refresh rate corresponding to the video application.
  • the main interface of the video application is displayed, and multiple videos that can be played are displayed on the main interface.
  • Surface Flinger after Surface Flinger obtains the rendered image data of the main interface, it can implement layer synthesis through the HWC. For example, Surface Flinger sends the target refresh rate to the HWC, and then sends the rendered image data of the main interface to the HWC. , so that the HWC performs layer synthesis on the rendered main interface image data according to the target refresh rate, and the HWC outputs each frame of synthesized main interface image data to the display screen according to the target refresh rate.
  • the process of Surface Flinger sending the synthesized main interface image data to display can refer to the related content of Surface Flinger compositing layers and sending display in the Android system.
  • the electronic device can display the main interface of the video application according to the preset refresh rate corresponding to the video application during the application startup phase for user interaction.
  • FIG. 4b is a schematic diagram of the video playback stage flow of any video application in the refresh rate setting method proposed by the embodiment of the present application, which is applied to the electronic device proposed by the embodiment of the present application.
  • the flow shown in FIG. 4b can be used in The process shown in Figure 4a is executed after the main interface of the video application is displayed. At this time, the front end of the electronic device only displays the main interface of the video application.
  • the specific process of the video playback stage includes the following steps :
  • the video application In response to a user's operation of playing a video, the video application sends a request for starting a video playing activity to the AMS.
  • the video playing Activity is used to implement and provide a video playing interface, which the user can use to interactively complete the video playing task.
  • the video application receives the user's operation of playing the video, and then responds to the user's operation of playing the video, triggering a request to start the video playback activity to AMS, so that AMS starts the video playback activity.
  • the user's operation of playing the video may be the operation of clicking a certain video on the main interface of the video application displayed in the process of FIG.
  • the operation of searching for a video to be watched on the main interface of the video application, and then clicking a video on the search result interface may also be the operation of the user voice inputting the video to be played.
  • the request to start the video playing activity carries the package name of the video application and the name of the video playing interface of the video playing activity.
  • the name of the video playback interface can be used to describe the name of the interface where the video to be played is located. For example, referring to (1) in FIG. 4c , the user clicks on video 1, which in turn triggers the video application to send a request to start the video playback activity, and the request to start the video playback activity may carry the name of the playback interface of video 1.
  • the request to start the video playback activity may carry an identifier of another type of video application other than the package name of the video application, for example, the path of the video application.
  • the specific form of the identification is not limited.
  • the request of the dynamic video playing activity may carry an identifier of another type of video playing interface except the name of the video playing interface, which is not limited in this embodiment of the present application.
  • the request for starting the video playing activity may also carry other information about the video playing activity, and this embodiment of the present application does not limit the specific information in the request for starting the video playing activity.
  • the video application or other applications may also automatically trigger the video application to send the request to start the video playback activity to AMS, and specifically trigger the sending of the request to start the video playback activity to AMS. The difference does not affect the implementation of the embodiment of the present application.
  • step S4201 For the execution process and principle of step S4201, reference may be made to the relevant technical principles related to Activity in the process of starting video playback in the application in the Android system.
  • the AMS starts the video playing activity.
  • the process of executing step S4202 is: AMS can respond to the request for starting the video playing activity, and create a video playing activity according to the package name of the video application and the name of the video playing interface carried in the request for starting the video playing activity, And start the created video playback Activity.
  • the AMS sends the window information corresponding to the video playback interface to the WMS.
  • the video playing interface is an application interface for displaying video images. After AMS starts the video playback activity, it needs to complete the video playback task and provide the user with the video playback interface of the video application for the user to interact with the controls of the video application playback interface to meet the user's video playback needs.
  • the window information corresponding to the video playing interface may be used to create a window corresponding to the video playing interface.
  • the window information corresponding to the video playback interface may be obtained from the installation information of the video application.
  • the AMS may determine the installation information of the video application according to the package name of the video application, and then obtain the window information corresponding to the video playback interface from the installation information of the video application.
  • the window information corresponding to the video playback interface may be determined through multiple interactions between the AMS and the video application.
  • the window information corresponding to the video playback interface may carry the package name of the video application, and specifically may include: window attribute information and window level information of the video playback interface.
  • the window attribute information may be window name, window position, window size and so on.
  • the window level information corresponding to the video playback interface is used to indicate that the window corresponding to the video playback interface is placed on the topmost layer of the display interface.
  • the AMS can send the window information to the WMS, so that the WMS can create a window corresponding to the video playback interface.
  • the process and principle of AMS sending the window information corresponding to the video playback interface to WMS can refer to the execution process and principle of step S4103 in Figure 4a above.
  • the difference is that the window information sent by AMS in step S4103 is related to the main interface of the video application.
  • the window information sent by the AMS in step S4203 is related to the video playback interface.
  • the WMS creates a window of the video playing interface according to the window information corresponding to the video playing interface.
  • the WMS can determine the window information according to the video playback interface. Find out the size, position, level, etc. of the window of the video playback interface to be created, and then realize the creation of the window of the video playback interface.
  • step S4204 can refer to step S4104 shown in Figure 4a above, the difference is that the window created in step S4104 is the window of the main interface of the video application, while the window created in step S4204 is the window of the video playback interface .
  • the WMS sends the window creation completion information to the video application.
  • step S4204 it will send the window creation completion information to the video application, prompting that the video application has completed the window creation, and the video application can trigger subsequent operations.
  • the window creation completion information may carry identification information of the created window, such as a window name, and the embodiment of the present application does not limit the specific form of the window creation completion information.
  • the video application calls the Media Codec to perform video decoding.
  • the video application triggers the execution of step S4206 after receiving the window creation completion information sent in step S4205.
  • the video application calls Media Codec to decode the video that the user needs to play in step S4201.
  • the user clicks video 1 so the video application will call Media Codec to decode video 1.
  • the video application will call Media Codec to decode video 1.
  • step S4206 For the execution process and principle of step S4206, reference may be made to relevant technical principles of the video decoding process in the Android system.
  • the Media Codec sends the frame rate of the video source to the frame rate decision module.
  • the video source frame rate refers to the video source frame rate of the video to be played by the user. Because in step S4206, the Media Codec has carried out video decoding to the video that the user needs to play under the call of the video application, so the Media Codec obtains the video source frame rate of the video through the video source of the video that the user needs to play. In some other embodiments, Media Codec can also obtain information of video sources such as video resolution.
  • the frame rate of the video source that the user needs to play the video can explain the required refresh rate of the video that the user needs to play, and then provide a reference for the frame rate decision module to make a decision on the refresh rate.
  • the frame rate decision module can explain the required refresh rate of the video that the user needs to play, and then provide a reference for the frame rate decision module to make a decision on the refresh rate.
  • the WMS sends the layer information corresponding to the window of the video playback interface to Surface Flinger.
  • the layer information corresponding to the window of the video playback interface carries the package name of the video application.
  • the layer information corresponding to the window of the video playback interface can be used to create the layer of the video playback interface, and then the layer of the video playback interface can be displayed in the corresponding window area of step S4204.
  • the layer information corresponding to the video playback interface may specifically include: the layer name, layer size, layer position, window name, etc. corresponding to the window of the video playback interface.
  • the package name of the video application may be included in the layer name.
  • step S4208 For the execution process and principle of step S4208, you can refer to the process and principle of step S4105 mentioned in Figure 4a above. The difference is that the layer information sent in step S4108 is related to the main interface of the video application, while the image Layer information is related to the video playback interface.
  • the Surface Flinger creates a layer corresponding to the window of the video playback interface according to the layer information corresponding to the window of the video playback interface.
  • step S4209 For the execution process and principle of step S4209, you can refer to the process and principle of step S4106 mentioned in Figure 4a above. The difference is that the layer created in step S4109 is related to the main interface of the video application, while the layer Related to the video playback interface.
  • S4203, S4204, S4208 and S4209 may be repeated steps.
  • the display screen refreshes the display screen according to the preset refresh rate, so when a new frame of video playback interface needs to be displayed, it is necessary to go through S4203, S4204, S4208 and S4209, Re-update information such as windows and layers.
  • Step S4209 Surface Flinger triggers the video application to draw and render the layer corresponding to the window of the video playback interface according to the Vsync signal rhythm corresponding to the preset refresh rate corresponding to the video application, and then the video application follows the The Vsync signal rhythm corresponding to the preset refresh rate, call the drawing and rendering module to draw and render the layer corresponding to the window of the video playback interface, and the drawing and rendering module sends the drawn and rendered video playback according to the Vsync signal rhythm corresponding to the preset refresh rate
  • the image data of the interface is sent to Surface Flinger, and Surface Flinger can use the image data of the rendered video playback interface to perform layer synthesis according to the preset refresh rate corresponding to the video application, and send the image data of the synthesized video playback interface to display, so that the display can refresh the video playback interface according to the preset refresh rate corresponding to the video application.
  • the step of refreshing and displaying the video playback interface does not interfere with the execution process of the steps from step S4211 to step S4224 described later, that is, this application does not interfere with step S4211 and step S4211 according to the video Apply the preset refresh rate corresponding to the application, and there is no restriction on the execution sequence of the steps of refreshing and displaying the video playback interface.
  • Surface Flinger can follow the preset refresh rate corresponding to the video application. Refresh and display the video playback interface until the refresh rate sent by the frame rate decision-making module changes through steps S4211 to S4224, and then refresh the video playback interface with the new refresh rate.
  • the playback whitelist can be understood as a list of applications with video playback permission.
  • the layer information in step S4210 can be understood as the layer information corresponding to the window of the video playback interface. It can be known from the foregoing description of step S4208 that the layer information carries the package name of the video application, so it can be determined whether the video application is in the playback whitelist through the package name of the video application.
  • the package name of the video application can be obtained from the layer name carried in the layer information corresponding to the video playback interface.
  • the specific format of the layer name can be: package name/layer-specific name, and then the package name can be obtained from the layer information.
  • the layer name of the video playback interface of a video application is "xxx.xxxx.xx/com.xx.bangumi.ui.page.detail.BangumiDetailActivityV3#0SurfaceView-xxx.xxxx.xx/com.bi[. ..]age.detail.BangumiDetailActivityV3#0", as can be seen from the layer name, the package name of the video application is "xxx.xxxx.xx".
  • step S4210 Surface Flinger searches whether there is a package name of a video application from the playback whitelist, and if there is a package name of the video application, it determines that the video application is in the playback whitelist. If the package name of the video application cannot be found in the playback whitelist, it means that the video application is not in the playback whitelist.
  • step S4211 is performed.
  • the video playback interface is displayed on the screen.
  • prompt information for reminding the user that the video application does not have the permission to play may also be displayed on the display screen.
  • step S4210 may not be executed, that is, step S4211 is directly executed after step S4209 is executed.
  • the Surface Flinger determines whether it is a video playback scene according to the layer information corresponding to the window of the video playback interface.
  • the layer information corresponding to the window of the video playback interface has layer feature information that can be used to describe whether it is a video playback scene. Determining whether it is a video playing scene can be understood as determining whether the scene where the video application is currently located is a video playing scene.
  • the video playing scene refers to a scene where an application plays a video.
  • the layer feature information may be a video layer (SurfaceView). If SurfaceView is included in the layer information, it indicates that the video playback scene is in use. For example, if there is a video layer (SurfaceView) field in the layer name in the layer information, it can be determined as a video playback scene.
  • the layer name of the video playback interface of a video application is "xxx.xxxx.xx/com.xx.bangumi.ui.page.detail.BangumiDetailActivityV3#0SurfaceView-xxx.xxxx.xx/com.bi[... ]age.detail.BangumiDetailActivityV3#0", which contains the "SurfaceView" field, so it is determined that the video application is currently in the video playback scene, and the user has a video playback requirement.
  • step S4211 determines that it is a video playback scene, then execute step S4212, start bullet chatting statistics, and further determine whether it is currently a bullet chatting scene under a video playback scene, if step S4211 determines that it is not a video playback scene, then end the process, That is, the frame rate decision-making module is no longer allowed to reset the refresh rate, and the display screen continues to refresh the display screen according to the preset refresh rate corresponding to the video application mentioned in the process of Figure 4a.
  • step S4211 determines that the current video playback scene is present, it will not repeat the determination of the video playback scene.
  • step S4211 determines that it is a video playback scene
  • step S4209 determines that it is a video playback scene
  • a non-video playback scene can be sent
  • the information is sent to the frame rate decision-making module to notify the frame rate decision-making module to determine that it is not in the video playback scene, and reset the target refresh rate to the preset refresh rate corresponding to the video application.
  • Surface Flinger Since Surface Flinger has determined that the current scene is a video playback scene through step S4211, it can be determined to enable bullet chatting statistics, so that the drawing and rendering module can start counting bullet chatting when drawing bullet chatting.
  • Surface Flinger can further determine whether it is currently a barrage scene, and can determine the barrage gear in the barrage scene based on the barrage statistics data. Specifically, after Surface Flinger determines to enable barrage statistics, the drawing and rendering module can perform barrage statistics when drawing barrage. When Surface Flinger has not determined to enable barrage statistics, the rendering module does not perform barrage statistics. For details, reference may be made to the relevant content in subsequent steps S4213 to S4221, which will not be repeated here.
  • the manner in which Surface Flinger executes step S4212 may be that Surface Flinger sets the status of barrage statistics to an enabled state. For example, modify the status parameter of bullet chat statistics to the parameter corresponding to the enabled status.
  • the storage location of the barrage statistical state is not limited in this embodiment of the application.
  • the manner of performing step S4212 may also be: Surface Flinger sends the information of enabling the bullet chat statistics to the drawing and rendering module, so as to control the drawing and rendering module to enable the bullet chat statistics.
  • Surface Flinger determines whether bullet chatting drawing information is received within a preset time period.
  • the barrage drawing information is information related to drawing barrage by the rendering module.
  • the barrage drawing information of the current frame includes: the number of barrage entries in the current frame, the timestamp of the current frame, and the current frame The rendered barrage data of drawing.
  • the bullet chat drawing information of the current frame can be understood as the bullet chat drawing information of the display screen of the current frame.
  • step S4213 if it is determined in step S4213 that bullet chat drawing information is received within the preset time period, the video application is considered to be in the bullet chatting scene;
  • the bullet chat scene is only in the video scene determined in step S4211, that is, it can be understood that the current video application does not need to display bullet chat during the video playing process, and step S4228 is executed.
  • the barrage scene can be understood as a scene in which a barrage is displayed during video playback.
  • step S4211 After Surface Flinger determines in step S4211 that it is currently in a video scene, it enables barrage statistics in step S4212. Therefore, if you are currently in the bullet chatting scene, the drawing and rendering module will perform bullet chatting statistics and bullet chatting drawing, and then get the bullet chatting drawing information of the current frame and send it to Surface Flinger.
  • the video application when the video application is in the bullet chatting scene, the video application will call the drawing and rendering module to draw and render the bullet chatting according to the rhythm of the Vsync signal corresponding to the preset refresh rate of the video application, and calculate the number of bullet chatting bars drawn Make statistics, and then the drawing and rendering module will send the barrage drawing information of the current frame to Surface Flinger according to the rhythm of the Vsync signal. Therefore, in the bullet chat scene, Surface Flinger will receive the bullet chat drawing information according to the rhythm of the Vsync signal.
  • the preset duration can be set according to actual application scenarios, experience, etc., for example, the preset duration can be set to 5 seconds, and the specific setting method of the preset duration is not limited in this embodiment of the present application.
  • S4213 may be a repeated step.
  • Surface Flinger can confirm whether the bullet chatting drawing information is received within the preset duration according to the rhythm of the Vsync signal corresponding to the current preset refresh rate, so as to realize the purpose of confirming in real time whether it is currently in the bullet chatting scene.
  • step S4211 determines that it is a video playback scene
  • step S4213 the video playback scene information can also be directly sent to the frame rate decision module, and the frame rate decision module can use the video playback scene information and the video source Frame rate, determine the target refresh rate, and then send the target refresh rate to Surface Flinger, and Surface Flinger controls the display of the video playback interface according to the target refresh rate.
  • the subsequent steps S4228 to S4231 here No longer.
  • Surface Flinger controls the display screen to refresh the display screen according to the target refresh rate determined by the video playback scene information, and then executes step S4213. If it is determined that the barrage drawing information is not received within the preset time period, the process can be ended directly. There is no need to execute steps S4228 to S4231 again.
  • the video application calls the drawing and rendering module to draw the barrage according to the rhythm of the Vsync signal.
  • the video application calls the drawing and rendering module to draw the barrage according to the rhythm of the Vsync signal corresponding to the current actual refresh rate.
  • the display screen is refreshing the display screen according to the preset refresh rate corresponding to the video application. Therefore, when the video application needs to draw a bullet chat, it can call the drawing and rendering module to draw the bullet chat according to the rhythm of the Vsync signal corresponding to the current preset refresh rate.
  • the barrage drawn by the video application can be understood as a barrage of a video to be played by the user.
  • step S4214 may be triggered to start execution of step S4213 after the video application receives the user's click to open the barrage, or it may be automatically triggered to start the video application that needs to be played after receiving the user's operation to play the video in the aforementioned step S4201. barrage, and then start to execute step S4214.
  • the drawing and rendering module can generate a text view (Textview), a basic canvas (basecanvas), and use drawtext, Skia, HWUI, OPEN GL ES and other functions to complete the bullet chatting drawing.
  • a text view Textview
  • basecanvas basic canvas
  • drawtext Skia
  • HWUI High Speed User Interface
  • OPEN GL ES Openpoint GL ES
  • the drawing and rendering module draws and renders bullet chatting, and displays the bullet chatting on the display screen of the electronic device. Multiple modules can cooperate to complete the process.
  • the process and technical principles of bullet chatting rendering and rendering in the Android system please refer to the process and technical principles of bullet chatting rendering and rendering in the Android system.
  • step S4214 the drawing rendering module is called to draw bullet chatting if and only when the video application needs to draw bullet chatting, so if the video application does not need to draw bullet chatting, for example, the user turns off the bullet chatting switch, or the video itself If there is no barrage, step S4214 will not be executed, and further steps S4215 to S4218 will not be executed. Therefore, step S4214 is not a step that must be executed when each frame is displayed, and step S4214 is executed when and only when a frame of screen has a demand for bullet chatting, and the specific ways of triggering step S4214 do not affect this The implementation of the application embodiment.
  • the drawing and rendering module determines whether to enable barrage statistics.
  • step S4215 is executed. If the drawing and rendering module determines that the bullet chatting statistics are not enabled, it means that there is currently no need for counting bullet chatting, so The process can be ended, that is, the barrage statistics will not be performed, and the subsequent steps S4216 to S4218 will not be executed.
  • the drawing and rendering module determines whether to enable bullet chatting statistics by acquiring the state of bullet chatting statistics. If the bullet chat statistics is enabled, the bullet chat statistics will be enabled, if the bullet chat statistics are not enabled, the bullet chat statistics will not be enabled. Since in the aforementioned step S4212, Surface Flinger determines to enable the bullet chat statistics, and determines the state of the bullet chat statistics as the enabled state, so the state of the bullet chat statistics obtained by the drawing and rendering module when executing step S4215 is the enabled state, and it is determined that the bullet chat statistics is enabled. Bullet screen statistics, go to step S4216.
  • the status of bullet chatting statistics can be set in Surface Flinger, and the drawing and rendering module can determine whether to enable bullet chatting statistics by obtaining the status of bullet chatting statistics from Surface Flinger. In some other embodiments, it may also be that when the drawing and rendering module receives the information of enabling bullet chatting statistics, it is determined to enable bullet chatting statistics, and when the drawing and rendering module does not receive the information of bullet chatting statistics, it does not open bullet chatting statistics.
  • the drawing and rendering module determines to enable the bullet chatting statistics, which can be understood as the drawing and rendering module determines to enable the statistics of the currently drawn bullet chatting. Every time the video application calls the drawing and rendering module to draw bullet chatting, the drawing and rendering module is triggered to start counting the currently called bullet chatting for drawing.
  • the bullet chatting counter may be initialized and activated, so that the bullet chatting counter is in a state where counting can be performed.
  • the barrage counter can be initialized and started only once. In the current frame, if the drawing and rendering module is called multiple times, and the barrage statistics are determined to be enabled for multiple times correspondingly, the barrage statistics will be initialized and started when it is first determined to enable the barrage statistics. After the barrage counter is started, since the barrage counter is already activated, the barrage counter can no longer be initialized in the future.
  • the exemplary barrage counter does not count the barrage before it is initialized and started by the control of the drawing and rendering module, and starts to count the barrage drawn by the drawing and rendering module only after being initialized and started.
  • the manner of initializing and starting the barrage counter by the drawing and rendering module may be as follows: the drawing and rendering module may send a start signal to the barrage counter, and the barrage counter starts counting the barrage drawn by the drawing and rendering module after receiving the start signal.
  • the barrage counter can enter a low-power state or a dormant state after the barrage drawing of a frame is completed to save power consumption until the next frame needs to be barraged, and the drawing and rendering module is determined to enable the barrage statistics again. Initialize the startup barrage counter.
  • the drawing and rendering module controls adding one to the barrage counter.
  • the drawing and rendering module in step S4215 has determined to enable bullet chatting statistics, the drawing and rendering module is triggered to control the bullet chatting counter to count the currently drawn bullet chatting, so that the bullet chatting counter counts plus one, and the drawing of the current drawing and rendering module is completed.
  • the number of bullet chats is counted and counted. Since the video application needs to draw a barrage, it will call the drawing and rendering module once to draw the barrage, and then trigger the drawing and rendering module to perform steps S4215 to S4217. Therefore, the barrage counter can count the number of barrage bars drawn in the current frame.
  • the drawing and rendering module can also control the bullet chat counter to count the number of bullet chat characters that need to be drawn in one frame, that is, control the bullet chat counter to count and add the number of characters of a bullet chat that is currently called to draw. It should be noted that there are many methods for the bullet chat related data that can be counted by the bullet chat counter, including but not limited to the content proposed in this application.
  • the drawing and rendering module can control the bullet chatting counter to perform statistics on the bullet chatting data in a specific area, that is, only count the bullet chatting data drawn in a specific area Barrage.
  • the drawing and rendering module can control the barrage counter to only target the specific area 20 The text in the text is counted, and then the interference of non-bullet chat text can be eliminated, and the accurate bullet chat data such as the number of bullet chat entries and the number of bullet chat characters can be obtained through statistics.
  • the barrage counter counts the number of barrages starting from zero. Every time the video application executes step S4214, it calls the drawing and rendering module to draw a barrage, and the drawing and rendering module is determined to be enabled in step S4215.
  • Bullet chat statistics control the bullet chat counter to count and add one, and the statistics will not end until the number of bullet chats that need to be drawn in the current frame is drawn. That is, in the current frame, if multiple barrages need to be drawn, steps S4214 to S4217 will be repeated for many times.
  • the barrage counter can also count the number of barrage characters that need to be drawn in one frame under the control of the drawing and rendering module. It should be noted that there are many methods for the bullet chat related data that can be counted by the bullet chat counter, including but not limited to the content proposed in this application.
  • the count of the barrage counter can be cleared, and then the barrage counter of a certain frame can be counted later.
  • counting starts from zero again.
  • steps S4214 to S4217 shown in the embodiment of the present application are only an exemplary process of counting the number of bullet chats, and the drawing and rendering module realizes the number of bullet chats and the number of bullet chat characters in the current frame. There are many ways to count barrage data, including but not limited to the content of the embodiment of this application.
  • the drawing and rendering module sends the barrage drawing information of the current frame to the Surface Flinger according to the rhythm of the Vsync signal.
  • the drawing and rendering module sends the barrage drawing information of the current frame to Surface Flinger according to the rhythm of the Vsync signal corresponding to the current actual refresh rate.
  • the current actual refresh rate of the electronic device is the preset refresh rate corresponding to the video application. Therefore, when step S4218 is executed, the drawing and rendering module can follow the rhythm of the Vsync signal corresponding to the preset refresh rate, Send the bullet chat drawing information of the current frame to Surface Flinger.
  • the bullet chatting drawing information of the current frame includes: the number of bullet chatting bars drawn in the current frame, the time stamp of the current frame, and the bullet chatting data after drawing and rendering of the current frame.
  • the number of barrage bars drawn in the current frame is the number of barrage bars that need to be drawn in the current frame as counted by the barrage counter
  • the timestamp of the current frame is the timestamp corresponding to the current frame
  • the rendered The barrage data is also the rendered barrage data corresponding to the current frame.
  • the bullet chat drawing information may also carry other information, such as the package name of the video application, the number of bullet chat characters in the current frame counted by the bullet chat counter, and the like.
  • the drawing and rendering module sends the barrage drawing information of the current frame to Surface Flinger according to the rhythm of the Vsync signal with the preset refresh rate corresponding to the video application. It should be noted that the drawing and rendering module does not need to send bullet chat drawing information to Surface Flinger every frame. If the drawing and rendering module does not perform bullet chat drawing in a certain frame, then the drawing and rendering module will not perform step S4214 in this frame. Go to step S4217, and then this frame will not generate bullet chat drawing information, and will not send bullet chat drawing information to Surface Flinger.
  • step S4218 may be performed by the barrage counter in the drawing and rendering module.
  • Surface Flinger determines that it receives bullet chatting drawing information within a preset time period, and stores the number of bullet chatting lines drawn in the current frame and the time stamp.
  • the number of bullet chatting items in the bullet chatting drawing information and the timestamp can be stored correspondingly, and the total number of bullet chatting items drawn under the timestamp can be recorded.
  • there are many ways to store the number of barrage entries and time stamps which are not limited in this embodiment of the present application.
  • the barrage number and timestamp can be stored in a table form.
  • the time stamp of the first frame is 35257912694ns
  • the drawing and rendering module draws a total of 3 barrages in the first frame
  • the time stamp of the second frame is 35274579360ns
  • the drawing and rendering module draws a total of 3 barrages in the second frame
  • Two barrages are drawn
  • the timestamp of the third frame is 35291246026ns.
  • the drawing and rendering module draws a total of seven barrages in the third frame...and so on, recording the number of barrages and time stamps drawn in each frame.
  • the drawn and rendered bullet chat data in the bullet chat drawing information is subsequently used for Surface Flinger synthesis and sent to the display screen.
  • the process of displaying bullet chat data can refer to the image data sending and display from step S4110 to step S4113 in Figure 4a. A description of the process.
  • Surface Flinger determines the total number of bullet chats displayed in the current frame according to the number of bullet chats stored, time stamp, and aging time of bullet chats.
  • the bullet chat aging time refers to the bullet chat survival time.
  • the time from when the barrage is displayed on the display to when the barrage completely disappears is the survival time of the barrage, that is, the aging time of the barrage.
  • the aging time of bullet chatting may be different. For example, if the double-speed video is not turned on, the bullet chat aging time may be 5 seconds, and if the double-speed video is turned on, the bullet chat aging time may be 3 seconds.
  • the bullet chatting aging time may be determined by Surface Flinger by obtaining the bullet chatting aging time of the video to be played in the current video playing scene.
  • a preset and fixed bullet chatting aging time may also be determined through experience, big data, etc. That is, there are many ways to determine the aging time of the barrage, which is not limited in this embodiment of the present application.
  • the stored barrage number, time stamp, and barrage aging time can be understood as the stored number of barrage bars drawn for each frame, the corresponding time stamp of each frame, and barrage aging time.
  • Surface Flinger can determine the total number of bullet chats currently aged (that is, those that have disappeared on the display) based on the number of bullet chats, timestamps, and aging time of each frame that have been stored, and store them The total number of barrages displayed in the current frame can be obtained by subtracting the total number of barrages that have aged.
  • the bullet chat drawing information sent to Surface Flinger also includes the total number of bullet chat characters in the current frame
  • the current number of bullet chat characters can also be calculated according to the stored bullet chat characters.
  • the total number of bullet chat characters to be displayed in one frame For example, after calculating the total number of bullet chats displayed in the current frame through the foregoing description, calculate the number of bullet chat characters in the total number of bullet chats, and then obtain the total number of bullet chat characters to be displayed in the current frame.
  • the required screen refresh rate When the total number of bullet chats to be displayed is more, the required screen refresh rate will be higher. If the refresh rate is too low, it is easy to cause the scrolling bullet chat to freeze and the screen to be unsmooth. When the total number of barrage items to be displayed is less, the required refresh rate is lower, and a lower refresh rate can ensure smooth display of the screen.
  • Surface Flinger determines the barrage gear of the current frame according to the number of barrage lines drawn in the current frame and the total number of barrage bars displayed.
  • the barrage level is used to describe the magnitude of the barrage density in the display screen displayed on the display screen.
  • the more bullet chats drawn in the current frame the more the total number of bullet chats displayed, it is considered that the number of bullet chats to be displayed is more, and the density of bullet chats is higher. Otherwise, the number of bullet chats to be displayed is considered to be less.
  • the barrage density is small.
  • the barrage gear may include: a high barrage gear, or a low barrage gear.
  • the low barrage gear is used to indicate that the barrage density of the current frame is low.
  • the high barrage gear is used to indicate that the barrage density of the current frame is high.
  • the process of executing step S4221 may be: when the number of barrage bars drawn in the current frame is less than the first preset value, and the total number of barrage bars displayed in the current frame is less than the second preset value, then determine the barrage gear It is a low barrage gear.
  • the first preset value and the second preset value can be set according to experience, for example, the first preset value can be set to 2, and the second preset value can be set to 5.
  • the barrage gear can be divided into two barrage gears, the high barrage gear or the low barrage gear, and can also be divided into three, and more than three barrage gears can be set.
  • Barrage gear may include: a first bullet chatting gear, a second bullet chatting gear, or a third bullet chatting gear. The number of barrage represented by the first barrage gear, the second barrage barrage to the third barrage barrage gear increases step by step.
  • the bullet chatting drawing information also includes the number of bullet chatting characters in the current frame
  • when determining the bullet chatting gear it can also be based on the number of bullet chatting characters in the current frame, and/or the current It is determined by the total number of bullet chat characters that need to be displayed in one frame. The more bullet chatting characters to be drawn in the current frame, the more bullet chatting characters to display in total, the more bullet chatting needs to be displayed at present, and the smaller the number of bullet chatting to be displayed on the contrary.
  • step S4221 in order to ensure that the determined bullet chatting gear is accurate, after Surface Flinger determines the video playback scene, when executing step S4221 for the first time to determine the bullet chatting gear, first delay the preset duration before executing . After the preset time delay, some non-bullet chat text, such as the text in the 10 area and 30 area in Figure 4d, will be regarded as the disappearing bullet chat that has passed the bullet chat aging time, and will not interfere Calculate the accuracy of the total number of barrage bars displayed in one frame, and then accurately determine the barrage gear.
  • steps S4219 to S4221 can be performed in real time. Exemplarily, it can be determined by executing steps S4219 to S4221 according to the Vsync rhythm corresponding to the preset refresh rate. Barrage gear.
  • the bullet chatting gear information is used to describe the bullet chatting gear in the bullet chatting scene, and may specifically include the bullet chatting gear.
  • Step S4222 can be understood as Surface Flinger determining whether the barrage gear information of the current frame has changed. Specifically, it may be to compare the bullet chatting gear of the current frame with the bullet chatting gear determined in the previous frame, and if the bullet chatting gear of the current frame is different from the bullet chatting gear of the previous frame, determine If the bullet chatting gear information of the current frame is changed, if the bullet chatting gear determined in the current frame is consistent with the bullet chatting gear of the previous frame, it is determined that the bullet chatting gear information of the current frame has not changed , so end the process and continue to determine whether the barrage gear information of the next frame is changed.
  • step S4221 is the first determined barrage gear
  • step S4222 Surface Flinger will also determine that the barrage gear information has changed.
  • step S4223 When it is determined in step S4222 that the barrage information has changed, step S4223 will be triggered. If Surface Flinger determines that the barrage gear information has not changed, step S4223 may not be executed.
  • the barrage gear information can be used to explain the current barrage scene and the barrage gear in the current barrage scene, and then the frame rate decision module can adjust the target refresh rate according to the barrage gear to meet the requirements of the current barrage.
  • the bullet chatting gear information of the current frame can also include other information, for example, it can include information such as the package name of the video application, the frame rate of the video source, and the like in the embodiment of the present application. The information contained in is not limited.
  • step S4223 can be executed again if and only when the barrage gear information is changed, that is, if step S4222 is not executed, that is, when the barrage gear information of the previous frame has not changed, the execution step may not be triggered S4223. Since Surface Flinger can only send the barrage gear information to the frame rate decision-making module when it is determined that the barrage gear information changes, it does not need to send the barrage gear information of the current frame to the frame rate decision-making module in real time, and then Power saving can be achieved.
  • the bullet chatting gear determined in the current frame is a high bullet chatting gear
  • the bullet chatting gear determined last time is a low bullet chatting gear
  • Another example is that when Surface Flinger determines the barrage gear for the first time, it also needs to send the barrage gear information to the frame rate decision module.
  • the frame rate decision module determines a target refresh rate according to the barrage gear information.
  • the frame rate decision module can determine the current bullet chatting scene and the bullet chatting gear in the bullet chatting scene according to the switching refresh rate information. Furthermore, the target refresh rate can be adjusted according to the barrage gear information.
  • a corresponding preset refresh rate can be allocated in advance for each barrage gear.
  • the preset refresh rate corresponding to the low barrage gear can be set to 30HZ
  • the preset refresh rate corresponding to the high barrage gear can be set to 40HZ.
  • the refresh rate determined by the barrage gear may be less than or equal to the preset refresh rate corresponding to the video application, and Greater than or equal to the video source frame rate of the video played by the video app.
  • the target refresh rate can be calculated using the barrage gear information through specific calculation rules.
  • the embodiment of the present application is specific to the implementation of step S4224 No restrictions.
  • the frame rate decision module sends the target refresh rate to Surface Flinger.
  • the target refresh rate may be sent to Surface Flinger when the target refresh rate determined by the frame rate decision module changes.
  • the newly determined target refresh rate may also be sent to Surface Flinger at a preset frequency.
  • Surface Flinger controls the display of the video playback interface according to the target refresh rate.
  • Surface Flinger draws and renders the layer of the video playback interface mentioned in step S4209, obtains the layer of the rendered video playback interface, and then draws the layer of the rendered video playback interface and the layer in S4218.
  • the received bullet chat data after drawing and rendering is processed by merging layers, and sent to the display screen for display according to the target refresh rate, so that the display screen refreshes and displays the video playback interface according to the target refresh rate.
  • the process of Surface Flinger controlling and displaying the video playback interface can refer to the drawing and rendering process of the main interface of the video application from step S4110 to step S4113 in Figure 4a.
  • the main difference is that the target refresh rate at this time is no longer corresponding to the video application.
  • the preset refresh rate, but the refresh rate determined according to the barrage gear information, and what is displayed is no longer the main interface of the video application, but the barrage interface of the video playback.
  • the user clicks Video 1 on the main interface of the video application, triggering the execution of the process shown in Figure 4b, after steps S4201 to S4212, the current video application can be determined It is in the video playback scene, and Surface Flinger also confirms to enable bullet chatting statistics.
  • the display interface changes to (2) in Figure 4c. Since Surface Flinger determines the bullet chatting gear information after a preset period of time, so When just entering the video 1 playback interface shown in (2) of Figure 4c, the original refresh rate of 60HZ is still maintained, and the refresh rate remains unchanged. Continue to refer to (2) in Figure 4c.
  • the low barrage gear is determined, and the switch
  • the refresh rate is 30HZ under the low barrage gear, and the display interface changes to (3) in Figure 4c.
  • the number of barrage items displayed is 3, which is a state of low barrage gear, and the display is refreshed at a refresh rate of 30HZ.
  • Surface Flinger can determine the barrage gear information according to the number of bullet chats drawn in the current frame and the total number of bullet chats displayed in the current frame, and then can send the carried information to the frame rate decision module. There is switching refresh rate information of the barrage gear information, so that the frame rate decision module can adjust the target refresh rate according to the barrage gear, so as to meet the refresh rate requirements in the barrage scene, and meet the screen fluency requirements At the same time, it will not cause excessive refresh rate and waste power.
  • the bullet chatting gear of the latest frame can be understood as Surface Flinger determining the latest bullet chatting gear in real time. In some embodiments, it can only be performed when step S4222 is performed, that is, when the bullet chatting gear is determined to change (for example When it is different from the bullet chatting gear determined in the previous frame), the bullet chatting gear information is displayed again to notify the frame rate decision-making module to readjust the target refresh rate.
  • the number of bullet chats on video 1 is small, and it is in a low bullet chat gear, but after video 1 is played for a period of time, the Surface Flinger in the mobile phone passes the steps
  • the number of displayed bullet chats calculated by S4220 increases, and the number of bullet chats that need to be drawn also increases, and then the bullet chat gear determined in step S4221 becomes a high bullet chat gear, so the refresh rate is switched again, when the video 1
  • the screen of (3) in Figure 4c is changed to (4) in Figure 4c
  • the refresh rate under the screen gear is 40HZ to refresh the display screen.
  • step S4226 After step S4226 is executed, Surface Flinger keeps refreshing the display screen according to the target refresh rate determined by the barrage gear information. However, in the process of executing step S4226, Surface Flinger also repeatedly executes step S4213, that is, determines whether it is still in the bullet chatting scene by determining whether it has received bullet chatting drawing information within a preset time period.
  • step S4228 needs to be executed to notify the frame rate decision module that the current scene is changed to a video playback scene.
  • Steps S4214 to S4218 will not be executed for multiple consecutive frames.
  • Surface Flinger does not receive the bullet chat drawing information for more than the preset time, it means that the video application is no longer in the bullet chat scene.
  • the judgment in the aforementioned step S4211 The following is still in the video playback scene, so it will be determined that it is only the video playback scene, but not in the barrage scene.
  • step S4228 can be executed to resend the switching refresh rate information carrying the video playback scene to the frame rate decision module.
  • the decision-making module determines the target refresh rate again according to the video playback scene and the frame rate of the video source, so that the display screen of the electronic device refreshes the picture according to the refresh rate in the video playback scene.
  • step S4213 When it is determined in the aforementioned step S4213 that the video playback scene is not in the bullet chatting scene, or in the case that the video application is changed from the original bullet chatting scene to the video playing scene in step S4227, Surface Flinger
  • the frame rate decision-making module is notified that the current scene is a video playback scene, so that the frame rate decision-making module can reset the target refresh rate according to the video playback scene information.
  • the video playing scene information is used to indicate that the current playing scene is a video playing scene.
  • Surface Flinger sends the video playback scene information to the frame rate decision-making module to notify the frame rate decision-making module that it is currently a video playback scene, so the frame rate decision-making module needs to reset a target refresh rate suitable for the video playback scene.
  • the specific expression form of the video playing scene information is not limited in this embodiment of the application, for example, it may be represented by a specific symbol or a specific field.
  • the frame rate decision module determines a target refresh rate according to the video playing scene information and the frame rate of the video source.
  • the frame rate decision module After the frame rate decision module receives the video playback scene information, it can determine that it is currently in the video playback scene. Exemplarily, in a video playback scenario, the frame rate decision module may determine the target refresh rate according to the frame rate of the video source. For example, the target refresh rate can be set to the video source frame rate. For another example, the target refresh rate may be calculated according to the frame rate of the video source and using the refresh rate calculation rule in the video playing scene.
  • step S4207 also sends video resolution and other video source information other than the video source frame rate
  • the target refresh rate may also be determined in combination with other video source information. This embodiment does not limit the specific process of determining the target refresh rate according to the frame rate of the video source.
  • the target refresh rate determined by using the frame rate of the video source in the video playback scenario can meet the requirements of the current video playback scenario.
  • the target refresh rate may also be determined only according to the video playback scene information.
  • the preset refresh rate corresponding to the video playback scene may be pre-stored, and when the video playback scene information is received, the frame rate decision module sets the target refresh rate as the preset refresh rate corresponding to the video playback scene.
  • the preset refresh rate corresponding to the video playback scene is 48HZ
  • the preset refresh rate corresponding to the video application is 60HZ
  • the frame rate decision module will adjust the target refresh rate from the original 60HZ to 48HZ.
  • the frame rate decision module sends the target refresh rate to Surface Flinger.
  • step S4230 can refer to the aforementioned step S4225.
  • the main difference is that the target refresh rate determined in step S4225 is related to the barrage gear, while the refresh rate determined in step S4230 is related to the video playback scene.
  • Surface Flinger controls the display of the video playback interface according to the target refresh rate.
  • step S4231 can refer to the above description of step S4226, the main difference is that the target refresh rate in step S4226 is determined according to the barrage gear, while the target refresh rate in step S4231 is determined according to the video playback scene, here I won't repeat them here.
  • a scene with a high barrage gear appears on the playback interface of Video 1.
  • the user chooses to close the barrage switch, it will trigger the execution of steps S4227 to Step S4231, it is determined that it is not currently in the barrage scene, but only in the video playback scene, and then the refresh rate switch is completed, and the screen displayed on the display screen changes to (5) in Figure 4c.
  • the barrage switch is turned off, and the playback interface of Video 1 There is no barrage anymore, and the mobile phone refreshes the display screen according to the refresh rate of 24HZ in the video playback scene.
  • the video application no longer performs video playback, Surface Flinger will detect that the layer is destroyed in step S4209, and then Surface Flinger will determine that it is not in the video playback scene, and the scene information determined at this time Changes occur again, so the video playback scene information can be sent to the frame rate decision module again, and the frame rate decision module can adjust the target refresh rate again, for example, set the target refresh rate to the preset refresh rate corresponding to the video application.
  • Surface Flinger determines that it is not in a video playback scene, it may also determine to turn off bullet chat statistics, and then the drawing and rendering module will stop bullet chat statistics.
  • the mobile phone when the user clicks the " ⁇ " icon to close the playback interface of video 1, the mobile phone can determine that it is not in the video playback scene, and the refresh rate will be switched again due to the scene change.
  • the display screen changes to (6) in FIG. 4c, the video application returns to the video application main interface, and switches to a preset refresh rate of 60HZ corresponding to the video application to refresh the screen.
  • Surface Flinger can determine whether the video application is in the video playback scene according to the layer information corresponding to the video playback interface and/or the caller information, and then can pass to the video application when the video application is in the video playback scene.
  • the frame rate decision-making module sends switching refresh rate information, so that the frame rate decision-making module readjusts the target refresh rate, and the target refresh rate determined according to the video playback scene can meet the performance requirements of electronic equipment, and can not only meet the display requirements of video playback, It will not cause excessive refresh rate and waste power.
  • Figure 4e is a schematic diagram of the video playback stage flow of any video application in the refresh rate setting method proposed by another embodiment of the present application, which is applied to the electronic device proposed by the embodiment of the present application, and the flow shown in Figure 4e Execution can be started after the process shown in Figure 4a is completed and the video application main interface of the video application is displayed. At this time, only one video application main interface of the video application is displayed on the front end of the electronic device.
  • the specific video playback stage process includes The following steps:
  • the video application In response to a user's operation of playing a video, the video application sends a request for starting a video playing activity to the AMS.
  • step S4301 for the execution process and principle of step S4301, reference may be made to the aforementioned step S4201 shown in FIG. 4b, which will not be repeated here.
  • the AMS starts the video playing activity.
  • step S4302 for the execution process and principle of step S4302, reference may be made to the aforementioned step S4202 shown in FIG. 4b, which will not be repeated here.
  • the AMS sends the window information corresponding to the video playback interface to the WMS.
  • step S4303 for the execution process and principle of step S4303, reference may be made to the aforementioned step S4203 shown in FIG. 4b, which will not be repeated here.
  • the WMS creates a window of the video playback interface according to the window information corresponding to the video playback interface.
  • step S4304 for the execution process and principle of step S4304, reference may be made to step S4204 shown in FIG. 4b above, which will not be repeated here.
  • the WMS sends information that the window creation is completed to the video application.
  • step S4305 for the execution process and principle of step S4305, reference may be made to the aforementioned step S4205 shown in FIG. 4b , which will not be repeated here.
  • the video application calls MediaCodec to perform video decoding.
  • step S4306 for the execution process and principle of step S4306, reference may be made to the aforementioned step S4206 shown in FIG. 4b, which will not be repeated here.
  • the Media Codec sends the frame rate of the video source and the called information to the frame rate decision module.
  • the video source frame rate refers to the video source frame rate of the video to be played by the user. Because in step S4306, Media Codec has carried out video decoding to the video that the user needs to play under the call of the video application, so Media Codec obtains the video source frame rate of the video through the video source of the video that the user needs to play. In some other embodiments, Media Codec can also obtain information of video sources such as video resolution.
  • the called information is used to indicate that the Media Codec is called.
  • the called information may carry the package name of the video application to indicate that the MediaCodec is called by the video application.
  • Media Codec performs video decoding under the call of the video application, so MediaCodec can obtain the package name of the video application when the video application calls it, and then can carry the package name of the video application in the called information .
  • the called information may not carry the package name of the video application. Since the front end only displays the interface of a single video application in the scene shown in FIG. 4c, the application that calls Media Codec may not be specified in the called information.
  • the frame rate decision module sends the called information to Surface Flinger.
  • the called information is used to indicate that the Media Codec is called.
  • the frame rate decision module may also send video source information such as video source frame rate and video resolution to Surface Flinger.
  • the WMS sends the layer information corresponding to the window of the video playback interface to Surface Flinger.
  • step S4309 for the execution process and principle of step S4309, reference may be made to step S4208 shown in FIG. 4b above, which will not be repeated here.
  • the Surface Flinger creates a layer corresponding to the window of the video playback interface according to the layer information corresponding to the window of the video playback interface.
  • step S4310 for the execution process and principle of step S4310, reference may be made to step S4209 shown in FIG. 4b above, which will not be repeated here.
  • step S4311 for the execution process and principle of step S4311, reference may be made to step S4210 shown in FIG. 4b above, which will not be repeated here.
  • the Surface Flinger determines whether it is a video playback scene according to the layer information corresponding to the window of the video playback interface.
  • step S4312 If it is determined in step S4312 that it is a video playback scene, execute step S4313; if it is determined that it is not a video playback scene, then end the process and continue to refresh the display screen according to the preset refresh rate corresponding to the video application.
  • step S4312 for the execution process and principle of step S4312, reference may be made to step S4211 shown in FIG. 4b above, which will not be repeated here.
  • the Surface Flinger determines that the video application calls the Media Codec according to the called information.
  • step S43008 Surface Flinger received the called information, the called information can indicate that the Media Codec was called, further verifying that there is a current video playback requirement, so the video application will call the Media Codec for video decoding.
  • the determination in step S4312 that the current scene is a video playback scene Surface Flinger again verifies that the current scene is a video playback scene through step S4313, and then further executes step S4314 to determine whether it is currently a barrage scene under the video playback scene.
  • step S4313 determines that the video application has called the Media Codec, and after verifying that the current scene is a video playback scene, before performing step S4314, the video playback scene information can also be directly sent to the frame rate decision module, and the frame rate
  • the decision-making module determines the target refresh rate based on the video playback scene information and the frame rate of the video source, and then sends the target refresh rate to Surface Flinger, and Surface Flinger controls the display of the video playback interface according to the target refresh rate.
  • the specific process and execution principle can be referred to Subsequent steps S4330 to S4333 will not be repeated here.
  • step S4315 determines that the video application does not receive the bullet chat drawing information within the preset duration, Then the process can be ended directly, without performing steps S4330 to S4333 again.
  • Surface Flinger assists in verifying that the video application is in the video playback scene through the called information, which improves the recognition accuracy of the video playback scene.
  • step S4313 Since step S4313 has re-verified that the current scene is a video playback scene, it can be determined to enable bullet chat statistics, so that the drawing and rendering module can start counting bullet chat when drawing bullet chat.
  • Surface Flinger By turning on the barrage statistics, Surface Flinger can further determine whether it is currently a barrage scene, and can determine the barrage gear in the barrage scene based on the barrage statistics data.
  • step S4314 for the execution process and principle of step S4314, reference may be made to the aforementioned step S4212 shown in FIG. 4b, which will not be repeated here.
  • Surface Flinger determines whether bullet chat drawing information is received within a preset time period.
  • step S4315 for the execution process and principle of step S4315, reference may be made to the aforementioned step S4213 shown in FIG. 4b, which will not be repeated here.
  • the video application calls the drawing and rendering module to draw the barrage according to the rhythm of the Vsync signal.
  • step S4316 for the execution process and principle of step S4316, reference may be made to step S4214 shown in FIG. 4b above, which will not be repeated here.
  • the drawing and rendering module determines whether to enable barrage statistics.
  • step S4317 for the execution process and principle of step S4317, reference may be made to step S4215 shown in FIG. 4b above, which will not be repeated here.
  • the drawing and rendering module controls adding one to the barrage counter.
  • step S4318 for the execution process and principle of step S4318, reference may be made to step S4216 shown in FIG. 4b above, which will not be repeated here.
  • step S4319 for the execution process and principle of step S4319, reference may be made to step S4217 shown in FIG. 4b above, which will not be repeated here.
  • the drawing and rendering module sends the barrage drawing information of the current frame to the Surface Flinger according to the rhythm of the Vsync signal.
  • step S4320 for the execution process and principle of step S4320, reference may be made to step S4218 shown in FIG. 4b above, which will not be repeated here.
  • Surface Flinger determines that the bullet chatting drawing information is received within the preset time period, and stores the number of bullet chatting bars drawn in the current frame and the time stamp.
  • step S4321 for the execution process and principle of step S4321, reference may be made to the aforementioned step S4219 shown in FIG. 4b, which will not be repeated here.
  • Surface Flinger determines the total number of bullet chats displayed in the current frame according to the stored number of bullet chats, time stamps, and bullet chat aging time.
  • step S4322 for the execution process and principle of step S4322, reference may be made to the aforementioned step S4220 shown in FIG. 4b , which will not be repeated here.
  • Surface Flinger determines the barrage gear of the current frame according to the number of barrage lines drawn in the current frame and the total number of barrage bars displayed.
  • step S4323 for the execution process and principle of step S4323, reference may be made to the aforementioned step S4221 shown in FIG. 4b, which will not be repeated here.
  • step S4324 for the execution process and principle of step S4324, reference may be made to the aforementioned step S4222 shown in FIG. 4b, which will not be repeated here.
  • step S4325 for the execution process and principle of step S4325, reference may be made to the aforementioned step S4223 shown in FIG. 4b , which will not be repeated here.
  • the frame rate decision-making module determines a target refresh rate according to the barrage gear information.
  • step S4326 for the execution process and principle of step S4326, reference may be made to the aforementioned step S4224 shown in FIG. 4b, which will not be repeated here.
  • the frame rate decision module sends the target refresh rate to Surface Flinger.
  • step S4327 for the execution process and principle of step S4327, reference may be made to the aforementioned step S4225 shown in FIG. 4b, which will not be repeated here.
  • Surface Flinger controls the display of the video playback interface according to the target refresh rate.
  • step S4328 for the execution process and principle of step S4328, reference may be made to the aforementioned step S4226 shown in FIG. 4b, which will not be repeated here.
  • step S4329 for the execution process and principle of step S4329, reference may be made to the aforementioned step S4227 shown in FIG. 4b, which will not be repeated here.
  • step S4330 for the execution process and principle of step S4330, reference may be made to the aforementioned step S4228 shown in FIG. 4b, which will not be repeated here.
  • the frame rate decision module determines a target refresh rate according to the video playing scene information and the frame rate of the video source.
  • step S4331 for the execution process and principle of step S4331, reference may be made to the aforementioned step S4229 shown in FIG. 4b, which will not be repeated here.
  • the frame rate decision module sends the target refresh rate to Surface Flinger.
  • step S4332 for the execution process and principle of step S4332, reference may be made to the aforementioned step S4230 shown in FIG. 4b, which will not be repeated here.
  • step S4333 for the execution process and principle of step S4333, reference may be made to the aforementioned step S4231 shown in FIG. 4b , which will not be repeated here.
  • the video application in the embodiment of this application is referred to as application for short.
  • the first refresh rate is used to refer to the preset refresh rate corresponding to the application
  • the second refresh rate is used to refer to the frame rate decision
  • the third refresh rate refers to the refresh rate determined by the frame rate decision module in the video playback scene.
  • the interfaces that are output on the display screen by the application such as the video playback interface and the main application interface mentioned in the embodiments of the present application, are collectively referred to as the application interface of the application.
  • the layer corresponding to the window of the video playback interface mentioned in the embodiment of the present application is also referred to as the layer of the video playback interface for short, and the layer corresponding to the window of the main application interface is also referred to as the layer of the main application interface.
  • the electronic device after the electronic device responds to the operator who starts the application from the user, it refreshes at the first refresh rate Refreshes the application interface that displays the application.
  • the first refresh rate may be a preset refresh rate corresponding to the application.
  • the electronic device displays the video playback interface according to the first refresh rate in response to receiving the user's operation to play the video
  • the video playing interface is used to display the video screen of the user playing the video.
  • There are many ways to display the video playback interface according to the first refresh rate in response to receiving the user's operation of playing the video including but not limited to the content proposed in the embodiment of the present application.
  • steps S4210 to S4231 in Figure 4b, and S4311 to S4333 in Figure 4d it can be seen that in the embodiment of the present application, in the process of displaying the video screen of the user playing the video, by determining whether to perform barrage drawing within the preset duration way to adjust the refresh rate. Specifically, in the process of displaying the video screen of the user playing the video, if the bullet chatting drawing is performed within the preset duration, the bullet chat data of each frame of the video display screen will be counted, and the bullet chat data of each frame display screen will be counted. , refresh and display the application interface of the application according to the second refresh rate.
  • the application interface of the displayed application is refreshed according to the third refresh rate, which is different from the second refresh rate, and the third refresh rate is the refresh rate used in the video playback scene , the second refresh rate is the refresh rate determined according to the bullet chat data of each frame of the display screen in the bullet chat scene.
  • the barrage data is adjusted to the second refresh rate that meets the needs of the current scene to refresh the application interface, and in the video playback scene, it is adjusted to the third refresh rate that meets the needs of the video playback scene to refresh the application interface.
  • the disclosed system, device and method may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be Incorporation may either be integrated into another system, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of this embodiment may be integrated into one processing unit, or each unit may physically exist separately, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the technical solution of this embodiment is essentially or the part that contributes to the prior art or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium
  • several instructions are included to make a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor execute all or part of the steps of the method described in each embodiment.
  • the aforementioned storage medium includes: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk, and other various media capable of storing program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请公开了一种刷新率的设置方法及相关设备,涉及显示技术领域,目的在于满足电子设备在视频播放过程中的对刷新率的需求。具体方案为:响应于接收用户启动应用的操作,按照第一刷新率刷新显示所述应用的应用界面。响应于接收用户播放视频的操作,按照第一刷新率显示视频画面。在显示视频画面的过程中,确定预设时长内是否进行弹幕绘制。若在预设时长内进行弹幕绘制,则统计视频的每一帧显示画面的弹幕数据。根据每一帧显示画面的弹幕数据,按照第二刷新率刷新显示所述应用的应用界面,满足弹幕场景下的显示需求。若在预设时长内未进行弹幕绘制,则按照第三刷新率刷新显示应用的应用界面,满足视频播放场景下的显示需求。

Description

刷新率的设置方法及相关设备
本申请要求于2021年12月16日提交中国国家知识产权局、申请号为202111545016.7、发明名称为“刷新率的设置方法及相关设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示技术领域,尤其涉及一种刷新率的设置方法及相关设备。
背景技术
刷新率为屏幕每秒画面被刷新的次数。随着显示技术的发展,手机、平板电脑等电子设备可支持的刷新率不断提高。目前,为了画面能显示流畅,电子设备通常是按照固定的高刷新率进行刷新显示,例如按照固定的60HZ、90HZ或者120HZ的固定刷新率进行刷新显示。
然而,在电子设备的视频播放场景下,往往对刷新率的需求较低,这就导致电子设备在播放视频过程中,会出现实际的刷新率高于需求的刷新率的情况,造成处理器的负载过重,对系统电量也造成了浪费。
发明内容
本申请提供了一种刷新率的设置方法及相关设备,目的在于满足视频播放过程中对刷新率的需求。
为了实现上述目的,本申请提供了以下技术方案:
第一方面,本申请实施例公开了一种刷新率的设置方法,应用于电子设备,电子设备安装有应用,方法包括:
响应于接收用户启动应用的操作,按照第一刷新率刷新显示应用的应用界面。响应于接收用户播放视频的操作,按照第一刷新率显示视频画面。在显示视频画面的过程中,确定预设时长内是否进行弹幕绘制。若在预设时长内进行弹幕绘制,则统计视频的每一帧显示画面的弹幕数据,然后根据每一帧显示画面的弹幕数据,按照第二刷新率刷新显示应用的应用界面。若在预设时长内未进行弹幕绘制,则按照第三刷新率刷新显示应用的应用界面,第三刷新率与第二刷新率不同。
本申请实施例中,在显示视频画面的过程中,实时确定预设时长内是否进行弹幕绘制,若在预设时长内进行弹幕绘制,则可以确定应用进入弹幕场景,因此统计视频的每一帧显示画面的弹幕数据,然后根据每一帧显示画面的弹幕数据,按照第二刷新率刷新显示应用的应用界面,以根据弹幕数据使用满足弹幕场景下的显示需求的刷新率刷新界面。而若在预设时长内未进行弹幕绘制,则可以认为应用进入了视频播放场景,进而可以按照第三刷新率刷新显示应用的应用界面,以满足视频播放场景下的显示需求。
在一种可能的实现方式中,根据每一帧显示画面的弹幕数据,按照第二刷新率刷新显示应用的应用界面,包括:
根据每一帧显示画面的弹幕数据,确定第二刷新率,按照确定出的第二刷新率刷新显示应用的应用界面。
在另一种可能的实现方式中,根据每一帧显示画面的弹幕数据,确定第二刷新率,包括:
根据当前统计到的每一帧显示画面的弹幕数据,确定当前的弹幕档位。弹幕档位用于表示显示画面上的弹幕密度量级。根据当前的弹幕档位,确定第二刷新率。第二刷新率为当前的弹幕档位对应的预设刷新率。弹幕档位说明的弹幕密度量级越大,则确定出的第二刷新率越大。
在另一种可能的实现方式中,一帧显示画面的弹幕数据,包括:一帧显示画面中绘制的弹幕条数,和/或,一帧显示画面中绘制的弹幕字符数。
在另一种可能的实现方式中,根据当前统计到的每一帧显示画面的弹幕数据,确定当前的弹幕档位,包括:
根据当前统计到的每一帧显示画面中绘制的弹幕条数、每一帧显示画面对应的时间戳以及弹幕老化时间,计算得到当前一帧显示画面中显示的总弹幕条数。根据当前一帧显示画面中绘制的弹幕条数和当前一帧显示画面中显示的总弹幕条数,确定当前的弹幕档位。
在另一种可能的实现方式中,弹幕档位,包括:高弹幕档位或低弹幕档位。根据当前一帧显示画面中绘制的弹幕条数和当前一帧显示画面中显示的总弹幕条数,确定当前的弹幕档位,包括:
若当前一帧显示画面中绘制的弹幕条数小于第一预设值,且当前一帧显示画面中显示的总弹幕条数小于第二预设值,则确定出当前的弹幕档位为低弹幕档位。若当前一帧显示画面中绘制的弹幕条数大于或等于第二预设值,或者当前一帧显示画面中显示的总弹幕条数大于或等于第二预设值,则确定出当前的弹幕档位为高弹幕档位。
在另一种可能的实现方式中,统计视频的每一帧显示画面的弹幕数据,包括:
统计视频的每一帧显示画面中的特定区域的弹幕数据。
在另一种可能的实现方式中,确定预设时长内是否进行弹幕绘制之前,方法还包括:
根据应用界面的图层信息,确定是否为视频播放场景。视频播放场景为播放视频的场景。其中确定预设时长内是否进行弹幕绘制,包括:若确定为视频播放场景,则确定预设时长内是否进行弹幕绘制。
在另一种可能的实现方式中,确定预设时长内是否进行弹幕绘制之前,方法还包括:
确定是否对视频进行解码。其中若确定为视频播放场景,则确定预设时长内是否进行弹幕绘制,包括:若确定为视频播放场景且确定出对视频进行解码,则确定预设时长内是否进行弹幕绘制。
在另一种可能的实现方式中,根据应用界面的图层信息,确定是否为视频播放场景之前,还包括:
确定应用是否在播放白名单中;播放白名单为具有视频播放权限的应用名单。其中,根据应用界面的图层信息,确定是否为视频播放场景,包括:若确定出应用在播放白名单 中,则根据应用界面的图层信息,确定是否为视频播放场景。
在另一种可能的实现方式中,确定应用是否在播放白名单中,包括:
根据应用界面的图层信息中携带的应用的包名,确定应用是否在播放白名单中。播放白名单,包括:每一个具有视频播放权限的应用的包名。
在另一种可能的实现方式中,根据应用界面的图层信息,确定是否为视频播放场景,包括:
确定应用界面的图层信息中是否具有视频图层的特征信息,若应用界面的图层信息中具有视频图层的特征信息,则确定出处于视频播放场景。若应用界面的图层信息中不具有视频图层的特征信息,则确定出不处于视频播放场景。
在另一种可能的实现方式中,视频图层的特征信息为:图层名中携带的视频图层SurfaceView字段。
在另一种可能的实现方式中,第三刷新率根据视频的视频源帧率确定。
在另一种可能的实现方式中,还包括:响应于接收用户退出视频播放的操作,按照第一刷新率刷新显示应用的应用界面。
在另一种可能的实现方式中,电子设备的操作系统,包括:应用和图像合成器Surface Flinger;确定预设时长内是否进行弹幕绘制,包括:
Surface Flinger确定预设时长内是否接收到一帧的弹幕绘制信息,一帧的弹幕绘制信息,包括:一帧显示画面中绘制的弹幕条数、一帧显示画面对应的时间戳,以及一帧显示画面中绘制渲染后的弹幕数据。
在另一种可能的实现方式中,电子设备的操作系统还包括:帧率决策模块。一帧显示画面的弹幕数据,包括:一帧显示画面中绘制的弹幕条数。若在预设时长内进行弹幕绘制,则统计视频的每一帧显示画面的弹幕数据,包括:
Surface Flinger若在预设时长内接收到一帧的弹幕绘制信息,则存储一帧显示画面中绘制的弹幕条数以及一帧显示画面对应的时间戳。
在另一种可能的实现方式中,电子设备的操作系统还包括:帧率决策模块。根据每一帧显示画面的弹幕数据,按照第二刷新率刷新显示应用的应用界面,包括:
Surface Flinger根据已存储的每一帧显示画面中绘制的弹幕条数、每一帧显示画面对应的时间戳以及弹幕老化时间,确定当前的弹幕档位。Surface Flinger将当前的弹幕档位信息发送至帧率决策模块。弹幕档位信息,包括:当前的弹幕档位。帧率决策模块根据弹幕档位信息,确定第二刷新率。帧率决策模块将确定出的第二刷新率发送至Surface Flinge。Surface Flinge按照第二刷新率,控制电子设备的显示屏刷新显示应用的应用界面。
在另一种可能的实现方式中,Surface Flinger将当前的弹幕档位信息发送至帧率决策模块,包括:
Surface Flinger若确定出当前的弹幕档位未发生变更,则将当前的弹幕档位信息发送至帧率决策模块。
在另一种可能的实现方式中,电子设备的操作系统,还包括:绘制渲染模块,绘制渲染模块包括弹幕计数器。Surface Flinger若在预设时长内接收到一帧的弹幕绘制信息,则Surface Flinger在预设时长内接收到一帧的弹幕绘制信息之前,还包括:
Surface Flinger确定开启弹幕统计,应用按照Vsync信号的节奏调用绘制渲染模块绘制弹幕,绘制渲染模块确定弹幕统计是否开启,绘制渲染模块若确定弹幕统计已开启,则控制弹幕计数器对绘制的弹幕条数进行计数,绘制渲染模块按照Vsync信号的节奏,发送当前一帧的弹幕绘制信息至Surface Flinger。
在另一种可能的实现方式中,电子设备的操作系统还包括:帧率决策模块。若在预设时长内未进行弹幕绘制,则按照第三刷新率刷新显示应用的应用界面,包括:
Surface Flinger若在预设时长内未接收到弹幕绘制信息,则将视频播放场景信息发送至帧率决策模块;视频播放场景为播放视频的场景,帧率决策模块根据视频播放场景信息和视频源帧率,确定第三刷新率。帧率决策模块将确定出的第三刷新率发送至Surface Flinge,Surface Flinge按照第三刷新率,控制电子设备的显示屏刷新显示应用的应用界面。
在另一种可能的实现方式中,电子设备的操作系统还包括:编解码器Media Codec。响应于接收用户播放视频的操作之后,还包括:
应用调用Media Codec对视频进行解码,Media Codec将视频的视频源帧率发送至帧率决策模块。
在另一种可能的实现方式中,电子设备的操作系统,包括:Surface Flinge。根据应用界面的图层信息,确定是否为视频播放场景,包括:
Surface Flinge确定应用界面的图层信息中的图层名,是否携带SurfaceView字段,Surface Flinge若确定出图层名携带SurfaceView字段,则确定为视频播放场景,Surface Flinge若确定出图层名不携带SurfaceView字段,则确定不为视频播放场景。
在另一种可能的实现方式中,电子设备的操作系统,包括:应用、Surface Flinge以及绘制渲模块。按照第一刷新率显示视频画面,包括:
Surface Flinge按照第一刷新率对应的Vsync信号节奏,触发应用对视频播放界面的图层进行绘制渲染,视频播放界面为用于显示视频画面的应用界面。应用按照第一刷新率对应的Vsync信号节奏,调用绘制渲染模块对视频播放界面的图层进行绘制渲染。绘制渲染模块按照第一刷新率对应的Vsync信号节奏,发送绘制渲染后的视频播放界面图像数据至Surface Flinge。Surface Flinge按照第一刷新率对应的Vsync信号节奏,使用绘制渲染后的视频播放界面图像数据进行图层合成。Surface Flinge按照第一刷新率对应的Vsync信号节奏,将合成后的视频播放界面图像数据输出至显示屏显示。
在另一种可能的实现方式中,电子设备的操作系统还包括:AMS和窗口管理模块WMS。Surface Flinge按照第一刷新率对应的Vsync信号节奏,触发应用对视频播放界面的图层进行绘制渲染之前还包括:
应用发送启动视频播放Activity的请求至AMS;启动视频播放Activity的请求中携带有应用的包名和视频播放界面名。AMS根据应用的包名和视频播放界面名,启动视频播放Activity。AMS发送视频播放界面对应的窗口信息至WMS。WMS根据视频播放界面对应的窗口信息,创建视频播放界面的窗口。WMS发送视频播放界面的图层信息至Surface Flinge。图层信息中携带有应用的包名。视频播放界面的图层信息与视频播放界面的窗口对应。Surface Flinge根据视频播放界面的图层信息,创建视频播放界面的图层。
在另一种可能的实现方式中,电子设备的操作系统,包括:应用、Surface Flinge以及 Media Codec。确定是否对视频进行解码,包括:
Surface Flinge若接收到被调用的信息,则确定出应用调用Media Codec对视频进行解码;被调用的信息用于说明Media Codec被调用。Surface Flinge若未接收到被调用的信息,则确定出应用未调用Media Codec对视频进行解码。
在另一种可能的实现方式中,若Surface Flinge接收到被调用的信息,则Surface Flinge接收到被调用的信息之前还包括:
应用调用Media Codec对视频进行解码,Media Codec发送被调用的信息至Surface Flinge。
在另一种可能的实现方式中,应用调用Media Codec对视频进行解码之前,还包括:
WMS发送视频播放界面的窗口创建完成的信息至应用。
在另一种可能的实现方式中,确定应用是否在播放白名单中,包括:
Surface Flinge根据应用界面的图层信息中携带的应用的包名,确定应用是否在播放白名单中;播放白名单,包括:每一个具有视频播放权限的应用的包名。
在另一种可能的实现方式中,电子设备的操作系统,包括:应用、Surface Flinge、帧率决策模块以及绘制渲染模块。按照第一刷新率刷新显示应用的应用界面,包括:
Surface Flinge按照第一刷新率对应的Vsync信号节奏,触发应用对应用主界面的图层进行绘制渲染,应用主界面为应用启动之后显示的应用界面。应用按照第一刷新率对应的Vsync信号节奏,调用绘制渲染模块对应用主界面的图层进行绘制渲染。绘制渲染模块按照第一刷新率对应的Vsync信号节奏,发送绘制渲染后的应用主界面图像数据至Surface Flinge。Surface Flinge按照第一刷新率对应的Vsync信号节奏,使用绘制渲染后的应用主界面图像数据进行图层合成。Surface Flinge按照第一刷新率对应的Vsync信号节奏,将合成后的应用主界面图像数据输出至显示屏显示。
在另一种可能的实现方式中,电子设备的操作系统还包括:桌面启动器Launcher、AMS以及WMS。Surface Flinge按照第一刷新率对应的Vsync信号节奏,触发应用对应用主界面的图层进行绘制渲染之前,还包括:
Launcher发送启动应用Activity的请求至AMS;启动应用Activity的请求中携带有应用的包名。AMS启动应用Activity,AMS向WMS发送应用主界面对应的窗口信息,窗口信息中携带有应用的包名。WMS根据应用主界面对应的窗口信息,创建应用主界面的窗口,并发送应用的包名至帧率决策模块。WMS发送应用主界面的图层信息至Surface Flinge;应用主界面的图层信息与应用主界面的窗口对应。帧率决策模块根据应用的包名,确定出第一刷新率,第一刷新率为应用对应的预设刷新率。Surface Flinge根据应用主界面的图层信息,创建应用主界面的图层。
第二方面,本申请公开了一种电子设备,包括:一个或多个处理器、存储器以及显示屏。存储器、显示屏分别与一个或多个处理器耦合。显示屏用于显示应用界面。存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,电子设备执行如上述第一方面中任一项的刷新率的设置方法。
应当理解的是,本申请中对技术特征、技术方案、有益效果或类似语言的描述并不是暗示在任意的单个实施例中可以实现所有的特点和优点。相反,可以理解的是对于特征或 有益效果的描述意味着在至少一个实施例中包括特定的技术特征、技术方案或有益效果。因此,本说明书中对于技术特征、技术方案或有益效果的描述并不一定是指相同的实施例。进而,还可以任何适当的方式组合本实施例中所描述的技术特征、技术方案和有益效果。本领域技术人员将会理解,无需特定实施例的一个或多个特定的技术特征、技术方案或有益效果即可实现实施例。在其他实施例中,还可在没有体现所有实施例的特定实施例中识别出额外的技术特征和有益效果。
附图说明
图1为本申请实施例提出的视频应用的视频播放场景示意图一;
图2为本申请实施例提出的电子设备的硬件结构图;
图3为本申请实施例提出的电子设备的系统图;
图4a为本申请实施例提出的刷新率的设置方法在应用启动阶段的流程图;
图4b为本申请实施例提出的刷新率的设置方法在视频播放阶段的流程图;
图4c为本申请实施例提出的视频应用的视频播放场景示意图二;
图4d本申请实施例提出的视频应用的视频播放场景示意图三;
图4e为本申请实施例提出的另一刷新率的设置方法在视频播放阶段的流程图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请实施例中,“一个或多个”是指一个、两个或两个以上;“和/或”,描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
本申请实施例涉及的多个,是指大于或等于两个。需要说明的是,在本申请实施例的描述中,“第一”、“第二”等词汇,仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。
为了下述各实施例的描述清楚简洁,首先给出一种刷新率的设置方案的简要介绍:
以手机这类电子设备为例,手机的各个应用可以设定有应用对应的预设刷新率,不同应用对应的预设刷新率可以相同,也可以不同。用户点击启动应用后,在应用运行过程中, 手机可以采用应用对应的预设刷新率刷新屏幕。
具体的,举例说明,如图1的(1)所示,手机主界面上显示有时钟、日历、图库、备忘录、文件管理、电子邮件、音乐、计算器、视频、运动健康、天气以及浏览器应用。用户点击视频应用的图标,手机界面变化为图1的(2)。如图1的(2)所示,应用主界面上展示了视频1、视频2、视频3、视频4、视频5等可播放的视频。此时应用已启动,手机按照应用对应的预设刷新率60HZ,进行刷新。用户从应用主界面上点击视频1进行播放,手机界面变化为图1的(3)。图1的(3)显示了视频1的播放界面,视频1此时弹幕开关处于关闭状态,在图1的(3)所示出的视频1的播放过程中,视频画面仍然以60HZ的刷新率在刷新。用户点击开启弹幕,手机画面变化为图1的(4),此时弹幕开关处于开启状态,视频画面中有弹幕滚动,视频画面仍然以60HZ进行刷新显示。
然而,在应用的视频播放场景下,应用对应的预设刷新率难以满足视频播放过程对刷新率的需求。例如当图1的(3)所播放的视频1为电影时,仅需要24HZ的刷新率即可满足流畅显示画面的需求,而采用60HZ的刷新率去刷新视频画面,会造成处理器负载过大、耗电较高,不满足手机低功耗的需求。又例如,对于如图1的(4)所示出的视频播放、并开启弹幕的场景,由于画面上还需动态显示滚动弹幕,因此对刷新率的需求会高于图1的(3)所示出的未开启弹幕场景,刷新率过低了容易出现弹幕滚动画面卡顿。
由前述描述可知:(1)应用对应的预设刷新率高于视频播放场景下的实际需求刷新率,容易造成处理器负载过大、耗电较高的情况,不满足手机低功耗的需求。(2)应用对应的预设刷新率是固定不变的,而视频播放场景下对刷新率的需求是会动态变化的,应用对应的预设刷新率无法适用于视频播放场景下对刷新率的不同需求。
基于上述技术方案存在的问题,本申请实施例提出了一种刷新率的设置方法,以满足应用在不同的视频播放场景下对刷新率的需求。
本申请实施例提出的一种刷新率的设置方法,可以适用于手机,平板电脑,笔记本电脑,超级移动个人计算机(Ultra-mobile Personal Computer,UMPC),手持计算机,上网本,个人数字助理(Personal DigitalAssistant,PDA),可穿戴电子设备,智能手表等电子设备,该电子设备内安装有视频应用。其中,本申请实施例所提及的视频应用,均指的是具有播放视频功能的应用。
如图2所示,本申请实施例所提出的电子设备可以包括处理器110,外部存储器接口120,内部存储器121,充电管理模块130,电源管理模块131,电池132,天线1,天线2,移动通信模块140,无线通信模块150,音频模块160,扬声器160A,传感器模块170,以及显示屏180等。其中传感器模块170可以包括指纹传感器170A,触摸传感器170B等。
可以理解的是,本实施例示意的结构并不构成对电子设备的具体限定。在另一些实施例中,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(applicationprocessor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),视频编解码器。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。例如,在本申请中,处理器110可 以用于执行本申请实施例提出的任一刷新率的设置方法,例如处理器110所包括的GPU可以用于实现在显示屏180上显示本申请实施例所提出的任一刷新率的设置方法中所涉及的图像和弹幕。具体的,可参阅下述图4a、图4b以及图4e部分的相关内容,此处不再赘述。
处理器110中还可以设置存储器,用于存储指令和数据。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括移动产业处理器接口(mobile industryprocessor interface,MIPI)等。
MIPI接口可以被用于连接处理器110与显示屏180等外围器件。MIPI接口包括显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和显示屏180通过DSI接口通信,实现电子设备的显示功能。例如,本申请实施例中,处理器110和显示屏180通过DSI接口通信,实现在显示屏180上显示本申请实施例所提出的任一刷新率的设置方法中涉及的图像和弹幕。
可以理解的是,本实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备的结构限定。在本申请另一些实施例中,电子设备也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块130用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。
电源管理模块131用于连接电池132,充电管理模块130与处理器110。电源管理模块131接收电池132和/或充电管理模块130的输入,为处理器110,内部存储器121,显示屏180等供电。
电子设备的无线通信功能可以通过天线1,天线2,移动通信模块140,无线通信模块150,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。
移动通信模块140可以提供应用在电子设备上的包括2G/3G/4G/5G等无线通信的解决方案。
无线通信模块150可以提供应用在电子设备上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT)等无线通信的解决方案。
在一些实施例中,电子设备的天线1和移动通信模块140耦合,天线2和无线通信模块150耦合,使得电子设备可以通过无线通信技术与网络以及其他设备通信。
电子设备通过GPU,显示屏180,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏180和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。例如,本申请实施例中,GPU可以用于实现在显示屏180上显示本申请实施例所提出的任一刷新率的设置方法中所涉及的图像和弹幕。具体的,可参阅下述图4a、图4b、以及图4e部分的相关内容,此处不再赘述。
显示屏180用于显示图像,视频等。显示屏180包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting  diode的,AMOLED)等。在一些实施例中,电子设备可以包括1个或N个显示屏180,N为大于1的正整数。
电子设备的显示屏180上可以显示一系列图形用户界面(graphical user interface,GUI),这些GUI都是该电子设备的主屏幕。例如,本申请实施例中,显示屏180可以用于显示本申请实施例所提出的任一刷新率的设置方法中所涉及的图像和弹幕。具体的,可参阅具体的,可参阅图4a、图4b以及图4e部分的相关内容,此处不再赘述。
视频编解码器用于对数字视频压缩或解压缩。电子设备可以支持一种或多种视频编解码器。这样,电子设备可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。例如,本申请实施例中,视频编解码器可以用于对本申请实施例所提出的任一刷新率的设置方法中所涉及的视频应用播放的视频进行解码。具体的,可参阅图4b部分的S4206以及图4e部分的S4306的相关内容,此处不再赘述。在另一些实施例中,视频编解码器也可以为软件,例如可以是下述图3示出的编解码器(Media Codec)。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备的存储能力。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备的各种功能应用以及数据处理。例如,在本实施例中,处理器110可以通过执行存储在内部存储器121中的指令,执行本申请实施例所提及的任一刷新率的设置方法。具体的,可参阅下述图4a、图4b以及图4e部分的相关内容,此处不再赘述。
音频模块160用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。
扬声器160A,也称“喇叭”,用于将音频电信号转换为声音信号。
指纹传感器170A用于采集指纹。电子设备可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
触摸传感器170B,也称“触控器件”。触摸传感器170B可以设置于显示屏180,由触摸传感器170B与显示屏180组成触摸屏,也称“触控屏”。
另外,在上述部件之上,运行有操作系统。例如iOS操作系统,Android开源操作系统,Windows操作系统等。在该操作系统上可以安装运行应用程序。
图3是本申请实施例的电子设备的结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
图3是本申请实施例的电子设备的结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
参阅图3,应用程序层可以包括桌面启动器(Launcher)和视频应用。
其中,Launcher用于启动应用Activity。
例如,在实施例中,如图3所示,Launcher响应于用户点击视频应用的图标的操作,下发启动应用Activity请求至Activity管理服务模块(Activity Manager Service,AMS),具体可以参阅下述图4a示出的步骤S4101的相关内容,此处不再赘述。
视频应用用于为用户提供视频播放功能。
例如,本实施例中,如图3所示,视频应用响应于用户点击视频播放的操作,调用编解码器(Media Codec)对视频进行解码,并下发启动视频播放Activity的请求至AMS。具体可以参阅下述图4b示出的步骤S4201和步骤S4206的相关内容,以及图4e中的步骤S4301和步骤S4306的相关内容,此处不再赘述。
例如,在本实施例中,如图3所示,视频应用被图像合成器(Surface Flinger)触发绘制渲染后,调用绘制渲染模块的绘制渲染线程,进行图像数据和/或,弹幕数据的绘制渲染,并将绘制渲染后的数据发送至Surface Flinger进行合成。具体可以参阅图4a示出的步骤S4110至步骤S4112的相关内容,此处不再赘述。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。如图3所示,应用程序框架层可以包括Media Codec,AMS,窗口管理模块(Window Manager Service,WMS),以及绘制渲染模块。
Media Codec用于处理音视频的编解码。
例如,本实施例中,如图3所示,Media Codec对视频应用需播放的视频进行解码,并将视频源帧率和被调用的信息发送给帧率决策模块。具体可以参阅图4b示出的步骤S4207的相关内容,以及图4e的步骤S4307的相关内容,此处不再赘述。
AMS用于管理应用进程以及进程的Activity。
例如,如图3所示,本实施例中,AMS响应于Launcher下发的启动应用Activity请求,启动应用Activity,然后向WMS下发创建窗口的信息,具体可以参阅图4a示出的步骤S4102至步骤S4103的相关内容,此处不再赘述。
又例如,如图3所示,本实施例中,AMS响应于视频应用下发的视频播放Activity请求,启动视频播放Activity,并向WMS下发创建窗口的信息,具体可以参阅图4b示出的步骤S4202至步骤S4203,以及图4e的S4302至S4303的相关内容,此处不再赘述。
WMS用于管理窗口,例如负责窗口的启动、添加和删除等。
例如,本实施例中,WMS用于根据AMS发送的视频应用主界面对应的窗口信息,创建视频应用主界面的窗口。具体可以参阅图4a的步骤S4104的相关内容。WMS还可以根据视频播放界面的窗口信息,创建视频播放界面的窗口,具体可参阅图4b的步骤S4204以及图4e的S4304的相关内容。
例如,本实施例中,WMS用于下发视频应用的包名至帧率决策模块。具体可以参阅图4a示出的步骤S4107的相关内容,此处不再赘述。
例如,如图3所示,本实施例中,WMS用于下发视频应用主界面的图层信息、或者下发视频播放界面的图层信息至Surface Flinger,具体可以参阅图4a示出的步骤S4105以及图4b示出的步骤S4205的相关内容,此处不再赘述。
绘制渲染模块用于启动绘制渲染线程进行绘制渲染,得到绘制渲染后的图像数据,和/或,弹幕数据等。
例如,如图3所示,本实施例中,视频应用调用绘制渲染模块中的绘制渲染线程,以使得绘制渲染模块绘制视频应用主界面的图层,视频播放界面的图层和/或弹幕,具体可以参阅图4a示出的步骤S4111至步骤S4112的相关内容、以及图4b示出的步骤S4214、图4e示出的S4316的相关内容,此处不再赘述。
又例如,本实施例中,绘制渲染模块还包括有弹幕计数器,在接收到Surface Flinger发送的开启弹幕统计的信息时,弹幕计数器会开启弹幕统计,统计弹幕条数,并向Surface Flinger发送弹幕绘制,其中,弹幕绘制信息携带有绘制渲染后的弹幕数据、弹幕条数以及时间戳。具体可以参阅图4b示出的步骤S4214至步骤S4218,以及图4e示出的S4316至S4319的相关内容,此处不再赘述。
AndroidRuntime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如图3所示,系统库包括:帧率决策模块和Surface Flinger。
帧率决策模块用于发送目标刷新率(也可以称为目标刷新率)给Surface Flinger,以使得Surface Flinger按照目标刷新率控制显示数据的合成。
例如,如图3所示,本实施例中,帧率决策模块用于将被调用的信息发送给Surface Flinger,具体可以参阅图4e示出的步骤S4308的相关内容,此处不再赘述。
又例如,本实施例中,帧率决策模块还接收Surface Flinger发送的场景信息,根据场景信息确定出目标刷新率,并将目标刷新率发送给Surface Flinger。其中,场景信息,包括:视频场景信息或弹幕档位信息。具体可以参阅图4b示出的步骤S4224至步骤S4225,S4229至S4230,以及图4e的S4326至步骤S4327,步骤S4331至步骤S4332的相关内容,此处不再赘述。
Surface Flinger用于合成显示数据。
例如,如图3所示,本实施例中,Surface Flinger用于创建视频应用主界面的图层或者创建视频播放界面的图层。具体可以参阅图4a示出的步骤S4106的相关内容,图4b示出的步骤S4209以及图4e示出的S4310的相关内容,此处不再赘述。
例如,如图3所示,本实施例中,Surface Flinger用于触发视频应用进行绘制渲染,具体可以参阅图4a示出的步骤S4110的相关内容,此处不再赘述。
又例如,本实施例中,Surface Flinger还用于接收WMS发送的图层信息以及接收Media Codec发送的包名,根据图层信息中的图层特征以及包名,确定是否是视频播放场景,若为视频播放场景,则向绘制渲染模块发送开启弹幕统计的信息。具体可以参阅图4b示出的步骤S4206至步骤4211,以及图4e的S4309至S4312的相关内容,此处不再赘述。
又例如,本实施例中,Surface Flinger还接收绘制渲染模块发送的携带有绘制渲染后的弹幕数据、弹幕条数以及时间戳的弹幕绘制信息,根据弹幕绘制信息确定出弹幕场景下的弹幕档位,然后将携带有弹幕档位的弹幕场景信息发送至帧率决策模块。又例如,当超过预设时长未接收到弹幕绘制信息,则将视频播放场景信息发送至帧率决策模块。具体可以参阅图4b示出的步骤S4219至步骤S4223,以及图4e的S4321至S4325的相关内容,此处不再赘述。
又例如,本实施例中,Surface Flinger还接收并缓存视频应用发送的绘制渲染后的图像数据和弹幕数据,与硬件设计者(Hardware Composer,HWC)配合进行图层合成。
内核层是硬件和软件之间的层。内核层至少包含HWC,HWC用于利用硬件完成图像数据组合并显示。例如,如图3所示,本实施例中,HWC用于接收Surface Flinger发送的图层和目标刷新率,按照目标刷新率对图层进行处理,并送至硬件层的LCD进行显示。具体可以参阅图4a的S4113的相关内容,以及图4b示出的步骤S4226和步骤S4231,以及图4e的S4328和步骤S4333的相关内容,此处不再赘述。
需要说明的是,本申请实施例虽然以Android系统为例进行说明,但是其基本原理同样适用于基于iOS或Windows等操作系统的电子设备。还需要说明的是,图3示出的硬件层中除了可以使用LCD,也可以使用OLED等其他类型的显示屏,本申请实施例对此不做限制。
参阅图4a,图4a为本申请实施例提出的刷新率的设置方法中的任意一个视频应用的启动阶段流程的示意图,应用于本申请实施例提出的电子设备,具体包括以下步骤:
S4101、Launcher响应于用户启动视频应用的操作,下发启动应用Activity的请求至AMS。
具体的,Launcher接收用户启动视频应用的操作,进而响应于用户启动视频应用的操作,下发启动应用Activity的请求至AMS。示例性的,用户启动视频应用的操作可以是用户点击视频应用图标的操作、用户语音启动视频应用的操作、用户解锁自动启动视频应用的操作等。例如图4a所示,用户需要使用视频应用时,用户点击视频应用的图标,以使得电子设备中的Launcher响应于用户点击视频应用图标的操作,下发启动应用Activity的请求至AMS。
其中,Activity是一个负责与用户交互的应用程序组件,提供一个屏幕或界面,用户可以用来交互为了完成某项任务。应用Activity主要用于完成启动应用的任务,为用户提供应用的主界面。启动应用Activity的请求用于请求启动视频应用Activity,以实现启动视频应用,向用户展示视频应用的主界面。
在一些实施例中,启动应用Activity的请求中携带有视频应用的包名,在另一些实施例中,也可以是携带有视频应用的路径等视频应用所特有的唯一标识。在另一些实施例中,启动应用Activity的请求中除了视频应用所特有的唯一标识外,还可以携带有其他的信息。
在本申请实施例中,视频应用指的是具有视频播放功能的应用,具体可以是浏览器、某视频平台的应用、可播放音乐短片的音乐平台的应用等等。举例说明,如图1中的(1)所示,电子设备为手机时,用户点击界面上的“视频app”的图标,进而可触发手机中的Launcher响应于用户点击视频应用图标的操作,下发启动应用Activity的请求至AMS。
具体的,步骤S4101的执行过程和原理可以参考Android系统中启动应用过程中涉及Activity相关技术原理。
S4102、AMS启动应用Activity。
在一些实施例中,执行步骤S4102的过程为:AMS可以响应于启动应用Activity的请求,根据启动应用Activity的请求中所携带的视频应用的包名,创建应用Activity,并启动创建了的应用Activity。
具体的,AMS启动应用Activity的执行过程和原理可以参考Android系统中AMS启动应用Activity的相关技术原理。
S4103、AMS发送视频应用主界面对应的窗口信息至WMS。
AMS启动应用Activity后,需要完成启动视频应用的任务,为用户提供视频应用的主界面,以供用户与视频应用主界面的控件进行交互。
示例性的,视频应用主界面对应的窗口信息可以从视频应用的安装信息中获取。例如,AMS可以根据视频应用的包名,确定出视频应用的安装信息,进而从视频应用的安装信息中获取视频应用主界面对应的窗口信息。示例性的,可以是AMS与视频应用之间通过多次交互的方式,确定出视频应用主界面对应的窗口信息。
视频应用主界面对应的窗口信息可用于创建视频应用主界面对应的窗口,视频应用主界面对应的窗口信息可以携带有视频应用的包名,具体可以包括有:视频应用主界面的窗口属性信息、窗口层级信息等。例如,窗口属性信息可以是窗口名、窗口位置、窗口大小等,而视频应用主界面对应的窗口层级信息用于说明将视频应用主界面对应的窗口置于显示界面的最顶层。AMS确定出视频应用主界面对应的窗口信息后,即可将该窗口信息发送给WMS,以供WMS创建视频应用主界面对应的窗口。
需要说明的是,视频应用主界面所需要创建的窗口数量的不同并不影响本申请实施例的实现,本申请对此不作限制。
在一些实施例中,步骤S4103的执行过程可以是,AMS调用WMS,向WMS发送视频应用主界面对应的窗口信息。示例性的,AMS调用WMS线程,通过下发视频应用主界面对应的窗口信息,向WMS申请创建视频应用主界面的每一个窗口。
S4104、WMS根据视频应用主界面对应的窗口信息,创建视频应用主界面的窗口。
由于视频应用主界面对应的窗口信息中,包括有视频应用主界面的窗口名、窗口大小、窗口位置等窗口的属性信息,以及窗口的层级信息等,因此WMS可以根据视频应用主界面对应的窗口信息,确定出所需创建的视频应用主界面的窗口的大小、位置、层级等,进而可实现窗口创建。
具体的,步骤S4103至步骤S4104所描述的AMS与WMS之间交互实现创建窗口的过程和技术原理,可以参考Android系统等操作系统中AMS与WMS之间交互实现创建窗口的过程和技术原理。
S4105、WMS将视频应用主界面的窗口对应的图层信息发送至Surface Flinger。
其中,视频应用主界面的窗口对应的图层信息可用于创建视频应用主界面的窗口对应的图层,进而能够实现将视频应用主界面的窗口对应的图层,显示在对应的步骤S4104的窗口区域。视频应用主界面的窗口对应的图层信息携带有视频应用的包名,具体可以包括: 视频应用主界面的窗口名,视频应用主界面的窗口对应的图层名,视频应用主界面的窗口对应的图层大小,以及视频应用主界面的窗口对应的图层位置等。示例性的,视频应用的包名可以携带于图层名中。
其中,视频应用主界面的窗口对应的图层信息可以是包含了多个图层的图层信息,具体根据视频应用主界面所需要创建的图层数量而定,本申请对此不作限制。
示例性的,视频应用主界面的窗口对应的图层信息可以从视频应用的安装信息中获取。例如,AMS可以通过视频应用的包名,确定出视频应用的安装信息,从视频应用的安装信息中获取图层信息,再将图层信息发送是WMS,WMS获取到图层信息之后,再发送至Surface Flinger。
示例性的,执行步骤S4105的过程可以是:WMS可以调用Surface Flinger,通过下发视频应用主界面的窗口对应的图层信息至Surface Flinger,以申请在Surface Flinger中创建视频应用主界面的窗口对应的图层。
S4106、Surface Flinger根据视频应用主界面的窗口对应的图层信息,创建视频应用主界面的窗口对应的图层。
由于视频应用主界面的窗口对应的图层信息中具有视频应用主界面的窗口名,窗口对应的图层名,图层大小,图层位置等能够实现图层创建的信息,进而Surface Flinger可以根据视频应用主界面的窗口对应的图层信息,创建出视频应用主界面窗口对应的图层。
示例性,步骤S4105至步骤S4106的过程中,WMS和Surface Flinger之间的交互通过ViewRootImpl等函数实现图层创建。
具体的,步骤S4105至步骤S4106中所涉及的WMS和Surface Flinger之间交互通信,实现在Surface Flinger中创建视频应用主界面的窗口对应的图层的过程和技术原理,可以参考Android系统中WMS与Surface Flinger之间交互实现创建图层的过程和技术原理。
S4107、WMS向帧率决策模块发送视频应用的包名。
其中,步骤S4107可以是在WMS完成窗口创建,即步骤S4104之后执行。由于步骤S4103中,WMS接收到的窗口信息中携带有视频应用的包名,因此在当前处于启动视频应用的阶段,WMS可以将步骤S4103中获取到的包名,发送至帧率决策模块,进而可使得帧率决策模块确定出当前需要启动的是视频应用。帧率决策模块可以根据视频应用的包名,确定出与视频应用对应的预设刷新率。
需要说明的是,WMS向帧率决策模块发送视频应用的包名的具体实现方式很多,只需实现让帧率决策模块可通过WMS下发的包名,确定出视频应用对应的预设刷新率即可,具体发送视频应用的包名的方式本申请实施例不做限制。
S4108、帧率决策模块根据视频应用的包名,确定出视频应用对应的预设刷新率。
在一些实施例中,电子设备中预先存储有包名与预设刷新率之间的对应关系。因此帧率决策模块可以通过视频应用的包名,查找到视频应用对应的预设刷新率。
S4109、帧率决策模块将视频应用对应的预设刷新率发送至Surface Flinger。
由于当前启动了视频应用,因此在显示过程中所使用的目标刷新率为视频应用对应的预设刷新率,帧率决策模块通过将视频应用对应的刷新率下发给Surface Flinger的方式,可控制Surface Flinger按照预设刷新率来合成需要显示的图像。
需要说明的是,在一些实施例中,也可以不执行步骤S4107和步骤S4108。例如,若所有应用的预设刷新率均是一致的时候,例如电子设备默认使用60HZ的刷新率,那么就不再需要执行步骤S4107和步骤S4108,而步骤S4109时则将所有应用默认使用的预设刷新率作为目标刷新率,下发给Surface Flinger即可。
S4110、Surface Flinger按照预设刷新率对应的垂直同步(Vsync,Vertical Sync)信号的节奏,触发视频应用对视频应用主界面的的窗口对应的图层进行绘制渲染。
其中,Vsync信号用于控制显示画面的绘制渲染、图层合成、送显的节奏。步骤S4110中提及的预设刷新率为视频应用对应的预设刷新率,Surface Flinger按照预设刷新率对应的Vsync信号的节奏,触发视频应用启动对视频应用主界面的窗口对应的图层进行绘制渲染。Vsync信号的周期与预设刷新率相对应。预设刷新率对应的Vsync信号可以控制显示屏按照预设刷新率进行刷新显示。
具体的,Surface Flinger按照Vsync信号的节奏触发绘制渲染的过程可以参考Android系统中Surface Flinger对触发绘制渲染的过程和技术原理的相关内容。
S4111、视频应用按照预设刷新率对应的Vsync信号的节奏,调用绘制渲染模块对视频应用主界面的窗口对应的图层进行绘制渲染,得到绘制渲染后的主界面图像数据。
其中,绘制渲染后的主界面图像数据,指的是绘制渲染后的视频应用主界面的图像数据,也可以理解为是对视频应用主界面的窗口对应的图层进行绘制渲染之后,所得到的图像数据。
示例性的,执行步骤S4111的过程可以是,视频应用按照预设刷新率对应的Vsync信号的节奏,调用绘制渲染模块的绘制渲染线程,实现让绘制渲染模块对视频应用主界面的图层进行绘制渲染,进而绘制渲染模块可以得到绘制渲染后的主界面图像数据。
示例性的,绘制渲染模块进行绘制渲染的过程中,可以通过调用Open GL ES、Skia等函数对视频应用主界面的窗口对应的图层进行绘制渲染。
具体的,视频应用和绘制渲染模块之间交互实现绘制渲染图层的过程和技术原理,可以参考Android系统中对图层进行绘制渲染的过程和技术原理。
S4112、绘制渲染模块按照预设刷新率对应的Vsync信号的节奏,将绘制渲染后的主界面图像数据,发送至Surface Flinger。
步骤S4111中绘制渲染模块所得到的绘制渲染后的主界面图像数据,可以按照预设刷新率对应的Vsync信号的节奏,发送至Surface Flinger中。而具体的绘制渲染模块按照Vsync信号的节奏,将绘制渲染后的主界面图像数据发送至Surface Flinger的过程中,可以由其他模块共同配合实现。本申请实施例对于步骤S4112的具体实现过程不作限制。
S4113、Surface Flinger按照视频应用对应的预设刷新率,使用绘制渲染后的主界面图像数据进行图层合成,并将合成后的主界面图像数据送显。
Surface Flinger按照视频应用对应的预设刷新率,对每一帧的绘制渲染后的主界面图像数据进行图层合成,并将合成后的主界面数据送至显示屏(例如LCD)进行显示,以使得电子设备的显示屏上可以按照目标刷新率,刷新显示视频应用主界面。示例性的,Surface Flinger按照预设刷新率对应的Vsync信号的节奏,对每一帧的绘制渲染后的主界面图像数据进行图层合成,并将合成后的主界面数据,按照Vsync信号的节奏将合成后的主界面数 据送至显示屏(例如LCD)进行显示,进而实现显示屏按照视频应用对应的预设刷新率,刷新显示视频应用主界面。
举例说明,如图1中的(2)所示,按照60HZ的预设刷新率,显示视频应用的主界面,主界面上显示有可播放的多个视频。
示例性的,Surface Flinger得到绘制渲染后的主界面图像数据后,可以通过HWC实现图层合成,例如Surface Flinger将目标刷新率下发至HWC,然后将绘制渲染后的主界面图像数据发送给HWC,以使得HWC按照目标刷新率对绘制渲染后的主界面图像数据进行图层合成,HWC按照目标刷新率将每一帧合成后的主界面图像数据输出至显示屏显示。
具体的,Surface Flinger将合成后的主界面图像数据送显的过程可以参考Android系统中Surface Flinger合成图层并送显的相关内容。
通过步骤S4101至步骤S4103,电子设备可在应用启动阶段,实现按照视频应用对应的预设刷新率,显示视频应用主界面,以供用户交互。
参阅图4b,图4b为本申请实施例提出的刷新率的设置方法中任意一个视频应用的视频播放阶段流程的示意图,应用于本申请实施例提出的电子设备,图4b示出的流程可在图4a示出的流程执行完成,显示出视频应用的视频应用主界面之后再开始执行,此时电子设备的前端仅显示有一个视频应用的视频应用主界面,具体的视频播放阶段流程包括以下步骤:
S4201、视频应用响应于用户播放视频的操作,下发启动视频播放Activity的请求至AMS。
其中,视频播放Activity用于实现提供一个视频播放界面,用户可以用来交互完成视频播放任务。
通过图4a的流程启动了视频应用之后,视频应用接收用户播放视频的操作,进而响应于用户播放视频的操作,触发发送启动视频播放Activity的请求至AMS,以使得AMS启动视频播放Activity。
其中,用户播放视频的操作的具体形式有很多,示例性的,用户播放视频的操作可以是在前述图4a的流程中显示的视频应用主界面上点击某个视频的操作,还可以是用户在视频应用主界面上搜索需要观看的视频,然后在搜索结果界面上点击某个视频的操作,也可以是用户语音输入需要播放的视频的操作。
举例说明,如图4c中的(1)所示,在视频应用主界面上,用户点击视频1,进而可触发视频应用下发启动视频播放Activity的请求至AMS。其中,图4c的(1)的具体描述也可以参考前述对图1的(2)的相关描述,此处不再赘述。
需要说明的是,从显示视频应用主界面到视频应用接收了用户的播放视频操作的过程有很多种具体实现形式,本申请实施例对此不作限制。
在一些实施例中,启动视频播放Activity的请求中携带有视频应用的包名和视频播放Activity的视频播放界面名。其中,视频播放界面名能够用于说明需要播放的视频所在界面的名称。例如参阅图4c的(1),用户点击视频1,进而触发视频应用发送启动视频播放Activity的请求,该启动视频播放Activity的请求中则可以携带有视频1的播放界面名。
在一些实施例中,启动视频播放Activity的请求中可以携带有除视频应用的包名之外 的其他类型的视频应用的标识,例如可以是视频应用的路径等,本申请实施例对于视频应用的标识的具体形式不作限制。在另一些实施例中,动视频播放Activity的请求中可以携带有除视频播放界面名之外的其他类型的视频播放界面的标识,本申请实施例对此也不作限制。
在另一些实施例中,启动视频播放Activity的请求中还可以携带有其他的关于视频播放Activity的信息,本申请实施例对于启动视频播放Activity的请求中的具体信息不作限制。
需要说明的是,在另一些实施例中,也可以是视频应用或者其他应用自动触发视频应用下发启动视频播放Activity的请求至AMS,具体触发下发启动视频播放Activity的请求至AMS的方式的不同,并不影响本申请实施例的实现。
具体的,步骤S4201的执行过程和原理可以参考Android系统中在应用中启动视频播放过程中涉及Activity的相关技术原理。
S4202、AMS启动视频播放Activity。
在一些实施例中,执行步骤S4202的过程为:AMS可以响应于启动视频播放Activity的请求,根据启动视频播放Activity的请求中所携带的视频应用的包名和视频播放界面名,创建视频播放Activity,并启动创建了的视频播放Activity。
具体的,AMS启动视频播放Activity的执行过程和原理可以参考Android系统中AMS启动Activity的相关技术原理。
S4203、AMS发送视频播放界面对应的窗口信息至WMS。
其中,视频播放界面为用于显示视频画面的应用界面。AMS启动视频播放Activity后,需要完成视频播放的任务,为用户提供视频应用的视频播放界面,以供用户与视频应用播放界面的控件进行交互,满足用户的视频播放需求。视频播放界面对应的窗口信息可用于创建视频播放界面对应的窗口。
示例性的,视频播放界面对应的窗口信息可以从视频应用的安装信息中获取。例如,AMS可以根据视频应用的包名,确定出视频应用的安装信息,进而从视频应用的安装信息中获取视频播放界面对应的窗口信息。示例性的,可以是AMS与视频应用之间通过多次交互的方式,确定出视频播放界面对应的窗口信息。
视频播放界面对应的窗口信息可以携带有视频应用的包名,具体可以包括有:视频播放界面的窗口属性信息、窗口层级信息等。例如,窗口属性信息可以是窗口名、窗口位置、窗口大小等。而视频播放界面对应的窗口层级信息用于说明将视频播放界面对应的窗口置于显示界面的最顶层。AMS确定出视频播放界面对应的窗口信息后,即可将该窗口信息发送给WMS,以供WMS创建视频播放界面对应的窗口。
其中,AMS发送视频播放界面对应的窗口信息至WMS的过程和原理,可以参考前述图4a中步骤S4103的执行过程和原理,区别点在于步骤S4103需AMS发送的窗口信息与视频应用主界面相关,而步骤S4203中AMS发送的窗口信息与视频播放界面相关。
S4204、WMS根据视频播放界面对应的窗口信息,创建视频播放界面的窗口。
由于视频播放界面对应的窗口信息中,包括有视频播放界面的窗口名、窗口大小、窗口位置等窗口的属性信息,以及窗口的层级信息等,因此WMS可以根据视频播放界面对应的窗口信息,确定出所需创建的视频播放界面的窗口的大小、位置、层级等,进而可实 现视频播放界面的窗口创建。
具体的,步骤S4204的执行过程和原理可以参考前述图4a示出的步骤S4104,区别点在于步骤S4104创建的窗口是视频应用主界面的窗口,而步骤S4204所创建的窗口是视频播放界面的窗口。
S4205、WMS发送窗口创建完成的信息至视频应用。
WMS在结束步骤S4204时,会将窗口创建完成的信息发送给视频应用,提示视频应用当前已完成窗口创建,视频应用可触发后续操作。
其中,窗口创建完成的信息可以携带有已完成创建的窗口的标识信息,例如窗口名等,本申请实施例对于窗口创建完成的信息的具体形式不作限制。
S4206、视频应用调用Media Codec进行视频解码。
示例性的,可以是视频应用在接收到步骤S4205所发送的窗口创建完成的信息之后,触发执行步骤S4206。
具体的,视频应用调用Media Codec,对步骤S4201中用户所需播放的视频进行视频解码。例如图4c的(1)所示,用户点击了视频1,因此视频应用会调用Media Codec对视频1进行视频解码。其中,图4c的(1)的相关内容可以参考前述对图1的(2)的描述,此处不再赘述。
具体的,步骤S4206的执行过程和原理可以参考Android系统中视频解码过程的相关技术原理。
S4207、Media Codec将视频源帧率发送给帧率决策模块。
其中,视频源帧率指的是用户所需播放的视频的视频源帧率。由于步骤S4206中,Media Codec在视频应用的调用下,对用户所需播放的视频进行了视频解码,因此Media Codec通过用户所需播放的视频的视频源,获取该视频的视频源帧率。在另一些实施例中,Media Codec还可以获取视频的分辨率等视频源的信息。
其中,用户所需播放视频的视频源帧率能够说明用户所需播放的视频的需求刷新率,进而可为帧率决策模块进行刷新率决策时提供参考,具体可参阅后续的步骤S4229的描述。
S4208、WMS将视频播放界面的窗口对应的图层信息发送至Surface Flinger。
其中,视频播放界面的窗口对应的图层信息中携带有视频应用的包名。视频播放界面的窗口对应的图层信息可用于创建视频播放界面的图层,进而能够实现将视频播放界面的图层显示在对应的步骤S4204的窗口区域。视频播放界面对应的图层信息具体可以包括:视频播放界面的窗口对应的图层名、图层大小、图层位置以及窗口名等。示例性的,视频应用的包名可以携带在图层名中。
具体的,步骤S4208的执行过程和原理,可以参考前述图4a提及的步骤S4105的过程和原理,区别点在于步骤S4108发送的图层信息与视频应用主界面相关,而步骤S4208中发送的图层信息与视频播放界面相关。
S4209、Surface Flinger根据视频播放界面的窗口对应的图层信息,创建视频播放界面的窗口对应的图层。
具体的,步骤S4209的执行过程和原理,可以参考前述图4a提及的步骤S4106的过程和原理,区别点在于步骤S4109创建的图层与视频应用主界面相关,而步骤S4206所创建 的图层与视频播放界面相关。
需要说明的是,S4203、S4204、S4208以及S4209可以是重复执行的步骤。示例性的,在执行完前述图4a的流程之后,显示屏按照预设刷新率刷新显示屏,因此当需要显示新的一帧的视频播放界面时,就需要通过S4203、S4204、S4208以及S4209,重新更新窗口、图层等信息。
示例性的,在执行步骤S4209之后,Surface Flinger则按照视频应用对应的预设刷新率所对应的Vsync信号节奏,触发视频应用对视频播放界面的窗口对应的图层进行绘制渲染,然后视频应用按照预设刷新率对应的Vsync信号节奏,调用绘制渲染模块对视频播放界面的窗口对应的图层进行绘制渲染,绘制渲染模块按照预设刷新率所对应的Vsync信号节奏,发送绘制渲染后的视频播放界面的图像数据至Surface Flinger,Surface Flinger即可按照视频应用对应的预设刷新率,使用绘制渲染后的视频播放界面的图像数据进行图层合成,并将合成后的视频播放界面的图像数据送显,实现令显示屏按照视频应用对应的预设刷新率刷新视频播放界面。具体可以参阅图4a示出的步骤S4110至步骤S4113的相关内容,区别点在于显示的界面不同。
需要说明的是,按照视频应用对应的预设刷新率,刷新显示视频播放界面的步骤,与后续描述的步骤S4211至步骤S4224的步骤的执行过程互不干扰,即本申请对步骤S4211与按照视频应用对应的预设刷新率,刷新显示视频播放界面的步骤的执行先后顺序不作限制,在帧率决策模块未发送新的刷新率至Surface Flinger之前,Surface Flinger可以按照视频应用对应的预设刷新率刷新显示视频播放界面,直至通过步骤S4211至步骤S4224的步骤,帧率决策模块所发送的刷新率发生变更,再重新以新的刷新率去刷新视频播放界面。
S4210、Surface Flinger根据图层信息中的包名,确定出视频应用是否在播放白名单中。
其中,播放白名单可以理解为是具有视频播放权限的应用的名单。步骤S4210中的图层信息可以理解为是视频播放界面的窗口对应的图层信息。由前述对步骤S4208的描述可知,图层信息中携带有视频应用的包名,因此可以通过视频应用的包名,确定出视频应用是否在播放白名单中。
示例性的,视频播放界面对应的图层信息中携带的图层名中可获取到视频应用的包名。例如,图层名的具体格式可以为:包名/图层特有的名称,进而能从图层信息中得到包名。举例说明,某个视频应用的视频播放界面的图层名为“xxx.xxxx.xx/com.xx.bangumi.ui.page.detail.BangumiDetailActivityV3#0SurfaceView-xxx.xxxx.xx/com.bi[...]age.detail.BangumiDetailActivityV3#0”,由图层名可知,该视频应用的包名为“xxx.xxxx.xx”。
示例性的,步骤S4210的执行过程为:Surface Flinger从播放白名单中查找是否存在有视频应用的包名,若存在有视频应用的包名,则确定出视频应用在播放白名单中,若在播放白名单中查找不到视频应用的包名,则说明视频应用不在播放白名单中。
若确定出视频应用在播放白名单中,则认为视频应用具有播放权限,执行步骤S4211,若确定出视频应用不在播放白名单中,则认为视频应用不具有播放权限,则结束流程,即不在显示屏上显示视频播放界面。在另一些实施例中,还可以在显示屏上显示用于提醒用户该视频应用不具有播放权限的提示信息。
在另一些实施例中,也可以不执行步骤S4210,即执行步骤S4209之后直接执行步骤S4211。
S4211、Surface Flinger根据视频播放界面的窗口对应的图层信息,确定是否为视频播放场景。
其中,视频播放界面的窗口对应的图层信息中具有能够用于说明是否为视频播放场景的图层特征信息。确定是否为视频播放场景可以理解为是确定视频应用当前所处的场景是否为视频播放场景。视频播放场景指的是应用播放视频的场景。
示例性的,图层特征信息可以为视频图层(SurfaceView),若图层信息中具有SurfaceView,则说明处于视频播放场景。举例说明,若图层信息中的图层名中存在有视频图层(SurfaceView)的字段,即可确定出为视频播放场景。例如某个视频应用的视频播放界面的图层名为“xxx.xxxx.xx/com.xx.bangumi.ui.page.detail.BangumiDetailActivityV3#0SurfaceView-xxx.xxxx.xx/com.bi[...]age.detail.BangumiDetailActivityV3#0”,当中携带有“SurfaceView”字段,因此确定出该视频应用当前处于视频播放场景,用户具有视频播放需求。
若步骤S4211确定出为视频播放场景,则执行步骤S4212,开始进行弹幕统计,进一步确定当前是否为视频播放场景下的弹幕场景,若步骤S4211确定出不为视频播放场景,则结束流程,即不再让帧率决策模块重新设置刷新率,显示屏继续按照图4a流程中所提及的视频应用对应的预设刷新率,刷新显示画面。
需要说明的是,当Surface Flinger在步骤S4211中确定出当前为视频播放场景时,就不会再重复进行视频播放场景的确定。示例性的,当步骤S4211确定为视频播放场景之后,若检测到前述提及的步骤S4209所创建的图层销毁时,则确定出当前不再处于视频播放场景,进而可以发送一个非视频播放场景的信息至帧率决策模块,以通知帧率决策模块确定出不处于视频播放场景,将目标刷新率重新设置为视频应用对应的预设刷新率。
S4212、Surface Flinger确定开启弹幕统计。
由于Surface Flinger通过步骤S4211已确定出当前为视频播放场景,因此可以确定开启弹幕统计,以使得绘制渲染模块在进行弹幕绘制时可以开始统计弹幕。通过开启弹幕统计,Surface Flinger可以进一步确定当前是否为弹幕场景,以及可根据弹幕统计的数据确定弹幕场景下的弹幕档位。具体的,当Surface Flinger确定开启弹幕统计之后,绘制渲染模块即可在绘制弹幕时,进行弹幕统计。而在Surface Flinger未确定开启弹幕统计时,绘制渲染模块则不进行弹幕统计。具体可参考后续的步骤S4213至步骤S4221的相关内容,此处不再赘述。
示例性的,Surface Flinger执行步骤S4212的方式可以为,Surface Flinger将弹幕统计的状态设置为开启状态。例如,修改弹幕统计的状态参数为开启状态对应的参数。其中,弹幕统计状态的存储位置本申请实施例不做限制。
在另一些实施例中,执行步骤S4212的方式还可以是:Surface Flinger向绘制渲染模块发送开启弹幕统计的信息,以控制绘制渲染模块开启弹幕统计。
S4213、Surface Flinger确定预设时长内是否接收到弹幕绘制信息。
其中,弹幕绘制信息为绘制渲染模块绘制弹幕的相关信息,示例性的,当前一帧的弹 幕绘制信息包括:当前一帧的弹幕条数、当前一帧的时间戳以及当前一帧的绘制渲染后的弹幕数据。当前一帧的弹幕绘制信息可以理解为是当前一帧显示画面的弹幕绘制信息。
具体的,若步骤S4213确定出预设时长内接收到弹幕绘制信息,则认为视频应用处于弹幕场景,如果S4213确定出预设时长内未接收到弹幕绘制信息,则认为视频应用不处于弹幕场景,只处于步骤S4211中确定出的视频场景,即可以理解为当前视频应用在视频播放过程中不需要显示弹幕,执行步骤S4228。其中,弹幕场景可以理解为是视频播放过程中显示有弹幕的场景。
具体的,Surface Flinger通过步骤S4211确定当前处于视频场景之后,在步骤S4212中开启了弹幕统计。因此,若当前处于弹幕场景,绘制渲染模块就会进行弹幕统计和弹幕绘制,进而会得到当前一帧的弹幕绘制信息发送至Surface Flinger。示例性的,当视频应用处于弹幕场景时,视频应用会按照视频应用的预设刷新率对应的Vsync信号的节奏,调用绘制渲染模块进行弹幕的绘制渲染,并对绘制的弹幕条数进行统计,进而绘制渲染模块会按照Vsync信号的节奏向Surface Flinger发送当前一帧的弹幕绘制信息。因此,弹幕场景下,Surface Flinger会按照Vsync信号的节奏接收到弹幕绘制信息。
当Surface Flinger在Vsync信号的节奏下,连续多帧都没有收到弹幕绘制信息,进而确定出预设时长内均没有收到弹幕绘制信息,即可以认为当前不处于弹幕场景,仅处于视频播放场景。示例性的,当Surface Flinger最近一次接收到弹幕绘制信息的时间戳与当前一帧的时间戳之间的时间差,大于或等于预设时长,则确定出在预设时长内均没有收到弹幕绘制信息,认为当前不处于弹幕场景,仅处于视频播放场景。
而Surface Flinger如果在预设时长内有接收到弹幕绘制信息,则可以认为当前处于弹幕场景。示例性的,Surface Flinger最新一次接收到弹幕绘制信息的时间戳与当前的时间戳之间的时间差,小于预设时长,就可以确定出在预设时长内有接收到弹幕绘制信息,当前处于弹幕场景。
示例性的,预设时长可以根据实际应用场景、经验等进行设定,例如,可以将预设时长设定为5秒,预设时长的具体设定方式本申请实施例对此不作限制。
需要说明的是,S4213可以是一个重复执行的步骤。例如,Surface Flinger可以根据当前的预设刷新率对应的Vsync信号的节奏,确认预设时长内是否接收到弹幕绘制信息,以实现实时确认当前是否处于弹幕场景的目的。
在另一些实施例中,步骤S4211确定出为视频播放场景之后,在执行步骤S4213之前,还可以直接发送视频播放场景信息至帧率决策模块,由帧率决策模块根据视频播放场景信息以及视频源帧率,确定出目标刷新率,然后将目标刷新率发送至Surface Flinger,由Surface Flinger按照目标刷新率控制视频播放界面的显示,具体过程和执行原理可以参考后续对步骤S4228至步骤S4231,此处不再赘述。Surface Flinger控制显示屏按照视频播放场景信息所确定出的目标刷新率,刷新显示画面之后,再执行步骤S4213,若确定出在预设时长内未接收到弹幕绘制信息,则可以直接结束流程,不需要再次执行步骤S4228至步骤S4231。
S4214、视频应用按照Vsync信号的节奏,调用绘制渲染模块绘制弹幕。
视频应用按照当前的实际刷新率对应的Vsync信号的节奏,调用绘制渲染模块绘制弹幕。示例性,由前述图4a的描述可知,启动视频应用之后,显示屏按照视频应用对应的预 设刷新率在刷新显示画面。因此,当视频应用需要绘制弹幕时,可以按照当前的预设刷新率对应的Vsync信号的节奏,调用绘制渲染模块进行弹幕绘制。其中,视频应用所绘制的弹幕可以理解为是用户所需播放的视频的弹幕。
示例性的,步骤S4214可以是在视频应用接收到用户点击开启弹幕之后,触发步骤S4213开始执行,也可以是在前述步骤S4201中接收到用户播放视频的操作之后,自动触发开启需播放视频的弹幕,进而开始执行步骤S4214。示例性的,如图4c的(2)所示,用户点击弹幕开关,即可触发视频应用执行步骤S4214。
示例性的,视频应用调用绘制渲染模块进行绘制弹幕时,绘制渲染模块可以生成文本视图(Textview)、基本画布(basecanvas),采用drawtext、Skia、HWUI、OPEN GL ES等函数完成弹幕绘制。示例性的,针对绘制每一帧的弹幕绘制过程,可以是每需要绘制一条弹幕,视频应用就调用一次绘制渲染模块进行弹幕绘制。可以理解为在绘制一帧画面的弹幕的过程中,步骤S4214中可以根据实际的弹幕绘制需求重复执行多次。绘制渲染模块绘制渲染弹幕,并将弹幕显示到电子设备的显示屏的过程可以由多个模块配合执行完成,具体可以参考Android系统中的弹幕绘制渲染的过程和技术原理。
需要说明的是,步骤S4214当且仅当视频应用需要绘制弹幕时,才调用绘制渲染模块进行弹幕绘制,因此如果视频应用不需要绘制弹幕时,例如用户关闭弹幕开关,或者视频本身无任何弹幕,则不会执行步骤S4214,进而也不会执行步骤S4215至S4218。因此步骤S4214并不是在显示每一帧画面时都必须执行的步骤,当且仅当一帧的画面具有弹幕绘制需求时才出发执行步骤S4214,而具体触发步骤S4214的方式的不同不影响本申请实施例的实现。
S4215、绘制渲染模块确定是否开启弹幕统计。
绘制渲染模块若确定出开启弹幕统计,则说明当前具有统计弹幕的需求,因此执行步骤S4215,绘制渲染模块若确定出不开启弹幕统计,则说明当前不具有统计弹幕的需求,因此可结束流程,即不进行弹幕统计,不再执行后续的步骤S4216至步骤S4218。
示例性的,绘制渲染模块通过获取弹幕统计的状态,确定是否开启弹幕统计。若弹幕统计为开启状态,则开启弹幕统计,若弹幕统计不为开启状态,则不开启弹幕统计。由于前述的步骤S4212中,Surface Flinger确定开启弹幕统计,将弹幕统计的状态确定为开启状态,因此绘制渲染模块在执行步骤S4215时获取到的弹幕统计的状态为开启状态,确定出开启弹幕统计,执行步骤S4216。
在一些实施例中,弹幕统计的状态可以设置于Surface Flinger中,绘制渲染模块可通过从Surface Flinger中获取弹幕统计的状态,来确定是否开启弹幕统计。在另一些实施例中,也可以是当绘制渲染模块接收到开启弹幕统计的信息时,确定出开启弹幕统计,当绘制渲染模块未接收到弹幕统计的信息时,则不开启弹幕统计。
在一些实施例中,绘制渲染模块确定开启弹幕统计,可以理解为是绘制渲染模块确定开启统计当前绘制的弹幕。视频应用每调用一次绘制渲染模块进行弹幕绘制,绘制渲染模块就被触发开启统计当前被调用进行绘制的弹幕。
在一些实施例中,步骤S4215中若确定出开启弹幕统计,则可以初始化启动弹幕计数器,令弹幕计数器处于可进行计数工作的状态。可选地,弹幕计数器可以仅初始化启动一 次,在当前一帧内,若绘制渲染模块多次被调用,相应的多次确定开启弹幕统计,则在首次确定开启弹幕统计时进行初始化启动弹幕计数器后,由于弹幕计数器已处于启动状态,后续可不再初始化启动弹幕计数器。示例性的弹幕计数器在未被绘制渲染模块控制初始化启动之前,不对弹幕进行统计,仅在被初始化启动之后才开始统计绘制渲染模块绘制的弹幕。示例性的,绘制渲染模块初始化启动弹幕计数器的方式可以为:绘制渲染模块可以发送启动信号给弹幕计数器,弹幕计数器接收到启动信号后,开始统计绘制渲染模块所绘制的弹幕。弹幕计数器可以在一帧画面的弹幕绘制结束之后,进入低功耗状态或者休眠状态,以节省功耗,直到下一帧需要进行弹幕时,绘制渲染模块再次在确定开启弹幕统计时初始化启动弹幕计数器。
S4216、绘制渲染模块控制弹幕计数器计数加一。
由于步骤S4215中绘制渲染模块已确定开启弹幕统计,因此触发绘制渲染模块控制弹幕计数器对当前绘制的弹幕进行计数统计,以使得弹幕计数器计数加一,完成对当前绘制渲染模块所绘制的弹幕的条数进行计数统计。由于视频应用每需要绘制一条弹幕,均会调用一次绘制渲染模块进行弹幕绘制,进而触发绘制渲染模块执行步骤S4215至S4217。因此,弹幕计数器能够统计到当前一帧中绘制的弹幕条数。
在另一些实施例中,绘制渲染模块还可以控制弹幕计数器统计一帧画面中需要绘制的弹幕字符数,即控制弹幕计数器计数累加上当前被调用绘制的一条弹幕的字符数。需要说明的是,弹幕计数器可统计的弹幕相关的数据可以有很多方式,包括但不限于本申请所提出的内容。
示例性的,在一些实施例中,为了提高弹幕计数器统计的弹幕数据的精度,绘制渲染模块可以控制弹幕计数器在特定区域进行弹幕数据的统计,即仅统计在特定区域内绘制的弹幕。举例说明,如图4d所示,在图4d示出的视频播放界面上,为了避免区域10和区域30部分的文字被当成弹幕进行统计,绘制渲染模块可以控制弹幕计数器仅针对特定区域20内的文字进行统计,进而可以摒除非弹幕文字的干扰,统计得到准确的弹幕条数、弹幕字符数等弹幕数据。
S4217、弹幕计数器计数加一。
具体的,针对每一帧画面,弹幕计数器由零开始进行弹幕条数的统计,视频应用每执行步骤S4214,调用绘制渲染模块绘制一条弹幕,绘制渲染模块就会在步骤S4215中确定开启弹幕统计,控制弹幕计数器进行计数加一,直至当前一帧画面中需要绘制的弹幕条数绘制完毕,才结束统计。即在当前一帧画面中,若需要绘制多条弹幕,就会重复执行多次步骤S4214至步骤S4217。
在另一些实施例中,弹幕计数器还可以在绘制渲染模块的控制下统计一帧画面中需要绘制的弹幕字符数。需要说明的是,弹幕计数器可统计的弹幕相关的数据可以有很多方式,包括但不限于本申请所提出的内容。
示例性的,在一些实施例中,当弹幕计数器完成当前一帧画面的弹幕条数的统计后,可以将弹幕计数器的计数清零,进而可在之后需要统计某一帧画面的弹幕条数时,重新从零开始计数。
需要说明的是,本申请实施例中示出的步骤S4214至步骤S4217仅仅是示例性的对弹 幕条数进行统计的过程,绘制渲染模块实现当前一帧的弹幕条数、弹幕字符数等弹幕数据的统计方式有很多,包括但不限于本申请实施例的内容。
S4218、绘制渲染模块按照Vsync信号的节奏,发送当前一帧的弹幕绘制信息至Surface Flinger。
绘制渲染模块按照当前的实际刷新率对应的Vsync信号的节奏,发送当前一帧的弹幕绘制信息至Surface Flinger。示例性,执行完图4a的流程之后,电子设备当前的实际刷新率为视频应用对应的预设刷新率,因此执行步骤S4218时,绘制渲染模块可以按照预设刷新率对应的Vsync信号的节奏,发送当前一帧的弹幕绘制信息至Surface Flinger。
其中,当前一帧的弹幕绘制信息包括:当前一帧绘制的弹幕条数、当前一帧的时间戳以及当前一帧的绘制渲染后的弹幕数据。当前一帧绘制的弹幕条数为弹幕计数器所统计的当前的一帧画面内需要绘制的弹幕条数,当前一帧的时间戳为当前一帧画面对应的时间戳,绘制渲染后的弹幕数据也为当前一帧画面对应的绘制渲染后的弹幕数据。示例性的,弹幕绘制信息中还可以携带有其他信息,例如视频应用的包名,弹幕计数器统计的当前一帧的弹幕字符数等。
绘制渲染模块按照视频应用对应的预设刷新率的Vsync信号的节奏,发送当前一帧的弹幕绘制信息至Surface Flinger。需要说明的是,绘制渲染模块可以不用每一帧都发送弹幕绘制信息至Surface Flinger,若绘制渲染模块在某一帧不进行弹幕绘制,那么绘制渲染模块在该帧也不会执行步骤S4214至步骤S4217,进而该帧就不会产生弹幕绘制信息,不会将弹幕绘制信息发送至Surface Flinger。
示例性的,可以是绘制渲染模块中的弹幕计数器执行步骤S4218。
S4219、Surface Flinger确定在预设时长内收到弹幕绘制信息,存储当前一帧绘制的弹幕条数和时间戳。
示例性的,由于Surface Flinger接收到了步骤S4218发送的当前一帧的弹幕绘制信息,确定出了当前在预设时长内有接收到弹幕绘制信息,认为当前处于弹幕场景。进而可以通过存储当前一帧绘制的弹幕条数和时间戳,以使用存储的弹幕条数和时间戳,来确定弹幕场景下的弹幕档位,具体可参考后续对步骤S4220至步骤S4221的描述,此处不再赘述。
示例性的,可以将弹幕绘制信息中的弹幕条数和时间戳进行对应存储,记录该时间戳下总共绘制的弹幕条数。其中,存储弹幕条数和时间戳的方式有很多,本申请实施例对此不作限制。
示例性的,可以以表格形式存储弹幕条数和时间戳。举例说明,如表一所示,第一帧的时间戳为35257912694ns,绘制渲染模块在第一帧总共绘制了3条弹幕,第二帧的时间戳为35274579360ns,绘制渲染模块在第二帧总共绘制了2条弹幕,第三帧的时间戳为35291246026ns,绘制渲染模块在第三帧总共绘制了7条弹幕……以此类推,记录每一帧绘制的弹幕条数和时间戳。
表一
每一帧绘制的弹幕条数 时间戳(ns)
3 35257912694
2 35274579360
7 35291246026
…… ……
其中,弹幕绘制信息中的绘制渲染后的弹幕数据后续用于Surface Flinger合成送显至显示屏,显示弹幕数据的过程可以参见前述图4a中的步骤S4110至步骤S4113的图像数据送显过程的相关描述。
S4220、Surface Flinger根据已存储的弹幕条数、时间戳、以及弹幕老化时间,确定出当前一帧显示的总弹幕条数。
其中,弹幕老化时间指的是弹幕存活时间。针对每一条弹幕,该弹幕从开始在显示屏上显示至即弹幕完全消失的时间,即为该弹幕存活时间,也就是弹幕老化时间。不同场景下,弹幕老化时间可能有所不同。例如未开倍速的视频,弹幕老化时间可能是5秒,开了倍速的视频,弹幕老化时间可能是3秒。
示例性的,弹幕老化时间可以由Surface Flinger通过获取当前视频播放场景下需播放的视频的弹幕老化时间的方式确定。在另一些实施例中,也可以是通过经验、大数据等确定一个预设置的、固定的弹幕老化时间。即弹幕老化时间的确定方式有很多,本申请实施例对此不做限制。
其中,已存储的弹幕条数、时间戳、以及弹幕老化时间可以理解为是已存储的每一帧绘制的弹幕条数、每一帧对应的时间戳、以及弹幕老化时间。
Surface Flinger根据已存储的每一帧的弹幕条数、时间戳、以及弹幕老化时间,可以确定出当前已经老化的(即已经消失在显示屏上的)总弹幕条数,将已存储的总弹幕条数,减去已经老化的总弹幕条数,即可得到当前一帧显示的总弹幕条数。
举例说明,如前述的表一所示,当弹幕老化时间为5秒,当前的一帧为第三帧时,可以计算出已存储的总弹幕条数为:3+2+7=12。而从第一帧到第三帧时间为:35291246026-35257912694=33333332ns,即0.033333332秒,因此第一帧到第三帧所绘制的弹幕均未老化,因此当前一帧显示的总弹幕条数就是12条。
在另一些实施例中,若步骤S4218中,发送给Surface Flinger的弹幕绘制信息中,还包括有当前一帧的弹幕字符总数,那么还可以根据已存储的弹幕字符总数,计算出当前一帧需显示的总弹幕字符数。例如,可以是通过前述描述计算出当前一帧显示的总弹幕条数之后,计算总弹幕条数内的弹幕字符数,进而得到当前一帧需显示的总弹幕字符数。
当需要显示的总弹幕条数越多时,所需求的屏幕刷新率就会越高,刷新率过低容易出现滚动的弹幕卡顿,画面不流畅的现象。而需要显示的总弹幕条数越少时,所需求的刷新率就较低,较低的刷新率即可保障画面流畅显示。
S4221、Surface Flinger根据当前一帧绘制的弹幕条数以及显示的总弹幕条数,确定当前一帧的弹幕档位。
其中,弹幕档位为用于说明显示屏上所显示的显示画面中弹幕密度的量级。当前一帧 绘制的弹幕条数越多,显示的总弹幕条数越多,就认为当前需显示弹幕数量越多,弹幕密度大,反之则认为需显示的弹幕数量越少,弹幕密度小。
示例性的,弹幕档位可以包括:高弹幕档位,或低弹幕档位。低弹幕档位用于表示当前一帧画面的弹幕密度低。高弹幕档位用于表示当前一帧的弹幕密度高。执行步骤S4221的过程可以是:当当前一帧绘制的弹幕条数小于第一预设值,且当前一帧显示的总弹幕条数小于第二预设值,则确定出弹幕档位为低弹幕档位。反之,若当前一帧绘制的弹幕条数大于或等于第一预设值,或者,当前一帧显示的总弹幕条数大于或等于第二预设值,则确定出为高弹幕档位。其中,第一预设值和第二预设值可以按照经验等进行设定,例如第一预设值可以设定为2,第二预设值可以设定为5。
在另一些实施例中,弹幕档位除了可以被划分成高弹幕档位或低弹幕档位这两个弹幕档位,也可以被划分成3个,设置3个以上的多个弹幕档位。例如,弹幕档位可以包括:第一弹幕档位,第二弹幕档位,或者第三弹幕档位。第一弹幕档位、第二弹幕档位至第三弹幕档位所表征的弹幕数量逐级递增。
在另一些实施例中,若弹幕绘制信息中还包括有当前一帧的弹幕字符数,则在确定弹幕档位时,还可以根据当前一帧的弹幕字符数,和/或当前一帧需显示的总弹幕字符数来确定。当前一帧需绘制的弹幕字符数越多,需显示的总弹幕字符数越多,就认为当前需显示弹幕数量越多,反之则认为需显示的弹幕数量越少。
在另一些实施例中,为了确保确定出的弹幕档位准确,可以在Surface Flinger确定出视频播放场景之后,在首次执行步骤S4221确定弹幕档位时,先延时预设时长之后再执行。在延时预设时长之后,一些非弹幕的文字,例如图4d中的10区域和30区域内的文字,就会被当做已过了弹幕老化时间的消失弹幕,进而不会干扰到计算一帧显示的总弹幕条数的精确度,进而能够准确确定出弹幕档位。
需要说明的是,根据当前一帧绘制的弹幕条数以及显示的总弹幕条数,确定当前一帧的弹幕档位的具体方式有很多,包括但不限于本申请实施例所提出的内容。
还需要说明的是,前述提及的步骤S4219至步骤S4221可以是实时执行的,示例性的,可以是按照预设刷新率对应的Vsync节奏,通过执行步骤S4219至步骤S4221,确定每一帧的弹幕档位。
S4222、Surface Flinger确定弹幕档位信息发生变更。
其中,弹幕档位信息用于说明弹幕场景下的弹幕档位,具体可以包括弹幕档位。步骤S4222可以理解为是Surface Flinger确定当前一帧的弹幕档位信息是否发生了变更。具体的,可以是将当前一帧的弹幕档位与上一帧确定出的弹幕档位进行比较,若当前一帧的弹幕档位与上一帧的弹幕档位不同,则确定出当前一帧的弹幕档位信息发生变更,若当前一帧确定出的弹幕档位与上一帧的弹幕档位一致,则确定出当前一帧的弹幕档位信息没有发生变更,因此结束流程,继续确定下一帧的弹幕档位信息是否发生变更。
需要说明的是,若步骤S4221为首次确定出的弹幕档位,那么步骤S4222中Surface Flinger也会确定出弹幕档位信息发生变更。
当步骤S4222确定弹幕档位信息发生变更时,则会触发执行步骤S4223。若Surface Flinger确定弹幕档位信息没有发生变更,则可以不执行步骤S4223。
S4223、Surface Flinger发送弹幕档位信息至帧率决策模块。
其中,弹幕档位信息能够用于说明当前处于弹幕场景以及当前弹幕场景下的弹幕档位,进而帧率决策模块可以根据弹幕档位调整目标刷新率,以满足在当前的弹幕档位下的显示需求。弹幕档位信息除了可以包括当前一帧的弹幕档位,还可以包括其他的信息,例如可以包括视频应用的包名、视频源帧率等信息,本申请实施例对于弹幕档位信息中所包括的信息不做限制。
示例性,步骤S4223可以在当且仅当弹幕档位信息发生变更时再执行,即若没有执行步骤S4222,即当前一帧的弹幕档位信息没有发生变更时,则可以不触发执行步骤S4223。由于Surface Flinger可以仅在确定出弹幕档位信息发生变更时,发送弹幕档位信息给帧率决策模块,而不需要实时发送当前一帧的弹幕档位信息给帧率决策模块,进而可实现节省功耗。
举例说明,若当前一帧确定出的弹幕档位为高弹幕档位,而上一次确定出的弹幕档位为低弹幕档位,那么就需要发送高弹幕档位信息至帧率决策模块。又例如Surface Flinger首次确定出弹幕档位时,也需要发送弹幕档位信息至帧率决策模块。
S4224、帧率决策模块根据弹幕档位信息,确定目标刷新率。
帧率决策模块根据切换刷新率信息,可以确定出当前为弹幕场景,以及弹幕场景下的弹幕档位。进而可以根据弹幕档位信息,调整目标刷新率。
示例性的,可以预先针对每一个弹幕档位分配一个对应的预设刷新率。弹幕档位越高,说明需要显示的弹幕数量越多,对应的预设刷新率也就越高。例如,可以将低弹幕档位对应的预设刷新率设置为30HZ,高弹幕档位对应的预设刷新率为40HZ。在一些实施例中,为了使得弹幕档位确定出的刷新率适用于当前的视频应用,可以使得弹幕档位所确定出的刷新率,小于或等于视频应用对应的预设刷新率,且大于或等于视频应用播放视频的视频源帧率。
其中,根据弹幕档位信息,确定目标刷新率的方式有很多,例如还可以通过具体的运算规则,使用弹幕档位信息计算出目标刷新率,本申请实施例对步骤S4224的具体实施方式不做限制。
S4225、帧率决策模块将目标刷新率发送至Surface Flinger。
在一些实施例中,为了节省功耗,可以是在帧率决策模块确定出的目标刷新率发生变更时,再发送目标刷新率至Surface Flinger。
在另一些实施例中,也可以按照预设的频率发送最新确定出的目标刷新率至Surface Flinger。
S4226、Surface Flinger按照目标刷新率,控制视频播放界面的显示。
具体的,Surface Flinger对步骤S4209中所提及的视频播放界面的图层进行绘制渲染,得到绘制渲染后的视频播放界面的图层,然后将绘制渲染后的视频播放界面的图层以及S4218中接收到的绘制渲染后的弹幕数据进行合并图层处理,按照目标刷新率输送至显示屏显示,以使得显示屏按照目标刷新率,刷新显示视频播放界面。
其中,Surface Flinger控制显示视频播放界面的过程可以参考前述图4a中的步骤S4110至步骤S4113中的视频应用主界面的绘制渲染过程,主要区别点在于此时的目标刷新率不 再是视频应用对应的预设刷新率,而是根据弹幕档位信息所确定出的刷新率,且显示的不再是视频应用主界面,而是视频播放的弹幕界面。
举例说明,如图4c所示,图4c的(1)中,用户在视频应用主界面点击了视频1,触发执行图4b示出的流程,经过步骤S4201至步骤S4212,可以确定出当前视频应用是处于视频播放场景,且Surface Flinger也确定开启弹幕统计,此时显示界面的画面变为了图4c的(2),由于Surface Flinger在经过预设时长之后,才确定弹幕档位信息,因此刚进入图4c的(2)示出的视频1播放界面时,仍然维持原有的刷新率60HZ,刷新率未变更。继续参阅图4c的(2),用户点击弹幕开关,将弹幕开关开启,触发执行了步骤S4213至步骤S4226,在延时了预设时长之后,确定出了为低弹幕档位,切换为低弹幕档位下的刷新率30HZ,显示界面变化为图4c的(3)。如图4c的(3)所示,显示的弹幕条数为3条,为低弹幕档位的状态,按照30HZ的刷新率刷新显示。
由于本申请实施例中,Surface Flinger能够根据当前一帧绘制的弹幕条数、以及当前一帧显示的总弹幕条数,确定出弹幕档位信息,进而能够向帧率决策模块发送携带有弹幕档位信息的切换刷新率信息,以使得帧率决策模块可以根据弹幕档位调整目标刷新率,进而能够满足弹幕场景下对刷新率的需求,在能够满足画面流畅性要求的同时,又不会造成刷新率多余,浪费电量的情况。
需要说明的是,由于每一帧画面所显示的弹幕都需要不断绘制,因此Surface Flinger中存储的弹幕条数以及时间戳也在不断增加,也需要不断执行步骤S4213和步骤S4221,确定出最新一帧的弹幕档位,即可以理解为Surface Flinger在实时确定最新的弹幕档位,在一些实施例中,可以仅在执行了步骤S4222,即弹幕档位确定发生变更时(例如与上一帧确定出的弹幕档位不同时)再弹幕档位信息,以通知帧率决策模块重新调整目标刷新率。
举例说明,如图4c的(3)所示的场景中,视频1上的弹幕条数较少,处于低弹幕档位,但在视频1播放一段时间后,手机中的Surface Flinger通过步骤S4220计算出的显示的弹幕条数增加,需要绘制的弹幕条数也增加,进而步骤S4221所确定出的弹幕档位变为了高弹幕档位,因此再次切换刷新率,当视频1的画面由图4c的(3)变为图4c的(4)之后,可以看出图4c的(4)上的弹幕数量相较于(3)明显增多,此时按照切换后的高弹幕档位下的刷新率40HZ刷新显示画面。
S4227、Surface Flinger确定在预设时长内未接收到弹幕绘制信息。
执行完步骤S4226之后,Surface Flinger一直按照弹幕档位信息所确定出的目标刷新率,刷新显示屏。但在执行步骤S4226的过程中,Surface Flinger也重复执行步骤S4213,即通过确定预设时长内是否接收到弹幕绘制的信息来确定当前是否仍处于弹幕场景。
当Surface Flinger确定出在预设时长内未接收到弹幕绘制信息,即也可以理解为是超过预设时长均未接收到弹幕绘制信息时,Surface Flinger确定出视频应用不再处于弹幕场景,而是变更为了视频播放场景,因此需执行步骤S4228,通知帧率决策模块当前的场景变更为了视频播放场景。
示例性的,若用户关闭弹幕或者视频播放过程中不再有需要显示的弹幕等一系列不再需要绘制弹幕的情况时,Surface Flinger会连续多帧都未接收弹幕绘制信息,因此连续多帧都不会执行步骤S4214至步骤S4218,当Surface Flinger超过预设时长都未接收弹幕绘制信 息时,就说明视频应用不再处于弹幕场景,而此时在前述的步骤S4211的判断下仍然处于视频播放场景中,因此会确定出当前仅为视频播放场景,但不处于弹幕场景。
由于Surface Flinger确定出的场景由弹幕场景需要变更为视频播放场景,场景信息发生了变更,因此可以执行步骤S4228,重新发送携带有视频播放场景的切换刷新率信息至帧率决策模块,帧率决策模块重新根据视频播放场景和视频源帧率,确定目标刷新率,以使得电子设备的显示屏按照视频播放场景下的刷新率刷新画面。
S4228、Surface Flinger发送视频播放场景信息至帧率决策模块。
在通过前述的步骤S4213中确定出了只处于视频播放场景不处于弹幕场景的情况下,或者是通过步骤S4227确定出视频应用从原本的弹幕场景变为了视频播放场景的情况下,Surface Flinger通过将视频播放场景信息发送至帧率决策模块的方式,通知帧率决策模块当前的场景为视频播放场景,以使得帧率决策模块可根据视频播放场景信息重新设置目标刷新率。
其中,视频播放场景信息用于说明当前为视频播放场景。Surface Flinger将视频播放场景信息发送至帧率决策模块,以通知帧率决策模块当前为视频播放场景,因此需要帧率决策模块重新设置一个适用于视频播放场景的目标刷新率。其中,视频播放场景信息的具体表现形式本申请实施例不作限制,例如可以使用特定符号或特定字段来表示。
S4229、帧率决策模块根据视频播放场景信息和视频源帧率,确定出目标刷新率。
帧率决策模块接收到视频播放场景信息后,可确定出当前处于视频播放场景。示例性的,在视频播放场景下,帧率决策模块可以根据视频源帧率确定目标刷新率。例如可以将目标刷新率设置为视频源帧率。又例如,可以是根据视频源帧率,使用视频播放场景下的刷新率计算规则,计算出目标刷新率。
在另一些实施例中,若步骤S4207还发送了视频分辨率等除了视频源帧率之外的其他视频源信息,则还可以结合其他的视频源信息确定目标刷新率。本实施例对于根据视频源帧率确定出目标刷新率的具体过程不作限制。
由于视频源帧率能够说明当前需播放的视频对刷新率的需求,因此在视频播放场景下使用视频源帧率来确定出的目标刷新率,可以满足当前视频播放场景的需求。
在另一些实施例中,还可以仅根据视频播放场景信息,确定出目标刷新率。示例性的,可以是预先存储了视频播放场景对应的预设刷新率,当接收到视频播放场景信息时,帧率决策模块则将目标刷新率设置为视频播放场景对应的预设刷新率。例如,视频播放场景对应的预设刷新率48HZ,而视频应用对应的预设刷新率为60HZ,则帧率决策模块在执行步骤S4229的过程中,会将目标刷新率从原本的60HZ,调整为48HZ。
S4230、帧率决策模块将目标刷新率发送至Surface Flinger。
步骤S4230的执行过程和原理可以参考前述提及的步骤S4225,区别点主要在于步骤S4225确定出的目标刷新率与弹幕档位相关,而步骤S4230确定出的刷新率与视频播放场景相关。
S4231、Surface Flinger按照目标刷新率,控制视频播放界面的显示。
其中,步骤S4231的执行过程和原理可以参考前述对步骤S4226的描述,主要区别点在于步骤S4226中的目标刷新率根据弹幕档位确定,而步骤S4231的目标刷新率根据视频 播放场景确定,此处不再赘述。
具体的,举例说明,如图4c的(4)所示,视频1的播放界面上呈现出高弹幕档位的场景,此时当用户选择关闭弹幕开关时,则会触发执行步骤S4227至步骤S4231,确定出当前不处于弹幕场景,只处于视频播放场景,进而完成刷新率切换,显示屏显示的画面变化为图4c的(5),此时弹幕开关关闭,视频1的播放界面不再有弹幕,手机按照视频播放场景下的刷新率24HZ刷新显示画面。
在另一些实施例中,若视频应用不再进行视频播放时,则Surface Flinger会检测到步骤S4209中的图层销毁,进而Surface Flinger会确定出不处于视频播放场景,此时确定出的场景信息再次发生变更,因此可以再次发送视频播放场景信息至帧率决策模块,帧率决策模块即可再次调整目标刷新率,例如将目标刷新率设置为视频应用对应的预设刷新率。示例性的,Surface Flinger确定出不处于视频播放场景时,还可以确定关闭弹幕统计,进而绘制渲染模块就会停止弹幕统计。
举例说明,如图4c的(5)所示,当用户点击“←”图标,关闭视频1的播放界面时,手机可确定出不处于视频播放场景,由于场景变更,因此会再次切换刷新率。具体的,显示画面变为图4c的(6),视频应用回到了视频应用主界面,切换成了视频应用对应的预设刷新率60HZ进行刷新画面。
由于本申请实施例中,Surface Flinger能够根据视频播放界面对应的图层信息,和/或调用方信息,确定出视频应用是否处于视频播放场景,进而能够在视频应用处于视频播放场景时,通过向帧率决策模块发送切换刷新率信息,以使得帧率决策模块重新调整目标刷新率,根据视频播放场景所确定出的目标刷新率能够满足电子设备的性能需求,既能够满足视频播放的显示需求,又不会造成刷新率多余,浪费电量的情况。
参阅图4e,图4e为本申请另一实施例提出的刷新率的设置方法中任意一个视频应用的视频播放阶段流程的示意图,应用于本申请实施例提出的电子设备,图4e示出的流程可在图4a示出的流程执行完成,显示出视频应用的视频应用主界面之后再开始执行,此时电子设备的前端仅显示有一个视频应用的视频应用主界面,具体的视频播放阶段流程包括以下步骤:
S4301、视频应用响应于用户播放视频的操作,下发启动视频播放Activity的请求至AMS。
其中,步骤S4301的执行过程和原理可以参考前述图4b示出的步骤S4201,此处不再赘述。
S4302、AMS启动视频播放Activity。
其中,步骤S4302的执行过程和原理可以参考前述图4b示出的步骤S4202,此处不再赘述。
S4303、AMS发送视频播放界面对应的窗口信息至WMS。
其中,步骤S4303的执行过程和原理可以参考前述图4b示出的步骤S4203,此处不再赘述。
S4304、WMS根据视频播放界面对应的窗口信息,创建视频播放界面的窗口。
其中,步骤S4304的执行过程和原理可以参考前述图4b示出的步骤S4204,此处不再 赘述。
S4305、WMS发送窗口创建完成的信息至视频应用。
其中,步骤S4305的执行过程和原理可以参考前述图4b示出的步骤S4205,此处不再赘述。
S4306、视频应用调用MediaCodec进行视频解码。
其中,步骤S4306的执行过程和原理可以参考前述图4b示出的步骤S4206,此处不再赘述。
S4307、Media Codec将视频源帧率和被调用的信息发送给帧率决策模块。
其中,视频源帧率指的是用户所需播放的视频的视频源帧率。由于步骤S4306中,Media Codec在视频应用的调用下,对用户所需播放的视频进行了视频解码,因此Media Codec通过用户所需播放的视频的视频源,获取该视频的视频源帧率。在另一些实施例中,Media Codec还可以获取视频的分辨率等视频源的信息。
被调用的信息用于说明Media Codec被调用。在一些实施例中,被调用的信息中可以携带有视频应用的包名,以说明MediaCodec是被视频应用调用的。具体的由于步骤S4306中,Media Codec在视频应用的调用下进行了视频解码,因此MediaCodec在视频应用调用时,可得到视频应用的包名,进而可以在被调用的信息中携带视频应用的包名。
在另一些实施例中,被调用的信息中也可以不携带有视频应用的包名。由于图4c示出的场景中前端仅显示单个视频应用的界面,因此被调用的信息中也可不说明调用Media Codec的应用。
S4308、帧率决策模块将被调用的信息发送给Surface Flinger。
其中,被调用的信息用于说明Media Codec被调用。在另一些实施例中,帧率决策模块还可以将视频源帧率、视频分辨率等视频源信息发送给Surface Flinger。
S4309、WMS将视频播放界面的窗口对应的图层信息发送至Surface Flinger。
其中,步骤S4309的执行过程和原理可以参考前述图4b示出的步骤S4208,此处不再赘述。
S4310、Surface Flinger根据视频播放界面的窗口对应的图层信息,创建视频播放界面的窗口对应的图层。
其中,步骤S4310的执行过程和原理可以参考前述图4b示出的步骤S4209,此处不再赘述。
S4311、Surface Flinger根据图层信息中的包名,确定出视频应用是否在播放白名单中。
其中,步骤S4311的执行过程和原理可以参考前述图4b示出的步骤S4210,此处不再赘述。
S4312、Surface Flinger根据视频播放界面的窗口对应的图层信息,确定是否为视频播放场景。
若步骤S4312确定出为视频播放场景,则执行步骤S4313,若确定出不为视频播放场景,则结束流程,继续按照视频应用对应的预设刷新率刷新显示画面。
其中,步骤S4312的执行过程和原理可以参考前述图4b示出的步骤S4211,此处不再赘述。
S4313、Surface Flinger根据被调用的信息,确定出视频应用调用了Media Codec。
由于步骤S4308中,Surface Flinger接收到了被调用的信息,该被调用的信息能够说明Media Codec被调用,进一步验证了当前具有视频播放需求,因此视频应用才会调用Media Codec进行视频解码。又结合步骤S4312中确定出当前为视频播放场景,Surface Flinger通过步骤S4313再次验证了当前为视频播放场景,进而可进一步执行步骤S4314,以确定当前是否为视频播放场景下的弹幕场景。
在一些实施例中,若Surface Flinger未接收到被调用的信息,即无法确定出视频应用调用了Media Codec,不能够验证当前是否为视频播放场景,因此可以结束流程,不执行后续的步骤S4314。
在另一些实施例中,步骤S4313确定出视频应用调用了Media Codec,验证了当前为视频播放场景之后,在执行步骤S4314之前,还可以直接发送视频播放场景信息至帧率决策模块,由帧率决策模块根据视频播放场景信息以及视频源帧率,确定出目标刷新率,然后将目标刷新率发送至Surface Flinger,由Surface Flinger按照目标刷新率控制视频播放界面的显示,具体过程和执行原理可以参考后续对步骤S4330至步骤S4333,此处不再赘述。Surface Flinger控制显示屏按照视频播放场景信息所确定出的目标刷新率刷新显示画面之后,再开始执行步骤S4314,若步骤S4315确定出在预设时长内未接收到视频应用进行弹幕绘制的信息,则可以直接结束流程,不需要再次执行步骤S4330至步骤S4333。
本申请实施例中,Surface Flinger通过被调用的信息辅助验证视频应用处于视频播放场景,提高了视频播放场景的识别准确性。
S4314、Surface Flinger确定开启弹幕统计。
由于步骤S4313已经再次验证了当前为视频播放场景,因此可以确定开启弹幕统计,以使得绘制渲染模块在进行弹幕绘制时可以开始统计弹幕。通过开启弹幕统计,Surface Flinger可以进一步确定当前是否为弹幕场景,以及可根据弹幕统计的数据确定弹幕场景下的弹幕档位。
其中,步骤S4314的执行过程和原理可以参考前述图4b示出的步骤S4212,此处不再赘述。
S4315、Surface Flinger确定预设时长内是否接收到弹幕绘制信息。
其中,步骤S4315的执行过程和原理可以参考前述图4b示出的步骤S4213,此处不再赘述。
S4316、视频应用按照Vsync信号的节奏,调用绘制渲染模块绘制弹幕。
其中,步骤S4316的执行过程和原理可以参考前述图4b示出的步骤S4214,此处不再赘述。
S4317、绘制渲染模块确定是否开启弹幕统计。
其中,步骤S4317的执行过程和原理可以参考前述图4b示出的步骤S4215,此处不再赘述。
S4318、绘制渲染模块控制弹幕计数器计数加一。
其中,步骤S4318的执行过程和原理可以参考前述图4b示出的步骤S4216,此处不再赘述。
S4319、弹幕计数器计数加一。
其中,步骤S4319的执行过程和原理可以参考前述图4b示出的步骤S4217,此处不再赘述。
S4320、绘制渲染模块按照Vsync信号的节奏,发送当前一帧的弹幕绘制信息至Surface Flinger。
其中,步骤S4320的执行过程和原理可以参考前述图4b示出的步骤S4218,此处不再赘述。
S4321、Surface Flinger确定在预设时长内收到弹幕绘制信息,存储当前一帧绘制的弹幕条数和时间戳。
其中,步骤S4321的执行过程和原理可以参考前述图4b示出的步骤S4219,此处不再赘述。
S4322、Surface Flinger根据已存储的弹幕条数、时间戳、以及弹幕老化时间,确定出当前一帧显示的总弹幕条数。
其中,步骤S4322的执行过程和原理可以参考前述图4b示出的步骤S4220,此处不再赘述。
S4323、Surface Flinger根据当前一帧绘制的弹幕条数以及显示的总弹幕条数,确定当前一帧的弹幕档位。
其中,步骤S4323的执行过程和原理可以参考前述图4b示出的步骤S4221,此处不再赘述。
S4324、Surface Flinger确定弹幕档位信息发生变更。
其中,步骤S4324的执行过程和原理可以参考前述图4b示出的步骤S4222,此处不再赘述。
S4325、Surface Flinger发送弹幕档位信息至帧率决策模块。
其中,步骤S4325的执行过程和原理可以参考前述图4b示出的步骤S4223,此处不再赘述。
S4326、帧率决策模块根据弹幕档位信息,确定目标刷新率。
其中,步骤S4326的执行过程和原理可以参考前述图4b示出的步骤S4224,此处不再赘述。
S4327、帧率决策模块将目标刷新率发送至Surface Flinger。
其中,步骤S4327的执行过程和原理可以参考前述图4b示出的步骤S4225,此处不再赘述。
S4328、Surface Flinger按照目标刷新率,控制视频播放界面的显示。
其中,步骤S4328的执行过程和原理可以参考前述图4b示出的步骤S4226,此处不再赘述。
S4329、Surface Flinger确定在预设时长内未接收到弹幕绘制信息。
其中,步骤S4329的执行过程和原理可以参考前述图4b示出的步骤S4227,此处不再赘述。
S4330、Surface Flinger发送视频播放场景信息至帧率决策模块。
其中,步骤S4330的执行过程和原理可以参考前述图4b示出的步骤S4228,此处不再赘述。
S4331、帧率决策模块根据视频播放场景信息和视频源帧率,确定出目标刷新率。
其中,步骤S4331的执行过程和原理可以参考前述图4b示出的步骤S4229,此处不再赘述。
S4332、帧率决策模块将目标刷新率发送至Surface Flinger。
其中,步骤S4332的执行过程和原理可以参考前述图4b示出的步骤S4230,此处不再赘述。
S4333、Surface Flinger按照目标刷新率,控制视频播放界面的显示。
其中,步骤S4333的执行过程和原理可以参考前述图4b示出的步骤S4231,此处不再赘述。
需要说明的是,图4a、图4b以及图4c中所提及的Launcher、视频应用、绘制渲染模块、AMS、WMS、Media Codec、帧率决策模块以及Surface Flinger的功能和技术原理的描述可以参考图3示出的相关内容。
为了描述方案更为简洁,本申请实施例中的视频应用简称为应用,本申请中使用第一刷新率指代应用对应的预设刷新率,使用第二刷新率指代帧率决策模块在弹幕场景下所确定出的刷新率,使用第三刷新率指代帧率决策模块在视频播放场景下所确定出的刷新率。本申请实施例中提及的视频播放界面、应用主界面等应用输出在显示屏上的界面,统称为应用的应用界面。而本申请实施例所提及的视频播放界面的窗口对应的图层,也简称为视频播放界面的图层,应用主界面的窗口对应的图层也简称为应用主界面的图层。
由前述图4a、图4b以及图4e的描述可知,本申请实施例的图4a示出的步骤S4101至步骤S4113中,电子设备在响应于接收用户启动应用的操作者之后,按照第一刷新率刷新显示应用的应用界面。其中,第一刷新率可以为应用对应的预设刷新率。而实现响应于接收用户启动应用的操作者,按照第一刷新率刷新显示应用的应用界面的方式有很多,包括但不限于本申请实施例所提出的内容。
参阅图4b示出的步骤S4201至步骤S4209的描述,以及图4e示出的步骤S4301至步骤S4310的描述可知,电子设备响应于接收用户播放视频的操作,按照第一刷新率显示视频播放界面,视频播放界面用于显示用户播放视频的视频画面。其中。实现响应于接收用户播放视频的操作,按照第一刷新率显示视频播放界面的方式有很多,包括但不限于本申请实施例所提出的内容。
继续参阅图4b的步骤S4210至步骤S4231、以及参阅图4d的S4311至S4333可知,本申请实施例中,在显示用户播放视频的视频画面的过程中,通过确定预设时长内是否进行弹幕绘制的方式,调整刷新率。具体的,在显示用户播放视频的视频画面的过程中,若在预设时长内进行弹幕绘制,则统计视频的每一帧显示画面的弹幕数据,根据每一帧显示画面的弹幕数据,按照第二刷新率刷新显示应用的应用界面。若在预设时长内未进行弹幕绘制,则按照第三刷新率刷新显示应用的应用界面,第三刷新率与第二刷新率不同,第三刷新率为视频播放场景下所使用的刷新率,第二刷新率为弹幕场景下,根据根据每一帧显示画面的弹幕数据所确定出的刷新率。通过在显示视频画面的过程中,不断通过确定预设 时长内是否进行弹幕绘制的方式,可以识别出当前为视频播放场景还是弹幕场景,进而在处于弹幕场景时,能够通过统计到的弹幕数据,调整为满足当前场景需求的第二刷新率刷新应用界面,而在视频播放场景时,则调整为满足视频播放场景需求的第三刷新率去刷新应用界面。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本实施例所提供的几个实施例中,应该理解到,所揭露的系统,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本实施例各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (62)

  1. 一种刷新率的设置方法,其特征在于,应用于电子设备,所述电子设备安装有应用,所述方法包括:
    响应于接收用户启动所述应用的操作;
    按照第一刷新率刷新显示所述应用的应用界面;
    响应于接收用户播放视频的操作;
    按照所述第一刷新率显示视频画面;
    在显示所述视频画面的过程中,确定预设时长内是否进行弹幕绘制;
    若在预设时长内进行弹幕绘制,则统计所述视频的每一帧显示画面的弹幕数据;
    根据所述每一帧显示画面的弹幕数据,按照第二刷新率刷新显示所述应用的应用界面;
    若在预设时长内未进行弹幕绘制,则按照第三刷新率刷新显示所述应用的应用界面,所述第三刷新率与所述第二刷新率不同。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述每一帧显示画面的弹幕数据,按照第二刷新率刷新显示所述应用的应用界面,包括:
    根据所述每一帧显示画面的弹幕数据,确定第二刷新率;
    按照确定出的所述第二刷新率刷新显示所述应用的应用界面。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述每一帧显示画面的弹幕数据,确定第二刷新率,包括:
    根据当前统计到的每一帧显示画面的弹幕数据,确定当前的弹幕档位;所述弹幕档位用于表示显示画面上的弹幕密度量级;
    根据所述当前的弹幕档位,确定第二刷新率;所述第二刷新率为当前的弹幕档位对应的预设刷新率;所述弹幕档位说明的弹幕密度量级越大,则确定出的第二刷新率越大。
  4. 根据权利要求1至3任一所述的方法,其特征在于,所述一帧显示画面的弹幕数据,包括:一帧显示画面中绘制的弹幕条数,和/或,一帧显示画面中绘制的弹幕字符数。
  5. 根据权利要求4所述的方法,其特征在于,所述根据当前统计到的每一帧显示画面的弹幕数据,确定当前的弹幕档位,包括:
    根据当前统计到的每一帧显示画面中绘制的弹幕条数、每一帧显示画面对应的时间戳以及弹幕老化时间,计算得到当前一帧显示画面中显示的总弹幕条数;
    根据当前一帧显示画面中绘制的弹幕条数和当前一帧显示画面中显示的总弹幕条数,确定当前的弹幕档位。
  6. 根据权利要求5所述的方法,其特征在于,所述弹幕档位,包括:高弹幕档位或低弹幕档位;
    所述根据当前一帧显示画面中绘制的弹幕条数和当前一帧显示画面中显示的总弹幕条数,确定当前的弹幕档位,包括:
    若当前一帧显示画面中绘制的弹幕条数小于第一预设值,且当前一帧显示画面中显示的总弹幕条数小于第二预设值,则确定出当前的弹幕档位为低弹幕档位;
    若当前一帧显示画面中绘制的弹幕条数大于或等于第二预设值,或者当前一帧显示画面中显示的总弹幕条数大于或等于所述第二预设值,则确定出当前的弹幕档位为高弹幕档 位。
  7. 根据权利要求1至6任一所述的方法,其特征在于,所述统计所述视频的每一帧显示画面的弹幕数据,包括:
    统计所述视频的每一帧显示画面中的特定区域的弹幕数据。
  8. 根据权利要求1至7任一所述的方法,其特征在于,所述确定预设时长内是否进行弹幕绘制之前,所述方法还包括:
    根据所述应用界面的图层信息,确定是否为视频播放场景;所述视频播放场景为播放视频的场景;
    所述确定预设时长内是否进行弹幕绘制,包括:
    若确定为视频播放场景,则确定预设时长内是否进行弹幕绘制。
  9. 根据权利要求8所述的方法,其特征在于,所述确定预设时长内是否进行弹幕绘制之前,所述方法还包括:
    确定是否对所述视频进行解码;
    所述若确定为视频播放场景,则确定预设时长内是否进行弹幕绘制,包括:
    若确定为视频播放场景且确定出对所述视频进行解码,则确定预设时长内是否进行弹幕绘制。
  10. 根据权利要求8或9所述的方法,其特征在于,所述根据所述应用界面的图层信息,确定是否为视频播放场景之前,还包括:
    确定所述应用是否在播放白名单中;所述播放白名单为具有视频播放权限的应用名单;
    所述根据所述应用界面的图层信息,确定是否为视频播放场景,包括:
    若确定出所述应用在所述播放白名单中,则根据所述应用界面的图层信息,确定是否为视频播放场景。
  11. 根据权利要求10所述的方法,其特征在于,所述确定所述应用是否在播放白名单中,包括:
    根据所述应用界面的图层信息中携带的应用的包名,确定所述应用是否在播放白名单中;所述播放白名单,包括:每一个具有视频播放权限的应用的包名。
  12. 根据权利要求8至11任一所述的方法,其特征在于,所述根据所述应用界面的图层信息,确定是否为视频播放场景,包括:
    确定所述应用界面的图层信息中是否具有视频图层的特征信息;
    若所述应用界面的图层信息中具有视频图层的特征信息,则确定出处于视频播放场景;
    若所述应用界面的图层信息中不具有视频图层的特征信息,则确定出不处于视频播放场景。
  13. 根据权利要求12所述的方法,其特征在于,所述视频图层的特征信息为:图层名中携带的视频图层SurfaceView字段。
  14. 根据权利要求1至13任一所述的方法,其特征在于,所述第三刷新率根据所述视频的视频源帧率确定。
  15. 根据权利要求1至14任一所述的方法,其特征在于,还包括:
    响应于接收用户退出视频播放的操作;
    按照第一刷新率刷新显示所述应用的应用界面。
  16. 根据权利要求1至15所述的方法,其特征在于,所述电子设备的操作系统,包括:所述应用和图像合成器Surface Flinger;所述确定预设时长内是否进行弹幕绘制,包括:
    所述Surface Flinger确定预设时长内是否接收到一帧的弹幕绘制信息;所述一帧的弹幕绘制信息,包括:一帧显示画面中绘制的弹幕条数、一帧显示画面对应的时间戳,以及一帧显示画面中绘制渲染后的弹幕数据。
  17. 根据权利要求16所述的方法,其特征在于,所述电子设备的操作系统还包括:帧率决策模块;所述一帧显示画面的弹幕数据,包括:一帧显示画面中绘制的弹幕条数;所述若在预设时长内进行弹幕绘制,则统计所述视频的每一帧显示画面的弹幕数据,包括:
    所述Surface Flinger若在预设时长内接收到一帧的弹幕绘制信息,则存储一帧显示画面中绘制的弹幕条数以及一帧显示画面对应的时间戳。
  18. 根据权利要求16或17所述的方法,其特征在于,所述电子设备的操作系统还包括:帧率决策模块;所述根据每一帧显示画面的弹幕数据,按照第二刷新率刷新显示所述应用的应用界面,包括:
    所述Surface Flinger根据已存储的每一帧显示画面中绘制的弹幕条数、每一帧显示画面对应的时间戳以及弹幕老化时间,确定当前的弹幕档位;
    所述Surface Flinger将当前的弹幕档位信息发送至帧率决策模块;所述弹幕档位信息,包括:所述当前的弹幕档位;
    所述帧率决策模块根据所述弹幕档位信息,确定第二刷新率;
    所述帧率决策模块将确定出的第二刷新率发送至所述Surface Flinge;
    所述Surface Flinge按照所述第二刷新率,控制所述电子设备的显示屏刷新显示所述应用的应用界面。
  19. 根据权利要求18所述的方法,其特征在于,所述Surface Flinger将当前的弹幕档位信息发送至帧率决策模块,包括:
    所述Surface Flinger若确定出当前的弹幕档位未发生变更,则将当前的弹幕档位信息发送至帧率决策模块。
  20. 根据权利要求16至19任一所述的方法,其特征在于,所述电子设备的操作系统,还包括:绘制渲染模块;所述绘制渲染模块包括弹幕计数器;所述Surface Flinger若在预设时长内接收到一帧的弹幕绘制信息,则所述Surface Flinger在预设时长内接收到一帧的弹幕绘制信息之前,还包括:
    所述Surface Flinger确定开启弹幕统计;
    所述应用按照Vsync信号的节奏调用绘制渲染模块绘制弹幕;
    所述绘制渲染模块确定弹幕统计是否开启;
    所述绘制渲染模块若确定弹幕统计已开启,则控制弹幕计数器对绘制的弹幕条数进行计数;
    所述绘制渲染模块按照Vsync信号的节奏,发送当前一帧的弹幕绘制信息至所述Surface Flinger。
  21. 根据权利要求16至20任一所述的方法,其特征在于,所述电子设备的操作系统 还包括:帧率决策模块;所述若在预设时长内未进行弹幕绘制,则按照第三刷新率刷新显示所述应用的应用界面,包括:
    所述Surface Flinger若在预设时长内未接收到弹幕绘制信息,则将视频播放场景信息发送至所述帧率决策模块;视频播放场景为播放视频的场景;
    所述帧率决策模块根据所述视频播放场景信息和视频源帧率,确定第三刷新率;
    所述帧率决策模块将确定出的第三刷新率发送至所述Surface Flinge;
    所述Surface Flinge按照所述第三刷新率,控制所述电子设备的显示屏刷新显示所述应用的应用界面。
  22. 根据权利要求20所述的方法,其特征在于,所述电子设备的操作系统还包括:编解码器Media Codec;所述响应于接收用户播放视频的操作之后,还包括:
    所述应用调用所述Media Codec对所述视频进行解码;
    所述Media Codec将所述视频的视频源帧率发送至帧率决策模块。
  23. 根据权利要求8所述的方法,其特征在于,所述电子设备的操作系统,包括:Surface Flinge;所述根据所述应用界面的图层信息,确定是否为视频播放场景,包括:
    Surface Flinge确定所述应用界面的图层信息中的图层名,是否携带SurfaceView字段;
    所述Surface Flinge若确定出所述图层名携带SurfaceView字段,则确定为视频播放场景;
    所述Surface Flinge若确定出所述图层名不携带SurfaceView字段,则确定不为视频播放场景。
  24. 根据权利要求1至23任一所述的方法,其特征在于,所述所述电子设备的操作系统,包括:所述应用、Surface Flinge以及绘制渲模块;所述按照所述第一刷新率显示视频画面,包括:
    所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,触发所述应用对视频播放界面的图层进行绘制渲染;所述视频播放界面为用于显示视频画面的应用界面;
    所述应用按照所述第一刷新率对应的Vsync信号节奏,调用绘制渲染模块对所述视频播放界面的图层进行绘制渲染;
    所述绘制渲染模块按照所述第一刷新率对应的Vsync信号节奏,发送绘制渲染后的视频播放界面图像数据至所述Surface Flinge;
    所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,使用所述绘制渲染后的视频播放界面图像数据进行图层合成;
    所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,将合成后的视频播放界面图像数据输出至显示屏显示。
  25. 根据权利要求24所述的方法,其特征在于,所述电子设备的操作系统还包括:AMS和窗口管理模块WMS;所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,触发所述应用对所述视频播放界面的图层进行绘制渲染之前还包括:
    所述应用发送启动视频播放Activity的请求至所述AMS;所述启动视频播放Activity的请求中携带有所述应用的包名和视频播放界面名;
    所述AMS根据所述应用的包名和所述视频播放界面名,启动视频播放Activity;
    所述AMS发送所述视频播放界面对应的窗口信息至所述WMS;
    所述WMS根据所述视频播放界面对应的窗口信息,创建所述视频播放界面的窗口;
    所述WMS发送所述视频播放界面的图层信息至所述Surface Flinge;所述图层信息中携带有所述应用的包名;所述视频播放界面的图层信息与所述视频播放界面的窗口对应;
    所述Surface Flinge根据所述视频播放界面的图层信息,创建所述视频播放界面的图层。
  26. 根据权利要求9所述的方法,其特征在于,所述电子设备的操作系统,包括:所述应用、Surface Flinge以及Media Codec;所述确定是否对所述视频进行解码,包括:
    所述Surface Flinge若接收到被调用的信息,则确定出所述应用调用Media Codec对所述视频进行解码;所述被调用的信息用于说明所述Media Codec被调用;
    所述Surface Flinge若未接收到被调用的信息,则确定出所述应用未调用Media Codec对所述视频进行解码。
  27. 根据权利要求26所述的方法,其特征在于,若所述Surface Flinge接收到所述被调用的信息,则所述Surface Flinge接收到所述被调用的信息之前还包括:
    所述应用调用所述Media Codec对所述视频进行解码;
    所述Media Codec发送被调用的信息至所述Surface Flinge。
  28. 根据权利要求27所述的方法,其特征在于,所述应用调用所述Media Codec对所述视频进行解码之前,还包括:
    所述WMS发送所述视频播放界面的窗口创建完成的信息至所述应用。
  29. 根据权利要求10所述的方法,其特征在于,所述确定所述应用是否在播放白名单中,包括:
    所述Surface Flinge根据应用界面的图层信息中携带的应用的包名,确定所述应用是否在播放白名单中;所述播放白名单,包括:每一个具有视频播放权限的应用的包名。
  30. 根据权利要求1所述的方法,其特征在于,所述电子设备的操作系统,包括:应用、Surface Flinge、帧率决策模块以及绘制渲染模块;所述按照第一刷新率刷新显示所述应用的应用界面,包括:
    所述Surface Flinge按照第一刷新率对应的Vsync信号节奏,触发所述应用对应用主界面的图层进行绘制渲染;所述应用主界面为所述应用启动之后显示的应用界面;
    所述应用按照所述第一刷新率对应的Vsync信号节奏,调用绘制渲染模块对所述应用主界面的图层进行绘制渲染;
    所述绘制渲染模块按照所述第一刷新率对应的Vsync信号节奏,发送绘制渲染后的应用主界面图像数据至所述Surface Flinge;
    所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,使用所述绘制渲染后的应用主界面图像数据进行图层合成;
    所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,将合成后的应用主界面图像数据输出至显示屏显示。
  31. 根据权利要求30所述的方法,其特征在于,所述电子设备的操作系统还包括:桌面启动器Launcher、AMS以及WMS;所述所述Surface Flinge按照第一刷新率对应的Vsync信号节奏,触发所述应用对应用主界面的图层进行绘制渲染之前,还包括:
    所述Launcher发送启动应用Activity的请求至AMS;所述启动应用Activity的请求中携带有所述应用的包名;
    所述AMS启动应用Activity;
    所述AMS向WMS发送应用主界面对应的窗口信息;所述窗口信息中携带有应用的包名;
    所述WMS根据所述应用主界面对应的窗口信息,创建所述应用主界面的窗口,并发送所述应用的包名至所述帧率决策模块;
    所述WMS发送应用主界面的图层信息至所述Surface Flinge;所述应用主界面的图层信息与所述应用主界面的窗口对应;
    所述帧率决策模块根据所述应用的包名,确定出第一刷新率;所述第一刷新率为所述应用对应的预设刷新率;
    所述Surface Flinge根据所述应用主界面的图层信息,创建所述应用主界面的图层。
  32. 一种电子设备,其特征在于,包括:一个或多个处理器、存储器以及显示屏;
    所述存储器、所述显示屏分别与所述一个或多个处理器耦合;
    所述显示屏用于显示应用界面;
    所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述一个或多个处理器执行所述计算机指令时,所述电子设备执行如权利要求1至31中任一项所述的刷新率的设置方法。
  33. 一种刷新率的设置方法,其特征在于,应用于电子设备,所述电子设备安装有应用,所述方法包括:
    响应于接收用户启动所述应用的操作;
    按照第一刷新率刷新显示所述应用的应用界面;
    响应于接收用户播放视频的操作;
    按照所述第一刷新率显示视频画面;
    在显示所述视频画面的过程中,根据所述应用界面的图层信息,确定是否为视频播放场景;所述视频播放场景为播放视频的场景;
    若确定为视频播放场景,则确定预设时长内是否进行弹幕绘制;
    若在预设时长内进行弹幕绘制,则统计所述视频的每一帧显示画面的弹幕数据;
    根据所述每一帧显示画面的弹幕数据,按照第二刷新率刷新显示所述应用的应用界面;
    若在预设时长内未进行弹幕绘制,则按照第三刷新率刷新显示所述应用的应用界面,所述第三刷新率与所述第二刷新率不同。
  34. 根据权利要求33所述的方法,其特征在于,所述根据所述每一帧显示画面的弹幕数据,按照第二刷新率刷新显示所述应用的应用界面,包括:
    根据所述每一帧显示画面的弹幕数据,确定第二刷新率;
    按照确定出的所述第二刷新率刷新显示所述应用的应用界面。
  35. 根据权利要求34所述的方法,其特征在于,所述根据所述每一帧显示画面的弹幕数据,确定第二刷新率,包括:
    根据当前统计到的每一帧显示画面的弹幕数据,确定当前的弹幕档位;所述弹幕档位 用于表示显示画面上的弹幕密度量级;
    根据所述当前的弹幕档位,确定第二刷新率;所述第二刷新率为当前的弹幕档位对应的预设刷新率;所述弹幕档位说明的弹幕密度量级越大,则确定出的第二刷新率越大。
  36. 根据权利要33至35任一所述的方法,其特征在于,所述一帧显示画面的弹幕数据,包括:一帧显示画面中绘制的弹幕条数,和/或,一帧显示画面中绘制的弹幕字符数。
  37. 根据权利要求36所述的方法,其特征在于,所述根据当前统计到的每一帧显示画面的弹幕数据,确定当前的弹幕档位,包括:
    根据当前统计到的每一帧显示画面中绘制的弹幕条数、每一帧显示画面对应的时间戳以及弹幕老化时间,计算得到当前一帧显示画面中显示的总弹幕条数;
    根据当前一帧显示画面中绘制的弹幕条数和当前一帧显示画面中显示的总弹幕条数,确定当前的弹幕档位。
  38. 根据权利要求37所述的方法,其特征在于,所述弹幕档位,包括:高弹幕档位或低弹幕档位;
    所述根据当前一帧显示画面中绘制的弹幕条数和当前一帧显示画面中显示的总弹幕条数,确定当前的弹幕档位,包括:
    若当前一帧显示画面中绘制的弹幕条数小于第一预设值,且当前一帧显示画面中显示的总弹幕条数小于第二预设值,则确定出当前的弹幕档位为低弹幕档位;
    若当前一帧显示画面中绘制的弹幕条数大于或等于第二预设值,或者当前一帧显示画面中显示的总弹幕条数大于或等于所述第二预设值,则确定出当前的弹幕档位为高弹幕档位。
  39. 根据权利要求33至38任一所述的方法,其特征在于,所述统计所述视频的每一帧显示画面的弹幕数据,包括:
    统计所述视频的每一帧显示画面中的特定区域的弹幕数据。
  40. 根据权利要求33所述的方法,其特征在于,所述确定预设时长内是否进行弹幕绘制之前,所述方法还包括:
    确定是否对所述视频进行解码;
    所述若确定为视频播放场景,则确定预设时长内是否进行弹幕绘制,包括:
    若确定为视频播放场景且确定出对所述视频进行解码,则确定预设时长内是否进行弹幕绘制。
  41. 根据权利要求40所述的方法,其特征在于,所述根据所述应用界面的图层信息,确定是否为视频播放场景之前,还包括:
    确定所述应用是否在播放白名单中;所述播放白名单为具有视频播放权限的应用名单;
    所述根据所述应用界面的图层信息,确定是否为视频播放场景,包括:
    若确定出所述应用在所述播放白名单中,则根据所述应用界面的图层信息,确定是否为视频播放场景。
  42. 根据权利要求41所述的方法,其特征在于,所述确定所述应用是否在播放白名单中,包括:
    根据所述应用界面的图层信息中携带的应用的包名,确定所述应用是否在播放白名单 中;所述播放白名单,包括:每一个具有视频播放权限的应用的包名。
  43. 根据权利要求40至42任一所述的方法,其特征在于,所述根据所述应用界面的图层信息,确定是否为视频播放场景,包括:
    确定所述应用界面的图层信息中是否具有视频图层的特征信息;
    若所述应用界面的图层信息中具有视频图层的特征信息,则确定出处于视频播放场景;
    若所述应用界面的图层信息中不具有视频图层的特征信息,则确定出不处于视频播放场景。
  44. 根据权利要求43所述的方法,其特征在于,所述视频图层的特征信息为:图层名中携带的视频图层SurfaceView字段。
  45. 根据权利要求33至44任一所述的方法,其特征在于,所述第三刷新率根据所述视频的视频源帧率确定。
  46. 根据权利要求33至45任一所述的方法,其特征在于,还包括:
    响应于接收用户退出视频播放的操作;
    按照第一刷新率刷新显示所述应用的应用界面。
  47. 根据权利要求33至46任一所述的方法,其特征在于,所述电子设备的操作系统,包括:所述应用和图像合成器Surface Flinger;所述确定预设时长内是否进行弹幕绘制,包括:
    所述Surface Flinger确定预设时长内是否接收到一帧的弹幕绘制信息;所述一帧的弹幕绘制信息,包括:一帧显示画面中绘制的弹幕条数、一帧显示画面对应的时间戳,以及一帧显示画面中绘制渲染后的弹幕数据。
  48. 根据权利要求47所述的方法,其特征在于,所述电子设备的操作系统还包括:帧率决策模块;所述一帧显示画面的弹幕数据,包括:一帧显示画面中绘制的弹幕条数;所述若在预设时长内进行弹幕绘制,则统计所述视频的每一帧显示画面的弹幕数据,包括:
    所述Surface Flinger若在预设时长内接收到一帧的弹幕绘制信息,则存储一帧显示画面中绘制的弹幕条数以及一帧显示画面对应的时间戳。
  49. 根据权利要求47或48所述的方法,其特征在于,所述电子设备的操作系统还包括:帧率决策模块;所述根据每一帧显示画面的弹幕数据,按照第二刷新率刷新显示所述应用的应用界面,包括:
    所述Surface Flinger根据已存储的每一帧显示画面中绘制的弹幕条数、每一帧显示画面对应的时间戳以及弹幕老化时间,确定当前的弹幕档位;
    所述Surface Flinger将当前的弹幕档位信息发送至帧率决策模块;所述弹幕档位信息,包括:所述当前的弹幕档位;
    所述帧率决策模块根据所述弹幕档位信息,确定第二刷新率;
    所述帧率决策模块将确定出的第二刷新率发送至所述Surface Flinge;
    所述Surface Flinge按照所述第二刷新率,控制所述电子设备的显示屏刷新显示所述应用的应用界面。
  50. 根据权利要求49所述的方法,其特征在于,所述Surface Flinger将当前的弹幕档位信息发送至帧率决策模块,包括:
    所述Surface Flinger若确定出当前的弹幕档位未发生变更,则将当前的弹幕档位信息发送至帧率决策模块。
  51. 根据权利要求47至50任一所述的方法,其特征在于,所述电子设备的操作系统,还包括:绘制渲染模块;所述绘制渲染模块包括弹幕计数器;所述Surface Flinger若在预设时长内接收到一帧的弹幕绘制信息,则所述Surface Flinger在预设时长内接收到一帧的弹幕绘制信息之前,还包括:
    所述Surface Flinger确定开启弹幕统计;
    所述应用按照Vsync信号的节奏调用绘制渲染模块绘制弹幕;
    所述绘制渲染模块确定弹幕统计是否开启;
    所述绘制渲染模块若确定弹幕统计已开启,则控制弹幕计数器对绘制的弹幕条数进行计数;
    所述绘制渲染模块按照Vsync信号的节奏,发送当前一帧的弹幕绘制信息至所述Surface Flinger。
  52. 根据权利要求47至51任一所述的方法,其特征在于,所述电子设备的操作系统还包括:帧率决策模块;所述若在预设时长内未进行弹幕绘制,则按照第三刷新率刷新显示所述应用的应用界面,包括:
    所述Surface Flinger若在预设时长内未接收到弹幕绘制信息,则将视频播放场景信息发送至所述帧率决策模块;视频播放场景为播放视频的场景;
    所述帧率决策模块根据所述视频播放场景信息和视频源帧率,确定第三刷新率;
    所述帧率决策模块将确定出的第三刷新率发送至所述Surface Flinge;
    所述Surface Flinge按照所述第三刷新率,控制所述电子设备的显示屏刷新显示所述应用的应用界面。
  53. 根据权利要求51所述的方法,其特征在于,所述电子设备的操作系统还包括:编解码器Media Codec;所述响应于接收用户播放视频的操作之后,还包括:
    所述应用调用所述Media Codec对所述视频进行解码;
    所述Media Codec将所述视频的视频源帧率发送至帧率决策模块。
  54. 根据权利要求33所述的方法,其特征在于,所述电子设备的操作系统,包括:Surface Flinge;所述根据所述应用界面的图层信息,确定是否为视频播放场景,包括:
    Surface Flinge确定所述应用界面的图层信息中的图层名,是否携带SurfaceView字段;
    所述Surface Flinge若确定出所述图层名携带SurfaceView字段,则确定为视频播放场景;
    所述Surface Flinge若确定出所述图层名不携带SurfaceView字段,则确定不为视频播放场景。
  55. 根据权利要33至54任一所述的方法,其特征在于,所述所述电子设备的操作系统,包括:所述应用、Surface Flinge以及绘制渲模块;所述按照所述第一刷新率显示视频画面,包括:
    所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,触发所述应用对视频播放界面的图层进行绘制渲染;所述视频播放界面为用于显示视频画面的应用界面;
    所述应用按照所述第一刷新率对应的Vsync信号节奏,调用绘制渲染模块对所述视频播放界面的图层进行绘制渲染;
    所述绘制渲染模块按照所述第一刷新率对应的Vsync信号节奏,发送绘制渲染后的视频播放界面图像数据至所述Surface Flinge;
    所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,使用所述绘制渲染后的视频播放界面图像数据进行图层合成;
    所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,将合成后的视频播放界面图像数据输出至显示屏显示。
  56. 根据权利要求55所述的方法,其特征在于,所述电子设备的操作系统还包括:AMS和窗口管理模块WMS;所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,触发所述应用对所述视频播放界面的图层进行绘制渲染之前还包括:
    所述应用发送启动视频播放Activity的请求至所述AMS;所述启动视频播放Activity的请求中携带有所述应用的包名和视频播放界面名;
    所述AMS根据所述应用的包名和所述视频播放界面名,启动视频播放Activity;
    所述AMS发送所述视频播放界面对应的窗口信息至所述WMS;
    所述WMS根据所述视频播放界面对应的窗口信息,创建所述视频播放界面的窗口;
    所述WMS发送所述视频播放界面的图层信息至所述Surface Flinge;所述图层信息中携带有所述应用的包名;所述视频播放界面的图层信息与所述视频播放界面的窗口对应;
    所述Surface Flinge根据所述视频播放界面的图层信息,创建所述视频播放界面的图层。
  57. 根据权利要求40所述的方法,其特征在于,所述电子设备的操作系统,包括:所述应用、Surface Flinge以及Media Codec;所述确定是否对所述视频进行解码,包括:
    所述Surface Flinge若接收到被调用的信息,则确定出所述应用调用Media Codec对所述视频进行解码;所述被调用的信息用于说明所述Media Codec被调用;
    所述Surface Flinge若未接收到被调用的信息,则确定出所述应用未调用Media Codec对所述视频进行解码。
  58. 根据权利要求57所述的方法,其特征在于,若所述Surface Flinge接收到所述被调用的信息,则所述Surface Flinge接收到所述被调用的信息之前还包括:
    所述应用调用所述Media Codec对所述视频进行解码;
    所述Media Codec发送被调用的信息至所述Surface Flinge。
  59. 根据权利要求58所述的方法,其特征在于,所述应用调用所述Media Codec对所述视频进行解码之前,还包括:
    所述WMS发送所述视频播放界面的窗口创建完成的信息至所述应用。
  60. 根据权利要求41所述的方法,其特征在于,所述确定所述应用是否在播放白名单中,包括:
    所述Surface Flinge根据应用界面的图层信息中携带的应用的包名,确定所述应用是否在播放白名单中;所述播放白名单,包括:每一个具有视频播放权限的应用的包名。
  61. 根据权利要求33所述的方法,其特征在于,所述电子设备的操作系统,包括:应用、Surface Flinge、帧率决策模块以及绘制渲染模块;所述按照第一刷新率刷新显示所述 应用的应用界面,包括:
    所述Surface Flinge按照第一刷新率对应的Vsync信号节奏,触发所述应用对应用主界面的图层进行绘制渲染;所述应用主界面为所述应用启动之后显示的应用界面;
    所述应用按照所述第一刷新率对应的Vsync信号节奏,调用绘制渲染模块对所述应用主界面的图层进行绘制渲染;
    所述绘制渲染模块按照所述第一刷新率对应的Vsync信号节奏,发送绘制渲染后的应用主界面图像数据至所述Surface Flinge;
    所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,使用所述绘制渲染后的应用主界面图像数据进行图层合成;
    所述Surface Flinge按照所述第一刷新率对应的Vsync信号节奏,将合成后的应用主界面图像数据输出至显示屏显示。
  62. 根据权利要求61所述的方法,其特征在于,所述电子设备的操作系统还包括:桌面启动器Launcher、AMS以及WMS;所述所述Surface Flinge按照第一刷新率对应的Vsync信号节奏,触发所述应用对应用主界面的图层进行绘制渲染之前,还包括:
    所述Launcher发送启动应用Activity的请求至AMS;所述启动应用Activity的请求中携带有所述应用的包名;
    所述AMS启动应用Activity;
    所述AMS向WMS发送应用主界面对应的窗口信息;所述窗口信息中携带有应用的包名;
    所述WMS根据所述应用主界面对应的窗口信息,创建所述应用主界面的窗口,并发送所述应用的包名至所述帧率决策模块;
    所述WMS发送应用主界面的图层信息至所述Surface Flinge;所述应用主界面的图层信息与所述应用主界面的窗口对应;
    所述帧率决策模块根据所述应用的包名,确定出第一刷新率;所述第一刷新率为所述应用对应的预设刷新率;
    所述Surface Flinge根据所述应用主界面的图层信息,创建所述应用主界面的图层。
PCT/CN2022/115417 2021-12-16 2022-08-29 刷新率的设置方法及相关设备 WO2023109184A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22879601.7A EP4224872A4 (en) 2021-12-16 2022-08-29 METHOD FOR ADJUSTING REFRESH FREQUENCY AND ASSOCIATED DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111545016.7A CN114205673A (zh) 2021-12-16 2021-12-16 刷新率的设置方法及相关设备
CN202111545016.7 2021-12-16

Publications (1)

Publication Number Publication Date
WO2023109184A1 true WO2023109184A1 (zh) 2023-06-22

Family

ID=80654737

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/115417 WO2023109184A1 (zh) 2021-12-16 2022-08-29 刷新率的设置方法及相关设备

Country Status (3)

Country Link
EP (1) EP4224872A4 (zh)
CN (1) CN114205673A (zh)
WO (1) WO2023109184A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114245208A (zh) * 2021-12-16 2022-03-25 荣耀终端有限公司 刷新率的设置方法及相关设备
CN114205673A (zh) * 2021-12-16 2022-03-18 荣耀终端有限公司 刷新率的设置方法及相关设备
CN116095382B (zh) * 2022-06-02 2023-10-31 荣耀终端有限公司 弹幕识别方法和相关装置
CN115079977A (zh) * 2022-06-15 2022-09-20 北京字跳网络技术有限公司 界面绘制方法、装置、电子设备及存储介质
WO2024169421A1 (zh) * 2023-02-17 2024-08-22 海信视像科技股份有限公司 显示设备和显示处理方法
CN118349311B (zh) * 2024-06-14 2024-08-20 北京麟卓信息科技有限公司 一种基于可视区域度量的应用渲染帧率自适应调整方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210853A (zh) * 2016-07-08 2016-12-07 上海幻电信息科技有限公司 弹幕显示系统及其cpu消耗控制方法
CN106933526A (zh) * 2017-03-10 2017-07-07 广东欧珀移动通信有限公司 一种动态调节屏幕刷新率的方法、装置及移动终端
WO2018161957A1 (zh) * 2017-03-10 2018-09-13 广东欧珀移动通信有限公司 图层绘制控制方法、装置及移动终端
CN109413480A (zh) * 2018-09-30 2019-03-01 Oppo广东移动通信有限公司 画面处理方法、装置、终端及存储介质
CN113438552A (zh) * 2021-05-19 2021-09-24 荣耀终端有限公司 一种刷新率调整方法和电子设备
CN113766324A (zh) * 2020-06-02 2021-12-07 深圳市万普拉斯科技有限公司 视频播放的控制方法、装置、计算机设备和存储介质
CN114205673A (zh) * 2021-12-16 2022-03-18 荣耀终端有限公司 刷新率的设置方法及相关设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106919358B (zh) * 2017-03-10 2021-03-09 Oppo广东移动通信有限公司 一种移动终端的显示控制方法、装置及移动终端
CN106919402B (zh) * 2017-03-10 2020-08-28 Oppo广东移动通信有限公司 一种移动终端的控制方法、装置及移动终端
CN109120988B (zh) * 2018-08-23 2020-07-24 Oppo广东移动通信有限公司 解码方法、装置、电子设备以及存储介质
CN113362783B (zh) * 2020-03-06 2022-04-05 华为技术有限公司 刷新率切换方法和电子设备
CN111767013A (zh) * 2020-06-01 2020-10-13 Oppo(重庆)智能科技有限公司 控制方法、控制装置、电子装置、计算机可读存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210853A (zh) * 2016-07-08 2016-12-07 上海幻电信息科技有限公司 弹幕显示系统及其cpu消耗控制方法
CN106933526A (zh) * 2017-03-10 2017-07-07 广东欧珀移动通信有限公司 一种动态调节屏幕刷新率的方法、装置及移动终端
WO2018161957A1 (zh) * 2017-03-10 2018-09-13 广东欧珀移动通信有限公司 图层绘制控制方法、装置及移动终端
CN109413480A (zh) * 2018-09-30 2019-03-01 Oppo广东移动通信有限公司 画面处理方法、装置、终端及存储介质
CN113766324A (zh) * 2020-06-02 2021-12-07 深圳市万普拉斯科技有限公司 视频播放的控制方法、装置、计算机设备和存储介质
CN113438552A (zh) * 2021-05-19 2021-09-24 荣耀终端有限公司 一种刷新率调整方法和电子设备
CN114205673A (zh) * 2021-12-16 2022-03-18 荣耀终端有限公司 刷新率的设置方法及相关设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4224872A4

Also Published As

Publication number Publication date
EP4224872A4 (en) 2024-04-10
CN114205673A (zh) 2022-03-18
EP4224872A1 (en) 2023-08-09

Similar Documents

Publication Publication Date Title
WO2023109184A1 (zh) 刷新率的设置方法及相关设备
WO2023109185A1 (zh) 刷新率的设置方法及相关设备
CN107783830B (zh) 一种多任务管理方法及终端设备
US10540056B2 (en) Video playing method and apparatus
US9626530B2 (en) Electronic device and method for protecting applications thereof
CN107544810B (zh) 控制应用程序的方法和装置
US20110231396A1 (en) System and method for providing predictive contacts
CN110300328B (zh) 一种视频播放控制方法、装置及可读存储介质
CN110110262A (zh) 浏览器内存管理方法、装置和设备
US10268343B2 (en) Mobile user interface for contextual browsing while playing digital content
CN114661263B (zh) 一种显示方法、电子设备及存储介质
CN106445449B (zh) 一种音频焦点控制装置及方法
US11455075B2 (en) Display method when application is exited and terminal
CN113507646B (zh) 一种显示设备及浏览器多标签页媒资播放方法
US8363009B1 (en) Enhanced input using touch screen
US20180196584A1 (en) Execution of multiple applications on a device
US20240320009A1 (en) Data access method and apparatus, and non-transient computer-readable storage medium
US20200028961A1 (en) Switching presentations of representations of objects at a user interface
CN108228776A (zh) 数据处理方法、装置、存储介质及电子设备
WO2019201134A1 (zh) 分屏显示方法、存储介质及电子设备
WO2024041047A1 (zh) 一种屏幕刷新率切换方法及电子设备
US8875049B1 (en) Sliding window manager
WO2023083218A1 (zh) 投屏中流畅显示画面的方法、相关装置及系统
CN113825014B (zh) 多媒体内容播放方法、装置、计算机设备和存储介质
CN113096054A (zh) 图像帧合成方法、装置及存储介质

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 18035380

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2022879601

Country of ref document: EP

Effective date: 20230424

NENP Non-entry into the national phase

Ref country code: DE