CN112516590A - Frame rate identification method and electronic equipment - Google Patents

Frame rate identification method and electronic equipment Download PDF

Info

Publication number
CN112516590A
CN112516590A CN201910888838.1A CN201910888838A CN112516590A CN 112516590 A CN112516590 A CN 112516590A CN 201910888838 A CN201910888838 A CN 201910888838A CN 112516590 A CN112516590 A CN 112516590A
Authority
CN
China
Prior art keywords
frame rate
application
target frame
measured
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910888838.1A
Other languages
Chinese (zh)
Inventor
李宗峰
王绪
周未来
丁少文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910888838.1A priority Critical patent/CN112516590A/en
Priority to PCT/CN2020/108714 priority patent/WO2021052070A1/en
Publication of CN112516590A publication Critical patent/CN112516590A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping

Abstract

The present application relates to the field of electronic device technologies, and in particular, to a frame rate identification method and an electronic device. The method comprises the following steps: according to the current target frame rate, performing frame stabilization on image drawing and rendering performed by the first application; determining the frame length of N continuous images according to the receiving time of a first application drawing rendering result, wherein the N continuous images are images rendered by the first application drawing; determining an actual measurement frame rate according to the frame lengths of the N continuous images; and determining a new target frame rate according to the actual measurement frame rate and the current target frame rate so as to perform frame stabilization on the image drawing and rendering performed by the first application according to the new target frame rate.

Description

Frame rate identification method and electronic equipment
Technical Field
The present application relates to the field of electronic device technologies, and in particular, to a frame rate identification method and an electronic device.
Background
With the explosive development of the mobile phone game industry, it is more and more common for users to play games with mobile phones. The user's gaming experience is severely affected by problems such as frame rate jitter, battle morton, etc. of large mobile games (e.g., royal glory, peace and elite, etc.). The frame rate is the frequency with which bitmap images called units of frames continuously appear on the display, and can be expressed in frames per second (fps). Gaming applications such as peaceful elite, royal glory, etc. may allow a user to set frame rates, for example, royal glory has a high frame rate mode and a normal frame rate mode, where the high frame rate mode corresponds to a frame rate of 60fps and the normal frame rate mode corresponds to a frame rate of 30 fps. When the user selects the royal glory high frame rate mode, the frame rate is set to 60fps when the royal glory is operated on the electronic device. When the user selects the general frame rate mode in which the royal glory is glory, the frame rate is set to 30fps when the royal glory is operated on the electronic device.
In order to enable users to have better game experience and reduce unnecessary power consumption of electronic equipment, a frame stabilizing scheme is provided. In the frame stabilizing scheme, the operating system needs to adjust the operating frequency of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), and the like, with a set frame rate of a game application currently running on the electronic device as a target frame rate, so as to provide just enough performance supply (or referred to as a computing resource).
Currently, a mobile phone manufacturer needs to cooperate with a third-party game manufacturer to enable a third-party game application to issue a set frame rate to an operating system through a Software Development Kit (SDK) of the third-party game application. This results in the identification of the target frame rate for the gaming application being strongly dependent on third party game vendors. Moreover, when the third-party game application version is upgraded, a failure of the issued set frame rate may occur. Thereby making it difficult for the operating system to frame-stabilize at a target frame rate that is at or near the set frame rate of the third party gaming application.
Disclosure of Invention
The embodiment of the application provides a frame rate identification method and an electronic device, which can quickly determine a frame rate equal to or close to a set frame rate applied by a third party to stabilize frames without depending on the third party.
In a first aspect, an embodiment of the present application provides a frame rate identification method, which is applied to an electronic device; the method comprises the following steps: according to the current target frame rate, performing frame stabilization on image drawing and rendering performed by the first application; determining the frame length of N continuous images according to the receiving time of a first application drawing rendering result, wherein the N continuous images are images rendered by the first application drawing; determining an actual measurement frame rate according to the frame lengths of the N continuous images; and determining a new target frame rate according to the actual measurement frame rate and the current target frame rate so as to perform frame stabilization on the image drawing and rendering performed by the first application according to the new target frame rate.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the determining a measured frame rate according to the frame lengths of the N consecutive images includes: determining a first interval from a plurality of preset frame rate intervals according to the average frame length of the N continuous images; and determining the actually measured frame rate according to the first interval.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, the determining the measured frame rate according to the first interval includes: and using the upper limit frame rate of the first interval as the measured frame rate.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: when the actual measurement frame rate is equal to the current target frame rate and the number of first images in the N continuous images is larger than a first threshold value, adding the current target frame rate and a first preset frame rate to obtain a first frame rate, and using the first frame rate as the new target frame rate; the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: when the actually measured frame rate is equal to the current target frame rate, the number of first-class images in the N continuous images is greater than a first threshold value, and the number of first-class images in the N continuous images is greater than the number of second-class images in the N continuous images, adding the current target frame rate and a first preset frame rate to obtain a first frame rate, and using the first frame rate as the new target frame rate; the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate; and the frame rate corresponding to the second type of image is less than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: and when the measured frame rate is greater than the current target frame rate, using the measured frame rate as the new target frame rate.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a sixth possible implementation manner of the first aspect, the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: and using the higher frame rate of the actually measured frame rate and the first frame rate as the new target frame rate, wherein the first frame rate is obtained by adding the current target frame rate and a first preset frame rate.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a seventh possible implementation manner of the first aspect, the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: and when the measured frame rate is smaller than the current target frame rate, using the measured frame rate as the new target frame rate.
With reference to the first aspect or the first possible implementation manner of the first aspect, in an eighth possible implementation manner of the first aspect, the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: and when the actually measured frame rate is smaller than the current target frame rate and the average sleep time of rendering threads when the first application performs rendering on the N continuous images is larger than a second threshold value, using the actually measured frame rate as the new target frame rate.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a ninth possible implementation manner of the first aspect, the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: when the actually measured frame rate is smaller than the current target frame rate, the average sleep time of rendering threads when the first application performs rendering on the N continuous images is larger than a second threshold, and the number of second images in the N continuous images is larger than a third threshold, using the actually measured frame rate as the new target frame rate; and the frame rate corresponding to the second type of image is smaller than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a tenth possible implementation manner of the first aspect, the measured frame rate includes at least two measured frame rates; determining a latest target frame rate according to the measured frame rate includes: and when the measured frame rate is smaller than the current target frame rate and the at least two frame rates are equal, using the measured frame rate as the latest target frame rate.
With reference to the first aspect or the first possible implementation manner of the first aspect, in an eleventh possible implementation manner of the first aspect, the measured frame rate includes at least two measured frame rates; determining a latest target frame rate according to the measured frame rate includes: when the actual measurement frame rate is smaller than the current target frame rate, the at least two frame rates are equal, and the number of second images in the N continuous images is larger than a fourth threshold, using the actual measurement frame rate as the latest target frame rate; and the frame rate corresponding to the second type of image is smaller than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
In a second aspect, an embodiment of the present application provides a frame rate identification apparatus, including:
the frame stabilizing unit is used for stabilizing frames of image drawing and rendering performed by the first application according to the current target frame rate;
a first determining unit, configured to determine frame lengths of N consecutive images according to a receiving time at which a first application rendering result is received, where the N consecutive images are images rendered by the first application rendering;
a second determining unit, configured to determine an actual frame rate according to the frame lengths of the N continuous images;
and a third determining unit, configured to determine a new target frame rate according to the actual measurement frame rate and the current target frame rate, so as to perform frame stabilization on image rendering and rendering performed by the first application according to the new target frame rate.
With reference to the second aspect, in a first possible implementation manner of the second aspect, the second determining unit is further configured to determine a first section from a plurality of preset frame rate sections according to an average frame length of the N consecutive images; and determining the actually measured frame rate according to the first interval.
With reference to the second possible implementation manner of the second aspect, in the second possible implementation manner of the first aspect, the second determining unit is further configured to use an upper frame rate of the first interval as the measured frame rate.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a third possible implementation manner of the first aspect, the third determining unit is further configured to, when the measured frame rate is equal to the current target frame rate and the number of first-class images in the N consecutive images is greater than a first threshold, use a first frame rate obtained by adding the current target frame rate and a first preset frame rate as the new target frame rate; the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a fourth possible implementation manner of the first aspect, the third determining unit is further configured to, when the measured frame rate is equal to the current target frame rate, the number of first-type images in the N continuous images is greater than a first threshold, and the number of first-type images in the N continuous images is greater than the number of second-type images in the N continuous images, use, as the new target frame rate, a first frame rate obtained by adding the current target frame rate and a first preset frame rate; the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate; and the frame rate corresponding to the second type of image is less than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a fifth possible implementation manner of the first aspect, the third determining unit is further configured to use the measured frame rate as the new target frame rate when the measured frame rate is greater than the current target frame rate.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a sixth possible implementation manner of the first aspect, the third determining unit is further configured to use a larger frame rate of the actually measured frame rate and the first frame rate as the new target frame rate, where the first frame rate is obtained by adding the current target frame rate and a first preset frame rate.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a seventh possible implementation manner of the first aspect, the third determining unit is further configured to use the measured frame rate as the new target frame rate when the measured frame rate is smaller than the current target frame rate.
With reference to the second aspect or the first possible implementation manner of the second aspect, in an eighth possible implementation manner of the first aspect, the third determining unit is further configured to use the measured frame rate as the new target frame rate when the measured frame rate is smaller than the current target frame rate and an average sleep time of a rendering thread when the first application performs rendering of the N consecutive images is larger than a second threshold.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a ninth possible implementation manner of the first aspect, the third determining unit is further configured to use the measured frame rate as the new target frame rate when the measured frame rate is smaller than the current target frame rate, an average sleep time of rendering threads when the first application performs rendering of the N consecutive images is greater than a second threshold, and the number of second-class images in the N consecutive images is greater than a third threshold; and the frame rate corresponding to the second type of image is smaller than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a tenth possible implementation manner of the first aspect, the measured frame rate includes at least two measured frame rates; the third determining unit is further configured to use the measured frame rate as the latest target frame rate when the measured frame rate is less than the current target frame rate and the at least two frame rates are equal to each other.
With reference to the second aspect or the first possible implementation manner of the second aspect, in an eleventh possible implementation manner of the first aspect, the measured frame rate includes at least two measured frame rates; the third determining unit is further configured to use the measured frame rate as the latest target frame rate when the measured frame rate is less than the current target frame rate, the at least two frame rates are equal, and the number of second images in the N consecutive images is greater than a fourth threshold; and the frame rate corresponding to the second type of image is smaller than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory; wherein the memory is used for storing computer execution instructions; when the electronic device is running, the processor executes the computer executable instructions stored by the memory to cause the electronic device to perform the method of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer storage medium including computer instructions that, when executed on an electronic device, cause the electronic device to perform the method of the first aspect.
In a fifth aspect, the present application provides a computer program product, where the computer program product includes program code, which, when executed by a processor in an electronic device, implements the method of the first aspect.
By the frame rate identification method provided by the embodiment of the application, the frame rate equal to or close to the set frame rate of the application can be quickly identified under the condition that the operating system does not know or does not accurately know the set frame rate of the application, and the identified frame rate can be used as the target frame rate of the operating system for stabilizing frames, so that the user experience of the application is improved, and unnecessary power consumption overhead is avoided or reduced.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating a module in a system library receiving an image from an application according to an embodiment of the present application;
fig. 4 is a schematic block diagram of a frame rate recognition apparatus according to an embodiment of the present disclosure;
fig. 5 is a block diagram of a software structure of an electronic device according to an embodiment of the present disclosure;
fig. 6 is a flowchart of a frame rate identification method according to an embodiment of the present disclosure;
fig. 7 is a flowchart of a frame rate identification method according to an embodiment of the present disclosure;
fig. 8 is a schematic block diagram of a frame rate recognition apparatus according to an embodiment of the present disclosure;
fig. 9 is a schematic block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise.
Wherein in the description of the present specification, "/" indicates a meaning, for example, a/B may indicate a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
In the description of the present specification, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
When an Application (APP) that needs to render a dynamic picture runs, an operating system of the electronic device may provide computing resources for all programs currently running on the electronic device according to a target frame rate of the APP.
Specifically, the application may be a game application, a video playing application, or the like. More specifically, the gaming applications may be royal glory, peace elite, absolute survival, and so on, to name but a few.
The application may be a third party application. In one example, the user may download and uninstall the third party application from an application marketplace or the like.
The computing resources may represent computing power, the more computing resources, the more computing power. It is easy to understand that the operating frequency of the CPU, the GPU, the double data rate synchronous dynamic random access memory (DDR), etc., determines the computing power of the electronic device. The amount of the computing resources is positively correlated with the running frequency of a CPU, a GPU, a DDR and the like.
Taking application a as an application that needs to draw and render a dynamic picture as an example, an operating system of the electronic device may determine a frame length of an image according to a receiving time of receiving the image drawn and rendered by application a. This image may also be referred to as application a rendered rendering results. The frame length of an image may be a time interval between the time of receipt of the image and the time of receipt of an image immediately preceding the image, which is an application a rendering result received by the operating system prior to receipt of the image, and which is adjacent to the received image.
The operating system may determine the frame rate of an image based on the frame length of the image. Specifically, the actual frame rate of the image may be obtained by dividing the time length of 1000 milliseconds by the frame length of the image. The measured frame rate of the image may represent or reflect how fast the operating system receives application a rendered results (images) from application a.
The operating system may adjust the computational resources based on a size relationship between the measured frame rate and the target frame rate of the image. Specifically, the operating frequencies of the CPU, the GPU, the DDR, and the like may be adjusted, so that the provided computing resources may just stabilize the actual frame rate of the rendered image drawn by the application a at about the target frame rate, thereby improving the user experience and reducing or avoiding unnecessary power consumption overhead.
The above process may be referred to as frame-stabilizing the image drawing rendering performed by the application a, wherein the target frame rate may be referred to as a target frame rate for frame-stabilizing the image drawing rendering performed by the application a.
Next, an example of the frame-stable scheme will be described.
In some embodiments, it may be set that at a first time, the operating system may receive an image C rendered by application a drawing from application a. At a second time, the operating system receives application a rendering rendered image D from application a. Image D is next to image C and is adjacent to image C, that is, after the operating system receives image C from application a, the image received from application a is image D. The time interval between the second instant and the first instant may be referred to as a frame length of the image D. The frame length of the image D is divided by the time length of 1000 milliseconds, so that the actual frame rate of the rendered image D drawn by the application A can be obtained. If the actual frame rate of the image D rendered by the application a is less than the target frame rate of the stable frame (the target frame rate of the image rendered by the application a for the stable frame), the operating system increases the computing resources provided by the operating system. And if the frame rate of the rendered image D drawn by the application A is greater than the target frame rate of the stable frame, the operating system reduces the computing resource provided by the application A.
In some embodiments, the operating system may determine an average frame length for each of a plurality of consecutive images it receives from application a, which render the rendered image for application a, based on the frame length of that image. The frame length of each image can be described with reference to the previous embodiment, and is not described herein again. The actual frame rate of the plurality of continuous images can be obtained by dividing the duration of 1000 milliseconds by the average frame length of each image. If the measured frame rate of the plurality of continuous images is less than the target frame rate of the stable frame, the operating system increases the computing resources provided by the operating system. If the measured frame rate of the plurality of continuous images is greater than the target frame rate of the stable frame, the operating system reduces the computing resources provided by the operating system.
In some embodiments, the operating system may divide the computing resources that can be provided by the electronic device into a plurality of levels, where the computing capacity corresponding to the computing resource of any level in the plurality of levels is smaller than the computing capacity corresponding to the computing resource of the previous level, and in particular, the operating frequencies of the CPU, the GPU, the DDR, and the like may be ranked to obtain computing resources of different levels. For example, the computing resource of level a may correspond to an operating frequency of 12000MHz for the CPU, 500MHz for the GPU, and 1066 for the DDR, an operating frequency of 17000MHz for the CPU, 600MHz for the GPU, and 1333MHz for the DDR, an operating frequency of 22000MHz for the computing resource of level C, 700MHz for the GPU, and 1600MHz for the DDR, and so on, which are not listed here. When the computing resources are turned down or up, the operating system can be adjusted one level at a time or multiple levels at a time. For example, if the difference between the actual frame rate of the image and the target frame rate of the stable frame is small, the first level may be adjusted; if the difference between the actual measurement frame rate of the image and the target frame rate of the stable frame is large, multiple stages can be adjusted at one time. The present embodiment is merely illustrative of the manner of ranking and adjusting the computing resources, and is not limited thereto. Developers can rank the computing resources and set specific rules for adjusting the computing resources based on experience or experimentation.
From the above, the target frame rate of the stable frame is an important basis for adjusting the computing resources in the stable frame scheme, and therefore, it is important for the stable frame scheme to enable the operating system to quickly and accurately obtain the target frame rate of the stable frame.
In general, a user can set a frame rate of an application, for example, the frame rate can be set through an application frame rate setting interface, and the electronic device is expected to display a screen of the application according to the frame rate set by the user. The frame rate of the application set by the user may be referred to as a set frame rate. If the target frame rate for performing image rendering and frame stabilization on the application is not consistent with the set frame rate, it may be difficult for the electronic device to display the application picture according to the set frame rate, so that the user experience is poor.
The embodiment of the application provides a frame rate identification method, which can determine an actual measurement frame rate of a plurality of continuous images drawn and rendered by a first application under the condition that an operating system of an electronic device performs frame stabilization on the image drawing and rendering performed by the first application at a current target frame rate, and determine a new target frame rate which is equal to or closer to a set frame rate according to the actual measurement frame rate and the current target frame rate so as to perform frame stabilization on the image drawing and rendering performed by the first application according to the new target frame rate. Therefore, the set frame rate of the application can be quickly and accurately identified, particularly when the application is a third-party application, the set frame rate of the application can be quickly and accurately identified without depending on a third party, the frame is stabilized, and the user experience is improved.
The frame rate identification method can be applied to electronic equipment. The electronic device may be a portable electronic device such as a mobile phone, a tablet computer, a digital camera, a Personal Digital Assistant (PDA), a wearable device, and a laptop computer (laptop). Exemplary embodiments of portable electronic devices include, but are not limited to, portable electronic devices that carry an iOS, android, microsoft, or other operating system. The portable electronic device may also be other portable electronic devices such as laptop computers (laptop) with touch sensitive surfaces (e.g., touch panels), etc. It should also be understood that in other embodiments of the present application, the electronic device may not be a portable electronic device, but may be a desktop computer having a touch-sensitive surface (e.g., a touch panel). The embodiment of the present application does not specifically limit the type of the electronic device.
Fig. 1 shows a schematic structural diagram of an electronic device 100 provided in an embodiment of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software configuration of the electronic apparatus 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application packages may include camera, gallery, calendar, phone, map, navigation, WLAN, music, video, short message, game, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), frame rate recognizers, Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), two-dimensional graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The surface manager may provide a buffer queue (buffer queue) and a surfefinger. The buffer queue, the surfaceflicker and the application program in the application program layer which needs to perform dynamic picture drawing and rendering form a graph producer-consumer model. Among them, the application program that needs to perform dynamic rendering is a producer (producer), and the surfefringer is a consumer (consumer). The buffer queue may include a plurality of buffers (buffers) that may serve as carriers for image transfer.
Referring to fig. 3, when the application needs to draw the rendering image, the application calls a buffer area in an idle state in the buffer queue, and draws the rendering image in the called buffer area. The process of an application calling a buffer may be referred to as dequeuing the buffer. When the application finishes rendering the image in the called buffer area, the buffer area can be handed over to the buffer area queue. The handover process may be referred to as enqueuing of the buffer. In other words, the buffer queue may receive an image from an application that the application renders. With continued reference to fig. 3, the consumer, i.e. the surfefinger, may obtain a buffer area in the buffer queue including the rendered image, and use the image to perform image merging, for example, merging the image with the status bar. The surfefringer may release the buffer into the buffer queue after using up the images in the buffer. And the buffer area released to the buffer queue is in an idle state again for cyclic use. The surfefringer may submit its merged image to a hardware buffer of the display screen for display by the display screen.
The frame rate identifier may obtain and record the reception time of the buffer queue receiving the image from the application program, and calculate the time interval between the reception times of two adjacent images received consecutively. The time interval between the receipt times of two adjacent images may reflect or represent the frame rate (rate) at which the application draws the rendered image. The frame rate identifier may determine a frame rate equal to or close to a set frame rate of the application according to a rate at which the application renders the rendered image, which will be described below with reference to fig. 6.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The two-dimensional graphics engine is a two-dimensional drawing engine.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the software and hardware of the electronic device 100 is exemplarily described below in connection with a scenario of adjusting the target frame rate of the game application from the first frame rate to the second frame rate.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch single-click operation, and taking the control corresponding to the single-click operation as the control of the second frame rate of the game application as an example, the application can respond to the touch operation and adjust the target frame rate from the frame rate T to the frame rate T.
Next, in some embodiments, a frame rate identification apparatus provided in an embodiment of the present application is described with reference to fig. 4. The device may be a software device included in an operating system of the electronic apparatus 100, i.e. the device belongs to the operating system side.
As shown in fig. 4, the apparatus may include an information acquisition module, a frame stabilizing module, and a frame rate identification module.
Next, application a will be described as an example.
The application a may be an application program that requires dynamic screen rendering, and may be, for example, a game-type application (peaceful elite, royal glory, or the like). The application may be a third party application. In one example, the user may download and uninstall the third party application from an application marketplace or the like.
The information acquisition module can acquire the receiving moment when the operating system receives the image rendered and drawn by the application from the application and transmit the acquired receiving moment to the frame stabilizing module.
The frame stabilizing module can determine the frame length of the image according to the image receiving moment, further determine the actual measurement frame rate of the image, and further adjust the computing resource by combining the current target frame rate of the frame stabilizing module so as to realize frame stabilization. The specific process of frame stabilization and the method for determining the frame length may refer to the above description, and are not described herein again.
In some embodiments, a developer of the frame rate recognition apparatus or a developer of the operating system may preset at least one initial target frame rate. When the application a starts to run, the frame stabilizing module may determine one initial target frame rate from at least one initial target frame rate as its current target frame rate.
In one example of these embodiments, at least one initial target frame rate may be preset according to the type of the application a, for example, at least one initial target frame rate may be preset for a game-like application, for example, 30fps, 60fps, and the like.
In one example of these embodiments, a developer of the frame rate recognition apparatus or a developer of the operating system may use a screen refresh rate of a display screen of the electronic device as a preset initial target frame rate.
Generally, the screen refresh rate of an electronic device is fixed and rarely varies, so that the screen refresh rate can be used as a preset initial target frame rate. If the screen refresh rate of the electronic device is changed, for example, the user changes the screen refresh rate through a device management program in the operating system. The frame stabilizing device can acquire the changed screen refresh rate, and the changed screen refresh rate is used as a preset initial target frame rate.
When the application starts to run, the frame stabilizing device may use the preset initial target frame rate as its current target frame rate.
The frame stabilizing module may transfer the frame length of the image to the frame rate identification module. The frame rate identification module judges whether the frame lengths of N continuous images are received every time the frame length of one image is received. The specific value of N may be preset by a developer of the frame rate recognition apparatus or a developer of the operating system, and may be, for example, 50, 100, 200, or the like.
Each time the frame rate identification module receives the frame length of N consecutive images, the frame rate identification module may determine the actual frame rate E corresponding to the N consecutive images according to the frame length of the N images. Then, the measured frame rate E and the current target frame rate may be compared to determine a new target frame rate. The new target frame rate may be equal to the set frame rate of application a or closer to the set frame rate of application a than the current target frame rate. The frame rate identification module may transmit the new target frame rate to the frame stabilizing module, so that the frame stabilizing module performs frame stabilization with the new target frame rate as the current target frame rate at the next moment. The specific process will be described below with reference to fig. 6.
It should be noted that, the frame length calculation and transmission process is introduced by taking the example that the information acquisition module acquires the receiving time, and the frame stabilizing module calculates the frame length and sends the frame length to the frame rate identification module.
In some embodiments, the information acquisition module may acquire the receiving time of the image, calculate the frame length of the image according to the receiving time of the image, transfer the frame length of the image to the frame stabilizing module, and transfer the frame length of the image to the frame rate identification module by the frame stabilizing module.
In some embodiments, the information acquisition module may acquire the receiving time of the image, calculate the frame length of the image according to the receiving time of the image, and transmit the frame length of the image to the frame stabilizing module and the frame rate identification module, respectively.
In some embodiments, the information acquisition module may transmit the receiving time of the acquired image to the frame stabilizing module and the frame rate identification module, respectively. The frame stabilizing module and the frame rate identification module respectively calculate the frame length of the image according to the receiving time of the image.
The frame rate identification device provided by the embodiment of the application can quickly identify a new target frame rate equal to or close to the set frame rate of the application under the condition that the operating system does not know or does not accurately know the set frame rate of the application, and can transmit the new target frame rate to the frame stabilizing module, so that the frame stabilizing module adjusts computing resources according to the new target frame rate, the experience of a user on the application is improved, and unnecessary power consumption overhead is avoided or reduced.
Next, in some embodiments, with reference to fig. 5, an implementation principle of the frame rate identification method provided in the embodiments of the present application is described.
As shown in FIG. 5, the software framework of the electronic device may include applications and an operating system. The application may be a third party application. The application can comprise a frame rate setting module and a drawing and rendering module. The frame rate setting module may set the frame rate in response to a user-initiated operation. The frame rate set by the application may be referred to as a set frame rate of the application. The rendering module may occupy or call most or most of the computing resources provided by the operating system according to the set frame rate to render the image. Therefore, the rate or frame rate at which the rendering module renders the rendered image is limited by the set frame rate and the computing resources provided by the operating system.
The operating system can comprise an image receiving module, a frame rate identification device and a computing resource providing module.
The image receiving module may receive an image to which the rendering is applied. In some examples, the image receiving module may be the surface manager shown in fig. 2. In some examples, the image receiving module may be a buffer queue as described above.
The frame rate identification device can comprise an information acquisition module, a frame stabilizing module and a frame rate identification module. The information acquisition module may acquire a reception time at which each image is received by the image reception module. The frame stabilizing module may determine a frame length of the image according to an interval of the receiving time of the image, and execute a frame stabilizing scheme according to the frame length of the image and the current target frame rate. The current target frame rate may be a preset initial target frame rate, or may be a frame rate recently acquired by the frame stabilizing module from the frame rate identification module. The frame stabilizing module may transmit the frame length of the image to the frame rate identification module, so that the frame rate identification module determines a new target frame rate. The frame rate identification module may pass the new target frame rate determined by the frame rate identification module to the frame stabilizing module. The functions of the modules of the frame rate identification apparatus can refer to the description of the embodiment shown in fig. 4.
And the computing resource providing module provides computing resources according to the stable frame scheme so as to supply the electronic equipment to run. Most or most of the resources provided by the computing resource providing module are occupied or called by the rendering and rendering module to render and render the image.
In the embodiment of the application, under the condition that the third-party application does not inform the operating system of the set frame rate or informs the set frame rate inaccurately, the frame rate equal to or close to the set frame rate of the application can be quickly determined, and the determined frame rate can be transmitted to the frame stabilizing module, so that the frame stabilizing module adjusts the computing resources according to the determined frame rate, and the experience of a user on the application is improved, and unnecessary power consumption overhead is avoided or reduced.
Next, referring to fig. 6, a frame rate identification method provided in the embodiment of the present application is illustrated, where the method may be applied to the electronic device 100 shown in fig. 1, and may be specifically executed by an operating system of the electronic device 100.
As shown in fig. 6, the method may include steps 601 and 613a (or 613 b). The details are as follows.
Step 601, the operating system may perform frame stabilization on the image rendering and rendering performed by the application a according to the current frame rate F.
The application a may be an application program that needs to perform dynamic image rendering, and specific reference may be made to the above description, which is not described herein again.
The operating system can adjust the provided computing resources to realize stable frames by taking the current frame rate F as the target frame rate of stable frames. For the specific implementation of frame stabilization, reference may be made to the above description, and details are not described here.
In some embodiments, the method shown in fig. 6, that is, step 601 and step 613a (or step 613b), may be executed by the electronic device in a loop, where each loop may be referred to as a round of identification process or an identification period.
If the current round of identification process (or the current identification period) is a non-first round of identification process (or a non-first identification period) after the application a starts to operate, the current frame rate F in the current round of identification process (or the current identification period) is a new target frame rate determined by the previous round of identification process (or the previous identification period). The previous round of identification process (or the previous identification period) is a previous round of identification process adjacent to the present round of identification process (or the present identification period).
If the current round of identification process (or the current identification period) is the first round of identification process (the first identification period) after the application a starts to operate, the current frame rate F of the current round of identification process may be the preset initial target frame rate. For the preset initial target frame rate, reference may be made to the above description of the embodiment shown in fig. 4, and details are not described here again.
In some embodiments, step 601 of the first round of the recognition process (or first recognition cycle) may begin after the operating system receives at least one image from application A.
Step 603, recording the frame length of the nth image received by the operating system from the application A, wherein the frame length of the nth image is determined by the receiving time of the nth image received by the operating system from the application A and the receiving time of the (n-1) th image. The nth image and the (n-1) th image are rendered images by using A drawing.
The nth image and the (n-1) th image are two images which are sequentially received by the operating system from the application A, namely the (n-1) th image is adjacent to the nth image and is a previous image of the nth image. Or, the reception time of the (n-1) th image is adjacent to and before the reception time of the nth image.
The operating system may obtain the time of receipt of the n-1 th image from application a and the time of receipt of the nth image. In an example, the operating system may obtain an enqueue time of the buffer corresponding to the n-1 th image and an enqueue time of the buffer corresponding to the nth image, where the enqueue time may be a receiving time, and specifically refer to the description of the embodiment shown in fig. 2 above, and details of the enqueue time are not repeated here.
The operating system may calculate a time interval between the reception timing of the nth image and the reception timing of the (n-1) th image, and regard the calculated time interval as a frame length of the nth image. For example, if the reception time of the nth image is 80ms after the application a is started, and the reception time of the (n-1) th image is 60ms after the application a is started, the frame length of the nth image is 20 ms.
Each time step 603 is performed, step 605 may be performed to determine whether a frame length of N images has been recorded.
N may be a predetermined integer, such as 50, 100, 200, etc., and is not listed here.
If the frame length of N images has not been recorded, step 603 is executed again.
In some embodiments, if frame lengths of N images have been recorded, step 607 is performed to remove the abnormal frame length and smooth the fluctuation of the frame length.
The abnormal frame length refers to an obviously abnormal frame length caused by program bugs (bugs) of the application program, network abnormalities and the like. In specific implementation, a developer may set an exception removal threshold in advance according to experience or experiment, and remove a frame length of which the frame length exceeds the threshold.
The gradual frame length fluctuation means that the original frame length of m images is replaced by the average value of the frame lengths of the continuous m images, and m is an integer larger than 1. For example, in the N images, the frame lengths of the 9 th, 10 th, and 11 th images are 40ms, 60ms, and 40ms, respectively, and the frame lengths of the 9 th, 10 th, and 11 th images are expressed by an average value of 40ms, 60ms, and 50ms, that is, after gradual frame length fluctuation, the frame lengths of the 9 th, 10 th, and 11 th images are 50ms, and 50ms, respectively.
After step 607, step 609 may be performed.
In some embodiments, in step 605, if the frame length of N images has been recorded, step 609 may be performed directly.
In step 609, the average frame length and the measured frame rate FD1 corresponding to the average frame length may be determined.
An average frame length of the frame lengths of the N images processed in step 607 or not processed in step 6057 may be calculated, and the actually measured frame rates FD1 corresponding to the N images may be determined according to the average frame length.
Specifically, the actual frame rate FD1 corresponding to the N images can be obtained by dividing the duration of 1000 milliseconds by the average frame length. In one example, the result may be rounded if the result, which may be the division of the 1000 millisecond duration by the average frame length, is not an integer. The rounding rule can be upward rounding, downward rounding, or rounding.
In some embodiments, the frame rate range from 0fps to the highest settable frame rate may be divided into a plurality of preset frame rate intervals, where 0 to the lowest settable frame rate is one of the preset frame rate intervals. The lowest settable frame rate may be the floor frame rate. Generally, if the target frame rate of the application a exceeds the minimum set frame rate, the user experience of the dynamic screen of the application a is poor or poor. For example, if the application a is a game-type application, the minimum settable frame rate is typically 20 fps. Typically, the highest settable frame rate may be equal to the refresh rate of the electronic device display screen, e.g. 60 fps. Generally, there are multiple settable frame rates between the lowest settable frame rate and the highest settable frame rate, and the difference between adjacent settable frame rates may be referred to as the frame rate adjustable amplitude. The settable frame rate is a frame rate to which the user starts to adjust in response to the user-initiated operation, and in the case of peace and elite, 20fps and 25fps are settable frame rates, while 21fps and the like are not settable frame rates.
In some embodiments, settable frame rates of various application types may be counted, so as to divide the frame rate interval according to the counting result.
In some embodiments, the settable frame rates for different application types may be preset based on experience or research analysis for multiple applications.
In one illustrative example, the settable frame rates of the game-like applications may be preset to 20fps, 25fps, 30fps, 35fps, 40fps, 45fps, 50fps, 55fps, 60 fps.
In one illustrative example, any one of a plurality of frame rate sections divided between a lowest settable frame rate to a highest settable frame rate may include one settable frame rate.
In one example, in any frame rate interval, the settable frame rate is included as the upper limit of the frame rate interval. Specifically, taking the application a as a game application as an example, the lowest set frame rate of the game application is usually 20fps, and the highest set frame rate is usually 60 fps. The frame rate is typically adjustable to a magnitude of 5fps, e.g., peaceful elite, allowing the user to adjust the target frame rate by an integer multiple of 5 fps. The plurality of preset frame rate sections for the game application may be (0, 20], (20, 25], (25, 30], (30, 35], (35, 40], (40, 45], (45, 50], (50, 55], (55, 60).
In one example, in any frame rate interval, the settable frame rate is included near the upper limit of the frame rate interval. Specifically, taking the game application as an example, the plurality of preset frame rate sections may be (0, 21], (21, 26], (26, 31], (31, 36], (36, 41], (41, 46], (46, 51], (51, 56], (56, 61) ].
When it is determined which section the result of dividing 1000 milliseconds by the average frame length of N images falls in, the settable frame rate included in which section is set as the actual frame rate FD 1. For example, when the frame rate can be set to the upper limit of the frame rate section, the upper limit of the frame rate section is set as the measured frame rate FD 1. For example, when the settable frame rate is a frame rate close to the upper limit of the frame rate section, the settable frame rate close to the upper limit of the frame rate section is the measured frame rate FD 1.
In some embodiments, in step 609, the number N1 of images whose measured frame rate is higher than the threshold Y1 may also be determined.
The actual frame rate of any image can be obtained by dividing the time length of 1000 milliseconds by the frame length of the image, and then the actual frame rate of each image can be compared with the threshold Y1, wherein the number N1 of the images with the actual frame rates higher than the threshold Y1 in the available N images. In one example, the frame rate of any image may be rounded by dividing the 1000 millisecond duration by the frame length of that image. The rounding can be upward rounding, downward rounding or rounding.
The threshold value Y1 may be the sum of the current frame rate F and a preset frame rate H, which may be an integer of 2, 3, etc. In an example, the preset frame rate H may specifically be a frame rate smaller than the target frame rate by an adjustable amplitude. Taking application a as a sum-and-square-elite example, the predetermined frame rate H is a frame rate less than 5fps, and the value thereof may be an integer.
In one example, the preset frame rate H may be associated with the frame rate F, that is, different frame rates F may correspond to different preset frame rates H. For example, the frame rate F is 60fps, and the preset frame rate H may be 4 fps; the frame rate is 40fps, and the preset frame rate H can be 3 pfs. Etc., which are not listed here.
In some embodiments, in step 609, the number N2 of images whose measured frame rate is lower than the threshold Y2 may also be determined.
The measured frame rate of each image can be compared with a threshold Y2, and the number N2 of the images with the frame rate lower than the threshold Y2 among the available N images.
The threshold value Y2 may be a difference value of the current frame rate F minus a preset frame rate K, where the preset frame rate K may be an integer of 2, 3, or the like. In one example, the preset frame rate K may be equal to the preset frame rate H. In an example, the preset frame rate K may specifically be a frame rate smaller than the target frame rate by an adjustable amplitude. Taking application a as a plain elite example, the predetermined frame rate H is a frame rate less than 5fps, and the value is an integer.
In one example, the preset frame rate K may be associated with a frame rate F, that is, different frame rates F may correspond to different preset frame rates H. For example, the frame rate F is 60fps, and the preset frame rate K may be 4 fps; the frame rate is 40fps, and the preset frame rate K can be 3 pfs. Etc., which are not listed here.
In some embodiments, at step 609, an average value FL of rendering thread sleep (sleep) time may also be determined.
The operating system may obtain a sleep duration of a rendering thread of application a when application a renders an image. Taking the nth image as an example, the operating system obtains the activity duration of the rendering thread of the application a detected after receiving the nth-1 image and before receiving the nth image, and calculates the frame length of the nth image and the difference value of the activity duration, so as to obtain the sleep duration of the rendering thread corresponding to the nth image.
More specifically, taking the android system as an example, the kernel layer may obtain running conditions of each thread running in the CPU, where the running conditions include the rendering thread of the application a, so that an activity duration of the rendering thread of the application a may be detected. Generally speaking, an application such as a game class may be composed of a plurality of rendering threads, and the active duration of the rendering thread herein specifically refers to the active duration of the main rendering thread. The main rendering thread is the rendering thread with the longest activity duration in the plurality of rendering threads.
The average sleep duration FL may be calculated from the sleep times of the rendering threads of the respective images.
In some embodiments, in step 611, it may be determined whether condition 1 is satisfied. The condition 1 is: FD1 < F, and FL > P1, and N2 > P2.
If the condition 1 is satisfied, in step 613a, the measured frame rate FD1 is set as a new frame rate F.
It will be readily appreciated that when the current frame rate F is higher than the set frame rate of application a, the computational resources provided by the operating system are over-provisioned as can be appreciated from the above description of the frame-stable scheme. Specifically, application a is limited to its lower set frame rate, and the frame rate at which rendered images are rendered is lower than frame rate F. In the case of the operating system, since the frame rate at which the application a renders the rendered image (which may be represented by the measured frame rate at which the rendered image is rendered) is lower than the current frame rate F, the operating system further increases the computational resources. Application a may use enough computing resources to perform the N image rendering renderings at or near its set frame rate, i.e., the measured frame rate FD1 is likely to be equal to the set frame rate of application a.
Further, since the supply of computing resources is sufficient, the application a can complete rendering of one image quickly, and the set frame rate of the application a is small, and the time interval between the start rendering times of adjacent images is also long. The foregoing two factors may cause the sleep duration of the rendering thread to become longer.
Furthermore, since the application a can render images at or near the set frame rate, even if the computing resources are sufficient, the actual frame rate of the rendered images is relatively low, and the number of images whose actual frame rate is lower than the threshold Y2 increases, because the application a is limited to a relatively low set frame rate (that is, the set frame rate of the application a is a factor limiting how fast the rendered images are rendered).
Therefore, when FD1 < F, FL > P1, and N2 > P2, it can be determined that the measured frame rate FD1 is equal to or close to the set frame rate of application A.
P1, P2 may be values preset empirically or experimentally, for example P1 may be 10ms and P2 may be 20. When the values of P1, P3, etc. are preset experimentally, the following experimental protocol can be employed.
Taking a game application as an example, the operating system may be set to stabilize frames at a higher frame rate (e.g. 50fps), and the set frame rate of the application may be set to a lower frame rate (e.g. 30fps), so that the measured frame rate FD1, the average sleep duration FL of the rendering thread, the number N2 of images with the measured frame rate less than the threshold Y2, and other characteristic values during the running of the game application may be observed. The frame rate for operating the system to stabilize the frames can be set to be equal to the set frame rate of the application, and then characteristic values such as the measured frame rate FD1, the average sleep duration FL of the rendering thread, the number N2 of images with the measured frame rate smaller than the threshold Y2, and the like during the running of the game application can be observed. And determining P1 and P2 values by comparing the difference of characteristic values of the actually measured frame rate FD1, the average sleep duration FL of the rendering thread, the number N2 of the images with the actually measured frame rate smaller than the threshold Y2 and the like in two groups of experiments.
In some embodiments, in step 611, it may be determined whether condition 2 is satisfied, where condition 2 is: FD1 < F, and N2 > P3, and FD 1-FD 2-FD 3.
The FD2 and the FD3 are actual frame rates corresponding to the average frame length of the images determined by the two previous rounds of identification processes (the two previous rounds of identification cycles) of the current round of identification processes (the current round of identification cycles). The two upper and the present rounds are three consecutive identification processes (identification cycles). The target frame rate for frame stabilization in the last two rounds of recognition is equal to the target frame rate for frame stabilization in the current round of recognition, that is, the frame rate F in the last two rounds of recognition is equal to the frame rate F in the current round of recognition.
If the condition 2 is satisfied, in step 613a, the measured frame rate FD1 is set as a new frame rate F.
It will be readily appreciated that when the current frame rate F is higher than the set frame rate of application a, the computational resources provided by the operating system are over-provisioned as can be appreciated from the above description of the frame-stable scheme. Specifically, application a is limited to its lower set frame rate, and the frame rate at which rendered images are rendered is lower than frame rate F. In the case of the operating system, since the frame rate at which the application a renders the rendered image (which may be represented by the measured frame rate at which the rendered image is rendered) is lower than the current frame rate F, the operating system further increases the computational resources. Application a may use enough computing resources to perform the N image rendering renderings at or near its set frame rate, i.e., the measured frame rate FD1 is likely to be equal to or near the set frame rate of application a.
Further, if the actual frame rates corresponding to the average frame lengths of the images during the three consecutive recognition rounds are all equal, it may be said that the actual frame rate FD1 is approximately equal to or close to the set frame rate of application a.
Moreover, since the application a can perform image rendering at or near the set frame rate, even if the computing resources are sufficient, the frame rate of rendering the image is relatively low due to the limitation of the relatively low set frame rate (that is, the set frame rate of the application a is a factor limiting how fast the image is rendered), and the number of images with the actual frame rate lower than the threshold Y2 is increased. When the number of images with the measured frame rate lower than the threshold value Y2 is larger than P3, it further reflects that the set frame rate of the application a is lower than the frame rate F. P3 can be preset empirically or experimentally, and P3 can be 10. Where the value of P3 is set in advance according to the experiment, the procedure of the experiment may refer to the above description of condition 1.
In some embodiments, in step 611, it may be determined whether condition 3 is satisfied, where condition 3 is: FD > F.
In an illustrative example, if condition 3 is satisfied, the measured frame rate FD1 may be used as the new frame rate F.
In an illustrative example, if the condition 3 is satisfied, a first frame rate obtained by adding the current frame rate F and the preset frame rate S may be used as the new frame rate F. In one example, the preset frame rate S may be equal to a difference between the adjacent settable frame rates. In one example, the preset frame rate S may be equal to a length of a section of the preset frame rate section, where the frame rate section is any section except for a section from 0fps to the lowest frame rate among a plurality of sections of the preset frame rate. Taking the application a as a game application as an example, the length of any section other than the section from 0fps to the lowest settable frame rate is 5fps, i.e., the preset frame rate S may be 5 fps.
In an illustrative example, if the condition 3 is satisfied, when the measured frame rate FD1 is not equal to the first frame rate, the larger of the measured frame rate FD1 and the first frame rate may be used as the new frame rate F. The first frame rate may be described with reference to the previous example, and is not described herein again.
It is easy to understand that when the current frame rate F is lower than the set frame rate of the application a, since the operating system stabilizes the frame for the image rendering and rendering performed by the application a according to the current frame rate F, when the application a renders and renders the N images, the computing resources available to the application a are insufficient, so that the frame rate for the image rendering and rendering performed by the application a is lower than the set frame rate. In this case, the measured frame rate FD1, which is the measured frame rate of the N images, is still higher than the current frame rate F, which indicates that the set frame rate of application a is higher than the frame rate F.
Since the computing resources are insufficient when the application a renders the N images, and the frame rate of the image rendered by the application a is lower than the set frame rate, there is still a large gap between the measured frame rate FD1 (or the first frame rate) and the set frame rate of the application a. The frame rate F may be continuously updated through the frame rate identification process (identification period) of the subsequent round, so that the updated frame rate F is closer to or equal to the set frame rate of the application a. The details are as follows.
The new frame rate F determined by 613a in the current round of identification process (identification period) can be used as the current frame rate F used as a stable frame in the next round of identification process (identification period). The relationship between the current frame rate F of the next round of identification process (i.e. the new frame rate F determined by the current round of identification process) and the measured frame rate FD1 in the next round of identification process may be determined at step 611 of the next round of identification process to determine the next new frame rate F, which is closer to the set frame rate of application a than the frame rate F determined by the current round of identification process. By analogy, a frame rate F equal to or close to the set frame rate of application a may be determined in a limited round of recognition.
Specifically, the m-th round identification process and the m + 1-th round identification process are taken as examples. And the m +1 th round is the next round of the m round and is adjacent to the m round in the identification process. It may be set in the mth round, and the new frame rate F is determined to be the frame rate Z. Frame rate Z is used as the current frame rate F in round m + 1. In the (m + 1) th round 611, the relationship between the determined measured frame rate FD1 and the frame rate Z can be compared, and a new frame rate F is determined in the (m + 1) th round. The new frame rate F determined in the (m + 1) th round is closer to the set frame rate of the application a than the current frame rate F, or the new frame rate F determined in the (m + 1) th round may be equal to the set frame rate of the application a. Thus, a frame rate F equal to or close to the frame rate set by the application a can be determined through a limited number of recognition processes.
In some embodiments, in step 611, it may be determined whether condition 4 is satisfied, where condition 4 is: FD1 ═ F, and N1 > N2, and N1 > P4. If the condition 4 is satisfied, in step 613a, the first frame rate may be set as a new frame rate F, and the first frame rate is obtained by adding the current frame rate F to the preset frame rate S. The preset frame rate S can be described with reference to the step 613a when the condition 3 is satisfied, and is not described herein again.
According to the above-mentioned frame stabilizing scheme, when the operating system detects that the measured frame rate of the image rendered by application a is higher than the target frame rate (i.e. the current frame rate F) used in the frame stabilizing scheme, the operating system will turn down the computing resources. It is easy to understand that when the application a renders the N images, the computing resources are insufficient, and the frame rate of the rendered images is lower than the set frame rate, that is, the insufficient resources are a factor that limits how fast the application a renders the rendered images. If the operating system continuously detects that the actual frame rate of the image rendered by the application a is higher than the frame rate F, the operating system continuously reduces the computing resource, so that the actual frame rate of the image rendered by the application a is close to or equal to the current frame rate F. If the operating system detects that the actual frame rate of the image rendered by the application a is lower than the frame rate F, the operating system increases the computing resources to make the actual frame rate of the image rendered by the application a close to or equal to the current frame rate F. Through the above process, the measured frame rate of the image rendered by the application a rendering can be stabilized at the frame rate F, that is, the measured frame rate FD1 is equal to the current frame rate F.
Moreover, it is easy to understand that the N images have a lighter load, that is, the application a can complete the rendering of the image with the lighter load without requiring more computing resources. Therefore, for the image with a light load, the application a requires less computing resources when rendering, and can render at a higher frame rate (for example, a predetermined frame rate or a frame rate close to the predetermined frame rate), so that when the number of images of which the actual frame rate is higher than the threshold Y1 in the images N is greater than P4, it can be reflected that the set frame rate of the application a is greater than the current frame rate F.
Similarly, when the number of images of which the actual frame rate is higher than the threshold Y1 is larger than the number of images of which the actual frame rate is lower than the threshold Y2, the set frame rate of the application a may be larger than the current frame rate F. The value of P4 may be preset empirically or experimentally, for example the value of P4 may be 15. When the value of P4 is set in advance according to experiments, the following scheme may be employed.
For example, in the case of a game application, the operating system may be set to perform frame stabilization at a low frame stabilization target frame rate (e.g., 30fps), and the set frame rate of the application may be set to a high frame rate (e.g., 40fps), so that characteristic values such as the number of images with the measured frame rate higher than the threshold Y1 and the number of images with the measured frame rate lower than the threshold Y2 during the running of the game application may be observed. It is also possible to set a target frame rate for the operating system to stabilize frames equal to the set frame rate of the application, and then observe characteristic values such as the number of images whose measured frame rate is higher than the threshold Y1 and the number of images whose measured frame rate is lower than the threshold Y2 during the running of the game application. P4 values were determined by comparing the differences in characteristic values observed in the two sets of experiments.
When the condition 4 is satisfied, the measured frame rate FD1 hardly represents or reflects the set frame rate of the application a, and although the new frame rate F is closer to the set frame rate of the application a than the current frame rate F, there may be a large gap between the new frame rate F and the set frame rate of the application a. The frame rate F may be continuously updated through the frame rate identification process (identification period) of the subsequent round, so that the updated frame rate F is closer to or equal to the set frame rate of the application a. Specifically, reference may be made to the above description of step 613a when the condition 3 is satisfied, and details are not described here.
If none of the conditions 1, 2, 3, and 4 are satisfied, step 613b may be executed to not update the current frame rate F. The current frame rate F may be used as the current frame rate F for the next round of identification.
After steps 613a and 613b, step 601 may be executed again to start the next round of identification process. Here, the new frame rate F determined in step 613a in the current round or the frame rate F not updated in step 613b is used as the frame stabilizing target frame rate of the next round, that is, the current frame rate F of the next round. Specifically, regarding the adjacent round a and round B, round B is the identification process of the next round of round a. The new frame rate determined in step 613a in round a may be set to rate frame F'. In round B, the operating system executes a frame-stabilizing scheme with frame rate F' as the current frame rate.
The frame rate identification method provided by the embodiment of the application can quickly identify the frame rate equal to or close to the set frame rate of the application under the condition that the operating system does not know or does not accurately know the set frame rate of the application, and can use the identified frame rate as the target frame rate of the operating system for stabilizing the frame, so that the user experience of the application is improved, and unnecessary power consumption overhead is avoided or reduced.
Next, taking application a as a plain elite as an example, the recognition effect of the frame rate recognition method shown in fig. 6 is exemplified.
It is possible to set the value of N to 100, P1 to 10ms, P2 to 20, P3 to 10, and P4 to 15. The preset frame rate sections are (0, 20), (21, 25), (26, 30), (31, 35), (36, 40), (41, 45), (46, 50), (51, 55) and (56, 60).
1000 tests were performed as follows.
During the operation of the peace elite, a user can adjust the frame rate from the original 20fps to 25fps through the frame rate setting interface of the peace elite, wherein the 25fps can be called as the set frame rate of the peace elite. Under the condition that the peace and elite do not inform the operating system that the set frame rate is adjusted to 25fps, the 999 test results in the 1000 tests show that the frame rate identification method provided by the embodiment of the application can enable the operating system to perform frame stabilization on image drawing and rendering performed by the peace and elite by taking 25fps as the target frame rate of the frame stabilization within 5 seconds. The set frame rate of peace elite can be identified within 5 seconds.
The following tests were also performed 1000 times.
During the operation of the peaceful elite, a user can adjust the frame rate from the original 30fps to 25fps through the frame rate setting interface of the peaceful elite. Under the condition that the peace and elite do not inform the operating system that the set frame rate is adjusted to 25fps, the 999 test results in the 1000 tests show that the frame rate identification method provided by the embodiment of the application can enable the operating system to perform frame stabilization on image drawing and rendering performed by the peace and elite by taking 25fps as the target frame rate of the frame stabilization within 5 seconds. The set frame rate of peace elite can be identified within 5 seconds.
The following tests were also performed 1000 times.
During the operation of the peaceful elite, a user can adjust the frame rate from 40fps to 60fps through the frame rate setting interface of the peaceful elite. Under the condition that the peace and elite do not inform an operating system that the set frame rate is adjusted to 60fps, the 1000 times of test results in the 1000 times of test show that the frame rate identification method provided by the embodiment of the application can enable the operating system to perform frame stabilization on image drawing and rendering performed by the peace and elite by taking 60fps as the target frame rate of the frame stabilization within 5 seconds. The set frame rate of peace elite can be identified within 5 seconds.
The following tests were also performed 1000 times.
During the operation of the peaceful elite, a user can adjust the frame rate from 40fps to 30fps through the frame rate setting interface of the peaceful elite. Under the condition that the peace and elite does not inform an operating system that the set frame rate is adjusted to 30fps, the 1000 times of test results in the 1000 times of test show that the frame rate identification method provided by the embodiment of the application can enable the operating system to perform frame stabilization on image drawing and rendering performed by the peace and elite by taking 30fps as the target frame rate of the frame stabilization within 5 seconds. The set frame rate of peace elite can be identified within 5 seconds.
Therefore, by adopting the method shown in fig. 6, under the condition that the application does not inform the operating system of the set frame rate, the frame rate equal to or close to the set frame rate of the application can be quickly and accurately identified, the response time is less than 5 seconds, the accuracy rate is more than 99.9%, the false negative rate is less than 0.1%, and the identification granularity can be as low as 5fps, so that the set frame rate of a third-party game can be identified under the condition that the third-party game is not dependent on, the set frame rates of various games can be identified, and the game experience of a user is effectively improved by combining a frame stabilizing scheme.
The embodiment of the application provides a frame rate identification method, which can be applied to electronic equipment. Referring to fig. 7, the method includes the following steps.
Step 701, according to the current target frame rate, performing frame stabilization on the image drawing and rendering performed by the first application. Reference may be made in particular to the above description of step 601 in fig. 6.
Step 703, determining frame lengths of N continuous images according to a receiving time of a first application rendering result, where the N continuous images are images rendered by the first application. Specifically, reference may be made to the above description of steps 603 and 605 in fig. 6.
Step 705, determining the actual frame rate according to the frame lengths of the N continuous images. Reference may be made specifically to the above description of step 609 in fig. 6.
Step 707, determining a new target frame rate according to the actual measurement frame rate and the current target frame rate, so as to perform frame stabilization on the image rendering performed by the first application according to the new target frame rate. The implementation can be specifically realized by referring to the above description of steps 611 and 613a in fig. 6
In some embodiments, said determining a measured frame rate from frame lengths of said N consecutive images comprises: determining a first interval from a plurality of preset frame rate intervals according to the average frame length of the N continuous images; and determining the actually measured frame rate according to the first interval. Reference may be made specifically to the above description of step 609 in fig. 6.
In one example of these embodiments, the determining the measured frame rate according to the first interval comprises: and using the upper limit frame rate of the first interval as the measured frame rate.
In some embodiments, said determining a new target frame rate based on said measured frame rate and said current target frame rate comprises: when the actual measurement frame rate is equal to the current target frame rate and the number of first images in the N continuous images is larger than a first threshold value, adding the current target frame rate and a first preset frame rate to obtain a first frame rate, and using the first frame rate as the new target frame rate; the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate.
Specifically, reference may be made to the above description of the case corresponding to condition 4 of step 611 in fig. 6.
In some embodiments, said determining a new target frame rate based on said measured frame rate and said current target frame rate comprises: when the actually measured frame rate is equal to the current target frame rate, the number of first-class images in the N continuous images is greater than a first threshold value, and the number of first-class images in the N continuous images is greater than the number of second-class images in the N continuous images, adding the current target frame rate and a first preset frame rate to obtain a first frame rate, and using the first frame rate as the new target frame rate; the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate; and the frame rate corresponding to the second type of image is less than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
Specifically, reference may be made to the above description of the case corresponding to condition 4 of step 611 in fig. 6.
In some embodiments, said determining a new target frame rate based on said measured frame rate and said current target frame rate comprises: and when the measured frame rate is greater than the current target frame rate, using the measured frame rate as the new target frame rate.
Specifically, reference may be made to the above description of the case corresponding to condition 3 of step 611 in fig. 6.
In some embodiments, said determining a new target frame rate based on said measured frame rate and said current target frame rate comprises: and using the higher frame rate of the actually measured frame rate and the first frame rate as the new target frame rate, wherein the first frame rate is obtained by adding the current target frame rate and a first preset frame rate.
Specifically, reference may be made to the above description of the case corresponding to condition 3 of step 611 in fig. 6.
In some embodiments, said determining a new target frame rate based on said measured frame rate and said current target frame rate comprises: and when the measured frame rate is smaller than the current target frame rate, using the measured frame rate as the new target frame rate.
Specifically, reference may be made to the above description of the case corresponding to conditions 1 and 2 of step 611 in fig. 6.
In some embodiments, said determining a new target frame rate based on said measured frame rate and said current target frame rate comprises: and when the actually measured frame rate is smaller than the current target frame rate and the average sleep time of rendering threads when the first application performs rendering on the N continuous images is larger than a second threshold value, using the actually measured frame rate as the new target frame rate.
Specifically, reference may be made to the above description of the case corresponding to condition 1 of step 611 in fig. 6.
In some embodiments, said determining a new target frame rate based on said measured frame rate and said current target frame rate comprises: when the actually measured frame rate is smaller than the current target frame rate, the average sleep time of rendering threads when the first application performs rendering on the N continuous images is larger than a second threshold, and the number of second images in the N continuous images is larger than a third threshold, using the actually measured frame rate as the new target frame rate; and the frame rate corresponding to the second type of image is smaller than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
Specifically, reference may be made to the above description of the case corresponding to condition 1 of step 611 in fig. 6.
In some embodiments, the measured frame rate comprises at least two measured frame rates; determining a latest target frame rate according to the measured frame rate includes: and when the measured frame rate is smaller than the current target frame rate and the at least two frame rates are equal, using the measured frame rate as the latest target frame rate.
Specifically, reference may be made to the above description of the case corresponding to condition 2 of step 611 in fig. 6.
In some embodiments, the measured frame rate comprises at least two measured frame rates; determining a latest target frame rate according to the measured frame rate includes: when the actual measurement frame rate is smaller than the current target frame rate, the at least two frame rates are equal, and the number of second images in the N continuous images is larger than a fourth threshold, using the actual measurement frame rate as the latest target frame rate; and the frame rate corresponding to the second type of image is smaller than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
Specifically, reference may be made to the above description of the case corresponding to condition 2 of step 611 in fig. 6.
The frame rate identification method provided by the embodiment of the application can quickly identify the frame rate equal to or close to the set frame rate of the application under the condition that the operating system does not know or does not accurately know the set frame rate of the application, and can use the identified frame rate as the target frame rate of the operating system for stabilizing the frame, so that the user experience of the application is improved, and unnecessary power consumption overhead is avoided or reduced.
The embodiment of the present application provides a frame rate recognition apparatus 800. Referring to fig. 8, the apparatus 800 includes:
a frame stabilizing unit 810, configured to stabilize a frame for image rendering and rendering performed by the first application according to the current target frame rate;
a first determining unit 820, configured to determine frame lengths of N consecutive images according to a receiving time at which a first application rendering result is received, where the N consecutive images are images rendered by the first application rendering;
a second determining unit 830, configured to determine an actual frame rate according to the frame lengths of the N consecutive images;
a third determining unit 840, configured to determine a new target frame rate according to the actual frame rate and the current target frame rate, so as to perform frame stabilization on the image rendering performed by the first application according to the new target frame rate.
The apparatus 800 provided in the embodiments of the present application has been described above mainly from the perspective of method flow. It is to be understood that each electronic device, in order to implement the above-described functions, includes corresponding hardware structures and/or software modules for performing the respective functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, functional modules may be divided for an electronic device and the like according to the method embodiments shown in fig. 7, for example, each functional module may be divided for each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
The device provided by the embodiment of the application can rapidly identify the frame rate equal to or close to the set frame rate of the application under the condition that the operating system does not know or does not accurately know the set frame rate of the application, and can use the identified frame rate as the target frame rate of the operating system for stabilizing the frame, so that the experience of a user on the application is improved, and unnecessary power consumption overhead is avoided or reduced.
The embodiment of the application provides electronic equipment. Referring to fig. 9, the electronic device may include a processor 910 and a memory 920. The memory 920 is used for storing computer execution instructions; when the electronic device is running, the processor 910 executes the computer executable instructions stored by the memory 920 to cause the electronic device to perform the method shown in fig. 7. The processor 910 is configured to perform frame stabilization on image rendering and rendering performed by a first application according to a current target frame rate; the processor 910 is configured to determine frame lengths of N consecutive images according to a receiving time at which a first application rendering result is received, where the N consecutive images are images rendered by the first application rendering; the processor 910 is configured to determine an actual frame rate according to the frame lengths of the N consecutive images; the processor 910 is configured to determine a new target frame rate according to the measured frame rate and the current target frame rate, so as to perform frame stabilization on the image rendering performed by the first application according to the new target frame rate.
In some embodiments, the electronic device further includes a communication bus 930, wherein the processor 910 may communicate with the memory 920 via the communication bus 930 to retrieve and execute computer-executable instructions stored by the memory 920.
The specific implementation of each component/device of the electronic device end in the embodiment of the present application can be implemented by referring to each method embodiment shown in fig. 7, which is not described herein again.
Therefore, under the condition that the set frame rate of the application is unknown or not accurately known by the operating system, the frame rate which is equal to or close to the set frame rate of the application can be quickly identified, and the identified frame rate can be used as the target frame rate of the operating system for stabilizing frames, so that the user experience of the application is improved, and unnecessary power consumption overhead is avoided or reduced.
The method steps in the embodiments of the present application may be implemented by hardware, or may be implemented by software instructions executed by a processor. The software instructions may consist of corresponding software modules that may be stored in Random Access Memory (RAM), flash memory, read-only memory (ROM), programmable read-only memory (PROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, a hard disk, a removable hard disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is to be understood that the various numerical references referred to in the embodiments of the present application are merely for descriptive convenience and are not intended to limit the scope of the embodiments of the present application.

Claims (16)

1. A frame rate identification method is characterized by being applied to electronic equipment; the method comprises the following steps:
according to the current target frame rate, performing frame stabilization on image drawing and rendering performed by the first application;
determining the frame length of N continuous images according to the receiving time of a first application drawing rendering result, wherein the N continuous images are images rendered by the first application drawing;
determining an actual measurement frame rate according to the frame lengths of the N continuous images;
and determining a new target frame rate according to the actual measurement frame rate and the current target frame rate so as to perform frame stabilization on the image drawing and rendering performed by the first application according to the new target frame rate.
2. The method of claim 1, wherein determining a measured frame rate from the frame lengths of the N successive images comprises:
determining a first interval from a plurality of preset frame rate intervals according to the average frame length of the N continuous images;
and determining the actually measured frame rate according to the first interval.
3. The method of claim 2, wherein determining the measured frame rate according to the first interval comprises:
and using the upper limit frame rate of the first interval as the measured frame rate.
4. The method of claim 1 or 2, wherein determining a new target frame rate based on the measured frame rate and the current target frame rate comprises:
when the actual measurement frame rate is equal to the current target frame rate and the number of first images in the N continuous images is larger than a first threshold value, adding the current target frame rate and a first preset frame rate to obtain a first frame rate, and using the first frame rate as the new target frame rate;
the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate.
5. The method of claim 1 or 2, wherein determining a new target frame rate based on the measured frame rate and the current target frame rate comprises:
when the actually measured frame rate is equal to the current target frame rate, the number of first-class images in the N continuous images is greater than a first threshold value, and the number of first-class images in the N continuous images is greater than the number of second-class images in the N continuous images, adding the current target frame rate and a first preset frame rate to obtain a first frame rate, and using the first frame rate as the new target frame rate;
the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate; and the frame rate corresponding to the second type of image is less than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
6. The method of claim 1 or 2, wherein determining a new target frame rate based on the measured frame rate and the current target frame rate comprises:
and when the measured frame rate is greater than the current target frame rate, using the measured frame rate as the new target frame rate.
7. The method of claim 1 or 2, wherein determining a new target frame rate based on the measured frame rate and the current target frame rate comprises:
and using the higher frame rate of the actually measured frame rate and the first frame rate as the new target frame rate, wherein the first frame rate is obtained by adding the current target frame rate and a first preset frame rate.
8. The method of claim 1 or 2, wherein determining a new target frame rate based on the measured frame rate and the current target frame rate comprises:
and when the measured frame rate is smaller than the current target frame rate, using the measured frame rate as the new target frame rate.
9. The method of claim 1 or 2, wherein determining a new target frame rate based on the measured frame rate and the current target frame rate comprises:
and when the actually measured frame rate is smaller than the current target frame rate and the average sleep time of rendering threads when the first application performs rendering on the N continuous images is larger than a second threshold value, using the actually measured frame rate as the new target frame rate.
10. The method of claim 1 or 2, wherein determining a new target frame rate based on the measured frame rate and the current target frame rate comprises:
when the actually measured frame rate is smaller than the current target frame rate, the average sleep time of rendering threads when the first application performs rendering on the N continuous images is larger than a second threshold, and the number of second images in the N continuous images is larger than a third threshold, using the actually measured frame rate as the new target frame rate; wherein the content of the first and second substances,
and the frame rate corresponding to the second type of image is less than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
11. The method of claim 1 or 2, wherein the measured frame rate comprises at least two measured frame rates; determining a latest target frame rate according to the measured frame rate includes:
and when the measured frame rate is smaller than the current target frame rate and the at least two frame rates are equal, using the measured frame rate as the latest target frame rate.
12. The method of claim 1 or 2, wherein the measured frame rate comprises at least two measured frame rates; determining a latest target frame rate according to the measured frame rate includes:
when the actual measurement frame rate is smaller than the current target frame rate, the at least two frame rates are equal, and the number of second images in the N continuous images is larger than a fourth threshold, using the actual measurement frame rate as the latest target frame rate; wherein the content of the first and second substances,
and the frame rate corresponding to the second type of image is less than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
13. An apparatus for frame rate identification, the apparatus comprising:
the frame stabilizing unit is used for stabilizing frames of image drawing and rendering performed by the first application according to the current target frame rate;
a first determining unit, configured to determine frame lengths of N consecutive images according to a receiving time at which a first application rendering result is received, where the N consecutive images are images rendered by the first application rendering;
a second determining unit, configured to determine an actual frame rate according to the frame lengths of the N continuous images;
and a third determining unit, configured to determine a new target frame rate according to the actual measurement frame rate and the current target frame rate, so as to perform frame stabilization on image rendering and rendering performed by the first application according to the new target frame rate.
14. An electronic device comprising a processor, a memory;
the memory is configured to store computer-executable instructions that, when executed by the electronic device, are executed by the processor to cause the electronic device to perform the method of any of claims 1-12.
15. A computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-12.
16. A computer program product comprising program code for performing the method of any one of claims 1-12 when executed by a processor in an electronic device.
CN201910888838.1A 2019-09-19 2019-09-19 Frame rate identification method and electronic equipment Pending CN112516590A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910888838.1A CN112516590A (en) 2019-09-19 2019-09-19 Frame rate identification method and electronic equipment
PCT/CN2020/108714 WO2021052070A1 (en) 2019-09-19 2020-08-12 Frame rate identification method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910888838.1A CN112516590A (en) 2019-09-19 2019-09-19 Frame rate identification method and electronic equipment

Publications (1)

Publication Number Publication Date
CN112516590A true CN112516590A (en) 2021-03-19

Family

ID=74884005

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910888838.1A Pending CN112516590A (en) 2019-09-19 2019-09-19 Frame rate identification method and electronic equipment

Country Status (2)

Country Link
CN (1) CN112516590A (en)
WO (1) WO2021052070A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114442792A (en) * 2022-02-09 2022-05-06 北京小米移动软件有限公司 Method and device for adjusting operating frequency of processor and storage medium
WO2022100646A1 (en) * 2020-11-16 2022-05-19 深圳市万普拉斯科技有限公司 Picture frame rate adjustment method for electronic device, electronic device, and storage medium
CN115278366A (en) * 2022-09-28 2022-11-01 天津卓朗昆仑云软件技术有限公司 Data processing method and device for video stream of virtual machine and electronic equipment
CN115665482A (en) * 2022-11-09 2023-01-31 腾讯科技(深圳)有限公司 Video rendering method and device, computer equipment and storage medium
CN115904184A (en) * 2021-09-30 2023-04-04 荣耀终端有限公司 Data processing method and related device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366391A (en) * 2013-06-26 2013-10-23 广州市动景计算机科技有限公司 Picture rendering method and picture rendering device of dynamic picture
WO2018008426A1 (en) * 2016-07-08 2018-01-11 ソニーセミコンダクタソリューションズ株式会社 Signal processing device and method, and imaging device
CN107786748A (en) * 2017-10-31 2018-03-09 广东欧珀移动通信有限公司 Method for displaying image and equipment
CN108762465A (en) * 2018-03-27 2018-11-06 广东欧珀移动通信有限公司 Frame per second self-adapting regulation method, device, storage medium and intelligent terminal
CN109165103A (en) * 2018-10-15 2019-01-08 Oppo广东移动通信有限公司 Frame rate control method, device, terminal and storage medium
CN109800141A (en) * 2019-01-28 2019-05-24 Oppo广东移动通信有限公司 Determination method, apparatus, terminal and the storage medium of GPU performance bottleneck
CN109857559A (en) * 2019-01-25 2019-06-07 维沃移动通信有限公司 Terminal control method and terminal
KR20190096685A (en) * 2018-02-09 2019-08-20 삼성전자주식회사 Display apparatus and control method for the same

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8291460B1 (en) * 2010-02-12 2012-10-16 Adobe Systems Incorporated Rate adaptation based on dynamic performance monitoring
US8615160B2 (en) * 2010-06-18 2013-12-24 Adobe Systems Incorporated Media player instance throttling
CN107610039A (en) * 2016-07-12 2018-01-19 联发科技股份有限公司 Image processing method and image processing apparatus
CN106603543B (en) * 2016-12-22 2019-08-09 努比亚技术有限公司 Correct the synchronous method and device of stream medium audio and video

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366391A (en) * 2013-06-26 2013-10-23 广州市动景计算机科技有限公司 Picture rendering method and picture rendering device of dynamic picture
WO2018008426A1 (en) * 2016-07-08 2018-01-11 ソニーセミコンダクタソリューションズ株式会社 Signal processing device and method, and imaging device
CN107786748A (en) * 2017-10-31 2018-03-09 广东欧珀移动通信有限公司 Method for displaying image and equipment
KR20190096685A (en) * 2018-02-09 2019-08-20 삼성전자주식회사 Display apparatus and control method for the same
CN108762465A (en) * 2018-03-27 2018-11-06 广东欧珀移动通信有限公司 Frame per second self-adapting regulation method, device, storage medium and intelligent terminal
CN109165103A (en) * 2018-10-15 2019-01-08 Oppo广东移动通信有限公司 Frame rate control method, device, terminal and storage medium
CN109857559A (en) * 2019-01-25 2019-06-07 维沃移动通信有限公司 Terminal control method and terminal
CN109800141A (en) * 2019-01-28 2019-05-24 Oppo广东移动通信有限公司 Determination method, apparatus, terminal and the storage medium of GPU performance bottleneck

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022100646A1 (en) * 2020-11-16 2022-05-19 深圳市万普拉斯科技有限公司 Picture frame rate adjustment method for electronic device, electronic device, and storage medium
CN115904184A (en) * 2021-09-30 2023-04-04 荣耀终端有限公司 Data processing method and related device
CN115904184B (en) * 2021-09-30 2024-03-19 荣耀终端有限公司 Data processing method and related device
CN114442792A (en) * 2022-02-09 2022-05-06 北京小米移动软件有限公司 Method and device for adjusting operating frequency of processor and storage medium
CN115278366A (en) * 2022-09-28 2022-11-01 天津卓朗昆仑云软件技术有限公司 Data processing method and device for video stream of virtual machine and electronic equipment
CN115278366B (en) * 2022-09-28 2023-03-24 天津卓朗昆仑云软件技术有限公司 Data processing method and device for video stream of virtual machine and electronic equipment
CN115665482A (en) * 2022-11-09 2023-01-31 腾讯科技(深圳)有限公司 Video rendering method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
WO2021052070A1 (en) 2021-03-25

Similar Documents

Publication Publication Date Title
CN113794800B (en) Voice control method and electronic equipment
WO2020259452A1 (en) Full-screen display method for mobile terminal, and apparatus
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
WO2021036770A1 (en) Split-screen processing method and terminal device
CN114089933B (en) Display parameter adjusting method, electronic device, chip and readable storage medium
CN112789651A (en) Frequency adjusting method and device applied to terminal and electronic equipment
WO2021052070A1 (en) Frame rate identification method and electronic device
WO2021258814A1 (en) Video synthesis method and apparatus, electronic device, and storage medium
CN113704205B (en) Log storage method, chip, electronic device and readable storage medium
CN111913750B (en) Application program management method, device and equipment
WO2021052139A1 (en) Gesture input method and electronic device
WO2021082815A1 (en) Display element display method and electronic device
WO2022001258A1 (en) Multi-screen display method and apparatus, terminal device, and storage medium
CN113805797A (en) Network resource processing method, electronic device and computer readable storage medium
CN113641271A (en) Application window management method, terminal device and computer readable storage medium
CN113438366B (en) Information notification interaction method, electronic device and storage medium
CN111104209B (en) Task processing method and related equipment
CN112437341B (en) Video stream processing method and electronic equipment
CN113542574A (en) Shooting preview method under zooming, terminal, storage medium and electronic equipment
CN114995715B (en) Control method of floating ball and related device
CN114911400A (en) Method for sharing pictures and electronic equipment
CN114828098A (en) Data transmission method and electronic equipment
CN116048831B (en) Target signal processing method and electronic equipment
CN115421644B (en) Method and device for determining source of popup message
WO2022042774A1 (en) Profile picture display method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210319

RJ01 Rejection of invention patent application after publication