CN117499739A - Frame rate control method, device, computer equipment and storage medium - Google Patents

Frame rate control method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN117499739A
CN117499739A CN202410002823.1A CN202410002823A CN117499739A CN 117499739 A CN117499739 A CN 117499739A CN 202410002823 A CN202410002823 A CN 202410002823A CN 117499739 A CN117499739 A CN 117499739A
Authority
CN
China
Prior art keywords
target
scenes
scene
frame rate
display frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410002823.1A
Other languages
Chinese (zh)
Inventor
徐士立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202410002823.1A priority Critical patent/CN117499739A/en
Publication of CN117499739A publication Critical patent/CN117499739A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a frame rate control method, a device, computer equipment and a storage medium, which can be applied to various scenes such as a game scene, a shopping scene, a multimedia resource scene and the like, wherein the method comprises the following steps: acquiring m initial scenes in a target application; determining n target scenes corresponding to the target application based on m initial scenes and a duration threshold; determining n actual display frame rates based on n target scenes and preset priorities; if any one of the n actual display frame rates does not reach the target display frame rate, performing frame interpolation processing on the target scene corresponding to any one of the n actual display frame rates, so that the actual display frame rate of each of the n target scenes reaches the target display frame rate. According to the method and the device, the scene of which the actual display frame rate does not reach the target display frame rate is subjected to frame inserting processing, and balance between user experience and terminal operation safety is achieved.

Description

Frame rate control method, device, computer equipment and storage medium
Technical Field
The present invention relates to the field of terminal technologies, and in particular, to a frame rate control method, a device, a computer device, and a storage medium.
Background
With the development of terminal technology, various terminals have become an indispensable tool in life and work of people. While the terminal supports the operation of various applications, the high frame rate support is required for each scene in the applications, such as a game scene, a video scene and the like, so that a user can smoothly browse each scene.
Currently, all scenes in an application are generally operated at a high frame rate, so that a user can smoothly browse all scenes.
However, the above method may cause that the terminal temperature is too high to affect the operation safety of the terminal, and the balance between the user experience and the operation safety of the terminal cannot be achieved.
Disclosure of Invention
The main purpose of the application is to provide a frame rate control method, a device, a computer device and a storage medium, which can realize the balance between user experience and terminal operation safety.
In order to achieve the above object, in a first aspect, the present application provides a frame rate control method, including:
obtaining m initial scenes in a target application, wherein m is an integer greater than 1;
determining n target scenes corresponding to the target application based on m initial scenes and a duration threshold, wherein n is an integer greater than or equal to m;
determining n actual display frame rates based on the n target scenes and the preset priority, wherein the n actual display frame rates comprise the actual display frame rate of each of the n target scenes;
If any one of the n actual display frame rates does not reach the target display frame rate, performing frame interpolation processing on the target scene corresponding to any one of the n actual display frame rates, so that the actual display frame rate of each of the n target scenes reaches the target display frame rate.
In an embodiment, determining n target scenes corresponding to the target application based on the m initial scenes and the duration threshold includes:
calculating the duration of each of the m initial scenes;
if the duration of each of the m initial scenes does not exceed the duration threshold, taking the m initial scenes as n target scenes corresponding to the target application, wherein m is equal to n;
if the duration of any one of the m initial scenes exceeds the duration threshold, performing scene division on any one of the m initial scenes, and determining n target scenes corresponding to the target application.
In an embodiment, performing scene division on any one of m initial scenes to determine n target scenes corresponding to a target application, where the determining includes:
selecting k variables corresponding to any one of m initial scenes, wherein the variables are used for representing variables affecting the operation frequency of the object, and k is an integer greater than 1;
Calculating the information entropy of each variable in the k variables to obtain k information entropy;
sorting the k information entropies from large to small to obtain k sorted information entropies;
and performing scene division on any one of m initial scenes by using a variable corresponding to the first information entropy in the k ordered information entropies, and determining n target scenes corresponding to the target application.
In an embodiment, performing scene division on any one of m initial scenes by using a variable corresponding to a first information entropy in k ordered information entropies, and determining n target scenes corresponding to a target application, where the determining includes:
performing scene division on any one of m initial scenes by using a variable corresponding to a first information entropy in the k ordered information entropies to obtain t sub-scenes, wherein t is an integer greater than 1;
calculating the duration of each of the t sub-scenes;
if the duration of each of the t sub-scenes does not exceed the duration threshold, taking the scenes except any one of the initial scenes and the t sub-scenes as n target scenes corresponding to the target application;
if the duration of any one of the t sub-scenes exceeds the duration threshold, adding a new variable, and dividing the scene of any one of the m initial scenes by using the added new variable to determine n target scenes corresponding to the target application, wherein the new variable is a variable corresponding to the next information entropy of the current information entropy.
In an embodiment, performing scene division on any one of m initial scenes by using the added new variable, and determining n target scenes corresponding to the target application includes:
performing scene division on any one of m initial scenes by utilizing a variable corresponding to the first information entropy and a variable corresponding to the second information entropy to obtain p sub-scenes, wherein p is an integer greater than 1;
calculating the duration of each of the p sub-scenes;
if the duration of each of the p sub-scenes does not exceed the duration threshold, taking the scenes except any one of the initial scenes and the p sub-scenes as n target scenes corresponding to the target application;
if the duration of any one of the p sub-scenes exceeds the duration threshold, repeating the steps of adding a new variable and dividing the scene of any one of the m initial scenes by using the added new variable until n target scenes corresponding to the target application are determined.
In an embodiment, calculating a duration of each of the m initial scenes includes:
acquiring duration data of each initial scene;
and carrying out average value calculation on all time lengths except the shortest time length and the longest time length in the time length data to obtain the duration time length of each initial scene.
In an embodiment, calculating the information entropy of each of the k variables to obtain k information entropies includes:
q feature subsets corresponding to each variable are obtained, wherein q is an integer greater than 1;
calculating the proportion of each feature subset in the q feature subsets to each variable to obtain q proportion values;
counting q proportional values to obtain the information entropy of each variable;
and summarizing the information entropy of each variable to obtain k information entropies.
In an embodiment, determining n actual display frame rates based on n target scenes and a preset priority includes:
acquiring the operation frequency of each target scene aiming at each target scene in n target scenes;
determining the priority of each target scene based on the operation frequency and the preset priority of each target scene;
determining a rendering frame rate of each target scene based on the priority of each target scene;
determining an actual display frame rate for each target scene based on the rendering frame rate and the target display frame rate for each target scene;
and summarizing the actual display frame rate of each target scene to obtain n actual display frame rates.
In an embodiment, obtaining an operating frequency of each target scene includes:
For each object in all objects, acquiring the operation times of each object for executing operation on each target scene;
the operation frequency of each object aiming at each target scene is obtained by taking the operation frequency as a quotient of the duration time of each target scene and the operation time of each object;
adding the operation frequencies of each object aiming at each target scene to obtain the total operation frequency;
and (3) the total operation frequency is multiplied by the number of all objects to obtain the operation frequency of each target scene.
In an embodiment, determining the priority of each target scene based on the operation frequency of each target scene and the preset priority includes:
selecting a target scene with the minimum operating frequency from the n target scenes, and setting the target scene with the minimum operating frequency as the lowest priority;
calculating the duty ratio of the duration of the target scene with the minimum operation frequency to the duration of n target scenes;
if the duty ratio is greater than or equal to the duty ratio upper limit corresponding to the lowest priority, matching the target scene with the minimum operation frequency with the last priority of the lowest priority to determine the priority of the target scene with the minimum operation frequency;
if the duty ratio is smaller than the duty ratio upper limit corresponding to the lowest priority, determining the target scene with the minimum operating frequency as the lowest priority;
And returning to the step of selecting the target scene with the smallest operation frequency from the n target scenes, and setting the target scene with the smallest operation frequency as the lowest priority until the priority of each target scene is determined.
In one embodiment, determining an actual display frame rate for each target scene based on the rendering frame rate and the target display frame rate for each target scene includes:
multiplying the rendering frame rate of each target scene by the target display frame rate to obtain the actual display frame rate of each target scene.
In an embodiment, if any one of the n actual display frame rates does not reach the target display frame rate, performing frame interpolation processing on a target scene corresponding to any one of the n actual display frame rates, so that the actual display frame rate of each of the n target scenes reaches the target display frame rate, including:
if any one of the n actual display frame rates does not reach the target display frame rate, acquiring a simulation frame of a target scene corresponding to the any one actual display frame rate;
and inserting a preset number of simulation frames into the rendering frames of the target scenes corresponding to any actual display frame rate at intervals so that the actual display frame rate of each of the n target scenes reaches the target display frame rate, wherein the preset number is the difference between any actual display frame rate and the target display frame rate.
In a second aspect, an embodiment of the present application provides a frame rate control apparatus, including:
the acquisition module is used for acquiring m initial scenes in the target application, wherein m is an integer greater than 1;
the scene determining module is used for determining n target scenes corresponding to the target application based on m initial scenes and a duration threshold, wherein n is an integer greater than or equal to m;
the display frame rate determining module is used for determining n actual display frame rates based on n target scenes and preset priorities, wherein the n actual display frame rates comprise the actual display frame rate of each of the n target scenes;
and the frame inserting processing module is used for carrying out frame inserting processing on the target scene corresponding to any actual display frame rate if any actual display frame rate in the n actual display frame rates does not reach the target display frame rate, so that the actual display frame rate of each target scene in the n target scenes reaches the target display frame rate.
In a third aspect, embodiments of the present application provide an apparatus comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the steps of any of the methods described above when the computer program is executed.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program which, when executed by a processor, performs the steps of any of the methods described above.
The embodiment of the application provides a frame rate control method, a device, a computer device and a storage medium, comprising the following steps: firstly acquiring m initial scenes in a target application, wherein m is an integer greater than 1, then determining n target scenes corresponding to the target application based on the m initial scenes and a duration threshold, wherein n is an integer greater than or equal to m, and determining n actual display frame rates based on the n target scenes and a preset priority, wherein the n actual display frame rates comprise the actual display frame rate of each target scene in the n target scenes, and if any actual display frame rate in the n actual display frame rates does not reach the target display frame rate, performing frame interpolation processing on the target scene corresponding to any actual display frame rate so that the actual display frame rate of each target scene in the n target scenes reaches the target display frame rate. According to the method and the device, the scene with the actual display frame rate not reaching the target display frame rate is subjected to frame inserting processing, so that the display frame rate of the scene can be improved, and meanwhile, the potential safety hazards of terminal operation caused by the fact that all the scenes are in high-frame operation are avoided, and therefore balance between user experience and terminal operation safety is achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, are included to provide a further understanding of the application and to provide a further understanding of the application with regard to the other features, objects and advantages of the application. The drawings of the illustrative embodiments of the present application and their descriptions are for the purpose of illustrating the present application and are not to be construed as unduly limiting the present application. In the drawings:
fig. 1 is a schematic structural diagram of a frame rate control system according to an embodiment of the present application;
fig. 2 is a flowchart of a frame rate control method according to an embodiment of the present application;
fig. 3 is a flowchart of another frame rate control method according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a frame inserting method according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a frame rate control device according to an embodiment of the present application;
fig. 6 is a schematic diagram of a computer device provided in an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein.
In the present embodiment, the term "module" or "unit" refers to a computer program or a part of a computer program having a predetermined function, and works together with other relevant parts to achieve a predetermined object, and may be implemented in whole or in part by using software, hardware (such as a processing circuit or a memory), or a combination thereof. Also, a processor (or multiple processors or memories) may be used to implement one or more modules or units. Furthermore, each module or unit may be part of an overall module or unit that incorporates the functionality of the module or unit.
It should be understood that, in various embodiments of the present application, the size of the sequence number of each process does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
It should be understood that in this application, "comprising" and "having" and any variations thereof are intended to cover non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in this application, "plurality" means two or more. "and/or" is merely an association relationship describing an association object, and means that three relationships may exist, for example, and/or B may mean: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. "comprising A, B and C", "comprising A, B, C" means that all three of A, B, C comprise, "comprising A, B or C" means that one of the three comprises A, B, C, and "comprising A, B and/or C" means that any 1 or any 2 or 3 of the three comprises A, B, C.
It should be understood that in this application, "B corresponding to a", "a corresponding to B", or "B corresponding to a" means that B is associated with a from which B can be determined. Determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information. The matching of A and B is that the similarity of A and B is larger than or equal to a preset threshold value.
As used herein, "if" may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to detection" depending on the context.
The data related to the application can be the data authorized by the testers or fully authorized by all the parties, and the acquisition, the transmission, the use and the like of the data all meet the requirements of relevant laws and regulations and standards of relevant countries and regions, and the implementation modes/embodiments of the application can be mutually combined.
The embodiment of the application can be applied to various scenes such as multimedia resource playing, voice interaction, information communication and the like.
The technical scheme of the present application is described in detail below with specific examples. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
Next, the present application will be described by way of specific embodiments with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a frame rate control system according to an embodiment of the present application.
As shown in fig. 1, the frame rate control system of the present application includes a client 101 and a server 102, where the client 101 includes a frame rendering module, a policy control module, an inserting frame processing module, and a first main logic module. The server 102 includes a storage module, a scene analysis module, a second main logic module, and a priority analysis module.
The scenes in the scene analysis module include, but are not limited to, various visual scenes such as game scenes, movie scenes, shopping scenes and the like.
Taking a game scenario as an example, in the development stage of a game, the game needs to be subdivided into different scenarios. The scenes are subdivided for the duration of the scenes, such as login loading scenes, game hall (preparation stage) scenes, game play scenes and the like, and then different priorities are preset for different scenes.
In actual running of the game, different rendering frame rates are set according to priorities of scenes, a high-priority scene sets a high rendering frame rate, and a low-priority scene sets a low rendering frame rate. Because the preset scene priority is not necessarily accurate, in the game running process, the operation information of the user is synchronously reported, the server performs data analysis, and the priorities of different scenes are set and updated according to the frequent degree of the user operation.
The game scene is applied to a method corresponding to the frame rate control system shown in fig. 1, and specifically comprises the following steps:
the first main logic module establishes connection with the server 102, requests scene classification and scene priority classification policies from the server 102, locally caches the policies for the policies to control, collects user operation data corresponding to each game scene, and reports the user operation data to the server 102.
The strategy control module is used for determining the type and the corresponding priority of each game scene according to the locally cached scene classification and scene priority classification strategy in the game running process, determining the rendering frame rate of each game scene according to the type and the corresponding priority of each game scene, and informing the frame rendering module and the frame inserting processing module to execute actual rendering on each scene.
And the frame rendering module is used for executing frame rendering according to the rendering frame rate set by the strategy control module.
And the frame inserting processing module calculates frame inserting data according to the setting of the strategy control module and completes output.
The second main logic module receives the policy request of the client 101, acquires corresponding policy data from the storage module and transmits the policy data to the client 101; the game scene and the user operation data reported by the client 101 are received and stored in the storage module for use by the analysis module.
The storage module stores scene classification, scene priority and corresponding rendering frame rate data, and stores scene data and user operation data reported by the client 101.
And the scene analysis module is used for optimizing the scene division logic according to the data reported by the data stored by the storage module and updating the scene classification logic.
And the priority analysis module is used for reclassifying the priorities of the scenes according to the data reported by the data stored by the storage module and updating the priorities to the storage module.
It should be noted that, the method corresponding to the other types of scenes applied to the frame rate control system shown in fig. 1 is similar to the game scene, and will not be repeated here.
The client 101 and the server 102 may communicate by any communication means, including but not limited to network communication, and the network may include but is not limited to: a wired network, a wireless network, wherein the wired network comprises: local area networks, metropolitan area networks, and wide area networks, the wireless network comprising: bluetooth, WIFI, and other networks that enable wireless communications. Wherein the client 101 includes, but is not limited to, at least one of: a mobile phone (e.g., android mobile phone, iOS mobile phone, etc.), a notebook computer, a tablet computer, a palm computer, a mobile internet device (Mobile Internet Devices, MID), a PAD, a desktop computer, a smart television, etc. The server 102 may be a site server or a remote server, where the configuration server, the site server, and the remote server may all be independent servers or a service cluster formed by a plurality of servers. The above is merely an example, and is not limited in any way in the present embodiment.
Referring to fig. 2, fig. 2 is a flowchart of a frame rate control method according to an embodiment of the present application. As shown in fig. 2, the method is applied to the server 102 shown in fig. 1, and includes the following steps:
step S201: and acquiring m initial scenes in the target application.
Wherein m is an integer greater than 1, such as 2, 5, 10, etc
Target applications include, but are not limited to, shopping applications, gaming applications, short video applications, and the like.
Taking a game application as an example, according to the running logic of a game, the game comprises scenes such as login loading, a game hall (preparation stage), game play, sightseeing (the game process of the game is finished and teammates wait) and the like.
The target application may be set in the client 101, and the client 101 transmits m initial scenes as included in the shopping application to the server 102, so that the server 102 performs subsequent scene division and the like on the m initial scenes.
Step S202: and determining n target scenes corresponding to the target application based on the m initial scenes and the duration threshold.
Wherein n is an integer greater than or equal to m.
For n target scenes corresponding to the target application based on m initial scenes and a duration threshold, the duration of each initial scene in the m initial scenes is calculated first, and if the duration of each initial scene in the m initial scenes does not exceed the duration threshold, the m initial scenes are taken as n target scenes corresponding to the target application, wherein m is equal to n; if the duration of any one of the m initial scenes exceeds the duration threshold, performing scene division on any one of the m initial scenes, and determining n target scenes corresponding to the target application.
Since the durations of different types of scenes may be the same or different, it should be noted that the durations of the scenes cannot be too long or too short, and if the durations of the scenes are too long, the frame insertion scheduling effect is affected, while the durations of the scenes are too short, not only the system load is increased, but also the user experience is affected.
Thus, the duration of the scene needs to remain within a duration threshold, which may be set according to the particular application and scene, which may be represented by a particular value or interval of values, such as a duration threshold of 5s-60s.
Further, it is necessary to determine which initial scenes need to be further divided and which scenes do not need to be divided by comparing the duration of the initial scenes with the duration threshold, so as to determine the target scenes.
Taking tactical athletic games as an example, the main stream Cheng Dagai of the user is several core scenes such as initializing the game (downloading resources and logging in), entering a game hall, preparing for game play, boarding, parachuting, game play, sightseeing, and the like, and it can be obviously found that the game play scenes have long duration, and the game content actually faced by the user in the scenes is rich and changeable, and the game content is the scene with the most frequent user operation and the greatest influence on the user experience. If the game playing scene is considered as an integral scene and is set to be the highest rendering frame rate, the duration of the game playing scene is overlong, that is, the duration of the game playing scene exceeds the duration threshold, the system load is excessively high for a long time, the temperature of a terminal is possibly high, the frequency is reduced, the performance of the terminal is rapidly reduced, and finally the user experience is still damaged.
Thus, for such scenes where the duration exceeds the duration threshold, further scene subdivision needs to be performed to determine the target scene.
The method comprises the steps of dividing scenes according to any one of m initial scenes, determining n target scenes corresponding to target applications, selecting k variables corresponding to any one of m initial scenes, wherein the variables are used for representing variables affecting the operation frequency of the object, k is an integer larger than 1, calculating the information entropy of each variable in the k variables to obtain k information entropies, sorting the k information entropies from large to small to obtain k sorted information entropies, dividing the scenes of any one of the m initial scenes by utilizing the variable corresponding to the first information entropy in the k sorted information entropies, and determining the n target scenes corresponding to the target applications.
The method for determining the n target scenes corresponding to the target application includes the steps of: performing scene division on any one of m initial scenes by using a variable corresponding to a first information entropy in the k ordered information entropies to obtain t sub-scenes, wherein t is an integer greater than 1; calculating the duration of each of the t sub-scenes; if the duration of each of the t sub-scenes does not exceed the duration threshold, taking the scenes except any one of the initial scenes and the t sub-scenes as n target scenes corresponding to the target application; if the duration of any one of the t sub-scenes exceeds the duration threshold, adding a new variable, and dividing the scene of any one of the m initial scenes by using the added new variable to determine n target scenes corresponding to the target application, wherein the new variable is a variable corresponding to the next information entropy of the current information entropy.
The method for determining the n target scenes corresponding to the target application includes the steps of: performing scene division on any one of m initial scenes by utilizing a variable corresponding to the first information entropy and a variable corresponding to the second information entropy to obtain p sub-scenes, wherein p is an integer greater than 1; calculating the duration of each of the p sub-scenes; if the duration of each of the p sub-scenes does not exceed the duration threshold, taking the scenes except any one of the initial scenes and the p sub-scenes as n target scenes corresponding to the target application; if the duration of any one of the p sub-scenes exceeds the duration threshold, repeating the steps of adding a new variable and dividing the scene of any one of the m initial scenes by using the added new variable until n target scenes corresponding to the target application are determined.
The principle of further dividing the initial scene with the duration exceeding the duration threshold in the application is to select a variable which has an influence on the operation frequency of the user under the initial scene.
The method comprises the steps that under the initial scene, variables influencing the user operation frequency are the same-screen number of people, the type of carried props and health values, meanwhile, the operation conditions of users under corresponding conditions can be obtained, then the information entropy is adopted to determine the influence of each of the variables on the user operation frequency, namely, the information entropy of each variable is calculated, and the calculated information entropy is A1, A2 and A3.
Then sorting the three calculated information entropies from large to small, assuming the three sorted information entropies to be A1, A2 and A3, and then selecting the variable corresponding to the maximum value of the information entropies: the same-screen people subdivide an initial scene to be divided, the initial scene is divided into t sub-scenes, and if the duration of each sub-scene in the t sub-scenes does not exceed a duration threshold, the divided t sub-scenes and the initial scene which does not need to be divided are taken as target scenes; if the duration of any one of the t sub-scenes exceeds the duration threshold, the next variable is added to divide the sub-scenes with the duration exceeding the duration threshold, namely, the variable is adopted: dividing sub-scenes with the duration exceeding a duration threshold value by the number of people on the same screen and the type of carrying prop, and obtaining p sub-scenes.
If the duration of each of the p sub-scenes does not exceed the duration threshold, taking the p divided sub-scenes and the initial scene which does not need to be divided as target scenes; if the duration of any one of the p sub-scenes exceeds the duration threshold, the next variable is added to divide the sub-scenes with the duration exceeding the duration threshold, namely, the variable is adopted: dividing the sub-scenes with the duration exceeding the duration threshold value by the same screen number, the type of the carried prop and the health value until the duration of each divided sub-scene does not exceed the duration threshold value.
If all variables are traversed: after the number of people on the same screen, the type of the carried prop and the health value are carried, the divided sub-scenes still cannot exceed the duration threshold, the current variable is not comprehensive enough, and the initial scene is divided after other variables are required to be added until the divided sub-scenes do not exceed the duration threshold.
For calculating the duration of each initial scene in m initial scenes, acquiring duration data of each initial scene, and then carrying out average value calculation on all duration except the shortest duration and the longest duration in the duration data to obtain the duration of each initial scene.
Because the duration of the same initial scene can be greatly different due to different user operation habits, the method and the device can count all duration of operation of all users aiming at the same initial scene at a certain moment or in a certain time period, reject the longest duration and the shortest duration in all duration, calculate the average value of all the remaining duration, and take the calculated average value as the duration of the same initial scene.
For example, taking a scope-aiming scene as an example, most users operate for the scene for only one or two seconds, but some users may keep the scope for a few minutes or even longer, so when calculating the duration of the scope-aiming scene, the shortest 5% and the longest 5% of the duration of all users operating for the scene in a certain period of time can be removed, and then the average value of all the remaining durations is taken as the duration of the scene.
When the duration of any scene is calculated, the maximum and minimum values are removed, so that the influence of abnormal data on the calculation of a final result can be avoided, and the accuracy of the duration of the scene is improved.
It should be noted that, the duration of the initial scene and the divided sub-scenes are calculated in the same manner.
Aiming at calculating the information entropy of each variable in k variables to obtain k information entropies, q feature subsets corresponding to each variable are required to be obtained firstly, wherein q is an integer greater than 1, then the proportion of each feature subset in the q feature subsets to each variable is calculated to obtain q proportion values, the q proportion values are counted to obtain the information entropy of each variable, and finally the information entropies of each variable are summarized to obtain k information entropies.
Assuming that the acquired variables affecting the user operation frequency are v1, v2, v3...vk, the information entropy of each variable is calculated by a mathematical model as shown in formula (1):
(1)
wherein Ent (D) represents the information entropy of variable X, p k Representing the proportion (also called probability), log, of the kth feature subset of the variable X to the variable X 2 p k The base 2 logarithm is represented, wherein X represents any one of the variables v1, v2, v3..
Step S203: based on the n target scenes and the preset priority, n actual display frame rates are determined.
Wherein the n actual display frame rates include an actual display frame rate for each of the n target scenes.
For n actual display frame rates based on n target scenes and preset priorities, the operation frequency of each target scene is acquired for each target scene in the n target scenes, then the priority of each target scene is determined based on the operation frequency of each target scene and the preset priorities, the rendering frame rate of each target scene is determined based on the priority of each target scene, and then the actual display frame rate of each target scene is determined based on the rendering frame rate of each target scene and the target display frame rate, and then the actual display frame rates of each target scene are summarized to obtain n actual display frame rates.
The method for acquiring the operation frequency of each target scene comprises the following steps: for each object in all objects, acquiring the operation times of each object for executing operation on each target scene; the operation frequency of each object aiming at each target scene is obtained by taking the operation frequency as a quotient of the duration time of each target scene and the operation time of each object; adding the operation frequencies of each object aiming at each target scene to obtain the total operation frequency; and (3) the total operation frequency is multiplied by the number of all objects to obtain the operation frequency of each target scene.
For example, for any target scene, the duration of this target scene is 10s. To calculate the operation frequency of the target scene, all objects at a certain moment, such as 3 objects, may be acquired first, and the number of operations performed on the target scene by each of the 3 objects is 20, 50, and 200, respectively. Then dividing the operation frequency of each object to the target scene by the duration of the target scene, obtaining that the operation frequencies of 3 objects for the target scene are respectively 2 times/s, 5 times/s and 20 times/s, and finally adding all the operation frequencies of 3 objects for the target scene, so as to obtain that the operation frequency of the target scene is 2+5+20=27 times/s.
Wherein determining the priority of each target scene based on the operation frequency and the preset priority of each target scene comprises: selecting a target scene with the minimum operating frequency from the n target scenes, and setting the target scene with the minimum operating frequency as the lowest priority; calculating the duty ratio of the duration of the target scene with the minimum operation frequency to the duration of n target scenes; if the duty ratio is greater than or equal to the duty ratio upper limit corresponding to the lowest priority, matching the target scene with the minimum operation frequency with the last priority of the lowest priority to determine the priority of the target scene with the minimum operation frequency; if the duty ratio is smaller than the duty ratio upper limit corresponding to the lowest priority, determining the target scene with the minimum operating frequency as the lowest priority; and returning to the step of selecting the target scene with the smallest operation frequency from the n target scenes, and setting the target scene with the smallest operation frequency as the lowest priority until the priority of each target scene is determined.
The present application sets priorities (i.e., preset priorities) for target scenes in advance, where the preset priorities may include a plurality of priorities, and each priority corresponds to a matched rendering frame rate and duration duty cycle.
Five priorities, i.e., L1, L2, L3, L4, and L5 are set in the preset priority table shown in table 1, and the priorities of L1, L2, L3, L4, and L5 decrease in sequence.
Table 1 preset priority table
When n target scenes are acquired, the n target scenes need to be prioritized, that is, the corresponding priority is configured for each target scene in the n target scenes.
Assume that three target scenes are the first target scene, the second target scene and the third target scene, and the operating frequencies of the three target scenes are sequentially reduced.
In combination with table 1, when the three target scenes are prioritized, the target scenes are matched from the lowest priority L5, and the specific implementation steps are as follows:
firstly, setting a target scene with the minimum operation frequency (namely a third target scene) as a lowest priority L5; then calculating the duty ratio of the duration time of the third target scene to the duration time of the three target scenes; if the duty ratio is smaller than the duty ratio upper limit corresponding to the lowest priority L5, determining the third target scene as the lowest priority L5; if the duty ratio is greater than or equal to the duty ratio upper limit corresponding to the lowest priority, matching the third target scene with the last priority L4 of the lowest priority, namely setting the third target scene as the priority L4, and then continuously calculating the duty ratio of the duration time of the third target scene to the duration time of the three target scenes; if the duty ratio is smaller than the duty ratio upper limit corresponding to the priority L4, determining the third target scene as the priority L4; and if the duty ratio is greater than or equal to the duty ratio upper limit corresponding to the priority L4, matching the third target scene with the priority L3 until the priorities of all the target scenes are determined.
Determining an actual display frame rate for each target scene based on the rendering frame rate and the target display frame rate for each target scene, comprising: multiplying the rendering frame rate of each target scene by the target display frame rate to obtain the actual display frame rate of each target scene. The target display frame rate is a frame rate at which the terminal can display a scene, and the frame rate can be set according to circumstances, and is not particularly limited herein.
After determining the priority of each target scene, since the rendering frame rate of each priority is recorded in the priority table, the rendering frame rate of each target scene can be determined directly by the priority. And by multiplying the rendering frame rate of each target scene by the target display frame rate, the actual display frame rate of each target scene can be obtained.
For example, if the target display frame rate is 120 and the rendering frame rate of a certain target scene is 100, it is known that the actual display frame rate of the target scene is 120, that is, the target display frame rate is reached.
After the target scene, the priority corresponding to the target scene, the rendering frame rate and the actual display frame rate are determined through the embodiment, the related content can be recorded and stored, so that a user can conveniently and rapidly acquire the target scene and the related content.
Taking the target application as an example of the game application, the corresponding scene and the related content are recorded in table 2.
Table 2 scene information for gaming applications
Because the manner of game play iteration, user operation habit change, and the like, regarding game scene division may become unreasonable, the target scene and the related content shown in table 2 need to be updated periodically with the latest data of the user so as to obtain a classification method that is the best match according to the current game play and the user operation habit.
Step S204: if any one of the n actual display frame rates does not reach the target display frame rate, performing frame interpolation processing on the target scene corresponding to any one of the n actual display frame rates, so that the actual display frame rate of each of the n target scenes reaches the target display frame rate.
For any one of the n actual display frame rates not reaching the target display frame rate, performing frame interpolation processing on the target scene corresponding to any one of the n actual display frame rates to enable the actual display frame rate of each of the n target scenes to reach the target display frame rate, including: if any one of the n actual display frame rates does not reach the target display frame rate, obtaining simulation frames of the target scenes corresponding to the any one actual display frame rate, and inserting a preset number of simulation frames into the rendering frames of the target scenes corresponding to the any one actual display frame rate at intervals so that the actual display frame rate of each of the n target scenes reaches the target display frame rate, wherein the preset number is the difference between the any one actual display frame rate and the target display frame rate.
In the method, only the target scene which does not reach the target display frame rate is subjected to frame inserting processing, when frames are inserted, in order to ensure the continuity of visual effects, the inserted frames need to uniformly follow the actual rendering frames, for example, the target display frame rate is 120, the rendering frame rate of a certain target scene in a game is 50%, the calculated actual display frame rate is 60, then one simulation frame can be inserted after each rendering frame, 60 simulation frames are inserted, and the actual display frame rate of the target scene is the actual display frame rate+the simulation frame=120, namely, the target display frame rate is reached.
According to the method and the device, the frame inserting strategies of different target scenes are determined according to the priorities, namely all frames of the target scenes with the highest priorities are actually rendered, part of frames of the scenes with the lowest priorities are actually rendered, other parts of frames of the scenes with the lowest priorities are in interpolation, the display frame rate of the scenes can be improved through dynamic frame inserting of the target scenes, meanwhile, potential safety hazards of terminal operation caused by the fact that all the scenes are in high-frame operation are avoided, and therefore balance between user experience and terminal operation safety is achieved.
Referring to fig. 3, fig. 3 is a flowchart illustrating another frame rate control method according to an embodiment of the present application. As shown in fig. 3, the method is applied to a game client and a game server of a game application, wherein the game server sequentially performs steps of periodic update, calculation of variable influence information entropy, rendering scene division, rendering scene priority classification, storage policy, query policy and the like, and the game client sequentially performs steps of starting a game, caching policy, starting scene & variable change monitoring, initializing completion and the like, and comprises the following steps:
Step S301: and updating periodically. The game server is updated or the game version is updated, and the modes of dividing the game scene, prioritizing and the like are needed to be correspondingly updated.
Step S302: and calculating the variable influence information entropy. For any one of the game scenes, information entropy of each variable (including a variable newly added due to a factor of version update or the like) affecting the operation frequency of the user operating this scene is calculated.
Step S303: rendering a scene division. And determining a new rendering scene division mode according to the rendering scene division mode in the step S202 and the recalculated information entropy.
Step S304: rendering scene priority classification. According to the rendering scene division manner in step S202, the corresponding duration and user operation frequency are calculated for the game scenes divided by the new rendering scene division manner, and the priority of each game scene is determined according to the calculation result and table 1.
Step S305: and storing the strategy. And storing the recalculated game scene and the corresponding priority.
Step S306: and (5) ending. And after successful storage, ending the updating flow.
Step S307: the game is started. The initialization flow is started after the game client is started.
Step S308: and querying the strategy. After the initialization flow is started, the game client requests the synchronous scene division and priority classification mode from the game server, and after the game server receives the request of the game client, the game server inquires the scene division and priority classification mode from the storage module and returns the scene division and priority classification mode to the game client.
Step S309: caching policies. And the game client receives the scene division and priority classification mode returned by the game server and then caches the scene division and priority classification mode in the local memory for subsequent processes.
Step S3010: scene & change monitoring is initiated. After the local cache is successful, starting the changing condition of variables related to the game scene and the scene division mode, and if the scene change or the change of the corresponding variables in the scene is found, starting the frame inserting method shown in fig. 4.
Step S3011: the initialization is completed. And after the normal start scene and the change monitoring, the initialization flow is finished.
Referring to fig. 4, fig. 4 is a flowchart of a frame inserting method according to an embodiment of the present application. As shown in fig. 4, the method is applied to a game client and a game server of a game application, wherein the game client sequentially performs steps of monitoring scene and variable changes, matching rendering scenes, rendering scene change judgment, rendering frame rate change judgment, refreshing rendering frame rate, entering the next round of monitoring, and the game server performs steps of reporting data, and the method comprises the following steps:
Step S401: scene and variable changes are monitored. The game client starts a frame inserting method when the scene changes or the corresponding variable changes in the scene.
Step S402: and reporting the data. The game client acquires relevant data from the current change to the last change, such as game scenes, relevant variables, duration time, user operation data and the like, and reports the relevant data to the game server.
Step S403: matching the rendered scene. And obtaining the latest game scene, the corresponding priority and rendering frame rate according to the current game scene, the related variables and the scene division and priority classification modes of the local cache.
Step S404: and (5) rendering scene change judgment. And judging whether the latest game scene is consistent with the current game scene, and if the latest game scene is not changed, returning to the step S401 to continue monitoring.
Step S405: and judging the change of the rendering frame rate. If the game scene changes, judging whether the priority of the game scene, namely the rendering frame rate, changes.
Step S406: the rendering frame rate is kept unchanged. If the rendering frame rate has not changed, the current rendering frame rate is kept running, and then the next round of game scenes or related variables are waited for to change, and the step S402 is executed.
Step S407: the rendering frame rate is refreshed. If the rendering frame rate changes, the frame inserting method is executed at the new rendering frame rate.
Step S408: and the next round of monitoring is carried out. And ending the frame inserting process of the round, and carrying out the next round of monitoring.
The embodiment of the application provides a frame rate control method, which comprises the following steps: firstly acquiring m initial scenes in a target application, wherein m is an integer greater than 1, then determining n target scenes corresponding to the target application based on the m initial scenes and a duration threshold, wherein n is an integer greater than or equal to m, and determining n actual display frame rates based on the n target scenes and a preset priority, wherein the n actual display frame rates comprise the actual display frame rate of each target scene in the n target scenes, and if any actual display frame rate in the n actual display frame rates does not reach the target display frame rate, performing frame interpolation processing on the target scene corresponding to any actual display frame rate so that the actual display frame rate of each target scene in the n target scenes reaches the target display frame rate. According to the method and the device, the scene with the actual display frame rate not reaching the target display frame rate is subjected to frame inserting processing, so that the display frame rate of the scene can be improved, and meanwhile, the potential safety hazards of terminal operation caused by the fact that all the scenes are in high-frame operation are avoided, and therefore balance between user experience and terminal operation safety is achieved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
The following are device embodiments of the present application, for details not described in detail therein, reference may be made to the corresponding method embodiments described above.
Fig. 5 shows a schematic structural diagram of a frame rate control device provided in an embodiment of the present application, and for convenience of explanation, only shows a portion related to the embodiment of the present application, and the frame rate control device includes an acquisition module 501, a scene determination module 502, a display frame rate determination module 503, and an interpolation frame processing module 504, which are specifically as follows:
an obtaining module 501, configured to obtain m initial scenes in a target application, where m is an integer greater than 1;
the scene determining module 502 is configured to determine n target scenes corresponding to the target application based on m initial scenes and a duration threshold, where n is an integer greater than or equal to m;
a display frame rate determining module 503, configured to determine n actual display frame rates based on the n target scenes and the preset priority, where the n actual display frame rates include an actual display frame rate of each of the n target scenes;
And the frame inserting processing module 504 is configured to perform frame inserting processing on the target scenes corresponding to any one of the n actual display frame rates if any one of the n actual display frame rates does not reach the target display frame rate, so that the actual display frame rate of each of the n target scenes reaches the target display frame rate.
In an embodiment, the scene determination module 502 is further configured to calculate a duration of each of the m initial scenes;
if the duration of each of the m initial scenes does not exceed the duration threshold, taking the m initial scenes as n target scenes corresponding to the target application, wherein m is equal to n;
if the duration of any one of the m initial scenes exceeds the duration threshold, performing scene division on any one of the m initial scenes, and determining n target scenes corresponding to the target application.
In an embodiment, the scene determining module 502 is further configured to select k variables corresponding to any one of m initial scenes, where the variables are used to represent variables affecting an operation frequency of the object, and k is an integer greater than 1;
calculating the information entropy of each variable in the k variables to obtain k information entropy;
Sorting the k information entropies from large to small to obtain k sorted information entropies;
and performing scene division on any one of m initial scenes by using a variable corresponding to the first information entropy in the k ordered information entropies, and determining n target scenes corresponding to the target application.
In an embodiment, the scene determining module 502 is further configured to perform scene division on any one of m initial scenes by using a variable corresponding to a first information entropy in the k ordered information entropies to obtain t sub-scenes, where t is an integer greater than 1;
calculating the duration of each of the t sub-scenes;
if the duration of each of the t sub-scenes does not exceed the duration threshold, taking the scenes except any one of the initial scenes and the t sub-scenes as n target scenes corresponding to the target application;
if the duration of any one of the t sub-scenes exceeds the duration threshold, adding a new variable, and dividing the scene of any one of the m initial scenes by using the added new variable to determine n target scenes corresponding to the target application, wherein the new variable is a variable corresponding to the next information entropy of the current information entropy.
In an embodiment, the scene determining module 502 is further configured to perform scene division on any one of the m initial scenes by using a variable corresponding to the first information entropy and a variable corresponding to the second information entropy to obtain p sub-scenes, where p is an integer greater than 1;
calculating the duration of each of the p sub-scenes;
if the duration of each of the p sub-scenes does not exceed the duration threshold, taking the scenes except any one of the initial scenes and the p sub-scenes as n target scenes corresponding to the target application;
if the duration of any one of the p sub-scenes exceeds the duration threshold, repeating the steps of adding a new variable and dividing the scene of any one of the m initial scenes by using the added new variable until n target scenes corresponding to the target application are determined.
In an embodiment, the scene determining module 502 is further configured to obtain duration data of each initial scene;
and carrying out average value calculation on all time lengths except the shortest time length and the longest time length in the time length data to obtain the duration time length of each initial scene.
In an embodiment, the scene determining module 502 is further configured to obtain q feature subsets corresponding to each variable, where q is an integer greater than 1;
Calculating the proportion of each feature subset in the q feature subsets to each variable to obtain q proportion values;
counting q proportional values to obtain the information entropy of each variable;
and summarizing the information entropy of each variable to obtain k information entropies.
In an embodiment, the display frame rate determining module 503 is further configured to obtain, for each of the n target scenes, an operation frequency of each target scene;
determining the priority of each target scene based on the operation frequency and the preset priority of each target scene;
determining a rendering frame rate of each target scene based on the priority of each target scene;
determining an actual display frame rate for each target scene based on the rendering frame rate and the target display frame rate for each target scene;
and summarizing the actual display frame rate of each target scene to obtain n actual display frame rates.
In an embodiment, the display frame rate determining module 503 is further configured to obtain, for each object of all objects, the number of operations performed by each object on each target scene;
the operation frequency of each object aiming at each target scene is obtained by taking the operation frequency as a quotient of the duration time of each target scene and the operation time of each object;
Adding the operation frequencies of each object aiming at each target scene to obtain the total operation frequency;
and (3) the total operation frequency is multiplied by the number of all objects to obtain the operation frequency of each target scene.
In an embodiment, the display frame rate determining module 503 is further configured to select a target scene with the smallest operation frequency from the n target scenes, and set the target scene with the smallest operation frequency as the lowest priority;
calculating the duty ratio of the duration of the target scene with the minimum operation frequency to the duration of n target scenes;
if the duty ratio is greater than or equal to the duty ratio upper limit corresponding to the lowest priority, matching the target scene with the minimum operation frequency with the last priority of the lowest priority to determine the priority of the target scene with the minimum operation frequency;
if the duty ratio is smaller than the duty ratio upper limit corresponding to the lowest priority, determining the target scene with the minimum operating frequency as the lowest priority;
and returning to the step of selecting the target scene with the smallest operation frequency from the n target scenes, and setting the target scene with the smallest operation frequency as the lowest priority until the priority of each target scene is determined.
In an embodiment, the display frame rate determining module 503 is further configured to multiply the rendering frame rate of each target scene with the target display frame rate to obtain an actual display frame rate of each target scene.
In an embodiment, the frame inserting processing module 504 is further configured to obtain a simulated frame of the target scene corresponding to any one of the n actual display frame rates if any one of the n actual display frame rates does not reach the target display frame rate;
and inserting a preset number of simulation frames into the rendering frames of the target scenes corresponding to any actual display frame rate at intervals so that the actual display frame rate of each of the n target scenes reaches the target display frame rate, wherein the preset number is the difference between any actual display frame rate and the target display frame rate.
Fig. 6 of the present application provides a schematic diagram of a computer device. As shown in fig. 6, the computer device 6 of this embodiment includes: a processor 601, a memory 602, and a computer program 603 stored in the memory 602 and executable on the processor 601. The steps of the above-described respective frame rate control method embodiments, such as steps 201 to 204 shown in fig. 2, are implemented when the processor 601 executes the computer program 603. Alternatively, the processor 601 may implement the functions of the modules/units in the respective frame rate control apparatus embodiments described above, such as the functions of the modules/units 501 to 504 shown in fig. 5, when executing the computer program 603.
The present application also provides a readable storage medium having a computer program stored therein, which when executed by a processor is configured to implement the frame rate control method provided in the above-described various embodiments.
The readable storage medium may be a computer storage medium or a communication medium. Communication media includes any medium that facilitates transfer of a computer program from one place to another. Computer storage media can be any available media that can be accessed by a general purpose or special purpose computer. For example, a readable storage medium is coupled to the processor such that the processor can read information from, and write information to, the readable storage medium. In the alternative, the readable storage medium may be integral to the processor. The processor and the readable storage medium may reside in an application specific integrated circuit (Application Specific Integrated Circuits, ASIC). In addition, the ASIC may reside in a user device. The processor and the readable storage medium may reside as discrete components in a communication device. The readable storage medium may be read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tape, floppy disk, optical data storage device, etc.
The present application also provides a program product comprising execution instructions stored in a readable storage medium. The at least one processor of the apparatus may read the execution instructions from the readable storage medium, and execution of the execution instructions by the at least one processor causes the apparatus to implement the frame rate control method provided by the various embodiments described above.
In the above described embodiments of the apparatus, it is understood that the processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present application may be embodied directly in a hardware processor or in a combination of hardware and software modules within a processor.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (15)

1. A frame rate control method, comprising:
obtaining m initial scenes in a target application, wherein m is an integer greater than 1;
determining n target scenes corresponding to the target application based on the m initial scenes and a duration threshold, wherein n is an integer greater than or equal to m;
determining n actual display frame rates based on the n target scenes and a preset priority, wherein the n actual display frame rates comprise actual display frame rates of each of the n target scenes;
if any one of the n actual display frame rates does not reach the target display frame rate, performing frame interpolation processing on the target scene corresponding to the any one actual display frame rate, so that the actual display frame rate of each target scene in the n target scenes reaches the target display frame rate.
2. The frame rate control method according to claim 1, wherein the determining n target scenes corresponding to the target application based on the m initial scenes and a duration threshold includes:
calculating the duration of each initial scene in the m initial scenes;
if the duration of each initial scene in the m initial scenes does not exceed the duration threshold, taking the m initial scenes as n target scenes corresponding to the target application, wherein m is equal to n;
If the duration of any one of the m initial scenes exceeds the duration threshold, performing scene division on any one of the m initial scenes, and determining n target scenes corresponding to the target application.
3. The frame rate control method according to claim 2, wherein the performing scene division on any one of the m initial scenes to determine n target scenes corresponding to the target application includes:
selecting k variables corresponding to any one of the m initial scenes, wherein the variables are used for representing variables affecting the operation frequency of the object, and k is an integer greater than 1;
calculating the information entropy of each variable in the k variables to obtain k information entropy;
sorting the k pieces of information entropy from large to small to obtain k pieces of sorted information entropy;
and performing scene division on any one of the m initial scenes by utilizing a variable corresponding to the first information entropy in the k ordered information entropies, and determining n target scenes corresponding to the target application.
4. The frame rate control method according to claim 3, wherein the scene division is performed on any one of the m initial scenes by using a variable corresponding to a first information entropy of the k ordered information entropies, and determining n target scenes corresponding to the target application includes:
Performing scene division on any one of the m initial scenes by utilizing a variable corresponding to a first information entropy in the k ordered information entropies to obtain t sub-scenes, wherein t is an integer greater than 1;
calculating the duration of each of the t sub-scenes;
if the duration of each sub-scene in the t sub-scenes does not exceed the duration threshold, taking the scenes except any one scene in the initial scene and the t sub-scenes as n target scenes corresponding to the target application;
if the duration of any one of the t sub-scenes exceeds the duration threshold, adding a new variable, and dividing the scene of any one of the m initial scenes by using the added new variable, and determining n target scenes corresponding to the target application, wherein the new variable is a variable corresponding to the next information entropy of the current information entropy.
5. The frame rate control method according to claim 4, wherein the scene division is performed on any one of the m initial scenes by using the added new variable, and determining n target scenes corresponding to the target application includes:
Performing scene division on any one of the m initial scenes by utilizing the variable corresponding to the first information entropy and the variable corresponding to the second information entropy to obtain p sub-scenes, wherein p is an integer greater than 1;
calculating the duration of each sub-scene in the p sub-scenes;
if the duration of each sub-scene in the p sub-scenes does not exceed the duration threshold, taking the scenes except any one scene in the initial scene and the p sub-scenes as n target scenes corresponding to the target application;
and if the duration of any one of the p sub-scenes exceeds the duration threshold, repeating the steps of adding a new variable and dividing the scene of any one of the m initial scenes by using the added new variable until n target scenes corresponding to the target application are determined.
6. The frame rate control method of claim 2, wherein the calculating the duration of each of the m initial scenes comprises:
acquiring duration data of each initial scene;
and calculating the average value of all the time durations except the shortest time duration and the longest time duration in the time duration data to obtain the duration of each initial scene.
7. The frame rate control method of claim 3, wherein said calculating the information entropy of each of the k variables, to obtain k information entropies, comprises:
q feature subsets corresponding to each variable are obtained, wherein q is an integer greater than 1;
calculating the proportion of each feature subset in the q feature subsets to each variable to obtain q proportion values;
counting the q proportional values to obtain the information entropy of each variable;
and summarizing the information entropy of each variable to obtain the k information entropies.
8. The frame rate control method of claim 1, wherein the determining n actual display frame rates based on the n target scenes and a preset priority comprises:
acquiring the operation frequency of each target scene in the n target scenes;
determining the priority of each target scene based on the operation frequency of each target scene and the preset priority;
determining a rendering frame rate of each target scene based on the priority of each target scene;
determining an actual display frame rate of each target scene based on the rendering frame rate of each target scene and the target display frame rate;
And summarizing the actual display frame rates of each target scene to obtain the n actual display frame rates.
9. The frame rate control method of claim 8, wherein the acquiring the operating frequency of each target scene comprises:
for each object in all objects, acquiring the operation times of each object for executing operation on each target scene;
the operation frequency of each object aiming at each target scene is obtained by using the operation times and the duration time of each target scene as a quotient;
adding the operation frequencies of each object aiming at each target scene to obtain the total operation frequency;
and the total operation frequency is multiplied by the number of all objects to obtain the operation frequency of each target scene.
10. The frame rate control method of claim 8, wherein the determining the priority of each target scene based on the operation frequency of each target scene and the preset priority comprises:
selecting a target scene with the minimum operating frequency from n target scenes, and setting the target scene with the minimum operating frequency as the lowest priority;
calculating the duty ratio of the duration of the target scene with the minimum operating frequency to the duration of the n target scenes;
If the duty ratio is greater than or equal to the duty ratio upper limit corresponding to the lowest priority, matching the target scene with the minimum operation frequency with the last priority of the lowest priority to determine the priority of the target scene with the minimum operation frequency;
if the duty ratio is smaller than the duty ratio upper limit corresponding to the lowest priority, determining the target scene with the minimum operating frequency as the lowest priority;
and returning to the step of selecting the target scene with the smallest operation frequency from the n target scenes, and setting the target scene with the smallest operation frequency as the lowest priority level until the priority level of each target scene is determined.
11. The frame rate control method of claim 8, wherein the determining the actual display frame rate for each target scene based on the rendering frame rate for each target scene and the target display frame rate comprises:
multiplying the rendering frame rate of each target scene by the target display frame rate to obtain the actual display frame rate of each target scene.
12. The frame rate control method according to claim 1, wherein if any one of the n actual display frame rates does not reach a target display frame rate, performing frame interpolation processing on a target scene corresponding to the any one actual display frame rate so that the actual display frame rate of each of the n target scenes reaches the target display frame rate, includes:
If any one of the n actual display frame rates does not reach the target display frame rate, acquiring a simulation frame of a target scene corresponding to the any one actual display frame rate;
and inserting a preset number of simulation frames into the rendering frames of the target scenes corresponding to any actual display frame rate at intervals so that the actual display frame rate of each target scene in the n target scenes reaches the target display frame rate, wherein the preset number is the difference between any actual display frame rate and the target display frame rate.
13. A frame rate control apparatus, comprising:
the acquisition module is used for acquiring m initial scenes in the target application, wherein m is an integer greater than 1;
the scene determining module is used for determining n target scenes corresponding to the target application based on the m initial scenes and the duration threshold, wherein n is an integer greater than or equal to m;
a display frame rate determining module, configured to determine n actual display frame rates based on the n target scenes and a preset priority, where the n actual display frame rates include an actual display frame rate of each of the n target scenes;
and the frame inserting processing module is used for carrying out frame inserting processing on the target scene corresponding to any one of the n actual display frame rates if any one of the n actual display frame rates does not reach the target display frame rate, so that the actual display frame rate of each target scene in the n target scenes reaches the target display frame rate.
14. A computer device comprising a memory, and one or more processors communicatively coupled to the memory;
stored in the memory are instructions executable by the one or more processors to cause the one or more processors to implement the frame rate control method of any one of claims 1 to 12.
15. A computer-readable storage medium, comprising a program or instructions that, when run on a computer, implements the frame rate control method of any one of claims 1 to 12.
CN202410002823.1A 2024-01-02 2024-01-02 Frame rate control method, device, computer equipment and storage medium Pending CN117499739A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410002823.1A CN117499739A (en) 2024-01-02 2024-01-02 Frame rate control method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410002823.1A CN117499739A (en) 2024-01-02 2024-01-02 Frame rate control method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117499739A true CN117499739A (en) 2024-02-02

Family

ID=89667621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410002823.1A Pending CN117499739A (en) 2024-01-02 2024-01-02 Frame rate control method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117499739A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110213670A (en) * 2019-05-31 2019-09-06 北京奇艺世纪科技有限公司 Method for processing video frequency, device, electronic equipment and storage medium
CN110798738A (en) * 2018-08-01 2020-02-14 Oppo广东移动通信有限公司 Frame rate control method, device, terminal and storage medium
CN112230758A (en) * 2020-11-09 2021-01-15 腾讯科技(深圳)有限公司 Frame rate adjustment method, device, equipment and computer readable storage medium
CN113099313A (en) * 2021-03-31 2021-07-09 杭州海康威视数字技术股份有限公司 Video slicing method and device and electronic equipment
CN114740965A (en) * 2022-05-05 2022-07-12 Oppo广东移动通信有限公司 Processing method and device for reducing power consumption of terminal, terminal and readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110798738A (en) * 2018-08-01 2020-02-14 Oppo广东移动通信有限公司 Frame rate control method, device, terminal and storage medium
CN110213670A (en) * 2019-05-31 2019-09-06 北京奇艺世纪科技有限公司 Method for processing video frequency, device, electronic equipment and storage medium
CN112230758A (en) * 2020-11-09 2021-01-15 腾讯科技(深圳)有限公司 Frame rate adjustment method, device, equipment and computer readable storage medium
CN113099313A (en) * 2021-03-31 2021-07-09 杭州海康威视数字技术股份有限公司 Video slicing method and device and electronic equipment
CN114740965A (en) * 2022-05-05 2022-07-12 Oppo广东移动通信有限公司 Processing method and device for reducing power consumption of terminal, terminal and readable storage medium

Similar Documents

Publication Publication Date Title
CN108574732B (en) Push message processing method, computer readable storage medium and terminal equipment
CN108319974B (en) Data processing method, data processing device, storage medium and electronic device
CN108721898B (en) Frame rate determination method and apparatus, storage medium, and electronic apparatus
CN111478857B (en) Interface current limiting control method and device and electronic equipment
CN111147395B (en) Network resource adjusting method and device
CN108388509B (en) Software testing method, computer readable storage medium and terminal equipment
CN113055877B (en) Cloud card distribution method and device, electronic equipment and storage medium
CN111324533B (en) A/B test method and device and electronic equipment
CN107832142B (en) Resource allocation method and equipment for application program
WO2015184888A1 (en) Terminal, method, and system for switching networks
CN110531951B (en) Picture output mode adjusting method, device, equipment, system and storage medium
JP2019034201A (en) Video game processing program and video game processing system
CN111770581A (en) Power communication network wireless resource scheduling method and device
CN110311963B (en) Message pushing method and device, computer equipment and computer readable storage medium
CN114024737B (en) Method, apparatus and computer readable storage medium for determining live room volume
CN111416883A (en) Address query method, device, terminal and computer readable storage medium
CN110806908A (en) Application software pre-starting method, terminal and computer readable storage medium
CN113296666A (en) Anchor exposure data reporting method and device, terminal equipment and storage medium
CN117499739A (en) Frame rate control method, device, computer equipment and storage medium
CN111277451B (en) Service evaluation method, device, terminal equipment and medium
CN105992055B (en) video decoding method and device
CN110851327A (en) Hardware state information acquisition method and device, terminal and readable storage medium
WO2023159821A1 (en) Method and device for determining operational behavior, storage medium, and electronic device
CN107911345B (en) Game reservation list generation method and device and server
CN107291543B (en) Application processing method and device, storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination