CN115550690A - Frame rate adjusting method, device, equipment and storage medium - Google Patents

Frame rate adjusting method, device, equipment and storage medium Download PDF

Info

Publication number
CN115550690A
CN115550690A CN202211536842.XA CN202211536842A CN115550690A CN 115550690 A CN115550690 A CN 115550690A CN 202211536842 A CN202211536842 A CN 202211536842A CN 115550690 A CN115550690 A CN 115550690A
Authority
CN
China
Prior art keywords
frame rate
code
rate
decoding
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211536842.XA
Other languages
Chinese (zh)
Other versions
CN115550690B (en
Inventor
曹健
曹洪彬
黄永铖
杨小祥
陈思佳
宋美佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202211536842.XA priority Critical patent/CN115550690B/en
Publication of CN115550690A publication Critical patent/CN115550690A/en
Application granted granted Critical
Publication of CN115550690B publication Critical patent/CN115550690B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234363Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234381Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the temporal resolution, e.g. decreasing the frame rate by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2387Stream processing in response to a playback request from an end-user, e.g. for trick-play
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The embodiment of the invention provides a frame rate adjusting method, a device, equipment and a storage medium, wherein the frame rate adjusting method comprises the following steps: the method comprises the steps of obtaining a first network bandwidth when a target type application program starts video stream transmission, determining a target frame rate according to the first network bandwidth and decoding rendering capacity information of terminal equipment when the target type application program starts, wherein the decoding rendering capacity information comprises decoding rendering capacity of the terminal equipment on code streams of M code rate points, M is a positive integer, and sending the target frame rate to a server so that the server can adjust a current frame rate to the target frame rate, and the current frame rate comprises a coding frame rate or a coding frame rate and a data acquisition frame rate. Therefore, the server is encoded by using the encoding frame rate adaptive to the rendering capability of the terminal equipment, the decoding rendering capabilities of the terminal equipment of different types can be reasonably and effectively utilized, the playing pictures of the terminal equipment are smoother, and the effective utilization rate of hardware resources and bandwidth resources of the server end can be improved.

Description

Frame rate adjusting method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of internet, in particular to a frame rate adjusting method, a frame rate adjusting device, frame rate adjusting equipment and a storage medium.
Background
Cloud gaming is an online gaming technology based on cloud computing technology. With the development of cloud rendering and video encoding technologies, cloud games have become increasingly popular and become an important game form. The cloud game puts the running, rendering and other logics of the game on a cloud server, the rendered game picture is coded and compressed through a video coding technology, the coded code stream is transmitted to the terminal equipment through a network, and the code stream is decoded, rendered and played by the terminal equipment.
In the prior art, in order to consider terminal devices of low-end models, namely to ensure that all terminal devices can decode and render code streams smoothly, the cloud server encodes game pictures related to game scenes into code streams by using a uniform and low coding frame rate, and transmits the code streams to the terminal devices through a network.
However, the above method may cause the encoding frame rate to only adapt to the decoding rendering capability of part of the terminal devices, and the decoding rendering capability of other model terminal devices cannot be reasonably and effectively utilized.
Disclosure of Invention
Embodiments of the present invention provide a frame rate adjustment method, apparatus, device, and storage medium, which can make reasonable and effective use of decoding and rendering capabilities of different types of terminal devices, and make playing pictures of the terminal devices smoother.
In a first aspect, a frame rate adjustment method is provided, including:
acquiring a first network bandwidth when a target type application program starts video stream transmission;
determining a target frame rate according to the first network bandwidth and decoding rendering capability information of terminal equipment when the target type application program is started, wherein the decoding rendering capability information comprises decoding rendering capabilities of the terminal equipment on code streams of M code rate points, and M is a positive integer;
and sending the target frame rate to a server so that the server adjusts the current frame rate to the target frame rate, wherein the current frame rate comprises an encoding frame rate or an encoding frame rate and a data acquisition frame rate.
In a second aspect, a frame rate adjustment method is provided, including:
receiving a target frame rate sent by a terminal device, wherein the target frame rate is determined by the terminal device according to a first network bandwidth and decoding rendering capability information of the terminal device when a target type application program is started, the first network bandwidth is the network bandwidth when the target type application program starts video stream transmission, the decoding rendering capability information comprises the decoding rendering capability of the terminal device on code streams of M code rate points, and M is a positive integer;
adjusting the current encoding frame rate to the target frame rate;
coding a video to be transmitted according to the target frame rate to obtain a coded code stream;
and sending the coded code stream to the terminal equipment so that the terminal equipment can decode, render and play the code stream.
In a third aspect, a frame rate adjustment apparatus is provided, including:
the acquisition module is used for acquiring a first network bandwidth when a target type application program starts video streaming transmission;
a determining module, configured to determine a target frame rate according to the first network bandwidth and decoding rendering capability information of a terminal device when the target type application program is started, where the decoding rendering capability information includes decoding rendering capabilities of the terminal device on code streams of M code rate points, and M is a positive integer;
and the sending module is used for sending the target frame rate to a server so as to enable the server to adjust the current coding frame rate to the target frame rate.
In a fourth aspect, there is provided a frame rate adjusting apparatus, comprising:
the video stream transmission method comprises the steps that a receiving module is used for receiving a target frame rate sent by a terminal device, wherein the target frame rate is determined by the terminal device according to a first network bandwidth and decoding rendering capacity information of the terminal device when a target type application program is started, the first network bandwidth is the network bandwidth when the target type application program starts video stream transmission, the decoding rendering capacity information comprises the decoding rendering capacity of the terminal device on code streams of M code rate points, and M is a positive integer;
the adjusting module is used for adjusting the current coding frame rate to the target frame rate;
the encoding module is used for encoding the video to be transmitted according to the target frame rate to obtain an encoded code stream;
and the sending module is used for sending the coded code stream to the terminal equipment so that the terminal equipment can decode, render and play the code stream.
In a fifth aspect, there is provided a frame rate adjustment apparatus, including: a processor and a memory, the memory being adapted to store a computer program, the processor being adapted to call and run the computer program stored in the memory to perform the method as in the first aspect or its embodiments or in the second aspect or its embodiments.
A sixth aspect provides a computer readable storage medium for storing a computer program for causing a computer to perform a method as in the first aspect or embodiments thereof or the second aspect or embodiments thereof.
In a seventh aspect, there is provided a computer program product comprising computer program instructions to cause a computer to perform a method as in the first aspect or embodiments thereof or the second aspect or embodiments thereof.
In summary, in the embodiment of the present invention, a first network bandwidth when a target type application starts video streaming is obtained by a terminal device, a target frame rate is determined according to the first network bandwidth and decoding rendering capability information of the terminal device when the target type application starts, the target frame rate is sent to a server, the server adjusts a current frame rate to the target frame rate, and the current frame rate includes an encoding frame rate or an encoding frame rate and a data acquisition frame rate. Therefore, the frame rate is adjusted through the cooperation of the end clouds, and the current frame rate of the server is adjusted in real time according to the first network bandwidth when the target type application program starts video stream transmission and the decoding rendering capacity information of the terminal equipment when the target type application program starts, so that the code stream coded by the server is obtained by coding based on the rendering capacity information of the terminal equipment when the target type application program starts, the coding frame rate can be well adapted to the rendering capacity of the terminal equipment, the coding frame rate which is adapted to the rendering capacity of the terminal equipment is used for coding by the server, the phenomenon that the server transmits the code stream with the coding frame rate which is too high or too low to the terminal equipment is avoided, the decoding rendering capacities of the terminal equipment of different types can be reasonably and effectively utilized, the playing pictures of the terminal equipment are smoother, and the effective utilization rates of hardware resources and bandwidth resources of the server end can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of a video image processing process according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a video image processing process according to an embodiment of the present invention;
fig. 3 is a schematic view of an application scenario of a frame rate adjustment method according to an embodiment of the present invention;
fig. 4 is an interaction flowchart of a frame rate adjustment method according to an embodiment of the present invention;
fig. 5 is an interaction flow diagram of a frame rate adjustment method according to an embodiment of the present invention;
fig. 6 is a flowchart of a method for obtaining decoding rendering capability information of a terminal device according to an embodiment of the present invention;
FIG. 7 is a flowchart of a method for determining a frame rate table according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a frame rate adjustment apparatus according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a frame rate adjustment apparatus according to an embodiment of the present invention;
fig. 10 is a schematic block diagram of a frame rate adjusting apparatus provided in an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Moreover, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Before the technical solution of the embodiment of the present invention is introduced, the following is to introduce the related knowledge of the embodiment of the present invention:
1. cloud technology (Cloud technology) refers to a hosting technology for unifying serial resources such as hardware, software, and network in a wide area network or a local area network to realize calculation, storage, processing, and sharing of data. Cloud technology (Cloud technology) is based on a general term of network technology, information technology, integration technology, management platform technology, application technology and the like applied in a Cloud computing business model, can form a resource pool, is used as required, and is flexible and convenient. Cloud computing technology will become an important support. Background services of technical network systems require a large amount of computing and storage resources, such as video websites, picture-like websites and more portal websites. With the high development and application of the internet industry, each article may have its own identification mark and needs to be transmitted to a background system for logic processing, data in different levels are processed separately, and various industrial data need strong system background support and can only be realized through cloud computing.
2. Cloud gaming (Cloud gaming), also known as gaming on demand (gaming), is an online gaming technology based on Cloud computing technology. Cloud game technology enables light-end devices (thin clients) with relatively limited graphics processing and data computing capabilities to run high-quality games. In a cloud game scene, a game is not operated in a player game terminal but in a cloud server, and the cloud server renders the game scene into a video and audio stream which is transmitted to the player game terminal through a network. The player game terminal does not need to have strong graphic operation and data processing capacity, and only needs to have basic streaming media playing capacity and capacity of acquiring player input instructions and sending the instructions to the cloud server. In short, cloud games are game modes based on cloud computing, in an operation mode of the cloud games, all games are operated in a cloud server, and the cloud server compresses rendered game pictures and transmits the compressed game pictures to a player game terminal through a network.
3. The frame rate, which is a definition in the image field, may refer to the number of frames transmitted per second of a picture, which refers to the number of pictures of a moving picture or video, the unit of the frame rate is FPS (FPS) (number of frames transmitted per second), and FPS is a definition in the image field, which refers to the number of frames transmitted per second, which is the number of pictures of a moving picture or video. The larger the frame rate, the smoother the picture.
4. The Bit Rate, also called Bit Rate (Bit Rate), or called Bit Rate, is the amount of video (or audio) data per unit time in bps (bits per second), typically using kbps (kilobits per second) or Mbps (megabits per second). Different bit rates, which determine the degree of video compression of the encoder, are key factors for determining the final quality of the video and the file size.
5. Video coding, which is a method of converting a file in an original video format into a file in another video format by a compression technology, wherein the converted data can be called a code stream.
6. Video decoding, which is the reverse process of video encoding.
7. End cloud cooperation: a method for improving video quality/smoothness by server (cloud) and user (terminal) through negotiation and cooperative work, such as cooperative formulation or adjustment of video coding/video processing/network transmission strategy. The end cloud cooperation can be used for scenes such as cloud games, cloud rendering and real-time communication.
In the prior art, in order to ensure that all terminal devices can decode and render code streams smoothly, a cloud server encodes game pictures related to game scenes into code streams by using a uniform and low encoding frame rate, and transmits the code streams to the terminal devices through a network. However, the method may cause the encoding frame rate to be only adapted to the decoding rendering capability of part of the terminal devices, and the decoding rendering capability of other model terminal devices cannot be reasonably and effectively utilized.
In order to solve the technical problem, in the embodiment of the present invention, a first network bandwidth when a target type application starts video stream transmission is obtained through a terminal device, a target frame rate is determined according to the first network bandwidth and decoding rendering capability information of the terminal device when the target type application starts, and the target frame rate is sent to a server, so that the server adjusts a current frame rate to the target frame rate, where the current frame rate includes an encoding frame rate or an encoding frame rate and a data acquisition frame rate. Therefore, the frame rate is adjusted through the cooperation of the end clouds, and the current frame rate of the server is adjusted in real time according to the first network bandwidth when the target type application program starts video stream transmission and the decoding rendering capacity information of the terminal equipment when the target type application program starts, so that the code stream coded by the server is obtained by coding based on the rendering capacity information of the terminal equipment when the target type application program starts, the coding frame rate can be well adapted to the rendering capacity of the terminal equipment, the coding frame rate which is adapted to the rendering capacity of the terminal equipment is used for coding by the server, the phenomenon that the server transmits the code stream with the coding frame rate which is too high or too low to the terminal equipment is avoided, the decoding rendering capacities of the terminal equipment of different types can be reasonably and effectively utilized, the playing pictures of the terminal equipment are smoother, and the effective utilization rates of hardware resources and bandwidth resources of the server end can be improved.
It should be understood that the technical solution of the embodiment of the present invention can be applied to the following scenarios, but is not limited to:
currently, for some video or image processing procedures in cloud-based scenes, the following can be performed: fig. 1 is a schematic view of a video image processing process according to an embodiment of the present invention, and fig. 2 is a schematic view of a video image processing process according to an embodiment of the present invention. As shown in fig. 1, the cloud server generates a video, performs video image acquisition, processes the acquired video image, and encodes the processed video image to obtain a code stream of the video image, and further, the cloud server may send the code stream to the terminal device, and the terminal device decodes the code stream, and finally performs display of the video image according to a decoding result. Or, as shown in fig. 2, the cloud server generates a video, performs video image acquisition, and encodes the acquired video image to obtain a code stream of the video image, and further, the cloud server may send the code stream to the terminal device, and the terminal device decodes the code stream, processes the decoded video image, such as sharpening, blurring, and denoising, and finally displays the processed video image.
For example, fig. 3 is a schematic view of an application scenario of the frame rate adjustment method provided in the embodiment of the present invention, as shown in fig. 3, the terminal device 110 may communicate with the server 120, where the server 120 may be any independent physical server, or may also be a cloud server that provides basic cloud services such as cloud service, cloud database, cloud computing, cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, and big data and artificial intelligence platform, or may also be a server cluster or distributed system formed by a plurality of physical servers, which is not limited in this embodiment of the present invention. The server 120 may include an encoding module for image encoding, which may include hardware devices/software code that compresses the analog audio-video signals into encoded data (encoded files).
The terminal device 110 may be a device with basic streaming media playing capability, man-machine interaction capability, and communication capability, such as a smart phone, a tablet computer, a desktop computer, and a smart television, and a game client is installed and operated in the terminal device 110, and the game client may be an application program running in the terminal device 110. Or, an instant messaging client or an av conference supporting client is installed and operated in the terminal device 110, and a decoding module for decoding an image may be included in the terminal device 110, and the decoding module may include a hardware device/software code for converting encoded data (or an encoded file) into an analog av signal.
In a cloud game scenario, a cloud server refers to a server running a game in the cloud, and has functions of video enhancement (pre-coding), video coding, and the like, but is not limited thereto. The terminal equipment is equipment which has rich man-machine interaction modes, has the capability of accessing the internet, is usually provided with various operating systems and has stronger processing capability. The terminal device may be a smart phone, a living room television, a tablet computer, a vehicle-mounted terminal, a player game terminal, such as a handheld game console, but is not limited thereto.
In the running process of the cloud game, the server 120 may invoke the coding module to perform image compression coding operation on a game picture obtained by rendering according to a video coding standard (such as h.264, h.265, and the like) to save network bandwidth occupied during subsequent image transmission, then may transmit coded data (coded frame) obtained by image compression coding to the terminal device 110 in a code stream manner, and after the terminal device 110 receives the coded data, the coded data may be decoded by the decoding module to obtain an original video frame, and then the rendering module renders and displays the video frame, that is, displays the original game picture.
In some possible embodiments, the application scenario shown in fig. 3 may further include: a base station, a core network side device, and the like, and fig. 3 exemplarily shows one terminal device and one server, and may actually include other numbers of terminal devices and servers, which is not limited in this embodiment of the present invention.
Illustratively, the technical scheme of the embodiment of the invention can be applied to audio and video interactive scenes, such as cloud games, interactive live broadcasts, video conferences, video calls and other scenes, and can provide the optimal frame rate, namely smoother experience, for the user according to the decoding and rendering capabilities of the terminal equipment.
The technical scheme of the embodiment of the invention is explained in detail as follows:
fig. 4 is an interaction flowchart of a frame rate adjustment method according to an embodiment of the present invention, which may be executed by, for example, the terminal device 110 and the server 120 shown in fig. 3, but is not limited thereto, and as shown in fig. 4, the method includes the following steps:
s101, the terminal equipment obtains a first network bandwidth when the target type application program starts video streaming transmission.
Specifically, various applications are installed and run in the terminal device, the target type application refers to an application having a video streaming function, and the target type application may be, for example, a game type application, an instant messaging type application, and an application having a video call or a voice call function.
When a user starts a target type application program to perform video streaming transmission, a terminal device can detect that the target type application program starts video streaming transmission, and at this time, the terminal device can perform network speed measurement. In this embodiment, the specific network testing method is not limited.
S102, the terminal device determines a target frame rate according to the first network bandwidth and decoding rendering capacity information of the terminal device when the target type application program is started, wherein the decoding rendering capacity information comprises decoding rendering capacity of the terminal device to code streams of M code rate points, and M is a positive integer.
Specifically, the target type application is started up differently from the target type application which starts video streaming, and the target type application is started up first in time before video streaming is started up, for example, an instant messaging application is started up first before a video call is started (i.e., video streaming is started).
In this embodiment, the decoding rendering capability information of the terminal device is the decoding rendering capability information acquired when the target type application program is started, and the decoding rendering capability information includes the decoding rendering capability of the terminal device on the code stream of the M code rate points.
Optionally, as an implementable manner, S102 may specifically be:
s1021, the terminal device determines a frame rate table according to the decoding rendering capability information of the terminal device when the target type application program is started, wherein the frame rate table comprises frame rates corresponding to the M code rate points.
S1022, the terminal device determines the target frame rate according to the frame rate table and the first network bandwidth.
Optionally, in an implementable manner, S1021 may specifically be:
s10211, determining a decoding rendering mode of the terminal equipment, wherein the decoding rendering mode comprises a synchronous model and an asynchronous mode.
S10212, calculating frame rates corresponding to the M code rate points according to the decoding rendering mode and the decoding rendering capability information of the terminal equipment.
Optionally, the decoding rendering capability includes an average decoding time and an average rendering time, and in S10212, frame rates corresponding to the M code rate points are calculated according to a decoding rendering mode and decoding rendering capability information of the terminal device, which may specifically be:
if the decoding rendering mode of the terminal equipment is a synchronous mode, the frame rate corresponding to any target code rate point in the M code rate points is a quotient of N and a first numerical value, the first numerical value is the sum of the average decoding time of the target code rate point and the average rendering time of the target code rate point, and N is a preset value.
Wherein, N may be 1 second(s), and the frame rate corresponding to any one target code rate point in the M code rate points is a quotient of 1 divided by the first value.
If the decoding rendering mode of the terminal equipment is an asynchronous mode, the frame rate corresponding to any target code rate point in the M code rate points is the quotient of N and a second numerical value, and the second numerical value is the maximum value of the average decoding time of the target code rate point and the average rendering time of the target code rate point.
Wherein, N may be 1 second(s), and the frame rate corresponding to any one target code rate point in the M code rate points is a quotient of 1 divided by the second value.
S10213, obtaining a frame rate table according to the frame rates respectively corresponding to the M code rate points.
Optionally, in an implementable manner, the obtaining of the frame rate table according to the frame rates respectively corresponding to the M code rate points may specifically be:
and determining code rate intervals corresponding to the M code rate points according to the frame rates corresponding to the M code rate points, and obtaining a frame rate table according to the code rate intervals corresponding to the M code rate points and the frame rates corresponding to the M code rate points.
Illustratively, the following table one is an example of a frame rate table:
frame rate table
Code rate point Interval of corresponding code rate Frame rate
5 Mbps 0 ~ 5 Mbps fps_5
10 Mbps 5 ~ 10 Mbps fps_10
i Mbps (i-5) ~ i Mbps fps_i
Optionally, in S1022, the target frame rate is determined according to the frame rate table and the first network bandwidth, which may specifically be:
and searching a code rate interval where the first network bandwidth is located from the frame rate table, and determining the frame rate corresponding to the code rate interval where the first network bandwidth is located as the target frame rate. For example, the first network bandwidth is 7.5 Mbps, the code rate interval in which the first network bandwidth is located is 5 to 10Mbps, and the corresponding frame rate is fps _10, that is, the target frame rate is fps _10 at this time.
S103, the terminal device sends the target frame rate to a server.
S104, the server adjusts the current frame rate to be the target frame rate, wherein the current frame rate comprises an encoding frame rate or an encoding frame rate and a data acquisition frame rate.
Specifically, the current frame rate includes an encoding frame rate or an encoding frame rate and a data acquisition frame rate, that is, the server may adjust the current encoding frame rate to the target frame rate, or may adjust both the current encoding frame rate and the data acquisition frame rate to the target frame rate.
And S105, the server encodes the video to be transmitted according to the target frame rate to obtain an encoded code stream.
Specifically, in this embodiment, the server encodes the video to be transmitted according to the target frame rate to obtain an encoded code stream, and the encoding mode is not limited.
And S106, the server sends the coded code stream to the terminal equipment.
S107, the terminal equipment decodes, renders and plays the received code stream.
Specifically, the server sends the encoded code stream to the terminal device, and after receiving the encoded code stream, the terminal device may decode the encoded code stream first, and render and play the decoded code stream.
In this embodiment, the decoding rendering capability information of the terminal device may be obtained when the target type application program is started, and since the user may operate other functions after the application program is started, the video streaming may not be immediately performed, and the application program may be started for a period of time and then run in the background, in this embodiment, the current decoding rendering capability information of the terminal device is obtained and cached when the target type application program is started, and the decoding rendering capability information is used when the target type application program starts the video streaming. An implementation of obtaining the decoding rendering capability information of the terminal device when the target type application is started is described in detail below.
Optionally, in an implementable manner, before S101, the method of this embodiment may further include:
and S108, the terminal equipment acquires the second network bandwidth when the target type application program is started.
Specifically, when the user starts the target type application program, the terminal device may detect that the target type application program is started, and at this time, the terminal device may perform network speed measurement, and specifically, a second network bandwidth when the target type application program is started may be obtained by using a preset network speed measurement method, where the second network bandwidth may be a current maximum network bandwidth measured by the terminal device. In this embodiment, the specific network testing method is not limited.
And S109, the terminal equipment acquires the decoding rendering capability information of the terminal equipment according to the second network bandwidth.
Specifically, the decoded rendering capability information of the terminal device is determined according to the second network bandwidth of the terminal device. In an implementable manner, S109 may specifically include:
s1091, determining M code rate points according to the second network bandwidth.
S1092, obtaining the test code stream of each code rate point in the M code rate points.
S1093, respectively decoding and rendering the test code stream of each code rate point in the M code rate points to obtain the decoding and rendering capabilities of the terminal equipment on the code streams of the M code rate points, wherein the decoding and rendering capabilities comprise average decoding time and average rendering time.
As an implementable manner, the determining M code rate points according to the second network bandwidth in S1091 specifically may include:
s10911, rounding up the second network bandwidth to obtain a rounded value of the second network bandwidth.
Specifically, the second network bandwidth is 10.1Mbps rounded up to 11Mbps, for example.
S10912, quantizing the value of the second network bandwidth after the second network bandwidth is rounded according to the quantization step size, and obtaining a quantized value of the second network bandwidth.
The quantization step may be, for example, 5Mbps, 7 Mbps, 10Mbps, or other values, which is not limited in this embodiment.
As an implementable manner, the rounded value of the second network bandwidth is quantized according to the quantization step size, and may be quantized by the following formula:
Figure 245098DEST_PATH_IMAGE001
wherein max _ rate is a quantization value of the second network bandwidth, A is a quantization step, and max _ rate _ int is a value taken by the second network bandwidthThe value of the whole process is shown,
Figure 693397DEST_PATH_IMAGE002
in (1), the "/" in (1) is an integer division operation, and as a =5 is taken as an example, 11/5=2, 14/5=2, 15/5=3;
Figure 269872DEST_PATH_IMAGE003
it means that the remainder of max _ rate _ int divided by a is calculated first, and if the remainder is greater than 0, the result is 1, otherwise the result is 0.
Taking the quantization step as 5 as an example, the quantization value of the second network bandwidth is:
Figure 196240DEST_PATH_IMAGE004
according to the above formula, for example, the quantized value of the network bandwidth 10.1Mbps is 15, and the quantized value of the network bandwidth 17 Mbps is 20.
S10913, determining M code rate points according to the quantization value and the quantization step of the second network bandwidth.
Specifically, for example, if the quantization value of the second network bandwidth is 15Mbps and the quantization step size is 5Mbps, three rate points of 5Mbps, 10Mbps and 15Mbps, that is, M =3, may be determined.
As an implementation manner, S1092 may specifically be:
determining a test code stream of each code rate point in the M code rate points according to the test file;
optionally, the test file may be pre-stored, for example, the test file may be a file pre-stored locally, for example, M =3,3 code rate points are 5Mbps, 10Mbps, and 15Mbps, the test code stream of each of the M code rate points is determined according to the test file, and the test code stream with a code rate of 5Mbps, the test code stream with a code rate of 10Mbps, and the test code stream with a code rate of 15Mbps may be obtained from the test file, respectively.
Alternatively, the test file may be received by the terminal device from the server when the target type application is started. Correspondingly, the method of the embodiment may further include: the server receives a test file acquisition request sent by the terminal equipment, the test file acquisition request is sent when a target type application program is started, and the server sends a test file to the terminal equipment. The mode does not occupy the storage space of the terminal equipment.
According to the frame rate adjusting method provided by the embodiment of the invention, the first network bandwidth when the target type application program starts video stream transmission is obtained through the terminal equipment, the target frame rate is determined according to the first network bandwidth and the decoding rendering capacity information of the terminal equipment when the target type application program starts, the target frame rate is sent to the server, the server adjusts the current frame rate to the target frame rate, and the current frame rate comprises the coding frame rate or the coding frame rate and the data acquisition frame rate. Therefore, the frame rate is adjusted through the cooperation of the end clouds, and the current frame rate of the server is adjusted in real time according to the first network bandwidth when the target type application program starts video stream transmission and the decoding rendering capacity information of the terminal equipment when the target type application program starts, so that the code stream coded by the server is obtained by coding based on the rendering capacity information of the terminal equipment when the target type application program starts, the coding frame rate can be well adapted to the rendering capacity of the terminal equipment, the coding frame rate which is adapted to the rendering capacity of the terminal equipment is used for coding by the server, the phenomenon that the server transmits the code stream with the coding frame rate which is too high or too low to the terminal equipment is avoided, the decoding rendering capacities of the terminal equipment of different types can be reasonably and effectively utilized, the playing pictures of the terminal equipment are smoother, and the effective utilization rates of hardware resources and bandwidth resources of the server end can be improved.
The following describes in detail a technical solution of the frame rate adjustment method according to an embodiment of the present invention with reference to a specific embodiment.
Fig. 5 is an interaction flow diagram of a frame rate adjustment method according to an embodiment of the present invention, and as shown in fig. 5, the method according to the embodiment may include:
s201, the terminal device obtains a second network bandwidth when the target type application program is started.
Specifically, when the user starts the target type application program, the terminal device may detect that the target type application program is started, and at this time, the terminal device may perform network speed measurement. In this embodiment, the specific network testing method is not limited.
S202, the terminal device rounds the second network bandwidth upwards to obtain a rounded value of the second network bandwidth, and the rounded value of the second network bandwidth is quantized according to the quantization step size to obtain a quantized value of the second network bandwidth.
Specifically, in the present embodiment, the quantization step is taken as 5 as an example, for example, the second network bandwidth is 10.1Mbps, and rounded up to 11Mbps.
The rounded value of the second network bandwidth is quantized according to the quantization step size, and the quantization can be performed by the following formula:
Figure 541770DEST_PATH_IMAGE005
wherein max _ rate is a quantization value of the second network bandwidth, 5 is a quantization step size, max _ rate _ int is a value obtained by rounding the second network bandwidth,
Figure 98654DEST_PATH_IMAGE006
where "/" is an integer division operation, the result is to retain only the integer part of the quotient, e.g., 11/5=2, 14/5=2, 15/5=3;
Figure 896845DEST_PATH_IMAGE007
the remainder of dividing max _ rate _ int by 5 is calculated first, and if the remainder is greater than 0, the result is 1, otherwise the result is 0.
According to the above formula, for example, the quantized value of the network bandwidth 10.1Mbps is 15, and the quantized value of the network bandwidth 17 Mbps is 20.
Optionally, S201 to S202 may be specifically executed by a network speed measurement module in the terminal device, and an output of the network speed measurement module is a quantized value of the second network bandwidth.
S203, the terminal device determines M code rate points according to the quantization value and the quantization step of the second network bandwidth, and determines the test code stream of each code rate point in the M code rate points according to the test file.
Specifically, for example, the quantization value of the second network bandwidth is 15Mbps, and the quantization step size is 5Mbps, three code rate points of 5Mbps, 10Mbps, and 15Mbps, that is, M =3, may be determined.
S204, the terminal device respectively decodes and renders the test code stream of each code rate point in the M code rate points to obtain decoding and rendering capability information of the terminal device when the target type application program is started, wherein the decoding and rendering capability information comprises the decoding and rendering capability of the terminal device on the code streams of the M code rate points, and the decoding and rendering capability comprises average decoding time and average rendering time.
Optionally, the above S203-S204 may be executed by a decoding rendering capability detection module of the terminal device, and exemplarily, a process of obtaining the decoding rendering capability information of the terminal device according to the quantized value of the second network bandwidth is shown in conjunction with fig. 6. Fig. 6 is a flowchart of a method for acquiring decoding rendering capability information of a terminal device according to an embodiment of the present invention, and as shown in fig. 6, an acquiring process of the decoding rendering capability information of the terminal device may include:
and S1, inputting a quantized value of the second network bandwidth.
And S2, enabling an integer variable i =5.
Here, 5 is the quantization step size in S203.
And S3, judging whether the variable i is smaller than or equal to the quantization value of the second network bandwidth.
If yes, executing S4, otherwise executing S6.
And S4, recording the average decoding time and the average rendering time corresponding to the i Mbps code rate point to obtain the decoding rendering capability of the code stream of the i Mbps code rate point.
Specifically, the units of the average decoding time and the average rendering time are both milliseconds. The code stream with the code rate of i Mbps can be determined according to the test file. The test file may be pre-stored or retrieved from a server via network transmission.
S5, i = i + 5, and proceeds to S3.
And S6, outputting the decoding rendering capability information of the terminal equipment, and ending.
Specifically, the decoding rendering capability of the terminal device on the code streams of the M code rate points can be obtained through the above S1 to S5, that is, the decoding rendering capability information of the terminal device when the target type application program is started is obtained, and M is determined according to the quantization value of the second network bandwidth.
S205, the terminal device determines a frame rate table according to the decoding rendering capability information of the terminal device when the target type application program is started, wherein the frame rate table comprises frame rates corresponding to the M code rate points.
In an implementation manner, S205 may specifically be:
and S2051, determining a decoding rendering mode of the terminal equipment, wherein the decoding rendering mode comprises a synchronous model and an asynchronous mode.
And S20522, calculating frame rates respectively corresponding to the M code rate points according to the decoding rendering mode and the decoding rendering capability information of the terminal equipment.
Optionally, the decoding rendering capability includes an average decoding time and an average rendering time, and in S2052, frame rates respectively corresponding to the M code rate points are calculated according to the decoding rendering mode and the decoding rendering capability information of the terminal device, which may specifically be:
if the decoding rendering mode of the terminal equipment is a synchronous mode, the frame rate corresponding to any target code rate point in the M code rate points is a quotient of N and a first numerical value, the first numerical value is the sum of the average decoding time of the target code rate point and the average rendering time of the target code rate point, and N is a preset value.
Wherein, N may be 1 second(s), and the frame rate corresponding to any one target code rate point in the M code rate points is a quotient of 1 divided by the first value.
If the decoding rendering mode of the terminal equipment is an asynchronous mode, the frame rate corresponding to any target code rate point in the M code rate points is the quotient of N and a second numerical value, and the second numerical value is the maximum value of the average decoding time of the target code rate point and the average rendering time of the target code rate point.
Wherein, N may be 1 second(s), and the frame rate corresponding to any one target code rate point in the M code rate points is a quotient of 1 divided by the second value.
S20533, determining code rate intervals corresponding to the M code rate points according to the frame rates corresponding to the M code rate points, and obtaining a frame rate table according to the code rate intervals corresponding to the M code rate points and the frame rates corresponding to the M code rate points.
Optionally, the S205 may be executed by a frame rate calculation module of the terminal device. Illustratively, a frame rate table acquisition process is shown in conjunction with fig. 7. Fig. 7 is a flowchart of a method for determining a frame rate table according to an embodiment of the present invention, and as shown in fig. 7, a process of determining, by a terminal device, a frame rate table according to decoding rendering capability information of the terminal device when a target type application is started may include:
and S11, inputting a quantized value of the second network bandwidth.
S12, inputting the decoding rendering capability of the terminal equipment to the code stream of the M code rate points, namely the average decoding time and the average rendering time of each code rate point in the M code rate points.
S13, let integer variable i =5.
Here, 5 is the quantization step size in S203.
And S14, judging whether the variable i is smaller than or equal to the quantized value of the second network bandwidth.
If yes, executing S15, otherwise executing S16.
And S15, judging whether the decoding rendering mode of the terminal equipment is an asynchronous mode.
If so, go to S17, otherwise go to S18.
S17, determining that the frame rate of the i-code rate point is a quotient of 1000 and a second numerical value, wherein the second numerical value is the maximum value of the average decoding time of the i-code rate point and the average rendering time of the i-code rate point.
I.e. the frame rate fps _ i =1000 ms/max for the i-rate point (average decoding time for the i-rate point, average rendering time for the i-rate point).
S18, determining that the frame rate of the i-code rate point is a quotient of 1000 and a first numerical value, wherein the first numerical value is the sum of the average decoding time of the i-code rate point and the average rendering time of the i-code rate point.
Namely, the frame rate fps _ i of the code rate point i =1000 ms/(the average decoding time of the code rate point i + the average rendering time of the code rate point i).
S19, i = i + 5, and proceeds to S14.
S16, determining code rate intervals corresponding to the M code rate points according to the frame rates corresponding to the M code rate points, and obtaining a frame rate table according to the code rate intervals corresponding to the M code rate points and the frame rates corresponding to the M code rate points.
For example, the frame rate table obtained is shown in the above table.
S206, the terminal device obtains a first network bandwidth when the target type application program starts video stream transmission.
S207, the terminal device searches a code rate interval where the first network bandwidth is located from the frame rate table, and determines the frame rate corresponding to the code rate interval where the first network bandwidth is located as the target frame rate.
And S208, the terminal equipment sends the target frame rate to the server.
S209, the server adjusts the current frame rate to a target frame rate, wherein the current frame rate comprises an encoding frame rate or an encoding frame rate and a data acquisition frame rate.
S210, the server encodes the video to be transmitted according to the target frame rate to obtain an encoded code stream.
And S211, the server sends the coded code stream to the terminal equipment.
S212, the terminal equipment decodes, renders and plays the received code stream.
Specifically, the server sends the encoded code stream to the terminal device, and after receiving the encoded code stream, the terminal device may decode the encoded code stream first, and render and play the decoded code stream.
According to the frame rate adjusting method provided by the embodiment of the invention, the frame rate is adjusted through the cooperation of the end cloud, and as the current frame rate of the server is adjusted in real time according to the first network bandwidth when the target type application program starts video stream transmission and the decoding rendering capability information of the terminal equipment when the target type application program starts, the code stream coded by the server is obtained by coding based on the rendering capability information of the terminal equipment when the target type application program starts, the coding frame rate can be better adapted to the rendering capability of the terminal equipment, the server is coded by using the coding frame rate adapted to the rendering capability of the terminal equipment, the code stream with the coding frame rate which is too high or too low is prevented from being transmitted to the terminal equipment by the server, the decoding rendering capabilities of the terminal equipment of different types can be reasonably and effectively utilized, the playing picture of the terminal equipment is smoother, and the effective utilization rate of hardware resources and bandwidth resources of the server end can be improved.
Fig. 8 is a schematic structural diagram of a frame rate adjustment apparatus according to an embodiment of the present invention, and as shown in fig. 8, the frame rate adjustment apparatus may include: the device comprises an acquisition module 11, a determination module 12 and a sending module 13.
The acquiring module 11 is configured to acquire a first network bandwidth when a target type application starts video streaming transmission;
the determining module 12 is configured to determine a target frame rate according to the first network bandwidth and decoding rendering capability information of the terminal device when the target type application program is started, where the decoding rendering capability information includes decoding rendering capabilities of the terminal device on code streams of M code rate points, and M is a positive integer;
the sending module 13 is configured to send the target frame rate to the server, so that the server adjusts the current frame rate to the target frame rate, where the current frame rate includes an encoding frame rate or an encoding frame rate and a data acquisition frame rate.
In one embodiment, the determination module 12 is configured to:
determining a frame rate table according to the decoding rendering capability information, wherein the frame rate table comprises frame rates corresponding to M code rate points respectively;
and determining the target frame rate according to the frame rate table and the first network bandwidth.
In an embodiment, the obtaining module 11 is further configured to:
acquiring a second network bandwidth when the target type application program is started;
and acquiring the decoding rendering capability information of the terminal equipment according to the second network bandwidth.
In an embodiment, the obtaining module 11 is specifically configured to:
determining M code rate points according to the second network bandwidth;
obtaining a test code stream of each code rate point in the M code rate points;
and respectively decoding and rendering the test code stream of each code rate point in the M code rate points to obtain the decoding and rendering capabilities of the terminal equipment on the code streams of the M code rate points, wherein the decoding and rendering capabilities comprise average decoding time and average rendering time.
In an embodiment, the obtaining module 11 is specifically configured to:
rounding up the second network bandwidth to obtain a rounded value of the second network bandwidth;
quantizing the value of the second network bandwidth after being rounded according to the quantization step length to obtain a quantized value of the second network bandwidth;
and determining M code rate points according to the quantization value and the quantization step of the second network bandwidth.
In an embodiment, the obtaining module 11 is specifically configured to:
determining a test code stream of each code rate point in the M code rate points according to the test file;
the test file is pre-stored, or the terminal device receives the test file from the server when the target type application program is started.
In one embodiment, the determination module 12 is configured to:
determining a decoding rendering mode of the terminal equipment, wherein the decoding rendering mode comprises a synchronous model and an asynchronous mode;
calculating frame rates respectively corresponding to the M code rate points according to the decoding rendering mode and the decoding rendering capability information of the terminal equipment;
and obtaining a frame rate table according to the frame rates respectively corresponding to the M code rate points.
In an embodiment, the decoding rendering capability includes an average decoding time and an average rendering time, and the determining module 12 is specifically configured to:
if the decoding rendering mode of the terminal equipment is a synchronous mode, the frame rate corresponding to any target code rate point in the M code rate points is a quotient of N and a first numerical value, the first numerical value is the sum of the average decoding time of the target code rate point and the average rendering time of the target code rate point, and N is a preset value;
if the decoding rendering mode of the terminal equipment is an asynchronous mode, the frame rate corresponding to any target code rate point in the M code rate points is the quotient of N and a second numerical value, and the second numerical value is the maximum value of the average decoding time of the target code rate point and the average rendering time of the target code rate point.
In an embodiment, the determining module 12 is specifically configured to:
determining code rate intervals respectively corresponding to the M code rate points according to the frame rates respectively corresponding to the M code rate points;
and obtaining a frame rate table according to the code rate intervals respectively corresponding to the M code rate points and the frame rates respectively corresponding to the M code rate points.
In an embodiment, the determining module 12 is specifically configured to:
and searching the code rate interval where the first network bandwidth is located from the frame rate table, and determining the frame rate corresponding to the code rate interval where the first network bandwidth is located as the target frame rate.
It is to be understood that the apparatus embodiments and the method embodiments may correspond to each other and similar descriptions may be made with reference to the method embodiments. To avoid repetition, the description is omitted here. Specifically, the frame rate adjusting apparatus shown in fig. 6 may execute the method embodiment corresponding to fig. 4, and the foregoing and other operations and/or functions of each module in the frame rate adjusting apparatus are respectively for implementing the corresponding flow in the method embodiment corresponding to fig. 4, and are not described herein again for brevity.
Fig. 9 is a schematic structural diagram of a frame rate adjusting apparatus according to an embodiment of the present invention, and as shown in fig. 9, the frame rate adjusting apparatus may include: a receiving module 21, an adjusting module 22, an encoding module 23 and a transmitting module 24.
The receiving module 21 is configured to receive a target frame rate sent by a terminal device, where the target frame rate is determined by the terminal device according to a first network bandwidth and decoding rendering capability information of the terminal device when a target type application program is started, the first network bandwidth is a network bandwidth when the target type application program starts video streaming transmission, the decoding rendering capability information includes decoding rendering capabilities of the terminal device on code streams of M code rate points, and M is a positive integer;
the adjusting module 22 is configured to adjust a current frame rate to a target frame rate, where the current frame rate includes an encoding frame rate or an encoding frame rate and a data acquisition frame rate;
the encoding module 23 is configured to encode a video to be transmitted according to the target frame rate to obtain an encoded code stream;
the sending module 24 is configured to send the encoded code stream to the terminal device, so that the terminal device performs decoding rendering and playing on the code stream.
In an embodiment, the receiving module 21 is further configured to: receiving a test file acquisition request sent by terminal equipment, wherein the test file acquisition request is sent when a target type application program is started;
the sending module 24 is further configured to send the test file to the terminal device.
It is to be understood that apparatus embodiments and method embodiments may correspond to one another and that similar descriptions may refer to method embodiments. To avoid repetition, the description is omitted here.
The frame rate adjusting apparatus according to the embodiments of the present invention is described above from the perspective of functional modules with reference to the drawings. It should be understood that the functional modules may be implemented by hardware, by instructions in software, or by a combination of hardware and software modules. Specifically, the steps of the method embodiment in the embodiment of the present invention may be implemented by integrated logic circuits of hardware in a processor and/or instructions in the form of software, and the steps of the method disclosed in connection with the embodiment of the present invention may be directly implemented by a hardware encoding processor, or implemented by a combination of hardware and software modules in the encoding processor. Alternatively, the software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, electrically erasable programmable memory, registers, or other storage medium known in the art. The storage medium is located in a memory, and a processor reads information in the memory and combines hardware thereof to complete steps of the above method embodiments.
Fig. 10 is a schematic block diagram of a frame rate adjusting apparatus provided in an embodiment of the present invention. The frame rate adjusting device may be a server or a terminal device in the above method embodiments.
As shown in fig. 10, the frame rate adjusting apparatus may include:
a memory 210 and a processor 220, the memory 210 being configured to store a computer program and to transfer the program code to the processor 220. In other words, the processor 220 may call and run a computer program from the memory 210 to implement the method in the embodiment of the present invention.
For example, the processor 220 may be configured to perform the above-described method embodiments according to instructions in the computer program.
In some of the embodiments of the present invention, the processor 220 may include, but is not limited to:
general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like.
In some of the present embodiments, the memory 210 includes, but is not limited to:
volatile memory and/or non-volatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM) which serves as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double Data Rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), SLDRAM (Synchronous link DRAM), and Direct Rambus RAM (DR RAM).
In some embodiments of the present invention, the computer program may be divided into one or more modules, and the one or more modules are stored in the memory 210 and executed by the processor 220 to perform the methods provided by the embodiments of the present invention. The one or more modules may be a series of computer program instruction segments capable of performing specific functions, and the instruction segments are used for describing the execution process of the computer program in the frame rate adjustment device.
As shown in fig. 10, the frame rate adjusting apparatus may further include:
a transceiver 230, the transceiver 230 being connectable to the processor 220 or the memory 210.
The processor 220 may control the transceiver 230 to communicate with other devices, and specifically, may transmit information or data to the other devices or receive information or data transmitted by the other devices. The transceiver 230 may include a transmitter and a receiver. The transceiver 230 may further include one or more antennas.
It should be understood that the various components in the frame rate adjustment device are connected by a bus system, wherein the bus system includes a power bus, a control bus, and a status signal bus in addition to a data bus.
Embodiments of the present invention also provide a computer storage medium having a computer program stored thereon, where the computer program, when executed by a computer, enables the computer to execute the method of the above method embodiments. Alternatively, the embodiment of the present invention further provides a computer program product containing instructions, which when executed by a computer, cause the computer to execute the method of the above method embodiment.
When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the invention are generated in whole or in part when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a Digital Video Disk (DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
In the several embodiments provided in the embodiments of the present invention, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the module is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. For example, functional modules in the embodiments of the present invention may be integrated into one processing module, or each module may exist alone physically, or two or more modules are integrated into one module.
The above description is only a specific implementation of the embodiments of the present invention, but the scope of the embodiments of the present invention is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the embodiments of the present invention, and all such changes or substitutions should be covered by the scope of the embodiments of the present invention. Therefore, the protection scope of the embodiments of the present invention shall be subject to the protection scope of the claims.

Claims (16)

1. A method for frame rate adjustment, comprising:
acquiring a first network bandwidth when a target type application program starts video stream transmission;
determining a target frame rate according to the first network bandwidth and decoding rendering capability information of terminal equipment when the target type application program is started, wherein the decoding rendering capability information comprises the decoding rendering capability of the terminal equipment on code streams of M code rate points, and M is a positive integer;
and sending the target frame rate to a server so that the server adjusts the current frame rate to the target frame rate, wherein the current frame rate comprises an encoding frame rate or an encoding frame rate and a data acquisition frame rate.
2. The frame rate adjustment method according to claim 1, wherein the determining the target frame rate according to the first network bandwidth and the decoding rendering capability information of the terminal device when the target type application is started comprises:
determining a frame rate table according to the decoding rendering capability information, wherein the frame rate table comprises frame rates respectively corresponding to the M code rate points;
and determining the target frame rate according to the frame rate table and the first network bandwidth.
3. The method of claim 1 or 2, wherein the method further comprises:
acquiring a second network bandwidth when the target type application program is started;
and acquiring the decoding rendering capability information of the terminal equipment according to the second network bandwidth.
4. The frame rate adjustment method according to claim 3, wherein the obtaining the decoding rendering capability information of the terminal device according to the second network bandwidth comprises:
determining the M code rate points according to the second network bandwidth;
acquiring a test code stream of each code rate point in the M code rate points;
and respectively decoding and rendering the test code stream of each code rate point in the M code rate points to obtain the decoding and rendering capabilities of the terminal equipment on the code streams of the M code rate points, wherein the decoding and rendering capabilities comprise average decoding time and average rendering time.
5. The method of claim 4, wherein the determining the M rate points according to the second network bandwidth comprises:
rounding up the second network bandwidth to obtain a rounded value of the second network bandwidth;
quantizing the value of the second network bandwidth after being rounded according to the quantization step length to obtain a quantized value of the second network bandwidth;
and determining the M code rate points according to the quantization value of the second network bandwidth and the quantization step size.
6. The method of claim 4, wherein the obtaining the test code stream of each of the M code rate points comprises:
determining a test code stream of each code rate point in the M code rate points according to the test file;
the test file is pre-stored, or the test file is received by the terminal device from the server when the target type application program is started.
7. The method according to claim 2, wherein the determining a frame rate table according to the decoding rendering capability information comprises:
determining a decoding rendering mode of the terminal equipment, wherein the decoding rendering mode comprises a synchronous model and an asynchronous mode;
calculating frame rates respectively corresponding to the M code rate points according to a decoding rendering mode of the terminal equipment and the decoding rendering capability information;
and obtaining the frame rate table according to the frame rates respectively corresponding to the M code rate points.
8. The method according to claim 7, wherein the decoding rendering capability includes an average decoding time and an average rendering time, and the calculating, according to the decoding rendering mode of the terminal device and the decoding rendering capability information, frame rates corresponding to the M rate points respectively includes:
if the decoding rendering mode of the terminal equipment is a synchronous mode, a frame rate corresponding to any target code rate point in the M code rate points is a quotient of N and a first numerical value, the first numerical value is the sum of the average decoding time of the target code rate point and the average rendering time of the target code rate point, and N is a preset value;
if the decoding rendering mode of the terminal device is an asynchronous mode, the frame rate corresponding to any one target code rate point in the M code rate points is the quotient of the N and a second numerical value, and the second numerical value is the maximum value between the average decoding time of the target code rate point and the average rendering time of the target code rate point.
9. The method as claimed in claim 7, wherein the obtaining the frame rate table according to the frame rates corresponding to the M code rate points comprises:
determining code rate intervals corresponding to the M code rate points respectively according to the frame rates corresponding to the M code rate points respectively;
and obtaining the frame rate table according to the M code rate points, the code rate intervals respectively corresponding to the M code rate points and the frame rates respectively corresponding to the M code rate points.
10. The method of claim 2, wherein the determining the target frame rate according to the frame rate table and the first network bandwidth comprises:
and searching a code rate interval where the first network bandwidth is located from the frame rate table, and determining the frame rate corresponding to the code rate interval where the first network bandwidth is located as the target frame rate.
11. A method for frame rate adjustment, comprising:
receiving a target frame rate sent by a terminal device, wherein the target frame rate is determined by the terminal device according to a first network bandwidth and decoding rendering capability information of the terminal device when a target type application program is started, the first network bandwidth is the network bandwidth when the target type application program starts video stream transmission, the decoding rendering capability information comprises the decoding rendering capability of the terminal device on code streams of M code rate points, and M is a positive integer;
adjusting the current frame rate to the target frame rate, wherein the current frame rate comprises an encoding frame rate or an encoding frame rate and a data acquisition frame rate;
coding a video to be transmitted according to the target frame rate to obtain a coded code stream;
and sending the coded code stream to the terminal equipment so that the terminal equipment can decode, render and play the code stream.
12. The method of claim 11, wherein the method further comprises:
receiving a test file acquisition request sent by the terminal equipment, wherein the test file acquisition request is sent when the target type application program is started;
and sending the test file to the terminal equipment.
13. A frame rate adjustment apparatus, comprising:
the acquisition module is used for acquiring a first network bandwidth when a target type application program starts video streaming transmission;
a determining module, configured to determine a target frame rate according to the first network bandwidth and decoding rendering capability information of a terminal device when the target type application program is started, where the decoding rendering capability information includes decoding rendering capabilities of the terminal device on code streams of M code rate points, and M is a positive integer;
and the sending module is used for sending the target frame rate to a server so as to enable the server to adjust the current frame rate to the target frame rate, wherein the current frame rate comprises an encoding frame rate or an encoding frame rate and a data acquisition frame rate.
14. A frame rate adjustment apparatus, comprising:
the video coding method comprises the steps that a receiving module is used for receiving a target frame rate sent by terminal equipment, the target frame rate is determined by the terminal equipment according to a first network bandwidth and decoding rendering capacity information of the terminal equipment when a target type application program starts, the first network bandwidth is the network bandwidth when the target type application program starts video streaming transmission, the decoding rendering capacity information comprises the decoding rendering capacity of the terminal equipment on code streams of M code rate points, and M is a positive integer;
the adjusting module is used for adjusting the current frame rate to the target frame rate, wherein the current frame rate comprises a coding frame rate or a coding frame rate and a data acquisition frame rate;
the encoding module is used for encoding the video to be transmitted according to the target frame rate to obtain an encoded code stream;
and the sending module is used for sending the coded code stream to the terminal equipment so that the terminal equipment can decode, render and play the code stream.
15. A frame rate adjustment device, comprising:
a processor and a memory, the memory for storing a computer program, the processor for invoking and executing the computer program stored in the memory to perform the method of any of claims 1-10 or 11-12.
16. A computer-readable storage medium for storing a computer program which causes a computer to perform the method of any one of claims 1-10 or 11-12.
CN202211536842.XA 2022-12-02 2022-12-02 Frame rate adjusting method, device, equipment and storage medium Active CN115550690B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211536842.XA CN115550690B (en) 2022-12-02 2022-12-02 Frame rate adjusting method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211536842.XA CN115550690B (en) 2022-12-02 2022-12-02 Frame rate adjusting method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115550690A true CN115550690A (en) 2022-12-30
CN115550690B CN115550690B (en) 2023-04-14

Family

ID=84722360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211536842.XA Active CN115550690B (en) 2022-12-02 2022-12-02 Frame rate adjusting method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115550690B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007088539A (en) * 2005-09-20 2007-04-05 Mitsubishi Electric Corp Video stream supply system and apparatus, and video stream receiving apparatus
JP2012029219A (en) * 2010-07-27 2012-02-09 Kddi Corp Data transmitter, program and method for completing data transmission before time limit
WO2015101092A1 (en) * 2013-12-31 2015-07-09 华为技术有限公司 Transmission mechanism adjusting method, server and client
CN105471865A (en) * 2015-11-23 2016-04-06 苏州工业园区云视信息技术有限公司 Method for dynamic network state adaptation of video stream
CN106657143A (en) * 2017-01-20 2017-05-10 中兴通讯股份有限公司 Streaming media transmission method and device, server and terminal
CN111882626A (en) * 2020-08-06 2020-11-03 腾讯科技(深圳)有限公司 Image processing method, apparatus, server and medium
CN111901635A (en) * 2020-06-17 2020-11-06 北京视博云信息技术有限公司 Video processing method, device, storage medium and equipment
CN112104879A (en) * 2020-11-13 2020-12-18 腾讯科技(深圳)有限公司 Video coding method and device, electronic equipment and storage medium
CN113347488A (en) * 2021-08-04 2021-09-03 腾讯科技(深圳)有限公司 Video rendering method, device, equipment and storage medium
CN113542795A (en) * 2020-04-21 2021-10-22 腾讯科技(深圳)有限公司 Video processing method and device, electronic equipment and computer readable storage medium
CN115379235A (en) * 2022-08-26 2022-11-22 腾讯科技(深圳)有限公司 Image decoding method and device based on buffer pool, readable medium and electronic equipment

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007088539A (en) * 2005-09-20 2007-04-05 Mitsubishi Electric Corp Video stream supply system and apparatus, and video stream receiving apparatus
JP2012029219A (en) * 2010-07-27 2012-02-09 Kddi Corp Data transmitter, program and method for completing data transmission before time limit
WO2015101092A1 (en) * 2013-12-31 2015-07-09 华为技术有限公司 Transmission mechanism adjusting method, server and client
CN105471865A (en) * 2015-11-23 2016-04-06 苏州工业园区云视信息技术有限公司 Method for dynamic network state adaptation of video stream
CN106657143A (en) * 2017-01-20 2017-05-10 中兴通讯股份有限公司 Streaming media transmission method and device, server and terminal
CN113542795A (en) * 2020-04-21 2021-10-22 腾讯科技(深圳)有限公司 Video processing method and device, electronic equipment and computer readable storage medium
CN111901635A (en) * 2020-06-17 2020-11-06 北京视博云信息技术有限公司 Video processing method, device, storage medium and equipment
CN111882626A (en) * 2020-08-06 2020-11-03 腾讯科技(深圳)有限公司 Image processing method, apparatus, server and medium
CN112104879A (en) * 2020-11-13 2020-12-18 腾讯科技(深圳)有限公司 Video coding method and device, electronic equipment and storage medium
CN113347488A (en) * 2021-08-04 2021-09-03 腾讯科技(深圳)有限公司 Video rendering method, device, equipment and storage medium
CN115379235A (en) * 2022-08-26 2022-11-22 腾讯科技(深圳)有限公司 Image decoding method and device based on buffer pool, readable medium and electronic equipment

Also Published As

Publication number Publication date
CN115550690B (en) 2023-04-14

Similar Documents

Publication Publication Date Title
CN114501062B (en) Video rendering coordination method, device, equipment and storage medium
CN111246209B (en) Adaptive encoding method, apparatus, electronic device, and computer storage medium
CN100366088C (en) System for mobile terminal receiving multimedia content, method and apparatus thereof
WO2021057697A1 (en) Video encoding and decoding methods and apparatuses, storage medium, and electronic device
CN113965751A (en) Screen content coding method, device, equipment and storage medium
CN114845106A (en) Video coding method, video coding device, storage medium and electronic equipment
CN116567228A (en) Encoding method, real-time communication method, apparatus, device and storage medium
CN116567346A (en) Video processing method, device, storage medium and computer equipment
US20240098316A1 (en) Video encoding method and apparatus, real-time communication method and apparatus, device, and storage medium
WO2021057686A1 (en) Video decoding method and apparatus, video encoding method and apparatus, storage medium and electronic device
WO2021057684A1 (en) Video decoding method and apparatus, video encoding method and apparatus, storage medium, and electronic apparatus
CN115550690B (en) Frame rate adjusting method, device, equipment and storage medium
US20220239920A1 (en) Video processing method, related apparatus, storage medium, and program product
CN114205359A (en) Video rendering coordination method, device and equipment
CN110545431B (en) Video decoding method and device, video encoding method and device
CN116567242A (en) Image processing method, device and equipment
CN116567297A (en) Frame rate adjustment method, device, equipment and storage medium
CN110582022A (en) Video encoding and decoding method and device and storage medium
JP2002223332A (en) System and method for image processing, and program
CN116708793B (en) Video transmission method, device, equipment and storage medium
CN113473180B (en) Wireless-based Cloud XR data transmission method and device, storage medium and electronic device
CN110677721B (en) Video encoding and decoding method and device and storage medium
CN114374841A (en) Optimization method and device for video coding rate control and electronic equipment
CN116567320A (en) Video processing coordination method, device, equipment and storage medium
CN117336531A (en) Video picture rendering method, device, terminal, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant