CN111901635A - Video processing method, device, storage medium and equipment - Google Patents

Video processing method, device, storage medium and equipment Download PDF

Info

Publication number
CN111901635A
CN111901635A CN202010555675.8A CN202010555675A CN111901635A CN 111901635 A CN111901635 A CN 111901635A CN 202010555675 A CN202010555675 A CN 202010555675A CN 111901635 A CN111901635 A CN 111901635A
Authority
CN
China
Prior art keywords
parameter
rendering
terminal
transmission state
network transmission
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010555675.8A
Other languages
Chinese (zh)
Inventor
牛长锋
王安琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Shiboyun Information Technology Co ltd
Original Assignee
Beijing Shiboyun Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Shiboyun Information Technology Co ltd filed Critical Beijing Shiboyun Information Technology Co ltd
Priority to CN202010555675.8A priority Critical patent/CN111901635A/en
Publication of CN111901635A publication Critical patent/CN111901635A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • H04N21/2405Monitoring of the internal components or processes of the server, e.g. server load
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234363Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • H04N21/2402Monitoring of the downstream path of the transmission network, e.g. bandwidth available
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • H04N21/26208Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists the scheduling operation being performed under constraints
    • H04N21/26216Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists the scheduling operation being performed under constraints involving the channel capacity, e.g. network bandwidth
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities

Abstract

The disclosure provides a video processing method, a video processing device, a storage medium and a video processing device. The method comprises the following steps: monitoring the network transmission state of the terminal; determining a first rendering parameter and a first coding parameter which are corresponding to the terminal and are adaptive to the network transmission state; executing the picture rendering operation of the application corresponding to the terminal according to the first rendering parameter, and executing the coding operation on the picture obtained by rendering according to the first coding parameter; and transmitting the video data obtained by coding to the terminal through a network. Therefore, the rendering parameters and the coding parameters are adjusted according to the change of the monitored network transmission state of the terminal, so that the finally obtained video data is adaptive to the current network transmission state of the terminal, the network transmission delay is reduced, and the user experience is improved.

Description

Video processing method, device, storage medium and equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a video processing method, apparatus, storage medium, and device.
Background
Based on a cloud streaming technology, applications such as cloud games and cloud VR (Virtual Reality) can be operated on a cloud rendering server, application scene data generated by application operation is rendered by the cloud rendering server in the application operation process to obtain an application picture, audio data generated by application operation is collected, the application picture and the audio data are encoded and then are transmitted to terminals such as a television, a mobile phone, a PC (personal computer), a VR (Virtual Reality) helmet and the like through a network, and the terminals decode and present the application picture and the audio data; according to the cloud streaming application running mode, the terminal does not need to download and install the application and update the application client, and the dependence of the application running on the hardware condition of the terminal is greatly reduced.
In the related art, in the process of transmitting a video stream to a terminal by a cloud rendering server, if a network bandwidth of the terminal is reduced, the network transmission delay of the video stream is relatively long, and at the moment, the time for the terminal to receive the video stream is relatively long, so that a picture played by the terminal is blocked or a user operation is delayed to respond, and user experience is affected.
Disclosure of Invention
In view of this, the present disclosure provides a video processing method, an apparatus, a storage medium, and a device, so as to reduce video transmission delay when a network transmission state of a terminal fluctuates, so as to ensure that user experience is not affected.
Specifically, the present disclosure is realized by the following technical solutions:
in a first aspect, an embodiment of the present disclosure provides a video processing method, where the method is applied to a cloud rendering server, and the method includes:
monitoring the network transmission state of the terminal;
determining a first rendering parameter and a first coding parameter which are corresponding to the terminal and are adaptive to the network transmission state;
executing picture rendering of the terminal application according to the determined first rendering parameter which is adaptive to the network transmission state, and executing coding operation on a picture obtained through rendering according to the first coding parameter;
and transmitting the video data obtained by coding to the terminal through a network.
In a second aspect, an embodiment of the present disclosure provides an apparatus for image processing, where the apparatus is applied to a cloud rendering server, and the apparatus includes:
the monitoring module is used for monitoring the network transmission state of the terminal;
the determining module is used for determining a first rendering parameter and a first coding parameter which are corresponding to the terminal and are adaptive to the network transmission state;
the execution module is used for executing the picture rendering of the terminal application according to the determined first rendering parameter which is adaptive to the network transmission state, and executing the coding operation on the picture obtained by rendering according to the first coding parameter;
and the transmission module is used for transmitting the video data obtained by encoding to the terminal through a network.
In a third aspect, the disclosed embodiments provide a machine-readable storage medium having stored thereon several computer instructions, which when executed implement the method according to the first aspect.
In a fourth aspect, an embodiment of the present disclosure provides an electronic device, including: a machine-readable storage medium and a processor, the machine-readable storage medium: storing instruction code, the processor: communicating with a machine-readable storage medium, reading and executing instruction code in the machine-readable storage medium, to implement a method as described in the first aspect.
In the video processing method and device provided in the embodiment of the present disclosure, a first rendering parameter and a first encoding parameter adapted to a network transmission state are determined by monitoring the network transmission state of a terminal, and a cloud rendering server performs a picture rendering operation applied corresponding to the terminal according to the determined first rendering parameter adapted to a current network transmission state, and performs an encoding operation on a picture obtained through rendering according to the determined first encoding parameter; then the cloud rendering server transmits the video data obtained by encoding to the terminal through a network; in this embodiment, the cloud rendering server dynamically adjusts the rendering parameters and the encoding parameters according to the change of the monitored network transmission state of the terminal, so that the finally obtained video data is adapted to the current network transmission state of the terminal, the network transmission delay is reduced, and the user experience is improved.
Drawings
FIG. 1 is a schematic diagram of a cloud streaming system architecture;
FIG. 2 is a flow diagram illustrating a video processing method according to an exemplary embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating steps of a method for determining a first rendering parameter and a first encoding parameter according to an exemplary embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating steps of a method of calculating a first rendering parameter according to an exemplary embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an image processing apparatus shown in an exemplary embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a determination module of a video processing apparatus according to an exemplary embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
FIG. 1 is a schematic diagram of a cloud streaming system architecture; referring to fig. 1, in the system architecture, applications such as a cloud game and a cloud VR are deployed on a cloud rendering server 100, the cloud rendering server 100 receives an operation instruction for controlling application operation by a user, which is sent by a terminal 200, through a network, analyzes the operation instruction, and transmits the analyzed operation instruction to the application itself, the cloud rendering server responds to the operation instruction according to a processing logic of the application itself to generate application picture data, renders the application picture data to obtain a video stream, encodes the video stream, transmits the encoded video stream to the terminal 200 through the network in real time, and the terminal 200 decodes and outputs the received video stream. In the mode, if the current network bandwidth of the terminal is insufficient, the network transmission delay of video stream transmission is relatively large, and at the moment, the video playing of the terminal is delayed or blocked, so that the user experience is influenced; based on this, the embodiment of the disclosure provides a video processing method and device.
FIG. 2 is a flow diagram illustrating a video processing method according to an exemplary embodiment of the present disclosure; the video processing method provided in this embodiment is applied to a cloud rendering server, and this disclosure is not limited thereto. Referring to FIG. 2, the method includes the following steps S10-S40:
and S10, monitoring the network transmission state of the terminal.
The cloud rendering server can directly acquire the network transmission state of the terminal from the data link layer or directly acquire the network transmission state of the terminal from the terminal side, or the network transmission state can be acquired by the detection of the cloud rendering server; the network transmission state includes: network bandwidth, network delay, terminal blocking time and the like.
In an optional embodiment, the terminal detects the network transmission state of the terminal in real time and uploads the network transmission state to the cloud rendering server in real time, or the terminal synchronously sends the current network transmission state of the terminal to the cloud rendering server when sending an operation instruction of a user to the cloud rendering server each time; or, if the terminal is a VR all-in-one device, a VR + mobile phone/PC combined device, or the like, the terminal may send the current network transmission state of the terminal to the cloud rendering server synchronously when sending the posture information to the cloud rendering server.
In an optional embodiment, because a certain hysteresis exists when the first rendering parameter and the first coding parameter determined by the network transmission state uploaded by the terminal are used for rendering and coding the picture of the application under the condition of a large network transmission delay, in order to fully consider the influence of the network transmission delay on the uploaded network transmission state, after the terminal detects the network transmission state of the terminal itself, the terminal estimates the network transmission state of the next moment according to the currently detected network transmission state, wherein the next moment is determined by the current network transmission delay, and the larger the network delay, the longer the time from the next moment to the current moment is; and the terminal uploads the network transmission state at the next moment to a cloud rendering server, the cloud rendering server determines a first rendering parameter and a first encoding parameter which are matched with the network transmission state, and the first rendering parameter and the first encoding parameter are used as a second rendering parameter and a second encoding parameter to execute the rendering and encoding operation of the application until the second rendering parameter and the second encoding parameter are not matched with the network transmission state.
If the network transmission state comprises network delay, a delay detection instruction can be sent to the terminal through the cloud rendering server, the terminal sends feedback information after receiving the delay detection instruction, and the server calculates the network delay according to a timestamp for sending the delay detection instruction and a timestamp for receiving the feedback information.
If the network transmission state comprises terminal pause time, the terminal determines first time for sending a certain operation instruction, determines second time for obtaining a video frame generated in response to the operation instruction, calculates time difference between the first time and the second time to obtain pause time, and synchronously sends the pause time or the pause time at the next moment predicted according to the pause time and the next operation instruction to the cloud rendering server.
In an embodiment of the disclosure, the network transmission state is a network bandwidth, and the cloud rendering server monitors the network bandwidth of the terminal in real time, for example, the cloud rendering server acquires the network bandwidth of the terminal from the data link layer in real time.
S20, determining a first rendering parameter and a first encoding parameter corresponding to the terminal and adapted to the network transmission state.
FIG. 3 is a schematic diagram illustrating steps of a method for determining a first rendering parameter and a first encoding parameter according to an exemplary embodiment of the present disclosure; referring to fig. 3, in the step S20, determining the first rendering parameter and the first encoding parameter that are adapted to the network transmission state specifically includes the following steps S201 to S203:
s201, calculating to obtain a first rendering parameter and a first coding parameter which are adaptive to the network transmission state and correspond to the terminal.
In an embodiment of the present disclosure, the first rendering parameter and the first encoding parameter are supported by performance of the terminal. For different terminals, the performance parameters of the terminal need to be considered when the first rendering parameter and the first coding parameter corresponding to the terminal and adapted to the network transmission state of the current terminal are obtained through the calculation, for example, some terminal performance parameters are weaker and can only support a 720p video picture, some terminal performance parameters are stronger and can support a higher-precision (such as 1080p, 4k, and the like) video picture, and further, for example, the performance parameters of the VR terminal, the television terminal, and the mobile phone terminal are also different, so that different adjustment modes for configuring the first rendering parameter and the first coding parameter are performed according to the performance parameters of different terminals.
S202, obtaining a second rendering parameter and a second encoding parameter used by the current cloud rendering server to execute the rendering and encoding operation of the application, and judging whether the second rendering parameter and the second encoding parameter are consistent with the first rendering parameter and the second encoding parameter.
And S203, if the first rendering parameter and the second coding parameter are not consistent, adjusting to enable the second rendering parameter and the second coding parameter to be consistent with the first rendering parameter and the second coding parameter.
And S204, if the two are consistent, no adjustment is performed.
In this embodiment, the cloud rendering server monitors the network transmission state of the terminal in real time, calculates a first rendering parameter and a first encoding parameter adapted to the network transmission state, compares the first rendering parameter and the first encoding parameter with a second rendering parameter and a second encoding parameter used by the current cloud rendering server to perform the rendering and encoding operations of the application, if the second rendering parameter and the second coding parameter are consistent, the rendering and coding operation of the application is continued to be carried out by using the second rendering parameter and the second coding parameter until the second rendering parameter and the second coding parameter are inconsistent with the first rendering parameter and the first coding parameter which are obtained by calculation and are adaptive to the network transmission state, adjusting a second rendering parameter and a second encoding parameter used by the cloud rendering server to perform the rendering and encoding operation of the application to the first rendering parameter and the first encoding parameter.
It should be noted that, the calculating of the first rendering parameter and the first encoding parameter, the determining whether the second rendering parameter and the second encoding parameter are consistent with the first rendering parameter and the second encoding parameter, and the adjusting of the second rendering parameter and the second encoding parameter may be different, which is not limited in this disclosure.
In a possible embodiment of the present disclosure, in the step S201, calculating the first rendering parameter adapted to the network transmission state specifically includes the following steps a 10-430:
step A10, comparing the network parameter characterizing the network transmission state with the preset threshold value of the network parameter.
In this embodiment, the preset threshold may be obtained according to an empirical value, and the preset threshold may be the same or different for terminals with different performances.
Step a10, if the network parameter is greater than or equal to the preset threshold, taking a high-level rendering parameter supported by the set performance of the terminal as a first rendering parameter adapted to the network transmission state.
Step a10, if the network parameter is lower than the preset threshold, using the set low-level rendering parameter supported by the terminal performance as the first rendering parameter adapted to the network transmission state.
Furthermore, in the embodiment, when the current network transmission state is better, the high-level rendering parameters supported by the terminal are adopted to perform high-quality rendering on the picture; and when the network transmission state is poor, low-quality rendering is performed by adopting the low-level rendering parameters. And according to the determined coding parameters, the pictures obtained by high-quality rendering and the pictures obtained by low-quality rendering are compressed according to different coding parameters, so that the loss of definition of the pictures obtained by high-quality rendering after low-quality coding is avoided, and resources which are unnecessarily consumed by the original high-quality rendering work are wasted.
FIG. 4 is a schematic diagram illustrating steps of a method of calculating a first rendering parameter according to an exemplary embodiment of the present disclosure; referring to fig. 4, in the step S201 in this embodiment, calculating the first rendering parameter adapted to the network transmission state specifically includes the following steps S2011-S2013:
and S2011, comparing the network parameter representing the network transmission state with a specified threshold range of the network parameter, and if the network parameter exceeds the upper limit value of the specified threshold range, taking the set first-level rendering parameter as a first rendering parameter adaptive to the network transmission state.
S2012, if the network parameter is in the specified threshold range, the set second level rendering parameter is used as the first rendering parameter adapted to the network transmission state.
S2013, if the network parameter is smaller than the lower limit value of the specified threshold range, the set third-level rendering parameter is used as the first rendering parameter adaptive to the network transmission state.
The first level rendering parameter is a highest level rendering parameter, the third level rendering parameter is a lowest level rendering parameter, and the second level rendering parameter is a rendering parameter between the highest level rendering parameter and the lowest level rendering parameter.
Taking the above network parameter representing the network transmission state as an example, the mapping relationship between the network bandwidth and the rendering parameter is shown in the following table 1:
network Bandwidth (BW) A first rendering parameter
BW>BWh First level rendering parameters
BWL≤BW≤BWh Second level rendering parameters
BW<BWL Third level rendering parameters
The configuring different rendering parameters for different bandwidths includes: any one or more of resolution, frame rate, and rendering quality level; wherein the rendering quality level comprises: anti-aliasing performance grade parameters, light and shadow effect grade parameters and the like.
In this embodiment, for a terminal with a certain type of performance parameter, the specified threshold range is illustratively 5Mb to 10Mb, and the upper limit BW of the upper limit of the specified threshold rangehAn upper limit value BW of 10Mb for the specified threshold rangeLIs 5Mb, when the network bandwidth is greater than 10Mb, the corresponding first level rendering parameters are: the resolution is 1080P, and the frame rate is 60 frames; network of operationWhen the bandwidth is less than or equal to 10Mb and more than or equal to 5Mb, the corresponding second level rendering parameters are as follows: the resolution is 720P, and the frame rate is 30 frames; when the network bandwidth is less than 5Mb, the corresponding third level rendering parameters are: 480P, frame rate is 20 frames.
In an embodiment of the present disclosure, the encoding parameter includes an encoding rate.
The above calculating method for obtaining the first coding parameter adapted to the network transmission state includes: and obtaining a first coding parameter according to the first rendering parameter obtained by calculation and the mapping relation between the rendering parameter and the first coding parameter. In this embodiment, the corresponding first coding parameter is obtained according to the determined first rendering parameter, so that different first coding parameters are set for different first rendering parameters, and the problem that video data obtained by rendering with a high-quality rendering parameter during rendering is compressed into low-quality video data during the encoding process, and further GPU resources in the rendering link are wasted is solved.
Or in another embodiment of the present disclosure, the adaptive first coding parameter is calculated according to the network transmission state; illustratively, the cloud rendering server determines a network bandwidth threshold according to the current actual sending code rate, adjusts the coding code rate of the encoder to a first specified code rate if the network bandwidth appearing in a period continuously greater than a first preset period is less than the network bandwidth threshold, and adjusts the coding code rate of the encoder to a second specified code rate if the network bandwidth appearing in a period continuously greater than the first preset period is greater than the network bandwidth threshold, wherein the second specified code rate is greater than the first specified code rate.
Illustratively, according to an actual network test situation, 40% of a 100M bandwidth is configured as an actual transmission code rate, and then the actual transmission code rate is obtained as 100 × 40% — 40M, a network bandwidth threshold is set according to a specified multiple of the transmission code rate, for example, the network bandwidth threshold is set as 1.3 times of the transmission code rate, and then the network bandwidth threshold is obtained as: 40M by 1.3 ═ 52M; code rate reduction scenario: when the actually measured network bandwidth is less than 52M continuously for 10 seconds, the code rate is reduced to be half of the configured actual sending code rate 40M, namely the first specified code rate is 20M; code rate up scenario: and when the network bandwidth is detected to be recovered to 52M or above and the actually measured network bandwidth is continuously greater than 52M for 10 seconds, the code rate is recovered to the configured actual sending code rate of 40M.
In an optional embodiment of the present disclosure, a user may select to set the adjustment of the on or off dynamic bitrate and the adjustment of the on or off dynamic rendering parameter, respectively.
S30, according to the determined first rendering parameter adapted to the network transmission state, executing picture rendering of the terminal application, and executing coding operation on the rendered picture according to the first coding parameter.
And S40, transmitting the video data obtained by coding to the terminal through the network.
Furthermore, in the embodiment, when the network transmission state is poor, low-quality rendering and encoding are performed to ensure that the data volume of the obtained video data is small to adapt to the network transmission state, the network transmission delay is reduced, high-quality rendering and encoding operations are performed under the condition that the network transmission state is good, the picture rendering parameters and the encoding parameters are dynamically adjusted according to the change of the network transmission state monitored in real time, the network adaptive characteristic is good, and the user experience is improved; and the method avoids the waste of resources such as GPU and the like caused by the previous high-quality rendering if only the encoding parameters are reduced under the condition of insufficient network bandwidth.
In a possible embodiment of the present disclosure, when the cloud rendering server does not acquire the network transmission state (for example, when a first frame of picture of an application is processed), the rendering operation and the encoding operation of the application picture are performed according to a default rendering parameter and a default encoding parameter, where the default rendering parameter and the default encoding parameter may be the above-mentioned highest-level rendering parameter and the second specified code rate.
Fig. 5 is a schematic structural diagram of a video processing apparatus according to an exemplary embodiment of the present disclosure; referring to fig. 5, the apparatus is applied to a cloud rendering server, and includes:
a monitoring module 501, configured to monitor a network transmission state of a terminal;
a determining module 502, configured to determine a first rendering parameter and a first encoding parameter corresponding to the terminal and adapted to the network transmission state;
an executing module 503, configured to execute picture rendering of the terminal application according to the determined first rendering parameter adapted to the network transmission state, and execute a coding operation on a picture obtained through rendering according to the first coding parameter;
a transmission module 504, configured to transmit the encoded video data to the terminal through a network.
Optionally, the determining module 502 specifically includes:
a calculating unit 5021, configured to calculate a first rendering parameter and a first encoding parameter that are adaptive to the network transmission state;
a determining unit 5022, configured to obtain a current second rendering parameter and a second encoding parameter, and determine whether the second rendering parameter and the second encoding parameter are consistent with the first rendering parameter and the second encoding parameter;
an adjusting unit 5023, configured to adjust to make the second rendering parameter and the second encoding parameter consistent with the first rendering parameter and the second encoding parameter if the first rendering parameter and the second encoding parameter are inconsistent with each other;
if the two are consistent, no adjustment is performed.
Optionally, the calculating unit 5021 is specifically configured to calculate a first rendering parameter adapted to the network transmission state through the following steps:
comparing the network parameter representing the network transmission state with a specified threshold range of the network parameter, and if the network parameter exceeds the upper limit value of the specified threshold range, taking the set first-level rendering parameter as a first rendering parameter adaptive to the network transmission state;
if the network parameter is in the specified threshold range, taking a set second-level rendering parameter as a first rendering parameter adaptive to the network transmission state;
if the network parameter is smaller than the lower limit value of the specified threshold range, taking the set third-level rendering parameter as a first rendering parameter adaptive to the network transmission state;
the first level rendering parameter is a highest level rendering parameter, the third level rendering parameter is a lowest level rendering parameter, and the second level rendering parameter is a rendering parameter between the highest level rendering parameter and the lowest level rendering parameter.
Optionally, the monitoring module 501 is specifically configured to:
and monitoring the network bandwidth between the cloud rendering server and the terminal.
Optionally, the rendering parameters include: resolution and/or frame rate.
Optionally, the calculating unit 5021 is specifically configured to calculate a first encoding parameter adapted to the network transmission state through the following steps:
obtaining the first coding parameter according to the first rendering parameter obtained by calculation and the mapping relation between the rendering parameter and the first coding parameter;
or, the first coding parameter adapted to the network transmission state is obtained by calculation according to the network transmission state.
In the embodiment of the disclosure, the network bandwidth of the terminal is monitored in real time, the first rendering parameter and the first encoding parameter which are adapted to the network bandwidth are determined according to the monitored network bandwidth, and the rendering operation and the encoding operation of the application are executed according to the first rendering parameter and the first encoding parameter; furthermore, in the embodiment, the cloud rendering server dynamically adjusts the rendering operation and the encoding operation according to the change of the network bandwidth of the terminal, so that the video data finally transmitted to the terminal is adapted to the current network bandwidth, and in the scheme, because the rendering operation is also adjusted, the situation that high-quality rendering work is wasted and further GPU resources are wasted due to the fact that dynamic encoding parameter adjustment is performed is avoided.
In another embodiment of the present disclosure, a machine-readable storage medium is further provided, on which a computer program is stored, which when executed by a processor implements the steps of the video processing method described above. The cloud rendering server dynamically adjusts rendering parameters and coding parameters through monitoring the change of the network transmission state of the terminal, so that finally obtained video data are adaptive to the current network transmission state of the terminal, the network transmission delay is reduced, and the user experience is improved.
Fig. 7 is a schematic structural diagram of an electronic device shown in an embodiment of the present disclosure. Referring to fig. 7, the electronic device 500 includes at least a memory (machine-readable storage medium) 502 and a processor 501; the memory 502 is connected to the processor 501 through a communication bus 503, and is used for storing instruction codes executable by the processor 501; the processor 501 is used for reading and executing instruction codes from the memory 502 to realize the steps of the video processing method according to any of the above embodiments. The network transmission state of the cloud rendering server monitoring terminal is monitored, the rendering parameters and the coding parameters are dynamically adjusted according to the network transmission state change of the terminal, and therefore finally obtained video data are adaptive to the current network transmission state of the terminal, network transmission delay is reduced, and user experience is improved.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Computers suitable for executing computer programs include, for example, general and/or special purpose microprocessors, or any other type of central processing unit. Generally, a central processing unit will receive instructions and data from a read-only memory and/or a random access memory. The basic components of a computer include a central processing unit for implementing or executing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer does not necessarily have such a device. Moreover, a computer may be embedded in another device, e.g., a mobile telephone, a Personal Digital Assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device such as a Universal Serial Bus (USB) flash drive, to name a few.
Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices (e.g., EPROM, EEPROM, and flash memory devices), magnetic disks (e.g., an internal hard disk or a removable disk), magneto-optical disks, and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. In other instances, features described in connection with one embodiment may be implemented as discrete components or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous. Moreover, the separation of various system modules and components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. Further, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some implementations, multitasking and parallel processing may be advantageous.
The above description is only exemplary of the present disclosure and should not be taken as limiting the disclosure, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (11)

1. A video processing method is applied to a cloud rendering server, and comprises the following steps:
monitoring the network transmission state of the terminal;
determining a first rendering parameter and a first coding parameter which are corresponding to the terminal and are adaptive to the network transmission state;
executing the picture rendering operation of the application corresponding to the terminal according to the first rendering parameter, and executing the coding operation on the picture obtained by rendering according to the first coding parameter;
and transmitting the video data obtained by coding to the terminal through a network.
2. The method of claim 1, wherein the determining the first rendering parameter and the first encoding parameter corresponding to the terminal and adapted to the network transmission state comprises:
calculating to obtain a first rendering parameter and a first coding parameter which are adaptive to the network transmission state and correspond to the terminal; wherein the first rendering parameter and the first encoding parameter are supported by a capability of the terminal;
acquiring a second rendering parameter and a second encoding parameter used by the current cloud rendering server to execute rendering and encoding operations of an application, and respectively judging whether the second rendering parameter and the second encoding parameter are consistent with the first rendering parameter and the second encoding parameter;
if not, adjusting to make the second rendering parameter and the second encoding parameter consistent with the first rendering parameter and the second encoding parameter;
if the two are consistent, no adjustment is performed.
3. The method according to claim 2, wherein the calculating to obtain the first rendering parameter corresponding to the terminal and adapted to the network transmission state includes:
comparing the network parameter representing the network transmission state with a preset threshold value of the network parameter;
if the network parameter is greater than or equal to the preset threshold, taking a set high-level rendering parameter supported by the performance of the terminal as a first rendering parameter adaptive to the network transmission state;
and if the network parameter is lower than the preset threshold value, taking the set low-level rendering parameter supported by the performance of the terminal as a first rendering parameter adaptive to the network transmission state.
4. The method of claim 2, wherein the calculating a first rendering parameter adapted to the network transmission status comprises:
comparing the network parameter representing the network transmission state with a specified threshold range of the network parameter, and if the network parameter exceeds the upper limit value of the specified threshold range, taking a first-level rendering parameter supported by the set terminal performance as a first rendering parameter adaptive to the network transmission state;
if the network parameter is in the specified threshold range, taking a second low-level rendering parameter supported by the set terminal performance as a first rendering parameter adaptive to the network transmission state;
if the network parameter is smaller than the lower limit value of the specified threshold range, taking a third level rendering parameter supported by the set performance of the terminal as a first rendering parameter adaptive to the network transmission state;
the first level rendering parameter is a highest level rendering parameter, the third level rendering parameter is a lowest level rendering parameter, and the second level rendering parameter is a rendering parameter between the highest level rendering parameter and the lowest level rendering parameter.
5. The method according to any one of claims 1 to 4, wherein the monitoring of the network transmission state between the cloud rendering server and the terminal currently comprises:
and monitoring the network bandwidth between the cloud rendering server and the terminal.
6. The method of any of claims 1-4, wherein the rendering parameters include: any one or more of resolution, frame rate, and rendering quality level.
7. The method of claim 2, wherein the calculating a first coding parameter adapted to the network transmission state comprises:
obtaining the first coding parameter according to the first rendering parameter obtained by calculation and the mapping relation between the rendering parameter and the first coding parameter;
or, the first coding parameter adapted to the network transmission state is obtained by calculation according to the network transmission state.
8. An apparatus for image processing, the apparatus being applied to a cloud rendering server, the apparatus comprising:
the monitoring module is used for monitoring the network transmission state of the terminal;
the determining module is used for determining a first rendering parameter and a first coding parameter which are corresponding to the terminal and are adaptive to the network transmission state;
the execution module is used for executing the picture rendering of the terminal application according to the determined first rendering parameter which is adaptive to the network transmission state, and executing the coding operation on the picture obtained by rendering according to the first coding parameter;
and the transmission module is used for transmitting the video data obtained by encoding to the terminal through a network.
9. The apparatus of claim 8, wherein the determining module is specifically configured to:
calculating to obtain a first rendering parameter and a first coding parameter which are adaptive to the network transmission state and correspond to the terminal; wherein the first rendering parameter and the first encoding parameter are supported by a capability of the terminal;
acquiring a current second rendering parameter and a second coding parameter, judging whether the second rendering parameter and the second coding parameter are consistent with the first rendering parameter and the second coding parameter, and if not, adjusting to make the second rendering parameter and the second coding parameter consistent with the first rendering parameter and the second coding parameter;
if the two are consistent, no adjustment is performed.
10. A machine-readable storage medium having stored thereon computer instructions which, when executed, perform the method of any one of claims 1-7.
11. An electronic device, comprising: a machine-readable storage medium and a processor, the machine-readable storage medium: storing the instruction code; a processor: communicating with a machine-readable storage medium, reading and executing instruction code in the machine-readable storage medium, to implement the method of any one of claims 1-7.
CN202010555675.8A 2020-06-17 2020-06-17 Video processing method, device, storage medium and equipment Pending CN111901635A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010555675.8A CN111901635A (en) 2020-06-17 2020-06-17 Video processing method, device, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010555675.8A CN111901635A (en) 2020-06-17 2020-06-17 Video processing method, device, storage medium and equipment

Publications (1)

Publication Number Publication Date
CN111901635A true CN111901635A (en) 2020-11-06

Family

ID=73206781

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010555675.8A Pending CN111901635A (en) 2020-06-17 2020-06-17 Video processing method, device, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN111901635A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112565884A (en) * 2020-11-27 2021-03-26 北京达佳互联信息技术有限公司 Image processing method, image processing device, terminal, server and storage medium
CN112738553A (en) * 2020-12-18 2021-04-30 深圳市微网力合信息技术有限公司 Self-adaptive cloud rendering system and method based on network communication quality
CN113727142A (en) * 2021-09-02 2021-11-30 北京沃东天骏信息技术有限公司 Cloud rendering method and device and computer-storable medium
CN114281453A (en) * 2021-12-15 2022-04-05 天翼电信终端有限公司 Cloud mobile phone and terminal application interactive execution method and device, server and storage medium
CN114302125A (en) * 2021-12-30 2022-04-08 展讯通信(上海)有限公司 Image processing method and device, and computer readable storage medium
CN114727083A (en) * 2021-01-04 2022-07-08 中国移动通信有限公司研究院 Data processing method, device, terminal and network side equipment
WO2022160744A1 (en) * 2021-01-29 2022-08-04 稿定(厦门)科技有限公司 Gpu-based video synthesis system and method
CN115550690A (en) * 2022-12-02 2022-12-30 腾讯科技(深圳)有限公司 Frame rate adjusting method, device, equipment and storage medium
CN116440501A (en) * 2023-06-16 2023-07-18 瀚博半导体(上海)有限公司 Self-adaptive cloud game video picture rendering method and system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103888485A (en) * 2012-12-19 2014-06-25 华为技术有限公司 Method for distributing cloud computing resource, device thereof and system thereof
CN104580368A (en) * 2014-12-09 2015-04-29 东北大学 Virtual network embedded system and method for cloud rendering in optical data center network
US20160249078A1 (en) * 2013-10-15 2016-08-25 Sky Italia S.R.L. Cloud Encoding System
US20190295309A1 (en) * 2018-03-20 2019-09-26 Lenovo (Beijing) Co., Ltd. Image rendering method and system
CN110460496A (en) * 2012-12-27 2019-11-15 辉达公司 It is controlled by frame per second and realizes that network self-adapting time delay reduces
CN110572656A (en) * 2019-09-19 2019-12-13 北京视博云科技有限公司 coding method, image processing method, device and system
CN110930307A (en) * 2019-10-31 2020-03-27 北京视博云科技有限公司 Image processing method and device
CN111135569A (en) * 2019-12-20 2020-05-12 RealMe重庆移动通信有限公司 Cloud game processing method and device, storage medium and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103888485A (en) * 2012-12-19 2014-06-25 华为技术有限公司 Method for distributing cloud computing resource, device thereof and system thereof
CN110460496A (en) * 2012-12-27 2019-11-15 辉达公司 It is controlled by frame per second and realizes that network self-adapting time delay reduces
US20160249078A1 (en) * 2013-10-15 2016-08-25 Sky Italia S.R.L. Cloud Encoding System
CN104580368A (en) * 2014-12-09 2015-04-29 东北大学 Virtual network embedded system and method for cloud rendering in optical data center network
US20190295309A1 (en) * 2018-03-20 2019-09-26 Lenovo (Beijing) Co., Ltd. Image rendering method and system
CN110572656A (en) * 2019-09-19 2019-12-13 北京视博云科技有限公司 coding method, image processing method, device and system
CN110930307A (en) * 2019-10-31 2020-03-27 北京视博云科技有限公司 Image processing method and device
CN111135569A (en) * 2019-12-20 2020-05-12 RealMe重庆移动通信有限公司 Cloud game processing method and device, storage medium and electronic equipment

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112565884A (en) * 2020-11-27 2021-03-26 北京达佳互联信息技术有限公司 Image processing method, image processing device, terminal, server and storage medium
CN112565884B (en) * 2020-11-27 2024-03-05 北京达佳互联信息技术有限公司 Image processing method, device, terminal, server and storage medium
CN112738553A (en) * 2020-12-18 2021-04-30 深圳市微网力合信息技术有限公司 Self-adaptive cloud rendering system and method based on network communication quality
WO2022127606A1 (en) * 2020-12-18 2022-06-23 微网优联科技(成都)有限公司 Adaptive cloud rendering system and method based on network communication quality
CN114727083A (en) * 2021-01-04 2022-07-08 中国移动通信有限公司研究院 Data processing method, device, terminal and network side equipment
WO2022160744A1 (en) * 2021-01-29 2022-08-04 稿定(厦门)科技有限公司 Gpu-based video synthesis system and method
CN113727142A (en) * 2021-09-02 2021-11-30 北京沃东天骏信息技术有限公司 Cloud rendering method and device and computer-storable medium
CN114281453A (en) * 2021-12-15 2022-04-05 天翼电信终端有限公司 Cloud mobile phone and terminal application interactive execution method and device, server and storage medium
CN114302125A (en) * 2021-12-30 2022-04-08 展讯通信(上海)有限公司 Image processing method and device, and computer readable storage medium
CN115550690A (en) * 2022-12-02 2022-12-30 腾讯科技(深圳)有限公司 Frame rate adjusting method, device, equipment and storage medium
CN115550690B (en) * 2022-12-02 2023-04-14 腾讯科技(深圳)有限公司 Frame rate adjusting method, device, equipment and storage medium
CN116440501A (en) * 2023-06-16 2023-07-18 瀚博半导体(上海)有限公司 Self-adaptive cloud game video picture rendering method and system
CN116440501B (en) * 2023-06-16 2023-08-29 瀚博半导体(上海)有限公司 Self-adaptive cloud game video picture rendering method and system

Similar Documents

Publication Publication Date Title
CN111901635A (en) Video processing method, device, storage medium and equipment
CA2975904C (en) Method and system for smart adaptive video streaming driven by perceptual quality-of-experience estimations
US10250664B2 (en) Placeshifting live encoded video faster than real time
US8218439B2 (en) Method and apparatus for adaptive buffering
US7543326B2 (en) Dynamic rate control
US7359004B2 (en) Bi-level and full-color video combination for video communication
US20170318323A1 (en) Video playback method and control terminal thereof
US11206431B2 (en) Systems and methods for selecting an initial streaming bitrate
US10177899B2 (en) Adapting a jitter buffer
CN110913245A (en) Method and device for controlling video transcoding code rate
US20110299588A1 (en) Rate control in video communication via virtual transmission buffer
WO2013059930A1 (en) Methods and apparatus for providing a media stream quality signal
EP2974207A1 (en) Playback stall avoidance in adaptive media streaming
CN108881931B (en) Data buffering method and network equipment
CN104270649A (en) Image encoding device and image encoding method
JP2016051927A (en) Image processing apparatus, image processing method, and program
CN111491201B (en) Method for adjusting video code stream and video frame loss processing method
JP2016059037A (en) Method and client terminal for receiving multimedia content split into at least two successive segments, and corresponding computer program product and computer readable medium
CN112929712A (en) Video code rate adjusting method and device
CN104022845A (en) Method, apparatus and system for adjusting bit rate of data block
CN113286146B (en) Media data processing method, device, equipment and storage medium
US7940843B1 (en) Method of implementing improved rate control for a multimedia compression and encoding system
CN108124155B (en) Code rate control method and device and electronic equipment
CN114629797A (en) Bandwidth prediction method, model generation method and equipment
CN114513668A (en) Live video hardware encoder control method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201106