CN114866814B - Network bandwidth allocation method and device - Google Patents

Network bandwidth allocation method and device Download PDF

Info

Publication number
CN114866814B
CN114866814B CN202210646907.XA CN202210646907A CN114866814B CN 114866814 B CN114866814 B CN 114866814B CN 202210646907 A CN202210646907 A CN 202210646907A CN 114866814 B CN114866814 B CN 114866814B
Authority
CN
China
Prior art keywords
audio
video
buffer
cache
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210646907.XA
Other languages
Chinese (zh)
Other versions
CN114866814A (en
Inventor
陆元亘
汪明君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Bilibili Technology Co Ltd
Original Assignee
Shanghai Bilibili Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Bilibili Technology Co Ltd filed Critical Shanghai Bilibili Technology Co Ltd
Priority to CN202210646907.XA priority Critical patent/CN114866814B/en
Publication of CN114866814A publication Critical patent/CN114866814A/en
Application granted granted Critical
Publication of CN114866814B publication Critical patent/CN114866814B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2385Channel allocation; Bandwidth allocation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/231Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion

Abstract

The application provides a network bandwidth allocation method and a device, wherein the network bandwidth allocation method comprises the following steps: determining an audio buffer area and a video buffer area, wherein the audio buffer area is smaller than the video buffer area; caching the audio data of the target audio and video into an audio cache region, and caching the video data of the target audio and video into a video cache region; acquiring audio buffer information of an audio buffer area, and acquiring playing buffer information corresponding to target audio and video buffer information of a video buffer area; video bandwidth and audio bandwidth are allocated to the target audio and video according to the audio cache information; according to the method, bandwidth allocation between audio and video can be dynamically adjusted according to the audio cache information, bandwidth waste is reduced, audio and video playing is reduced, and user experience is improved.

Description

Network bandwidth allocation method and device
Technical Field
The application relates to the technical field of computers, in particular to a network bandwidth allocation method. The application also relates to a network bandwidth allocation device, a computing device and a computer readable storage medium.
Background
With the development of internet technology, more and more users watch video content through a terminal, and when the terminal downloads audio and video content from a server, a TCP (Transmission Control Protocol, a transport layer communication protocol) creates a video receiving buffer area and an audio receiving buffer area in the terminal, and the two together use the bandwidth of the device to receive data until the respective receiving buffer areas are filled.
In general, the default video receiving buffer area and the audio receiving buffer area of the system are the same in size and are both large, and the code rate of the video is usually far greater than that of the audio, so that in the playing process, as long as the audio receiving buffer area is not full, the video can always compete with the video for bandwidth, so that the video originally needing more bandwidth is downloaded slowly, and the problems of slow or even failure loading of the first frame, playing of the video is blocked and the like are caused.
Disclosure of Invention
In view of this, the embodiment of the application provides a network bandwidth allocation method. The application also relates to a network bandwidth allocation device, a computing device and a computer readable storage medium, which are used for solving the problem that in the prior art, in the process of caching audio and video, audio data and video data compete for network bandwidth, thereby causing audio and video playing to be blocked.
According to a first aspect of an embodiment of the present application, there is provided a network bandwidth allocation method, including:
determining an audio buffer area and a video buffer area, wherein the audio buffer area is smaller than the video buffer area;
Caching the audio data of the target audio and video into the audio cache region, and caching the video data of the target audio and video into the video cache region;
Acquiring audio buffer information of the audio buffer area, and acquiring playing buffer information corresponding to the target audio and video buffer information of the video buffer area;
distributing video bandwidth and audio bandwidth for the target audio and video according to the audio cache information;
And determining a target cache region adjustment strategy according to the playing cache information and the video cache information, and adjusting the audio cache region and the video cache region based on the target cache region adjustment strategy.
According to a second aspect of an embodiment of the present application, there is provided a network bandwidth allocation apparatus, including:
the determining module is configured to determine an audio cache area and a video cache area, wherein the audio cache area is smaller than the video cache area;
The buffer module is configured to buffer the audio data of the target audio and video to the audio buffer area and buffer the video data of the target audio and video to the video buffer area;
The acquisition module is configured to acquire the audio cache information of the audio cache region and acquire the playing cache information corresponding to the target audio and video and the video cache information of the video cache region;
the allocation module is configured to allocate video bandwidth and audio bandwidth for the target audio and video according to the audio cache information;
And the adjusting module is configured to determine a target cache area adjusting strategy according to the playing cache information and the video cache information, and adjust the audio cache area and the video cache area based on the target cache area adjusting strategy.
According to a third aspect of embodiments of the present application, there is provided a computing device comprising a memory, a processor and computer instructions stored on the memory and executable on the processor, the processor implementing the steps of the network bandwidth allocation method when executing the computer instructions.
According to a fourth aspect of embodiments of the present application, there is provided a computer readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the network bandwidth allocation method.
The network bandwidth allocation method provided by the application determines an audio buffer area and a video buffer area, wherein the audio buffer area is smaller than the video buffer area; caching the audio data of the target audio and video into the audio cache region, and caching the video data of the target audio and video into the video cache region; acquiring audio buffer information of the audio buffer area, and acquiring playing buffer information corresponding to the target audio and video buffer information of the video buffer area; distributing video bandwidth and audio bandwidth for the target audio and video according to the audio cache information; and determining a target cache region adjustment strategy according to the playing cache information and the video cache information, and adjusting the audio cache region and the video cache region based on the target cache region adjustment strategy.
According to the embodiment of the application, the video bandwidth and the audio bandwidth allocation are dynamically adjusted through setting the larger video buffer area and the smaller audio buffer area and monitoring the audio buffer information in the audio buffer area, so that more bandwidths are allocated to the video under the condition that the audio can be cached quickly, the video buffer efficiency is improved, the bandwidth waste is reduced, the audio and video playing jamming is reduced, and the user experience is improved.
Drawings
Fig. 1 is a flowchart of a network bandwidth allocation method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an audio/video play-starting and growth strategy according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an audio/video reduction adjustment strategy according to an embodiment of the present application;
fig. 4 is a process flow diagram of a network bandwidth allocation method applied to a buffer of an audio/video according to an embodiment of the present application;
Fig. 5 is a schematic structural diagram of a network bandwidth allocation device according to an embodiment of the present application;
FIG. 6 is a block diagram of a computing device according to one embodiment of the application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. The present application may be embodied in many other forms than those herein described, and those skilled in the art will readily appreciate that the present application may be similarly embodied without departing from the spirit or essential characteristics thereof, and therefore the present application is not limited to the specific embodiments disclosed below.
The terminology used in the one or more embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the application. As used in one or more embodiments of the application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application refers to any or all possible combinations of one or more of the associated listed items.
It should be understood that, although the terms first, second, etc. may be used in one or more embodiments of the application to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first may also be referred to as a second, and similarly, a second may also be referred to as a first, without departing from the scope of one or more embodiments of the application. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination" depending on the context.
First, terms related to one or more embodiments of the present application will be explained.
TCP receive buffer: the core buffer area in the terminal for temporarily storing the data received from the server may be further divided into an audio buffer area and a video buffer area for the audio and video files.
First frame time: the time spent from the video playing of the user click to the video first frame rendering, the first frame time comprises the time spent by the links of video request, data buffering and decoding and rendering at the player level besides the time spent by the service side on the user click, page creation and rendering.
Katon rate: the playback event in which a jam occurs is a proportion of the total playback event.
Seek: and rapidly dragging the audio and video from the current position to the given position and playing the audio and video.
When the terminal downloads the audio and video from the server, the TCP creates a video buffer zone and an audio buffer zone, and the video buffer zone and the audio buffer zone are used together for data receiving by using network equipment of the terminal until the respective receiving buffer zones are filled, and the size of the video buffer zone and the audio buffer zone is fixed and is a larger fixed value by default by the system at present so as to facilitate the playing and use of the whole audio and video. At present, a method is also available for setting smaller audio buffer and video buffer when playing, and then using a simple growth strategy to make the audio buffer and video buffer grow to the maximum value.
However, in practical application, the code rate of the video is usually much larger than the audio code rate, and in the playing process, the video always competes with the video for bandwidth as long as the audio buffer area is not full, which results in slow downloading of the video which originally needs more bandwidth, and further causes the problems of slow loading of the first frame, even failure, playing of the video and the like. In addition, setting a larger audio buffer for the audio with smaller code rate may also lead to waste of traffic. Setting smaller audio buffer and video buffer during playing can optimize the first frame time and loading failure rate to a certain extent, but the growth strategy is too simple and lacks the regulation strategy, which may cause the increase of the playing card stop rate.
Based on this, in the present application, a network bandwidth allocation method is provided, and the present application relates to a network bandwidth allocation apparatus, a computing device, and a computer-readable storage medium, which are described in detail in the following embodiments one by one.
Fig. 1 shows a flowchart of a network bandwidth allocation method according to an embodiment of the present application, which specifically includes the following steps:
Step 102: and determining an audio buffer area and a video buffer area, wherein the audio buffer area is smaller than the video buffer area.
In practical application, a user wants to watch a certain audio/video, and needs to buffer the audio/video to a terminal for watching, at this time, the target audio/video specifically refers to the audio/video that the user wants to watch, in general, the video frame and the audio frame of the target audio/video are corresponding, the user uses the terminal to watch the target audio/video, and can click a play button on the terminal, the terminal sends an acquisition instruction of the target audio/video to the server, the server sends audio data and video data of the target audio/video to the terminal, the terminal receives the audio data and the video data by setting an audio buffer area and a video buffer area, specifically, buffers the audio data to the audio buffer area, buffers the video data to the video buffer area, and then buffers the audio data and the video data to the play buffer area, where the play buffer area is used for buffering the audio/video content that can be played directly in the player.
The user uses the terminal to send a caching instruction to the server, wherein the caching instruction specifically refers to an instruction for caching audio data and video data, and after receiving the caching instruction for a target audio and video, a frequency caching area and a video caching area are generated for the target audio and video, and the audio caching area and the video caching area are both TCP receiving caching areas.
The lengths of the audio buffer area and the video buffer area in the application are different, namely the audio buffer area is smaller than the video buffer area, and in particular, in order to meet the condition that audio and video do not get stuck during playing, the audio buffer area and the video buffer area need to meet the following conditions:
1, in order to ensure that the audio and video can be normally played, the audio and video needs to be cached for a certain time (frame number).
2, Because the audio code rate is usually smaller than the video code rate, in order to ensure that the downloaded frames of the two frames are as consistent as possible, the video buffer area is required to be ensured to be larger than the audio buffer area;
And 3, in order to reasonably allocate network bandwidth, the size of the audio buffer area cannot be too large, the audio buffer area is filled as soon as possible, the bandwidth is allocated to the video, and the synchronization of the audio and the video is ensured.
It should be noted that, in this step, the audio buffer and the video buffer may correspond to different stages of buffering the audio and video, in the playing stage, the initial audio buffer and the video buffer are determined according to the playing time of the target audio and video, and in the stage after playing, the audio buffer and the video buffer are buffers after being adjusted by the adjustment policy.
Based on this, in the play-up phase, an audio buffer and a video buffer are determined, including:
Determining the initial playing time length, the audio code rate and the video code rate of the target audio and video;
determining an initial audio buffer area based on the initial playing time length and the audio code rate;
And determining an initial video cache region based on the initial playing time length and the video code rate.
The initial playing duration is specifically to meet the requirement in the 1 st condition, and the initial playing duration may be set to be 2 seconds. The audio code rate specifically refers to the code rate of audio in the target audio and video, the video code rate specifically refers to the code rate of video in the target audio and video, and in practical application, after the target audio and video are determined, the audio code rate and the video code rate can be directly obtained based on the target audio and video.
After the initial playing time length, the audio code rate and the video code rate are determined, an initial audio buffer area can be determined according to the initial playing time length and the audio code rate, an initial video buffer area is determined according to the initial playing time length and the video code rate, specifically, the size of the initial audio buffer area can be obtained by multiplying the initial playing time length and the audio code rate, and the size of the initial video buffer area can be obtained by multiplying the initial playing time length and the video code rate. For example, the initial play time is 5 seconds, the audio rate is 50K, and the video rate is 200K, based on which it can be determined that the initial audio buffer is set to 250K and the initial video buffer is 1000K.
In the subsequent operation, the audio buffer area and the video buffer area can be adjusted according to the corresponding buffer area adjustment strategy, and the adjusted audio buffer area and video buffer area are acquired.
Step 104: and caching the audio data of the target audio and video into the audio cache region, and caching the video data of the target audio and video into the video cache region.
After the audio data cache area and the video cache area of the target audio and video are determined, the audio data and the video data of the target audio and video can be received, the audio data are cached in the audio cache area, and the video data are cached in the video cache area.
Step 106: and acquiring the audio cache information of the audio cache region, and acquiring the playing cache information corresponding to the target audio and video and the video cache information of the video cache region.
In the process of caching audio data and video data, monitoring audio cache information of an audio cache region, wherein the audio cache information specifically refers to information of the audio data cached in the audio cache region, and can include audio cache duration, audio cache rate of the audio cache region and the like.
By acquiring the audio buffer information of the audio buffer area, the network bandwidth is conveniently distributed according to the audio buffer information, so that the audio bandwidth and the video bandwidth are dynamically adjusted in the process of buffering the audio data and the video data.
In practical application, audio data is cached to an audio cache region, video data is cached to a video cache region, a playing cache region is further arranged in the terminal, target audio and video data for playing is cached in the playing cache region, namely, the audio data of a target frame is obtained from the audio cache region, the video data of the target frame is obtained from the video cache region, the audio data and the video data corresponding to the target frame are combined to generate audio and video data, the audio and video data is cached to the playing cache region, and the target audio and video data in the playing cache region can be directly used for playing by a player.
The playing buffer information of the target audio and video in the playing buffer area is obtained, and specifically, the playing buffer information may include playable duration of the target audio and video, playing buffer rate of the playing buffer area, and the like.
The video buffering information specifically refers to information of video data buffered in the video buffering area, and may include a video buffering duration, a video buffering rate of the video buffering area, and the like.
In the process of caching the video data and the audio data of the target audio and video, the playing cache information in the playing cache area and the video cache information in the video cache area can be obtained in real time, and the sizes of the video cache area and the audio cache area can be adjusted in an assisted mode according to the playing cache information and the video cache information.
Step 108: and distributing video bandwidth and audio bandwidth for the target audio and video according to the audio cache information.
After the audio buffer information is obtained, the audio bandwidth and the video bandwidth can be dynamically adjusted in real time according to the specific condition of the audio buffer information, wherein the audio bandwidth specifically refers to the bandwidth allocated for buffering the audio data, the video bandwidth specifically refers to the bandwidth allocated for buffering the video data, in practical application, the total bandwidth of the terminal is fixed, and the audio bandwidth and the video bandwidth are determined from the total bandwidth.
In an embodiment of the present application, further, the audio buffer information includes an audio buffer rate, and accordingly, video bandwidth and audio bandwidth need to be allocated to a target audio and video according to the audio buffer rate, specifically, video bandwidth and audio bandwidth need to be allocated to the target audio and video according to the audio buffer information, including:
adjusting video bandwidth to be equal to audio bandwidth under the condition that the audio buffer rate is smaller than or equal to a preset audio threshold, wherein the preset audio threshold is determined according to the audio buffer area;
and adjusting video bandwidth to be larger than audio bandwidth under the condition that the audio buffer rate is larger than the preset audio threshold value.
The audio buffer rate specifically refers to a buffer ratio of audio data in an audio buffer, for example, the size of the audio buffer is 400K, and if the buffered audio data has 200K, the audio buffer rate is 50%; if the size of the audio buffer area is 500K, the buffered audio data has 450K, and the audio buffer rate is 90%.
The preset audio threshold specifically refers to a threshold of audio data in an audio buffer zone, which is preset, if the audio buffer rate in the audio buffer zone is greater than the preset audio threshold, it is indicated that the data in the audio buffer zone is saturated at this time, and the audio bandwidth for buffering the audio data can be no longer allocated for the data; if the audio buffer rate in the audio buffer is smaller than or equal to the preset audio threshold, it is indicated that there is space in the audio buffer for buffering the audio data at this time, and the same bandwidth can be allocated for the audio data and the video data. Specifically, the preset audio threshold may be set higher, for example, 99%, 99.5%, etc., and the specific setting of the preset audio threshold is based on practical application.
In a specific embodiment of the present application, taking the preset audio threshold value as 99% as an example, the audio buffer rate of the audio buffer area is obtained to be 60%, and if the audio buffer rate is smaller than the preset audio threshold value, it can be determined that the audio buffer area is not filled yet, and if the audio bandwidth is the same as the video bandwidth.
In another specific embodiment provided by the application, taking the preset audio threshold value as 99% as an example, the audio buffer rate of the audio buffer area is obtained to be 99.3%, and at the moment, the audio buffer rate is larger than the preset audio threshold value, the audio buffer area can be determined to be filled, and more network bandwidth is allocated to the video bandwidth, so that more network traffic can be obtained for the video data, and the buffering of the video data is quickened.
By monitoring the audio buffer information (audio buffer rate) in the audio buffer area, the video bandwidth and the audio bandwidth can be dynamically adjusted, more bandwidth resources can be allocated to the video data under the condition that the audio buffer rate of the audio data in the audio buffer area is enough, the video buffer efficiency is improved, and the waste of network bandwidth is reduced.
It should be noted that, at this time, the audio buffer and the video buffer are not the maximum values, and the buffered audio data and video data are not all the data of the target audio and video, so that the sizes of the audio buffer and the video buffer also need to be dynamically adjusted. So that the audio buffer area and the video buffer area can buffer the audio data and the video data of the target audio and video.
Step 110: and determining a target cache region adjustment strategy according to the playing cache information and the video cache information, and adjusting the audio cache region and the video cache region based on the target cache region adjustment strategy.
The target buffer adjustment policy specifically refers to a buffer size adjustment policy determined according to current playing buffer information and video buffer information, where the adjustment policy is used to adjust the size of an audio buffer and the size of a video buffer. And adjusting the audio buffer area and the video buffer area according to the target buffer area adjusting strategy.
In the method provided by the application, the audio buffer area and the video buffer area are not mechanically increased according to time, but are dynamically and adaptively adjusted according to the current playing buffer information and video buffer information.
Specifically, the playing cache information comprises a cache playing time length, and the video cache information comprises a video cache rate;
Determining a target buffer adjustment strategy according to the play buffer information and the video buffer information, including:
under the condition that the playing cache information meets the preset playing condition, determining the target cache area adjusting strategy as the maximum adjusting strategy;
determining a target cache region adjustment strategy as an increase adjustment strategy under the condition that the play cache information does not meet a preset play condition and the video cache rate is greater than or equal to a first preset video threshold value;
Determining a target cache region adjustment strategy to be a maintenance adjustment strategy under the condition that the play cache information does not meet a preset play condition, and the video cache rate is larger than a second preset video threshold and smaller than the first preset video threshold;
And determining the target cache region adjustment strategy as a reduction adjustment strategy under the condition that the play cache information does not meet the preset play condition and the video cache rate is smaller than or equal to the second preset video threshold value.
In practical application, the playing cache information comprises cache playing time length, playing cache rate and the like; the video cache information comprises a video cache rate, and the cache playing time length specifically refers to the playing time length of a target audio/video supported to be played in a playing cache region, for example, if the playing cache region is cached with the target audio/video for 8 minutes, the playing time length of the cache is 8 minutes. The play buffer rate represents the buffer ratio of the audio and video in the play buffer area.
The video buffer rate is the same as the audio buffer rate described above, and is used to represent the buffer ratio of video data in the video buffer. The calculation method of the video buffer rate refers to the calculation method of the audio buffer rate in the above embodiment, and will not be described herein.
Furthermore, the adjustment strategy of the buffer area is primarily judged according to the playing buffer information, if the playing buffer information meets the preset playing condition, the current network speed of the user is better, the network bandwidth is not required to be planned, and the video buffer area and the audio buffer area can be directly adjusted to the maximum.
Specifically, the preset playing condition can be confirmed from two dimensions, namely, a playing time length and a playing buffer rate, if the playing time length is longer than the preset playing time length and the playing buffer rate is greater than a preset buffer threshold, it can be determined that the playing buffer information meets the preset playing condition, for example, taking the preset playing time length as 30 seconds and the preset buffer threshold as 50%, at this time, the playing buffer information is in the buffer playing time length of 2 minutes and the playing buffer rate as 60%, and then it is indicated that the playing buffer information meets the preset playing condition at this time, and it can be determined that the target buffer adjustment policy is the maximum adjustment policy.
In another embodiment provided in the present disclosure, if the playing buffer information does not meet the preset playing condition, that is, the playing time length of the buffer is less than or equal to the preset playing time length or the playing buffer rate is less than or equal to the preset buffer threshold, the network condition of the user is indicated to be general. And because the video code rate is higher than the audio code rate, the consumption of the video data is faster, and the current network condition of the terminal can be further judged through the video buffer rate in the video buffer zone.
Specifically, if the playing buffer information does not meet the preset playing condition and the video buffer rate is greater than or equal to the first preset video threshold, which indicates that the network state is good at this time, the size of the video buffer area and the audio buffer area can be further increased, so that more video data and audio data are stored, that is, the target buffer area adjustment strategy at this time is an increase adjustment strategy.
If the playing buffer information does not meet the preset playing condition and the video buffer rate is greater than the second preset video threshold and smaller than the first preset video threshold, which indicates that the network state is poor at this time, the video data in the video buffer area is difficult to support the playing of the target audio and video, and the audio data needs to be reduced as much as possible to occupy the network bandwidth and allocate more network bandwidth resources for buffering the video data, so that the target buffer area adjustment strategy at this time can be determined to be the maintenance adjustment strategy.
If the playing buffer information does not meet the preset playing condition and the video buffer rate is smaller than or equal to the second preset video threshold, the user is informed that the Seek operation (the process of rapidly dragging the audio and video from the current position to the given position and playing) or the network speed is suddenly reduced is performed on the target audio and video. At this time, the length of the audio buffer needs to be reduced to limit the competing network speed of the audio data and the video data, i.e. the target buffer adjustment strategy is determined to be the reduction adjustment strategy.
After the target buffer adjustment strategy is determined, the audio buffer and the video buffer corresponding to the target audio and video can be adjusted according to the target buffer adjustment strategy. The following describes the adjustment of the audio buffer and the video buffer with four embodiments for each buffer adjustment strategy.
In one embodiment of the present application, when the target buffer adjustment policy is the maximum adjustment policy, adjusting the audio buffer and the video buffer based on the target buffer adjustment policy includes:
determining the maximum value of an audio buffer area and the maximum value of a video buffer area based on the target audio and video;
And adjusting the audio buffer area to the maximum value of the audio buffer area, and adjusting the video buffer area to the maximum value of the video buffer area.
The maximum value of the audio buffer area specifically refers to the maximum buffer value required by the audio buffer area, and the maximum value of the video buffer area specifically refers to the maximum buffer value required by the video buffer area. Under the condition that the target buffer area adjustment strategy is the maximum adjustment strategy, the fact that the current network speed of the user is better at the moment is indicated, and the video buffer area and the audio buffer area are directly adjusted to the corresponding maximum values.
It should be noted that, also because the audio code rate and the video code rate are different, the maximum value of the audio buffer area and the maximum value of the video buffer area can be determined based on the code rate and the video code rate, so that unnecessary network bandwidth waste is avoided, and the audio data in the audio buffer area and the video data in the video buffer area can support the same playing time length. Specifically, determining the maximum value of the audio buffer area and the maximum value of the video buffer area based on the target audio and video includes:
Determining the total playing time length, the audio code rate and the video code rate of the target audio and video;
Determining the maximum value of an audio buffer area according to the total playing time length and the audio code rate;
and determining the maximum value of the video buffer area according to the total playing time length and the video code rate.
The total playing time length of the target audio and video specifically refers to the maximum time length that the target audio and video can support playing, the maximum value of the audio buffer area and the maximum value of the video buffer area can be calculated according to the audio code rate and the video code rate, and specifically, the maximum value of the audio buffer area can be determined according to the product of the total playing time length and the audio code rate; and determining the maximum value of the video buffer area according to the product of the total playing time length and the video code rate. For example, the total playing duration is 500 seconds, the audio code rate is 50K, the video code rate is 200K, the maximum value of the audio buffer is 25000K, and the maximum value of the video buffer is 100000K.
In one embodiment of the present application, when the target buffer adjustment policy is an addition adjustment policy, adjusting the audio buffer and the video buffer based on the target buffer adjustment policy includes:
Determining audio buffer area increment information and video buffer area increment information;
and increasing the audio buffer area based on the audio buffer area increment information, and increasing the video buffer area based on the video buffer area increment information.
Under the condition that the target cache area adjustment strategy is an increase adjustment strategy, the network state is good, the capacity of the video cache area and the audio cache area can be further increased, so that more video data and audio data can be stored, and concretely, the audio cache area increment information and the video cache area increment information can be determined, wherein the audio cache area increment information is specifically preset increase information of the audio cache area under the condition of increasing the adjustment strategy; the video buffer increment information specifically refers to preset increment information of the video buffer under the condition of increasing the adjustment strategy. For example, it is determined that both the audio buffer increment information and the video buffer increment information are double-length. At this time, the audio buffer area is 250K, the video buffer area is 1000K, and the audio buffer area is adjusted to 500K and the video buffer area is adjusted to 2000K by doubling.
In one embodiment of the present application, when the target buffer adjustment policy is a maintenance adjustment policy, adjusting the audio buffer and the video buffer based on the target buffer adjustment policy includes:
and keeping the audio buffer area and the video buffer area unchanged.
Under the condition that the target buffer area adjustment strategy is the maintenance adjustment strategy, the condition that the network state is poor at the moment is indicated, the video data in the video buffer area are difficult to support the playing of the target audio and video, and the sizes of the audio buffer area and the video buffer area are kept unchanged at the moment.
In one embodiment of the present application, when the target buffer adjustment policy is a decrease adjustment policy, adjusting the audio buffer and the video buffer based on the target buffer adjustment policy includes:
determining audio buffer area decrement information and video buffer area decrement information;
And reducing the audio buffer area based on the audio buffer area reduction information, and reducing the video buffer area based on the video buffer area reduction information.
Under the condition that the target buffer adjustment strategy is a reduction adjustment strategy, the fact that a user performs Seek operation or suddenly reduces the network speed on the target audio and video is explained, at the moment, the length of an audio buffer is required to be reduced to limit the competing network speed of the audio data and the video data, specifically, the audio buffer reduction information and the video buffer reduction information are determined, the audio buffer reduction information specifically refers to preset reduction information of the audio buffer under the condition of reducing the adjustment strategy, and the video buffer reduction information specifically refers to preset reduction information of the video buffer under the condition of reducing the adjustment strategy.
Specifically, the decrement information may be determined according to a preset coefficient between the maximum value of the audio buffer area and the maximum value of the video buffer area, where the preset coefficient may be determined in a decimal before 0-1, for example, the preset coefficient is 0.3, the maximum value of the audio buffer area is 25000K, and the maximum value of the video buffer area is 100000K, so that it may be determined that the decrement information of the audio buffer area is 7500K, the decrement information of the video buffer area is 30000K, and in practical application, the audio buffer area may be adjusted by 7500K, and the video buffer area may be adjusted by 30000K.
The network bandwidth allocation method provided by the application determines an audio buffer area and a video buffer area, wherein the audio buffer area is smaller than the video buffer area; caching the audio data of the target audio and video into the audio cache region, and caching the video data of the target audio and video into the video cache region; acquiring audio buffer information of the audio buffer area, and acquiring playing buffer information corresponding to the target audio and video buffer information of the video buffer area; distributing video bandwidth and audio bandwidth for the target audio and video according to the audio cache information; according to the playing cache information and the video cache information, a target cache area adjustment strategy is determined, the audio cache area and the video cache area are adjusted based on the target cache area adjustment strategy, by the method, the setting of a larger video cache area and a smaller audio cache area is realized, the audio cache information in the audio cache area is monitored, and the allocation of video bandwidth and audio bandwidth is dynamically adjusted through the audio cache information, so that more bandwidth is allocated to video under the condition that audio can be cached quickly, the video cache efficiency is improved, the bandwidth waste is reduced, the audio and video playing katon is reduced, and the user experience is improved.
And secondly, the sizes of the audio buffer area and the video buffer area can be dynamically adjusted according to the playing buffer information and the video buffer information, and the audio buffer information can be influenced by adjusting the sizes of the audio buffer area and the video buffer area, so that the video bandwidth and the audio bandwidth are further dynamically adjusted, the bandwidth waste is reduced, the audio and video playing jamming is reduced, and the user experience is improved.
Referring to fig. 2, fig. 2 shows a schematic diagram of an audio/video play-starting growth strategy according to an embodiment of the present application. The buffer status of the user terminal is obtained, the upper buffer can be understood as playing the audio and video data in the buffer area, if the upper buffer is sufficient, the current network speed of the user is better, the network bandwidth is not required to be planned, and the video buffer area and the audio buffer area can be directly adjusted to the maximum.
If the upper layer buffer is insufficient, further judging whether the video data in the video buffer area is sufficient, and if the video data in the video buffer area is sufficient, doubling the audio buffer area and the video buffer area.
Referring to fig. 3, fig. 3 shows a schematic diagram of an audio/video reduction adjustment policy provided by an embodiment of the present application, as shown in fig. 3, a buffer status of a user terminal is obtained, when a user has a Seek operation or a network speed suddenly drops, an upper layer buffer is not enough, and data in a video buffer is less, and the audio buffer and the video buffer need to be reset to a smaller value. The audio buffer rate of the audio buffer area is improved, so that the video bandwidth is improved.
The network bandwidth allocation method provided by the application is further described below by taking an example of application of the network bandwidth allocation method in a certain audio/video buffer as an example in combination with fig. 4. Fig. 4 shows a process flow chart of a network bandwidth allocation method applied to a buffer of an audio/video according to an embodiment of the present application, which specifically includes the following steps:
step 402: and receiving the audio and video caching instruction, and distributing a video caching area and an audio caching area for the audio and video, wherein the video caching area is larger than the audio caching area.
Step 404: and averagely distributing video bandwidth and audio bandwidth, caching video data in a video cache area, and caching audio data in an audio cache area.
Step 406: after the audio data in the audio buffer area is full, the audio bandwidth is reduced, the video bandwidth is improved, and the buffer speed of the video data is increased.
Step 408: when the video data in the video buffer area is full, the audio buffer area is doubled, and the video buffer area is doubled.
Step 410: and averagely distributing video bandwidth and audio bandwidth, continuing to buffer video data to the video buffer area, and buffering audio data to the audio buffer area.
Step 412: after the audio data in the audio buffer area is full, the audio bandwidth is reduced, the video bandwidth is improved, and the buffer speed of the video data is increased.
Step 414: the network speed suddenly decreases, and the audio data in the audio buffer and the video data in the video buffer extremely decrease.
Step 416: the size of the audio buffer area is halved, and the size of the video buffer area is halved.
Step 418: waiting for the network speed to be increased, doubling the audio buffer area and doubling the video buffer area when the video data in the video buffer area is full.
Corresponding to the above method embodiment, the present application further provides an embodiment of a network bandwidth allocation device, and fig. 5 shows a schematic structural diagram of a network bandwidth allocation device according to an embodiment of the present application. As shown in fig. 5, the apparatus includes:
A determining module 502 configured to determine an audio buffer and a video buffer, wherein the audio buffer is smaller than the video buffer;
The buffer module 504 is configured to buffer the audio data of the target audio and video to the audio buffer, and buffer the video data of the target audio and video to the video buffer;
the obtaining module 506 is configured to obtain audio buffer information of the audio buffer, and obtain play buffer information corresponding to the target audio and video buffer information of the video buffer;
an allocation module 508 configured to allocate a video bandwidth and an audio bandwidth to the target audio and video according to the audio buffer information;
And the adjusting module 510 is configured to determine a target buffer adjustment policy according to the play buffer information and the video buffer information, and adjust the audio buffer and the video buffer based on the target buffer adjustment policy.
Optionally, the audio buffering information includes an audio buffering rate;
the allocation module 508 is further configured to:
adjusting video bandwidth to be equal to audio bandwidth under the condition that the audio buffer rate is smaller than or equal to a preset audio threshold, wherein the preset audio threshold is determined according to the audio buffer area;
and adjusting video bandwidth to be larger than audio bandwidth under the condition that the audio buffer rate is larger than the preset audio threshold value.
Optionally, the video buffering information includes a video buffering rate;
the adjustment module 510 is further configured to:
under the condition that the playing cache information meets the preset playing condition, determining the target cache area adjusting strategy as the maximum adjusting strategy;
determining a target cache region adjustment strategy as an increase adjustment strategy under the condition that the play cache information does not meet a preset play condition and the video cache rate is greater than or equal to a first preset video threshold value;
Determining a target cache region adjustment strategy to be a maintenance adjustment strategy under the condition that the play cache information does not meet a preset play condition, and the video cache rate is larger than a second preset video threshold and smaller than the first preset video threshold;
And determining the target cache region adjustment strategy as a reduction adjustment strategy under the condition that the play cache information does not meet the preset play condition and the video cache rate is smaller than or equal to the second preset video threshold value.
Optionally, in the case that the target cache adjustment policy is the maximum adjustment policy, the adjustment module 510 is further configured to:
determining the maximum value of an audio buffer area and the maximum value of a video buffer area based on the target audio and video;
And adjusting the audio buffer area to the maximum value of the audio buffer area, and adjusting the video buffer area to the maximum value of the video buffer area.
Optionally, the adjusting module 510 is further configured to:
Determining the total playing time length, the audio code rate and the video code rate of the target audio and video;
Determining the maximum value of an audio buffer area according to the total playing time length and the audio code rate;
and determining the maximum value of the video buffer area according to the total playing time length and the video code rate.
Optionally, in the case that the target cache adjustment policy is an increase adjustment policy, the adjustment module 510 is further configured to:
Determining audio buffer area increment information and video buffer area increment information;
and increasing the audio buffer area based on the audio buffer area increment information, and increasing the video buffer area based on the video buffer area increment information.
Optionally, in the case that the target cache adjustment policy is a maintenance adjustment policy, the adjustment module 510 is further configured to:
and keeping the audio buffer area and the video buffer area unchanged.
Optionally, in the case that the target cache adjustment policy is a decrease adjustment policy, the adjustment module 510 is further configured to:
determining audio buffer area decrement information and video buffer area decrement information;
And reducing the audio buffer area based on the audio buffer area reduction information, and reducing the video buffer area based on the video buffer area reduction information.
Optionally, the determining module 502 is further configured to:
Determining the initial playing time length, the audio code rate and the video code rate of the target audio and video;
determining an initial audio buffer area based on the initial playing time length and the audio code rate;
And determining an initial video cache region based on the initial playing time length and the video code rate.
The network bandwidth allocation device provided by the application determines an audio buffer area and a video buffer area, wherein the audio buffer area is smaller than the video buffer area; caching the audio data of the target audio and video into the audio cache region, and caching the video data of the target audio and video into the video cache region; acquiring audio buffer information of the audio buffer area, and acquiring playing buffer information corresponding to the target audio and video buffer information of the video buffer area; distributing video bandwidth and audio bandwidth for the target audio and video according to the audio cache information; according to the playing cache information and the video cache information, a target cache area adjustment strategy is determined, the audio cache area and the video cache area are adjusted based on the target cache area adjustment strategy, the device is used for setting a larger video cache area and a smaller audio cache area, monitoring the audio cache information in the audio cache area, and dynamically adjusting the allocation of video bandwidth and audio bandwidth through the audio cache information, so that more bandwidth is allocated to video under the condition that audio can be cached quickly, the video cache efficiency is improved, the bandwidth waste is reduced, audio and video playing is reduced, and the user experience is improved.
And secondly, the sizes of the audio buffer area and the video buffer area can be dynamically adjusted according to the playing buffer information and the video buffer information, and the audio buffer information can be influenced by adjusting the sizes of the audio buffer area and the video buffer area, so that the video bandwidth and the audio bandwidth are further dynamically adjusted, the bandwidth waste is reduced, the audio and video playing jamming is reduced, and the user experience is improved.
The foregoing is a schematic solution of a network bandwidth allocation apparatus of this embodiment. It should be noted that, the technical solution of the network bandwidth allocation apparatus and the technical solution of the network bandwidth allocation method belong to the same concept, and details of the technical solution of the network bandwidth allocation apparatus, which are not described in detail, can be referred to the description of the technical solution of the network bandwidth allocation method.
Fig. 6 illustrates a block diagram of a computing device 600 provided in accordance with an embodiment of the present application. The components of computing device 600 include, but are not limited to, memory 610 and processor 620. The processor 620 is coupled to the memory 610 via a bus 630 and a database 650 is used to hold data.
Computing device 600 also includes access device 640, access device 640 enabling computing device 600 to communicate via one or more networks 660. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. The access device 640 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the application, the above-described components of computing device 600, as well as other components not shown in FIG. 6, may also be connected to each other, such as by a bus. It should be understood that the block diagram of the computing device illustrated in FIG. 6 is for exemplary purposes only and is not intended to limit the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 600 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), mobile phone (e.g., smart phone), wearable computing device (e.g., smart watch, smart glasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 600 may also be a mobile or stationary server.
Wherein the processor 620, when executing the computer instructions, implements the steps of the network bandwidth allocation method.
The foregoing is a schematic illustration of a computing device of this embodiment. It should be noted that, the technical solution of the computing device and the technical solution of the network bandwidth allocation method belong to the same concept, and details of the technical solution of the computing device, which are not described in detail, can be referred to the description of the technical solution of the network bandwidth allocation method.
An embodiment of the application also provides a computer-readable storage medium storing computer instructions that, when executed by a processor, implement the steps of the network bandwidth allocation method as described above.
The above is an exemplary version of a computer-readable storage medium of the present embodiment. It should be noted that, the technical solution of the storage medium and the technical solution of the network bandwidth allocation method belong to the same concept, and details of the technical solution of the storage medium which are not described in detail can be referred to the description of the technical solution of the network bandwidth allocation method.
The foregoing describes certain embodiments of the present application. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The computer instructions include computer program code that may be in source code form, object code form, executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
It should be noted that, for the sake of simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily all required for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The preferred embodiments of the application disclosed above are intended only to assist in the explanation of the application. Alternative embodiments are not intended to be exhaustive or to limit the application to the precise form disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and the practical application, to thereby enable others skilled in the art to best understand and utilize the application. The application is limited only by the claims and the full scope and equivalents thereof.

Claims (12)

1. A method for allocating network bandwidth, comprising:
determining an audio buffer area and a video buffer area, wherein the audio buffer area is smaller than the video buffer area;
Caching the audio data of the target audio and video into the audio cache region, and caching the video data of the target audio and video into the video cache region;
Acquiring audio buffer information of the audio buffer area, and acquiring playing buffer information corresponding to the target audio and video buffer information of the video buffer area, wherein the audio buffer information is information of audio data buffered in the audio buffer area, the video buffer information is information of video data buffered in the video buffer area, the audio buffer information comprises an audio buffer rate, the video buffer information comprises a video buffer rate, and the playing buffer information comprises playable duration of the target audio and video or playing buffer rate of the playing buffer area;
distributing video bandwidth and audio bandwidth for the target audio and video according to the audio cache information;
and determining a target cache region adjustment strategy according to the playing cache information and the video cache information, and adjusting the audio cache region and the video cache region based on the target cache region adjustment strategy, wherein the target cache region adjustment strategy comprises a maximum adjustment strategy, an increase adjustment strategy, a maintenance adjustment strategy or a reduction adjustment strategy.
2. The method of claim 1, wherein allocating video bandwidth and audio bandwidth for the target audio-video based on the audio buffering information comprises:
adjusting video bandwidth to be equal to audio bandwidth under the condition that the audio buffer rate is smaller than or equal to a preset audio threshold, wherein the preset audio threshold is determined according to the audio buffer area;
and adjusting video bandwidth to be larger than audio bandwidth under the condition that the audio buffer rate is larger than the preset audio threshold value.
3. The method of claim 1, wherein determining a target buffer adjustment policy based on the play buffer information and the video buffer information comprises:
under the condition that the playing cache information meets the preset playing condition, determining the target cache area adjusting strategy as the maximum adjusting strategy;
determining a target cache region adjustment strategy as an increase adjustment strategy under the condition that the play cache information does not meet a preset play condition and the video cache rate is greater than or equal to a first preset video threshold value;
Determining a target cache region adjustment strategy to be a maintenance adjustment strategy under the condition that the play cache information does not meet a preset play condition, and the video cache rate is larger than a second preset video threshold and smaller than the first preset video threshold;
And determining the target cache region adjustment strategy as a reduction adjustment strategy under the condition that the play cache information does not meet the preset play condition and the video cache rate is smaller than or equal to the second preset video threshold value.
4. The method of claim 3, wherein adjusting the audio buffer and the video buffer based on the target buffer adjustment policy if the target buffer adjustment policy is a maximum adjustment policy comprises:
determining the maximum value of an audio buffer area and the maximum value of a video buffer area based on the target audio and video;
And adjusting the audio buffer area to the maximum value of the audio buffer area, and adjusting the video buffer area to the maximum value of the video buffer area.
5. The method of claim 4, wherein determining an audio buffer maximum and a video buffer maximum based on the target audio-video comprises:
Determining the total playing time length, the audio code rate and the video code rate of the target audio and video;
Determining the maximum value of an audio buffer area according to the total playing time length and the audio code rate;
and determining the maximum value of the video buffer area according to the total playing time length and the video code rate.
6. The method of claim 3, wherein adjusting the audio buffer and the video buffer based on the target buffer adjustment policy if the target buffer adjustment policy is an increase adjustment policy comprises:
Determining audio buffer area increment information and video buffer area increment information;
and increasing the audio buffer area based on the audio buffer area increment information, and increasing the video buffer area based on the video buffer area increment information.
7. The method of claim 3, wherein adjusting the audio buffer and the video buffer based on the target buffer adjustment policy if the target buffer adjustment policy is a maintenance adjustment policy comprises:
and keeping the audio buffer area and the video buffer area unchanged.
8. The method of claim 3, wherein adjusting the audio buffer and the video buffer based on the target buffer adjustment policy if the target buffer adjustment policy is a decrease adjustment policy comprises:
determining audio buffer area decrement information and video buffer area decrement information;
And reducing the audio buffer area based on the audio buffer area reduction information, and reducing the video buffer area based on the video buffer area reduction information.
9. The method of claim 1, wherein determining an audio buffer and a video buffer comprises:
Determining the initial playing time length, the audio code rate and the video code rate of the target audio and video;
determining an initial audio buffer area based on the initial playing time length and the audio code rate;
And determining an initial video cache region based on the initial playing time length and the video code rate.
10. A network bandwidth allocation apparatus, comprising:
the determining module is configured to determine an audio cache area and a video cache area, wherein the audio cache area is smaller than the video cache area;
the buffer module is configured to buffer the audio data of the target audio and video to the audio buffer area and buffer the video data of the target audio and video to the video buffer area;
The device comprises an acquisition module, a display module and a display module, wherein the acquisition module is configured to acquire audio cache information of the audio cache region, acquire play cache information corresponding to the target audio and video cache information of the video cache region, wherein the audio cache information is information of audio data cached in the audio cache region, the video cache information is information of video data cached in the video cache region, the audio cache information comprises an audio cache rate, the video cache information comprises a video cache rate, and the play cache information comprises playable duration of the target audio and video or play cache rate of the play cache region;
the allocation module is configured to allocate video bandwidth and audio bandwidth for the target audio and video according to the audio cache information;
And the adjusting module is configured to determine a target cache area adjusting strategy according to the playing cache information and the video cache information, and adjust the audio cache area and the video cache area based on the target cache area adjusting strategy, wherein the target cache area adjusting strategy comprises a maximum adjusting strategy, an increasing adjusting strategy, a maintaining adjusting strategy or a reducing adjusting strategy.
11. A computing device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, wherein the processor, when executing the computer instructions, performs the steps of the method of any one of claims 1-9.
12. A computer readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the method of any one of claims 1-9.
CN202210646907.XA 2022-06-09 2022-06-09 Network bandwidth allocation method and device Active CN114866814B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210646907.XA CN114866814B (en) 2022-06-09 2022-06-09 Network bandwidth allocation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210646907.XA CN114866814B (en) 2022-06-09 2022-06-09 Network bandwidth allocation method and device

Publications (2)

Publication Number Publication Date
CN114866814A CN114866814A (en) 2022-08-05
CN114866814B true CN114866814B (en) 2024-04-30

Family

ID=82623977

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210646907.XA Active CN114866814B (en) 2022-06-09 2022-06-09 Network bandwidth allocation method and device

Country Status (1)

Country Link
CN (1) CN114866814B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117440209B (en) * 2023-12-15 2024-03-01 牡丹江师范学院 Implementation method and system based on singing scene

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6111863A (en) * 1995-12-29 2000-08-29 Lsi Logic Corporation Method and apparatus for the dynamic allocation of signal bandwidth between audio, video and data signals
CN102724584A (en) * 2012-06-18 2012-10-10 Tcl集团股份有限公司 Method and device for playing network videos online and smart television
CN107517400A (en) * 2016-06-15 2017-12-26 成都鼎桥通信技术有限公司 Flow media playing method and DST PLAYER
CN109194974A (en) * 2018-09-28 2019-01-11 北京北斗方圆电子科技有限公司 Media low latency communication means and system for internet video live broadcasting
CN110634511A (en) * 2019-09-27 2019-12-31 北京西山居互动娱乐科技有限公司 Audio data processing method and device
CN111510761A (en) * 2019-01-30 2020-08-07 上海哔哩哔哩科技有限公司 First frame equalization current limiting method and device, computer equipment and readable storage medium
CN111918093A (en) * 2020-08-13 2020-11-10 腾讯科技(深圳)有限公司 Live broadcast data processing method and device, computer equipment and storage medium
CN113115100A (en) * 2021-04-23 2021-07-13 深圳力维智联技术有限公司 Video adjusting method, monitoring device, computer program product and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11094346B2 (en) * 2018-11-12 2021-08-17 Netflix, Inc. Systems and methods for adaptive streaming of multimedia content

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6111863A (en) * 1995-12-29 2000-08-29 Lsi Logic Corporation Method and apparatus for the dynamic allocation of signal bandwidth between audio, video and data signals
CN102724584A (en) * 2012-06-18 2012-10-10 Tcl集团股份有限公司 Method and device for playing network videos online and smart television
CN107517400A (en) * 2016-06-15 2017-12-26 成都鼎桥通信技术有限公司 Flow media playing method and DST PLAYER
CN109194974A (en) * 2018-09-28 2019-01-11 北京北斗方圆电子科技有限公司 Media low latency communication means and system for internet video live broadcasting
CN111510761A (en) * 2019-01-30 2020-08-07 上海哔哩哔哩科技有限公司 First frame equalization current limiting method and device, computer equipment and readable storage medium
CN110634511A (en) * 2019-09-27 2019-12-31 北京西山居互动娱乐科技有限公司 Audio data processing method and device
CN111918093A (en) * 2020-08-13 2020-11-10 腾讯科技(深圳)有限公司 Live broadcast data processing method and device, computer equipment and storage medium
CN113115100A (en) * 2021-04-23 2021-07-13 深圳力维智联技术有限公司 Video adjusting method, monitoring device, computer program product and storage medium

Also Published As

Publication number Publication date
CN114866814A (en) 2022-08-05

Similar Documents

Publication Publication Date Title
US10271072B2 (en) Video preloading method and apparatus
US8046811B2 (en) Scheduled retrieval, storage and access of media data
US9146884B2 (en) Push pull adaptive capture
US20130227158A1 (en) Media-quality adaptation mechanisms for dynamic adaptive streaming
JP2004343701A (en) Data receiving reproduction apparatus, data receiving reproduction method, and data receiving reproduction processing program
CN108063769B (en) Method and device for realizing content service and content distribution network node
US10848794B2 (en) Digital data streaming using server driven adaptive bitrate
CN114866814B (en) Network bandwidth allocation method and device
CN114401447A (en) Video stuck prediction method, device, equipment and medium
CN113411643B (en) Video quality optimization method, system, electronic equipment and storage medium
CN111225209A (en) Video data plug flow method, device, terminal and storage medium
CN113891114B (en) Transcoding task scheduling method and device
CN113645150A (en) Transmission rate control method, device, electronic equipment and readable storage medium
CN111510761B (en) First frame equalization current limiting method and device, computer equipment and readable storage medium
CN114245175A (en) Video transcoding method and device, electronic equipment and storage medium
CN114389959A (en) Network congestion control method and device, electronic equipment and storage medium
CN112350998B (en) Video streaming transmission method based on edge calculation
CN106559715A (en) Mobile network video transmission optimization method and device
US20240031454A1 (en) System, method and computer-readable medium for data accessing
US20230221969A1 (en) Encoding scheduling method, server, client, and system for acquiring remote desktop
US20100121901A1 (en) Moving-picture processing device and moving-picture processing method
CN115022717A (en) Data downloading method and device, computer equipment and storage medium
CN114979768A (en) Video caching method, device, equipment and storage medium
US6763439B1 (en) Disk throttling and priority queuing for streaming I/O
US8791981B2 (en) Bit rate control apparatus and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant