CN115086779B - Video transmission system - Google Patents

Video transmission system Download PDF

Info

Publication number
CN115086779B
CN115086779B CN202210893047.XA CN202210893047A CN115086779B CN 115086779 B CN115086779 B CN 115086779B CN 202210893047 A CN202210893047 A CN 202210893047A CN 115086779 B CN115086779 B CN 115086779B
Authority
CN
China
Prior art keywords
playing
video
video frame
information
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210893047.XA
Other languages
Chinese (zh)
Other versions
CN115086779A (en
Inventor
邓志吉
张朝阳
刘明
叶奇
钟广海
李辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Publication of CN115086779A publication Critical patent/CN115086779A/en
Application granted granted Critical
Publication of CN115086779B publication Critical patent/CN115086779B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6373Control signals issued by the client directed to the server or network components for rate control, e.g. request to the server to modify its transmission rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • H04N21/64723Monitoring of network processes or resources, e.g. monitoring of network load
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • H04N21/64723Monitoring of network processes or resources, e.g. monitoring of network load
    • H04N21/64738Monitoring network characteristics, e.g. bandwidth, congestion level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • H04N21/64746Control signals issued by the network directed to the server or the client
    • H04N21/64761Control signals issued by the network directed to the server or the client directed to the server
    • H04N21/64769Control signals issued by the network directed to the server or the client directed to the server for rate control

Abstract

The application relates to a video transmission system, which comprises a coding end, a platform end and a playing end, wherein the coding end is used for acquiring first information and sending the first information to the playing end; the playing end is used for receiving the first information and acquiring the second information; the method is also used for determining network congestion conditions in a preset duration based on the first information and the second information; and/or, based on the second information, determining the playing condition of the video frames within the preset duration; the network congestion condition and/or the video frame playing condition are/is sent to the platform end; the platform end is used for receiving the network congestion condition and/or the video frame playing condition and determining a code stream adjustment strategy based on the network congestion condition and/or the video frame playing condition; the platform end is also used for sending the code stream adjustment strategy to the encoding end and the playing end. The platform end coordinates and controls the code streams of the coding end and the playing end, so that the end-to-end matching of video coding, video decoding and video transmission is ensured, and the quality of video transmission and playing is improved.

Description

Video transmission system
Technical Field
The application relates to the technical field of video, in particular to a video transmission system.
Background
With the development of wireless network technology and video technology, the use of video playing applications is becoming more and more common. The video encoding end usually forwards the video to the video playing end through the video platform end, so that an uplink network is formed between the video encoding end and the video platform end, and a downlink network is formed between the video playing end and the video platform end. However, the video coding end cannot perceive the network quality of the downlink network and the video playing quality of the playing end, and the video playing end cannot perceive the network quality of the uplink network, so that the video coding end and the video playing end are perceptively isolated, and the code stream adjustment strategy of the video coding end and the video playing end are relatively poor in cooperation, so that the overall playing experience of the video service is relatively poor, and the problems of playing clamping and video recording missing and the like are easily caused.
Therefore, how to provide a video transmission system to improve the video playing quality is a problem to be solved in the present application.
Disclosure of Invention
The embodiment of the invention provides a video transmission system which is used for improving a code stream adjustment strategy of a video so as to improve the video playing quality, thereby effectively improving the user experience.
The application provides a video transmission system, which comprises a coding end, a platform end and a playing end, wherein:
The encoding end is used for acquiring first information, wherein the first information comprises video frame sending information of the encoding end within a preset duration, and the first information is sent to the playing end;
The playing end is used for receiving the first information and obtaining second information, and the second information comprises video frame receiving information and/or video frame playing buffer information of the playing end in the preset duration;
The method is also used for determining network congestion conditions within the preset duration based on the first information and the second information; and/or, based on the second information, determining the video frame playing condition in the preset duration;
The network congestion condition and/or the video frame playing condition are/is sent to the platform end;
The platform end is used for receiving the network congestion condition and/or the video frame playing condition and determining a code stream adjustment strategy based on the network congestion condition and/or the video frame playing condition;
The platform end is also used for sending the code stream adjustment strategy to the encoding end and the playing end, and the code stream adjustment strategy is used for adjusting the code stream.
In one embodiment, the determining, based on the first information and the second information, a network congestion condition within the preset duration includes:
determining a first network state and/or a second network state based on the first information and the second information; wherein the first network state is associated with a video frame loss rate and the second network state is associated with a video frame delay;
the network congestion situation is determined based on the first network state and/or the second network state.
In one embodiment, the video transmission information includes a video frame transmission number, and the video reception information includes a video frame reception number; determining a first network state based on the first information and the second information, comprising:
determining a frame loss rate of the video frames in the preset duration based on the video frame sending number and the video frame receiving number;
If the video frame loss rate is greater than a first threshold value, determining that the first network state is an overload state; or if the video frame loss rate is a second threshold value, determining that the first network state is a normal state; or if the video frame loss rate is greater than or equal to a second threshold and less than a first threshold, determining that the first network state is a low-load state.
In one embodiment, the video transmission information includes a video frame transmission interval, and the video reception information includes a video frame reception interval; determining a second network state based on the first information and the second information, comprising:
determining the change trend of the video frame time delay in the preset duration based on the video frame sending interval and the video frame receiving interval;
If the quantized value of the variation trend of the video frame time delay is larger than a third threshold value, and the overload frequency of the network bandwidth exceeds a fourth threshold value within the preset time period and/or the overload time of the network bandwidth exceeds a fifth threshold value, determining that the second network state is an overload state;
Or if the quantized value of the variation trend of the video frame time delay is smaller than a sixth threshold value, determining that the second network state is a low-load state;
Or if the quantized value of the variation trend of the video frame time delay is smaller than or equal to a third threshold value and larger than or equal to a sixth threshold value, and the normal times of the network bandwidth exceeds a seventh threshold value and/or the normal time of the network bandwidth exceeds an eighth threshold value within the preset time period, determining that the second network state is a normal state.
In one embodiment, the third threshold and the sixth threshold are dynamically adjusted according to an absolute value of a difference between a quantized value of a trend of variation of the video frame delay and the third threshold or the sixth threshold.
In one embodiment, determining the network congestion condition based on the first network state and/or the second network state comprises:
The first network state and/or the second network state are/is in an overload state, and the network bandwidth in the preset duration is determined to be in the overload state;
The first network state and the second network state are both normal states, and the network bandwidth in the preset duration is determined to be in the normal state;
And if the first network state is a normal state and the second network state is a low-load state, determining that the network bandwidth within the preset duration is in the low-load state.
In one embodiment, the video receiving information includes a video frame receiving interval, and the video playing buffer information includes a video frame buffer time; based on the second information, determining the playing condition of the video frame within the preset duration comprises the following steps:
Determining the frame playing jitter time in the preset duration according to the video frame receiving interval and the video frame buffering time;
If the frame playing jitter time is smaller than or equal to a ninth threshold value, determining that video playing is not blocked in the preset duration; or if the frame playing jitter time is greater than a ninth threshold value, determining that video playing is blocked in the preset duration.
In one embodiment, the video receiving information includes a video frame receiving interval, and the video playing buffer information includes a video frame buffer time; based on the second information, determining the playing condition of the video frame within the preset duration comprises the following steps:
counting the number of frame play jitter in the preset duration according to the video frame receiving interval and the video frame buffering time;
If the number of the frame play jitter times is smaller than or equal to a tenth threshold value, determining that video play is not blocked in the preset duration; or if the number of the frame play jitter is greater than a tenth threshold, determining that video play is blocked in the preset duration.
In one embodiment, receiving the network congestion condition and/or the video frame playing condition, and determining the code stream adjustment policy based on the network congestion condition and/or the video frame playing condition includes:
When video playing is blocked and/or network bandwidth is in an overload state within the preset duration, the code stream adjustment strategy is used for indicating the playing end to downwards adjust the code stream; or alternatively
And when the video playing is not blocked and the network bandwidth is in a normal state within the preset time, the code stream adjustment strategy is used for indicating the playing end to up-regulate the code stream.
In one embodiment, the platform end includes a video transmission monitoring module and a dynamic code rate balancing module, where:
the coding end is also used for sending the uplink network transmission bandwidth information to the platform end;
the playing end is also used for sending the downlink network transmission bandwidth information to the platform end;
The video transmission monitoring module is used for receiving the uplink network transmission bandwidth information and the downlink network transmission bandwidth information;
the video transmission monitoring module is further used for determining a code stream adjustment strategy based on the uplink network transmission bandwidth information, the downlink network transmission bandwidth information, the network congestion condition and/or the video frame playing condition, and sending the code stream adjustment strategy to the dynamic code rate balancing module;
if the code stream adjustment strategy is to adjust the code stream downwards, the dynamic code rate balancing module is used for controlling the playing end to reduce the playing rate and increasing the capacity of a receiving buffer module of the playing end based on the code stream adjustment strategy;
the dynamic code rate balancing module is also used for controlling the coding end to increase the capacity of the sending buffer module and reduce the coding code rate.
In one embodiment, if the code stream adjustment policy is an up-regulation code stream, and the dynamic code rate balancing module sends the code stream adjustment policy of the down-regulation code stream to the encoding end and the playing end, the dynamic code rate balancing module is further configured to determine a first delay time based on the downlink network transmission bandwidth information;
if the first delay time is greater than a preset threshold, the dynamic code rate balancing module is used for controlling the playing end to improve the playing rate based on the code stream adjustment strategy;
and if the first delay time is smaller than a preset threshold value, the dynamic code rate balancing module is used for restoring the playing rate of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module to an initial state.
In one embodiment, the platform end includes a video transmission monitoring module and a key frame peak shifting module, where:
The video transmission monitoring module is used for determining whether key frame collision exists or not based on the first information and the second information;
if the key frame collision exists, determining a key frame peak shifting strategy, and sending the key frame peak shifting strategy to the key frame peak shifting module;
and the key frame peak shifting module sends the key frame peak shifting strategy to the coding end and the playing end.
In one embodiment, if there is a key frame collision, the key frame peak shifting module is configured to control the playing end to reduce a playing rate and increase a capacity of a receiving buffer module of the playing end;
The key frame peak shifting module is also used for controlling the coding end to increase the capacity of the sending buffer module and adjusting the key frame generation time sequence after the preset time.
In one embodiment, the playing end is further configured to send downlink network transmission bandwidth information to the platform end, and after the key frame peak shifting module sends the key frame peak shifting policy to the encoding end and the playing end, the key frame peak shifting module is further configured to determine a second delay time based on the downlink network transmission bandwidth information;
If the second delay time is greater than a preset threshold, the key frame peak shifting module is used for controlling the playing end to improve the playing speed;
and if the second delay time is smaller than a preset threshold value, the key frame peak shifting module is used for restoring the playing speed of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module to an initial state.
The video transmission system comprises a coding end, a platform end and a playing end, wherein: the encoding end is used for acquiring first information, wherein the first information comprises video frame sending information of the encoding end within a preset duration, and the first information is sent to the playing end; the playing end is used for receiving the first information and obtaining second information, and the second information comprises video frame receiving information and/or video frame playing buffer information of the playing end in the preset duration; the method is also used for determining network congestion conditions within the preset duration based on the first information and the second information; and/or, based on the second information, determining the video frame playing condition in the preset duration; the network congestion condition and/or the video frame playing condition are/is sent to the platform end; the platform end is used for receiving the network congestion condition and/or the video frame playing condition and determining a code stream adjustment strategy based on the network congestion condition and/or the video frame playing condition; the platform end is also used for sending the code stream adjustment strategy to the encoding end and the playing end, and the code stream adjustment strategy is used for adjusting the code stream.
In the embodiment of the application, a playing end determines the network congestion condition and/or the video frame playing condition in the preset time by combining the video frame receiving information and/or the video frame playing buffer information of the playing end in the preset time and the video frame sending information of the encoding end in the preset time; the platform end determines a code stream adjustment strategy based on video frame playing conditions and/or network congestion conditions and sends the code stream adjustment strategy to the coding end and the playing end; and the coding end and the playing end can adjust the code stream based on the code stream adjustment strategy. The platform end coordinates and controls the code streams of the coding end and the playing end, so that the end-to-end matching of video coding, video decoding and video transmission is ensured, and the quality of video transmission and playing is improved.
Drawings
FIG. 1 is a diagram of a system architecture to which embodiments of the present application are applicable;
fig. 2 is a flow chart of a video code stream adjustment method according to an embodiment of the present application;
Fig. 3 is a schematic flow chart of determining a first network state according to an embodiment of the present application;
fig. 4 is a schematic flow chart of determining a second network state according to an embodiment of the present application;
fig. 5 is a schematic diagram of video frame interval delay congestion detection according to an embodiment of the present application;
fig. 6 is a flowchart illustrating a method for determining network congestion in combination with a first network status and a second network status according to an embodiment of the present application;
Fig. 7 is a schematic flow chart of video clip detection according to an embodiment of the present application;
Fig. 8 is a schematic flow chart of determining a code stream adjustment strategy according to an embodiment of the present application;
Fig. 9 is a schematic diagram of a video transmission system according to an embodiment of the present application;
fig. 10 is a schematic diagram of an execution flow of a code stream adjustment strategy according to an embodiment of the present application;
Fig. 11 is a flow chart of a multi-code stream adjustment method according to an embodiment of the present application;
Fig. 12 is a schematic diagram of a video stream according to an embodiment of the present application;
fig. 13 is a schematic diagram of a video stream according to an embodiment of the present application;
FIG. 14 is a flowchart of a key frame peak shifting method according to an embodiment of the present application;
fig. 15 is a flow chart of a multi-code stream adjustment method according to an embodiment of the present application;
fig. 16 is a schematic diagram of a video stream according to an embodiment of the present application;
Fig. 17 is a schematic diagram of a video stream according to an embodiment of the present application.
Detailed Description
The present invention will be described in further detail below with reference to the attached drawings, wherein it is apparent that the described embodiments are only some, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
First, a system architecture diagram to which embodiments of the present application are applicable will be described with reference to the specific drawings.
Fig. 1 shows a system architecture diagram to which an embodiment of the present application is applied, where the system includes a playing end, a coding end, and a platform end. The coding end and the platform end can be connected through a wireless network, so that the coding end can forward the video to the playing end through the platform end, and an uplink network side is formed; the playing end and the platform end can be connected through a wired network or a wireless network, and then the platform end can send video to the playing end, so that a downlink network side is formed.
It should be understood that the playing end may be a mobile phone, a decoding wall, an intelligent television, a computer, or other devices with display capability, and the embodiment of the present application is not limited in particular. The encoding end is a device with video encoding capability and code stream adjusting capability, for example, a mobile monitoring device, and the embodiment of the application is not particularly limited.
In the embodiment of the application, the encoding end is used for acquiring first information, the first information comprises video frame sending information of the encoding end within a preset duration, and the first information is sent to the playing end; the playing end is used for receiving the first information and obtaining second information, wherein the second information comprises video frame receiving information and/or video frame playing buffer information of the playing end in a preset duration; the method is also used for determining network congestion conditions in a preset duration based on the first information and the second information; and/or, based on the second information, determining the playing condition of the video frames within the preset duration; the network congestion condition and/or the video frame playing condition are/is sent to the platform end; the platform end is used for receiving the network congestion condition and/or the video frame playing condition and determining a code stream adjustment strategy based on the network congestion condition and/or the video frame playing condition; the platform end is also used for sending a code stream adjustment strategy to the encoding end and the playing end, and the code stream adjustment strategy is used for adjusting the code stream. Therefore, the platform end coordinates and controls the code streams of the coding end and the playing end, so that the end-to-end matching of video coding, video decoding and video transmission is ensured, and the quality of video transmission and playing is improved.
Fig. 2 is a flow chart of a video code stream adjustment method according to an embodiment of the present invention, in which an execution body of the method takes a playing end (i.e. the playing end in fig. 1) and an encoding end (i.e. the encoding end in fig. 1) as an example, and the process includes the following steps:
s301: the encoding end transmits first information. Correspondingly, the playing end receives the first information.
The first information comprises video frame sending information of the coding end in a preset duration. In the embodiment of the application, the video frame sending information includes, but is not limited to, the number of video frame sending, the number of video frame encoding, the video frame sending time, the video frame encoding time, the frame type, the frame sequence number and the video frame sending time consumption.
Specifically, after the encoding end obtains the first information, the encoding end sends the first information to the platform end, and the platform end forwards the first information to the playing end; or after the encoding end obtains the first information, the encoding end directly sends the first information to the playing end through a transmission channel between the encoding end and the playing end.
S302: the playing end obtains the second information.
The second information comprises video frame receiving information and/or video frame playing buffer information of the playing end in a preset duration. In the embodiment of the application, the video frame receiving information includes, but is not limited to, video frame receiving number, video frame receiving time, frame type and frame number, and the video frame playing buffer information includes, but is not limited to, video frame buffer number, video frame playing number, frame type and frame number.
S303: the playing end determines network congestion conditions in a preset duration based on the first information and the second information; and/or determining the video frame playing condition within the preset duration based on the second information.
It should be noted that, in the embodiment of the present application, there are various implementations for determining the network congestion condition, including but not limited to the following ways:
In embodiment 1, a playing end determines a first network state based on first information and second information; based on the first network state, a network congestion condition is determined.
In one possible implementation, the first network state may be associated with a video frame loss rate, and if the video transmission information includes a video frame transmission number, the video reception information includes a video frame reception number; the playing end can determine the frame loss rate of the video frames in a preset duration based on the video frame sending number and the video frame receiving number; if the video frame loss rate is greater than a first threshold value, determining that the first network state is an overload state; or if the video frame loss rate is the second threshold value, determining that the first network state is a normal state; or if the video frame loss rate is greater than or equal to the second threshold value and less than the first threshold value, determining that the first network state is a low-load state. When the video frame transmission number includes the number of encoded input video frames, the video frame loss rate loss in the preset duration= (the number of encoded input video frames-the number of video receiving frames)/the number of encoded input video frames.
It will be understood that the "overload state" refers to a state in which the network bandwidth is insufficient, the "normal state" refers to a state in which the network bandwidth is normally used, and the "low-load state" refers to a state in which the network bandwidth has a certain load. Correspondingly, when the first network state is an overload state, the network congestion condition is insufficient network bandwidth; when the first network state is a normal state, the network congestion condition is that the network bandwidth is sufficient; when the first network state is a low-load state, the network congestion condition is that the network bandwidth is slightly insufficient.
For example, referring to fig. 3, the first threshold is exemplified by 10%, the second threshold is exemplified by 0, and the preset duration is exemplified by 5 minutes. If the number of the encoded input video frames is 100 frames and the number of the video frame receptions is 50, determining that the first network state is an overload state, namely the network bandwidth is insufficient, if the video frame loss rate loss= (the number of the encoded input video frames-the number of the video reception frames)/the number of the encoded input video frames= (100-50)/100=50% within 5 minutes is greater than 10%; or if the number of the encoded input video frames is 100 frames and the number of the video frame receptions is 100, determining that the first network state is a normal state, that is, the network bandwidth is sufficient, if the number of the video frame loss rate loss= (the number of the encoded input video frames-the number of the video reception frames)/the number of the encoded input video frames= (100-100)/100=0 within 5 minutes is 0; or if the number of the encoded input video frames is 100 frames and the number of the video frame receptions is 95, determining that the first network state is in a low-load state, i.e. the network bandwidth has a partial load, if the video frame loss rate loss= (the number of the encoded input video frames-the number of the video reception frames)/the number of the encoded input video frames= (100-95)/100=5% and the video frame loss rate loss5% is greater than 0 and less than 10% within 5 minutes.
In embodiment 2, the encoding end determines the second network state based on the first information and the second information, and determines the network congestion condition based on the second network state.
Referring to fig. 4, in one possible implementation, the second network state is associated with video frame delay; the video transmitting information comprises a video frame transmitting interval, and the video receiving information comprises a video frame receiving interval; the playing end determines a process of the second network state based on the first information and the second information, and the process comprises the following steps:
S501: and determining a quantized value of the change trend of the video frame time delay in the preset time length based on the video frame sending interval and the video frame receiving interval.
Further, determining a quantized value of a trend of variation of the video frame delay within a preset duration based on the video frame transmission interval and the video frame reception interval includes:
s5011: a video frame interval delay gradient value is determined based on the video frame transmission interval and the video frame reception interval.
Illustratively, the video frame interval delay gradient value (delta_ms) =video frame receive interval (recv_delta_ms) -video frame transmit interval send_delta_ms.
S5012: and determining a quantized value of the variation trend of the video frame time delay according to the video frame interval time delay gradient value in the preset duration.
The detection principle of the delay gradient congestion detection algorithm based on the video frame group is shown in fig. 5, and the encoding end continuously sends the video frames to the playing end. Wherein frame (i) represents the ith video frame group, frame (i-1) represents the ith-1 video frame group, frame (i+1) represents the ith+1 video frame group, each video frame group contains a plurality of video frames, for example, consecutive video frames are divided into 1 st, 2 nd, … … th, i-1 th, and ith groups according to 31 st groups. T (i) represents the arrival time of the last frame of the video frame group i, T (i-1) represents the arrival time of the last frame of the video frame group i-1, T (i) represents the transmission time of the last frame of the video frame group i, and T (i-1) represents the transmission time of the last frame of the video frame group i-1. The delay variation d (i) between two frame groups satisfies the following formula:
d(i)=(t(i)–t(i-1))–(T(i)–T(i-1));
Wherein, d (i) is 0, the network is not congested, d (i) is greater than 0, the network is congested, and d (i) is less than 0, the network congestion is recovering.
Furthermore, the playing end can combine a primary exponential smoothing prediction algorithm and a least square linear regression algorithm to accumulate, smooth and linearly regress the video frame interval delay gradient values within a preset duration to obtain a quantized value of a variation trend of the video frame delay, and the quantized value of the variation trend is used for representing the congestion degree of the network.
S502: and determining a second network state based on the quantized value of the trend of the video frame delay.
The playing end can compare the quantized value of the variation trend of the video frame time delay with a preset threshold value to determine a second network state; if the quantized value of the variation trend of the video frame time delay is larger than a third threshold value, and the overload frequency of the network bandwidth exceeds a fourth threshold value within a preset duration and/or the overload duration of the network bandwidth exceeds a fifth threshold value, determining that the second network state is an overload state; or if the quantized value of the variation trend of the video frame time delay is smaller than the sixth threshold value, determining that the second network state is a low-load state; or if the quantized value of the variation trend of the video frame time delay is smaller than or equal to the third threshold value and larger than or equal to the sixth threshold value, and the normal times of the network bandwidth exceeds the seventh threshold value and/or the normal time of the network bandwidth exceeds the eighth threshold value within the preset time period, determining that the second network state is a normal state.
Example 1, the preset duration is exemplified by 5 minutes, the third threshold is exemplified by 10%, the fourth threshold is exemplified by 2 times, and the fifth threshold is exemplified by 1 minute. For example, if the quantized value of the variation trend of the video frame delay is 15%, the overload time of the network bandwidth is 3 times within 5 minutes, and the overload time of the network bandwidth is 4 minutes, the quantized value of the variation trend of the video frame delay is greater than the third threshold, and the overload time of the network bandwidth exceeds the fourth threshold and the overload time of the network bandwidth exceeds the fifth threshold within 5 minutes, determining that the second network state is an overload state; for another example, if the quantized value of the variation trend of the video frame delay is 15%, and the overload frequency of the network bandwidth is 3 times in 5 minutes, the quantized value of the variation trend of the video frame delay is greater than the third threshold, and the overload frequency of the network bandwidth exceeds the fourth threshold in 5 minutes, the second network state is determined to be an overload state; for another example, if the quantized value of the variation trend of the video frame delay is 15% and the overload duration of the network bandwidth is 4 minutes, the quantized value of the variation trend of the video frame delay is greater than the third threshold, and the overload duration of the network bandwidth exceeds the fifth threshold within 5 minutes, the second network state is determined to be the overload state.
Example 2, the preset duration is exemplified by 5 minutes and the sixth threshold is exemplified by 5%. If the quantized value of the variation trend of the video frame time delay is 1%, determining that the second network state is a low-load state if the quantized value of the variation trend of the video frame time delay is smaller than a sixth threshold.
Example 3, the preset duration is exemplified by 5 minutes, the third threshold is exemplified by 10%, the fourth threshold is exemplified by 2 times, the fifth threshold is exemplified by 1 minute, the sixth threshold is exemplified by 5%, the seventh threshold is exemplified by 50 times, and the eighth threshold is exemplified by 3 minutes. For example, if the quantized value of the variation trend of the video frame delay is 8%, the normal number of times of the network bandwidth in 5 minutes is 55, and the normal duration of the network bandwidth is 4 minutes, the quantized value of the variation trend of the video frame delay is smaller than the third threshold and larger than the sixth threshold, and the normal number of times of the network bandwidth exceeds the seventh threshold and/or the normal duration of the network bandwidth exceeds the eighth threshold in 5 minutes, the second network state is determined to be the normal state. For another example, if the quantized value of the trend of the video frame delay is 8% and the normal number of times of the network bandwidth is 55 times in 5 minutes, the quantized value of the trend of the video frame delay is smaller than the third threshold and larger than the sixth threshold, and the normal number of times of the network bandwidth exceeds the seventh threshold in 5 minutes, the second network state is determined to be the normal state. For another example, if the quantized value of the variation trend of the video frame delay is 8% and the normal duration of the network bandwidth is 4 minutes, the quantized value of the variation trend of the video frame delay is smaller than the third threshold and larger than the sixth threshold, and the normal duration of the network bandwidth exceeds the eighth threshold within 5 minutes, the second network state is determined to be the normal state.
Correspondingly, when the second network state is an overload state, the network congestion condition is insufficient network bandwidth; when the second network state is a normal state, the network congestion condition is that the network bandwidth is sufficient; when the second network state is a low-load state, the network congestion condition is that the network bandwidth is slightly insufficient.
Optionally, after the playing end determines the second network state, S503 may further be executed: and dynamically adjusting the third threshold value and the sixth threshold value according to the absolute value of the difference value between the quantized value of the variation trend of the video frame time delay and the third threshold value or the sixth threshold value. Therefore, the network congestion detection method based on video frame delay can be suitable for different application scenes.
Wherein, S503 specifically includes the following steps:
S5031: the absolute value trend _ deta of the difference between the quantized value trend of the trend of the video frame delay variation value and the dynamic threshold threshold_is determined.
S5032: an adjustment value for the dynamic threshold is determined based on the absolute value trend _ deta of the difference.
For example, the adjustment value of the dynamic threshold may be determined according to the following formula.
Deta = k (trend _ deta-threshold_) deta _t; wherein deta is a threshold adjustment value, k is an adjustment factor, trend _ deta is an absolute value of a difference between a quantized value trend of a variation trend of a video frame delay variation value and a dynamic threshold threshold_and deta _t is a time difference of a last dynamic threshold update. Wherein deta _t > = 100 ms.
The adjustment factor K is determined according to the absolute value trend _ deta of the difference between the quantized value trend of the trend and the dynamic threshold threshold_ deta. For example trend _ deta < threshold_, then the adjustment factor k=up-regulation factor Kup (i.e. increase K); trend _ deta > =threshold_, then the adjustment factor k=the down-regulation factor Kdown (i.e. decrease K).
S5033: and updating the dynamic threshold according to the adjustment value of the dynamic threshold.
It should be understood that the dynamic threshold refers to the third and sixth thresholds described above.
In embodiment 3, the playing end determines a first network state and a second network state based on the first information and the second information; based on the first network state and the second network state, a network congestion condition is determined.
Referring to fig. 6, the process of determining, by the playing end, the network congestion condition based on the first network state and the second network state may be: the playing end acquires a first network state and a second network state, and when the first network state and/or the second network state are/is determined to be in an overload state, the network bandwidth in a preset duration is determined to be in the overload state; or when the first network state and the second network state are both normal, determining that the network bandwidth within the preset duration is in the normal state; or when the first network state is determined to be in the low-load state and the second network state is determined to be in the normal state, determining that the network bandwidth within the preset duration is in the low-load state. For specific embodiments for determining the first network state and the second network state, please refer to the foregoing descriptions about embodiment 1 and embodiment 2, and a detailed description is omitted herein.
For example, taking 5 minutes as an example of the preset duration, when the first network state and/or the second network state are in the overload state within 5 minutes, determining that the network bandwidth within 5 minutes is in the overload state; or when the first network state and the second network state are both in a normal state within 5 minutes, determining that the network bandwidth within 5 minutes is in the normal state; or when the first network state is determined to be in a normal state and the second network state is determined to be in a low-load state, determining that the network bandwidth within 5 minutes is in the low-load state.
It should be noted that, the playing end determines the playing condition of the video frame within the preset duration based on the second information, and there are various embodiments, including but not limited to the following embodiments:
In embodiment 1, the video receiving information includes a video frame receiving interval, and the video playing buffer information includes a video frame buffer time; the playing end determines the playing condition of the video frame within the preset duration based on the second information, and the method comprises the following steps: according to the video frame receiving interval and the video frame buffering time, determining the frame playing jitter time in a preset duration; if the frame playing jitter time is smaller than or equal to a ninth threshold value, determining that video playing is not blocked in a preset duration; or if the frame playing jitter time is greater than the ninth threshold value, determining that video playing is blocked in a preset duration. Here, frame play jitter time (jitter time) =video frame buffer time (buffer time) -video frame reception interval (interval).
Illustratively, the ninth threshold is 2s, the video frame buffer time (buffer time) is 1s, and the video frame reception interval (interval) is 0.6s. Frame play jitter time (jitter time) =video frame buffer time (buffer time) -video frame receiving interval (interval) =0.4 s, if the frame play jitter time is smaller than a ninth threshold, determining that video play is not blocked within a preset duration.
In embodiment 2, the video receiving information includes a video frame receiving interval, and the video playing buffer information includes a video frame buffer time; the playing end determines the playing condition of the video frame within the preset duration based on the second information, and the method comprises the following steps: counting the number of frame play jitter in a preset duration according to the video frame receiving interval and the video frame buffering time; if the number of the frame play jitter times is smaller than or equal to a tenth threshold value, determining that video play is not blocked in a preset duration; or if the number of the frame play jitter is greater than a tenth threshold, determining that video play is blocked in a preset duration.
For example, the tenth threshold is 3 times, and the number of frame play jitters in 5 minutes is not 4 times, and if the number of frame play jitters is greater than the tenth threshold, it is determined that video play is stuck in 5 minutes.
In embodiment 3, please refer to fig. 7, after the playing end obtains the second information (i.e. the buffer time and the video frame receiving interval (interval)), it determines the frame playing jitter time (jitter time), and if the frame playing jitter time (jitter time) is smaller than the preset threshold, it determines that the video playing is blocked; further, counting the number of frame play jitter times in the preset duration, and considering that video play is blocked in the preset duration when the number of frame play jitter times is larger than a tenth threshold value. Or counting the frame playing jitter time length in the preset time length, and considering that the video playing is blocked in the preset time length and stopping streaming when the frame playing jitter time length is larger than a ninth threshold value. Thus, video jam detection is more flexible.
S304: the platform end determines a code stream adjustment strategy based on video frame playing conditions and/or network congestion conditions.
Referring to fig. 8, in a possible implementation manner, the process of determining the code stream adjustment policy by the platform end based on the video frame playing condition and/or the network congestion condition may be: acquiring a video frame playing condition and a network congestion condition, and when video playing is blocked and/or network bandwidth is in an overload state within a preset duration, a code stream adjustment strategy is used for indicating a coding end to downwards adjust a code stream; or when the video playing is not blocked and the network bandwidth is in a normal state within a preset time length, the code stream adjustment strategy is used for indicating the coding end to up-regulate the code stream; or when the video playing is not blocked and the network bandwidth is in a low-load state within a preset time period, the code stream adjusting strategy indicates that the code stream is unchanged. It can be appreciated that the embodiment in fig. 8 is merely an example, and the code stream adjustment policy in the present invention may be set based on actual scene requirements, and is not limited to the correspondence between the video playing situation and the network congestion situation and the code stream adjustment policy.
S305: the platform end sends a code stream adjustment strategy to the encoding end and the playing end. Correspondingly, the encoding end and the playing end receive the code stream adjustment strategy.
S306: the encoding end and the playing end adjust the code stream based on the code stream adjustment strategy.
Illustratively, the code stream adjustment policy instructs the encoding end to down-regulate the code stream, and the encoding end down-regulates the code stream, for example, reduces the code stream from 2M to 1M. If the code stream adjustment strategy indicates that the coding end is up-regulating the code stream, for example, the code stream is reduced from 1M to 2M.
In another embodiment, the platform end includes a video transmission monitoring module and a dynamic code rate balancing module, wherein: the coding end is also used for sending the uplink network transmission bandwidth information to the platform end; the playing end is also used for sending the downlink network transmission bandwidth information to the platform end; the video transmission monitoring module is used for receiving the uplink network transmission bandwidth information and the downlink network transmission bandwidth information; the video transmission monitoring module is also used for determining a code stream adjustment strategy based on uplink network transmission bandwidth information, downlink network transmission bandwidth information, network congestion condition and/or video frame playing condition, and transmitting the code stream adjustment strategy to the dynamic code rate balancing module; if the code stream adjustment strategy is to adjust the code stream downwards, the dynamic code rate balancing module is used for controlling the playing end to reduce the playing rate and increasing the capacity of the receiving buffer module of the playing end based on the code stream adjustment strategy; the dynamic code rate balancing module is also used for controlling the coding end to increase the capacity of the sending buffer module and reduce the coding code rate.
In an exemplary embodiment, in a wireless environment such as 5G/4G/Wi-Fi, an encoding end such as a large-scale network camera is accessed to a platform end (including a cloud platform, NVR, EVS, etc.) through a wireless network, and video is stored through the platform end and forwarded to a playing end for real-time video playing. When the network is interfered by the environment, the problems of mismatching of video coding, video decoding and video transmission often occur, the network bandwidth has larger fluctuation, the overall transmission efficiency of the video is lower, the problems of playing clamping and video recording deletion and the like are easy to occur, and the user video experience is poor.
Specifically, because the video needs to be forwarded through the platform end access, the video coding end cannot sense the downlink video transmission quality and the video playing quality of the user, and the video playing end cannot sense the uplink video transmission quality of the video coding end, so that the video coding end and the video decoding end are subjected to sensing isolation, the video production, the video consumption and the video transmission are completely mismatched, and the strategy coordination among the three is poor. When the network bandwidth fluctuates (such as insufficient downlink transmission network bandwidth), the video coding end cannot sense, video is still produced according to the original coding rate, and the video transmission is not timely, so that the problems of playing clamping, screen display and the like of the video decoding end can be caused.
The platform end in this embodiment is provided with a video transmission monitoring module and a dynamic code rate balancing module, where the video transmission monitoring module is used to detect the network transmission quality from the coding end to the playing end, and the dynamic code rate balancing module is used to regulate the code streams of the uplink transmission network and the downlink transmission network. Specifically, the video transmission detection module acquires uplink network transmission bandwidth information and downlink network transmission bandwidth information, determines video transmission and playing quality of the video transmission system based on the uplink network transmission bandwidth information, the downlink network transmission bandwidth information, network congestion information and video frame playing conditions, and further formulates a corresponding code stream adjustment strategy and sends the code stream adjustment strategy to the dynamic code stream leveling module. The dynamic code stream balancing module determines whether the received code stream adjustment strategy is a down-regulating code stream, if so, the playing end is controlled to reduce the playing rate, the capacity of the receiving buffer module of the playing end is increased, the encoding end is controlled to increase the capacity of the transmitting buffer module, and the encoding code rate is reduced.
Referring to fig. 9, fig. 9 is a schematic diagram of a video transmission system according to an embodiment of the application.
The coding end and the playing end are connected to the platform end through protocols such as streaming media access and transmission, so that the coded code stream of the coding end can be forwarded to the playing end through the platform end for decoding and playing. The method comprises the following specific steps:
Step 1: the uplink bandwidth detection module of the coding end reports the current uplink network transmission bandwidth information and video frame sending information. The video frame transmission information comprises frame coding time, frame type, frame sequence number, frame transmission time consumption and the like. Specifically, the uplink network transmission bandwidth information is fed back to the video transmission monitoring module at the platform end, and the video frame sending information is directly fed back to the playing end.
Step 2: the playing end respectively detects and evaluates the playing quality and the network transmission quality of the current code stream through the video decoding module and the feedback bandwidth evaluation module according to the video frame sending information, the video frame receiving information and the video playing buffer information fed back by the encoding end, obtains the video frame playing condition and the network congestion condition, and feeds back the video frame playing condition, the network congestion condition and the detected downlink network transmission bandwidth information to the video transmission monitoring module of the platform end.
Step 3: and the video transmission monitoring module at the platform end receives the uplink network transmission bandwidth information, the downlink network transmission bandwidth information, the network congestion condition and the video frame playing condition, and carries out video flow average value overload detection based on the information. When the video transmission network cannot bear the transmission of the current coded video, the video transmission monitoring module at the platform end designates a code stream adjustment strategy and sends the code stream adjustment strategy to the dynamic code rate balancing module.
Specifically, the video transmission monitoring module of the platform end obtains the current playing smooth state of the playing end based on the playing condition of the video frame, and determines the current network bandwidth state from the system coding end to the playing end based on the uplink network transmission bandwidth information, the downlink network transmission bandwidth information and the network congestion condition. If the playing smooth state is monitored to be blocked or the network bandwidth state is in an overload state, outputting a code stream adjustment strategy of code stream adjustment to a dynamic code rate balancing module; when the smooth playing state is monitored to be smooth and the network bandwidth state is in a normal state, outputting a code stream adjustment strategy for up-regulating the code stream to the dynamic code rate balancing module; when the smooth playing state is monitored to be smooth and the network bandwidth state is in a low-load state, outputting a code stream adjustment strategy with unchanged code stream to a dynamic code rate balancing module, and adjusting the code stream by the dynamic code rate balancing module based on the code stream adjustment strategy. After the process is finished, whether the monitoring of the code stream is needed to be stopped is determined, and if yes, the code stream adjusting process is ended.
Step 4: the dynamic code stream flow balance module receives a code stream adjustment strategy and adjusts the code streams and caches of the coding end and the equipment end based on the code stream adjustment strategy. Specifically, if the code stream adjustment policy is to adjust the code stream downwards, the dynamic code rate balancing module is used for controlling the playing end to reduce the playing rate and increasing the capacity of the receiving buffer module of the playing end based on the code stream adjustment policy; the dynamic code rate balancing module is also used for controlling the coding end to increase the capacity of the sending buffer module and reduce the coding code rate.
Referring to fig. 10, fig. 10 is a schematic diagram illustrating an execution flow of a code stream adjustment strategy according to an embodiment of the present application.
In one embodiment, if the code stream adjustment policy is to adjust the code stream downward, the step 4 specifically includes:
Step 1: determining smaller values Bi of the uplink network transmission bandwidth and the downlink network transmission bandwidth based on the uplink network transmission bandwidth information and the downlink network transmission bandwidth information;
Step 2: the dynamic code rate balancing module executes a code stream adjustment strategy of code stream adjustment, reduces the playing rate of the video decoding module at the playing end, and carries out slow playing at alpha times of the normal playing rate, wherein the alpha value can be set based on the requirement of an actual scene, and the preferred alpha E [0.8, 1];
Step 3: the dynamic code rate balancing module increases the buffer capacity of a receiving buffer module at the playing end, for example, the delay data of the received video is introduced, and the buffer capacity which needs to be increased is calculated through the flow peak value balancing technology;
step 4: the dynamic code rate balancing module increases the buffer capacity of a transmitting buffer module of the coding end, for example, the transmitting video delay data is introduced, and the buffer capacity which needs to be increased is calculated through the flow peak value balancing technology;
step 5: after waiting for a certain time, the dynamic code stream balancing module informs the coding end to reduce the coding rate, and controls the coding end to code with a new code rate Bt, wherein bt=bi×γ, and the magnitude of γ value can be set based on the requirement of the actual scene, and the preferable γ e is 0.5, 0.8.
The video transmission monitoring module at the platform end in the embodiment determines a code stream adjustment strategy according to the uplink network transmission bandwidth information, the downlink network transmission bandwidth information, the network congestion condition and/or the video frame playing condition, and sends the code stream adjustment strategy to the dynamic code rate balancing module; the dynamic code stream balancing module controls the playing end to reduce the playing rate, increases the capacity of the receiving buffer module of the playing end, controls the encoding end to increase the capacity of the transmitting buffer module and reduces the encoding code rate of the encoding end based on the code rate adjustment strategy, balances the bandwidths of an uplink network and a downlink network by reducing the playing rate of the playing end and the encoding code stream of the encoding end, and dynamically stores video data by increasing the capacities of the receiving buffer module and the transmitting buffer module, thereby ensuring that the video encoding code rate, video decoding playing and video transmission end-to-end matching are carried out, and the video transmission and video playing processes cannot lose data in the matching process, so that the stability of a video transmission system and the smoothness of video playing are improved, and user experience is further improved.
In another embodiment, if the code stream adjustment policy is an up-regulation code stream, and the dynamic code rate balancing module sends the code stream adjustment policy of the down-regulation code stream to the encoding end and the playing end, the dynamic code rate balancing module is further configured to determine the first delay time based on the downlink network transmission bandwidth information; if the first delay time is greater than a preset threshold value, the dynamic code rate balancing module is used for controlling the playing end to improve the playing rate based on a code stream adjustment strategy; if the first delay time is smaller than the preset threshold, the dynamic code rate balancing module is used for restoring the playing rate of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module to an initial state.
The dynamic code rate balancing module sends the code stream adjustment policy of the down-regulated code stream to the encoding end and the playing end, and if the code stream adjustment policy is the up-regulated code stream, the code stream and the buffer memory of the encoding end and the device end are continuously adjusted based on the code stream adjustment policy, which comprises: and determining a first delay time based on the downlink network transmission bandwidth information, wherein the first delay time is the video buffering delay time of the receiving buffer module of the playing end. Comparing the first delay time with a preset threshold value, if the first delay time is larger than the preset threshold value, indicating that the video buffering delay time is too large, and controlling a playing end to improve the playing speed through a dynamic code stream balancing module so as to reduce the video buffering delay time; when the first delay time is smaller than a preset threshold, the video buffering delay is normal, and at the moment, the playing speed of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module can be recovered to be normal.
In one embodiment, as shown in fig. 10, the steps include:
Step 1: the dynamic code rate balancing module calculates video buffering delay Td of the receiving buffer module at the playing end based on the downlink network transmission bandwidth information, namely first delay time, and compares the video buffering delay Td with a preset threshold value Ts2 to determine whether the video buffering delay is overlarge or not;
step 2: if the video buffering delay Td is greater than a preset threshold value Ts2, the video buffering delay is too large, the dynamic code rate balancing module needs to increase the playing speed of the video decoding module at the playing end, the video decoding module plays at the beta time of the normal playing speed, the beta value can be set based on the requirement of an actual scene, and the preferable beta epsilon [1,1.2];
step 3: if the video buffering delay Td is smaller than the preset threshold value Ts2, the video buffering delay is normal, and the dynamic code rate balancing module restores the playing speed of the video decoding module at the playing end;
Step 4: if the cache of the receiving cache module of the playing end is not restored to the default value, gradually reducing the cache of the receiving cache module;
step 5: if the buffer memory of the sending buffer memory module of the coding end is not restored to the default value, the buffer memory of the sending buffer memory module is gradually reduced.
Referring to fig. 11, fig. 11 is a flow chart illustrating a multi-code stream adjusting method according to an embodiment of the application.
Optionally, if there are multiple code streams of multiple encoding end devices that need to perform code rate balancing, the code stream that preferably performs code rate balancing operation needs to be determined, which specifically includes the following steps:
step 1: the platform end determines the code stream number M of the current video stream average overload;
Step 2: if the number M of the code streams with the video flow average value overload exceeds 1, selecting the code stream with the smallest ratio of the uplink network transmission bandwidth to the coding code rate from the M code streams, for example, the uplink network transmission bandwidth of the code stream 1 is 10Mbps, the video coding code rate is 5Mbps, the ratio is 2, the uplink network transmission bandwidth of the code stream 2 is 9Mbps, the video coding code rate is 3Mbps, and the ratio is 3, and selecting the code stream 1 to perform the code rate balancing operation preferentially; or selecting a code stream with higher priority to perform code rate balancing operation based on a flow peak balancing technology;
step 3: and carrying out code rate adjustment on the selected code stream according to the code rate balance processing flow of the single code stream.
In this embodiment, if the code stream adjustment policy is an up-regulation code stream, and the dynamic code rate balancing module sends the code stream adjustment policy of the down-regulation code stream to the encoding end and the playing end, whether to recover the playing rate of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module is determined according to the size of the first delay time, so that the video transmission quality and the playing quality are recovered in time under the condition that the video encoding code rate, the video decoding playing and the video transmission end-to-end match, and further the viewing experience of the user is improved.
In another embodiment, the platform end includes a video transmission monitoring module and a key frame peak shifting module, wherein: the video transmission monitoring module is used for determining whether key frame collision exists or not based on the first information and the second information; if the key frame collision exists, determining a key frame peak shifting strategy, and sending the key frame peak shifting strategy to a key frame peak shifting module; the key frame peak shifting module sends a key frame peak shifting strategy to the encoding end and the playing end.
Referring to fig. 12-13, fig. 12-13 are schematic diagrams of video streams according to embodiments of the present application.
Illustratively, the abscissa in FIGS. 12-13 represents time in milliseconds (ms), and the ordinate represents code stream in bytes per millisecond (KByte/ms). The encoding end transmits video frames at a certain frequency and period in real time, as shown in fig. 12, wherein the video key I frames are much larger (typically more than 10 times) than the data amount of the normal P frames. Traffic characteristics in the network take fps=25, gop=50 video as an example, one video frame is generated every 40ms, and one key I frame is generated every 25×50×40=2000 ms. When a plurality of encoding end devices transmit video frames simultaneously, the more encoding end devices, the more easily the case that a plurality of encoding end devices transmit I frames at the same time point (I frame collision) occurs. When an I-frame collision occurs, a large amount of data burst occurs in a short time in the network, so that data congestion, picture delay increase, blocking and other conditions are caused, and the method is particularly obvious in a 5G/4G wireless network environment. As shown in fig. 13, fig. 13 is a schematic diagram of video streams when two encoding end devices collide with each other in an I frame.
The platform end further comprises a video transmission monitoring module and a key frame peak staggering module, wherein the video transmission monitoring module determines the time sequence distribution condition of the key frame based on video frame sending information in the first information and video frame receiving information in the second information, and further determines whether key frame collision exists. If the key frame collision exists, the video transmission monitoring module determines a key frame peak-shifting strategy based on the key frame collision condition and sends the key frame peak-shifting strategy to the key frame peak-shifting module. And after the key frame peak shifting module receives the key frame peak shifting strategy, the encoding end and the playing end are controlled to execute the peak shifting strategy.
In one embodiment, before performing the key frame collision detection, the first information and the second information are already stored in the cache of the platform end for detection and analysis, for example, the platform end forwards the first information and the second information and simultaneously stores the first information and the second information in the cache, or the first information and the second information are respectively acquired by the encoding end and the playing end and then directly sent to the platform end for storage of the platform end; in another embodiment, when performing the key frame collision detection, the platform end requests the encoding end and the playing end to send the first information and the second information in real time, so as to be used for detection analysis.
Optionally, in the process of key frame peak staggering, the dynamic buffer balancing module at the platform end controls the decoding adjusting module at the playing end in advance (adjusts the capacity of the receiving buffer module and the playing rate of the video decoding module), and controls the encoding adjusting module at the encoding end in advance (adjusts the capacity of the sending buffer module), so that the playing end can smoothly play the video in the process of maintaining the key frame peak staggering.
In the embodiment, the video transmission monitoring module determines whether a key frame collision exists or not based on the first information and the second information, and determines a key frame peak shifting strategy when the key frame collision exists, and the key frame peak shifting module controls the coding end and the playing end to shift peaks based on the key frame peak shifting strategy, so that the flow peak value from the coding end to the playing end is reduced, the bandwidths of an uplink network and a downlink network are balanced, the stability of the flow from the coding end to the playing end is ensured, the utilization rate of the bandwidths is improved, and the fluency of video transmission and video playing is not influenced on the premise that video data is not lost in the adjustment process, thereby avoiding picture blocking sections when the key frame peak shifting is performed, and ensuring the quality of video transmission and playing.
In another embodiment, if there is a key frame collision, the key frame peak shifting module is configured to control the playing end to reduce the playing rate and increase the capacity of the receiving buffer module of the playing end; the key frame peak shifting module is also used for controlling the coding end to increase the capacity of the sending buffer module and adjusting the key frame generation time sequence after the preset time.
In an exemplary embodiment, in the case of a key frame collision, the key frame peak shifting module controls the encoding end and the playing end to execute a key frame peak shifting strategy, and specifically includes: the playing end is controlled to reduce the playing speed, the capacity of the receiving buffer module of the playing end is increased, the encoding end is controlled to increase the capacity of the sending buffer module, and the key frame generation time sequence is adjusted after the preset time.
Referring to fig. 14, fig. 14 is a flowchart illustrating a key frame peak shifting method according to an embodiment of the application.
In one embodiment, the key frame peak shifting method includes:
Step 1: the playing end sends the downlink network transmission bandwidth information to the platform end, the video transmission monitoring module at the platform end carries out I frame flow peak superposition detection, the receiving time interval Ti of the I frame of the current code stream and the I frame of the last code stream is determined, and whether the I frame flow peak superposition occurs is determined by judging the receiving time interval Ti and a preset threshold value Ts 1;
Step 2: if the receiving time interval Ti is smaller than the preset threshold value Ts1, the occurrence of the superposition of the I frame flow peak values is indicated, the playing speed of a video decoding module at a playing end is reduced, and the playing is carried out at alpha times of the normal playing speed, wherein the preferred alpha is 0.8 and 1, and the perception of the decelerating playing eyes is not obvious at the moment;
Step 3: the key frame peak shifting module increases the capacity of a receiving buffer module of the playing end, and calculates the buffer capacity which needs to be increased by introducing the received video delay data;
step 4: the key frame peak shifting module increases the capacity of a sending buffer module of the coding end, and calculates the buffer capacity which needs to be increased by introducing the sending video delay data;
Step 5: after waiting for the preset time, the key frame peak shifting module informs the coding end to adjust the generation time sequence of the I frame of the current code stream, requires the coding end to regenerate the I frame at the current time, and performs video coding according to the specified video GOP period.
In the embodiment, the key frame peak shifting module controls the coding end and the playing end to shift the key frame peak, and the playing speed of the playing end, the capacity of the receiving buffer module and the like are adjusted in the peak shifting process, so that the video coding rate, the video decoding playing and the video transmission end-to-end matching are ensured, and the video playing stability of the video end in the key frame peak shifting process is ensured.
In another embodiment, the playing end is further configured to send the downlink network transmission bandwidth information to the platform end, and after the key frame peak shifting module sends a key frame peak shifting policy to the encoding end and the playing end, the key frame peak shifting module is further configured to determine a second delay time based on the downlink network transmission bandwidth information; if the second delay time is greater than the preset threshold, the key frame peak shifting module is used for controlling the playing end to increase the playing speed; if the second delay time is smaller than the preset threshold, the key frame peak shifting module is used for restoring the playing speed of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module to an initial state.
The method includes that after a key frame peak shifting module sends a key frame peak shifting strategy to an encoder and a playing end and executes the peak shifting strategy, the playing end sends downlink network transmission bandwidth information to a platform end, and the platform end determines video buffering delay time of a receiving buffer module of the playing end, namely second delay time, based on the downlink network transmission bandwidth information; comparing the second delay time with a preset threshold, if the second delay time is larger than the preset threshold, indicating that the video buffering delay is too high, and controlling the playing end to improve the playing speed by the key frame peak shifting module at the moment so as to reduce the video buffering delay; if the video buffer delay is smaller than the preset threshold, the video buffer delay of the playing end is normal, and the playing speed of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module are restored to an initial state.
In one embodiment, as shown in fig. 14, after the I-frame traffic peak is staggered, the key frame peak staggering method further includes:
Step 1: the key frame peak shifting module calculates a video buffering delay Td of the receiving buffer module at the playing end, namely a second delay time, and compares the video buffering delay Td with a preset threshold value Ts 2;
Step 2: if the video buffering delay Td is greater than a preset threshold value Ts2, the video delay is larger, the playing speed of a video decoding module at a playing end needs to be increased, and the video is played at beta times of the normal speed, wherein the preferable beta epsilon [1, 1.2] is not obvious in the accelerated playing of human eyes;
Step 3: if the video buffering delay Td is smaller than the preset threshold Ts2, the video delay is normal, and the playing rate of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module are respectively recovered.
Referring to fig. 15, fig. 15 is a flow chart of a multi-code stream adjusting method according to an embodiment of the application.
Optionally, if there are multiple code streams to be balanced for the I-frame traffic peak, the code stream to be balanced is determined first. The method comprises the following specific steps:
step 1: the platform end determines the number N of code stream with the I frame flow peak value superposition;
Step 2: for the number N of the code streams overlapped by the I frame flow peak value exceeding 1, selecting the code stream which is evaluated by the code stream adjustment strategy as the code stream which is adjusted by the code stream from a plurality of code streams overlapped by the I frame peak value as a first priority adjustment code stream, evaluating the code stream which is unchanged by the code stream as a second priority adjustment code stream, and evaluating the code stream which is up-adjusted by the code stream as a third priority adjustment code stream; if a plurality of code streams are needed to be adjusted in each priority, selecting the code stream with smaller ratio of the transmission bandwidth of the uplink network to the coding rate as the code stream with the priority to be adjusted; for example, the first code stream with priority adjustment is provided with a code stream 1 and a code stream 2, the uplink network transmission bandwidth of the code stream 1 is 10Mbps, the coding rate is 5 Mbps, the ratio is 2, the uplink network transmission bandwidth of the code stream 2 is 9Mbps, the coding rate is 3 Mbps, and the ratio is 3, the code stream 1 is selected for priority adjustment;
Step3: and performing I frame time sequence adjustment on the selected code stream according to the I frame flow peak value balance processing flow of the single code stream.
Referring to fig. 16-17, fig. 16-17 are schematic diagrams of video streams according to embodiments of the present application.
As shown in fig. 16-17, where the abscissa represents time in seconds(s) and the ordinate represents code stream in Bits flipped per 100 milliseconds (Bits/100 ms). The coding end comprises 8 coding end devices. By the key frame peak staggering adjustment method of the embodiment, 94Mbps of the flow peak value in the I frame collision is reduced to 53Mbps of the flow peak value after peak staggering, and the flow peak value is reduced by 43.6%.
After executing the key frame peak shifting strategy, the embodiment adjusts the playing rate of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module through the second delay time, so that the flow peak value when the key frames are overlapped is reduced, the situation of network congestion is relieved, and the bandwidth utilization rate is further improved.
It should be understood that, although the steps in the flowcharts related to the above embodiments are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Those skilled in the art will appreciate that implementing all or part of the above-described methods in accordance with the embodiments may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile memory may include read-only memory (ROM), magnetic tape, floppy disk, flash memory, optical memory, high density embedded nonvolatile memory, resistive random access memory (ReRAM), magneto-resistive random access memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric memory (Ferroelectric Random Access Memory, FRAM), phase change memory (PHASE CHANGE memory, PCM), graphene memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (12)

1. The utility model provides a video transmission system, includes coding end, platform end and broadcast end, wherein:
The encoding end is used for acquiring first information, wherein the first information comprises video frame sending information of the encoding end within a preset duration, and the first information is sent to the playing end;
The playing end is used for receiving the first information and obtaining second information, and the second information comprises video frame receiving information and video frame playing buffer information of the playing end in the preset duration;
The method is also used for determining network congestion conditions within the preset duration based on the first information and the second information; and determining a video frame playing condition within the preset duration based on the second information, wherein the video frame playing condition comprises whether video playing is blocked or not;
the network congestion condition and the video frame playing condition are sent to the platform end;
the platform end is used for receiving the network congestion condition and the video frame playing condition and determining a code stream adjustment strategy based on the network congestion condition and the video frame playing condition;
The platform end is also used for sending the code stream adjustment strategy to the encoding end and the playing end, and the code stream adjustment strategy is used for adjusting the code stream;
The platform end comprises a video transmission monitoring module and a dynamic code rate balancing module, wherein:
the coding end is also used for sending the uplink network transmission bandwidth information to the platform end;
the playing end is also used for sending the downlink network transmission bandwidth information to the platform end;
The video transmission monitoring module is used for receiving the uplink network transmission bandwidth information and the downlink network transmission bandwidth information;
The video transmission monitoring module is further used for determining a code stream adjustment strategy based on the uplink network transmission bandwidth information, the downlink network transmission bandwidth information, the network congestion condition and the video frame playing condition, and sending the code stream adjustment strategy to the dynamic code rate balancing module;
if the code stream adjustment strategy is to adjust the code stream downwards, the dynamic code rate balancing module is used for controlling the playing end to reduce the playing rate and increasing the capacity of a receiving buffer module of the playing end based on the code stream adjustment strategy;
the dynamic code rate balancing module is also used for controlling the coding end to increase the capacity of the sending buffer module and reduce the coding code rate;
If the code stream adjustment strategy is an up-regulation code stream, and the dynamic code rate balancing module sends the code stream adjustment strategy of the down-regulation code stream to the coding end and the playing end, the dynamic code rate balancing module is further used for determining a first delay time based on the downlink network transmission bandwidth information;
if the first delay time is greater than a preset threshold, the dynamic code rate balancing module is used for controlling the playing end to improve the playing rate based on the code stream adjustment strategy;
and if the first delay time is smaller than a preset threshold value, the dynamic code rate balancing module is used for restoring the playing rate of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module to an initial state.
2. The system of claim 1, wherein the determining a network congestion condition within the preset duration based on the first information and the second information comprises:
determining a first network state and/or a second network state based on the first information and the second information; wherein the first network state is associated with a video frame loss rate and the second network state is associated with a video frame delay;
the network congestion situation is determined based on the first network state and/or the second network state.
3. The system of claim 2, wherein the video frame transmission information comprises a video frame transmission number and the video frame reception information comprises a video frame reception number; determining a first network state based on the first information and the second information, comprising:
determining a frame loss rate of the video frames in the preset duration based on the video frame sending number and the video frame receiving number;
If the video frame loss rate is greater than a first threshold value, determining that the first network state is an overload state; or if the video frame loss rate is a second threshold value, determining that the first network state is a normal state; or if the video frame loss rate is greater than or equal to a second threshold and less than a first threshold, determining that the first network state is a low-load state.
4. The system of claim 2, wherein the video frame transmission information comprises a video frame transmission interval and the video frame reception information comprises a video frame reception interval; determining a second network state based on the first information and the second information, comprising:
determining the change trend of the video frame time delay in the preset duration based on the video frame sending interval and the video frame receiving interval;
If the quantized value of the variation trend of the video frame time delay is larger than a third threshold value, and the overload frequency of the network bandwidth exceeds a fourth threshold value within the preset time period and/or the overload time of the network bandwidth exceeds a fifth threshold value, determining that the second network state is an overload state;
Or if the quantized value of the variation trend of the video frame time delay is smaller than a sixth threshold value, determining that the second network state is a low-load state;
Or if the quantized value of the variation trend of the video frame time delay is smaller than or equal to a third threshold value and larger than or equal to a sixth threshold value, and the normal times of the network bandwidth exceeds a seventh threshold value and/or the normal time of the network bandwidth exceeds an eighth threshold value within the preset time period, determining that the second network state is a normal state.
5. The system of claim 4, wherein the third threshold and the sixth threshold are dynamically adjusted based on an absolute value of a difference between a quantized value of a trend of the video frame delay and the third threshold or the sixth threshold.
6. The system according to claim 2, wherein determining the network congestion condition based on the first network state and/or the second network state comprises:
The first network state and/or the second network state are/is in an overload state, and the network bandwidth in the preset duration is determined to be in the overload state;
The first network state and the second network state are both normal states, and the network bandwidth in the preset duration is determined to be in the normal state;
And if the first network state is a normal state and the second network state is a low-load state, determining that the network bandwidth within the preset duration is in the low-load state.
7. The system of any one of claims 1-6, wherein the video frame reception information comprises a video frame reception interval and the video frame play buffer information comprises a video frame buffer time; based on the second information, determining the playing condition of the video frame within the preset duration comprises the following steps:
Determining the frame playing jitter time in the preset duration according to the video frame receiving interval and the video frame buffering time;
If the frame playing jitter time is smaller than or equal to a ninth threshold value, determining that video playing is not blocked in the preset duration; or if the frame playing jitter time is greater than a ninth threshold value, determining that video playing is blocked in the preset duration.
8. The system of any one of claims 1-6, wherein the video frame reception information comprises a video frame reception interval and the video frame play buffer information comprises a video frame buffer time; based on the second information, determining the playing condition of the video frame within the preset duration comprises the following steps:
counting the number of frame play jitter in the preset duration according to the video frame receiving interval and the video frame buffering time;
If the number of the frame play jitter times is smaller than or equal to a tenth threshold value, determining that video play is not blocked in the preset duration; or if the number of the frame play jitter is greater than a tenth threshold, determining that video play is blocked in the preset duration.
9. The system of any of claims 1-6, wherein receiving the network congestion condition and video frame play condition and determining a code stream adjustment policy based on the network congestion condition and video frame play condition comprises:
When video playing is blocked and network bandwidth is in an overload state within the preset duration, the code stream adjustment strategy is used for indicating the playing end to downwards adjust the code stream; or alternatively
And when the video playing is not blocked and the network bandwidth is in a normal state within the preset time, the code stream adjustment strategy is used for indicating the playing end to up-regulate the code stream.
10. The system of claim 1, wherein the platform side comprises a video transmission monitoring module and a key frame peak shifting module, wherein:
The video transmission monitoring module is used for determining whether key frame collision exists or not based on the first information and the second information;
if the key frame collision exists, determining a key frame peak shifting strategy, and sending the key frame peak shifting strategy to the key frame peak shifting module;
and the key frame peak shifting module sends the key frame peak shifting strategy to the coding end and the playing end.
11. The system of claim 10, wherein if there is a key frame collision, the key frame peak shifting module is configured to control the playing end to decrease a playing rate and increase a capacity of a receiving buffer module of the playing end;
The key frame peak shifting module is also used for controlling the coding end to increase the capacity of the sending buffer module and adjusting the key frame generation time sequence after the preset time.
12. The system of claim 11, wherein the playback end is further configured to send downstream network transmission bandwidth information to the platform end, and wherein the key frame peak shifting module is further configured to determine a second delay time based on the downstream network transmission bandwidth information after the key frame peak shifting module sends the key frame peak shifting policy to the encoding end and the playback end;
If the second delay time is greater than a preset threshold, the key frame peak shifting module is used for controlling the playing end to improve the playing speed;
and if the second delay time is smaller than a preset threshold value, the key frame peak shifting module is used for restoring the playing speed of the playing end, the capacity of the receiving buffer module and the capacity of the sending buffer module to an initial state.
CN202210893047.XA 2021-12-17 2022-07-27 Video transmission system Active CN115086779B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111551164.XA CN114222194A (en) 2021-12-17 2021-12-17 Video code stream adjusting method, device and system
CN202111551164X 2021-12-17

Publications (2)

Publication Number Publication Date
CN115086779A CN115086779A (en) 2022-09-20
CN115086779B true CN115086779B (en) 2024-04-16

Family

ID=80703537

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111551164.XA Pending CN114222194A (en) 2021-12-17 2021-12-17 Video code stream adjusting method, device and system
CN202210893047.XA Active CN115086779B (en) 2021-12-17 2022-07-27 Video transmission system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202111551164.XA Pending CN114222194A (en) 2021-12-17 2021-12-17 Video code stream adjusting method, device and system

Country Status (1)

Country Link
CN (2) CN114222194A (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115361585B (en) * 2022-08-19 2023-11-07 广州市百果园信息技术有限公司 Video playing and clamping prediction method, device, equipment and storage medium
CN115334307B (en) * 2022-10-11 2023-02-10 浙江大华技术股份有限公司 Data transmission method, front-end equipment, video acquisition system and medium
CN116233472B (en) * 2023-05-08 2023-07-18 湖南马栏山视频先进技术研究院有限公司 Audio and video synchronization method and cloud processing system
CN116320536B (en) * 2023-05-16 2023-08-18 瀚博半导体(上海)有限公司 Video processing method, device, computer equipment and computer readable storage medium
CN116828229B (en) * 2023-08-30 2023-11-24 湖南马栏山视频先进技术研究院有限公司 Transmission method and system for audio and video streams

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1798085A (en) * 2004-12-28 2006-07-05 华为技术有限公司 Method for adjusting size of transmission buffer in control sub layer of wireless link
CN101583025A (en) * 2009-06-11 2009-11-18 中兴通讯股份有限公司 Streaming media playing method and device
CN101702711A (en) * 2009-10-30 2010-05-05 中兴通讯股份有限公司 Method and terminal for playing data
CN101753977A (en) * 2009-12-31 2010-06-23 中兴通讯股份有限公司 Method and device for adjusting network digital video play speed
CN101867804A (en) * 2010-06-01 2010-10-20 中兴通讯股份有限公司 Internet protocol television direct broadcast system and method
CN101938483A (en) * 2010-09-03 2011-01-05 中兴通讯股份有限公司 Method and system for distributing live broadcast contents
CN102547389A (en) * 2012-01-16 2012-07-04 何建亿 Network-adaptive streaming media quality of service (QoS) control method
WO2014107946A1 (en) * 2013-01-08 2014-07-17 北京信威通信技术股份有限公司 Method for smoothing code rate during real time video transmission in wireless network
CN105119755A (en) * 2015-09-10 2015-12-02 广州市百果园网络科技有限公司 Jitter buffer regulation method and device
CN105430532A (en) * 2015-11-18 2016-03-23 南京创维信息技术研究院有限公司 Control method and system for adaptive adjustment of video data transmission
CN107529097A (en) * 2016-06-20 2017-12-29 北京信威通信技术股份有限公司 A kind of method and device of adaptive regulating video buffer size
CN108259948A (en) * 2018-03-30 2018-07-06 武汉斗鱼网络科技有限公司 A kind of playback method, device, computer and storage medium that audio and video are broadcast live
CN109462773A (en) * 2018-08-31 2019-03-12 北京潘达互娱科技有限公司 A kind of plug-flow method, apparatus, electronic equipment and storage medium
CN109905326A (en) * 2019-03-26 2019-06-18 武汉大学 A kind of rate drawdown parameter optimization method based on the Congestion Level SPCC factor
CN110933516A (en) * 2018-09-19 2020-03-27 华为技术有限公司 Multimedia live broadcast method, device and equipment
CN111447459A (en) * 2020-05-14 2020-07-24 杭州当虹科技股份有限公司 Rtmp self-adaptive code rate realizing method
CN111615006A (en) * 2020-05-29 2020-09-01 高小翎 Video code conversion transmission control system based on network state self-evaluation
CN112019873A (en) * 2020-09-08 2020-12-01 北京金山云网络技术有限公司 Video code rate adjusting method and device and electronic equipment
CN112104879A (en) * 2020-11-13 2020-12-18 腾讯科技(深圳)有限公司 Video coding method and device, electronic equipment and storage medium
CN112399141A (en) * 2020-10-16 2021-02-23 浙江大华技术股份有限公司 Data transmission method based on multiple front-end video devices and related device
CN112953922A (en) * 2021-02-03 2021-06-11 西安电子科技大学 Self-adaptive streaming media control method, system, computer equipment and application

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104581340B (en) * 2015-01-16 2018-02-16 京东方科技集团股份有限公司 Client, stream medium data method of reseptance and stream medium data transmission system

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1798085A (en) * 2004-12-28 2006-07-05 华为技术有限公司 Method for adjusting size of transmission buffer in control sub layer of wireless link
CN101583025A (en) * 2009-06-11 2009-11-18 中兴通讯股份有限公司 Streaming media playing method and device
CN101702711A (en) * 2009-10-30 2010-05-05 中兴通讯股份有限公司 Method and terminal for playing data
CN101753977A (en) * 2009-12-31 2010-06-23 中兴通讯股份有限公司 Method and device for adjusting network digital video play speed
CN101867804A (en) * 2010-06-01 2010-10-20 中兴通讯股份有限公司 Internet protocol television direct broadcast system and method
CN101938483A (en) * 2010-09-03 2011-01-05 中兴通讯股份有限公司 Method and system for distributing live broadcast contents
CN102547389A (en) * 2012-01-16 2012-07-04 何建亿 Network-adaptive streaming media quality of service (QoS) control method
WO2014107946A1 (en) * 2013-01-08 2014-07-17 北京信威通信技术股份有限公司 Method for smoothing code rate during real time video transmission in wireless network
CN105119755A (en) * 2015-09-10 2015-12-02 广州市百果园网络科技有限公司 Jitter buffer regulation method and device
CN105430532A (en) * 2015-11-18 2016-03-23 南京创维信息技术研究院有限公司 Control method and system for adaptive adjustment of video data transmission
CN107529097A (en) * 2016-06-20 2017-12-29 北京信威通信技术股份有限公司 A kind of method and device of adaptive regulating video buffer size
CN108259948A (en) * 2018-03-30 2018-07-06 武汉斗鱼网络科技有限公司 A kind of playback method, device, computer and storage medium that audio and video are broadcast live
CN109462773A (en) * 2018-08-31 2019-03-12 北京潘达互娱科技有限公司 A kind of plug-flow method, apparatus, electronic equipment and storage medium
CN110933516A (en) * 2018-09-19 2020-03-27 华为技术有限公司 Multimedia live broadcast method, device and equipment
CN109905326A (en) * 2019-03-26 2019-06-18 武汉大学 A kind of rate drawdown parameter optimization method based on the Congestion Level SPCC factor
CN111447459A (en) * 2020-05-14 2020-07-24 杭州当虹科技股份有限公司 Rtmp self-adaptive code rate realizing method
CN111615006A (en) * 2020-05-29 2020-09-01 高小翎 Video code conversion transmission control system based on network state self-evaluation
CN112019873A (en) * 2020-09-08 2020-12-01 北京金山云网络技术有限公司 Video code rate adjusting method and device and electronic equipment
CN112399141A (en) * 2020-10-16 2021-02-23 浙江大华技术股份有限公司 Data transmission method based on multiple front-end video devices and related device
CN112104879A (en) * 2020-11-13 2020-12-18 腾讯科技(深圳)有限公司 Video coding method and device, electronic equipment and storage medium
CN112953922A (en) * 2021-02-03 2021-06-11 西安电子科技大学 Self-adaptive streaming media control method, system, computer equipment and application

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘高平 ; 宋执环 ; .自适应网络带宽的H.264视频流传输控制方法.浙江大学学报(工学版).2012,(12),全文. *

Also Published As

Publication number Publication date
CN115086779A (en) 2022-09-20
CN114222194A (en) 2022-03-22

Similar Documents

Publication Publication Date Title
CN115086779B (en) Video transmission system
US8412364B2 (en) Method and device for sending and playing streaming data
US6075768A (en) Fair bandwidth sharing for video traffic sources using distributed feedback control
US6910079B2 (en) Multi-threshold smoothing
EP2137937B1 (en) Bandwidth allocation control in multiple video streaming
US10602139B2 (en) Embedded multimedia systems with adaptive rate control for power efficient video streaming
CN101971629B (en) Device and method for adaptation of target rate of video signals
CN104394484A (en) Wireless live streaming media transmission method
CN109660879B (en) Live broadcast frame loss method, system, computer equipment and storage medium
US20110138427A1 (en) Video Service Buffer Management in a Mobile Rate Control Enabled Network
CN109729437B (en) Streaming media self-adaptive transmission method, terminal and system
CN107205160A (en) A kind of player method and device of the video issued for server
US8483055B2 (en) Adaptive frame rate control for video in a resource limited system
JP2001169284A (en) Quantization step setting method in moving image encoder and moving image encoder using the method
JP2008507922A (en) Trick mode and speed transition
CN104967884A (en) Code stream switching method and code stream switching device
CN101562615A (en) Transmission method for MPEG-4 code based multimedia data stream self-adapting network bandwidth
KR100592547B1 (en) Packet scheduling method for streaming multimedia
US20170142029A1 (en) Method for data rate adaption in online media services, electronic device, and non-transitory computer-readable storage medium
CN107509120A (en) A kind of streaming media self-adapting transmission method based on buffer underflow probability Estimation
CN112866752A (en) Video code stream self-adaptive network bandwidth method, device, equipment and medium
JPH10336626A (en) Transfer method and transfer device for video data
JP2017069849A (en) Video control device, video distribution system and video control method
CN106210785B (en) Self-adaptive unidirectional control method and system for media stream network
US20190089759A1 (en) Video encoding circuit and wireless video transmission apparatus and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant