CN113542798A - Video stream transmission method, electronic device and storage medium - Google Patents

Video stream transmission method, electronic device and storage medium Download PDF

Info

Publication number
CN113542798A
CN113542798A CN202110604920.4A CN202110604920A CN113542798A CN 113542798 A CN113542798 A CN 113542798A CN 202110604920 A CN202110604920 A CN 202110604920A CN 113542798 A CN113542798 A CN 113542798A
Authority
CN
China
Prior art keywords
video stream
path
transmission
current video
level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110604920.4A
Other languages
Chinese (zh)
Inventor
钟广海
严敏
李翔
叶奇
唐斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202110604920.4A priority Critical patent/CN113542798A/en
Publication of CN113542798A publication Critical patent/CN113542798A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • H04N21/64784Data processing by the network
    • H04N21/64792Controlling the complexity of the content stream, e.g. by dropping packets

Abstract

The application discloses a video stream transmission method, electronic equipment and a computer readable storage medium. The method comprises the following steps: acquiring a current video stream, wherein the current video stream comprises a plurality of video frames; grading a plurality of video frames; judging whether the path supports transmission of the current video stream; if the transmission of the current video stream is not supported, discarding the video frame of the lowest level to update the current video stream; repeatedly executing the steps until the transmission of the current video stream is supported by the path, or the grade of the residual video frames in the current video stream is higher than the preset grade; the current video stream is transmitted using the path. By the method, the transmission efficiency of the path to the current video stream can be improved.

Description

Video stream transmission method, electronic device and storage medium
Technical Field
The present application relates to the field of video transmission, and in particular, to a method for transmitting a video stream, an electronic device, and a computer-readable storage medium.
Background
Under scenes such as entertainment and monitoring, wireless transmission requirements often exist, namely the condition that video streams collected by a sending end are transmitted to a receiving end through a wireless network. Taking a monitoring scene as an example, the mobile monitoring device is arranged in a monitoring area, the mobile monitoring device performs information acquisition on the monitoring area to form a video stream, and the mobile monitoring device transmits the video stream to a receiving end through a wireless network.
However, existing methods of transmitting video streams are inefficient. To meet the transmission requirement, a method for improving the wireless transmission efficiency is needed.
Disclosure of Invention
The application provides a video stream transmission method, an electronic device and a computer readable storage medium, which can solve the problem that the existing video stream transmission method is low in efficiency.
In order to solve the technical problem, the application adopts a technical scheme that: there is provided a method of transmitting a video stream, the method comprising: acquiring a current video stream, wherein the current video stream comprises a plurality of video frames; grading a plurality of video frames; judging whether the path supports transmission of the current video stream; if the transmission of the current video stream is not supported, discarding the video frame of the lowest level to update the current video stream; repeatedly executing the steps until the transmission of the current video stream is supported by the path, or the grade of the residual video frames in the current video stream is higher than the preset grade; the current video stream is transmitted using the path.
In order to solve the above technical problem, another technical solution adopted by the present application is: an electronic device is provided, which comprises a processor and a memory connected with the processor, wherein the memory stores program instructions; the processor is configured to execute the program instructions stored by the memory to implement the above-described method.
In order to solve the above technical problem, the present application adopts another technical solution: there is provided a computer readable storage medium storing program instructions that when executed are capable of implementing the above method.
Through the mode, the video processing method and the video processing device have the advantages that the video frames in the current video stream are graded, and whether the video frames with the grade smaller than or equal to the preset grade in the current video stream are discarded or not and which video frames with the grade smaller than or equal to the preset grade in the current video stream are discarded are selected according to the transmission capacity of the path. That is, when the transmission capability of the path is not enough to support the transmission of video frames of all levels, the video frames in the current video stream are discarded in the order from high level to low level, so that the transmission efficiency of the path to the current video stream can be improved.
Drawings
Fig. 1 is a schematic flowchart of an embodiment of a video stream transmission method provided in the present application;
fig. 2 is a schematic view of a video streaming scenario when a path of the present application includes a cellular network and a WIFI network;
FIG. 3 is a schematic view of the detailed process of S13 in FIG. 1;
fig. 4 is a detailed flowchart of S133 in fig. 3;
FIG. 5 is a flow chart illustrating matching of corresponding paths for each level of video frames according to the present application;
fig. 6 is a schematic flow chart of another embodiment of a method for transmitting a video stream provided by the present application;
fig. 7 is a detailed flowchart of S22 in fig. 6;
FIG. 8 is a schematic flow chart illustrating the determination of the level of a video frame to be transmitted in a next video stream according to the present application;
fig. 9 is a schematic flowchart of a video stream transmission method according to another embodiment of the present application;
FIG. 10 is a flow chart illustrating an embodiment of a method for receiving a video stream provided herein;
FIG. 11 is a schematic structural diagram of an embodiment of an electronic device of the present application;
FIG. 12 is a schematic structural diagram of an embodiment of a computer-readable storage medium of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second" and "third" in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any indication of the number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include at least one of the feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those skilled in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Fig. 1 is a schematic flowchart of an embodiment of a video stream transmission method provided in the present application. It should be noted that, if the result is substantially the same, the flow sequence shown in fig. 1 is not limited in this embodiment. As shown in fig. 1, the present embodiment may include:
s11: and acquiring the current video stream.
The current video stream comprises a plurality of video frames.
The current video stream referred to herein is one video slice in the entire video stream to be transmitted. It may be real-time (e.g., live streaming) or non-real-time.
The execution main body of this embodiment is a wireless sending end, for example, a mobile monitoring device (mobile robot, unmanned aerial vehicle, camera), a mobile phone, a computer, and the like.
S12: a plurality of video frames are ranked.
The plurality of video frames may be ranked based on the type of video frame and the importance of the video frame. The types of video frames may include key frames (I-frames) and non-key frames (P-frames, B-frames, hereinafter only P-frames are included as an example). P frames can be divided into emphasized P frames and non-emphasized P frames. For example, in a surveillance scenario, P-frames are divided into emphasized P-frames and non-emphasized P-frames based on traffic density or other smart detection events. The importance degree of the I frame is higher than that of the P frame, and the importance degree of the key P frame is higher than that of the non-key P frame. The enhancement layers of SVC coding contain at least 2 layers, the lower the layer level the higher. Table 1 below is an example of the ranking results:
TABLE 1
Video frame level Video frame type
Level1 Key frame fragments, e.g. I-frame and its subsequent P-frames, or base layer for SVC coding
Level2 Non-key frame fragments, e.g. emphasized P-frames, or lower enhancement layers for SVC coding
Level3 Non-key frame fragments, e.g. non-emphasized P-frames, or high-level enhancement layers for SVC coding
The levels can be ranked from high to low as Level1 > Level2 > Level 3. The higher the level of the divided video frame, the higher its importance level.
S13: and judging whether the path supports the transmission of the current video stream.
There may be one or more paths that can be used to transmit the current video stream, and the paths may be determined according to the network conditions of the actual transmission scenario. The paths may be cellular networks (e.g., 4G, 5G), WIFI networks, and the like. Fig. 2 is an example of a scenario for video streaming when the path includes a cellular network and a WIFI network. As shown in fig. 2, the mobile monitoring device may acquire a video stream, and may transmit the video stream to the base station through the cellular network, and then the base station transmits the video stream to the cloud platform; or the video stream can be transmitted to the wireless device through the WIFI network and then transmitted to the cloud platform by the wireless device.
If the path is one, the implementation manner of the step is as follows: and judging whether the actual bandwidth of the path is larger than the required bandwidth or not. If so, judging that the path supports the transmission of the current video stream; otherwise, the path is judged not to support the transmission of the current video stream.
If the number of the paths is multiple, the step is to determine whether the multiple paths support the transmission of the current video stream. In the embodiment, it is determined whether a plurality of paths support transmission of a current video stream, that is, whether a single path in the plurality of paths supports transmission of the current video stream, or whether a combination of the plurality of paths supports transmission of the current video stream.
If the transmission of the current video stream is not supported, S14 is executed.
S14: the lowest ranked video frames are dropped to update the current video stream.
In the process of enabling the path to support the transmission of the current video stream by discarding the video frames in the current video stream, the video frames are discarded in the order of the lower level to the higher level.
For example, the video frames correspond to three levels of Level1, Level2 and Level3, wherein the importance levels are ranked from high to low as Level1 > Level2 > Level 3. Judging whether the path supports the transmission of video frames of Level1, Level2 and Level 3; if the path does not support the transmission of the video frames of Level1, Level2 and Level3, discarding the video frame of Level 3; and judging whether the path supports the transmission of the video frames of Level1 and Level2, and if the path does not support the transmission of the video frames of Level1 and Level2, discarding the video frame of Level 2.
After this step is executed, the process jumps to S13 to repeat the above steps until the path supports the transmission of the current video stream, or the level of the video frames remaining in the current video stream is higher than the preset level.
It is understood that the video frames corresponding to the video frames with the level higher than the preset level are the video frames necessary for the receiving end to receive the decoding. Assuming that the video frames higher than the preset Level are video frames of Level1, if only video frames of Level1 remain in the current video stream, the path cannot support transmission of the current video stream, and at this time, S15 is directly performed to ensure normal transmission of video frames of Level 1.
S15: the current video stream is transmitted using the path.
And if the single path supports the transmission of the current video stream, directly utilizing the single path to transmit the current video stream. And if the combination of the paths supports the transmission of the current video stream, distributing the current video stream to different paths for transmission.
Through the implementation of the embodiment, the video frames in the current video stream are classified into the grades, and whether to discard and which video frames with the grade less than or equal to the preset grade in the current video stream are discarded are selected according to the transmission capability of the path. That is, when the transmission capability of the path is not enough to support the transmission of video frames of all levels, the video frames in the current video stream are discarded in the order from high level to low level, so that the transmission efficiency of the path to the current video stream can be improved.
Referring to fig. 3 in combination, in the case that there are a plurality of paths, S13 may include the following sub-steps:
s131: it is determined whether the actual bandwidth of a single path exists that is greater than the required bandwidth of the current video stream.
The multiple paths/single path support transmission of the current video stream if the actual bandwidth of a single path among the multiple paths is greater than the required bandwidth of the current video stream. When a single path supports the transmission of the current video stream, only the single path is transmitted, and turning off or sleeping the other path can reduce energy consumption.
If yes, executing S132; if not, S133 is executed.
Taking the example that the multiple paths include a cellular network and a WIFI network, an implementation manner of S131 is given below:
priorities are set for the cellular network and the WIFI network. For example, a WIFI network may be prioritized over a cellular network based on the desired cost. Firstly, whether the actual bandwidth of the WIFI network is larger than the required bandwidth of the current video stream is judged. If the actual bandwidth of the WIFI network is larger than the required bandwidth of the current video stream, entering S132 to transmit the current video stream by using the WIFI network; otherwise, whether the actual bandwidth of the cellular network is larger than the required bandwidth of the current video stream is judged. If the actual bandwidth of the cellular network is greater than the required bandwidth of the current video stream, entering S132; otherwise, the process proceeds to S133.
In other implementation manners, whether the actual bandwidth of the WIFI network and the actual bandwidth of the cellular network are greater than the required bandwidth of the current video stream can be respectively judged; and then determines to proceed to S132 or S133 according to the determination result.
S132: the current video stream is transmitted using a single path.
S133: it is determined whether a combination of the plurality of paths supports transmission of the current video stream.
When multiple paths exist and a single path of the multiple paths is not enough to support transmission of the current video stream, whether the combination of the multiple paths supports transmission of the current video stream or not can be judged, so that video frames in the current video stream are split into multiple paths to be transmitted according to levels under the supporting condition, and the transmission efficiency is improved.
Under the condition that the sum of the actual bandwidths of the multiple paths is larger than the required bandwidth, judging that the combination of the multiple paths supports the transmission of the current video stream; otherwise, the combination of the multiple paths is judged not to support the transmission of the current video stream. Or, the method may further determine that whether the combination of the multiple paths supports transmission of the current video stream according to a result of the further determination when the sum of the actual bandwidths of the multiple paths is greater than the required bandwidth, and please refer to the following embodiment for a specific implementation manner.
If the combination of the multiple paths does not support the transmission of the current video stream, S134 is executed.
S134: it is determined that the plurality of paths do not support transmission of the current video stream.
Referring to fig. 4 in combination, if a further determination is needed in case the sum of the actual bandwidths of the plurality of paths is greater than the required bandwidth, S133 may include the following sub-steps:
s1331: and judging whether the sum of the actual bandwidths of the paths is larger than the required bandwidth or not.
If so, then S1332 is performed.
S1332: a corresponding path is matched for each level of video frames.
The corresponding path may be randomly matched for each level of video frames remaining in the current video stream.
However, in order to improve the transmission efficiency of the high-level video frames, a path with a high actual bandwidth may be matched for the high-level video frames, and a path with a low actual bandwidth may be matched for the low-level video frames. The following is illustrated in connection with fig. 5:
as shown in fig. 5, the actual bandwidth of the cellular network is w1, the actual bandwidth of the WIFI network is w2, the size of the code stream (required bandwidth) of the current video stream is k, the size of the code stream of the high-level video frame is kH, and the size of the code stream of the non-high-level video frame is k-kH.
The path with the smaller value w of the actual bandwidths w1 and w2 can be selected as the path of the video frames with non-high levels; and selecting the path of the larger value w' of the actual bandwidths w1 and w2 as the path of the high-ranked video frames.
S1333: and judging whether the matching is successful.
Continuing to refer to FIG. 5, determining whether w > (k-kH) v is satisfied, where v (v > 1) is an amplification factor for controlling the degree to which the actual bandwidth is greater than the required bandwidth; if yes, matching is successful; if not, the matching fails. That is, the matching is successful only if the path matching the non-high/low level supports the transmission of the non-high/low level.
If the matching is successful, executing S1334; if the matching fails, S1335 is executed.
S1334: and distributing the video frames of each grade to a corresponding path for transmission.
S1335: it is determined that the combination of the plurality of paths does not support transmission of the current video stream.
In addition, if it is determined that the combination of the plurality of paths supports the transmission of the current video stream (i.e., the plurality of paths support the transmission of the current video stream) in the case that the sum of the actual bandwidths of the plurality of paths is greater than the required bandwidth in S133 described above, it is necessary to match the corresponding path for the video frames of each level in the current video stream before performing S15, so that the video frames of each level are transmitted using the corresponding path in S15.
It will be appreciated that in general the current video stream and the next video stream are the same size and therefore require the same bandwidth. And, the video frames in the next video stream are classified in the same manner as the current video stream. If the transmission of the next video stream is directly entered after the transmission of the current video stream is completed, the video frame level in the next video stream transmitted using the path is the same as the video frame level in the current video stream transmitted using the path. That is, if there is no dropping of video frames when the current video stream is transmitted, there is no dropping of video frames when the next video stream is transmitted; if there is a drop of video frames when the next video stream is transmitted, there is also a drop of video frames when the next video stream is transmitted, and the level of video frames dropped when the next video stream is transmitted is the same as the level of video frames dropped when the current video stream is transmitted.
However, in practice, the actual bandwidth of the path may vary with the environment, and therefore, the level of the video frame to be transmitted in the next video stream may be determined before the next video stream is transmitted. The video frame to be transmitted is the video frame transmitted by using the path.
If there are no dropped video frames in the current video stream, all the levels can be directly used as the levels of the video frames to be transmitted in the next video stream.
Referring to fig. 6 in combination, if there is a dropped video frame in the current video stream, the implementation of determining the level of the video frame to be transmitted in the next video stream may be as follows:
s21: and judging whether the actual bandwidth of the path is recovered.
The actual bandwidth of the path after restoration is greater than the actual bandwidth of the path before restoration.
If so, then S22 is executed.
S22: the level of the video frame to be transmitted in the next video stream is determined.
Referring to fig. 7 in combination, S22 may include the following sub-steps:
s221: and determining the size relation between the actual bandwidth of the recovered path and the required bandwidth corresponding to the next video stream.
The required bandwidth corresponding to the next video stream is the combined required bandwidth of the video frames of the grade to be reserved and the video frames of at least one grade to be discarded in the next video stream. The grade to be reserved is the grade of the video frame which is actually transmitted in the current video stream, and the grade to be discarded is the grade of the discarded video frame in the current video stream.
For example, the video frame levels of the current video stream include level1, level2, and level 3. Where level2 and level3 are the levels of the dropped video frames and level1 is the level of the video frames actually transmitted. Then the video frames at the level to be retained are level1 and the video frames at the level to be dropped are level2 and level 3. The combination of the video frame of the level to be preserved and the video frame of the at least one level to be discarded may be the video frames of level1 and level2, or may be the video frames of level1 and level3, or may be the video frames of level1, level2 and level 3.
S222: whether to recover and the level to be reserved for recovery are determined based on the size relationship.
If the bandwidth of the recovered path can support the transmission of all the video frames of the grade to be discarded and the video frames of the grade to be reserved in the next video stream, the grade of the video frame to be transmitted in the next video stream is the grade to be discarded and the grade to be reserved; if the bandwidth of the recovered path can support the transmission of the video frames of the grade to be discarded and the video frames of the grade to be reserved in the part of the next video stream, the grade of the video frames to be transmitted in the next video stream is the grade to be discarded and the grade to be reserved in the part of the next video stream. Otherwise, the grade of the video frame to be transmitted in the next video stream is the grade to be reserved.
The following still takes the video frame level to be retained as level1, and the video frame levels to be discarded as level2 and level3 as examples, to illustrate the implementation process of S221-S222:
example 1: and respectively judging whether the actual bandwidth of the recovered path is greater than the required bandwidth of the video frames of level1 and level2, whether the actual bandwidth of the recovered path is greater than the required bandwidth of the video frames of level1 and level3, and whether the actual bandwidth of the recovered path is greater than the required bandwidth of the video frames of level1, level2 and level 3.
If not, the level2 and the level3 are not recovered. I.e. the level of the video frame to be transmitted is determined to be level 1.
And if the sum of the required bandwidth of the video frames is greater than the sum of the required bandwidth of the level1, the level2 and the level3, recovering the level1 and the level 2. The video frame levels to be transmitted are determined to be level1, level2 and level 3.
If the required bandwidth of the video frames is not greater than the level1, the level2 and the level3, and the required bandwidth of the video frames is greater than the level1 and the level2, the level2 is recovered. The video frame levels to be transmitted are determined to be level1 and level 2.
If the required bandwidth of the video frames is not greater than the level1 and the level2, and the required bandwidth of the video frames is greater than the level1 and the level3, the level3 is recovered. The video frame levels to be transmitted are determined to be level1 and level 3.
Example 2: as shown in fig. 8, first, it is determined whether W > K × Q is satisfied, where W is the actual bandwidth of the recovered path, K is the required bandwidth of the video frames of level1, level2, and level3 (full level, all levels), and Q (Q > 1) is the amplification factor;
if W is more than K Q, determining level1, level2 and level3 as the video frame level to be transmitted; if not, judging whether W > (K-KL) (+) -Q is met, wherein KL is the required bandwidth of level 3;
if W > (K-KL) Q is met, determining level1 and level2 as the levels of the video frames to be transmitted; if W > (K-KL) Q is not satisfied, determining level1 as the level of the video frame to be transmitted.
The following describes, in an example, a transmission method of a video stream provided by the present application with reference to fig. 9:
assume that the code stream size of the current frame video stream is b, the bandwidth of the WIFI network is m1, the bandwidth of the cellular network is m2(m1 > m2), and P, V is the magnification. Where the current frame video stream corresponds to levels a1, a2, and a 3. The degree of importance is ranked from high to low as a1 > a2 > a 3.
1) Judging whether m1 > b × p is satisfied; if m1 > b p is satisfied, go to 2); otherwise go to 3).
2) And transmitting the current video stream by using the WIFI network.
3) Judging whether m2 > b × p is satisfied; if m2 > b p is satisfied, go to 4); if m2 > b p is not satisfied, go to 5).
4) The current video stream is transmitted using a cellular network.
5) Judging whether m1+ m2 > b V is met; if m1+ m2 > b V, then go to 6) -7); if not, 10) is entered.
6) And matching a WIFI network for high-level video frames and a cellular network for low-level video frames.
7) Judging whether the cellular network supports the transmission of the low-level video frame, namely judging whether the matching is successful; if the matching fails, 8) is entered; if the matching is successful, 9) is entered.
8) Judging whether only a1 video frame remains in the current video stream; if yes, entering 9); if not, 10) is entered.
9) And transmitting the current video stream by using the matched path.
10) Discarding the video frame of the lowest level to update the current video stream; and jumps to 1) to repeatedly perform the above steps.
After the transmission of the current video stream by any of the steps 2), 4), 9) above, 11) is entered.
11) Judging whether a discarded video frame exists in the current video stream; if so, proceed to 12). Wherein 11), 12) are not shown in fig. 9.
12) And judging whether the bandwidth m1/m2/m1+ m2 is recovered, and if so, determining the level of a video frame to be transmitted in the next video stream based on the size relation between the recovered bandwidth and the bandwidth corresponding to the next video stream.
In addition, in the above embodiment, in any of the determining steps, it may be further detected whether to stop pushing the video stream. Stopping pushing the video stream means that the transmitting end has closed the matter of transmitting the video stream. If stopped, the video stream is not transmitted further.
In addition, if a single path is utilized and a reliable transport protocol (such as TCP) is adopted for transmission, the video stream received by the receiving end generally does not have the problem of disorder. However, if a plurality of paths are used to distribute and transmit video streams, the video frame delivery times are not consistent under the condition that the network conditions of the plurality of paths are asymmetric, and the problem of video frame disorder may exist in the video stream received by the receiving end. Or, if a single path is utilized and an unreliable transmission protocol is adopted for transmission, repeated transmission is performed to avoid missing transmission, and the problem of video frame disorder may also exist in the video stream received by the receiving end. Therefore, the solution provided by the application is as follows:
fig. 10 is a flowchart illustrating an embodiment of a method for receiving a video stream provided by the present application. As shown in fig. 10, the method includes:
s31: and acquiring the starting frame sequence number in the video stream from the sending end.
The receiving end can obtain the application layer protocol from the transmitting end, so as to obtain the starting frame sequence number from the application layer protocol.
S32: and judging whether the sequence number of the received current frame is equal to the sequence number of the starting frame.
If so, executing S33-S35; if not, then the video frame is an out-of-order video frame, and S37 is executed.
S33: the current frame is stored or decoded and the starting frame number is updated.
After decoding, the receiving end can play the current frame and the like. Updating the start frame number substantially updates the start frame number to the start frame number plus 1. After updating the start frame number, S34 may be entered.
S34: and judging whether the video frames with the sequence numbers equal to the sequence number of the initial frame exist in the buffer queue.
Stored in the buffer sequence are video frames that have been received by the transmitting end but not decoded.
If so, then S35 is executed.
S35: the video frames with sequence numbers equal to the starting frame sequence number are stored or decoded, and the starting frame sequence number is updated.
S36: the current frame is added to the buffer queue or discarded.
If the current frame serial number is less than the initial frame serial number, the current frame is discarded. It can be understood that if there is a retransmission situation at the transmitting end in order to ensure the reliability of the transmission, the current frame number may be smaller than the starting frame number, in which case the current frame needs to be discarded.
And if the current frame serial number is greater than the initial frame serial number, adding the current frame into the cache sequence.
In addition, if the transmitted video stream is a real-time stream, the time for buffering the video frames can be controlled by controlling the number of the video frames in the buffer queue, so that the phenomenon of playing card pause such as screen splash, skip broadcast and the like is avoided. The specific implementation mode comprises the following steps: whether the number of the video frames existing in the buffer queue is larger than the preset number or not can be judged, and if the number of the video frames existing in the buffer queue is larger than the preset number, the non-key video frames with the minimum sequence numbers are discarded. For example, it is determined whether there are two I frames in the buffer queue, and if so, the P frame preceding the I frame with the preceding sequence number is discarded, the I frame with the preceding sequence number is decoded, and the starting frame sequence number is updated.
By the method, the fluency and the real-time performance of the picture can be improved during real-time stream playing.
Fig. 11 is a schematic structural diagram of an embodiment of an electronic device according to the present application. As shown in fig. 11, the electronic device may include a processor 41, a memory 42 coupled to the processor 41.
Wherein the memory 42 stores program instructions for implementing the method of any of the above embodiments; processor 41 is operative to execute program instructions stored by memory 42 to implement the steps of the above-described method embodiments. The processor 41 may also be referred to as a CPU (Central Processing Unit). The processor 41 may be an integrated circuit chip having signal processing capabilities. The processor 41 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor 41 may be any conventional processor or the like.
FIG. 12 is a schematic structural diagram of an embodiment of a computer-readable storage medium of the present application. As shown in fig. 12, the computer readable storage medium 50 of the embodiment of the present application stores program instructions 51, and the program instructions 51 implement the method provided by the above-mentioned embodiment of the present application when executed. The program instructions 51 may form a program file stored in the computer-readable storage medium 50 in the form of a software product, so as to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute all or part of the steps of the methods according to the embodiments of the present application. And the aforementioned computer-readable storage medium 50 includes: various media capable of storing program codes, such as a usb disk, a mobile hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or terminal devices, such as a computer, a server, a mobile phone, and a tablet.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. The above embodiments are merely examples and are not intended to limit the scope of the present disclosure, and all modifications, equivalents, and flow charts using the contents of the specification and drawings of the present disclosure or those directly or indirectly applied to other related technical fields are intended to be included in the scope of the present disclosure.

Claims (10)

1. A method for transmitting a video stream, comprising:
acquiring a current video stream, wherein the current video stream comprises a plurality of video frames;
ranking a plurality of the video frames; and
judging whether the path supports the transmission of the current video stream;
if the transmission of the current video stream is not supported, discarding the video frame with the lowest level to update the current video stream;
repeatedly executing the steps until the path supports the transmission of the current video stream, or the grade of the video frames left in the current video stream is higher than a preset grade;
and transmitting the current video stream by using the path.
2. The method of claim 1, wherein there are a plurality of paths, and wherein the determining whether a path supports transmission of the current video stream comprises:
and judging whether a plurality of paths support the transmission of the current video stream.
3. The method of claim 2, wherein determining whether the plurality of paths support transmission of the current video stream comprises:
judging whether the actual bandwidth of a single path is larger than the required bandwidth of the current video stream or not;
if yes, transmitting the current video stream by using the single path;
if not, judging whether the combination of the paths supports the transmission of the current video stream;
if the combination of the paths does not support the transmission of the current video stream, determining that the paths do not support the transmission of the current video stream.
4. The method of claim 3, wherein the determining whether the combination of the plurality of paths supports transmission of the current video stream comprises:
judging whether the sum of the actual bandwidths of the paths is larger than the required bandwidth or not;
if so, matching the corresponding path for the video frame of each level;
judging whether the matching is successful;
if the matching is successful, distributing the video frames of each grade to the corresponding paths for transmission;
and if the matching fails, judging that the combination of the paths does not support the transmission of the current video stream.
5. The method of claim 4, the matching the corresponding path for the video frames of each level, comprising:
matching the path with the actual bandwidth being high for the video frames of the high rank and matching the path with the actual bandwidth being low for the video frames of the low rank;
the judging whether the matching is successful includes:
if the path with the low actual bandwidth supports the transmission of the video frame with the low level, the matching is judged to be successful; otherwise, judging that the matching fails.
6. The method of claim 1, further comprising, after said transmitting the current video stream using the path:
judging whether the discarded video frame exists in the current video stream or not;
and determining the grade of the video frame to be transmitted in the next video stream based on the judgment result.
7. The method of claim 6, wherein the determining the level of the video frame to be transmitted in the transmission using the path comprises:
if the judgment result is that the discarded video frame exists in the current video stream, judging whether the actual bandwidth of the path is recovered, wherein the actual bandwidth of the path after recovery is larger than the actual bandwidth of the path before recovery;
and if so, determining the grade of the video frame to be transmitted in the next video stream.
8. The method of claim 7, wherein the determining the level of the video frame to be transmitted in the next video stream comprises:
determining a size relation between an actual bandwidth of the recovered path and a required bandwidth corresponding to the next video stream, where the required bandwidth corresponding to the next video stream is a required bandwidth of a combination of the video frames of the to-be-reserved level and the video frames of the at least one to-be-discarded level in the next video stream;
determining whether to recover and the level to be reserved for recovery based on the size relationship.
9. An electronic device comprising a processor, a memory coupled to the processor, wherein,
the memory stores program instructions;
the processor is configured to execute the program instructions stored by the memory to implement the method of any of claims 1-8.
10. A computer-readable storage medium, characterized in that the storage medium stores program instructions that, when executed, implement the method of any of claims 1-8.
CN202110604920.4A 2021-05-31 2021-05-31 Video stream transmission method, electronic device and storage medium Pending CN113542798A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110604920.4A CN113542798A (en) 2021-05-31 2021-05-31 Video stream transmission method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110604920.4A CN113542798A (en) 2021-05-31 2021-05-31 Video stream transmission method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN113542798A true CN113542798A (en) 2021-10-22

Family

ID=78124490

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110604920.4A Pending CN113542798A (en) 2021-05-31 2021-05-31 Video stream transmission method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN113542798A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114866838A (en) * 2022-05-31 2022-08-05 厦门蝉羽网络科技有限公司 Live broadcast goods taking method and system based on information processing

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1468001A (en) * 2002-06-27 2004-01-14 上海汉唐科技有限公司 Media flow self-adapting transmission method based on internet
US20080052414A1 (en) * 2006-08-28 2008-02-28 Ortiva Wireless, Inc. Network adaptation of digital content
US20140211681A1 (en) * 2013-01-25 2014-07-31 Cisco Technology, Inc. System and method for video delivery over heterogeneous networks with scalable video coding for multiple subscriber tiers
CN103974333A (en) * 2014-05-16 2014-08-06 西安电子科技大学 Load balancing method for SVC video services and moving speed
US20150350598A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Redundant Transmission Channels for Real-Time Applications on Mobile Devices
US20160255346A1 (en) * 2013-10-11 2016-09-01 Sony Corporation Decoding device, decoding method, encoding device, and encoding method
CN106454432A (en) * 2016-10-18 2017-02-22 浙江大华技术股份有限公司 Video frame processing method and device
WO2018072675A1 (en) * 2016-10-18 2018-04-26 Zhejiang Dahua Technology Co., Ltd. Methods and systems for video processing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1468001A (en) * 2002-06-27 2004-01-14 上海汉唐科技有限公司 Media flow self-adapting transmission method based on internet
US20080052414A1 (en) * 2006-08-28 2008-02-28 Ortiva Wireless, Inc. Network adaptation of digital content
US20140211681A1 (en) * 2013-01-25 2014-07-31 Cisco Technology, Inc. System and method for video delivery over heterogeneous networks with scalable video coding for multiple subscriber tiers
US20160255346A1 (en) * 2013-10-11 2016-09-01 Sony Corporation Decoding device, decoding method, encoding device, and encoding method
CN103974333A (en) * 2014-05-16 2014-08-06 西安电子科技大学 Load balancing method for SVC video services and moving speed
US20150350598A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Redundant Transmission Channels for Real-Time Applications on Mobile Devices
CN106454432A (en) * 2016-10-18 2017-02-22 浙江大华技术股份有限公司 Video frame processing method and device
WO2018072675A1 (en) * 2016-10-18 2018-04-26 Zhejiang Dahua Technology Co., Ltd. Methods and systems for video processing

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114866838A (en) * 2022-05-31 2022-08-05 厦门蝉羽网络科技有限公司 Live broadcast goods taking method and system based on information processing
CN114866838B (en) * 2022-05-31 2024-02-23 厦门蝉羽网络科技有限公司 Live broadcast on-demand method and system based on information processing

Similar Documents

Publication Publication Date Title
CN111740808B (en) Data transmission method and device
TWI353743B (en) Method and apparatus for forwarding non-consecutiv
CN108768596B (en) Method and device for requesting automatic retransmission of signal
US20110237180A1 (en) Data receiving terminal, data distribution server, data distribution system, and data distribution method
CN105450785B (en) File transmission method and device
US11350142B2 (en) Intelligent video frame dropping for improved digital video flow control over a crowded wireless network
CN109729602B (en) Link data processing method and device and computer storage medium
US8448213B2 (en) Contents distribution system, contents distribution server, contents reproduction terminal, and contents distribution method
WO2016002436A1 (en) Wireless communications device, wireless communications method, and program
CN113708895B (en) Data transmission method and device and electronic equipment
CN103326831A (en) Link processing method and mobile terminal in multichannel transmission control protocol
CN114765690A (en) Data packet transmission method and related equipment
CN110996056B (en) Video storage method and device of cascade monitoring system
CN113542798A (en) Video stream transmission method, electronic device and storage medium
CN110943808B (en) Data transmission method and device, electronic equipment and storage medium
CN113068074B (en) Caching method and device, computer-readable storage medium and electronic device
CN111586349A (en) Data outage and continuous transmission method and system for monitoring equipment
CN103561282A (en) Streaming media file data transmission method and device
CN115066844A (en) Dynamic uplink end-to-end data transmission scheme with optimized memory path
CN110062003A (en) Video data transmitting method, device, electronic equipment and storage medium
US11424863B2 (en) Data packet transmission method and device, storage medium and terminal
JP6166445B1 (en) Application layer multicast delivery method
CN107872842B (en) Data receiving method and device
CN114584833B (en) Audio and video processing method and device and storage medium
WO2022042321A1 (en) High-definition video scheduling method, base station, scheduling system, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20211022

RJ01 Rejection of invention patent application after publication