CN108696773B - Real-time video transmission method and device - Google Patents

Real-time video transmission method and device Download PDF

Info

Publication number
CN108696773B
CN108696773B CN201710233780.8A CN201710233780A CN108696773B CN 108696773 B CN108696773 B CN 108696773B CN 201710233780 A CN201710233780 A CN 201710233780A CN 108696773 B CN108696773 B CN 108696773B
Authority
CN
China
Prior art keywords
video
frame
fragment
packet loss
video frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710233780.8A
Other languages
Chinese (zh)
Other versions
CN108696773A (en
Inventor
袁荣喜
周巍巍
张凯磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Qianwen Wandaba Education Technology Co ltd
Original Assignee
Suzhou Qianwen Wandaba Education Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Qianwen Wandaba Education Technology Co Ltd filed Critical Suzhou Qianwen Wandaba Education Technology Co Ltd
Priority to CN201710233780.8A priority Critical patent/CN108696773B/en
Publication of CN108696773A publication Critical patent/CN108696773A/en
Application granted granted Critical
Publication of CN108696773B publication Critical patent/CN108696773B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/231Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
    • H04N21/23106Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion involving caching operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the invention discloses a real-time video transmission method and a real-time video transmission device. The method comprises the following steps: carrying out fragment processing on the video coding frame generated in real time to form at least one video frame fragment; sequentially putting the video frame fragments into a sending buffer area; and according to the receiving confirmation response returned by the video playing end aiming at different video frame fragments, forwarding the video frame fragments of the video coding frame to the video playing end through the video server according to a set sending strategy so as to play the real-time video. By the technical scheme of the invention, the video transmission delay can be reduced, the video transmission reliability is improved, and the user experience is improved.

Description

Real-time video transmission method and device
Technical Field
The embodiment of the invention relates to an information processing technology, in particular to a real-time video transmission method and a real-time video transmission device.
Background
With the continuous popularization and development of the internet, live video broadcast of the mobile internet is in the fierce blowout development. Most of traditional live broadcasting is one-way, such as live broadcasting of a television station or an operator, a user only needs to open a terminal to watch the live broadcasting, and the requirement on real-time performance is not high. The mobile internet live video often needs interaction between a recording end and a playing end in function, the interaction is not limited to character interaction, and also comprises video interaction, so that the real-time requirement of the mobile internet live video is high.
At present, a Real-Time video data interaction method in mobile internet live video includes acquiring video data at a recording end, performing video coding according to an X264 format, pushing video stream data to a Content Delivery Network (CDN) server for distribution based on a Real Time Messaging Protocol (RTMP) on a Transmission Control Protocol (TCP) through a Quality of Service (QoS) algorithm, and pulling a stream from the CDN server by a playing end for playing.
In the prior art, because a video stream data message based on TCP is too large, the data interaction structure is complex, and the real-time performance of video transmission is extremely easily influenced by network fluctuation, the video transmission delay of the whole network link is usually 1-3 seconds or worse.
Disclosure of Invention
The embodiment of the invention provides a real-time video transmission method and device, which aim to reduce video transmission delay, improve video transmission reliability and improve user experience.
In a first aspect, an embodiment of the present invention provides a method for transmitting a real-time video, including:
carrying out fragment processing on the video coding frame generated in real time to form at least one video frame fragment;
sequentially putting the video frame fragments into a sending buffer area;
and according to the receiving confirmation response returned by the video playing end aiming at different video frame fragments, forwarding the video frame fragments of the video coding frame to the video playing end through the video server according to a set sending strategy so as to play the real-time video.
In a second aspect, an embodiment of the present invention further provides a method for transmitting a real-time video, including:
receiving video frame fragments which are forwarded by a video server and sent by a video sending end, putting the video frame fragments into corresponding frame serial numbers in a receiving cache area, and updating the maximum continuous fragment number and a packet loss cache table of the continuously received video frame fragments;
periodically generating a corresponding receiving confirmation response according to the maximum continuous fragment number and the packet loss cache table;
sending the receiving confirmation response to the video server so that the video server forwards the receiving confirmation response to the video sending end;
and if all the video frame fragments in the frame sequence number in the receiving cache area are successfully received, combining all the video frame fragments in the frame sequence number to play the real-time video.
In a third aspect, an embodiment of the present invention further provides a real-time video transmission apparatus, configured at a video sending end, where the apparatus includes:
the fragment processing module is used for carrying out fragment processing on the video coding frame generated in real time to form at least one video frame fragment;
the sending buffer module is used for sequentially placing the video frame fragments into a sending buffer area;
and the fragment sending module is used for forwarding the video frame fragments of the video coding frame to the video playing end through the video server according to a set sending strategy according to a receiving confirmation response returned by the video playing end aiming at different video frame fragments so as to play the real-time video.
In a fourth aspect, an embodiment of the present invention further provides a real-time video transmission device, configured at a video playing end, where the device includes:
the fragment receiving module is used for receiving video frame fragments which are transmitted by a video transmitting end and forwarded by a video server, putting the video frame fragments into corresponding frame serial numbers in a receiving cache region, and updating the maximum continuous fragment number and a packet loss cache table of the continuously received video frame fragments;
a response generation module, configured to periodically generate a corresponding reception acknowledgement response according to the maximum continuous fragment number and the packet loss cache table;
a response sending module, configured to send the reception acknowledgement response to the video server, so that the video server forwards the reception acknowledgement response to the video sending end;
and the fragment merging module is used for merging all the video frame fragments in the frame number to play the real-time video if all the video frame fragments in the frame number in the receiving cache area are successfully received.
The embodiment of the invention divides the video coding frame into at least one video frame fragment, and forwards the video frame fragment to the video playing end through the video server according to the set sending strategy so as to play the video in real time, thereby solving the problem of large video delay caused by adopting a transmission mode based on TCP in the prior art, realizing the purposes of reducing video transmission delay, improving the reliability of video transmission and improving the effect of user experience.
Drawings
Fig. 1 is a schematic flowchart of a real-time video transmission method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a real-time video transmission method according to a second embodiment of the present invention;
fig. 3 is a schematic flowchart of a real-time video transmission method according to a third embodiment of the present invention;
fig. 4 is a schematic flowchart of a real-time video transmission method according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of a real-time video transmission apparatus according to a fifth embodiment of the present invention;
fig. 6 is a schematic structural diagram of a real-time video transmission apparatus according to a sixth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart illustrating a real-time video transmission method according to an embodiment of the present invention. The method is applicable to the case of real-time video transmission, and can be executed by a real-time video transmission device, which can be composed of hardware and/or software and can be generally integrated in a video sending end and all intelligent terminals containing video sending functions. The method specifically comprises the following steps:
s110, carrying out fragment processing on the video coding frame generated in real time to form at least one video frame fragment.
Optionally, after the real-time video data is acquired by the video acquisition device such as the camera, the real-time video data may be encoded in real time by using the video encoder to form a video encoded frame, and for a high-resolution video encoded frame, the size of the frame is often higher than a network maximum transmission unit in a transmission mode based on UDP (User data Protocol), so that the video encoded frame generated in real time needs to be fragmented, and after at least one video frame fragment smaller than a preset size is formed, the video frame fragment is used as a transmission unit to transmit the video data, thereby effectively solving the problem of high video delay caused by an excessively large transmission unit, and reducing the transmission delay of the video.
Preferably, the video coding frame generated in real time can be a coding frame generated after the encoder codes according to the H264 protocol and filters out B frames.
Among them, the B frame is a bidirectional predictive frame, and needs to be predictive-coded from a backward video frame. The filtering of the B frames has the advantages of reducing the coding delay and ensuring the real-time performance of video transmission and playing.
Preferably, the slicing processing is performed on the video coding frame generated in real time to form at least one video frame slice, and the slicing processing includes: if the byte number included in the video coding frame is smaller than the sum of the byte number of the preset single slice and the preset byte number, dividing the video coding frame into a video frame slice; if the number of bytes included in the video coding frame is S times of the number of bytes of a preset single fragment, dividing the video coding frame into S video frame fragments, wherein S is an integer greater than 0; if the number of bytes included in the video coding frame is the sum of S times of the number of bytes of the preset single fragment and the number of bytes exceeding the number of bytes, and the number of bytes exceeding the number of bytes is larger than zero and smaller than the preset number of bytes, dividing the video coding frame into S video frame fragments, and putting the bytes corresponding to the number of bytes exceeding the number of bytes into the last video frame fragment; and if the number of bytes included in the video coding frame is the sum of the number of bytes which is S times of the preset single slice number of bytes and the number of bytes which exceed the preset number of bytes, and the number of bytes which exceed the preset number of bytes is not less than the preset number of bytes, dividing the video coding frame into (S +1) video frame slices.
For example, the maximum slice number of a video coding frame may be set to 500, the preset single slice byte number may be set to 800 bytes, and the preset byte number may be set to 50 bytes. For example, when the number of frame bytes of a video coding frame is less than 800+ 50-850 bytes, the video coding frame is divided into only one video frame slice; when the frame byte number of the video coding frame is S times of integral multiple of 800, the video coding frame is divided into S video frame fragments; when the frame byte number of the video coding frame is S times of 800 and N bytes are remained, if 0< N <50, the N bytes are put into the last video frame fragment of S video frame fragments and finally divided into S video frame fragments; if N is larger than or equal to 50, a video frame fragment is independently distributed to the redundant N bytes and finally divided into (S +1) video frame fragments.
And S120, sequentially putting the video frame fragments into a sending buffer area.
Optionally, all the video frame fragments which are being transmitted and have not received the reception acknowledgement response are stored in the transmission buffer. Compared with a transmission mode without acknowledgement in a UDP protocol, the advantage of setting the sending buffer area is that the video frame fragments can be sent again when the sending of the video frame fragments fails, and the sending buffer area provides temporary storage addresses for the video frame fragments which are sent before and have not received acknowledgement responses, so that the reliability of video transmission is improved.
And S130, according to the receiving confirmation response returned by the video playing end aiming at different video frame fragments, forwarding the video frame fragments of the video coding frame to the video playing end through the video server according to the set sending strategy so as to play the real-time video.
For example, the reception confirmation response may include a slice number of a video frame slice whose reception is confirmed, preferably a maximum consecutive slice number of video frame slices continuously received by the video playing end, and may further include a slice number of a video frame slice which is not received, that is, a packet loss slice number. The set sending strategy may be a reliable transmission mode based on UDP, that is, a transmission mode that needs to be confirmed after a message is sent, and specifically, reliable transmission of video frame fragments may be achieved through a sliding window.
Preferably, according to the receiving confirmation response returned by the video playing terminal for different video frame fragments, the method for forwarding the video frame fragments of the video coding frame to the video playing terminal through the video server according to the set sending strategy to perform real-time video playing includes: sending a set number of video frame fragments to be confirmed to a video server, and storing the video frame fragments to be confirmed in a sending window of a sending cache region; receiving a video frame fragment receiving confirmation response transmitted by a video playing end and forwarded by a video server; if the receiving confirmation response comprises the maximum continuous fragment number of the video frame fragments continuously received by the video playing end, sliding the sending window, deleting the confirmed video frame fragments from the sending window, and generating an idle sending window; if the receiving confirmation response comprises the packet loss fragment number of the video playing end, acquiring the video frame fragment corresponding to the packet loss fragment number, and retransmitting the video frame fragment to the video playing end through the video server; continuously sending the video frame fragments to be confirmed, which are matched with the number of the idle sending windows, to the video server, and storing the continuously sent video frame fragments to be confirmed in the idle sending windows; and returning to execute the operation of receiving the video frame fragment receiving confirmation response transmitted by the video playing end and forwarded by the video server until the confirmation transmission of all the video frame fragments is completed.
Illustratively, the size of the sending window is 5, 5 video frame fragments to be confirmed with fragment numbers of 1-5 are sent from the play buffer to the video server, and the 5 video frame fragments to be confirmed are stored in the sending window. If the maximum continuous fragment number of the received video frame fragments continuously received by the video playing end is 5, deleting the 5 video frame fragments to be confirmed from the sending window, storing the 5 video frame fragments to be confirmed with the fragment number of 6-10 into the sending window, and sending the video frame fragments to be confirmed; if the maximum continuous fragment number of the received video frame fragments continuously received by the video playing end is 3 and the packet loss fragment number is 4, deleting the video frame fragments No. 1-3 in the sending window, retransmitting the video frame fragments with the fragment number of 4, and simultaneously sending the video frame fragments with the fragment number of 6-8.
According to the technical scheme, the video coding frame is divided into at least one video frame fragment, and the video frame fragment is forwarded to the video playing end through the video server according to the set sending strategy so as to play the video in real time, so that the problem of large video delay caused by the adoption of a transmission mode based on TCP in the prior art is solved, the video transmission delay is reduced, the reliability of video transmission is improved, and the effect of user experience is improved.
Example two
Fig. 2 is a flowchart illustrating a real-time video transmission method according to a second embodiment of the present invention. The present embodiment is optimized based on the above embodiments, and provides a preferable real-time video transmission method, specifically, further optimization is performed after video frame fragments are sequentially placed in a sending buffer. The method specifically comprises the following steps:
s210, carrying out fragment processing on the video coding frame generated in real time to form at least one video frame fragment.
And S220, sequentially putting the video frame fragments into a sending buffer area.
S230, periodically checking whether the sending buffer area comprises expired video frame fragments with the existing time exceeding a preset time threshold, if so, executing S240; if not, go to S260.
The check period may be 10s, and the preset time threshold may be set according to the video playing time of the video playing end. When the network is congested, a plurality of sent and to-be-confirmed fragment messages may be generated in the sending buffer, and periodically checking the expired video frame fragments in the sending buffer has the advantage that the played video frame fragments can be cleared in time, that is, the time corresponding to the currently played video frame fragments at the video playing end is exceeded, so that congestion is relieved, and video transmission delay is reduced.
S240, removing all video frame fragments corresponding to an expired video coding frame group associated with the video frame fragments from a sending buffer area, and generating discarding synchronization information according to a next video coding frame group positioned behind the expired video coding frame group, wherein the video coding frame group comprises at least one video coding frame.
The expired video coded frame Group may be a Group of Pictures (GOP) where a video coded frame corresponding to the expired video frame slice is located. Illustratively, if it is detected that a certain video frame slice in a GOP is expired, the whole GOP where the video frame slice is located is removed from the sending buffer, and corresponding discard synchronization information is generated according to the frame number of a key frame (i.e., an I frame) in the next GOP and the slice number of the video frame slice corresponding to the key frame, so as to synchronize the receiving condition of the video playing end. Preferably, the discard synchronization information can be synchronized to each video player via a handshake protocol.
Preferably, after all video frame slices corresponding to the expired video coding frame group associated with the video frame slice are removed from the transmission buffer, the method further includes: updating the discarding times of the overdue video coding frame group, and reducing the value of the data transmission parameter if the discarding times in the preset time interval exceeds a discarding threshold value; wherein, the data transmission parameters include: the resolution of the video encoder, and/or the data transmission rate.
Illustratively, in the process of sending video frame fragments, discarding outdated video coding frame groups frequently occurs, which may cause a certain influence on the video playing process of a video playing end, and a playing jam may occur.
And S250, forwarding the discarding synchronization information to at least one video playing terminal through the video server to instruct the video playing terminal to discard and receive the video frame fragments corresponding to the overdue video coding frame group and discard the received video frame fragments corresponding to the overdue video coding frame group.
Illustratively, when there is an expired video coding frame group discard condition, the video sending end forwards the generated discard synchronization information to the associated at least one video player through the video server, and the video playing end may update the maximum consecutive slice number of the currently and consecutively received video frame slices according to the discard synchronization information, so as to instruct the video playing end to receive the video frame slices from the frame sequence number and the slice number, and discard the video frame slices in the expired video coding frame group that have been received before.
Preferably, the discard synchronization information includes a frame number corresponding to a key video coding frame in the next video coding frame group and a minimum slice number of a video frame slice in the key video coding frame.
The video coding frame group may be a GOP, and the key video coding frame may be an I frame in the GOP.
And S260, according to the receiving confirmation response returned by the video playing end aiming at different video frame fragments, forwarding the video frame fragments of the video coding frame to the video playing end through the video server according to the set sending strategy so as to play the real-time video.
According to the technical scheme of the embodiment, the outdated video frame fragments exceeding the preset time threshold value in the sending buffer area are periodically checked and timely discarded, and the generated discard is synchronously sent to each video playing end, so that the video playing ends give up receiving the video frame fragments corresponding to the outdated video frame group and discard the received video frame fragments corresponding to the outdated video frame group, network congestion is relieved, and video transmission delay is reduced.
EXAMPLE III
Fig. 3 is a flowchart illustrating a real-time video transmission method according to a third embodiment of the present invention. The method is applicable to the case of real-time video transmission, and can be executed by a real-time video transmission device, which can be composed of hardware and/or software, and can be generally integrated in a video playing end and all intelligent terminals containing video receiving and playing functions. The method specifically comprises the following steps:
s310, receiving the video frame fragments transmitted by the video transmitting end and forwarded by the video server, putting the video frame fragments into the corresponding frame serial numbers in the receiving buffer area, and updating the maximum continuous fragment number and the packet loss buffer table of the video frame fragments which are continuously received.
Optionally, the receiving buffer area includes a plurality of frame buffer areas, each frame buffer area corresponds to one frame number, and the frame buffer area corresponding to each frame number is allocated with a fragment buffer area of a corresponding number of video frame fragments. Illustratively, after receiving the video frame fragment, the video playing end will put the video frame fragment into the fragment buffer area in the frame buffer area under the corresponding frame number for storage, and update the corresponding parameter record. For example, when the video playing end receives 5 video frame fragments with the frame sequence number of 1 fragment number of 1-5, the 5 video frame fragments are placed in a frame buffer area with the frame sequence number of 1 in the receiving buffer area, and are respectively stored according to the fragment numbers, meanwhile, the maximum continuous fragment number of the continuously received video frame fragments is updated to be 5, and if a packet loss situation exists, the packet loss fragment number is recorded in a packet loss buffer table.
Preferably, before receiving the video frame fragments forwarded by the video server and sent by the video sending end, the method further includes: if receiving the discarding synchronization information transmitted by the video sending end and forwarded by the video server, updating the maximum continuous fragment number of the video frame fragments which are continuously received according to the discarding synchronization information, and discarding the video frame fragments which are received and correspond to the overdue video coding frame group, wherein the discarding synchronization information comprises the frame number corresponding to the key video coding frame in the next video coding frame group and the minimum fragment number of the video frame fragment in the key video coding frame, and the video coding frame group comprises at least one video coding frame.
Illustratively, when there is a situation that an expired video coding frame group is discarded at a video sending end, a video playing end receives discarded synchronization information which is generated and sent by the video sending end and is forwarded by a video server, and updates a maximum consecutive slice number of a video frame slice which is currently and continuously received according to the discarded synchronization information, that is, the maximum consecutive slice number of the video frame slice which is continuously received is set as a minimum slice number of a video frame slice which corresponds to a key video coding frame in a next video coding frame group, so as to start receiving the video frame slice from a frame number which corresponds to the key video coding frame in the next video coding frame group and the minimum slice number of the video frame slice in the key video coding frame, and discard the video frame slice in the expired video coding frame group which has been received before.
And S320, periodically generating a corresponding receiving confirmation response according to the maximum continuous fragment number and the packet loss cache table.
Alternatively, the generation period of the reception acknowledgement response may be 10 ms. Since the reception acknowledgement response needs to be sent to the video sending end every time the video frame fragment is received, and network congestion may be caused if the reception acknowledgement response is sent every time the video frame fragment is received, the advantage of periodically generating the corresponding reception acknowledgement response is that network congestion may be reduced.
Preferably, the reception confirmation response includes a maximum consecutive slice number of video frame slices that have been continuously received, and a packet loss slice number obtained from the packet loss buffer table.
And S330, sending the receiving confirmation response to the video server so that the video server forwards the receiving confirmation response to the video sending end.
The purpose of sending the receiving confirmation response is to enable the video sending end to send the video frame fragments according to the receiving confirmation response, thereby improving the reliability of video transmission.
And S340, if all the video frame fragments in the frame sequence number in the receiving cache area are successfully received, combining all the video frame fragments in the frame sequence number to play the real-time video.
Illustratively, if the video frame fragments in the frame buffer area corresponding to one frame number in the receiving buffer area are successfully received and stored, all the video frame fragments in the frame buffer area corresponding to the frame number are combined to form a complete video coding frame, and the video coding frame is sent to an upper player for real-time decoding and playing.
According to the technical scheme of the embodiment, the video frame fragments forwarded by the video server and sent by the video sending end are received, the corresponding parameters and the packet loss cache table are updated according to the receiving and storing conditions in the receiving cache area, the periodically generated receiving confirmation response is returned to the video server, and finally the video frame fragments in the successfully received frame sequence numbers are combined for real-time playing, so that the reliability and the real-time performance of video transmission are improved.
Example four
Fig. 4 is a flowchart illustrating a real-time video transmission method according to a fourth embodiment of the present invention. The present embodiment is optimized based on the foregoing embodiments, and provides a preferred real-time video transmission method, specifically, a video frame fragment transmitted by a video transmitting end and forwarded by a video server is received, and the video frame fragment is placed in a corresponding frame number in a receiving buffer area, and meanwhile, a maximum consecutive fragment number of the video frame fragment that has been continuously received and a packet loss buffer table are updated to perform further optimization. The method specifically comprises the following steps:
s410, receiving the current video frame fragment transmitted by the video transmitting end and forwarded by the video server, and acquiring the target fragment number of the current video frame fragment and the target frame sequence number of the video coding frame corresponding to the current video frame fragment.
The purpose of acquiring the target fragment number of the currently received video frame fragment and the target frame sequence number of the video coding frame corresponding to the currently received video frame fragment is to acquire the identity information of the currently received video frame fragment so as to perform corresponding storage according to the target fragment number and the target frame sequence number, and simultaneously facilitate the ordered management of all the received video frame fragments.
Preferably, after receiving the current video frame fragment transmitted by the video transmitting end and forwarded by the video server, and acquiring the target fragment number of the current video frame fragment and the target frame number of the video coding frame corresponding to the current video frame fragment, the method further includes: and if the target fragment number is smaller than the maximum continuous fragment number of the video frame fragments which are continuously received or is larger than the sum of the maximum received fragment number of the video frame fragments which are received and a preset value, or the target frame sequence number is smaller than the minimum frame sequence number of the video coding frame which is received, discarding the video frame fragments.
Exemplarily, if the slice number of the currently received video frame slice is smaller than the maximum consecutive slice number of the video frame slices that have been continuously received, or the frame number of the currently received video frame slice is smaller than the minimum frame number of the video coding frame that has been received, it indicates that the video frame slice is an expired video frame slice, that is, the video frame slice has been received before, and thus the video frame slice should be discarded; if the slice number of the currently received video frame slice is greater than the sum of the maximum received slice number of the received video frame slice and a preset value, it indicates that the video frame slice jumps too much, and if the slice is received, the data amount required to be stored in the packet loss cache table may be too large, so the video frame slice should be discarded. Alternatively, the preset value may be set according to the capacity of the packet loss buffer table, for example, the preset value may be set to 2000.
And S420, storing the current video frame fragment in a receiving cache region and a storage space corresponding to the target fragment number under the target frame number.
Illustratively, if the current video frame fragment is the video frame fragment No. 5 in the frame 1, the video frame fragment is correspondingly stored in the storage space No. 5 under the frame 1 in the receiving buffer area.
And S430, updating the maximum continuous fragment number of the video frame fragments which have been continuously received and the maximum receiving fragment number of the video frame fragments which have been received according to the target fragment number.
Alternatively, the maximum received slice number of a video frame slice that has been received may be updated to the maximum of the target slice number and the maximum received slice number before the update. Optionally, if the target slice number is consecutive to the previously received slice number, the maximum consecutive slice number of the video frame slices that have been received consecutively may be updated to the target slice number; otherwise, keeping the value of the maximum continuous fragment number unchanged.
Illustratively, if video frame slices with slice numbers 1-5 have been continuously received and the currently received target slice number is 6, then the maximum continuous slice number of the video frame slices that have been continuously received and the maximum received slice number of the video frame slices that have been received are both updated to 6; if the currently received target slice number is 10, the maximum consecutive slice number is not changed (still 5), and the maximum received slice number is updated to 10.
S440, updating the packet loss cache table according to the maximum continuous fragment number and the maximum received fragment number.
Wherein, the packet loss buffer table comprises: a packet loss segment number and a packet loss timestamp corresponding to the packet loss segment number.
Illustratively, if the maximum continuous fragment number is consistent with the maximum receiving fragment number, it indicates that no packet loss occurs, and further the packet loss cache table does not need to be updated; if the maximum continuous fragment number is not consistent with the maximum receiving fragment number, it indicates that a packet loss situation occurs in the transmission process, and therefore the packet loss cache table needs to be updated.
Preferably, updating the packet loss buffer table according to the maximum consecutive slice number and the maximum received slice number includes: if the difference value between the maximum continuous fragment number and the maximum receiving fragment number is not less than 1, recording all packet loss fragment numbers between the maximum continuous fragment number and the maximum receiving fragment number into a packet loss cache table, and updating a packet loss timestamp corresponding to the packet loss fragment number; and if the target fragment number is stored in the packet loss cache table, deleting the target fragment number in the packet loss cache table.
Illustratively, when the maximum received slice number-maximum consecutive slice number ≧ 1, for example, when the maximum consecutive slice number is 5 and the maximum received slice number is 10, then the video frame slices corresponding to all the slice numbers (i.e., 6-9) between the maximum consecutive slice number and the maximum received slice number are considered to be temporarily lost (possibly out of order, not necessarily true loss, requiring subsequently received slices to be confirmed). If the packet loss cache table does not have the packet loss fragment numbers, storing the packet loss fragment numbers into the packet loss cache table, and storing a corresponding packet loss timestamp (namely subtracting a round-trip transmission time value from the current time); and if the packet loss cache table has the packet loss fragment numbers, only updating the corresponding packet loss time stamp. The purpose of updating the packet loss timestamp corresponding to the packet loss fragment number is to provide a basis for generating a reception acknowledgement response in the subsequent steps. For example, if the received target fragment number is stored in the packet loss cache table, it indicates that the received target fragment number is a fragment number lost before, and therefore, the target fragment number stored in the packet loss cache table needs to be deleted to prevent the packet loss information from being repeatedly sent.
S450, setting time length at intervals, and periodically acquiring packet loss fragment numbers of which packet loss timestamps meet the set time threshold condition in the packet loss cache table.
Alternatively, the acquisition period may be the same as the period for generating the reception acknowledgement response, preferably 10 ms. The set time threshold may be a round trip transmission time. Illustratively, the packet loss cache table is scanned once every 10ms, a packet loss fragment number in which a packet loss timestamp corresponding to each packet loss fragment number exceeds one round-trip transmission time is obtained, the packet loss timestamp corresponding to the packet loss fragment number is updated to be the current time, a packet loss counter corresponding to the packet loss fragment number is accumulated, all packet loss fragment numbers exceeding the round-trip transmission time are sent to the video sending end through a receiving confirmation response, and the video sending end is instructed to resend the video frame fragment corresponding to the packet loss fragment number. The advantage of obtaining the packet loss fragment number satisfying the condition of the set time threshold is that the same packet loss fragment number can be prevented from being frequently and repeatedly sent to the video sending end.
Preferably, before obtaining the packet loss segment number of which the packet loss timestamp meets the condition of the set time threshold in the packet loss cache table, the method further includes: acquiring a minimum receiving fragment number of video frame fragments in video coding frames received earliest in a receiving buffer area; acquiring a target packet loss fragment number determined by the maximum continuous fragment number and the minimum receiving fragment number; and deleting the target packet loss fragment number from the packet loss cache table, and updating the maximum continuous fragment number to the minimum receiving fragment number of the video frame fragment in the earliest received video coding frame.
Illustratively, the minimum receiving segment number min _ seq in the receiving cache area is periodically obtained, whether a packet loss segment number meeting (base _ seq, min _ seq) conditions exists in the packet loss cache table is detected, if a target packet loss segment number meeting the conditions exists, the target packet loss segment number is deleted from the packet loss cache table, and the value of the maximum continuous segment number base _ seq is updated to be min _ seq, wherein base _ seq is the maximum continuous segment number, and min _ seq is the minimum receiving segment number.
And S460, constructing a receiving confirmation response according to the packet loss fragment number and the maximum continuous fragment number.
Optionally, the packet loss fragment number and the current maximum continuous fragment number of which the packet loss timestamp exceeds one round-trip transmission time in the packet loss cache table are forwarded to the video sending end through the video server in a manner of receiving a confirmation response, so that the video sending end is instructed to retransmit the video frame fragment corresponding to the packet loss fragment number according to the reception confirmation response, and send the next video frame fragment.
And S470, sending the receiving confirmation response to the video server, so that the video server forwards the receiving confirmation response to the video sending end.
And S480, if all the video frame fragments in the frame serial number in the receiving cache area are successfully received, combining all the video frame fragments in the frame serial number to play the real-time video.
According to the technical scheme of the embodiment, the packet loss cache table and the related parameter values are updated in real time by detecting the receiving and storing conditions of the video frame fragments in the receiving cache area, the packet loss fragment numbers in the packet loss cache table and the maximum continuous fragment numbers in the receiving cache area are periodically obtained, and the receiving confirmation response is constructed to be returned to the video sending end, so that a basis is provided for sending the video frame fragments of the video sending end, the reliable transmission of the video frame fragments is realized, and the transmission delay of the video is reduced by matching with the video sending end.
EXAMPLE five
Fig. 5 is a schematic structural diagram of a real-time video transmission apparatus according to a fifth embodiment of the present invention. The device can be generally integrated in a video transmitting end and all terminals containing video transmitting functions. Referring to fig. 5, the transmission apparatus of real-time video includes: a fragment processing module 510, a transmission buffer module 520, and a fragment transmission module 530, which are described in detail below.
A fragment processing module 510, configured to perform fragment processing on a video encoded frame generated in real time to form at least one video frame fragment;
a sending buffer module 520, configured to sequentially place the video frame fragments into a sending buffer area;
the fragment sending module 530 is configured to forward the video frame fragments of the video encoded frame to the video playing end through the video server according to a set sending policy according to a reception confirmation response returned by the video playing end for different video frame fragments, so as to perform real-time video playing.
The transmission device for the real-time video provided by this embodiment divides a video coding frame into at least one video frame fragment, and forwards the video frame fragment to a video playing end through a video server according to a set sending strategy, so as to play the real-time video, thereby solving the problem of large video delay caused by adopting a transmission mode based on TCP in the prior art, and realizing the effects of reducing video transmission delay, improving the reliability of video transmission and improving user experience.
On the basis of the foregoing embodiments, the fragment processing module 510 may be specifically configured to:
if the byte number included in the video coding frame is smaller than the sum of the byte number of a preset single slice and the preset byte number, dividing the video coding frame into a video frame slice;
if the number of bytes included in a video coding frame is S times of the number of bytes of a preset single fragment, dividing the video coding frame into S video frame fragments, wherein S is an integer greater than 0;
if the number of bytes included in a video coding frame is the sum of S times of the number of bytes of a preset single fragment and the number of bytes exceeding the number of bytes, and the number of bytes exceeding the number of bytes is larger than zero and smaller than the preset number of bytes, dividing the video coding frame into S video frame fragments, and putting bytes corresponding to the number of bytes exceeding the number of bytes into the last video frame fragment;
if the number of bytes included in the video coding frame is the sum of the number of bytes which is S times of the number of bytes of the preset single fragment and the number of bytes which exceed the number of bytes, and the number of bytes which exceed the number of bytes is not less than the preset number of bytes, the video coding frame is divided into (S +1) video frame fragments.
On the basis of the above embodiments, the method may further include:
the expiration checking module is used for periodically checking whether the expiration video frame fragments with the existence time exceeding a preset time threshold are included in the sending buffer area after the video frame fragments are sequentially placed in the sending buffer area;
a fragment removing module, configured to remove all video frame fragments corresponding to an expired video coding frame group associated with a video frame fragment from the sending buffer area if the sending buffer area includes the expired video frame fragment whose existing time exceeds a preset time threshold, and generate discard synchronization information according to a next video coding frame group located after the expired video coding frame group, where the video coding frame group includes at least one video coding frame;
a discarding information sending module, configured to forward the discarding synchronization information to the at least one video playing end through the video server, so as to instruct the video playing end to discard and receive the video frame slice corresponding to the expired video coding frame group, and discard the received video frame slice corresponding to the expired video coding frame group.
On the basis of the foregoing embodiments, the discard synchronization information includes a frame number corresponding to a key video coding frame in the next video coding frame group and a minimum slice number of a video frame slice in the key video coding frame.
On the basis of the above embodiments, the method may further include:
a value reduction module, configured to update the discarding times of an expired video coding frame group associated with the video frame fragment after all video frame fragments corresponding to the expired video coding frame group are removed from the sending buffer, and reduce a value of the data transmission parameter if the discarding times within a preset time interval exceeds a discarding threshold;
wherein the data transmission parameters include: the resolution of the video encoder, and/or the data transmission rate.
On the basis of the above embodiments, the video coding frame generated in real time is a coding frame generated after an encoder codes according to an H264 protocol and filters a B frame.
On the basis of the foregoing embodiments, the fragment sending module 530 may be specifically configured to:
sending a set number of video frame fragments to be confirmed to the video server, and storing the video frame fragments to be confirmed in a sending window of a sending cache region;
receiving a video frame fragment receiving confirmation response transmitted by the video playing terminal and forwarded by the video server;
if the receiving confirmation response comprises the maximum continuous fragment number of the video frame fragments continuously received by the video playing end, sliding the sending window, deleting the confirmed video frame fragments from the sending window, and generating an idle sending window;
if the receiving confirmation response comprises the packet loss fragment number of the video playing end, acquiring a video frame fragment corresponding to the packet loss fragment number, and retransmitting the video frame fragment to the video playing end through the video server;
continuously sending the video frame fragments to be confirmed, which are matched with the number of the idle sending windows, to the video server, and storing the continuously sent video frame fragments to be confirmed in the idle sending windows;
and returning to execute the operation of receiving the video frame fragment receiving confirmation response transmitted by the video playing end and forwarded by the video server until the confirmation transmission of all the video frame fragments is completed.
The product can execute the method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
EXAMPLE six
Fig. 6 is a schematic structural diagram of a real-time video transmission apparatus according to a sixth embodiment of the present invention. The device can be generally integrated in a video playing end and all intelligent terminals containing video receiving and playing functions. Referring to fig. 6, the transmission apparatus of real-time video includes: the fragment receiving module 610, the response generating module 620, the response sending module 630, and the fragment merging module 640 are described in detail below.
The fragment receiving module 610 is configured to receive a video frame fragment transmitted by a video transmitting end and forwarded by a video server, put the video frame fragment into a corresponding frame number in a receiving cache area, and update a maximum continuous fragment number and a packet loss cache table of video frame fragments that have been continuously received;
a response generating module 620, configured to periodically generate a corresponding receiving confirmation response according to the maximum consecutive segment number and the packet loss cache table;
a response sending module 630, configured to send the reception acknowledgement response to the video server, so that the video server forwards the reception acknowledgement response to the video sender;
and a fragment merging module 640, configured to merge all video frame fragments in the frame number if all video frame fragments in the frame number in the receiving cache area are successfully received, so as to perform real-time video playing.
The real-time video transmission device provided in this embodiment receives the video frame fragments transmitted by the video transmitting end and forwarded by the video server, updates the corresponding parameters and the packet loss cache table according to the receiving and storing conditions in the receiving cache area, further returns the periodically generated receiving confirmation response to the video server, and finally merges the video frame fragments in the successfully received frame number for real-time playing, thereby improving the reliability and real-time performance of video transmission.
On the basis of the foregoing embodiments, the fragment receiving module 610 may include:
the sequence number acquisition submodule is used for receiving a current video frame fragment transmitted by a video transmitting end and forwarded by a video server, and acquiring a target fragment number of the current video frame fragment and a target frame sequence number of a video coding frame corresponding to the current video frame fragment;
the fragment storage submodule is used for storing the current video frame fragments in the receiving cache region and a storage space corresponding to the target fragment number under the target frame serial number;
the fragment number updating submodule is used for updating the maximum continuous fragment number of the video frame fragments which are already continuously received and the maximum receiving fragment number of the video frame fragments which are already received according to the target fragment number;
a buffer table updating submodule, configured to update a packet loss buffer table according to the maximum continuous fragment number and the maximum reception fragment number;
wherein, the packet loss buffer table comprises: a packet loss segment number and a packet loss timestamp corresponding to the packet loss segment number.
On the basis of the foregoing embodiments, the cache table updating submodule may be specifically configured to:
if the difference between the maximum continuous fragment number and the maximum receiving fragment number is not less than 1, recording all packet loss fragment numbers between the maximum continuous fragment number and the maximum receiving fragment number into a packet loss cache table, and updating a packet loss timestamp corresponding to the packet loss fragment number;
and if the target fragment number is stored in the packet loss cache table, deleting the target fragment number in the packet loss cache table.
On the basis of the foregoing embodiments, the response generating module 620 may include:
the lost packet number acquisition submodule is used for setting time length at intervals and periodically acquiring a lost packet fragment number of which a lost packet timestamp meets a set time threshold condition in the lost packet cache table;
and the response construction submodule is used for constructing a receiving confirmation response according to the packet loss fragment number and the maximum continuous fragment number.
On the basis of the foregoing embodiments, the response generating module 620 may further include:
a minimum fragment number obtaining submodule, configured to obtain a minimum receiving fragment number of a video frame fragment in the video coding frame received earliest in the receiving cache region before obtaining a packet loss fragment number in the packet loss cache table, where a packet loss timestamp meets a set time threshold condition;
a target fragment number obtaining submodule, configured to obtain a target packet loss fragment number determined by the maximum continuous fragment number and the minimum reception fragment number;
and the target fragment number deleting submodule is used for deleting the target packet loss fragment number from the packet loss cache table and updating the maximum continuous fragment number to the minimum receiving fragment number of the video frame fragment in the earliest received video coding frame.
On the basis of the foregoing embodiments, the fragment receiving module 610 may further include:
the fragment discarding submodule is used for discarding the video frame fragment if the target fragment number is less than the maximum continuous fragment number of the video frame fragment which is continuously received or greater than the sum of the maximum received fragment number of the video frame fragment which is received and a preset value, or the target frame sequence number is less than the minimum frame sequence number of the video frame fragment which is received after receiving the current video frame fragment which is transmitted by the video server and sent by the video transmitting end and acquiring the target fragment number of the current video frame fragment and the target frame sequence number of the video coding frame corresponding to the current video frame fragment.
On the basis of the above embodiments, the method may further include:
the frame fragment discarding module is configured to, before receiving a video frame fragment transmitted by a video transmitting end and forwarded by a video server, update, according to discard synchronization information, a maximum consecutive fragment number of a video frame fragment that has been continuously received if the discard synchronization information transmitted by the video transmitting end and forwarded by the video server is received, and discard a video frame fragment that has been received and corresponds to an expired video coding frame group, where the discard synchronization information includes a frame number corresponding to a key video coding frame in the next video coding frame group and a minimum fragment number of a video frame fragment in the key video coding frame, and the video coding frame group includes at least one video coding frame.
The product can execute the method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (13)

1. A method for transmitting real-time video, comprising:
carrying out fragment processing on the video coding frame generated in real time to form at least one video frame fragment;
sequentially putting the video frame fragments into a sending buffer area;
according to the receiving confirmation response returned by the video playing end aiming at different video frame fragments, the video frame fragments of the video coding frame are forwarded to the video playing end through the video server according to the set sending strategy so as to play the real-time video;
the slicing processing is performed on the video coding frame generated in real time to form at least one video frame slice, and the slicing processing comprises the following steps:
if the byte number included in the video coding frame is smaller than the sum of the byte number of a preset single slice and the preset byte number, dividing the video coding frame into a video frame slice;
if the number of bytes included in a video coding frame is S times of the number of bytes of a preset single fragment, dividing the video coding frame into S video frame fragments, wherein S is an integer greater than 0;
if the number of bytes included in a video coding frame is the sum of S times of the number of bytes of a preset single fragment and the number of bytes exceeding the number of bytes, and the number of bytes exceeding the number of bytes is larger than zero and smaller than the preset number of bytes, dividing the video coding frame into S video frame fragments, and putting bytes corresponding to the number of bytes exceeding the number of bytes into the last video frame fragment;
if the number of bytes included in the video coding frame is the sum of the number of bytes which is S times of the number of bytes of the preset single fragment and the number of bytes which exceed the number of bytes, and the number of bytes which exceed the number of bytes is not less than the preset number of bytes, the video coding frame is divided into (S +1) video frame fragments.
2. The method of claim 1, wherein after sequentially placing the video frame slices into a transmission buffer, further comprising:
periodically checking whether expired video frame fragments with the existence time exceeding a preset time threshold value are included in the sending buffer area;
if yes, removing all video frame fragments corresponding to an expired video coding frame group associated with the video frame fragments from the sending buffer area, and generating discarding synchronization information according to a next video coding frame group located behind the expired video coding frame group, wherein the video coding frame group comprises at least one video coding frame;
and forwarding the discarding synchronization information to the at least one video playing terminal through the video server to instruct the video playing terminal to abandon and receive the video frame fragments corresponding to the overdue video coding frame group and discard the received video frame fragments corresponding to the overdue video coding frame group.
3. The method of claim 2, wherein the discard synchronization information comprises a frame number corresponding to a key video coded frame in the next group of video coded frames and a minimum slice number of a video frame slice in the key video coded frame.
4. The method of claim 2, further comprising, after removing all video frame slices corresponding to an expired set of video coding frames associated with the video frame slice from the transmit buffer:
updating the discarding times of the expired video coding frame group, and reducing the value of a data transmission parameter if the discarding times in a preset time interval exceeds a discarding threshold value;
wherein the data transmission parameters include: the resolution of the video encoder, and/or the data transmission rate.
5. The method according to any one of claims 1-4, wherein the real-time generated video coding frames are generated by an encoder according to the H264 protocol and after B frames are filtered out.
6. The method according to claim 1, wherein the forwarding the video frame fragments of the video encoded frame to the video playing end for real-time video playing through a video server according to a set transmission policy according to the reception acknowledgement response returned by the video playing end for different video frame fragments comprises:
sending a set number of video frame fragments to be confirmed to the video server, and storing the video frame fragments to be confirmed in a sending window of a sending cache region;
receiving a video frame fragment receiving confirmation response transmitted by the video playing terminal and forwarded by the video server;
if the receiving confirmation response comprises the maximum continuous fragment number of the video frame fragments continuously received by the video playing end, sliding the sending window, deleting the confirmed video frame fragments from the sending window, and generating an idle sending window;
if the receiving confirmation response comprises the packet loss fragment number of the video playing end, acquiring a video frame fragment corresponding to the packet loss fragment number, and retransmitting the video frame fragment to the video playing end through the video server;
continuously sending the video frame fragments to be confirmed, which are matched with the number of the idle sending windows, to the video server, and storing the continuously sent video frame fragments to be confirmed in the idle sending windows;
and returning to execute the operation of receiving the video frame fragment receiving confirmation response transmitted by the video playing end and forwarded by the video server until the confirmation transmission of all the video frame fragments is completed.
7. A method for transmitting real-time video, comprising:
receiving video frame fragments which are forwarded by a video server and sent by a video sending end, putting the video frame fragments into corresponding frame serial numbers in a receiving cache area, and updating the maximum continuous fragment number and a packet loss cache table of the continuously received video frame fragments;
periodically generating a corresponding receiving confirmation response according to the maximum continuous fragment number and the packet loss cache table;
sending the receiving confirmation response to the video server so that the video server forwards the receiving confirmation response to the video sending end;
if all the video frame fragments in the frame serial number in the receiving cache area are successfully received, combining all the video frame fragments in the frame serial number to play the real-time video;
the method for receiving the video frame fragments transmitted by the video transmitting end and forwarded by the video server, putting the video frame fragments into the corresponding frame number in the receiving cache area, and updating the maximum continuous fragment number and the packet loss cache table of the video frame fragments which are continuously received comprises the following steps:
receiving a current video frame fragment transmitted by a video transmitting end and forwarded by a video server, and acquiring a target fragment number of the current video frame fragment and a target frame sequence number of a video coding frame corresponding to the current video frame fragment;
storing the current video frame fragment in the receiving cache region, wherein the storage space under the target frame serial number corresponds to the target fragment number;
updating the maximum continuous fragment number of the video frame fragments which have been continuously received and the maximum receiving fragment number of the video frame fragments which have been received according to the target fragment number;
updating a packet loss cache table according to the maximum continuous fragment number and the maximum receiving fragment number;
wherein, the packet loss buffer table comprises: a packet loss section number and a packet loss timestamp corresponding to the packet loss section number;
the updating the packet loss cache table according to the maximum continuous fragment number and the maximum received fragment number includes:
if the difference between the maximum continuous fragment number and the maximum receiving fragment number is not less than 1, recording all packet loss fragment numbers between the maximum continuous fragment number and the maximum receiving fragment number into a packet loss cache table, and updating a packet loss timestamp corresponding to the packet loss fragment number;
and if the target fragment number is stored in the packet loss cache table, deleting the target fragment number in the packet loss cache table.
8. The method according to claim 7, wherein said periodically generating a corresponding reception acknowledgement response according to the maximum consecutive slice number and the packet loss buffer table comprises:
setting time length at intervals, and periodically executing the following operations:
acquiring a packet loss fragment number of which a packet loss timestamp meets a set time threshold condition in the packet loss cache table;
and constructing a receiving confirmation response according to the packet loss fragment number and the maximum continuous fragment number.
9. The method according to claim 8, wherein before obtaining the packet loss fragment number of which the packet loss timestamp satisfies the set time threshold condition in the packet loss cache table, the method further comprises:
acquiring the minimum receiving fragment number of video frame fragments in the video coding frames received earliest in the receiving buffer area;
acquiring a target packet loss fragment number determined by the maximum continuous fragment number and the minimum receiving fragment number;
deleting the target packet loss fragment number from the packet loss cache table, and updating the maximum continuous fragment number to the minimum receiving fragment number of the video frame fragment in the earliest received video coding frame.
10. The method according to claim 7, after receiving a current video frame fragment transmitted by a video transmitting end and forwarded by a video server, and acquiring a target fragment number of the current video frame fragment and a target frame number of a video coding frame corresponding to the current video frame fragment, further comprising:
and if the target fragment number is smaller than the maximum continuous fragment number of the video frame fragments which are continuously received or is larger than the sum of the maximum received fragment number of the video frame fragments which are received and a preset value, or the target frame sequence number is smaller than the minimum frame sequence number of the video coding frames which are received, discarding the video frame fragments.
11. The method of claim 7, further comprising, before receiving the video frame fragments forwarded by the video server and sent by the video sender:
if receiving discarding synchronization information which is forwarded by a video server and sent by a video sending end, updating the maximum continuous fragment number of the video frame fragments which are continuously received according to the discarding synchronization information, and discarding the video frame fragments which are received and correspond to an overdue video coding frame group, wherein the discarding synchronization information comprises the frame number corresponding to a key video coding frame in the next video coding frame group and the minimum fragment number of the video frame fragment in the key video coding frame, and the video coding frame group comprises at least one video coding frame.
12. A real-time video transmission apparatus configured at a video transmitting end, comprising:
the fragment processing module is used for carrying out fragment processing on the video coding frame generated in real time to form at least one video frame fragment;
the sending buffer module is used for sequentially placing the video frame fragments into a sending buffer area;
the fragment sending module is used for forwarding the video frame fragments of the video coding frame to the video playing end through the video server according to a set sending strategy to carry out real-time video playing according to a receiving confirmation response returned by the video playing end aiming at different video frame fragments;
the fragment processing module may be specifically configured to:
if the byte number included in the video coding frame is smaller than the sum of the byte number of a preset single slice and the preset byte number, dividing the video coding frame into a video frame slice;
if the number of bytes included in the video coding frame is S times of the number of bytes of a preset single fragment, dividing the video coding frame into S video frame fragments, wherein S is an integer greater than 0;
if the number of bytes included in the video coding frame is the sum of the number of bytes of a preset single fragment S times and the number of bytes exceeding the number of bytes, and the number of bytes exceeding the number of bytes is larger than zero and smaller than the preset number of bytes, dividing the video coding frame into S video frame fragments, and putting the bytes corresponding to the number of bytes exceeding the number of bytes into the last video frame fragment;
and if the number of bytes included in the video coding frame is the sum of S times of the number of bytes of a preset single fragment and the number of bytes exceeding the number of bytes, and the number of bytes exceeding the number of bytes is not less than the preset number of bytes, dividing the video coding frame into (S +1) video frame fragments.
13. A real-time video transmission device configured at a video playing end, comprising:
the fragment receiving module is used for receiving video frame fragments which are transmitted by a video transmitting end and forwarded by a video server, putting the video frame fragments into corresponding frame serial numbers in a receiving cache region, and updating the maximum continuous fragment number and a packet loss cache table of the continuously received video frame fragments;
a response generation module, configured to periodically generate a corresponding reception acknowledgement response according to the maximum continuous fragment number and the packet loss cache table;
a response sending module, configured to send the reception acknowledgement response to the video server, so that the video server forwards the reception acknowledgement response to the video sending end;
the fragment merging module is used for merging all the video frame fragments in the frame number to play the real-time video if all the video frame fragments in the frame number in the receiving cache area are successfully received;
the fragment receiving module comprises:
the sequence number acquisition submodule is used for receiving a current video frame fragment transmitted by a video transmitting end and forwarded by a video server, and acquiring a target fragment number of the current video frame fragment and a target frame sequence number of a video coding frame corresponding to the current video frame fragment;
the fragment storage submodule is used for storing the current video frame fragments in the receiving cache region and a storage space corresponding to the target fragment number under the target frame serial number;
the fragment number updating submodule is used for updating the maximum continuous fragment number of the video frame fragments which are already continuously received and the maximum receiving fragment number of the video frame fragments which are already received according to the target fragment number;
a buffer table updating submodule, configured to update a packet loss buffer table according to the maximum continuous fragment number and the maximum reception fragment number;
wherein, the packet loss buffer table comprises: a packet loss section number and a packet loss timestamp corresponding to the packet loss section number;
the cache table updating submodule is used for:
if the difference between the maximum continuous fragment number and the maximum receiving fragment number is not less than 1, recording all packet loss fragment numbers between the maximum continuous fragment number and the maximum receiving fragment number into a packet loss cache table, and updating a packet loss timestamp corresponding to the packet loss fragment number;
and if the target fragment number is stored in the packet loss cache table, deleting the target fragment number in the packet loss cache table.
CN201710233780.8A 2017-04-11 2017-04-11 Real-time video transmission method and device Expired - Fee Related CN108696773B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710233780.8A CN108696773B (en) 2017-04-11 2017-04-11 Real-time video transmission method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710233780.8A CN108696773B (en) 2017-04-11 2017-04-11 Real-time video transmission method and device

Publications (2)

Publication Number Publication Date
CN108696773A CN108696773A (en) 2018-10-23
CN108696773B true CN108696773B (en) 2021-03-09

Family

ID=63843374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710233780.8A Expired - Fee Related CN108696773B (en) 2017-04-11 2017-04-11 Real-time video transmission method and device

Country Status (1)

Country Link
CN (1) CN108696773B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109222854A (en) * 2018-11-19 2019-01-18 苏州新光维医疗科技有限公司 Wosap tv system and its picture signal transmission method
CN109714634B (en) * 2018-12-29 2021-06-29 海信视像科技股份有限公司 Decoding synchronization method, device and equipment for live data stream
CN111436009B (en) * 2019-01-11 2023-10-27 厦门雅迅网络股份有限公司 Real-time video stream transmission and display method and transmission and play system
CN109889543B (en) * 2019-03-26 2020-11-13 广州华多网络科技有限公司 Video transmission method, root node, child node, P2P server and system
CN110072128B (en) * 2019-04-22 2021-01-15 北京开广信息技术有限公司 Real-time pushing method of media stream and server
CN110417514B (en) * 2019-07-22 2022-02-01 北京地平线机器人技术研发有限公司 Data sending method and device and data receiving method and device
CN111491207A (en) * 2020-04-17 2020-08-04 北京三体云联科技有限公司 Video data processing method and device in live broadcast and electronic equipment
CN112235624B (en) * 2020-09-17 2021-09-03 成都成电光信科技股份有限公司 Video synchronous sending method supporting high-resolution LED dome screen display
CN112312204B (en) * 2020-09-30 2022-05-24 新华三大数据技术有限公司 Method and device for packaging video stream data fragments
CN113259715A (en) * 2021-05-07 2021-08-13 广州小鹏汽车科技有限公司 Method and device for processing multi-channel video data, electronic equipment and medium
CN113141415B (en) * 2021-05-07 2023-05-12 广州小鹏汽车科技有限公司 Remote driving system and method for vehicle, electronic device and storage medium
CN113254211B (en) * 2021-06-01 2023-04-07 广州小鹏汽车科技有限公司 Cache allocation method and device, electronic equipment and storage medium
CN113438520B (en) * 2021-06-29 2023-01-03 北京奇艺世纪科技有限公司 Data processing method, device and system
CN114024914B (en) * 2021-10-27 2024-03-01 杭州海康威视数字技术股份有限公司 Video data transmission method and device and electronic equipment
CN114363304B (en) * 2021-12-27 2024-04-19 浪潮通信技术有限公司 RTP video stream storage and playing method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729879A (en) * 2009-12-15 2010-06-09 山东大学 Method for realizing real-time video transmission based on MIMO-OFDM system
CN102256182A (en) * 2011-07-26 2011-11-23 重庆大唐科技股份有限公司 RTP (Real-time Transport Protocol)-based video stream fragment framing method
CN105721950A (en) * 2016-03-30 2016-06-29 浙江宇视科技有限公司 Reliable media stream transmission device
CN106341738A (en) * 2015-07-08 2017-01-18 杭州海康威视数字技术股份有限公司 Streaming media network transmission bandwidth calculation method, server and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9894393B2 (en) * 2015-08-31 2018-02-13 Gopro, Inc. Video encoding for reduced streaming latency

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729879A (en) * 2009-12-15 2010-06-09 山东大学 Method for realizing real-time video transmission based on MIMO-OFDM system
CN102256182A (en) * 2011-07-26 2011-11-23 重庆大唐科技股份有限公司 RTP (Real-time Transport Protocol)-based video stream fragment framing method
CN106341738A (en) * 2015-07-08 2017-01-18 杭州海康威视数字技术股份有限公司 Streaming media network transmission bandwidth calculation method, server and system
CN105721950A (en) * 2016-03-30 2016-06-29 浙江宇视科技有限公司 Reliable media stream transmission device

Also Published As

Publication number Publication date
CN108696773A (en) 2018-10-23

Similar Documents

Publication Publication Date Title
CN108696773B (en) Real-time video transmission method and device
CN108696772B (en) Real-time video transmission method and device
EP1482681B1 (en) Medium streaming distribution system
CN101861709B (en) Method and apparatus for adaptive forward error correction with merged automatic repeat request for reliable multicast in wireless local area networks
US9565482B1 (en) Adaptive profile switching system and method for media streaming over IP networks
EP2424241A1 (en) Method, device and system for forwarding video data
CN108600859B (en) Data slicing method and system
CN108616334B (en) Message transmission method, device, system, storage medium and electronic device
CN102932667B (en) Frame loss control and retransmission method and system in real-time streaming media uploading
JP2024509728A (en) Data retransmission processing method, device, computer equipment and computer program
CN109155707B (en) Requesting data retransmission in a multicast network
CN108696771B (en) Video playing method and device
CN107592185B (en) Forward retransmission method suitable for network coding transmission control protocol
CN113014586B (en) RTP data packet out-of-order processing and framing method and system
KR100526183B1 (en) Apparatus and Method for efficient data transmission/reception in Mobile Ad-hoc Network
CN111163362B (en) Video receiving method and system capable of self-adapting retransmission waiting time
CN109862400B (en) Streaming media transmission method, device and system
CN106790576B (en) interactive desktop synchronization method
CN117336796A (en) Real-time data transmission method
US11233716B2 (en) System for real-time monitoring with backward error correction
CN111741319A (en) Live broadcast data processing method and device and electronic equipment
CN114979793A (en) Live broadcast data transmission method, device, system, equipment and medium
CN109792444B (en) Play-out buffering in a live content distribution system
CN101277270A (en) Transmission method and system for flow medium data
JP6166445B1 (en) Application layer multicast delivery method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200817

Address after: No.259 Nanjing West Road, Tangqiao town, Zhangjiagang City, Suzhou City, Jiangsu Province

Applicant after: Suzhou Qianwen wandaba Education Technology Co.,Ltd.

Address before: Yangpu District State Road 200433 Shanghai City No. 200 Building 5 room 2002

Applicant before: SHANGHAI QIANWENWANDABA CLOUD TECH. Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210309