CN110933450A - Multi-channel live broadcast synchronization method, system, edge device, terminal and storage medium - Google Patents

Multi-channel live broadcast synchronization method, system, edge device, terminal and storage medium Download PDF

Info

Publication number
CN110933450A
CN110933450A CN201910981387.6A CN201910981387A CN110933450A CN 110933450 A CN110933450 A CN 110933450A CN 201910981387 A CN201910981387 A CN 201910981387A CN 110933450 A CN110933450 A CN 110933450A
Authority
CN
China
Prior art keywords
video data
time
channel
data
synchronization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910981387.6A
Other languages
Chinese (zh)
Other versions
CN110933450B (en
Inventor
赵璐
张健
莫东松
马晓琳
钟宜峰
张进
马丹
王科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MIGU Culture Technology Co Ltd
Original Assignee
MIGU Culture Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MIGU Culture Technology Co Ltd filed Critical MIGU Culture Technology Co Ltd
Priority to CN201910981387.6A priority Critical patent/CN110933450B/en
Publication of CN110933450A publication Critical patent/CN110933450A/en
Application granted granted Critical
Publication of CN110933450B publication Critical patent/CN110933450B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/64Addressing
    • H04N21/6402Address allocation for clients

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the invention relates to the field of communication, and discloses a multi-channel live broadcast synchronization method, a multi-channel live broadcast synchronization system, edge equipment, a terminal and a storage medium. The invention discloses a multi-channel live broadcast synchronization method, which is applied to edge equipment and comprises the following steps: acquiring multi-channel video data; wherein, each path of video data comprises time mark information; and synchronizing the multi-channel video data according to the time mark information and transmitting the synchronized multi-channel video data. According to the time mark information of the obtained multi-channel video data, the multi-channel video data are subjected to time synchronization, and the synchronized multi-channel video data are sent, so that the problem of multi-channel video data asynchronization caused by delay in the data transmission process is eliminated as much as possible, the pictures of the multi-channel video watched by a user are kept as synchronous as possible, and the user experience is improved.

Description

Multi-channel live broadcast synchronization method, system, edge device, terminal and storage medium
Technical Field
The embodiment of the invention relates to the technical field of information, in particular to a multi-channel live broadcast synchronization method, a multi-channel live broadcast synchronization system, edge equipment, a terminal and a storage medium.
Background
With the development and progress of science and technology, more and more people gradually begin to watch various types of live broadcasts and programs after watching television programs, more and more programs with ornamental value appear in people's daily life, and with watching the live broadcast of video for longer and longer time, the live broadcast picture of single line can not meet people's ornamental demand gradually, and a section of video or a program, different watching angles can bring different experiences and feelings to people. With the development of the 5G technology, UGC users upload and share live broadcast videos, the requirement for multi-screen playing on a terminal is increasing day by day, and unsynchronized multi-screen live broadcasts can bring great influence to the user in watching experience and some special function requirements. In terms of synchronizing UGC and live broadcast information of television stations and the like, a common method at present is to collect a single channel of each video, merge a plurality of data before transmission, transmit the merged video and play the merged video at a receiving end.
The inventor of the invention finds that at least the following problems exist in the prior art: in the prior art, a method for processing multi-line data still cannot solve the problem of synchronization of the multi-line data, and the final picture seen by a user is poor in synchronization or asynchronous, so that the user experience is poor.
Disclosure of Invention
The embodiment of the invention aims to provide a multi-channel live broadcast synchronization method, a multi-channel live broadcast synchronization system, edge equipment, a terminal and a storage medium, so that a user can watch multi-channel videos with synchronous pictures as far as possible when watching multi-channel live broadcasts of the videos, the watching requirements of the user are met, and the watching experience of the user is improved.
In order to solve the above technical problem, an embodiment of the present invention provides a multi-channel live broadcast synchronization method, which is applied to an edge device, and the method includes: acquiring multi-channel video data; wherein, each path of video data comprises time mark information; and synchronizing the multi-channel video data according to the time mark information and transmitting the synchronized multi-channel video data.
The embodiment of the invention also provides another multipath live broadcast synchronization method which is applied to a terminal and comprises the following steps: acquiring multi-channel video data; wherein, each path of video data comprises time mark information; and synchronizing the multi-channel video data according to the time mark information, and playing the synchronized multi-channel video.
The embodiment of the invention also provides a multi-channel live broadcast synchronization system, which comprises: a terminal and an edge device; the edge device is used for acquiring the multi-channel video data, synchronizing the multi-channel video data according to the time mark information of each channel of video data in the multi-channel video data and sending the synchronized multi-channel video data; the terminal is used for receiving the multi-channel video data and playing the video according to the received multi-channel video data; wherein the multiple channels of video data are sent by the edge device.
Embodiments of the present invention also provide an edge device, including: at least one processor; and a memory communicatively coupled to the at least one processor; the storage stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to execute the multi-channel live broadcast synchronization method applied to the edge device.
An embodiment of the present invention further provides a terminal, including: at least one processor; and a memory communicatively coupled to the at least one processor; the storage stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to execute the multi-channel live broadcast synchronization method applied to the terminal.
The embodiment of the invention also provides a computer readable storage medium, which stores a computer program, and the computer program is executed by a processor to realize the multi-channel live broadcast synchronization method.
Compared with the prior art, the embodiment of the invention synchronizes the multi-channel video data according to the time mark information in the multi-channel video data after the edge device acquires the multi-channel video data, and sends out the synchronized multi-channel video data. The video data are time-synchronized by the edge device, so that the problem that multiple paths of video pictures are asynchronous due to transmission delay of each path of video data in the process of transmitting the data to the edge device is avoided, the multiple paths of video data received by a final terminal are synchronous as far as possible, the pictures of played multiple paths of videos are kept synchronous, and the video watching experience of a user is improved.
In addition, before acquiring the multi-channel video data, time synchronization is carried out on each live broadcast device; the time mark information contained in the multi-channel video data and each channel of video data is generated by each live broadcast device after time synchronization, and the time synchronization is carried out on each live broadcast device before the live broadcast video data is generated by the live broadcast device, so that the video in the data packets with the same time mark information of each channel of generated video data is generated at the same time, and the problem that the multi-channel video pictures after time synchronization are still asynchronous due to the fact that the generation time of the data packets with the same time mark is inconsistent is solved.
In addition, synchronizing the multiple channels of video data according to the time stamp information and transmitting the synchronized multiple channels of video data includes: respectively arranging the data packets in each path of video data according to a time sequence; the time of the data packet is obtained from the time mark information of the video data to which the data packet belongs; carrying out time synchronization on the multi-channel video data at least once according to the synchronization period, sending the synchronized multi-channel video data, directly sending a current first data packet of each channel of video data when the current synchronization period is finished, and entering the next synchronization period; in one time synchronization, taking the current first data packet of each path of video data as a data packet to be detected, and detecting whether the time of all the data packets to be detected is consistent; if the time difference is not consistent, the data packet with non-latest time in the data packet to be detected is sent first, and next time synchronization is carried out; if the data packets are consistent, all the data packets to be detected are sent at the same time, next time of time synchronization is carried out, the time of the current first data packet of the multi-channel video data is detected for multiple times in each synchronization period, the detected playing progress is delayed and sent before the data packets of the video data of other channels, and after the periodic detection and the data sending, the playing progress of the multi-channel video data is adjusted to be consistent, so that the sent multi-channel video data are synchronous, and a user can watch the multi-channel video with synchronous pictures; and when each synchronization period is finished, whether the corresponding time of the current data packet of the multiple paths of videos is consistent or not, the current first data packet of all the video data is sent to the user once, and then the next synchronization period is entered, so that the problem that the playing progress of the one path of video with the fastest playing progress is blocked due to too long delay transmission time of the data packet caused by the fact that the playing progress cannot be synchronized in one synchronization period is avoided.
In addition, after the data packets in each path of video data are arranged in time sequence, the method further comprises the following steps: arranging all paths of video data into a video data matrix, distributing storage resources for all paths of video data according to the following formula, and caching all paths of video data;
Figure BDA0002235293950000031
wherein q isiIs the storage resource of the ith row in the video data matrix, Q is the total amount of the storage resource, Δ t is the maximum time difference of the time corresponding to the time stamp of the first column of video data in the video data matrix, MijIs the size of the data packet corresponding to the ith row and jth column data in the video data matrix, niThe number of ith row of data packets in the video data matrix, and m is the total row number of the video data matrix. Because the storage requirements of each path of video data are different, the storage space is specifically allocated for each path of video data through the formula, the waste of the storage space is avoided, the requirement of video data synchronization is ensured, and the existing storage resources are utilized to the maximum extent.
Drawings
One or more embodiments are illustrated by the corresponding figures in the drawings, which are not meant to be limiting.
Fig. 1 is a flowchart of a multi-live synchronization method according to a first embodiment of the present invention;
fig. 2 is a schematic diagram of a synchronization process in a multi-channel live broadcast synchronization method according to a second embodiment of the present invention;
fig. 3 is a flowchart of a multi-live synchronization method according to a second embodiment of the present invention;
fig. 4 is a schematic diagram of a video data matrix in a multi-channel live broadcast synchronization method according to a second embodiment of the present invention;
fig. 5 is a schematic diagram illustrating list creation and update in a multi-channel live broadcast synchronization method according to a second embodiment of the present invention;
fig. 6 is a flowchart of a multi-live synchronization method according to a third embodiment of the present invention;
fig. 7 is a schematic structural diagram of a multi-channel live broadcast synchronization system according to a fourth embodiment of the present invention;
FIG. 8 is a schematic diagram of an edge device according to a fifth embodiment of the present invention;
fig. 9 is a schematic diagram of a terminal structure according to a sixth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments. The following embodiments are divided for convenience of description, and should not constitute any limitation to the specific implementation manner of the present invention, and the embodiments may be mutually incorporated and referred to without contradiction.
The first embodiment of the invention relates to a multi-channel live broadcast synchronization method, which is applied to edge equipment, and is used for acquiring multi-channel video data containing time mark information, synchronizing the multi-channel video data according to the received multi-channel video data and the time mark information of each channel of video data, and then sending the synchronized multi-channel video data to terminal equipment; the terminal device may be a device with a video playing function, such as a mobile phone, a tablet computer, a cable television, etc., which is not described herein again.
The following describes implementation details of the multi-channel live broadcast synchronization method of the present embodiment in detail, and the following is only for implementation details that are easy to understand and is not necessary for implementing the present solution.
A specific flowchart of the multi-channel live broadcast synchronization method of the present embodiment is shown in fig. 1, and specifically includes the following steps:
step 101, acquiring multi-channel video data.
Specifically, after receiving a video request of a user terminal device, an edge device pulls and caches multiple paths of data streams of video data corresponding to the user video request from a cloud server according to information of the video data corresponding to the user request, each path of obtained video data stream is composed of multiple data packets, and each data packet is provided with respective time mark information.
And 102, synchronizing the multiple paths of video data and sending the synchronized multiple paths of video data.
Specifically, after acquiring multiple paths of video data, arranging data packets of each path of video data according to a time number sequence according to time stamp information of each path of video data, then detecting whether time corresponding to a time stamp of a current first data packet of the multiple paths of video data is consistent, when corresponding time is inconsistent, acquiring latest time in time corresponding to each data packet, temporarily storing video data of the latest time corresponding to the time stamp of the first data packet in edge equipment, sending current first data packets of other video data, and detecting whether time corresponding to the time stamp of the current first data packet of the multiple paths of video data is consistent again after sending is finished; and when the time corresponding to the time mark on the current first data packet of the multi-channel video data is consistent, simultaneously sending out the detected current first data packet of the multi-channel video data, and re-detecting whether the time corresponding to the time mark of the current first data packet of the multi-channel video data is consistent after the sending is finished.
Therefore, the embodiment provides a synchronization method of multi-channel live broadcast, which pulls required multi-channel video data streams on edge equipment according to a request of a user, detects whether corresponding time is consistent for a current first data packet of multi-channel video for multiple times according to time mark information of each channel of video data, temporarily stores video data of the latest moment corresponding to the first data packet during detection, sends the current first data packet of other video data until the time corresponding to the time mark of the current first data packet of the multi-channel video data is consistent, namely, after the multi-channel video data are synchronized, the multi-channel video data are synchronized with the sent synchronized multi-channel video data. Through carrying out a lot of detection and synchronization on the edge device, the problem that the multichannel video is asynchronous because the problems of delay and analysis cause in the process of transmitting the video data from the live broadcast device to the edge device in the transmission process is solved, the user can finally watch the multichannel video with synchronous pictures on the terminal device as far as possible, and the use experience of the user is greatly improved.
The second embodiment of the invention relates to a multi-channel live broadcast synchronization method. The second embodiment is an improvement over the first embodiment. In the second embodiment of the present invention, before acquiring multiple paths of video data, time synchronization is performed on live broadcast equipment that generates live broadcast data, then the live broadcast equipment generates and transmits video data, and meanwhile, edge equipment merges multiple identical requests according to detected user video requests, the same video data corresponding to the multiple requests is pulled and cached only once, and is sent to users corresponding to the multiple requests after synchronization, and in the process of video data synchronization, the pulling and caching conditions of the video data in the edge equipment are periodically updated, thereby maximizing the utilization of existing resources. A video synchronization diagram of the entire video synchronization process is shown in fig. 2.
A specific flowchart of a multi-channel live broadcast synchronization method according to a second embodiment of the present invention is shown in fig. 3, and specifically includes the following steps:
step 301, time synchronization is performed on the live broadcasting device.
Specifically, before acquiring live video data requested by a user, an edge device detects a live broadcast request of live broadcast equipment in a terminal device cloud of live broadcast, sends a time synchronization request to the live broadcast equipment initiating the live broadcast request, requests the current time to the same time synchronization server after all equipment to be subjected to live broadcast receives the time synchronization request, and sets own time according to the current time acquired from the time synchronization server, so that the equipment time of each live broadcast equipment is ensured to be consistent.
In one example, live broadcast equipment in a video live broadcast terminal equipment cloud requests a server in a service cloud to start live broadcast, the server in the service cloud issues an RTMP data transmission address to the live broadcast equipment, when edge equipment closest to the live broadcast equipment detects that the server sends the RTMP data transmission address to the live broadcast equipment, a time synchronization request is sent to the live broadcast equipment requesting the live broadcast in the video live broadcast terminal equipment cloud, when the live broadcast equipment receives the time synchronization request and the RTMP data transmission address, the current time is requested to the same clock synchronization server, built-in time of the live broadcast equipment is updated according to the current time obtained from the clock synchronization server, the built-in time of all the live broadcast equipment is guaranteed to be consistent, then the live broadcast equipment acquires videos, time marks are made for each data packet of the video data, and the generated video data are transmitted.
Step 302, acquiring multiple channels of video data.
Specifically, the edge device acquires a video request of a user, pulls and caches multiple paths of video data required by the user in a cloud server according to the video request of the user, then arranges data packets in each path of video data into a row according to time sequence according to time marks on the data packets in the video data to generate a video data matrix, generates an equipment list containing a terminal device address and request time for each path of video data according to the same video data requests of multiple users, updates the equipment list according to a maintenance cycle M, and updates the allocation of storage space of each path of video data according to a resource update cycle T through the following formula.
Figure BDA0002235293950000061
Wherein q isiIs the storage resource of the ith row in the video data matrix, Q is the total amount of the storage resource, Δ t is the maximum time difference of the time corresponding to the time stamp of the first column of video data in the video data matrix, MijIs the size of the data packet corresponding to the ith row and jth column data in the video data matrix, niThe number of ith row of data packets in the video data matrix, and m is the total row number of the video data matrix.
In one example, the edge device obtains the video data request of each terminal device and the address and request time of the terminal device initiating the video data request, pulls and buffers multiple paths of video data, arranges the data packets of each path of video data according to the time sequence, and combines the multiple paths of video data into a video data matrix. And generating an IP list of a corresponding terminal device for each path of video data according to the same video data requests of a plurality of users, wherein the list comprises the IP addresses and the request time of the terminal devices, and periodically detecting the video data requests of the users according to the maintenance period M to generate a video data matrix with the IP list. The schematic diagram of the video data matrix is shown in fig. 4, where M1 represents a first data packet in a first video, N3 represents a third data packet in a second video, L2 represents a second data packet in a third video, multiple data packets with the same time stamp are data packets generated at the same time by different live broadcast devices, that is, M3, N3, and L3 are data packets generated at the same time by different devices, addresses correspond to different lines of multiple video data, and IPlist represents an IP list of a terminal device corresponding to each video.
A schematic diagram of a method for creating and updating an IP list is shown in fig. 5, where a video request of a user is detected, whether the video request corresponds to new video data that is not included in a video data matrix is determined, and when a request for one path of video data that is included in the video data matrix is detected, an edge device obtains an address and request time of a terminal device that sends the request, and adds the terminal device corresponding to the request to a corresponding IP list; and when detecting that the video data aimed at by the video request is not in the video data matrix, pulling and caching the required video data, and generating a corresponding IP list. After an IP list is generated according to a video request, request time of each device in the IP list is periodically checked at intervals of a maintenance period M, a relationship between the request time of a terminal device in the IP list and current time is compared, and when a difference between the request time and the current time exceeds a threshold value 1s, an IP address corresponding to the request time is deleted, wherein the threshold value of the time difference can be set manually, and the embodiment is not limited.
After a video data matrix is generated, a resource updating period T is taken as a time unit, and the storage resources of each path of video data in the video data matrix are redistributed at intervals of T, wherein the default value of the storage space of each path of video can be set artificially, when the current residual storage space of one path of video data can support the continuous caching of a data packet, the data packet is synchronously cached, when the residual storage space of one path of video data is not enough to support the continuous caching of the data packet, the synchronous caching is not carried out, and after the data packet is received, the data packet is directly forwarded to the terminal equipment. And in each resource updating period, detecting the resource occupation condition of each path of video data in the video data matrix, when the resource occupation of one path of video data is 0, namely when no new data packet is transmitted, judging that the live broadcast of the path is stopped, no pulling and caching of the path of video are needed, deleting the path of video data from the video data matrix, cleaning the cached data, releasing the storage space, when detecting that the IP address in the IP list corresponding to the path of video data is 0, judging that no terminal equipment requests the path of video data, deleting the path of video data from the video data matrix, and releasing the storage space.
Step 303, determining whether the time corresponding to the current first data packet of the multi-channel video data is consistent, if so, entering
In step 304, if the two are not consistent, the process proceeds to step 305.
Specifically, since the direct broadcasting device is time-synchronized, and the time stamps of the packets of each path of video data are represented by t1, t2, and t3 … tn according to the sequence generated by each packet, the frames of different paths of video data with the same time stamp are synchronized, when detecting the multiple paths of video data, only the current first packet of the multiple paths of video data needs to be used as the packet to be detected, the time stamps of the packets to be detected are detected, and whether the multiple paths of video data are synchronized is determined according to whether the time stamps are consistent.
Therefore, after the data packets of the multiple paths of video data are arranged according to the time sequence, the current first data packet of each path of video data is used as a data packet to be detected, whether the time stamps on the current first data packet of each path of video data are consistent or not is detected, if the time stamps on all the data packets are detected to be the same, the step 304 is performed, and if the time stamps on all the data packets are detected to be not the same, the step 305 is performed.
In one example, a user watches live broadcast of a ball game at three angles on a mobile phone, the edge device draws three paths of video data according to the live broadcast that the user wants to watch, arranging the data packets of the three paths of video data according to the time sequence to generate a video data matrix, then taking the current first data packet of the three paths of video data as a data packet to be detected, detecting the time stamp of the data packet to be detected, if the time stamps on the current first data packet of the three paths of video data are detected to be t1, t2 and t3 in sequence, obviously, the time stamps are different, it is determined that the three-way video data is not synchronized, and the three-way video data needs to be synchronized, and if it is detected that the timestamps on the current first packet of the three-way video data are all t2, it is obvious that the timestamps are the same, then the three paths of video data are judged to be synchronous, and the current first data packet of the three paths of video data are directly and simultaneously sent.
Step 304, all data packets are sent to the user simultaneously.
Specifically, when the time stamps of all data packets currently detected by the edge device are consistent, it may be determined that the multiple paths of video data are synchronized at this time, that is, no video synchronization operation is required, the multiple paths of video data are directly sent to all devices corresponding to the IP addresses in the IP list according to the terminal device IP list corresponding to each path of video data, and then a detection determination is made as to whether the current synchronization period is finished.
In an example, the synchronization period is set to 40ms, the setting of the synchronization period is set according to the afterimage phenomenon of human eyes, and human eyes cannot recognize the change within 40ms, so that the set synchronization period duration is only not greater than 40ms, which is merely exemplified herein for convenience of explanation, and the setting of the synchronization period duration is not limited.
The method comprises the steps that a user watches live broadcast of a ball game at three angles on a mobile phone at the same time, the edge device detects time marks of three paths of video data, finds that the time marks of current first data packets of the three paths of video data are t2, and judges that the three paths of video data are synchronous, so that the current first data packets of the three paths of video data are directly sent to a plurality of users together according to addresses in a terminal IP list, and then the detection judgment on whether a current synchronous period is finished is carried out.
And step 305, sending the rest data packets except the data packet with the latest corresponding time to the user.
Specifically, when the time corresponding to the time stamps of all the data packets currently detected by the edge device is inconsistent, it may be determined that the playing schedules of the multiple channels of videos are inconsistent, at this time, the latest time among the time corresponding to the time stamps of all the currently detected data packets is obtained, the data packet corresponding to the latest time is temporarily stored in the edge device, the remaining data packets are sent to multiple users according to the terminal device addresses in the terminal IP list corresponding to each channel of video data, and then a detection determination is made as to whether the current synchronization period is finished.
In one example, the edge device detects that a user watches live broadcasts of a ball game at three angles on a mobile phone at the same time, detects the time stamp of the current first data packet of each video data in three paths of video data, detects that the time stamps of the current first data packet of the video data from the first path to the third path are t1, t2 and t3 in sequence, judges that the three paths of videos are asynchronous, temporarily stores the data packet with the latest corresponding time, namely the current first data packet of the third path of video data, sends the other two paths of data packets to the terminal device as the synchronized video data, and then, detects and judges whether the current synchronization period is finished.
Step 306, determining whether the current synchronization period is finished, if so, entering step 307, and if not, returning to step 303.
Specifically, in a synchronization period, multiple paths of video data may be synchronized, and after each synchronization is performed and the synchronized video data is sent to a user, it is further necessary to determine whether the current synchronization period is finished, and when the current synchronization period is not finished, the multiple paths of video data are synchronized again and the synchronized multiple paths of video data are sent, and when the current synchronization period is finished, the next step is performed.
In one example, the synchronization period is set to 40ms, after the edge device sends the synchronized video data to the user mobile phone, according to the start time and the current time of the current synchronization period, how long the current synchronization period is reached, if it is detected that 30ms has elapsed since the current synchronization period was reached, it is determined that the current synchronization period has not ended, and it is necessary to perform the next video data synchronization again, and if it is detected that 40ms has elapsed since the current synchronization period was reached, it is determined that the current synchronization period has ended, and the video synchronization is ended, and the next step is performed.
Step 307, sending the current first data packet of each video path to the user and entering the next synchronization period.
Specifically, when it is determined that one synchronization cycle is finished, the time corresponding to the time stamp of the current first data packet of each path of video data is not detected, the current first data packet of each path of video data is directly sent to the user, and the next synchronization cycle is started after the sending is finished.
In one example, the synchronization period is set to 40ms, the edge device detects that a user watches live broadcasts of a ball game at three angles on a mobile phone simultaneously, synchronizes three paths of video data and sends the synchronized video data, in the time detection of a certain synchronization process of the current synchronization period, the time marks on the current first data packet of the first path video data to the third path video data are respectively t1, t2 and t3, the three paths of video data are judged to be asynchronous, at this time, the current first data packet of the third path video data is temporarily stored in the edge device, the current first data packets of the other two paths are used as the synchronized video data to be sent to the terminal device, after the sending is finished, the current synchronization period is detected to be finished from 40ms when the current synchronization period is started to the sending is finished, at this time, the detection whether the time marks are consistent is not carried out, the current first data packet of the three paths of video data is directly sent to the user together, the method and the device avoid the phenomenon that the third path of video with the fastest playing progress in the pictures watched by the user does not advance the video pictures for more than 40ms, so that the video is visible to human eyes and is blocked, and after the data packet is sent, the third path of video enters the next synchronization period to continue to detect and synchronize the three paths of video data.
Therefore, the embodiment provides a multi-channel live broadcast synchronization method, and time synchronization is performed on live broadcast equipment before video data streams are generated and transmitted by the live broadcast equipment, so that video pictures corresponding to a plurality of data packets with the same time marks in multi-channel video data are generated at the same time; meanwhile, through integration and updating of video requests, video synchronization is carried out by utilizing the existing resources to the greatest extent, through synchronous processing of data packets, the problem that pictures of multi-channel video data are not synchronous due to time delay is solved, in the synchronization process, the video blockage problem caused by the synchronization process is avoided by directly sending the data packets once when the synchronization period is finished, and therefore multi-channel video data which are synchronous and smooth as much as possible are provided for a plurality of users, and user experience is greatly improved.
The steps of the above methods are divided for clarity, and the implementation may be combined into one step or split some steps, and the steps are divided into multiple steps, so long as the same logical relationship is included, which are all within the protection scope of the present patent; it is within the scope of the patent to add insignificant modifications to the algorithms or processes or to introduce insignificant design changes to the core design without changing the algorithms or processes.
The third embodiment of the invention relates to a multi-channel live broadcast synchronization method, which is applied to terminal equipment, and is used for synchronizing video data according to time mark information after the video data containing the time mark information is obtained, and then playing the synchronized multi-channel video; the terminal device may be a device with a video playing function, such as a mobile phone, a tablet computer, a cable television, etc., which is not described herein again, and this embodiment describes an example of sending synchronized multiple paths of video data to a mobile phone player.
A specific flowchart of the multi-channel live broadcast synchronization method of the present embodiment is shown in fig. 6, and specifically includes the following steps:
step 601, acquiring multi-channel video data.
Specifically, according to a video request of the client, a corresponding data stream of multiple paths of videos is acquired from a cloud server or an edge device and cached, the acquired video data stream is composed of a plurality of data packets, each data packet is provided with respective time mark information, then according to the time marks on the data packets, the data packets of each path of video data are respectively arranged in a line according to a time sequence, a video data matrix is generated, and the allocation of the storage space of each path of video data is updated according to a resource updating period T through the following formula.
Figure BDA0002235293950000101
Wherein q isiIs the storage resource of the ith row in the video data matrix, Q is the total amount of the storage resource, Δ t is the maximum time difference of the time corresponding to the time stamp of the first column of video data in the video data matrix, MijIs the size of the data packet corresponding to the ith row and jth column data in the matrix, niThe number of ith row of data packets in the video data matrix, and m is the total row number of the video data matrix.
In one example, after the required video data is acquired and a video data matrix is generated, the resource update period T is taken as a time unit, the storage resources of each path of video data in the video data matrix are reallocated every T time, when the current remaining storage space of each path of video data can support the continuous caching of the data packet, the data packet is synchronously cached, when the remaining storage space is not enough to support the continuous caching of the data packet, the synchronous caching is not performed, and after the data packet is received, the data packet is directly forwarded to the player. When a user stops watching the live video, the memory space of the mobile phone is cleaned, when no new live data is transmitted, the live broadcast is judged to be finished, the live broadcast is deleted, and the memory space of the mobile phone is cleaned.
Step 602, determining whether the time corresponding to the current first data packet of the multi-channel video data is consistent, if so, entering step 603, and if not, entering step 604.
Step 603, all data packets are sent to the player at the same time.
And step 604, sending the rest data packets except the data packet with the latest time to the player.
Step 605, determining whether the current synchronization period is finished, if so, entering step 606, and if not, returning to step 602.
Step 606, sending the current first data packet of each video path to the player and entering the next synchronization period.
Steps 602 to 606 in this embodiment are similar to steps 303 to 307 in the second embodiment, and are not repeated here, except that in this embodiment, a processor of the terminal performs synchronization processing on multiple channels of video data, and sends the synchronized video data to a player, and the player performs playing of multiple channels of video.
Therefore, the embodiment provides a multi-channel live broadcast synchronization method, a processor of a terminal synchronizes multi-channel videos according to time mark information of multi-channel video data, and loads and plays the synchronized multi-channel videos, so that a user can view the multi-channel videos with the synchronized video playing progress, and the user experience is improved.
A fourth embodiment of the present invention relates to a multi-channel live broadcast synchronization system, as shown in fig. 7, including: edge device 701 and terminal 702:
the edge device 701 is configured to acquire multiple paths of video data, synchronize the multiple paths of video data according to time stamp information of each path of video data in the multiple paths of video data, and send the synchronized multiple paths of video data.
Specifically, the edge device 701 pulls and buffers multiple paths of video data according to a user request, determines whether the multiple paths of video data are synchronized according to the timestamp information of each path of video data, synchronizes the asynchronous multiple paths of video data, and transmits the synchronized multiple paths of video data or directly transmits the synchronized multiple paths of video data.
In one example, before acquiring the multiple channels of video data, when the edge device 701 detects that multiple live devices perform live broadcast requests, time synchronization requests are sent to the multiple live broadcast devices, so that the multiple live broadcast devices acquire current time from the same time synchronization server, and adjust their own time, so that the built-in time of each device requesting live broadcast is kept consistent.
In an example, after pulling and buffering multiple paths of video data, the edge device 701 arranges data packets of each path of video data according to a time sequence, performs at least one time of time synchronization on the multiple paths of video data according to a synchronization period, and transmits the synchronized multiple paths of video data, and when the current synchronization period is finished, directly transmits a current first data packet of each path of video data, and enters a next synchronization period, wherein in one time of time synchronization, the current first data packet of each path of video is used as a data packet to be detected, detects whether the time of all data packets to be detected is consistent, and if not, transmits a data packet which is not the latest in time in the data packets to be detected, and performs the next time synchronization; and if the data packets are consistent, simultaneously sending all the data packets to be detected, and entering next time of time synchronization.
In one example, after arranging the data packets of each path of video data in the multiple paths of video data according to the time sequence, the edge device 701 arranges each path of video data into a video data matrix, allocates storage resources for each path of video data according to the following formula, and caches each path of video data;
Figure BDA0002235293950000111
wherein q isiIs the storage resource of the ith row in the video data matrix, Q is the total amount of the storage resource, Δ t is the maximum time difference of the time corresponding to the time stamp of the first column of video data in the video data matrix, MijIs the size of the data packet corresponding to the ith row and jth column data in the video data matrix, niThe number of ith row of data packets in the video data matrix, and m is the total row number of the video data matrix.
A terminal 702, configured to receive multiple channels of video data, and play a video according to the received multiple channels of video data; wherein the multiple channels of video data are sent by the edge device.
Specifically, after receiving the multiple channels of synchronized video data sent by the edge device 701, the player of the terminal 703 loads and plays each channel of corresponding video for the user according to the data packet of the multiple channels of acquired video data.
A fifth embodiment of the invention is directed to an edge device, as shown in fig. 8, comprising at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform a method of multi-cast synchronization as applied to an edge device as described above.
Where the memory and processor are connected by a bus, the bus may comprise any number of interconnected buses and bridges, the buses connecting together one or more of the various circuits of the processor and the memory. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor is transmitted over a wireless medium via an antenna, which further receives the data and transmits the data to the processor.
The processor is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. And the memory may be used to store data used by the processor in performing operations.
A sixth embodiment of the present invention is directed to a terminal, as shown in fig. 9, including at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform a method of multi-cast synchronization as applied to a terminal as described above.
Where the memory and processor are connected by a bus, the bus may comprise any number of interconnected buses and bridges, the buses connecting together one or more of the various circuits of the processor and the memory. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor is transmitted over a wireless medium via an antenna, which further receives the data and transmits the data to the processor.
The processor is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. And the memory may be used to store data used by the processor in performing operations.
A seventh embodiment of the present invention relates to a computer-readable storage medium storing a computer program. The computer program realizes the above-described method embodiments when executed by a processor.
That is, as can be understood by those skilled in the art, all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.

Claims (10)

1. A multi-channel live broadcast synchronization method is applied to edge equipment and is characterized by comprising the following steps:
acquiring multi-channel video data; wherein, each path of video data comprises time mark information;
and synchronizing the multi-channel video data according to the time mark information, and sending the synchronized multi-channel video data.
2. The multi-channel live broadcast synchronization method according to claim 1, further comprising, before the obtaining the multi-channel video data:
time synchronization is carried out on each live broadcast device; the multiple paths of video data and the time mark information contained in each path of video data are generated by each live broadcast device after time synchronization.
3. The multi-channel live broadcast synchronization method according to claim 1, wherein the synchronizing the multi-channel video data according to the timestamp information and sending the synchronized multi-channel video data comprises:
respectively arranging the data packets in each path of video data according to a time sequence; the time of the data packet is obtained from the time mark information of the video data to which the data packet belongs;
carrying out at least one time of time synchronization on the multi-channel video data according to the synchronization period, sending the synchronized multi-channel video data, directly sending the current first data packet of each channel of video data when the current synchronization period is finished, and entering the next synchronization period;
in one time synchronization, taking the current first data packet of each path of video data as a data packet to be detected, and detecting whether the time of all the data packets to be detected is consistent;
if the time difference is not consistent, the data packet with non-latest time in the data packet to be detected is sent first, and next time synchronization is carried out;
and if the data packets are consistent, simultaneously sending all the data packets to be detected, and entering next time of time synchronization.
4. The multi-channel live broadcast synchronization method according to claim 3, further comprising, after the data packets in the video data of each channel are arranged in time sequence:
arranging all paths of video data into a video data matrix, distributing storage resources for all paths of video data according to the following formula, and caching all paths of video data;
Figure FDA0002235293940000011
wherein q isiIs the storage resource of the ith row in the video data matrix, Q is the total amount of the storage resource, Δ t is the maximum time difference at the moment corresponding to the time stamp of the first column of video data in the video data matrix, MijIs the size of the data packet corresponding to the ith row and jth column data in the video data matrix, niThe number of ith row of data packets in the video data matrix, and m is the total number of rows of the video data matrix.
5. A multi-channel live broadcast synchronization method is applied to a terminal and is characterized by comprising the following steps:
acquiring multi-channel video data; wherein, each path of video data comprises time mark information;
and synchronizing the multi-channel video data according to the time mark information, and playing the synchronized multi-channel video.
6. The multi-channel live broadcast synchronization method according to claim 5, wherein the synchronizing the multi-channel video data according to the timestamp information and playing the synchronized multi-channel video comprises:
arranging data packets in each path of video data according to a time sequence; the time of the data packet is obtained from the time mark information of the video data to which the data packet belongs;
carrying out at least one time of time synchronization on the multi-channel video data according to the synchronization period, sending the synchronized multi-channel video data to a playing device, directly sending the current first data packet of each channel of video data when the current synchronization period is finished, and entering the next synchronization period;
in one time synchronization, taking the current first data packet of each path of video data as a data packet to be detected, and detecting whether the time of all the data packets to be detected is consistent;
if the time difference is not consistent, the data packet with non-latest time in the data packet to be detected is sent first, and next time synchronization is carried out;
and if the data packets are consistent, simultaneously sending all the data packets to be detected, and entering next time of time synchronization.
7. A multi-channel live synchronization system, comprising: a terminal and an edge device;
the edge device is used for acquiring multi-channel video data, synchronizing the multi-channel video data according to the time mark information of each channel of video data in the multi-channel video data and sending the synchronized multi-channel video data;
the terminal is used for receiving the multi-channel video data and playing a video according to the received multi-channel video data; wherein the plurality of channels of video data are transmitted by the edge device.
8. An edge device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of multi-way live synchronization of any of claims 1-4.
9. A terminal, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of multi-way live synchronization of claim 5 or 6.
10. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the multi-way live synchronization method of any one of claims 1 to 4 or 5 to 6.
CN201910981387.6A 2019-10-16 2019-10-16 Multi-channel live broadcast synchronization method, system, edge device, terminal and storage medium Active CN110933450B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910981387.6A CN110933450B (en) 2019-10-16 2019-10-16 Multi-channel live broadcast synchronization method, system, edge device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910981387.6A CN110933450B (en) 2019-10-16 2019-10-16 Multi-channel live broadcast synchronization method, system, edge device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110933450A true CN110933450A (en) 2020-03-27
CN110933450B CN110933450B (en) 2022-02-18

Family

ID=69848924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910981387.6A Active CN110933450B (en) 2019-10-16 2019-10-16 Multi-channel live broadcast synchronization method, system, edge device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110933450B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111526386A (en) * 2020-05-06 2020-08-11 北京三体云时代科技有限公司 Data transmission method and device based on auxiliary mixed screen equipment and data transmission system
CN112272305A (en) * 2020-09-28 2021-01-26 天下秀广告有限公司 Multi-channel real-time interactive video cache storage method
CN112866733A (en) * 2021-01-05 2021-05-28 广东中兴新支点技术有限公司 Cloud director synchronization system and method of multiple live devices
CN113395528A (en) * 2021-06-09 2021-09-14 珠海格力电器股份有限公司 Live broadcast method and device, electronic equipment and storage medium
CN114554228A (en) * 2022-02-14 2022-05-27 腾讯科技(深圳)有限公司 Cloud application processing method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101938606A (en) * 2009-07-03 2011-01-05 北京大学 Method, system and device for propelling multimedia data
WO2012094974A1 (en) * 2011-01-11 2012-07-19 中兴通讯股份有限公司 Method, device and system for synchronizing media streams
CN105245977A (en) * 2015-10-10 2016-01-13 上海慧体网络科技有限公司 Method for synchronous live broadcast through multiple cameras
CN107071509A (en) * 2017-05-18 2017-08-18 北京大生在线科技有限公司 The live video precise synchronization method of multichannel
CN107277558A (en) * 2017-06-19 2017-10-20 网宿科技股份有限公司 A kind of player client for realizing that live video is synchronous, system and method
CN109348273A (en) * 2018-10-25 2019-02-15 希诺麦田技术(深圳)有限公司 Transmission Multiple Real-time Internet Video method, apparatus, base station equipment and storage medium
CN109769124A (en) * 2018-12-13 2019-05-17 广州华多网络科技有限公司 Mixed flow method, apparatus, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101938606A (en) * 2009-07-03 2011-01-05 北京大学 Method, system and device for propelling multimedia data
WO2012094974A1 (en) * 2011-01-11 2012-07-19 中兴通讯股份有限公司 Method, device and system for synchronizing media streams
CN105245977A (en) * 2015-10-10 2016-01-13 上海慧体网络科技有限公司 Method for synchronous live broadcast through multiple cameras
CN107071509A (en) * 2017-05-18 2017-08-18 北京大生在线科技有限公司 The live video precise synchronization method of multichannel
CN107277558A (en) * 2017-06-19 2017-10-20 网宿科技股份有限公司 A kind of player client for realizing that live video is synchronous, system and method
CN109348273A (en) * 2018-10-25 2019-02-15 希诺麦田技术(深圳)有限公司 Transmission Multiple Real-time Internet Video method, apparatus, base station equipment and storage medium
CN109769124A (en) * 2018-12-13 2019-05-17 广州华多网络科技有限公司 Mixed flow method, apparatus, electronic equipment and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111526386A (en) * 2020-05-06 2020-08-11 北京三体云时代科技有限公司 Data transmission method and device based on auxiliary mixed screen equipment and data transmission system
CN112272305A (en) * 2020-09-28 2021-01-26 天下秀广告有限公司 Multi-channel real-time interactive video cache storage method
CN112272305B (en) * 2020-09-28 2023-03-24 天下秀广告有限公司 Multi-channel real-time interactive video cache storage method
CN112866733A (en) * 2021-01-05 2021-05-28 广东中兴新支点技术有限公司 Cloud director synchronization system and method of multiple live devices
CN113395528A (en) * 2021-06-09 2021-09-14 珠海格力电器股份有限公司 Live broadcast method and device, electronic equipment and storage medium
CN114554228A (en) * 2022-02-14 2022-05-27 腾讯科技(深圳)有限公司 Cloud application processing method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN110933450B (en) 2022-02-18

Similar Documents

Publication Publication Date Title
CN110933450B (en) Multi-channel live broadcast synchronization method, system, edge device, terminal and storage medium
US11792444B2 (en) Dynamic viewpoints of live event
EP3334175A1 (en) Streaming media and caption instant synchronization displaying and matching processing method, device and system
CN103324457B (en) Terminal and multi-task data display method
US7936790B2 (en) Synchronizing related data streams in interconnection networks
US10341672B2 (en) Method and system for media synchronization
CN109729373B (en) Streaming media data mixing method and device, storage medium and computer equipment
CN104186014B (en) A kind of method and device that net cast synchronizes
US10887646B2 (en) Live streaming with multiple remote commentators
KR20080076823A (en) Multicasting delivery system and multicasting delivery method
JP2010171697A (en) Video delivering system, video delivering device, and synchronization correcting processing device
CN102137278B (en) System and method for broadcasting and distributing streaming media based on mobile terminal
KR102167869B1 (en) Apparatus and method for receiving broadcast content from a broadcast stream and an alternate location
US11553215B1 (en) Providing alternative live media content
US11201903B1 (en) Time synchronization between live video streaming and live metadata
CN108293145A (en) Video distribution synchronizes
JP2015082845A (en) Method and device for ip video signal synchronization
JP2018509818A (en) Dynamic time window and cache mechanism in heterogeneous network transmission
KR20180090719A (en) Method and system for media synchronization
CN110971926A (en) Video playing method, video processing device and storage medium
CN107948703B (en) Method and device for synchronizing playing progress
CN108476333A (en) The adjacent streaming of Media Stream
US20170180769A1 (en) Simultaneous experience of media
JP5997500B2 (en) Broadcast communication cooperative receiver
CN105991469B (en) Dynamic time window and caching mechanism under a kind of heterogeneous network transmission

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant