EP3416335B1 - Optimization method and system on basis of network status of push terminal and push terminal - Google Patents

Optimization method and system on basis of network status of push terminal and push terminal Download PDF

Info

Publication number
EP3416335B1
EP3416335B1 EP17894674.5A EP17894674A EP3416335B1 EP 3416335 B1 EP3416335 B1 EP 3416335B1 EP 17894674 A EP17894674 A EP 17894674A EP 3416335 B1 EP3416335 B1 EP 3416335B1
Authority
EP
European Patent Office
Prior art keywords
time
streaming media
data
transmitted
media data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17894674.5A
Other languages
German (de)
French (fr)
Other versions
EP3416335A1 (en
EP3416335A4 (en
Inventor
Weikun DU
Guohuangshou CHEN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wangsu Science and Technology Co Ltd
Original Assignee
Wangsu Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wangsu Science and Technology Co Ltd filed Critical Wangsu Science and Technology Co Ltd
Publication of EP3416335A1 publication Critical patent/EP3416335A1/en
Publication of EP3416335A4 publication Critical patent/EP3416335A4/en
Application granted granted Critical
Publication of EP3416335B1 publication Critical patent/EP3416335B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/08Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
    • H04L43/0876Network utilisation, e.g. volume of load or congestion level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/08Configuration management of networks or network elements
    • H04L41/0896Bandwidth or capacity management, i.e. automatically increasing or decreasing capacities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/65Network streaming protocols, e.g. real-time transport protocol [RTP] or real-time control protocol [RTCP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/752Media network packet handling adapting media to network capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/762Media network packet handling at the sourceĀ 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • H04W24/02Arrangements for optimising operational condition

Definitions

  • the present disclosure generally relates to the field of network technology and, more particularly, relates to a network status of stream push terminal-based optimization method, a system, and a stream push terminal thereof
  • the network status of stream push terminal-based optimization method is applied to a network status of stream push terminal-based optimization system, where the network status of stream push terminal-based optimization system includes a stream push terminal, a viewing terminal, and a relay terminal.
  • the stream push terminal pushes the collected streaming media format data, through the relay terminal, to the viewing terminal for viewing.
  • the stream push terminal refers to a terminal device of a user, such as a mobile phone, a tablet, or a personal digital assistant, etc.
  • the relay terminal may include various transit devices in the network, such as routers and servers, and may specifically be a CDN (content delivery network) streaming media cluster.
  • CDN content delivery network
  • the viewing terminal may include various receiving and display terminals for streaming media data, such as mobile display terminals (e.g., a mobile phone, a tablet, etc.) and non-mobile display terminals (e.g., a desktop computer, etc.).
  • the stream push terminal includes at least a voice collection device (e.g., a microphone, etc.) and a video capture device (e.g., a camera, etc.), through which the streaming media format data is collected.
  • the collected streaming media format data is pushed, through forwarding by the relay terminal, to the viewing terminal for viewing.
  • the method of the present disclosure mainly optimizes the stream push terminal, that is, optimize from the source.
  • the function of the stream push terminal may be divided into three processes: collection, encoding, and transmission, each of which affects each other.
  • the collection comes first, the coding later, and then the transmission. This is a production and consumption procedure.
  • the present disclosure determines whether the previous processes have dropped frames based on the situation of eventual consumption, and controls the production based on the eventual situation, that is, to control the collection frequency directly from the collection process.
  • the stream push terminal determines the time for transmission of the current data based on the present real-time upstream bandwidth.
  • Step S1 a flowchart of detailed sub-steps of Step S1 shown in FIG. 1 according to one embodiment of the present disclosure is illustrated.
  • the stream push terminal collects voice data, such as pulse code modulation (PCM) format voice data, through its voice collection device (e.g., a microphone, etc.).
  • voice collection device e.g., a microphone, etc.
  • video capture device e.g., a camera, etc.
  • specific collection process may be, for example, as follows:
  • the collected audio data and/or video data are respectively audio-encoded and/or video-encoded.
  • the YUV and PCM data are respectively encoded into H.264 and AAC data using encoding tools such as FFMPEG, X264, FAAC, and hardcoding.
  • an H.264 video encoding method is used to encode the collected video data into H.264 format video data
  • an advanced audio coding (AAC) method is used to encode the collected audio data into AAC format audio data.
  • specific encoding process may be, for example, as follows:
  • the encoded audio data and/or the encoded video data are encapsulated into streaming media format data.
  • streaming protocols such as the RTMP (real-time messaging protocol) streaming protocol
  • the H.264 and AAC data are encoded into data conforming to the RTMP protocol.
  • the RTMP protocol is employed to encapsulate the encoded audio data and/or the encoded video data into RTMP format streaming media data.
  • specific encapsulation process may be, for example, as follows:
  • a temporary storage area is assigned in the memory of the stream push terminal for temporarily storing the streaming media data.
  • the streaming media data encapsulated according to the RTMP protocol is placed into the to-be-transmitted queue, and the to-be-transmitted queue is stored in the temporary storage area of the memory.
  • Step S15 the streaming media format data is retrieved from the to-be-transmitted queue and is transmitted to the relay terminal.
  • the present real-time upstream bandwidth is calculated by determining in real-time an amount of streaming media format data transmitted in a unit of time.
  • Step S17 the time for transmission of the current data is determined based on the present real-time upstream bandwidth.
  • the initialized frame rate is set as 25, the upper limit is 25, and the lower limit is 10.
  • specific transmission process may be, for example, as follows:
  • the calculated time for transmission of the current data is 150 ms, which is less than the predefined value of 400 ms, it means that the current upstream bandwidth is sufficient.
  • frame dropping is stopped, and the frame rate is increased. Further, sequential transmission of the remaining streaming media format data from the to-be-transmitted queue continues, to ensure the smoothness of the stream push service and improve the viewing experience in the viewing terminal.
  • the system of the present disclosure mainly optimizes the stream push terminal 11, that is, optimize from the source.
  • the function of the stream push terminal 11 may be divided into three processes: collection, encoding, and transmission, each of which affects each other.
  • the collection comes first, the coding later, and then the transmission. This is a production and consumption procedure.
  • the present disclosure determines whether the previous processes have dropped frames based on the situation of eventual consumption, and controls the production based on the eventual situation, that is, to control the collection frequency directly from the collection process.
  • an H.264 video encoding method is used to encode the collected video data into H.264 format video data
  • an AAC method is used to encode the collected audio data into AAC format audio data.
  • specific encapsulation process may be, for example, as follows:
  • the unit of time can be flexibly configured based on the requirements.
  • the unit of time is 1 s.
  • Partial data may be retrieved from the streaming media data in the to-be-transmitted queue and is placed into the transmission module for transmission, and the amount of the part of the streaming media data that is transmitted in a unit of time is determined in real-time. For example, 8 MB of streaming media data is transmitted within a unit of time of 1 s, and the calculated transmission rate for upstreaming data is 8 Mbps, then the present upstream bandwidth is 1 M.
  • specific frame dropping process may be, for example, as follows:
  • the transmission module 114 is configured to continue transmitting the remaining streaming media format data sequentially from the to-be-transmitted queue if the time for transmission of the current data does not exceed the predefined value and the frame rate reaches the upper limit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Description

    FIELD OF THE DISCLOSURE
  • The present disclosure generally relates to the field of network technology and, more particularly, relates to a network status of stream push terminal-based optimization method, a system, and a stream push terminal thereof
  • BACKGROUND
  • In recent years, with the rapid development of Internet technology, various streaming media applications have gradually entered the daily life of the majority of Internet users. Many live streaming-like real-time businesses have attracted a large number of users because of their original and innovative content. Therefore, how to optimize audio and video delay and improve user experience has become the development direction of audio and video live streaming services.
  • At present, with the development of the country's bandwidth acceleration and the rapid development of mobile communication technologies, people are no longer subject to geographical constraints. They can access texts, pictures, and videos through the Internet anytime and anywhere, which greatly enriches people's life in spare time. With the popularization of smartphones and the rapid development of mobile bandwidth, pushing streams using mobile phones has become an important method for video streaming, and more and more people have begun to use mobile phones to broadcast live videos.
  • However, existing mobile networks have the disadvantages of instability, variability, and being easily affected by the environments and locations, resulting in instability of stream push on the mobile terminals. In addition, the uploading bandwidth on the mobile stream push terminals is uncontrollable and thus is easily affected by objective factors. In the event of bandwidth fluctuations, the viewing terminal is prone to lagging, blurred screen, and increased delay, etc., which will affect the smoothness in watching videos and thus influence the customer experience. The document US 2014/0146676 A1 discloses a system that comprises a network device adapted to receive a media stream that includes a plurality of network packets. The network packets are transmitted or dropped based at least on a priority level associated with each network packet in the plurality of network packets.
  • BRIEF SUMMARY OF THE DISCLOSURE
  • To solve the problems in the existing technology, embodiments of the present disclosure provide an optimization method based on the network status of a stream push terminal, a optimization system based on the network status of a stream push terminal, and a stream push terminal thereof. The technical solutions are as indicated in the claims that follow.
  • The technical solutions provided by the embodiments of the present disclosure give rise to the following beneficial effects: they allow real-time inspection of the network status of the stream push terminal, real-time monitoring of the variations of the upstream bandwidth, control of production from the source, reduction of the production volume, and reduction of the amount of uploading data. This allows clients to smoothly push streams under the situation that the upstream bandwidth of the stream push terminal is not optimal, which greatly improves the viewing experience of users. When the bandwidth is detected to gradually get better, frame dropping is stopped, so that the quality of the pushed streams in the stream push terminal is ensured.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • To make the technical solutions in the embodiments of the present disclosure clearer, a brief introduction of the accompanying drawings consistent with descriptions of the embodiments will be provided hereinafter. It is to be understood that the following described drawings are merely some embodiments of the present disclosure. Based on the accompanying drawings and without creative efforts, persons of ordinary skill in the art may derive other drawings.
    • FIG. 1 is a flowchart of a network status of stream push terminal-based optimization method according to one embodiment of the present disclosure;
    • FIG. 2 is a flowchart of detailed sub-steps of Step S1 shown FIG. 1 according to one embodiment of the present disclosure;
    • FIG. 3 is a schematic diagram of an internal structure of a network status of stream push terminal-based optimization system 10 according to one embodiment of the present disclosure; and
    • FIG. 4 is a schematic diagram of an internal structure of a determination module 111 of a stream push terminal 11 shown in FIG. 3 according to one embodiment of the present disclosure.
    DETAILED DESCRIPTION
  • To make the objectives, technical solutions, and advantages of the present disclosure clearer, specific embodiments of the present disclosure will be made in detail with reference to the accompanying drawings. It should be understood that the specific embodiments described herein are merely used to explain the present disclosure and shall not be construed as limiting the present disclosure.
  • A network status of stream push terminal-based optimization method consistent with the present disclosure will be described in detail hereinafter.
  • Referring to FIG. 1, a flowchart of a network status of stream push terminal-based optimization method according to one embodiment of the present disclosure is provided.
  • In one embodiment, the network status of stream push terminal-based optimization method is applied to a network status of stream push terminal-based optimization system, where the network status of stream push terminal-based optimization system includes a stream push terminal, a viewing terminal, and a relay terminal. The stream push terminal pushes the collected streaming media format data, through the relay terminal, to the viewing terminal for viewing. The stream push terminal refers to a terminal device of a user, such as a mobile phone, a tablet, or a personal digital assistant, etc. The relay terminal may include various transit devices in the network, such as routers and servers, and may specifically be a CDN (content delivery network) streaming media cluster. The viewing terminal may include various receiving and display terminals for streaming media data, such as mobile display terminals (e.g., a mobile phone, a tablet, etc.) and non-mobile display terminals (e.g., a desktop computer, etc.). The stream push terminal includes at least a voice collection device (e.g., a microphone, etc.) and a video capture device (e.g., a camera, etc.), through which the streaming media format data is collected. The collected streaming media format data is pushed, through forwarding by the relay terminal, to the viewing terminal for viewing.
  • The method of the present disclosure mainly optimizes the stream push terminal, that is, optimize from the source. The function of the stream push terminal may be divided into three processes: collection, encoding, and transmission, each of which affects each other. The collection comes first, the coding later, and then the transmission. This is a production and consumption procedure. From the transmission process, the present disclosure determines whether the previous processes have dropped frames based on the situation of eventual consumption, and controls the production based on the eventual situation, that is, to control the collection frequency directly from the collection process.
  • At Step S1, the stream push terminal determines the time for transmission of the current data based on the present real-time upstream bandwidth.
  • In one embodiment, the Step S1 of determining, by the stream push terminal, the time for transmission of the current data based on the present real-time upstream bandwidth specifically includes seven sub-steps S11-S17, as shown in FIG. 2.
  • Referring to FIG. 2, a flowchart of detailed sub-steps of Step S1 shown in FIG. 1 according to one embodiment of the present disclosure is illustrated.
  • At Step S11, the stream push terminal collects audio data and/or video data.
  • In one embodiment, the stream push terminal collects voice data, such as pulse code modulation (PCM) format voice data, through its voice collection device (e.g., a microphone, etc.). At the same time, the stream push terminal collects video data in a format like YUV or voice data in a format like PCM in a callback manner through its video capture device (e.g., a camera, etc.) or an interface of the system.
  • In one embodiment, specific collection process may be, for example, as follows:
    1. (1) For video: collect video data through its video capture device (e.g., camera, etc.), and set up the frame rate, the width and height, etc. For examples, the frame rate is 25 frames/sec, the height and weight are 720*1080, and the system callbacks YUV data every 1000 ms/25 = 40 ms.
    2. (2) For audio: collect voice data through its voice collection device (e.g., a microphone, etc.), and set up the sampling rate, the number of channels, the number of bits, etc. For example, the sampling rate is 44,100, the number of channels is 2, and the number of bits is 16. That is, 44,100 samples are collected per second, and each sample has 2 channels, each with 16 bits.
  • At Step S12, the collected audio data and/or video data are respectively audio-encoded and/or video-encoded. For example, the YUV and PCM data are respectively encoded into H.264 and AAC data using encoding tools such as FFMPEG, X264, FAAC, and hardcoding.
  • In one embodiment, an H.264 video encoding method is used to encode the collected video data into H.264 format video data, and an advanced audio coding (AAC) method is used to encode the collected audio data into AAC format audio data.
  • In one embodiment, specific encoding process may be, for example, as follows:
    1. (1) Video encoding: initialize video encoding parameters, including the width and height, the frame rate, the bit rate, and the bit rate mode, etc., call the encoding interface of an encoder, encode the YUV data called back by the system into H.264 data, and encapsulate the H.264 data into streaming media format through a streaming protocol submodule. For example, < note that paragraphs from [0029] to [0065] are missing due to the renumbering in pages 2-6 as submitted on 30-08-2019 > the width and height are 720*1080, the frame rate is 25, the code rate is 1 Mbps, etc., and the YUV data is encoded into H.264 data using X264, FFMPEG, or hardcoding.
    2. (2) Audio coding: initialize audio encoding parameters, including the sampling rate, the number of channels, and the number of bits, etc., and encode the PCM data called back by the system into AAC data through the encoding interface of an AAC encoder. For example, the sampling rate is 44,100, the number of channels is 2, the number of bits is 16, and the PCM data is encoded into AAC data using encoders such as FAAC and hardcoding.
    3. (3) Sequentially place the audio/video data into the streaming protocol submodule based on the sequence of collection, and convert the audio/video data into streaming media format data.
  • At Step S13, the encoded audio data and/or the encoded video data are encapsulated into streaming media format data. Based on different streaming protocols, such as the RTMP (real-time messaging protocol) streaming protocol, the H.264 and AAC data are encoded into data conforming to the RTMP protocol.
  • In one embodiment, the RTMP protocol is employed to encapsulate the encoded audio data and/or the encoded video data into RTMP format streaming media data.
  • In one embodiment, specific encapsulation process may be, for example, as follows:
    1. (1) Establish a connection with a server using the RTMP protocol, and acquire an address for broadcasting streams.
    2. (2) Retrieve the H.264 data, the AAC data, etc. that are generated by the encoders, generate a sequence based on the time of collection by the encoders, and based on the time in the sequence, encapsulate the H.264 data, the AAC data, etc., respectively, according to the RTMP protocol.
    3. (3) Place the encapsulated streaming media format data into a to-be-transmitted queue. When placing the encapsulated streaming media format data into the queue, the encoding information, such as the width and height of the video, the bit rate, the frame rate, the audio sampling rate, and the number of channels, is placed into the queue as well. The data placed into the queue may wait for data transmission by the transmission module.
  • At Step S14, the encapsulated streaming media format data is placed into the to-be-transmitted queue.
  • In one embodiment, a temporary storage area is assigned in the memory of the stream push terminal for temporarily storing the streaming media data. The streaming media data encapsulated according to the RTMP protocol is placed into the to-be-transmitted queue, and the to-be-transmitted queue is stored in the temporary storage area of the memory.
  • At Step S15, the streaming media format data is retrieved from the to-be-transmitted queue and is transmitted to the relay terminal.
  • At Step S16, the present real-time upstream bandwidth is calculated by determining in real-time an amount of streaming media format data transmitted in a unit of time.
  • In one embodiment, specific calculation process may be, for example, as follows:
    1. (1) After retrieving the data from the queue, transmit a frame and record the time used for the transmission and the size of the transmitted data.
    2. (2) Because a delay is determined based on the real-time transmission rate and the buffered data in the to-be-transmitted queue, configuring a unit of time for counting/calculation becomes necessary, which is called the unit of time. Based on the configured unit of time, determine the total data transmitted in a currently configured unit of time, and the total data/ the total duration = the transmitting rate.
    3. (3) Calculate the amount of buffered remaining data in the to-be-transmitted queue, and determine the time for transmission of the current to-be-transmitted streaming media data based on the calculated upstream bandwidth and the data amount of the buffered remaining streaming media data in the to-be-transmitted queue.
  • In one embodiment, the unit of time can be flexibly configured based on the requirements. For example, the unit of time is 1 s. Partial data may be retrieved from the streaming media data in the to-be-transmitted queue and be placed into the transmission module for transmission, and the amount of the part of the streaming media data that is transmitted in a unit of time may be determined in real-time. For example, 8 MB of streaming media data is transmitted within a unit of time of 1 s, and the calculated transmission rate for upstreaming data is 8 Mbps, then the present upstream bandwidth is 1 M.
  • At Step S17, the time for transmission of the current data is determined based on the present real-time upstream bandwidth.
  • In one embodiment, after partial data has been transmitted to determine the upstream bandwidth, the amount of buffered remaining data in the to-be-transmitted queue is calculated. Based on the calculated upstream bandwidth and the data amount of the buffered remaining streaming media data in the to-be-transmitted queue, the time for transmission of the current to-be-transmitted streaming media data is then calculated. For example, the unit of time is 1 s, the streaming media data in the to-be-transmitted queue has a total size of 12 MB, and the transmission rate of the upstreaming data calculated from the previously retrieved part of the data (e.g., 8 MB) in a unit of time is 8 Mbps, that is, the upstream bandwidth is 1 M. Then, the calculated amount of the remaining streaming media data is 4 MB. Therefore, based on the present real-time upstream bandwidth and the amount of the remaining streaming media data, the calculated time for transmission of the current to-be-transmitted streaming media data is 500 ms.
  • That is, it is predicted to take 500 ms to finish transmission of the remaining streaming media data.
  • Referring back to FIG. 1, at Step S2, the stream push terminal evaluates whether the time for transmission of the current data exceeds a predefined value.
  • In one embodiment, the predefined value can be flexibly defined based on the requirements. For example, the predefined value is 400 ms.
  • If the time for transmission of the current data exceeds the predefined value, then the frame dropping approach is employed to reduce the amount of uploading data at Step S3.
  • In one embodiment, specific frame dropping process may be, for example, as follows:
    1. (1) Based on the total amount of data currently buffered and the current real-time transmission rate, calculate: the total amount of data/the real-time transmission rate = the time for transmission. The predefined time for transmission is denoted as T1, and the present time for transmission is denoted as Tn.
    2. (2) When encoding, initialize a frame rate ofk, predefine a frame dropping lower limit of k1, and upper limit of k2 as current predefined frame rates, and denote the current actual frame rate as kn. Initialize kn = k, and the range for the current frame rate lies between k1 and k2, that is, k1 ā‰¤ kn ā‰¤ k2. In general, by default, k2 = k. That is, the upper limit of k2 is initialized. The new frame rate is kn = kn/(Tn/T1), and this frame rate changes in real time.
    3. (3) If the time for transmission of the current data exceeds the predefined value, that is, Tn > T1, the frame dropping approach is employed to reduce the amount of uploading data. Specific frame dropping process includes: if kn > k2, then kn = k2; and if kn < k1, then kn = k1.
    4. (4) If the time for transmission of the current data does not exceed the predefined value, that is, Tn < T1, then gradually increase the frame rate until it recovers to the current maximum frame rate. The formula for the increasing process is: new kn = kn/(Tn/T1).
  • In one embodiment, for example, the initialized frame rate is set as 25, the upper limit is 25, and the lower limit is 10. Assume that the calculated time for transmission of the current data is 500 ms as stated in the above example. Since 500 ms is greater than the predefined value of 400 ms, if all data is uploaded, the influence of the bandwidth may lead to a poor viewing experience such as lagging, blurred screen, or increased delay, etc. in the pushed live streams received in the viewing terminal. Therefore, in order to improve the viewing experience, the frame dropping module is activated, and the new frame rate is kn = kn/(Tn/T1). Based on the formula, the new kn = 25/(500/400). That is, kn = 20 frames/sec, or 20 frames are generated per second. The amount of uploading data may be reduced by employing the frame dropping approach until the time for transmission of the amount of the buffered remaining data is less than the predefined value of 400 ms. If the time for transmission of the current data is calculated as 300 ms in the above example, the current frame rate is 20, and the new frame rate is kn = kn/(Tn/T1), based on the formula, the new kn = 20/(300/400). That is, kn = 26 frames/sec. Because kn > the upper limit k2, kn = k2 = 25 frames/sec.
  • If the time for transmission of the current data does not exceed the predefined value, and the frame rate reaches the upper limit, continues transmitting remaining streaming media format data sequentially from the to-be-transmitted queue at Step S4.
  • In one embodiment, specific transmission process may be, for example, as follows:
    1. (1) The RTMP protocol-based inter-connection with the server has already been set up using the RTMP streaming protocol. To-be-transmitted streaming media format data is circularly retrieved from the to-be-transmitted queue and is transmitted to the streaming media data.
    2. (2) The server receives the streaming media format data from the client terminal, and distributes it to each client terminal.
  • In one embodiment, if the calculated time for transmission of the current data is 150 ms, which is less than the predefined value of 400 ms, it means that the current upstream bandwidth is sufficient. In order to ensure the quality of the pushed streams at the stream push terminal, frame dropping is stopped, and the frame rate is increased. Further, sequential transmission of the remaining streaming media format data from the to-be-transmitted queue continues, to ensure the smoothness of the stream push service and improve the viewing experience in the viewing terminal.
  • The TCP protocol-based network status of stream push terminal-based optimization method provided by the present disclosure allows real-time inspection of the network status of the stream push terminal, real-time monitoring of the variations of the upstream bandwidth, control of production from the source, reduction of production volume, and reduction of the amount of uploading data. This allows clients to smoothly push streams under the situation that the upstream bandwidth of the stream push terminal is not optimal, which greatly improves the viewing experience of users. When the bandwidth is detected to gradually get better, frame dropping is stopped, so that the quality of the pushed streams in the stream push terminal is ensured.
  • A network status of stream push terminal-based optimization system consistent with the present disclosure will be described in detail hereinafter.
  • Referring to FIG. 3, a schematic diagram of an internal structure of a TCP-based network status of stream push terminal-based optimization system 10 according to one embodiment of the present disclosure is illustrated.
  • In one embodiment, the network status of stream push terminal-based optimization system 10 includes a stream push terminal 11, a viewing terminal 13, and a relay terminal 12, where the stream push terminal 11 pushes the collected streaming media format data, through the relay terminal 12, to the viewing terminal 13 for viewing. The stream push terminal 11 refers to a terminal device of a user, including a mobile phone, a tablet, and a personal digital assistant, etc. The relay terminal 12 may include various transit devices in the network, such as routers and servers. The viewing terminal 13 may include various receiving and display terminals for streaming media data, such as mobile display terminals (e.g., a mobile phone, a tablet, etc.) and non-mobile display terminals (e.g., a desktop computer, etc.). The stream push terminal 11 includes at least a voice collection device (e.g., a microphone, etc.) and a video capture device (e.g., a camera, etc.), through which the streaming media format data is collected. The collected streaming media format data is pushed, through forwarding by the relay terminal 12, to the viewing terminal for viewing. Here, the relay terminal 12 may include multiple units. The stream push terminal 11 pushes the collected streaming media format data, through a single relay terminal 12 or multiple relay terminals 12, to the viewing terminal 13 for viewing.
  • The system of the present disclosure mainly optimizes the stream push terminal 11, that is, optimize from the source. The function of the stream push terminal 11 may be divided into three processes: collection, encoding, and transmission, each of which affects each other. The collection comes first, the coding later, and then the transmission. This is a production and consumption procedure. Through the transmission process, the present disclosure determines whether the previous processes have dropped frames based on the situation of eventual consumption, and controls the production based on the eventual situation, that is, to control the collection frequency directly from the collection process.
  • The stream push terminal 11 specifically includes: a determination module 111, an evaluation module 112, a frame dropping module 113, and a transmission module 114.
  • The determination module 111 is configured to determine the time for transmission of the current data based on the present real-time upstream bandwidth.
  • Here, the determination module 111 specifically includes: a collection submodule 1111, an encoding submodule 1112, a streaming protocol submodule 1113, and a bandwidth inspection submodule 1114, as shown in FIG. 4.
  • Referring to FIG. 4, a schematic diagram of an internal structure of a determination module 111 of a stream push terminal 11 shown in FIG. 3 according to one embodiment of the present disclosure is illustrated.
  • The collection submodule 1111 is configured to collect audio data and/or video data.
  • In one embodiment, the collection submodule 1111 collects voice data, such as PCM format voice data, through its voice collection device (e.g., a microphone, etc.). At the same time, the collection submodule 1111 collects video data in a format like YUV or voice data in a format like PCM in a callback manner through its video capture device (e.g., a camera, etc.) or an interface of the system.
  • In one embodiment, specific collection process may be, for example, as follows:
    1. (1) For video: collect video data through its video capture device (e.g., camera, etc.), and set up the frame rate, the width and height, etc. For examples, the frame rate is 25 frames/sec, the height and weight are 720*1080, and the system callbacks YUV data every 1000 ms/25 = 40 ms.
    2. (2) For audio: collect voice data through its voice collection device (e.g., a microphone, etc.), and set up the sampling rate, the number of channels, the number of bits, etc. For example, the sampling rate is 44,100, the number of channels is 2, and the number of bits is 16. That is, 44,100 samples are collected per second, and each sample has 2 channels, each with 16 bits.
  • The encoding submodule 1112 is configured to audio-encode and/or video-encode the collected audio data and/or video data, respectively. For example, the YUV and PCM data are respectively encoded into H.264 and AAC data using encoding tools such as FFMPEG, X264, FAAC, and hardcoding.
  • In one embodiment, an H.264 video encoding method is used to encode the collected video data into H.264 format video data, and an AAC method is used to encode the collected audio data into AAC format audio data.
  • In one embodiment, specific encoding process may be, for example, as follows:
    1. (1) Video encoding: initialize video encoding parameters, including the width and height, the frame rate, the bit rate, and the bit rate mode, etc., call the encoding interface of an encoder, encode the YUV data called back by the system into H.264 data, and encapsulate the H.264 data into streaming media format through a streaming protocol submodule. For example, the width and height are 720*1080, the frame rate is 25, the code rate is 1 Mbps, etc., and the YUV data is encoded into H.264 data using X264, FFMPEG, or hardcoding.
    2. (2) Audio coding: initialize audio encoding parameters, including the sampling rate, the number of channels, and the number of bits, etc., and encode the PCM data called back by the system into AAC data through the encoding interface of an AAC encoder. For example, the sampling rate is 44,100, the number of channels is 2, the number of bits is 16, and the PCM data is encoded into AAC data using encoders such as FAAC, and hardcoding.
    3. (3) Sequentially place the audio/video data into the streaming protocol submodule based on the sequence of collection, and convert the audio/video data into streaming media format data.
  • The streaming protocol submodule 1113 is configured to encapsulate the encoded audio data and/or the encoded video data into streaming media format data. Based on different streaming protocols, such as the RTMP streaming protocol, the H.264 and AAC data are encoded into data conforming to the RTMP protocol.
  • In one embodiment, the RTMP protocol is employed to encapsulate the encoded audio data and/or the encoded video data into RTMP format streaming media data.
  • In one embodiment, specific encapsulation process may be, for example, as follows:
    1. (1) Establish a connection with a server using the RTMP protocol, and acquire an address for broadcasting streams.
    2. (2) Retrieve the H.264 data, the AAC data, etc. that are generated by the encoders, generate a sequence based on the time of collection by the encoders, and based on the time in the sequence, encapsulate the H.264 data, the AAC data, etc., respectively, according to the RTMP protocol.
    3. (3) Place the encapsulated streaming media format data into a to-be-transmitted queue. When placing the encapsulated streaming media format data into the queue, the encoding information, such as the width and height of the video, the bit rate, the frame rate, the audio sampling rate, and the number of channels, etc., is placed into the queue as well. The data placed into the queue may wait for data transmission by the transmission module.
  • The bandwidth inspection submodule 1114 is configured to place the encapsulated streaming media format data into the to-be-transmitted queue, calculate the present real-time upstream bandwidth by determining in real-time an amount of streaming media format data transmitted in a unit of time, and determine the time for transmission of the current data based on the present real-time upstream bandwidth.
  • In one embodiment, a temporary storage area is assigned in the memory of the stream push terminal 11 for temporarily storing the streaming media data. The streaming media data encapsulated according to the RTMP protocol is placed into the to-be-transmitted queue, and the to-be-transmitted queue is stored in the temporary storage area of the memory.
  • In one embodiment, specific calculation process may be, for example, as follows:
    1. (1) After retrieving the data from the queue, transmit a frame and record the time for the transmission and the size of the transmitted data.
    2. (2) Because a delay is determined based on the real-time transmission rate and the buffered data in the to-be-transmitted queue, configuring a unit of time for counting/calculation becomes necessary, which is called the unit of time. Based on the configured unit of time, determine the total data transmitted in a currently configured unit of time, and the total data/the total duration = the transmitting rate.
    3. (3) Calculate the amount of buffered remaining data in the to-be-transmitted queue, and determine the time for transmission of the current to-be-transmitted streaming media data based on the calculated upstream bandwidth and the data amount of the buffered remaining streaming media data in the to-be-transmitted queue.
  • In one embodiment, the unit of time can be flexibly configured based on the requirements. For example, the unit of time is 1 s. Partial data may be retrieved from the streaming media data in the to-be-transmitted queue and is placed into the transmission module for transmission, and the amount of the part of the streaming media data that is transmitted in a unit of time is determined in real-time. For example, 8 MB of streaming media data is transmitted within a unit of time of 1 s, and the calculated transmission rate for upstreaming data is 8 Mbps, then the present upstream bandwidth is 1 M.
  • In one embodiment, after partial data has been transmitted to determine the upstream bandwidth, the amount of buffered remaining data in the to-be-transmitted queue is calculated. Based on the calculated upstream bandwidth and the data amount of the buffered remaining streaming media data in the to-be-transmitted queue, the time for transmission of the current to-be-transmitted streaming media data is then calculated. For example, the streaming media data in the to-be-transmitted queue has a total size of 12 MB, and the transmission rate of the upstreaming data calculated from the previously retrieved part of the data (e.g., 8 MB) is 8 Mbps, that is, the upstream bandwidth is 1 M. Then, the determined amount of the remaining streaming media data is 4 MB. Therefore, based on the present real-time upstream bandwidth and the amount of the remaining streaming media data, the calculated time for transmission of the current to-be-transmitted streaming media data is 500 ms. That is, it is predicted to take 500 ms to finish transmission of the remaining streaming media data.
  • Referring back to FIG. 3, the evaluation module 112 is configured to evaluate whether the time for transmission of the current data exceeds a predefined value.
  • In one embodiment, the predefined value can be flexibly defined based on the requirements. For example, the predefined value is 400 ms.
  • The frame dropping module 113 is configured to employ a frame dropping approach to reduce the amount of uploading data if the time for transmission of the current data exceeds the predefined value.
  • In one embodiment, specific frame dropping process may be, for example, as follows:
    1. (1) Based on the total amount of data currently buffered, and the present real-time transmission rate, calculate: the total amount of data/the real-time transmission rate = the time for transmission. The predefined time for transmission is denoted as T1, and the present time for transmission is denoted as Tn.
    2. (2) When encoding, initialize a frame rate of k, predefine a frame dropping lower limit of k1, and upper limit of k2 as the current predefined frame rates, and denote the current actual frame rate as kn. Initialize kn = k, the current frame rate range lies between k1 and k2, that is, k1ā‰¤knā‰¤k2. In general, by default, k2 = k. That is, the upper limit of k2 is initialized. The new frame rate is kn = kn/(Tn/T1), and this frame rate changes in real time.
    3. (3) If the time for transmission of the current data exceeds the predefined value, that is, Tn > T1, the frame dropping approach is employed to reduce the amount of uploading data. Specific frame dropping process includes: if kn > k2, then kn = k2; and if kn < k1, then kn = k1.
    4. (4) If the time for transmission of the current data does not exceed the predefined value, that is, Tn < T1, then gradually increase the frame rate until it recovers to the current maximum frame rate. The formula for the increasing process is: new kn = kn/(Tn/T1).
  • In one embodiment, for example, the initialized frame rate is set as 25, the upper limit is 25, and the lower limit is 10. Assume that the calculated time for transmission of the current data is 500 ms as stated in the above example. Since 500 ms is greater than the predefined value of 400 ms, if all data is uploaded, the influence of the bandwidth may lead to a poor viewing experience such as lagging, blurred screen, or increased delay, etc. in the pushed live streams received in the viewing terminal 13. Therefore, in order to improve the viewing experience, the frame dropping module is activated, and the new frame rate is kn = kn/(Tn/T1). Based on the formula, the new kn = 25/(500/400). That is, kn = 20 frames/sec, or 20 frames are generated per second. The amount of uploading data may be reduced by employing the frame dropping approach until the time for transmission of the amount of the buffered remaining data is less than the predefined value of 400 ms. If the time for transmission of the current data is calculated as 300 ms in the above example, the current frame rate is 20, and the new frame rate is kn = kn/(Tn/T1), based on the formula, the new kn = 20/(300/400). That is, kn = 26 frames/sec. Because kn > the upper limit k2, kn = k2 = 25 frames/sec.
  • The transmission module 114 is configured to continue transmitting the remaining streaming media format data sequentially from the to-be-transmitted queue if the time for transmission of the current data does not exceed the predefined value and the frame rate reaches the upper limit.
  • In one embodiment, specific transmission process may be, for example, as follows:
    1. (1) The RTMP protocol-based inter-connection with the server has already been established using the RTMP streaming protocol. To-be-transmitted streaming media format data is circularly retrieved from the to-be-transmitted queue and is transmitted to the streaming media data.
    2. (2) The server receives the streaming media format data from the client terminal, and distributes it to each client terminal.
  • In one embodiment, if the calculated time for transmission of the current data is 150 ms, which is less than the predefined value of 400 ms, it means that the current upstream bandwidth is sufficient. In order to ensure the quality of the pushed streams in the stream push terminal 11, frame dropping is stopped, and the frame rate is increased. Further, sequential transmission of the remaining streaming media format data from the to-be-transmitted queue continues, to ensure the smoothness of the stream push service and improve the viewing experience in the viewing terminal.
  • The TCP protocol-based network status of stream push terminal-based optimization system 10 provided by the present disclosure allows real-time inspection of the network status in the stream push terminal, real-time monitoring of the variations of the upstream bandwidth, control of production from the source, reduction of production volume, and reduction of the amount of uploading data. This allows clients to smoothly push streams under the situation that the upstream bandwidth of the stream push terminal is not optimal, which greatly improves the viewing experience of users. When the bandwidth is detected to gradually get better, frame dropping is stopped, so that the quality of the pushed streams in the stream push terminal is ensured.
  • Embodiments of the above-described mechanisms are described merely for the illustrative purpose. The modules that are described as separate parts may or may not be physically separated, and the parts illustrated as modules may or may not be physical modules. That is, these modules may be located in one location, or distributed across multiple network entities. Based on actual needs, some or all of the modules may be selected to achieve the objectives of the present embodiments, which those of ordinary skill in the art may understand and implement without taking creative efforts.
  • Through the foregoing description of the embodiments, it is clear to those skilled in the art that each embodiment can be implemented by means of software plus a necessary general hardware platform, and certainly, by means of hardware as well. Based on this understanding, the technical solutions, or essentially the parts that contribute to the current technology, can be embodied in the form of a software product. This computer software product may be stored in a computer-readable storage medium, such as a ROM/RAM, a magnetic disc, an optical disc, etc., and include a variety of instructions that cause a computing device (which may be a personal computer, a server, or a network device, etc.) to implement each embodiment or methods described in certain parts of each embodiment.

Claims (10)

  1. An optimization method based on the network status of a stream push terminal (11), the method comprising:
    calculating, by the stream push terminal (11), a present real-time upstream bandwidth by determining in real-time an amount of streaming media data retrieved from a to-be-transmitted queue and transmitted to a relay terminal (12) in a unit of time;
    determining, by the stream push terminal (11), time for transmission of current to-be-transmitted streaming media data in the to-be-transmitted queue based on the present real-time upstream bandwidth;
    evaluating, by the stream push terminal (11), whether the time for transmission of the current to-be-transmitted streaming media data exceeds a predefined value; and
    if the time for transmission of the current to-be-transmitted streaming media data exceeds the predefined value, employing a frame dropping approach to reduce an amount of uploading data.
  2. The optimization method based on the network status of a stream push terminal (11) according to claim 1, wherein before calculating a present real-time upstream bandwidth by determining in real-time an amount of streaming media data retrieved from a to-be-transmitted queue and transmitted to a relay terminal (12) in a unit of time, the method further comprises:
    collecting, by the stream push terminal (11), audio data and/or video data;
    audio-encoding and/or video-encoding, by the stream push terminal (11), the collected audio data and/or video data, respectively;
    encapsulating, by the stream push terminal (11), the encoded audio data and/or the encoded video data into streaming media data; and
    placing, by the stream push terminal (11), the encapsulated streaming media data into the to-be-transmitted queue.
  3. The optimization method based on the network status of a stream push terminal (11), according to claim 2, further comprising:
    if the time for transmission of the current to-be-transmitted streaming media data does not exceed the predefined value, continuing transmitting, by the stream push terminal (11), remaining streaming media data sequentially from the to-be-transmitted queue.
  4. An optimization system (10) based on the network status of a stream push terminal (11), the system (10) comprising a stream push terminal (11), a viewing terminal (13), and a relay terminal (12), the stream push terminal (11) pushing collected streaming media format data, through the relay terminal (12), to the viewing terminal for viewing, wherein the stream push terminal (11) specifically includes:
    a determination module (111) that is configured to calculate a present real-time upstream bandwidth by determining in real-time an amount of the streaming media data retrieved from a to-be-transmitted queue and transmitted to the relay terminal (12) in a unit of time and determine time for transmission of current to-be-transmitted streaming media data based on the present real-time upstream bandwidth;
    an evaluation module (112)
    that is configured to evaluate whether the time for transmission of the current to-be-transmitted streaming media data exceeds a predefined value; and
    a frame dropping module (113) that is configured to employ a frame dropping approach to reduce an amount of uploading data if the time for transmission of the current to-be-transmitted streaming media data exceeds the predefined value.
  5. The optimization system (10) based on the network status of a stream push terminal (11) according to claim 4, wherein the determination module (111) specifically includes:
    a collection submodule (1111) that is configured to collect audio data and/or video data;
    an encoding submodule (1112) that is configured to audio-encode and/or video-encode the collected audio data and/or video data, respectively;
    a streaming protocol submodule (1113) that is configured to encapsulate the encoded audio data and/or the encoded video data into streaming media data; and
    a bandwidth inspection submodule (1114) that is configured to place the encapsulated streaming media data into a to-be-transmitted queue, retrieve the streaming media data from the to-be-transmitted queue and transmit the streaming media data to the relay terminal (12),
    calculate the present real-time upstream bandwidth by determining in real-time an amount of streaming media data transmitted in a unit of time, and determine the time for transmission of the current to-be-transmitted streaming media data based on the present real-time upstream bandwidth.
  6. The optimization system (10) based on the network status of a stream push terminal (11) according to claim 5, wherein the stream push terminal (11) further specifically includes:
    a transmission module (114) that is configured to continue transmitting remaining streaming media data sequentially from the to-be-transmitted queue if the time for transmission of the current to-be-transmitted streaming media data does not exceed the predefined value.
  7. A stream push terminal (10) comprising: a determination module (111) that is configured to calculate a present real-time upstream bandwidth by determining in real-time an amount of the streaming media data retrieved from a to-be-transmitted queue and transmitted to a relay terminal (12) in a unit of time and determine time for transmission of current to-be-transmitted streaming media data based on the present real-time upstream bandwidth;
    an evaluation module (112) that is configured to evaluate whether the time for transmission of the current to-be-transmitted streaming media data exceeds a predefined value; and
    a frame dropping module (113) that is configured to employ a frame dropping approach to reduce an amount of uploading data if the time for transmission of the current to-be-transmitted streaming media data exceeds the predefined value.
  8. The stream push terminal (11) according to claim 7, wherein the determination module (111) specifically includes: a collection submodule (1111) that is configured to collect audio data and/or video data; an encoding submodule (1112) that is configured to audio-encode and/or video-encode the collected audio data and/or video data, respectively; a streaming protocol submodule (1113) that is configured to encapsulate the encoded audio data and/or the encoded video data into streaming media data; and a bandwidth inspection submodule (1114) that is configured to place the encapsulated streaming media data into a to-be-transmitted queue, retrieve the streaming media data from the to-be-transmitted queue and transmit the streaming media data to a relay terminal, (12) and calculate the present real-time upstream bandwidth by determining in real-time an amount of streaming media data transmitted in a unit of time.
  9. The stream push terminal (11) according to claim 8, wherein the bandwidth inspection submodule (1114) is further configured to determine the time for transmission of the current to-be-transmitted streaming media data based on the present real-time upstream bandwidth.
  10. The stream push terminal (11) according to claim 9, wherein the stream push terminal (11) further specifically includes:
    a transmission module (114) that is configured to continue transmitting remaining streaming media data sequentially from the to-be-transmitted queue if the time for transmission of the current to-be-transmitted streaming media data does not exceed the predefined value.
EP17894674.5A 2017-04-05 2017-05-16 Optimization method and system on basis of network status of push terminal and push terminal Active EP3416335B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710216528.6A CN106998268A (en) 2017-04-05 2017-04-05 A kind of optimization method and system and plug-flow terminal based on plug-flow terminal network situation
PCT/CN2017/084456 WO2018184277A1 (en) 2017-04-05 2017-05-16 Optimization method and system on basis of network status of push terminal and push terminal

Publications (3)

Publication Number Publication Date
EP3416335A1 EP3416335A1 (en) 2018-12-19
EP3416335A4 EP3416335A4 (en) 2019-04-03
EP3416335B1 true EP3416335B1 (en) 2020-02-26

Family

ID=59433870

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17894674.5A Active EP3416335B1 (en) 2017-04-05 2017-05-16 Optimization method and system on basis of network status of push terminal and push terminal

Country Status (4)

Country Link
US (1) US20200274910A1 (en)
EP (1) EP3416335B1 (en)
CN (1) CN106998268A (en)
WO (1) WO2018184277A1 (en)

Families Citing this family (7)

* Cited by examiner, ā€  Cited by third party
Publication number Priority date Publication date Assignee Title
CN109348307A (en) * 2018-10-23 2019-02-15 å®‰å¾½ę…§č§†é‡‘ēž³ē§‘ęŠ€ęœ‰é™å…¬åø A kind of intelligent view cloud Teaching System audio-video low latency method
CN111225209B (en) * 2018-11-23 2022-04-12 北äŗ¬å­—čŠ‚č·³åŠØē½‘ē»œęŠ€ęœÆęœ‰é™å…¬åø Video data plug flow method, device, terminal and storage medium
CN110213549A (en) * 2019-07-03 2019-09-06 ęˆéƒ½ę±‡ēŗ³ę™ŗčƒ½ē§‘ęŠ€ęœ‰é™å…¬åø A kind of plug-flow method and system based on libRTMP
CN110557647A (en) * 2019-08-29 2019-12-10 å±±äøœęŸ„ēƒē½‘ē»œē§‘ęŠ€ęœ‰é™å…¬åø Simple stream pushing technology for direct recording and broadcasting conference
CN110650307A (en) * 2019-10-30 2020-01-03 å¹æå·žę²³äøœē§‘ęŠ€ęœ‰é™å…¬åø QT-based audio and video plug flow method, device, equipment and storage medium
CN113068062B (en) * 2021-04-06 2022-06-17 äøŠęµ·ē½‘ę¢Æꕰē ē§‘ęŠ€ęœ‰é™å…¬åø Automatic film arrangement push-streaming live broadcast method
CN117294851A (en) * 2023-11-23 2023-12-26 ę­¤čŠÆē§‘ꊀ(äøŠęµ·)ęœ‰é™å…¬åø Video streaming processing device and method

Family Cites Families (13)

* Cited by examiner, ā€  Cited by third party
Publication number Priority date Publication date Assignee Title
US8107524B2 (en) * 2001-03-30 2012-01-31 Vixs Systems, Inc. Adaptive bandwidth footprint matching for multiple compressed video streams in a fixed bandwidth network
US7571246B2 (en) * 2004-07-29 2009-08-04 Microsoft Corporation Media transrating over a bandwidth-limited network
CN1946079A (en) * 2006-11-02 2007-04-11 北äŗ¬å¤§å­¦ Selective frame losing method for network bandwidth self adaptive flow medium transmission
CN101990087A (en) * 2010-09-28 2011-03-23 ę·±åœ³äø­å…“力ē»“ꊀęœÆęœ‰é™å…¬åø Wireless video monitoring system and method for dynamically regulating code stream according to network state
EP2685742A4 (en) * 2011-04-07 2014-03-05 Huawei Tech Co Ltd Method, device and system for transmitting and processing media content
CN102905128B (en) * 2012-09-07 2016-08-03 ꭦ걉é•æę±Ÿé€šäæ”äŗ§äøšé›†å›¢č‚”ä»½ęœ‰é™å…¬åø Codec processor is the method for Rate Control during wireless video transmission
US9571404B2 (en) * 2012-11-09 2017-02-14 Aruba Networks, Inc. Method and system for prioritizing network packets
CN104135444A (en) * 2013-05-02 2014-11-05 č…¾č®Æē§‘ꊀļ¼ˆę·±åœ³ļ¼‰ęœ‰é™å…¬åø A method and a device for controlling multimedia data transmission flow
CN104602044B (en) * 2015-02-05 2019-02-15 ē§¦ę°øēŗ¢ A kind of RTMP Streaming Media public network live broadcast system and its design method
CN105357592B (en) * 2015-10-26 2018-02-27 å±±äøœå¤§å­¦č‹å·žē ”ē©¶é™¢ A kind of streaming media self-adapting transmitting selective frame losing method
CN105262699B (en) * 2015-10-29 2018-07-03 ęµ™ę±Ÿå¤§åŽęŠ€ęœÆč‚”ä»½ęœ‰é™å…¬åø A kind of network self-adapting code adjustment method and device
CN105791260A (en) * 2015-11-30 2016-07-20 ę­¦ę±‰ę–—é±¼ē½‘ē»œē§‘ęŠ€ęœ‰é™å…¬åø Network self-adaptive stream media service quality control method and device
CN105791836B (en) * 2016-03-07 2019-04-02 äø­å›½ē§‘å­¦é™¢č®”ē®—ꊀęœÆē ”ē©¶ę‰€ Method for video coding, video code flow adaptive transmission method and display methods

Non-Patent Citations (1)

* Cited by examiner, ā€  Cited by third party
Title
None *

Also Published As

Publication number Publication date
EP3416335A1 (en) 2018-12-19
WO2018184277A1 (en) 2018-10-11
US20200274910A1 (en) 2020-08-27
EP3416335A4 (en) 2019-04-03
CN106998268A (en) 2017-08-01

Similar Documents

Publication Publication Date Title
EP3416335B1 (en) Optimization method and system on basis of network status of push terminal and push terminal
US11349900B2 (en) Voice encoding and sending method and apparatus
KR100703399B1 (en) Transcoding apparatus and method for seamless video contents transmission
US9118801B2 (en) Optimizing video-call quality of service
EP3429124B1 (en) Optimizing video-call quality of service
US9497373B2 (en) Remote controlled studio camera system
US8737803B2 (en) Method and apparatus for storing and streaming audiovisual content
CN104737514A (en) A method and apparatus for distributing a media content service
CN105392020A (en) Internet video direct broadcasting method and system
CN102202210B (en) Method for mobile phone to play real-time monitoring video and mobile phone to play real-time monitoring video
CN101090486A (en) Monitoring device for multimedium monitoring information and its monitoring method
WO2006097937A2 (en) A method for a clustered centralized streaming system
US20120303797A1 (en) Scalable audiovisual streaming method and apparatus
CN101098467A (en) Network audio-video monitoring method and system
CN111629283B (en) Multi-stream media gateway service system and method
US20120304240A1 (en) Method and apparatus for selecting audiovisual content for streaming
WO2021093653A1 (en) Security method, apparatus and system easy to access by user
CN112584194A (en) Video code stream pushing method and device, computer equipment and storage medium
WO2012166444A2 (en) Scalable audiovisual streaming method and apparatus
CN110661992A (en) Data processing method and device
CN101483748A (en) Audio and video synchronization method and apparatus oriented to real-time video call application on 3G circuit switching network
US8588379B2 (en) Multimedia communication system, multimedia communication device and terminal
US8564639B2 (en) Multimedia communication system, multimedia communication device and terminal
CN110087020B (en) Method and system for realizing video networking conference by iOS equipment
US9491301B2 (en) Multimedia providing service

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180810

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

A4 Supplementary search report drawn up and despatched

Effective date: 20190301

RIC1 Information provided on ipc code assigned before grant

Ipc: H04L 12/24 20060101AFI20190225BHEP

Ipc: H04L 29/06 20060101ALN20190225BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: H04L 29/06 20060101ALN20191128BHEP

Ipc: H04L 12/24 20060101AFI20191128BHEP

Ipc: H04L 12/26 20060101ALI20191128BHEP

RIC1 Information provided on ipc code assigned before grant

Ipc: H04L 12/26 20060101ALI20191203BHEP

Ipc: H04L 29/06 20060101ALN20191203BHEP

Ipc: H04L 12/24 20060101AFI20191203BHEP

INTG Intention to grant announced

Effective date: 20191217

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1238984

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200315

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017012412

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200526

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20200226

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200527

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200526

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200626

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200719

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1238984

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200226

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017012412

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200531

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200531

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

26N No opposition filed

Effective date: 20201127

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20200531

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200516

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200516

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200531

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20210531

Year of fee payment: 5

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20210514

Year of fee payment: 5

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602017012412

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: H04L0012240000

Ipc: H04L0041000000

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200226

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20220516

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220531

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220516

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240513

Year of fee payment: 8