CN113727144A - High-definition live broadcast system and streaming media method based on mixed cloud - Google Patents

High-definition live broadcast system and streaming media method based on mixed cloud Download PDF

Info

Publication number
CN113727144A
CN113727144A CN202111024639.XA CN202111024639A CN113727144A CN 113727144 A CN113727144 A CN 113727144A CN 202111024639 A CN202111024639 A CN 202111024639A CN 113727144 A CN113727144 A CN 113727144A
Authority
CN
China
Prior art keywords
live broadcast
video signal
source station
cloud service
service source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111024639.XA
Other languages
Chinese (zh)
Inventor
王红丽
张丽娟
邹继涛
官易楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
Original Assignee
China United Network Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN202111024639.XA priority Critical patent/CN113727144A/en
Publication of CN113727144A publication Critical patent/CN113727144A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234309Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/20Network architectures or network communication protocols for network security for managing network security; network security policies in general
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols

Abstract

The application provides a high-definition live broadcast system and a streaming media method based on a mixed cloud. The method comprises the following steps: after the high-definition live broadcast system acquires the video signal, the video signal is sent back to a private cloud service source station through an RTMP in real time; the private cloud service source station calls a preset transcoding rule pre-stored in a local service to transcode the video signal into a playing file; the preset transcoding rule can be an HLS slicing algorithm; after the private cloud service source station completes generation of a playing file, the playing file is sent to the public cloud service source station; and after receiving the live broadcast request of the user terminal, the public cloud service source station distributes the broadcast file to the user terminal in a downlink manner according to the IP address of the user terminal. The method ensures the resource exclusive share and the data isolation of the live broadcast service, and improves the safety of the video signal in the transmission process.

Description

High-definition live broadcast system and streaming media method based on mixed cloud
Technical Field
The application relates to the field of communication, in particular to a high-definition live broadcast system and a streaming media method based on a mixed cloud.
Background
With the development of communication technology and internet technology, the network live broadcast industry is rapidly emerging. The network live broadcast has rich content and powerful expression mode. The "live broadcast + X" mode is also rapidly permeating to various vertical industries along with the rise of network live broadcast. Such as web lessons live, shopping live, etc.
Currently, live webcasting is generally realized by using a public cloud live platform. After the front-end equipment finishes the acquisition of the video signal, the front-end equipment uploads the video signal to a server of a public cloud. When the user needs to watch the network live broadcast, the user terminal requests the server for the video signal, and the network live broadcast watching is realized.
However, the public cloud live broadcast platform has the problem of poor data security.
Disclosure of Invention
The application provides a high-definition live broadcast system and a streaming media method based on a mixed cloud, which are used for solving the problem of poor data security of a public cloud live broadcast platform.
In a first aspect, the present application provides a high definition live broadcast system based on a hybrid cloud, including:
optionally, the system comprises: the system comprises a private cloud service source station and a public cloud service source station;
the private cloud service source station is used for acquiring a video signal and transcoding the video signal according to a preset transcoding rule to obtain a playing file, wherein the playing file is a streaming media playing file obtained after the video signal is transcoded;
and the public cloud service source station is used for distributing the playing file to the user terminal in a downlink manner.
Optionally, the private cloud service source station further includes: a filtration tank;
the filtering pool is used for filtering the video signal according to a preset rule, and the video signal is a streaming media signal acquired and encoded by front-end equipment.
Optionally, the private cloud service source station further includes: a safety pool;
the safety pool is used for monitoring the video signals acquired by the private cloud service source station.
Optionally, the private cloud service source station further includes: a storage area;
the memory is used for storing the video signal and/or the playing file.
Optionally, the public cloud service source station includes a transcoding node;
the transcoding node is used for determining a video code rate and a transcoding strategy of the playing file according to an output format of a user terminal, the transcoding node transcodes the playing file according to the video code rate and the transcoding strategy, the video code rate comprises one of ultra-definition, high-definition and standard definition, and the transcoding strategy comprises a picture scale.
Optionally, the public cloud service source station includes: a dispatching desk and an edge node;
the dispatching desk is used for determining the edge nodes of downlink distribution according to the user terminal;
and the edge node is used for realizing the downlink distribution of the downlink file.
Optionally, the system further comprises: a 5G node;
and the 5G node is used for sending the video signal uploaded by the front-end equipment to the private cloud service source station through the 5G streaming media.
In a second aspect, the present application provides a streaming media method, including:
acquiring a video signal, wherein the video signal is a streaming media signal acquired and encoded by front-end equipment;
transcoding the video signal according to a preset transcoding rule to obtain a playing file, wherein the playing file is a streaming media playing file obtained after transcoding the video signal;
and distributing the playing file in a downstream mode.
In a third aspect, the present application provides a readable storage medium, where a computer program product is stored, and when a processor in the system executes the computer program product, the hybrid cloud-based high-definition live broadcast system in any one of the possible designs of the first aspect and the first aspect is implemented.
In a fourth aspect, the present application provides a computer program product, where the computer program product includes a computer program, and when the computer program is executed by a processor, the hybrid cloud-based high-definition live broadcast system in the first aspect and any one of the possible designs of the first aspect is implemented.
According to the high-definition live broadcast system and the streaming media method based on the hybrid cloud, after the video signal is obtained through the high-definition live broadcast system, the video signal is sent back to a private cloud service source station in Real Time through a Real Time Messaging Protocol (RTMP); the private cloud service source station calls a preset transcoding rule pre-stored in a local service to transcode the video signal into a playing file; wherein, the preset transcoding rule can be a dynamic code rate adaptive (HTTP Live Streaming, HLS) slicing algorithm; after the private cloud service source station completes generation of a playing file, the playing file is sent to the public cloud service source station; after receiving a live broadcast request of a user terminal, the public cloud service source station distributes the broadcast file to the user terminal in a downlink manner according to the IP address of the user terminal, so that the effects of ensuring resource exclusive share of a high-definition live broadcast system, ensuring data isolation of live broadcast services and other services and improving the safety and stability of video signals are achieved.
Drawings
In order to more clearly illustrate the technical solutions in the present application or the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic view of a live broadcast scene according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a hybrid cloud-based high-definition live broadcast system according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of another hybrid cloud-based high-definition live broadcast system according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of another hybrid cloud-based high-definition live broadcast system according to an embodiment of the present application;
fig. 5 is a flowchart illustrating a streaming media method according to an embodiment of the present application;
fig. 6 is a schematic flowchart of another streaming media method according to an embodiment of the present application;
fig. 7 is a schematic flowchart of a 4K/VR live broadcast system according to an embodiment of the present application;
fig. 8 is a schematic flowchart of an AR live broadcasting system according to an embodiment of the present application;
fig. 9 is a schematic diagram of compatibility according to an embodiment of the present application.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In this application, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the present application, 5G refers to a fifth generation mobile communication technology. 5G is the highest peak of the current mobile communication technology development. 5G is a higher demand for mobile communication on the basis of 4G. 5G has advanced mobile communications in many areas, such as speed, bandwidth, and latency. The highest data transmission rate of 5G can reach 10Gbit/s, and the lowest network delay can reach within 1 millisecond. The 5G international standards organization 3GPP defines three major directions for 5G application scenarios, namely Enhanced Mobile Broadband (eMBB), Massive internet Of things (MMTC), and Ultra Reliable Low Latency Communication (URLLC). Currently, 5G applications can be generally divided into two broad categories, general and industrial applications respectively. The universal application mainly comprises 5G-based ultra-high-definition videos, 5G-based VR/AR, 5G-based internet unmanned planes, 5G-based wireless robots and the like. The industrial application mainly comprises the application of 5G in the industrial fields of new media, industrial internet, car networking, remote medical treatment, smart cities, rail transit and the like.
The Private Cloud referred to in this application refers to a Virtual Private Cloud (VPC), also called a Private network. The private Cloud is a virtual network environment which is constructed by an Elastic Cloud Server (ECS), can realize isolation among users and can realize autonomous configuration and management of the users. The use of the private cloud may improve the security of resources on the user cloud. The user can define security groups, IP address segments, bandwidth, etc. network characteristics in the VPC. Or, the user can manage and configure the internal network through the VPC to perform safe and rapid network change. In addition, the user can also realize interconnection between the VPC on the cloud and a traditional Internet Data Center (IDC) through connection modes such as a Private line/Virtual Private Network (VPN)/General Routing Encapsulation (GRE) and the like, so as to construct a hybrid cloud service. The private cloud has the advantage that security isolation can be performed between different tenants. Meanwhile, the private cloud can also configure the Network as needed based on Software Defined Networking (SDN) technology. Furthermore, the tenant of the private cloud can connect the VPCs in the private cloud and the public cloud through technologies such as VPN, so as to form a cross-region hybrid cloud resource pool.
In the present application, a Content Delivery Network (CDN) is an intelligent virtual Network constructed on the basis of an existing Network. The construction of the CDN is implemented by means of edge servers deployed in various places. The CDN can enable a user to obtain required content nearby through load balancing of the central platform, reduce network congestion and improve the access response speed and hit rate of the user. With the advent of high definition and ultra high definition (4K) video, over a million parts of content are transmitted each day over a network to user terminals. Fast-update video products, and the video demand for high-quality video experience, place higher demands on network concurrent processing power and network capacity. The content distribution network of the traditional chimney framework has difficulty in meeting the requirements of various types of content management and distribution. In order to solve the bottleneck problem caused by streaming media data in a backbone network, a CDN technology is integrated. The CDN combines technologies such as a monitoring and intelligent scheduling mechanism, an intelligent routing strategy and service quality monitoring by virtue of a high-quality node network, so that a user can enjoy high-quality video viewing and web browsing experience on various terminals.
Streaming media (streaming media) is also called streaming media or streaming mode. In streaming media, multimedia files such as video and audio files can be formed into one or more data packets by a special compression method. The data packet can be transmitted to the user terminal by the server, and the data packet usually maintains consistency, continuity and real-time property when data is transmitted. The traditional non-streaming playing mode needs to download the whole file before use and store the file locally before playing. The streaming media can be played without waiting for the whole file like non-streaming playing, and can be watched only by waiting for enough starting data. The rest data will continue to be downloaded and played one by one. When media files are transmitted through a network, a compression algorithm is generally adopted to compress original data, so that the dependence on network bandwidth is reduced, and domestic common Live broadcast protocols include a Real Time Messaging Protocol (RTMP), a dynamic code rate adaptive algorithm (HTTP Live Streaming, HLS), HTTP-FLV and the like. The RTMP is a proprietary protocol developed by Adobe for audio and video data transmission between a Flash player and a server. Meanwhile, RTMP is also a plaintext protocol working on TCP, a port 1935 is used by default, and the network delay is relatively low and is generally 1-3 s. Flv (flash video) is another video format introduced by Adobe corporation, which is a streaming media data storage container format transmitted over a network. The FLV video format consists of The FLV Header, The FLV Body and other tags, The suffix of The encapsulated file is FLV, and The loading speed is very high. The HTTP-FLV encapsulates streaming media data into FLV format, and then transmits to the client through HTTP protocol. The HLS differs from the above streaming media protocol most in that it does not request a complete data stream at once. HLS is to cut streaming media data into continuous short TS small files at the server, and access the TS files sequentially through M3U8 index files. Therefore, the client only needs to play the files acquired from the server in sequence continuously, and therefore audio and video playing is achieved. The HLS has the advantages of high performance, network delivery support through HTTP transmission, good CDN support, self-contained multi-code rate self-adaption, capability of penetrating a firewall, and low real-time performance, high delay and basically delay more than 10 s.
With the development of communication technology and internet technology, the network live broadcast industry is rapidly emerging. The network live broadcast has rich content and powerful expression mode. The "live broadcast + X" mode is also rapidly permeating to various vertical industries along with the rise of network live broadcast. Such as web lessons live, shopping live, etc. With the continuous development of the network live broadcast industry, the requirements of users on live broadcast platforms are also improved. When a user watches the live network broadcast, the live network video needs to keep high definition so as to improve the watching effect. When the anchor broadcasts the live broadcast on the network, the network which needs the live broadcast keeps high reliability, and avoids the influence on the live broadcast effect caused by the blocking due to the excessive number of people. Meanwhile, no matter the anchor or the user, the live webcast is required to be kept in a low-delay state, so that poor experience effect caused by interaction delay is avoided. Currently, live webcasting is generally realized by using a public cloud live platform. After the front-end equipment finishes the acquisition of the video signal, the front-end equipment uploads the video signal to a server of a public cloud. When the user needs to watch the network live broadcast, the user terminal requests the server for the video signal, and the network live broadcast watching is realized. However, live broadcast is performed by using a public cloud live broadcast platform, and the problem of poor data security exists. In addition, because data among different operators cannot be communicated with each other, the problem that user access is limited in a single-line machine room of the operators exists.
In addition, because direct interconnection and interworking between different operators cannot be achieved, the machine rooms of the operators are usually connected by a single line. Under the condition of single wire, the IDC machine room and the server are accessed by a single operator line, and the access response time delay fluctuation of the cross-operator is large. The prior art utilizes border gateway routing protocol (BGP) to transport data and information between different host gateways, the Internet, or autonomous systems. BGP is a Path Vector Protocol (PVP) that maintains paths to different hosts, networks, and gateway routers and determines routing decisions based on it. In order to solve the problem of interconnection between a single-wire machine room IDC network and other operators, a common BGP machine room is accessed with a mixed bandwidth, an IP of one line is mapped to an IP of the other line, and then the fastest line is automatically selected for the operators by means of a program, so that reasonable distribution of cross-operator access is realized. However, the BGP machine room scheme has the disadvantage of high capital and construction cost.
To the problem, the application provides a high-definition live broadcast system based on mixed cloud. The source station system of the live broadcast service built by the private cloud service source station is used. The Private Cloud service source station can provide functions of a uniform Private network (VPC), an elastic Internet Protocol (IP), Server Load Balancing (SLB), data storage and the like for live broadcast services. Meanwhile, the private cloud service source station can provide security guarantee for video data through modules such as a private cloud security pool firewall, an intrusion prevention system and security audit. By using the private cloud service source station, the advantages of private cloud environment resource exclusive sharing, data isolation, safety, stability and the like are fully exerted. The method and the device also use the public cloud service source station to build a stream media forwarding mechanism, so that the playing files can be rapidly distributed in a downlink mode through the public cloud service source station. According to the method and the device, a push-pull stream multi-link backup mechanism and a flexible transcoding mechanism of the video are constructed by building a high-definition live broadcast system of a mixed cloud based on a public cloud service source station and a private cloud service source station. The mixed cloud-based high-definition live broadcast system is used, so that the problem of downstream edge acceleration of service data of the private cloud system is solved, and the problem of bandwidth access of a private cloud platform to different regions and different operator networks is solved. The application also combines a 5G network and Mobile Edge Computing (MEC) to obtain the 5G node. The 5G node can further reduce network time delay, and meets the network requirements of high-definition video wireless return under various application scenes such as fixed-point live broadcast, walking broadcast and the like. The 5G node is used in the high-definition live broadcast system, and diversified live broadcast scenes such as 5G +4K/8K live broadcast, AR/VR live broadcast, holographic interactive live broadcast and the like can be met.
The high definition live broadcast system based on mixed cloud that this application was used compares with the ripe public cloud live broadcast platform in the market, can realize the data intercommunication of public network and private cloud private network to ensure functions such as video transmission, commentaries on classics push, transcoding, storage, can guarantee to a great extent that live broadcast platform resource is exclusive, data isolation, safety and stability.
Meanwhile, the public cloud service source station is used as a supplement scheme of the private cloud architecture of the operator, and the public cloud service node is used as a transmission pipeline. The streaming media signals can be transmitted to the public cloud service node through corresponding operator bandwidths, and then the public cloud service node forwards the streaming media signals to the private cloud internal live broadcast system through the communication bandwidth. Meanwhile, the distribution acceleration mechanism of the invention adopts a CDN framework fusing multiple manufacturers, further optimizes the problem of downstream edge acceleration of the private cloud system service data, and realizes an optimized line strategy of multi-mode switching at the playing front end.
In addition, in the application, the private cloud service source station constructs a push-pull stream multi-link backup mechanism and a flexible cloud transcoding mechanism based on three protocols of RTMP, HLS and HTTP-FLV. The public cloud service source station adopts a transcoding suppression strategy to adjust the frame-to-frame ratio (width and height) and the frame image resolution of the streaming media, and solves the problems of live broadcast blockage caused by unstable network environment and inappropriate front-end equipment.
In addition, the 5G node is used, and the 5G network is combined in practical application, so that the network delay is further reduced. The high-definition live broadcast system can meet the live broadcast requirements of multi-scene fusion such as fixed/walking interactive live broadcast, AR interactive live broadcast, holographic interactive live broadcast and the like. Experimental tests show that in the collecting, editing and broadcasting process, 5G network and RTMP protocol stream are used for manufacturing in the two links of collecting and editing, the time is 1-2 seconds faster than that of only using WIFI and 4G wireless networks, and the time is about 10 seconds faster than that of real-time stream in an M3U8 format using WIFI and 4G wireless networks and HLS protocols. While the use of 5G nodes is a combination of 5G and MEC. The 5G node can further reduce network time delay and has good application value for remote production and broadcasting in different places in actual production and the like.
The technical solution of the present application will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 1 shows a schematic view of a live scene provided in an embodiment of the present application. As shown in the figure, in the live broadcast process, three parts, namely front-end equipment, a high-definition live broadcast system and a user terminal, are mainly involved.
The front-end equipment is live broadcast equipment used by the anchor. The front-end equipment can be equipment for acquiring video signals, such as a high-definition/4K camera, a VR camera, screen recording equipment, a microphone and the like. The video signal collected by the front-end device may specifically include a high definition/4K signal, a VR signal, an AR signal, a mobile phone plug flow signal, a 5G return signal, a microphone audio signal, a Data Direction Register/Graphics form Express (DDR/GFX), a Joint Photographic Experts Group (JPEG) format picture, and the like. The front-end device may encode the video data into a streaming media signal, i.e., a video signal, after the acquisition is completed. And the front-end equipment uploads the video signal to a high-definition live broadcast system. And after the high-definition live broadcast system acquires the video signal, transcoding the video signal into a playing file. The playing file can be a streaming media playing file in a TS and M3U8 format. The high-definition live broadcast system can distribute the playing file to each edge node of the high-definition live broadcast system. When the user terminal requests the live broadcast system to watch the live broadcast, the high-definition live broadcast system can select edge nodes nearby according to the user terminal and push the live broadcast file to the user terminal. And after receiving the live broadcast file, the user terminal plays the live broadcast video by using the broadcast file to realize live broadcast watching.
Fig. 2 shows a schematic structural diagram of a hybrid cloud-based high-definition live broadcast system according to an embodiment of the present application, and as shown in fig. 2, the hybrid cloud-based high-definition live broadcast system 10 according to the present embodiment may include: a private cloud service source station 11 and a public cloud service source station 12.
The private cloud service source station 11 is configured to acquire a video signal, transcode the video signal according to a preset transcoding rule, and obtain a play file, where the play file is a streaming media play file obtained after transcoding the video signal.
In this embodiment, the front-end device acquires the obtained video data. The signals included in the video data are typically pictures and audio. In order to upload the picture and the audio, the front-end device may convert the video data into a video signal and upload the video signal. The video data collected by the front-end equipment is processed by an encoder (a software/hardware encoder) to obtain a video signal. The video signal may be transmitted in a real-time signal stream of a network protocol such as HTTP, RTMP, UDP, RTSP, and the like. The video signal has the transmission characteristics of large code rate fluctuation range and higher requirement on the self-adaptive capacity of the player.
The data transmission between the front-end device and the high-definition live broadcast system 10 may be implemented based on a Real Time Messaging Protocol (RTMP). After acquiring the video signal, the high-definition live broadcasting system 10 feeds the video signal back to the private cloud service source station 11 through the RTMP in real time. The private cloud service source station 11 calls a preset transcoding rule pre-stored in the local service to transcode the video signal into a playing file. The preset transcoding rule may be an HLS slicing algorithm. That is, the private cloud service source station 11 transcodes the video signal into the play file using the HLS slicing algorithm. The playing file may be a streaming media playing file in a TS and M3U8 format.
The public cloud service source station 12 is configured to distribute the play file to the user terminal in a downlink manner.
In this embodiment, after the private cloud service source station 11 completes generation of the play file, the play file is sent to the public cloud service source station 12. After receiving the live broadcast request of the user terminal, the public cloud service source station 12 distributes the broadcast file to the user terminal in a downlink manner according to the IP address of the user terminal. After the user terminal receives the playing file, the user terminal can realize live broadcast watching through the playing file.
When the public cloud service source station 12 distributes the play file in a downlink, the configured transmission protocol may include one of an RTMP, an HLS, and an HTTP-FLV. The public cloud service source station 12 selects one transmission protocol from the above transmission protocols to complete the configuration of data real-time transmission.
In one example, after acquiring the video signal, the high-definition live broadcasting system 10 may further send the video signal to the public cloud server source station 12, and the public cloud server source station 12 distributes the video signal to each user terminal according to a live broadcasting request of the user terminal.
The high-definition live broadcast system based on the hybrid cloud comprises a private cloud service source station and a public cloud service source station. And after the high-definition live broadcast system acquires the video signal, the video signal is sent back to the private cloud service source station through the RTMP in real time. And the private cloud service source station calls a preset transcoding rule pre-stored in the local service to transcode the video signal into a playing file. The preset transcoding rule may be an HLS slicing algorithm. And after the private cloud service source station completes the generation of the playing file, the playing file is sent to the public cloud service source station. And after receiving the live broadcast request of the user terminal, the public cloud service source station distributes the broadcast file to the user terminal in a downlink manner according to the IP address of the user terminal. In the application, through using the private cloud service source station, the resource exclusive share of a high-definition live broadcast system is guaranteed, the data isolation of live broadcast services and other services is guaranteed, and the safety and the stability of video signals are improved. Meanwhile, the playing file is transcoded in advance by controlling the video code rate and determining the transcoding strategy, so that the time delay of the user terminal is reduced, and the user experience is improved. In addition, this application has still avoided the three-way problem that the unable intercommunication of operator data leads to through the combination use with private cloud service source station and public cloud service source station, improves video signal's transmission efficiency, further reduces the time delay, improves user experience.
Fig. 3 shows a schematic structural diagram of another hybrid cloud-based high-definition live broadcast system according to an embodiment of the present application, and based on the embodiment shown in fig. 2, as shown in fig. 3, the hybrid cloud-based high-definition live broadcast system 10 of the present embodiment may include: 5G node 13.
The 5G node 13 is configured to send the video signal uploaded by the front-end device to the private cloud service source station 11 through a 5G streaming media.
In this embodiment, the front-end device realizes fast uploading of the video signal through the 5G node. Wherein the 5G node may include a 5G network and an MEC. The MEC is connected with the 5G base station, and a 5G streaming media private network channel from the front-end equipment to the private cloud service source station can be constructed. The 5G streaming media private network channel is used for realizing the data security intercommunication between the public network and the private cloud service source station, so that the video transmission efficiency is improved, and the security of video signal transmission is further improved.
The public cloud service source station 12 may include: a dispatcher 121, an edge node 122. The dispatcher 121 is configured to determine an edge node for downlink distribution according to the user terminal. The edge node 122 is used to implement downstream distribution of downstream files.
In this embodiment, the public cloud service source station 12 may specifically include two data processing nodes, namely a dispatching desk 121 and an edge node 122.
The edge node 122 is a CDN edge node deployed in each region for the public cloud service source station. The front-end device selects the edge node 122 nearby, and the video signal is uploaded. The proximally selected edge node 122 may be the edge node closest to the head end device. Alternatively, the edge node 122 selected nearby may be an edge node that is free within a certain range from the front-end device. Alternatively, the edge node 122 selected nearby may be the optimal edge node selected by integrating the distance and the busy degree of each edge node 122. After the edge node 122 acquires the video signal uploaded by the front-end device, the edge node 122 may push the video signal to the dispatcher 121.
The edge node 122 may also be used for distribution and acceleration of the playing file. When a user terminal sends a live request, the dispatcher 121 distributes the playback file to an appropriate edge node 122 according to the IP address of the user terminal. The appropriate edge node 122 may be the edge node closest to the user terminal. Alternatively, the suitable edge node 122 may be an edge node that is free within a certain range from the user terminal. Alternatively, the suitable edge node 122 may be an optimal edge node selected after integrating the distance and the busy degree of the edge node 122. After the edge node 122 acquires the play file distributed by the scheduler 121, the edge node 122 pushes the play file to the user terminal. Alternatively, the edge node 122 may further improve the transmission efficiency of the playing file by recoding, so as to realize accelerated distribution of the playing file.
The edge node 122 may be a CDN edge node from multiple vendors.
Wherein, the dispatching desk 121 is used for realizing intelligent dispatching of the edge node 122. When the front-end device uploads the video signal through the 5G node, the 5G node pushes the video signal to the dispatcher 121. The dispatch station 121 sources the video signal back to the private cloud service source station 11. After the private cloud service source station 11 completes encoding of the video signal, the private cloud service source station 11 streams the playing file to the scheduling station 121. When the broadcast file can be directly distributed, the dispatcher 121 distributes the broadcast file to each edge node 122 according to the condition of each edge node 122. Alternatively, the dispatcher 121 may transcode the playing file back to the transcoding node 123.
In one example, the public cloud service source station 12 also includes a transcoding node 123. The transcoding node is used for determining a video code rate and a transcoding strategy of the playing file according to the output format of the user terminal, the transcoding node transcodes the playing file according to the video code rate and the transcoding strategy, the video code rate comprises one of ultra-definition, high-definition and standard definition, and the transcoding strategy comprises a picture scale.
In this example, the transcoding node 123 may transcode the playing file according to the video bitrate and the transcoding policy. According to the video code rate, the transcoding node 123 can transcode to obtain the super-definition video, the high-definition video and the standard-definition video. According to the frame scale, the transcoding node 123 may transcode the frame scale of the playing file to a preset scale. When the public cloud service source station 12 receives a live broadcast request of a user terminal, the dispatcher station 121 determines a video code rate and a frame scale required by the user terminal according to the live broadcast request. The transcoding node 123 transcodes the playing file according to the video code rate and the picture scale acquired by the dispatching desk 121.
The high-definition live broadcast system based on the hybrid cloud further comprises a 5G node. And the front-end equipment uploads the video signal to the scheduling node through the 5G node. In the application, the 5G node is used for realizing the construction of the 5G streaming media special network channel, so that the data safety intercommunication between the public network and the private cloud service source station is realized, the video transmission efficiency is improved, and the safety of video signal transmission is further improved. The public cloud service source station in the application can comprise a dispatching desk, an edge node and a transcoding node. And the front-end equipment selects edge nodes nearby to realize the uploading of the video signal. The edge node may push the video signal to a dispatch station. And the dispatching desk feeds the video signal back to the private cloud service source station to realize the generation of the playing file. And the dispatching desk acquires the playing file and transcodes the playing file to a proper video code rate and transcoding strategy through the transcoding node. The dispatcher station can also distribute the playing file to various edge nodes. And the edge node sends the playing file to the user terminal to realize live broadcasting. In the application, resource balance of the edge node is realized by using the dispatching desk. In the application, the transcoding node is used, so that the playing file meets the playing requirement of the user terminal, the playing efficiency is improved, and the delay is reduced. In the application, by using the edge node, the nearby sending of the playing file and the accelerated issuing of the playing file are realized, and the effect of reducing the time delay is realized.
Fig. 4 shows a schematic structural diagram of another high-definition live broadcast system based on a hybrid cloud according to an embodiment of the present application, and based on the embodiments shown in fig. 2 and fig. 3, as shown in fig. 4, the private cloud service source station 11 of this embodiment may include: a filter tank 112, a safety tank 113, and a storage area 114.
The filtering pool 112 is configured to filter a video signal according to a preset rule, where the video signal is a streaming media signal acquired and encoded by a front-end device.
In this embodiment, the private cloud service source station 11 may include a filtering basin 112. The filtering tank 112 is the first line of defense for video signals entering the private cloud service source station 11. The filtration tank 112 may include a firewall and a security gateway. A firewall and security gateway in the filter tank 112 may filter the video signal. When a video signal can pass through a firewall and a security gateway in the filtering tank 112, the video signal is a security signal. When the video signal includes abnormal signals such as viruses, the firewall and security gateway in the filtering tank 112 will intercept the video signal.
The security pool 113 is used for monitoring the video signals acquired by the private cloud service source station.
In this embodiment, the private cloud service source 11 may include a security pool 113. The secure pool 113 is the second line of defense for video signals entering the private cloud service source station 11. The secure pool 113 may include detection probes deployed in the core switching area. The detection probe is specifically used for detecting the flow of the mirror image whole network. The secure pool 113 may also include security posture awareness deployed in an operation and maintenance management area. The security situation perception and the detection probe are communicated, and accurate monitoring and protection of application, data and identity security of an internal system are achieved. The safety pool 113 is used, so that the safety of the platform and the operation and maintenance management capacity are greatly improved.
The memory 114 is used for storing video signals and/or playing files.
In this embodiment, the private cloud service source 11 may include a memory 114. The memory 114 is used to store the video signal that is sourced back to the private cloud service source station 11 into the server. The video signal is the media original. The storage of the video signal may facilitate a request by the user terminal to review the live broadcast.
The private cloud service source station in the high-definition live broadcast system based on the hybrid cloud can comprise a filtering pool, a safety pool and a storage area. The filtering pool is a first defense line for the video signals to enter the private cloud service source station and is used for filtering abnormal signals in the video signals. The security pool is a second defense line for enabling video signals to enter the private cloud service source station and is used for achieving accurate monitoring and protection of application, data and identity security of an internal system. In the application, the filtering of the video signals and the monitoring and protection of the private service source station are realized by using the filtering pool and the safety pool, and the safety of the private cloud service source station is improved. Through, this application still through using the memory, will return the source to the video signal storage of private cloud service source station 11 to the server, improves live broadcast video's utilization ratio, is convenient for call this live broadcast video in the later stage.
Fig. 5 shows a schematic flowchart of a streaming media method provided in an embodiment of the present application, and as shown in fig. 5, the streaming media method of the present embodiment may include:
s101, acquiring a video signal, wherein the video signal is a streaming media signal acquired and encoded by front-end equipment.
In this embodiment, the high definition live system is from the video signal of front end equipment collection. The front-end equipment can be equipment for acquiring video signals, such as a high-definition/4K camera, a VR camera, screen recording equipment, a microphone and the like. The video signal collected by the front-end device may specifically include a high definition/4K signal, a VR signal, an AR signal, a mobile phone push stream signal, a 5G return signal, a microphone audio signal, a DDR/GFX, a JPEG picture, and the like. The front-end device may encode the video data into a streaming media signal, i.e., a video signal, after the acquisition is completed. And the front-end equipment uploads the video signal to a high-definition live broadcast system.
S102, transcoding the video signal according to a preset transcoding rule to obtain a playing file, wherein the playing file is a streaming media playing file obtained after the video signal is transcoded.
In this embodiment, after acquiring a video signal, the high-definition live broadcast system transcodes the video signal to obtain a play file. The transcoding rule of the high-definition live broadcast system is a preset rule. In the application, in order to realize data transmission of streaming media, the high-definition live broadcast system transcodes the video signal by using an HLS algorithm. The high-definition live broadcast system cuts the video signal into continuous TS small files with short duration. And the high-definition live system indexes the files through the M3U8 and sequences the TS files obtained by cutting.
And S103, distributing and playing the file in a downstream mode.
In this embodiment, the high-definition live broadcast system distributes the broadcast file to each user terminal in a downlink manner. When the user terminal receives the play file, the user terminal may sequentially play the received TS files according to the M3U8 index file.
The streaming media method provided in the embodiment of the present application is implemented based on the system embodiment, and specific implementation principles and technical effects thereof can be referred to the system embodiment, which is not described herein again.
Fig. 6 shows a schematic flow chart of another streaming media method provided in an embodiment of the present application, and based on the embodiment shown in fig. 5, as shown in fig. 6, the streaming media method of the present embodiment may include:
s201, the edge node acquires a video signal acquired by the front-end equipment.
S202, the edge node encodes the video signal collected by the front-end device through a signal encoder (a software/hardware encoder), and the encoded video signal is a streaming media signal.
And S203, the edge node uploads the video signal to a dispatching desk, wherein the transmission mode comprises real-time signal streams of network protocols such as HTTP, RTMP, UDP, RTSP and the like.
And S204, the dispatching desk sends the video signal back to the private cloud service source station.
And S205, the private cloud service source station divides the video signal into playing files by using an HLS slicing algorithm stored in the local service, wherein the playing files comprise TS files and M3U8 files.
And S206, the private cloud service source station stores the video signal and the platform service information.
And S207, the private cloud service source station pulls the playing file to the dispatching desk.
And S208, the dispatching desk feeds the private cloud service source station back to the transcoding node of the public cloud service source station.
S209, transcoding the playing file by the transcoding node, wherein the transcoding strategy comprises video code rate control and transcoding strategy determination, the video code rate comprises three definition of ultra-definition, high-definition and standard definition, and the transcoding strategy comprises picture scale adjustment.
S210, the transcoding node distributes the transcoded playing file to the edge node in a descending mode.
S211, the edge node distributes the playing file to the user terminal in an accelerating mode.
The streaming media method provided in the embodiment of the present application is implemented based on the system embodiment, and specific implementation principles and technical effects thereof can be referred to the system embodiment, which is not described herein again.
Based on the embodiment, the high-definition live broadcast system based on the mixed cloud can be applied to various live broadcast scenes such as fixed/walking interactive live broadcast, AR interactive live broadcast, holographic interactive live broadcast and the like.
As shown in fig. 7, the high-definition live broadcast system is applied to 4K/VR live broadcast, and a live broadcast mode of 5G +4K/VR is used in the high-definition live broadcast system. The high-definition live broadcast system collects video data acquired by the 4K/VR camera through a 5G network. The video data is encoded using a 4K/VR encoder. The encoding format is H.265. In the high-definition live broadcast system, the video code rate bandwidth of uplink data can reach 60M. Based on the high-definition live broadcast system, the user can realize ultrahigh-definition live broadcast and 360-degree panoramic live broadcast. Meanwhile, the high-definition live broadcast system shown in fig. 7 can also provide services such as channel management, media library, portal center, application tool management and the like. And the high-definition live broadcast system can also support the butt joint with a platform owned by a client, and realize the functions of remote access, remote watching, online interaction and the like. Meanwhile, the high-definition live broadcast system can also support fusion of various video devices such as a monitoring camera, a live broadcast camera and a conference camera. In conclusion, the high-definition live broadcast system disclosed by the application is used in the 4K/VR live broadcast process, so that not only is the regional obstacle broken through, but also the informatization cooperation is promoted, and the popular live broadcast content production and distribution are realized.
Fig. 8 is a flowchart of a system when the high-definition live broadcast system is applied to AR live broadcast. As shown in the figure, the high-definition live broadcast system achieves the effects of high-quality model rendering, rapid digital 3D model importing, high-precision action and expression capturing and the like based on the 5G node. The high-definition live broadcast system applied to AR live broadcast realizes model driving with low delay, when an RTMP protocol is adopted, the time delay can reach 2s, and the same-station interaction of a virtual anchor and a real anchor is realized.
To realize the application of the high-definition live broadcast system based on the hybrid cloud in various live broadcast scenes, the compatibility of the high-definition live broadcast system needs to be ensured. Fig. 9 is a schematic diagram illustrating a compatibility relationship of the high-definition live broadcast system. The high-definition live broadcast system can realize service series connection based on a plurality of third-party platforms. As shown in the figure, the high-definition live broadcast system completes the process of playing from a third-party platform after being pulled and played locally or being transferred to other third-party platforms for playing. The high-definition live broadcast system can pull real-time stream signals according to stream pulling addresses of RTMP/RTSP/HLS protocols provided by a third-party platform, and then further transfer the real-time stream signals to other three-party platforms, so that multi-platform linkage live broadcast is completed. In practical application, the information acquisition efficiency of unconventional acquisition equipment mainly comprising an IP camera is optimal.
The present application also provides a computer-readable storage medium, in which a computer program is stored, and the computer program is used for implementing the methods provided by the above-mentioned various embodiments when being executed by a processor.
The present application also provides a computer program product comprising a computer program stored in a computer readable storage medium. The computer program can be read by at least one processor of the device from a computer-readable storage medium, and execution of the computer program by the at least one processor causes the device to implement the methods provided by the various embodiments described above.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of modules is merely a division of logical functions, and an actual implementation may have another division, for example, a plurality of modules may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same. Although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: it is also possible to modify the solutions described in the previous embodiments or to substitute some or all of them with equivalents. And the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A high definition live broadcast system based on a hybrid cloud, the system comprising: the system comprises a private cloud service source station and a public cloud service source station;
the private cloud service source station is used for acquiring a video signal and transcoding the video signal according to a preset transcoding rule to obtain a playing file, wherein the playing file is a streaming media playing file obtained after the video signal is transcoded;
and the public cloud service source station is used for distributing the playing file to the user terminal in a downlink manner.
2. The hybrid cloud based high definition live broadcast system according to claim 1, wherein the private cloud service source station further comprises: a filtration tank;
the filtering pool is used for filtering the video signal according to a preset rule, and the video signal is a streaming media signal acquired and encoded by front-end equipment.
3. The hybrid cloud based high definition live broadcast system according to claim 2, wherein the private cloud service source station further comprises: a safety pool;
the safety pool is used for monitoring the video signals acquired by the private cloud service source station.
4. The hybrid cloud based high definition live broadcast system according to claim 3, wherein the private cloud service source station further comprises: a storage area;
the memory is used for storing the video signal and/or the playing file.
5. The hybrid cloud based high definition live broadcast system of claim 1, wherein the public cloud service source station comprises a transcoding node;
the transcoding node is used for determining a video code rate and a transcoding strategy of the playing file according to an output format of a user terminal, the transcoding node transcodes the playing file according to the video code rate and the transcoding strategy, the video code rate comprises one of ultra-definition, high-definition and standard definition, and the transcoding strategy comprises a picture scale.
6. The hybrid cloud based high definition live broadcast system of claim 5, wherein the public cloud service source station comprises: a dispatching desk and an edge node;
the dispatching desk is used for determining the edge nodes of downlink distribution according to the user terminal;
and the edge node is used for realizing the downlink distribution of the downlink file.
7. The hybrid cloud based high definition live broadcast system according to any one of claims 1 to 6, further comprising: a 5G node;
and the 5G node is used for sending the video signal uploaded by the front-end equipment to the private cloud service source station through the 5G streaming media.
8. A method for streaming media, the method comprising:
acquiring a video signal, wherein the video signal is a streaming media signal acquired and encoded by front-end equipment;
transcoding the video signal according to a preset transcoding rule to obtain a playing file, wherein the playing file is a streaming media playing file obtained after transcoding the video signal;
and distributing the playing file in a downstream mode.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, is adapted to implement a hybrid cloud based high definition live broadcast system according to any one of claims 1 to 7.
10. A computer program product, characterized in that the computer program product comprises a computer program which, when executed by a processor, implements the hybrid cloud based high definition live broadcast system of any one of claims 1 to 7.
CN202111024639.XA 2021-09-02 2021-09-02 High-definition live broadcast system and streaming media method based on mixed cloud Pending CN113727144A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111024639.XA CN113727144A (en) 2021-09-02 2021-09-02 High-definition live broadcast system and streaming media method based on mixed cloud

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111024639.XA CN113727144A (en) 2021-09-02 2021-09-02 High-definition live broadcast system and streaming media method based on mixed cloud

Publications (1)

Publication Number Publication Date
CN113727144A true CN113727144A (en) 2021-11-30

Family

ID=78680826

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111024639.XA Pending CN113727144A (en) 2021-09-02 2021-09-02 High-definition live broadcast system and streaming media method based on mixed cloud

Country Status (1)

Country Link
CN (1) CN113727144A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114629795A (en) * 2022-01-30 2022-06-14 阿里巴巴(中国)有限公司 Bandwidth usage method and content distribution network
CN114666610A (en) * 2022-04-02 2022-06-24 体奥动力(北京)体育传播有限公司 Video processing method and device for event site and electronic equipment
CN114666616A (en) * 2022-03-16 2022-06-24 同方知网数字出版技术股份有限公司 Low-cost high-confidentiality live broadcast playback method
CN114710682A (en) * 2022-04-02 2022-07-05 体奥动力(北京)体育传播有限公司 Virtual reality video processing method and device for event site and electronic equipment

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015051755A1 (en) * 2013-10-10 2015-04-16 中兴通讯股份有限公司 Multimedia file playing method and system, cloud transcoding server device and storage medium
CN104902005A (en) * 2015-04-13 2015-09-09 中国联合网络通信集团有限公司 Method and system for resource scheduling in hybrid cloud, and private cloud
CN106453658A (en) * 2016-12-08 2017-02-22 北京君泰家安科技有限公司 Online education platform
CN106530331A (en) * 2016-11-23 2017-03-22 北京锐安科技有限公司 Video monitoring system and method
CN106993236A (en) * 2017-04-01 2017-07-28 青岛海信电器股份有限公司 A kind of video broadcasting method and terminal
CN108173861A (en) * 2017-12-29 2018-06-15 北京奇虎科技有限公司 A kind of method, apparatus of net cast and live streaming distribution connector
CN108174228A (en) * 2017-12-27 2018-06-15 深圳康佳信息网络有限公司 Method, system and the storage medium of resource is broadcast live in acquisition for mobile terminal set-top box
CN108200444A (en) * 2017-12-29 2018-06-22 北京奇虎科技有限公司 A kind of methods, devices and systems of net cast
CN108495151A (en) * 2018-03-22 2018-09-04 深圳牛视科技有限公司 A kind of m3u8 format videos live broadcast system and method
WO2018210411A1 (en) * 2017-05-16 2018-11-22 Telefonaktiebolaget Lm Ericsson (Publ) Low latency media ingestion system, devices and methods
US10182269B1 (en) * 2018-04-24 2019-01-15 Verizon Patent And Licensing Inc. HTTP live streaming delivery over multicast
CN109547824A (en) * 2018-11-27 2019-03-29 亦非云互联网技术(上海)有限公司 A kind of video traffic method of servicing and system, storage medium and vpn server
CN109819265A (en) * 2017-11-20 2019-05-28 杭州萤石网络有限公司 Data storage, data capture method and system
CN110855653A (en) * 2019-11-05 2020-02-28 四川中讯易科科技有限公司 Cloud platform data processing method for private cloud
CN111064973A (en) * 2019-11-28 2020-04-24 湖北工业大学 Live broadcast system based on IPV9
CN111988368A (en) * 2020-07-30 2020-11-24 央视频融媒体发展有限公司 Data interaction system and interaction method
CN112491990A (en) * 2020-11-17 2021-03-12 中科三清科技有限公司 Hybrid cloud network data transmission method and device, electronic equipment and storage medium

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015051755A1 (en) * 2013-10-10 2015-04-16 中兴通讯股份有限公司 Multimedia file playing method and system, cloud transcoding server device and storage medium
CN104902005A (en) * 2015-04-13 2015-09-09 中国联合网络通信集团有限公司 Method and system for resource scheduling in hybrid cloud, and private cloud
CN106530331A (en) * 2016-11-23 2017-03-22 北京锐安科技有限公司 Video monitoring system and method
CN106453658A (en) * 2016-12-08 2017-02-22 北京君泰家安科技有限公司 Online education platform
CN106993236A (en) * 2017-04-01 2017-07-28 青岛海信电器股份有限公司 A kind of video broadcasting method and terminal
WO2018210411A1 (en) * 2017-05-16 2018-11-22 Telefonaktiebolaget Lm Ericsson (Publ) Low latency media ingestion system, devices and methods
CN109819265A (en) * 2017-11-20 2019-05-28 杭州萤石网络有限公司 Data storage, data capture method and system
CN108174228A (en) * 2017-12-27 2018-06-15 深圳康佳信息网络有限公司 Method, system and the storage medium of resource is broadcast live in acquisition for mobile terminal set-top box
CN108200444A (en) * 2017-12-29 2018-06-22 北京奇虎科技有限公司 A kind of methods, devices and systems of net cast
CN108173861A (en) * 2017-12-29 2018-06-15 北京奇虎科技有限公司 A kind of method, apparatus of net cast and live streaming distribution connector
CN108495151A (en) * 2018-03-22 2018-09-04 深圳牛视科技有限公司 A kind of m3u8 format videos live broadcast system and method
US10182269B1 (en) * 2018-04-24 2019-01-15 Verizon Patent And Licensing Inc. HTTP live streaming delivery over multicast
CN109547824A (en) * 2018-11-27 2019-03-29 亦非云互联网技术(上海)有限公司 A kind of video traffic method of servicing and system, storage medium and vpn server
CN110855653A (en) * 2019-11-05 2020-02-28 四川中讯易科科技有限公司 Cloud platform data processing method for private cloud
CN111064973A (en) * 2019-11-28 2020-04-24 湖北工业大学 Live broadcast system based on IPV9
CN111988368A (en) * 2020-07-30 2020-11-24 央视频融媒体发展有限公司 Data interaction system and interaction method
CN112491990A (en) * 2020-11-17 2021-03-12 中科三清科技有限公司 Hybrid cloud network data transmission method and device, electronic equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114629795A (en) * 2022-01-30 2022-06-14 阿里巴巴(中国)有限公司 Bandwidth usage method and content distribution network
CN114629795B (en) * 2022-01-30 2024-01-02 阿里巴巴(中国)有限公司 Bandwidth usage method and content distribution network
CN114666616A (en) * 2022-03-16 2022-06-24 同方知网数字出版技术股份有限公司 Low-cost high-confidentiality live broadcast playback method
CN114666610A (en) * 2022-04-02 2022-06-24 体奥动力(北京)体育传播有限公司 Video processing method and device for event site and electronic equipment
CN114710682A (en) * 2022-04-02 2022-07-05 体奥动力(北京)体育传播有限公司 Virtual reality video processing method and device for event site and electronic equipment

Similar Documents

Publication Publication Date Title
CN110266664B (en) Cloud VR video live broadcast system based on 5G and MEC
CN105392020B (en) A kind of internet video live broadcasting method and system
CN113727144A (en) High-definition live broadcast system and streaming media method based on mixed cloud
CN109788314B (en) Method and device for transmitting video stream data
CN100429901C (en) Method and structure for realizing live channel switching in Internet protocol audio-video broadcast network
US9258145B2 (en) Method and system for distribution of information contents and corresponding computer program product
CN107027045A (en) Pushing video streaming control method, device and video flowing instructor in broadcasting end
RU2647654C2 (en) System and method of delivering audio-visual content to client device
US10298965B2 (en) Selection of a content source based on performance data
US9113182B2 (en) Selecting a media content source based on monetary cost
US10499094B2 (en) Transmission apparatus, transmitting method, reception apparatus, and receiving method
US9253545B2 (en) Routing media content based on monetary cost
CN111447503A (en) Viewpoint switching method, server and system for multi-viewpoint video
US20230045876A1 (en) Video Playing Method, Apparatus, and System, and Computer Storage Medium
CN109561137B (en) Method, device, terminal equipment and medium for establishing P2P network
CN107547517B (en) Audio and video program recording method, network equipment and computer device
CN108574816B (en) Video networking terminal and communication method and device based on video networking terminal
Lykourgiotis et al. Hybrid broadcast and broadband networks convergence for immersive TV applications
CN109510868B (en) Method, device, terminal equipment and storage medium for establishing P2P network
Gürler et al. Peer-to-peer system design for adaptive 3D video streaming
CN110661992A (en) Data processing method and device
CN110392275B (en) Sharing method and device for manuscript demonstration and video networking soft terminal
KR101702426B1 (en) Video transmission method based on multi HTTP threads for reducing the viewpoint change delay in multi-view video service
CN114598853A (en) Video data processing method and device and network side equipment
CN113055636B (en) Data processing method and conference system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination