CN111405308A - Method and device for sending live audio and video data - Google Patents

Method and device for sending live audio and video data Download PDF

Info

Publication number
CN111405308A
CN111405308A CN202010215403.3A CN202010215403A CN111405308A CN 111405308 A CN111405308 A CN 111405308A CN 202010215403 A CN202010215403 A CN 202010215403A CN 111405308 A CN111405308 A CN 111405308A
Authority
CN
China
Prior art keywords
server
audio
video data
live
data processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010215403.3A
Other languages
Chinese (zh)
Other versions
CN111405308B (en
Inventor
郭志鸣
孙聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Century TAL Education Technology Co Ltd
Original Assignee
Beijing Three Body Cloud Times Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Three Body Cloud Times Technology Co ltd filed Critical Beijing Three Body Cloud Times Technology Co ltd
Priority to CN202010215403.3A priority Critical patent/CN111405308B/en
Publication of CN111405308A publication Critical patent/CN111405308A/en
Application granted granted Critical
Publication of CN111405308B publication Critical patent/CN111405308B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2368Multiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the application provides a method and a device for sending live audio and video data, wherein the method comprises the following steps: the data processing server receives first direct broadcast audio and video data sent by a first media server; and the data processing server sends the first broadcast audio and video data to a target transit server so that the target transit server can send the first broadcast audio and video data to a second media server, wherein the target transit server is a transit server with the best connectivity with the data processing server in the plurality of transit servers. According to the embodiment of the application, the live broadcast audio and video data are sent through the transfer server with the best connectivity with the data processing server, so that the data processing server and the target transfer server achieve the best transmission effect, and the sound quality of the microphone connecting user is guaranteed.

Description

Method and device for sending live audio and video data
Technical Field
The present application relates to the field of network technologies, and in particular, to a method and an apparatus for sending live audio and video data.
Background
With the development of the internet, the internet application of streaming media has also been widely developed. Live webcasting is an important composition of streaming media, and attracts more and more users with rich content and interactivity. And the live broadcast form of the live broadcast rooms with the live broadcast is realized, the direct broadcast expression form is greatly enriched, and the viewing experience of the user is improved.
The existing live broadcast microphone connecting method generally pushes live broadcast audio and video data of a plurality of users who carry out microphone connecting to a data processing server, and the data processing server completes sound mixing (such as a server sound mixing mode) and forwarding of the live broadcast audio and video data in a live broadcast room, so that voice intercommunication between microphone connecting users is realized.
In the process of implementing the invention, the inventor finds that the following problems exist in the prior art: in the case that there are cross-regions (especially cross-large regions, for example, cross-country scenes, etc.) for users in the live broadcast room, due to the responsibility of the global network, it is usually not guaranteed that network links between the data processing server and media servers accessed by all users are in a better state, which may result in a problem of poor sound quality when the users in the live broadcast room connect to the internet.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for sending live audio/video data, so as to solve the problem in the prior art that the sound quality is poor when a user in a live broadcast room connects to a microphone.
In a first aspect, an embodiment of the present application provides a method for sending live audio and video data, where the method is applied to an audio and video live broadcast system, the audio and video live broadcast system includes a first media server, a second media server, a data processing server, and multiple transit servers, and the first media server is connected to the data processing server, and the method includes: the data processing server receives first direct broadcast audio and video data sent by a first media server; the data processing server sends the first live audio and video data to a target transit server so that the target transit server sends the first live audio and video data to a second media server, wherein the target transit server is a transit server which is the best in communication with the data processing server among the plurality of transit servers (or the target transit server is a transit server which meets preset conditions among the plurality of transit servers). Therefore, in the embodiment of the application, the data processing server receives the first broadcast audio and video data sent by the first media server, and then the data processing server sends the first broadcast audio and video data to the target relay server, so that the target relay server sends the first broadcast audio and video data to the second media server, where the target relay server is a relay server with the best connectivity with the data processing server among the plurality of relay servers. Therefore, the live broadcast audio and video data are sent through the transfer server with the best connectivity with the data processing server, the data processing server and the target transfer server achieve the best transmission effect, the voice quality of the microphone connecting user is guaranteed, and the problem that the voice quality of the user in the live broadcast room is poor when the user is connected with the microphone in the prior art is solved.
In one possible embodiment, the audio-video live system further includes a third media server, and the method further includes: the data processing server receives second live audio and video data sent by the target transfer server, wherein the second live audio and video data are sent to the target transfer server by a second media server; the data processing server performs sound mixing on the first live broadcast audio and video data and the second live broadcast audio and video data to obtain third live broadcast audio and video data; and the data processing server sends the third live audio and video data to the target transfer server so that the target transfer server can send the third live audio and video data to the third media server.
Therefore, the embodiment of the application can also send the live audio and video data through the transfer server with the best connectivity with the data processing server, so that the data processing server and the target transfer server achieve the best transmission effect, the quality of live pictures is ensured, and the experience of audiences is improved.
In a second aspect, an embodiment of the present application provides a method for sending live audio and video data, where the method is applied to an audio and video live broadcast system, the audio and video live broadcast system includes a first media server, a second media server, a data processing server, and multiple transit servers, and the first media server is connected to the data processing server, and the method includes: the target transfer server receives first broadcast audio and video data sent by the data processing server, wherein the target transfer server is a transfer server with the best connectivity with the data processing server in the plurality of transfer servers, and the first broadcast audio and video data are sent to the data processing server by a first media server; and the target transit server sends the first broadcast audio and video data to the second media server.
In one possible embodiment, the audio-video live system further includes a third media server, and the method further includes: the target transfer server receives third live broadcast audio and video data sent by the data processing server, wherein the third live broadcast audio and video data is obtained by mixing second live broadcast audio and video data and first live broadcast audio and video data by the data processing server, and the second live broadcast audio and video data is sent to the target transfer server by the second media server; and the target transfer server sends the third live audio and video data to the third media server.
In a third aspect, an embodiment of the present application provides a device for sending live audio and video data, where the device is applied to an audio and video live broadcast system, the audio and video live broadcast system includes a first media server, a second media server, a data processing server and a plurality of relay servers, the first media server is connected to the data processing server, and the device is applied to the data processing server, and the device includes: the first receiving module is used for receiving first live broadcast audio and video data sent by a first media server; and the first sending module is used for sending the first live broadcast audio and video data to a target transit server so that the target transit server can send the first live broadcast audio and video data to a second media server, wherein the target transit server is a transit server with the best connectivity with the data processing server in the plurality of transit servers.
In a possible embodiment, the audio-video live system further includes a third media server, and the apparatus further includes: the first receiving module is further used for receiving second live broadcast audio and video data sent by the target transit server, wherein the second live broadcast audio and video data are sent to the target transit server by a second media server; the audio mixing module is used for mixing audio of the first live broadcast audio and video data and the second live broadcast audio and video data to obtain third live broadcast audio and video data; the first sending module is further configured to send third live audio and video data to the target relay server, so that the target relay server sends the third live audio and video data to the third media server.
In a fourth aspect, an embodiment of the present application provides an apparatus for sending live audio and video data, where the apparatus is applied to an audio and video live broadcast system, the audio and video live broadcast system includes a first media server, a second media server, a data processing server, and a plurality of relay servers, the first media server is connected to the data processing server, the apparatus is applied to a target relay server, and the target relay server is a relay server in the plurality of relay servers, which has the best connectivity with the data processing server, and the apparatus includes: the second receiving module is used for receiving the first broadcast audio and video data sent by the data processing server, wherein the first broadcast audio and video data are sent to the data processing server by the first media server; and the second sending module is used for sending the first live broadcast audio and video data to the second media server.
In a possible embodiment, the audio-video live system further includes a third media server, and the apparatus further includes: the second receiving module is used for receiving third live broadcast audio and video data sent by the data processing server, wherein the third live broadcast audio and video data are obtained by the data processing server after the second live broadcast audio and video data and the first live broadcast audio and video data are subjected to sound mixing, and the second live broadcast audio and video data are sent to the target transit server by the second media server; and the second sending module is also used for sending third live audio and video data to a third media server.
In a fifth aspect, the present application provides a storage medium having a computer program stored thereon, where the computer program is executed by a processor to perform the method of the first aspect or any optional implementation manner of the first aspect.
In a sixth aspect, the present application provides a storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the method of the second aspect or any optional implementation manner of the second aspect.
In a seventh aspect, an embodiment of the present application provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the method of the first aspect or any of the alternative implementations of the first aspect.
In an eighth aspect, an embodiment of the present application provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the method of the second aspect or any of the alternative implementations of the second aspect.
In a ninth aspect, the present application provides a computer program product which, when run on a computer, causes the computer to perform the method of the first aspect or any possible implementation manner of the first aspect.
In a tenth aspect, the present application provides a computer program product which, when run on a computer, causes the computer to perform the method of the second aspect or any possible implementation of the second aspect.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 shows a schematic diagram of an application scenario in the prior art;
FIG. 2 illustrates a schematic diagram of an application scenario to which embodiments of the present application are applicable;
fig. 3 is a flowchart illustrating a method for sending live audio/video data according to an embodiment of the present application;
fig. 4 shows a block diagram of a system for sending live audio/video data according to an embodiment of the present application;
fig. 5 shows a specific flowchart of a method for sending live audio/video data according to an embodiment of the present application;
fig. 6 shows a block diagram of an apparatus for sending live audio/video data according to an embodiment of the present application;
fig. 7 shows a block diagram of another apparatus for sending live audio/video data according to an embodiment of the present application;
fig. 8 shows a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, fig. 1 is a schematic diagram illustrating an application scenario 100 in the prior art. Specifically, the application scenario 100 includes: data processing server 110, media server 120, media server 130, and media server 140. And, the application scenario 100 further includes: mobile terminals 121 and 122 in communication with the media server 120 (or, in other words, accessing the media server 120), and mobile terminals 131 and 132 in communication with the media server 130, and mobile terminal 141 in communication with the media server 140.
Among them, the media server 120 and the mobile terminals 121 and 122 correspond to the first area, and the network link between the media server 120 and the data processing server 110 is preferable. And, the media server 140 and the mobile terminal 141 correspond to the third area, and the network link between the media server 140 and the data processing server 110 is also preferable.
However, since the network link between the media server 130 and the data processing server 110 corresponding to the second area is poor (i.e., the dotted line in fig. 1 indicates that the connectivity of the network link is poor), in the case that the user corresponding to the mobile terminal (e.g., the mobile terminal 121 or the mobile terminal 141) in the remaining area connects to the user corresponding to the mobile terminal 131, the sound quality between the connected users may be affected due to the influence of the network link.
And, since the broadcast audio and video data between the mobile terminal 131 and the mobile terminal 132 in the same area also needs to pass through the data processing server 110, the quality of the sound of the user in the same area is not good due to the influence of the network link.
In addition, in the case that the user corresponding to the mobile terminal 131 (or the mobile terminal 132) performs live broadcast, for the user (for example, the user corresponding to the mobile terminal 141) watching live broadcast, live broadcast audio/video data of the mobile terminal 131 also needs to pass through the data processing server 110, so that the watching experience of the user watching live broadcast is not good.
Based on this, the embodiment of the application skillfully provides a scheme for sending live audio and video data, the first live audio and video data sent by the first media server is received by the data processing server, and the data processing server sends the first live audio and video data to the target transit server, so that the target transit server sends the first live audio and video data to the second media server, wherein the target transit server is a transit server with the best connectivity with the data processing server among the plurality of transit servers. Therefore, the embodiment of the application sends the live broadcast audio and video data through the transfer server with the best connectivity with the data processing server, so that the sound quality of the user who connects to the microphone is ensured, and the problem that the sound quality of the user who connects to the microphone in the live broadcast room is poor in the prior art is solved.
To facilitate understanding of the embodiments of the present application, some terms in the embodiments of the present application are first explained herein as follows:
the mixing refers to mixing two or more paths of live audio and video streams together to form one path of live audio and video stream.
"connectivity" refers to how well the transmission performance of a network link is.
Referring to fig. 2, fig. 2 shows an illustrative diagram of an application scenario 200 to which embodiments of the present application are applicable. Specifically, the application scenario 200 includes: a data processing server 210, a media server 220, a mobile terminal 221, a relay server 230, a media server 240, and a mobile terminal 241.
It should be noted that the number of the data processing servers 210 in the embodiment of the present application may be multiple, and for convenience of description, only one data processing server 210 is shown in fig. 2.
It should be further noted that, in the embodiment of the present application, the number of the transit servers 230 may also be multiple, and this is also for convenience of description, so fig. 2 only shows one transit server 230.
The data processing server 210, the media server 220, and the mobile terminal 221 may be divided into a first area, and the relay server 230, the media server 240, and the mobile terminal 241 may be divided into a second area.
It should be understood that the type of the data processing server 210 may be set according to actual requirements, and the embodiment of the present application is not limited thereto.
For example, the data processing server 210 may be a single server, a server group, or the like, wherein the data processing server 210 may be centralized or distributed (for example, the data processing server 210 may be a distributed system).
Correspondingly, the media server 220, the transit server 230 and the media server 240 are all similar to the data processing server 210, i.e. they can all set the type of server according to actual requirements, and will not be described in detail here, and refer to the related description of the data processing server 210 in the foregoing.
It should also be understood that the specific type of the mobile terminal 221 may also be set according to actual requirements, and the embodiment of the present application is not limited thereto.
For example, the mobile terminal 221 may be a mobile phone, a tablet computer, a wearable device, or the like.
Correspondingly, the mobile terminal 241 may be similar to the mobile terminal 221, i.e. it may also set the type of the mobile terminal according to actual requirements, and will not be described in detail here, and refer to the foregoing description of the mobile terminal 221.
In the embodiment of the present application, in the case that the user corresponding to the mobile terminal 221 performs live broadcasting, at this time, the mobile terminal 221 may communicate with the media server 220, and the media server 220 may communicate with the data processing server 210. In the case where the user corresponding to mobile terminal 221 desires to have a live connection with the user corresponding to mobile terminal 241, mobile terminal 241 may now communicate with media server 240.
Subsequently, the scheduling server (not shown in fig. 2) may transmit attribute information (including an IP address and a server number) of the media server 240 to the data processing server 210. And, the data processing server 210 may query a plurality of transit servers belonging to the same area as the media server 240 from a pre-acquired transit server list thereof according to the attribute information of the media server 240. Among them, the transit server list may store the related situations of the transit servers of a plurality of different areas (for example, 3 transit servers and the addresses of the 3 transit servers may be set in the first area). And, the data processing server 210 may determine, through the transit server list, that the transit servers belonging to the same area as the media server 240 include the transit server 230 and other transit servers (not shown in fig. 2).
Subsequently, the data processing server 210 performs route probing on the plurality of transit servers in the second area, respectively, to select the transit server 230 with the best connectivity with the data processing server 210, and take the transit server 230 with the best connectivity as a target transit server. And, the data processing server 210 establishes a communication link with the relay server 230, and the relay server 230 establishes a communication link with the media server 240. Therefore, the user corresponding to the mobile terminal 221 and the user corresponding to the mobile terminal 241 can carry out live broadcast connection.
In addition, the data processing server 210 may further mix the live audio and video data of the mobile terminal 221 and the live audio and video data of the mobile terminal 241 to obtain mixed live audio and video data. And, the data processing server 210 may transmit the mixed live audio and video data to other mobile terminals (not shown in fig. 2), so that the viewers can watch the live content of the live broadcast room.
It should be noted that the scheme for sending live audio/video data provided in the embodiment of the present invention may be further extended to other suitable application scenarios, and is not limited to the application scenario 200 shown in fig. 2. In addition, although only 2 media servers and 2 mobile terminals are shown in fig. 2, it should be understood by those skilled in the art that the application scenario 200 may include more media servers or mobile terminals in the process of practical application, and the embodiment of the present application is not limited thereto.
Referring to fig. 3, fig. 3 shows a flowchart of a method for sending live audio/video data according to an embodiment of the present application. The method shown in fig. 3 is applied to an audio and video live broadcasting system, the audio and video live broadcasting system includes a first media server, a second media server, a data processing server and a plurality of relay servers, the first media server is connected with the data processing server, and the method includes:
in step S310, the data processing server sends a test packet for testing connectivity to each of the plurality of transit servers.
It should be understood that, since the data processing server has the functions of mixing and relaying, the data processing server may also be referred to as a mixing/relaying server, and the embodiments of the present application are not limited thereto.
Specifically, when the user corresponding to the first media server wants to perform live broadcast microphone connection with the user corresponding to the second media server, the first media server is already connected with the data processing server at this time. And the scheduling server may acquire attribute information of the second media server, wherein the attribute information includes an IP address of the second server and a server number of the second media server. Subsequently, the scheduling server transmits attribute information of the second media server to the data processing server. Correspondingly, the data processing server receives the attribute information of the second media server sent by the scheduling server.
It should be understood that, although the foregoing description shows that the attribute information includes an IP address and a server number, it should be understood by those skilled in the art that the attribute information may also include other information besides the IP address and the server number, and the embodiment of the present application is not limited thereto.
It should also be understood that the manner in which the scheduling server obtains the attribute information of the second media server may also be set according to actual requirements, and the embodiment of the present application is not limited thereto.
For example, the second media server may send a request to the scheduling server, and the scheduling server may query attribute information such as an IP address and a server number of the second media server according to the request of the second media server, that is, attribute information of media servers in each area may be stored in the scheduling server, that is, after the scheduling server acquires the request of the second media server, the scheduling server may determine the attribute information directly by a query method.
For another example, the second media server sends information carrying attribute information such as its own IP address and server number to the data processing server, so that the data processing server determines the attribute information of the second media server according to the information of the second media server.
It should also be understood that the first media server may also be referred to as an audio-video access server, and the embodiments of the present application are not limited thereto.
Correspondingly, other media servers may also be referred to as an audio/video access server, and the embodiments of the present application are not limited thereto.
In addition, since the data processing server may obtain in advance a relay server list in which settings of relay servers of the entire network are stored, the data processing server may find a plurality of relay servers in the same area as the second media server from the relay server list according to the attribute information of the second media server.
It should be understood that the manner in which the data processing server obtains the transit server list in advance may be set according to actual requirements, and the embodiment of the present application is not limited thereto. For example, the data processing server may store a transit server list in advance, and the data processing server may obtain the transit server list by reading the memory. For another example, the transit server list may be stored in a database, and the data processing server may acquire the transit server list in advance by downloading from the database.
In addition, in the case where the data processing server determines a plurality of relay servers in the same area as the second media server, the data processing server may send a test packet to each of the plurality of relay servers to perform a connectivity test.
In step S320, the data processing server receives a feedback packet sent by at least one relay server of the plurality of relay servers.
It should be appreciated that, since the network link between the data processing server and one (or more) of the plurality of transit servers may be down, the server may not be able to obtain the feedback message after sending the test message for the network link in which the down condition exists.
Specifically, after the data processing server sends a test packet for testing connectivity to each of the plurality of transit servers, the transit server that receives the test packet (e.g., a subsequently determined target transit server) may generate a feedback packet according to the test packet. And the transfer server receiving the test message can send a feedback message to the data processing server.
Step S330, the data processing server analyzes the feedback message sent by the at least one transit server to obtain a connectivity test parameter. Wherein the testing parameters of the connectivity comprise at least one of the following parameters: packet loss rate, network status, and network delay.
It should be understood that, although the foregoing shows that the test parameters include packet loss rate, network status and network delay, those skilled in the art should understand that the network parameters may also include other parameters besides the above three parameters, and the embodiment of the present application is not limited thereto.
Step S340, the data processing server determines the transfer server with the best connectivity with the data processing server according to the test parameter of connectivity, and takes the transfer server with the best connectivity with the data processing server as the target transfer server.
Specifically, the data processing server may sort all the transit servers in order from good to bad in connectivity according to the test parameters of connectivity. And the data processing server may take the first transit server with the best connectivity in the ranking results as the target transit server.
It should be noted that, although the determination process of the target transit server is shown in the foregoing steps S310 to S340, it should be understood by those skilled in the art that the target transit server may be determined by other methods as long as it is ensured that the determined transit server is the transit server with the best connectivity, and the embodiment of the present application is not limited thereto.
In step S350, the data processing server receives the first live audio and video data sent by the first media server. Correspondingly, the first media server sends the first live audiovisual data to the data processing server.
It should be understood that the first live audiovisual video data may also be referred to as a first live audiovisual data stream, to which the claimed embodiments are not limited.
Correspondingly, other live audio and video data may also be referred to as a live audio and video data stream, and the embodiments of the present application are not limited thereto.
Step S360, the data processing server sends the first broadcast audio and video data to the target relay server, so that the target relay server sends the first broadcast audio and video data to the second media server. Wherein, the target transit server is the transit server with the best connectivity with the data processing server in the plurality of transit servers. Correspondingly, the target transit server receives the first live audio and video data sent by the data processing server.
It should be understood that, after the data processing server determines the target relay server, the data processing server may record the relevant information of the area in charge of the target relay server, so that in the case that the subsequent data processing server receives the attribute information of other media servers in the same area as the second media server, the data processing server may directly multiplex the link with the target relay server to realize the transmission of the live audio and video data, and notify the target relay server to connect to other media servers.
In addition, since the connectivity of the network may change over time, the data processing server in the embodiment of the present application may repeatedly perform the steps S310 and S330 after a preset time to determine the connectivity of each transit server and data processing server in the area. And after determining the transit server with the best current connectivity, the data processing server may determine whether the transit server with the best current connectivity and the previously determined target transit server are the same transit server.
And when the transfer server with the best current connectivity and the previously determined target transfer server are the same transfer server, the data processing server can continuously send the live audio and video data to the previously determined target transfer server. However, in the case where the transit server with the best current connectivity and the previously determined target transit server are not the same transit server, the data processing server may disconnect the network link with the previous target transit server and reestablish the network link with the transit server with the best current connectivity, that is, the transit server with the best current connectivity may be subsequently used as a new target transit server.
It should be understood that the preset time may be set according to actual requirements, and the embodiments of the present application are not limited thereto.
For example, the preset time may be 5 minutes, 1 hour, 1 day, one week, or the like.
It should be noted that, although the foregoing describes a process in which the data processing server sends live audio and video data to the target transit server, a person skilled in the art should understand that the target transit server may also send live audio and video data to the data processing server, and the embodiment of the present application is not limited to this.
It should be further noted that, in a case that a user corresponding to the first media server wants to perform live broadcast and microphone connection with a user corresponding to the second media server, for a microphone connection process, the data processing server may perform forwarding operation on the first live broadcast audio and video data sent by the first media server and the second live broadcast audio and video data sent by the second media server. However, for the audience in the live broadcast room, the data processing server needs to mix live audio and video data (for example, first live broadcast audio and video data and second live broadcast audio and video data) of the connected user, and push the mixed live broadcast audio and video data to the mobile terminal device corresponding to the audience.
For example, under the condition that the audio and video live broadcast system further comprises a third media server, the data processing server receives second live broadcast audio and video data sent by the target transit server, wherein the second live broadcast audio and video data are sent to the target transit server by the second media server; the data processing server performs sound mixing on the first live broadcast audio and video data and the second live broadcast audio and video data to obtain third live broadcast audio and video data; the data processing server sends the third live audio and video data to the target transfer server so that the target transfer server can send the third live audio and video data to the third media server, and therefore due to the optimal connectivity between the data processing server and the target transfer server, audiences in the live broadcast room can watch live broadcast clearly, and watching experience of the audiences in the live broadcast room is improved.
It should be understood that, although the foregoing describes a live microphone connection between two mobile devices as an example, in this embodiment of the present application, a live microphone connection between two users may also be implemented, and this embodiment of the present application is not limited to this.
In addition, when two users in the same area perform a microphone connection, since the media servers and the data processing servers corresponding to the two users are both located in the same area, the data processing servers can respectively obtain corresponding live audio and video data from the media servers corresponding to the two users, thereby realizing the microphone connection.
In addition, it should be noted that, when two users in the same area perform wheat connection, because the media servers and the data processing servers corresponding to the two users are not in the same area, and the media servers corresponding to the two users correspond to one target relay server, and the target relay server only plays a role in relay, and the data processing server can achieve the functions of sound mixing and relay, the data processing server can respectively obtain live audio and video data from the target relay servers corresponding to the two users, thereby achieving wheat connection.
Therefore, in the embodiment of the application, the data processing server receives the first broadcast audio and video data sent by the first media server, and the data processing server sends the first broadcast audio and video data to the target relay server, so that the target relay server sends the first broadcast audio and video data to the second media server, where the target relay server is a relay server with the best connectivity with the data processing server among the plurality of relay servers. Therefore, the live broadcast audio and video data are sent through the transfer server with the best connectivity with the data processing server, so that the data processing server and the target transfer server achieve the best transmission effect, the sound quality of a customer connected with the live broadcast is guaranteed, and the problem that the sound quality of the customer connected with the live broadcast in the prior art is poor is solved.
In order to facilitate understanding of the embodiments of the present application, the following description will be given by way of specific examples.
Referring to fig. 4, fig. 4 is a block diagram illustrating a structure of a system for sending live audio and video data according to an embodiment of the present application. The system shown in fig. 4 comprises: data processing server 411, data processing server 412, data processing server 413, media server 420, mobile terminal 421, media server 430, mobile terminal 431, media server 440, mobile terminal 441, relay server 451, relay server 452, relay server 453, media server 460, mobile terminal 461, media server 470, mobile terminal 471, media server 480, and mobile terminal 481.
It should be noted that, as will be understood by those skilled in the art, the system shown in fig. 4 is only illustrative and not limiting to the system for transmitting live audiovisual data. For example, the number of data processing servers and the like can be further increased for the area a.
To facilitate understanding of the system for transmitting live audiovisual data shown in fig. 4, the method shown in fig. 5 is described below.
Referring to fig. 5, fig. 5 shows a specific flowchart of a method for sending live audio/video data according to an embodiment of the present application. The method shown in fig. 5 includes:
step S510, in the process of deploying the media server, the global server may be divided into a plurality of regions according to the division standards such as country or region.
It should be understood that the division standard may be set according to actual requirements, and the embodiments of the present application are not limited thereto.
Specifically, as shown in fig. 4, the data processing server 411, the data processing server 412, the data processing server 413, the media server 420, the mobile terminal 421, the media server 430, the mobile terminal 431, the media server 440, and the mobile terminal 441 may be divided into devices within an area a by region.
And, the transit server 451, the transit server 452, the transit server 453, the media server 460, the mobile terminal 461, the media server 470, the mobile terminal 471, the media server 480, and the mobile terminal 481 may also be divided into devices within the area B.
Step S520, when the first direct-broadcast connected-to-wheat user enters the live broadcast room, a media server and a data processing server may be allocated to the user.
To facilitate understanding of step S520, a description will be given below by taking an example in which a user corresponding to mobile terminal 421 enters a live broadcast room as a first live broadcast connected to a microphone. Correspondingly, the process of the user corresponding to the other mobile terminal entering the live broadcast room as the first live broadcast connected with the microphone is similar, and will not be described in detail later herein.
For example, when the user corresponding to the mobile terminal 421 enters the live broadcast room as the first live broadcast connected user, the scheduling server (not shown in fig. 4) allocates a first gateway server (not shown in fig. 4) to the mobile terminal 421, and the first gateway server allocates the media server 420 to the mobile terminal 421. The scheduling server may also allocate a data processing server 411 to the mobile terminal 421 according to the area where the mobile terminal 421 is located, the operator, and the like. The scheduling server may also notify the data processing server 411 to obtain live audio and video data of the mobile terminal 421 from the media server 420.
In step S530, when other connected users enter the live broadcasting room, a media server may be allocated to the other connected users.
To facilitate understanding of step S530, the following description will take the user corresponding to mobile terminal 461 as an example of entering a live broadcast room as another connected user. Correspondingly, the process of entering the live broadcast room as other connected users by the users corresponding to other mobile terminals is similar, and will not be described in detail later herein.
For example, when the user corresponding to the mobile terminal 461 enters the live broadcast room as another microphone connecting user and performs microphone connecting, the scheduling server may allocate access to a second gateway server (not shown in fig. 4) for the mobile terminal 461, and the second gateway server allocates the media server 460 for the mobile terminal 461. And, the scheduling server notifies the data processing server 411 to obtain live audio and video data of the mobile terminal 461 from the media server 460.
In step S540, the data processing server obtains the live broadcast audio and video data from the media servers corresponding to other connected users.
Specifically, the data processing server needs to judge whether the first microphone connecting user and the other microphone connecting users are in the same area, and when the first microphone connecting user and the other microphone connecting users are in the same area, the data processing server directly obtains live audio and video data from media servers corresponding to the other microphone connecting users; when the first connecting user and other connecting users are not in the same area, the data processing server selects a target transfer server with the best connectivity from a plurality of transfer servers corresponding to the areas where the other connecting users are located, and accordingly live audio and video data are sent or/obtained through the target transfer server.
In order to facilitate understanding of step S540, the following description is made by way of specific examples.
For example, when the user corresponding to the mobile terminal 421 and the user corresponding to the mobile terminal 461 connect to each other, the data processing server 411 needs to determine: when the mobile terminal 421 and the mobile terminal 461 are located in the same area, the data processing server 411 may directly obtain live audio/video data from the media server 460 corresponding to the mobile terminal 461; when the mobile terminal 421 and the mobile terminal 461 are in different areas, the data processing server 411 may perform route probing on all online transit servers in the area B where the mobile terminal 461 is located to obtain a test parameter of connectivity, so as to determine the transit server 451 with the best connectivity with the data processing server 411 according to the test parameter of connectivity, and use the transit server 451 as a target transit server. And, the data processing server 411 may establish a network link with the relay server 451 and request the relay server 451 to establish an audio forwarding service for the live broadcast. And, the data processing server 411 notifies the transit server 451 to acquire live audio and video data of the mobile terminal 461 from the media server 460.
For another example, when a user corresponding to the mobile terminal 461 and a user corresponding to the mobile terminal 471 in the area B perform live broadcast wheat connection, both the mobile terminal 461 and the mobile terminal 471 can transmit live broadcast audio/video data to the relay server 451, so that wheat connection can be realized through the relay server 451. However, for the audience in the live broadcast room, the relay server 451 needs to send the live audio and video data of the mobile terminal 461 and the live audio and video data of the mobile terminal 471 to the data processing server 411, and the data processing server 411 mixes the live audio and video data of the mobile terminal 461 and the live audio and video data of the mobile terminal 471 and sends the mixed live audio and video data to the audience watching the live broadcast of the mobile terminal 461 and the mobile terminal 471. Therefore, the live audio and video data are sent through the transfer server with the best connectivity with the data processing server, so that the data processing server and the target transfer server achieve the best transmission effect, the quality of the live audio and video is guaranteed, and the live watching experience is improved.
Step S550, the data processing server generates a transit route record in the live broadcast room. The relay route record can be record information of a relay server sending live audio and video data in an area, so that the relay server in the area can be directly determined to be a target relay server through the relay route record subsequently.
In order to facilitate understanding of step S550, the following description is made by way of specific examples.
For example, in the case where the target relay server determined in step S540 is the relay server 451, then the relay route record generated by the data processing server 411 is: the transit server for sending the live audio and video data in the area B is a transit server 451.
And step S560, in the subsequent case that other wheat connecting users enter the live broadcast room, if the data processing server has a relevant transfer route record, the live broadcast audio and video data can be forwarded through a target transfer server in the transfer route record. The related transit route record refers to a record related to a transit server in the same area as other connected users.
In order to facilitate understanding of step S560, the following description is made by way of specific examples.
For example, when the user corresponding to the mobile terminal 471 enters the live broadcast room and the data processing server 411 determines that the mobile terminal 461 and the mobile terminal 471 are both in the area B, the data processing server 411 may directly multiplex the network link between the data processing server 411 and the relay server 451, and notify the relay server 451 to obtain live audio and video data of the mobile terminal 471 from the media server 470.
In addition, if there is no relay route record related to another connected user in the data processing server, the data processing server may implement the transmission of the live audio/video data through step S540.
Step S570, the target transit server forwards the live audio and video data in the corresponding area to the data processing server for audio mixing and forwarding by the user.
In order to facilitate understanding of step S570, the following description is made by way of specific examples.
For example, the transit server 451 forwards the live audio and video data in the area B to the data processing server 411, so that for the user who live broadcasts and connects to the microphone, the data processing server 411 can forward the live audio and video data in the area B. However, for the audience in the live broadcast room, the data processing server 411 needs to mix the live audio and video data in the area B and/or the live audio and video data in the area a, and send the mixed live audio and video data to the user terminal (for example, the mobile terminal 441 or the mobile terminal 481) corresponding to the audience. Thus, according to the technical solution of the embodiment of the present application, the audio/video forwarding between the users in the area B is completed by the target relay server (or, the relay server 451), and the audio/video data forwarding between the users in the area B and the users in the area a is completed by the data processing server 411 and the relay server 451.
Therefore, compared with the prior art, the embodiment of the application can send the live broadcast audio and video data through the transfer server with the best connectivity with the data processing server, so that the data processing server and the target transfer server achieve the best transmission effect, and better audio and video intercommunication quality can be realized no matter whether live broadcast microphone connecting users in a live broadcast room are cross-regional or co-regional.
It should be understood that the processing method of the environment information is only exemplary, and those skilled in the art may make various modifications according to the method, and the solution after the modification is within the protection scope of the embodiments of the present application.
Further, while the operations of the method of the invention are depicted in the drawings in a particular order, this does not require or imply that the operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Rather, the steps depicted in the flowcharts may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions. For example, for fig. 3, steps S330 to S340 may be combined into one step.
Referring to fig. 6, fig. 6 shows a block diagram of a device 600 for sending live audio and video data according to an embodiment of the present application, and it should be understood that the device 600 corresponds to the data processing server side in the above-described method embodiment of fig. 3 or fig. 5, and is capable of executing various steps related to the data processing server side in the above-described method embodiment, and specific functions of the device 600 may be referred to the above description, and detailed descriptions are appropriately omitted here to avoid repetition. The device 600 includes at least one software functional module that can be stored in a memory in the form of software or firmware (firmware) or solidified in an Operating System (OS) of the device 600. Specifically, the device 600 is applied to an audio and video live broadcast system, the audio and video live broadcast system includes a first media server, a second media server, a data processing server and a plurality of transfer servers, the first media server is connected to the data processing server, the device 600 is applied to the data processing server, and the device 600 includes:
a first receiving module 610, configured to receive first live audio and video data sent by a first media server; the first sending module 620 is configured to send the first broadcast audiovisual data to a target relay server, so that the target relay server sends the first broadcast audiovisual data to a second media server, where the target relay server is a relay server with the best connectivity with the data processing server in the plurality of relay servers.
In a possible embodiment, the audio-video live system further includes a third media server, and the apparatus 600 further includes: the first receiving module 610 is further configured to receive second live audio and video data sent by the target transit server, where the second live audio and video data is sent to the target transit server by a second media server; a sound mixing module (not shown) for mixing the first live audio and video data and the second live audio and video data to obtain third live audio and video data; the first sending module 620 is further configured to send the third live audio and video data to the target relay server, so that the target relay server sends the third live audio and video data to the third media server.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus described above may refer to the corresponding process in the foregoing method, and will not be described in detail herein.
Referring to fig. 7, fig. 7 is a block diagram illustrating a structure of another apparatus 700 for sending live audio and video data according to an embodiment of the present application, and it should be understood that the apparatus 700 corresponds to the target relay server side in the above embodiment of the method in fig. 2 or fig. 3, and is capable of performing various steps related to the target relay server side in the above embodiment of the method, and specific functions of the apparatus 700 may be referred to the description above, and detailed descriptions are omitted here as appropriate to avoid repetition. The device 700 includes at least one software functional module that can be stored in a memory in the form of software or firmware (firmware) or solidified in an Operating System (OS) of the device 700. Specifically, the apparatus 700 is applied to an audio and video live broadcasting system, the audio and video live broadcasting system includes a first media server, a second media server, a data processing server and a plurality of transit servers, the first media server is connected with the data processing server, the apparatus 700 is applied to a target transit server, the target transit server is a transit server with the best connectivity with the data processing server among the plurality of transit servers, the apparatus 700 includes:
a second receiving module 710, configured to receive first live audio and video data sent by a data processing server, where the first live audio and video data is sent by a first media server to the data processing server; a second sending module 720, configured to send the first broadcast audiovisual data to the second media server.
In a possible embodiment, the av live system further includes a third media server, and the apparatus 700 further includes: a second receiving module 710, configured to receive third live broadcast audio and video data sent by the data processing server, where the third live broadcast audio and video data is obtained by the data processing server mixing second live broadcast audio and video data with the first live broadcast audio and video data, and the second live broadcast audio and video data is sent to the target transit server by the second media server; the second sending module 720 is further configured to send third live audio and video data to a third media server.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus described above may refer to the corresponding process in the foregoing method, and will not be described in detail herein.
The embodiment of the application also provides an electronic device, which can be arranged in the data processing server and also can be arranged in the target transit server.
Fig. 8 shows a block diagram of an electronic device 800 according to an embodiment of the present application. As shown in fig. 8, electronic device 800 may include a processor 810, a communication interface 820, a memory 830, and at least one communication bus 840. Wherein, the communication bus 840 is used for realizing the direct connection communication of these components. The communication interface 820 of the device in the embodiment of the present application is used for performing signaling or data communication with other node devices. Processor 810 may be an integrated circuit chip having signal processing capabilities. The Processor 810 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor 810 may be any conventional processor or the like.
The Memory 830 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. The memory 830 stores computer readable instructions, and when the computer readable instructions are executed by the processor 810, the electronic device 800 may perform the steps of the corresponding apparatus side in the method embodiments of fig. 3 or fig. 5. For example, in the case where the electronic device 800 is provided in a data processing server, the memory 830 stores computer-readable instructions, and when the computer-readable instructions are executed by the processor 810, the electronic device 800 may perform the steps of the data processing server side in the above-described method embodiments of fig. 3 or 5.
The electronic device 800 may further include a memory controller, an input-output unit, an audio unit, a display unit.
The memory 830, the memory controller, the processor 810, the peripheral interface, the input/output unit, the audio unit, and the display unit are electrically connected to each other directly or indirectly to realize data transmission or interaction. For example, these components may be electrically coupled to each other via one or more communication buses 840. The processor 810 is adapted to execute executable modules stored in the memory 830, such as software functional modules or computer programs comprised by the electronic device 800.
The input and output unit is used for providing input data for a user to realize the interaction of the user and the server (or the local terminal). The input/output unit may be, but is not limited to, a mouse, a keyboard, and the like.
The audio unit provides an audio interface to the user, which may include one or more microphones, one or more speakers, and audio circuitry.
The display unit provides an interactive interface (e.g. a user interface) between the electronic device and a user or for displaying image data to a user reference. In this embodiment, the display unit may be a liquid crystal display or a touch display. In the case of a touch display, the display may be a capacitive touch screen or a resistive touch screen, which supports single-point and multi-point touch operations. The support of single-point and multi-point touch operations means that the touch display can sense touch operations simultaneously generated from one or more positions on the touch display, and the sensed touch operations are sent to the processor for calculation and processing.
It is to be understood that the configuration shown in fig. 8 is merely exemplary, and that the electronic device 800 may include more or fewer components than shown in fig. 8, or have a different configuration than shown in fig. 8. The components shown in fig. 8 may be implemented in hardware, software, or a combination thereof.
The present application provides a storage medium having stored thereon a computer program which, when executed by a processor, performs the method of an embodiment.
The present application also provides a computer program product which, when run on a computer, causes the computer to perform the method of the method embodiments.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the system described above may refer to the corresponding process in the foregoing method, and will not be described in detail herein.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the device class embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and for the relevant points, refer to the partial description of the method embodiment.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the present application may be essentially implemented or contributed to by the prior art or parts thereof in the form of a software product stored in a storage medium, and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes. It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for sending live broadcast audio and video data is characterized in that the method is applied to an audio and video live broadcast system, the audio and video live broadcast system comprises a first media server, a second media server, a data processing server and a plurality of transfer servers, the first media server is connected with the data processing server, and the method comprises the following steps:
the data processing server receives first live broadcast audio and video data sent by the first media server;
and the data processing server sends the first broadcast audio and video data to a target transit server so that the target transit server can send the first broadcast audio and video data to the second media server, wherein the target transit server is a transit server with the best connectivity with the data processing server in the plurality of transit servers.
2. The method of claim 1, wherein the audio-visual live system further comprises a third media server, and wherein the method further comprises:
the data processing server receives second live audio and video data sent by the target transfer server, wherein the second live audio and video data are sent to the target transfer server by the second media server;
the data processing server performs sound mixing on the first live broadcast audio and video data and the second live broadcast audio and video data to obtain third live broadcast audio and video data;
and the data processing server sends the third live audio and video data to the target transfer server so that the target transfer server can send the third live audio and video data to the third media server.
3. A method for sending live broadcast audio and video data is characterized in that the method is applied to an audio and video live broadcast system, the audio and video live broadcast system comprises a first media server, a second media server, a data processing server and a plurality of transfer servers, the first media server is connected with the data processing server, and the method comprises the following steps:
a target transit server receives first broadcast audio and video data sent by the data processing server, wherein the target transit server is a transit server with the best connectivity with the data processing server in the plurality of transit servers, and the first broadcast audio and video data are sent to the data processing server by the first media server;
and the target transit server sends the first broadcast audio and video data to the second media server.
4. The method of claim 3, wherein the audio-visual live system further comprises a third media server, the method further comprising:
the target transfer server receives third live audio and video data sent by the data processing server, wherein the third live audio and video data is obtained by mixing second live audio and video data and the first live audio and video data by the data processing server, and the second live audio and video data is sent to the target transfer server by the second media server;
and the target transit server sends the third live audio and video data to the third media server.
5. The utility model provides a send device of live broadcast audio and video data, its characterized in that, the device is applied to audio and video live broadcast system, audio and video live broadcast system includes first media server, second media server, data processing server and a plurality of transfer server, first media server with data processing server connects, the device is applied to data processing server, the device includes:
the first receiving module is used for receiving first live broadcast audio and video data sent by the first media server;
a first sending module, configured to send the first broadcast audiovisual data to a target transit server, so that the target transit server sends the first broadcast audiovisual data to the second media server, where the target transit server is a transit server in the multiple transit servers that has the best connectivity with the data processing server.
6. The apparatus of claim 5, wherein the audio-video live broadcast system further comprises a third media server, and wherein the apparatus further comprises:
the first receiving module is further configured to receive second live audio and video data sent by the target transit server, where the second live audio and video data is sent to the target transit server by the second media server;
the audio mixing module is used for mixing audio of the first live broadcast audio and video data and the second live broadcast audio and video data to obtain third live broadcast audio and video data;
the first sending module is further configured to send the third live audio and video data to the target relay server, so that the target relay server sends the third live audio and video data to the third media server.
7. The utility model provides a send device of live broadcast audio and video data, its characterized in that, the device is applied to audio and video live broadcast system, audio and video live broadcast system includes first media server, second media server, data processing server and a plurality of transfer server, first media server with data processing server connects, the device is applied to target transfer server, target transfer server be in a plurality of transfer server with the best transfer server of data processing server connectivity, the device includes:
a second receiving module, configured to receive first broadcast video data sent by the data processing server, where the first broadcast video data is sent by the first media server to the data processing server;
and the second sending module is used for sending the first live broadcast audio and video data to the second media server.
8. The apparatus of claim 7, wherein the audio-video live broadcast system further comprises a third media server, and wherein the apparatus further comprises:
the second receiving module is configured to receive third live audio and video data sent by the data processing server, where the third live audio and video data is obtained by mixing second live audio and video data and the first live audio and video data by the data processing server, and the second live audio and video data is sent to the target transit server by the second media server;
the second sending module is further configured to send the third live audio and video data to the third media server.
9. A storage medium, having stored thereon a computer program which, when executed by a processor, performs a method of transmitting live audiovisual data as claimed in any of claims 1 to 4.
10. An electronic device, characterized in that the electronic device comprises: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is operating, the machine-readable instructions when executed by the processor performing the method of transmitting live audio video data as claimed in any one of claims 1 to 4.
CN202010215403.3A 2020-03-24 2020-03-24 Method and device for sending live audio and video data Active CN111405308B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010215403.3A CN111405308B (en) 2020-03-24 2020-03-24 Method and device for sending live audio and video data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010215403.3A CN111405308B (en) 2020-03-24 2020-03-24 Method and device for sending live audio and video data

Publications (2)

Publication Number Publication Date
CN111405308A true CN111405308A (en) 2020-07-10
CN111405308B CN111405308B (en) 2022-05-03

Family

ID=71431153

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010215403.3A Active CN111405308B (en) 2020-03-24 2020-03-24 Method and device for sending live audio and video data

Country Status (1)

Country Link
CN (1) CN111405308B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113891387A (en) * 2021-11-12 2022-01-04 山东亚华电子股份有限公司 Method and device for detecting audio and video communication link

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160105489A1 (en) * 2014-10-14 2016-04-14 Alcatel-Lucent Usa Inc. Distribution of cloud services in a cloud environment
CN106488263A (en) * 2016-10-24 2017-03-08 北京小米移动软件有限公司 Push the method and device of live broadcast stream media data
CN107819833A (en) * 2017-10-20 2018-03-20 贵州白山云科技有限公司 A kind of method and device for accessing live even wheat
CN108259989A (en) * 2018-01-19 2018-07-06 广州华多网络科技有限公司 Method, computer readable storage medium and the terminal device of net cast
US20180310033A1 (en) * 2017-04-20 2018-10-25 Tellybean Oy Computer implemented method for providing multi-camera live broadcasting service
CN109168018A (en) * 2018-10-17 2019-01-08 北京潘达互娱科技有限公司 Company's wheat converging system, method, apparatus and own server in a kind of live streaming
CN109995741A (en) * 2018-01-02 2019-07-09 武汉斗鱼网络科技有限公司 Connect wheat realization method and system in a kind of network direct broadcasting

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160105489A1 (en) * 2014-10-14 2016-04-14 Alcatel-Lucent Usa Inc. Distribution of cloud services in a cloud environment
CN106488263A (en) * 2016-10-24 2017-03-08 北京小米移动软件有限公司 Push the method and device of live broadcast stream media data
US20180310033A1 (en) * 2017-04-20 2018-10-25 Tellybean Oy Computer implemented method for providing multi-camera live broadcasting service
CN107819833A (en) * 2017-10-20 2018-03-20 贵州白山云科技有限公司 A kind of method and device for accessing live even wheat
CN110881135A (en) * 2017-10-20 2020-03-13 贵州白山云科技股份有限公司 Method, device, equipment and medium for optimizing microphone-connected transmission protocol
CN109995741A (en) * 2018-01-02 2019-07-09 武汉斗鱼网络科技有限公司 Connect wheat realization method and system in a kind of network direct broadcasting
CN108259989A (en) * 2018-01-19 2018-07-06 广州华多网络科技有限公司 Method, computer readable storage medium and the terminal device of net cast
CN109168018A (en) * 2018-10-17 2019-01-08 北京潘达互娱科技有限公司 Company's wheat converging system, method, apparatus and own server in a kind of live streaming

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李如平: "基于P2P网络的流媒体直播技术研究", 《铜陵学院学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113891387A (en) * 2021-11-12 2022-01-04 山东亚华电子股份有限公司 Method and device for detecting audio and video communication link
CN113891387B (en) * 2021-11-12 2024-03-29 山东亚华电子股份有限公司 Method and equipment for detecting audio and video communication link

Also Published As

Publication number Publication date
CN111405308B (en) 2022-05-03

Similar Documents

Publication Publication Date Title
US20200186373A1 (en) Method and system for sharing and discovery
US8887185B2 (en) Method and system for providing virtual co-presence to broadcast audiences in an online broadcasting system
CN111836074B (en) Live wheat-connecting method and device, electronic equipment and storage medium
US20060029092A1 (en) Transmission optimization for application-level multicast
US20020184314A1 (en) Method and system for transmitting multicast data signals
CN105656910B (en) Media transmission server, media transmission system, user terminal and media transmission method
US20140344854A1 (en) Method and System for Displaying Speech to Text Converted Audio with Streaming Video Content Data
JP2015534323A (en) Media negotiation method, device, and system for multi-stream conferencing
US11412278B1 (en) Streaming video trunking
CN112788053A (en) Real-time communication method, device, server, system and storage medium
US9402056B2 (en) Collaboration extension system
JP2007104193A (en) Video distribution system, video distribution method, and video synchronization sharing apparatus
Boronat et al. Wersync: A web platform for synchronized social viewing enabling interaction and collaboration
KR20220137038A (en) Interaction methods, devices and electronic devices
CN111405308B (en) Method and device for sending live audio and video data
CN114257572B (en) Data processing method, device, computer readable medium and electronic equipment
KR101968847B1 (en) Provision of information system and method for individual broadcasting
BR112014006764B1 (en) METHODS AND TERMINAL FOR PROVIDING INTERACTIVE SERVICES WITHIN A NETWORK FOR DISTRIBUTION OF TELEVISION CONTENT
CN111803924A (en) Multi-terminal synchronous display method and device of cloud game and readable storage medium
KR20160073667A (en) Method and system for providing chatting service and chatting server
KR20100023473A (en) Individual broadcasting system and method for providing chatting service on individual broadcasting using iptv
CN115379279A (en) Multi-screen linkage interaction method, device and system, storage medium and electronic equipment
CN112055365B (en) Method and device for terminal networking, terminal and readable storage medium
US20200195999A1 (en) Identifying user devices for interactive media broadcast participation
CN112019791A (en) Multi-party audio and video call method and system based on education examination

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210302

Address after: Room 1702-03, Lantian Hesheng building, 32 Zhongguancun Street, Haidian District, Beijing 100082

Applicant after: BEIJING CENTURY TAL EDUCATION TECHNOLOGY Co.,Ltd.

Address before: 102200 b5-005 maker Plaza, 338 Huilongguan East Street, Huilongguan town, Changping District, Beijing

Applicant before: Beijing three body cloud times Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant