CN117750076A - Video code stream scheduling method, system, equipment and storage medium - Google Patents
Video code stream scheduling method, system, equipment and storage medium Download PDFInfo
- Publication number
- CN117750076A CN117750076A CN202311764701.8A CN202311764701A CN117750076A CN 117750076 A CN117750076 A CN 117750076A CN 202311764701 A CN202311764701 A CN 202311764701A CN 117750076 A CN117750076 A CN 117750076A
- Authority
- CN
- China
- Prior art keywords
- picture
- code stream
- media server
- resolution
- server
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 62
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 83
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 83
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 20
- 239000000203 mixture Substances 0.000 claims description 24
- 238000013507 mapping Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 230000009286 beneficial effect Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000059 patterning Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Landscapes
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The application provides a video code stream scheduling method, a system, equipment and a storage medium, wherein the method comprises the following steps: a media server for synthesizing pictures acquires picture synthesizing requirements; the media server of the picture to be synthesized determines the source terminal of the appointed code stream corresponding to each picture channel according to the picture synthesis requirement; for each source terminal which is not connected with the server, the media server of the picture to be synthesized determines the appointed resolution of the appointed code stream corresponding to the source terminal according to the picture synthesis requirement; subscribing a media server of a picture to be synthesized from an SFU server to obtain a specified code stream corresponding to the source terminal and the specified resolution; the media server to be combined with the pictures synthesizes the pictures based on the appointed code stream of each picture channel. By adopting the method and the device, the coding and decoding resources of the media server can be saved more effectively, and the dynamic synthesis under the multi-terminal conference is satisfied.
Description
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to a method, a system, an apparatus, and a storage medium for scheduling a video code stream.
Background
Video conferencing is an important application of multimedia communication technology, and hybrid conferencing between webrtc-based multi-stream terminals and AVC-based single-stream terminals is also becoming more and more widely used. Fig. 1 shows a prior art video conferencing implementation architecture. Each media server can accommodate N terminals, and when the number of the terminal for meeting is greater than N, the new media server can be automatically occupied. Two media servers are exemplarily shown in fig. 1. The AVC terminal represents an AVC single stream terminal, is a single stream terminal based on 323/SIP protocol, and the main stream video only supports transmitting one path of code stream and receiving one path of code stream. The RTC terminal is a terminal supporting the transmission of SVC/simulcast multi-layer video code streams and the subscription of multi-channel code streams for self synthesis. The SFU server is a webtc code stream forwarding server and supports multi-stream push and subscription. The DSW server is an AVC terminal code stream forwarding server and supports single stream code stream forwarding. The DEC interface is a decoding port corresponding to the terminal uplink code stream. And ENC is a coding port corresponding to the downlink code stream of the terminal. The uplink code stream of the terminal refers to the audio and video code stream sent to the media server by the terminal. The downlink code stream of the terminal refers to the audio and video code stream sent to the terminal by the media server.
The current network video conference is generally realized based on a CloudMCU, which is an MCU independent of hardware encoding and decoding pure software, the performance of the network video conference limits the number of 1080p port encoding and decoding, and when the number of terminals is large, the multimedia MCUs are based on network port communication, have limited bandwidth and can not directly forward YUV code streams. Because AVC code stream is a single stream, and the network port running between the cross-media servers is not suitable for the limitation of YUV bare code stream, when the cross-media servers are used, compressed code streams such as h.264/h.265 commonly used by the network still need to be used, when the cross-media servers are used, the target media servers consume more CPU decoding due to direct 1080p code stream transmission, and therefore the multi-path cross-server effect cannot be satisfied. For example, as can be seen from fig. 1, in the existing architecture, each media server reserves 10% of ports for relay coding and decoding, if there are only 50 1080 media ports, there are only 5 relay ports, and in a multipoint conference with 80 terminals, two media servers are required, one for connecting 45 terminals and one for connecting 35 terminals. The media server 1 needs to acquire a code stream from another media server and perform picture composition by a synthesizer. Under the condition of customizing multiple pictures, the number of the transit ports is limited, only 5 paths of code streams of the non-self media server can be transited across the media server, and more than 5 paths of code streams can prompt insufficient resources. When the media server obtains the code stream across servers, only the code stream with the highest resolution can be obtained, and the synthesizer end needs to occupy very high decoding resources, so that the resource occupation of the media server is very large. Therefore, in the existing video conference architecture, the media resources of the media server where the source terminal is located cannot be directly used to share the resource shortage problem on the equipment where the synthesizer is located, and the requirement of synthesizing by more multi-path terminals across the media servers cannot be solved.
Disclosure of Invention
Aiming at the problems in the prior art, the purpose of the application is to provide a video code stream scheduling method, a system, equipment and a storage medium, which can more effectively save the encoding and decoding resources of a media server and meet the dynamic synthesis under a multi-terminal conference.
The embodiment of the application provides a video code stream scheduling method, which comprises the following steps:
a media server for synthesizing pictures acquires picture synthesizing requirements;
the media server of the picture to be synthesized determines the source terminal of the appointed code stream corresponding to each picture channel according to the picture synthesis requirement;
for each source terminal which is not connected with the server, the media server of the picture to be synthesized determines the appointed resolution of the appointed code stream corresponding to the source terminal according to the picture synthesis requirement;
the media server of the picture to be synthesized subscribes from the SFU server to acquire the appointed code stream corresponding to the source terminal and the appointed resolution, wherein the media server connected with the source terminal is configured to process the source code stream of the source terminal into code streams with multiple resolutions and then issue the code streams to the SFU server;
and the media server of the pictures to be synthesized synthesizes the pictures based on the specified code streams of the picture channels.
According to the video code stream scheduling method, when a media server for synthesizing pictures acquires picture synthesizing requirements, a source terminal needing to acquire a designated code stream is determined based on the picture synthesizing requirements, when the source terminal and a first media server do not belong to the same group, then the code streams issued by other media servers are subscribed from an SFU server, after the code streams are acquired through subscription, picture synthesis is carried out, the synthesized code streams can be pushed to a meeting terminal for display, therefore, when the code streams are acquired by crossing packets, the media server at a synthesizer end does not need to concern packet information and IP information of the source terminal, only the code streams which are needed by subscribing from the SFU server are needed, the video conference implementation mode is simpler, even if the code streams of the source terminal which are connected with a non-local server do not need to be transferred and decoded, a transfer port is not needed in the media server, the structure of the media server is more concise, resources are utilized more effectively, even if more packets are subscribed, a concise video conference architecture can be realized, when the code streams are acquired from the U server, the frame is more subscribed to the corresponding to the frame is more efficient, the frame can be synthesized, the frame can be decoded, the requirements of the frame can be reduced when the frame is needed by the frame is more effectively, and the frame is more needed by the frame is coded and the frame is more needed by the frame is more effectively, the frame is coded and the frame is more needed to be synthesized, and the frame is more needs to be coded and the frame is more effectively coded and the frame is more needed by the frame; the video code stream scheduling method can cancel the limit on the number of terminals under the condition of saving media coding and decoding resources, realize cross-group multi-terminal meeting, and ensure that a user does not sense difference and limit at all when the user operates and customizes multiple pictures at a large capacity of multiple terminals.
In some embodiments, the media server connected to the source terminal is further configured to send the issued code stream information and the corresponding terminal information to the conference management terminal after issuing the code streams with the multiple resolutions to the SFU server;
the media server for obtaining the picture to be synthesized needs includes: the media server of the picture to be synthesized acquires a picture synthesis requirement initiated by the conference management terminal, wherein the picture synthesis requirement comprises source terminal information of a specified code stream corresponding to each picture channel;
the media server of the picture to be synthesized subscribes to obtain a specified code stream corresponding to the source terminal and the specified resolution from the SFU server, and the method comprises the following steps: and subscribing and acquiring a code stream from the SFU server by the media server of the picture to be synthesized based on the source terminal information and the appointed resolution.
In some embodiments, the determining, by the media server of the picture to be synthesized, the specified resolution of the specified code stream corresponding to the source terminal according to the picture synthesis requirement includes performing, for each source terminal not connected to the server, the following steps:
the media server of the picture to be synthesized sends request information to the SFU server, wherein the request information comprises picture channel port information of the server corresponding to the specified code stream corresponding to the source terminal;
The media server of the picture to be synthesized acquires first optional resolution information of the appointed code stream from the SFU server;
and the media server of the picture to be synthesized determines the appointed resolution of the appointed code stream according to the picture synthesis requirement and the first selectable resolution information of the appointed code stream.
In some embodiments, the picture composition requirement further includes a picture composition type corresponding to each picture channel;
the media server of the picture to be synthesized determines the designated resolution of the designated code stream according to the picture synthesis requirement and the first selectable resolution information of the designated code stream, and comprises the following steps:
the media server of the picture to be synthesized determines second optional resolution information corresponding to the picture synthesis type of the picture channel according to a preset mapping relation between the picture synthesis type and the resolution;
and the media server of the picture to be synthesized determines the appointed resolution of the corresponding appointed code stream according to the first selectable resolution information and the second selectable resolution information.
In some embodiments, after determining the source terminal of the specified code stream corresponding to each picture channel according to the picture synthesis requirement, the media server of the picture to be synthesized further includes the following steps:
The media server of the picture to be synthesized respectively judges whether each source terminal is connected with the media server of the picture to be synthesized;
for each first type source terminal connected with the media server of the picture to be synthesized, the media server of the picture to be synthesized acquires the source code stream of the first type source terminal and decodes the source code stream to obtain the designated code stream corresponding to the first type source terminal;
and for each second-type source terminal which is not connected with the media server of the picture to be synthesized, after the media server of the picture to be synthesized determines the appointed resolution corresponding to the second-type source terminal according to the picture synthesis requirement, subscribing and acquiring the appointed code stream corresponding to the second-type source terminal and the appointed resolution from the SFU server.
In some embodiments, the method further comprises the steps of:
the media server receives a source code stream of a first resolution of a terminal connected with the media server;
the media server adopts a decoding interface to decode the source code stream with the first resolution to obtain a decoded code stream;
the media server adopts two coding interfaces to code the decoded code stream to respectively obtain a code stream with a second resolution and a code stream with a third resolution;
And the media server issues the source code stream with the first resolution, the code stream with the second resolution and the code stream with the third resolution to the SFU server.
In some embodiments, the method for synthesizing pictures by the media server based on the specified code stream of each picture channel comprises the following steps:
the media server of the picture to be synthesized synthesizes the picture based on the appointed code stream according to the picture synthesis requirement to obtain a synthesized code stream;
the media server of the picture to be synthesized encodes the synthesized code stream to obtain synthesized code streams with various resolutions, and transmits the synthesized code streams to the SFU server;
and the first terminal subscribes and acquires the synthesized code stream with the specific resolution from the SFU server.
The embodiment of the application also provides a video code stream scheduling system, which is applied to the video code stream scheduling method, wherein the system comprises a plurality of media servers, and each media server comprises:
the encoding and decoding module comprises a decoding interface, a first encoding interface and a second encoding interface, wherein the encoding and decoding module is used for decoding the source code stream with the first resolution by adopting the decoding interface to obtain a decoded code stream after receiving the source code stream with the first resolution from a terminal connected with the server, and respectively encoding the decoded code stream by adopting the first encoding interface and the second encoding interface to obtain a code stream with the second resolution and a code stream with the third resolution, and the encoding and decoding module is also used for issuing the source code stream with the first resolution, the code stream with the second resolution and the code stream with the third resolution to an SFU server;
The system comprises a synthesizer, a server and a plurality of SFU (small form-factor user) servers, wherein the synthesizer is used for acquiring picture synthesis requirements, determining source terminals of specified code streams corresponding to all picture channels according to the picture synthesis requirements, determining the specified resolution of the specified code streams corresponding to the source terminals according to the picture synthesis requirements for all source terminals which are not connected with the server, subscribing and acquiring the specified code streams corresponding to the source terminals and the specified resolution from the SFU server, and performing picture synthesis based on the specified code streams of all picture channels.
The video code stream scheduling system is adopted, a framework matched with a plurality of media servers is designed, the coding and decoding modules of the media servers have the function of processing code streams of various resolutions into the code streams of terminals connected with the media servers and then distributing the code streams to the SFU servers after receiving the code streams of the terminals, and the synthesizer of the media servers has the function of subscribing terminal code streams crossing the servers from the SFU servers when picture synthesis is needed. Therefore, when the code stream is needed to be acquired by crossing the groups, the media server at the synthesizer end does not need to concern about the group information and the IP information of the source terminal, only needs to subscribe the needed code stream from the SFU server, the video conference implementation mode is simpler, even if the code stream of the source terminal connected with the non-local server is not required to be subjected to transit encoding and decoding, the transit port is not required to be arranged in the media server, the structure of the media server is simpler, the resources are more effectively utilized, even if more groups exist, a simple video conference architecture can be still realized, and when the code stream is subscribed from the SFU server, only the code stream with the resolution corresponding to the picture synthesis requirement is required to be subscribed, and the code stream with the highest resolution is not required to be acquired each time, so that the resource occupation of the synthesizer in the decoding process is reduced, the encoding and decoding resources of the media server are more effectively saved, and the dynamic synthesis under the multi-terminal conference can be met; the video code stream scheduling method can cancel the limit on the number of terminals under the condition of saving media coding and decoding resources, realize cross-group multi-terminal meeting, and ensure that a user does not sense difference and limit at all when the user operates and customizes multiple pictures at a large capacity of multiple terminals.
The embodiment of the application also provides video conference equipment, which comprises:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the video bitstream scheduling method via execution of the executable instructions.
By adopting the video conference equipment provided by the application, the processor executes the video code stream scheduling method when executing the executable instruction, so that the beneficial effects of the video code stream scheduling method can be obtained.
The embodiment of the application also provides a computer readable storage medium for storing a program, which when executed by a processor, implements the steps of the video code stream scheduling method.
By adopting the computer readable storage medium provided by the application, the stored program realizes the steps of the video code stream scheduling method when being executed, thereby obtaining the beneficial effects of the video code stream scheduling method.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings.
Fig. 1 is a schematic diagram of a prior art video conferencing architecture;
FIG. 2 is a flow chart of a video bitstream scheduling method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a video bitstream scheduling architecture according to an embodiment of the present application;
FIG. 4 is a schematic architecture diagram of a first media server implementation of an embodiment of the present application;
FIG. 5 is a schematic diagram of an architecture of a second media server implementation of an embodiment of the present application;
FIG. 6 is a diagram of a 7-frame synthesized frame according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a video code stream scheduling apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural view of a computer storage medium according to an embodiment of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments can be embodied in many forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus a repetitive description thereof will be omitted. Although the terms "first" or "second" etc. may be used herein to describe certain features, these features should be interpreted in a descriptive sense only and not for purposes of limitation as to the number and importance of the particular features.
As shown in fig. 2, in an embodiment, the application provides a video code stream scheduling method, which includes the following steps:
s100: a media server for synthesizing pictures acquires picture synthesizing requirements;
in this embodiment, the frame composition requirement of the video conference may be initiated by a conference management terminal, specifically, when a media server obtains the frame composition requirement initiated by the conference management terminal, the media server needs to perform the following steps S200 to S500 to perform frame composition, that is, the media server is used as a media server of a frame to be synthesized, where the frame composition requirement includes source terminal information of a specific code stream corresponding to each frame channel, and the source terminal information may be ID and other identification information of the source terminal; the conference management terminal is configured to manage the IDs of all terminals (including AVC terminals and/or RTC terminals) currently participating in a conference and the IDs of the issued code streams;
the conference management terminal can dynamically group the terminals of the conference, when a single media server can accommodate N terminals, the terminals of the first conference are connected to one media server, when the number of the terminals of the conference is greater than N, the terminals automatically occupy new media servers, each media server is allocated with independent coding and decoding port resources, and the terminals accessed on the same media server belong to the same group;
S200: the media server of the picture to be synthesized determines the source terminal of the appointed code stream corresponding to each picture channel according to the picture synthesis requirement;
the media server of the picture to be synthesized is provided with a synthesizer, and the synthesizer is used for synthesizing the appointed code streams of a plurality of picture channels into one picture according to the picture synthesis requirement so as to push the appointed code streams to a terminal of a participant for display, so that the picture channels are correspondingly provided with a plurality of source terminals, wherein the source terminals refer to terminal equipment participating in a video conference, such as a mobile phone, a tablet computer, a notebook computer and the like;
s300: for each source terminal which is not connected with the server, the media server of the picture to be synthesized determines the appointed resolution of the appointed code stream corresponding to the source terminal according to the picture synthesis requirement;
the server refers to the media server of the picture to be synthesized, and when the media server of the picture to be synthesized determines that a source terminal is not connected with the server, the specified code stream of the specified resolution of the source terminal is obtained through the steps S300 and S400;
s400: the picture media server to be synthesized subscribes and acquires the appointed code stream corresponding to the source terminal and the appointed resolution from the SFU server, wherein the media server connected with the source terminal is configured to process the source code stream of the source terminal into code streams with multiple resolutions and then issue the code streams to the SFU server;
The media server connected to the source terminal and the media server of the picture to be synthesized may have the same structure and function, and the definition herein is different only to indicate that the two are the code stream demander and the code stream provider in one picture synthesis, and in some scenarios, the plurality of media servers may also be the code stream demander and the code stream provider, for example, the first media server needs to synthesize the picture of the source terminal connected to the second media server, or the second media server needs to synthesize the picture of the source terminal connected to the first media server. Therefore, each media server of the application has the functions of processing the code stream into the code streams with multiple resolutions and then distributing the code streams to the SFU server after receiving the code streams of the terminal connected with the server, and subscribing the terminal code streams crossing the media server from the SFU server when picture synthesis is needed;
further, the media server connected with the source terminal is further configured to issue the code streams with the multiple resolutions to the SFU server, and then send the issued code stream information and the corresponding terminal information to the conference management terminal, that is, the conference management terminal manages the terminal information and the code stream information of all the terminals;
In this embodiment, the subscribing, by the media server of the picture to be synthesized, from the SFU server to obtain a specified code stream corresponding to the source terminal and the specified resolution includes: the media server of the picture to be synthesized subscribes to acquire a specified code stream from the SFU server based on the source terminal information and the specified resolution;
s500: and the media server of the pictures to be synthesized synthesizes the pictures based on the specified code streams of the picture channels.
According to the video code stream scheduling method, when a media server of a picture to be synthesized acquires picture synthesis requirements through the step S100, a source terminal needing to acquire a designated code stream is determined through the step S200 based on the picture synthesis requirements, when the source terminal and a first media server do not belong to the same group, code streams issued by other media servers are subscribed from an SFU server through the steps S300 and S400, after the code streams are subscribed and acquired, picture synthesis is performed through the step S500, the synthesized code streams can be used for being pushed to a meeting terminal for display, therefore, when the code streams are acquired in a cross-packet mode, the media server of a synthesizer end does not need to care about packet information and IP information of the source terminal, only needs to subscribe the code streams needed from the SFU server, the video conference implementation mode is simpler, even if the code streams of the source terminal connected with the non-local server do not belong to the same group, a transit port is not needed to be arranged in the media server, the structure of the media server is more concise, the resources are more effectively utilized, even if more groups of the code streams are subscribed and the video server can be more efficiently coded, the frame is more easily synthesized, the frame is more than the frame is more easily subscribed and the frame is more easily demanded, and the frame is more easily subscribed and the frame is more demanded, the frame is more is better than the frame is better demanded and the frame is better is required to be synthesized and the frame is required to be synthesized in the frame has the user has the service to be more subscribed and has the service more time required to be more subscribed and has the service to have the highest priority; the video code stream scheduling method can cancel the limit on the number of terminals under the condition of saving media coding and decoding resources, realize cross-group multi-terminal meeting, and ensure that a user does not sense difference and limit at all when the user operates and customizes multiple pictures at a large capacity of multiple terminals.
The application also provides a video code stream scheduling system for implementing the video code stream scheduling method, the system comprises a plurality of media servers, each media server comprises:
the encoding and decoding module comprises a decoding interface, a first encoding interface and a second encoding interface, wherein the encoding and decoding module is used for decoding the source code stream with the first resolution by adopting the decoding interface to obtain a decoded code stream after receiving the source code stream with the first resolution from a terminal connected with the server, and respectively encoding the decoded code stream by adopting the first encoding interface and the second encoding interface to obtain a code stream with the second resolution and a code stream with the third resolution, and the encoding and decoding module is also used for issuing the source code stream with the first resolution, the code stream with the second resolution and the code stream with the third resolution to an SFU server;
the system comprises a synthesizer, a server and a plurality of SFU servers, wherein the synthesizer is used for acquiring picture synthesis requirements, determining source terminals of specified code streams corresponding to all picture channels according to the picture synthesis requirements, determining the specified resolution of the specified code streams corresponding to the source terminals according to the picture synthesis requirements for the source terminals which are not connected with the server, subscribing and acquiring the specified code streams corresponding to the source terminals and the specified resolution from the SFU server, and performing picture synthesis based on the specified code streams of all picture channels.
The video code stream scheduling system is adopted, a framework matched with a plurality of media servers is designed, the coding and decoding modules of the media servers have the function of processing code streams of various resolutions into the code streams of terminals connected with the media servers and then distributing the code streams to the SFU servers after receiving the code streams of the terminals, and the synthesizer of the media servers has the function of subscribing terminal code streams crossing the servers from the SFU servers when picture synthesis is needed. Therefore, when the code stream is needed to be acquired by crossing the groups, the media server at the synthesizer end does not need to concern about the group information and the IP information of the source terminal, only needs to subscribe the needed code stream from the SFU server, the video conference implementation mode is simpler, even if the code stream of the source terminal connected with the non-local server is not required to be subjected to transit encoding and decoding, the transit port is not required to be arranged in the media server, the structure of the media server is simpler, the resources are more effectively utilized, even if more groups exist, a simple video conference architecture can be still realized, and when the code stream is subscribed from the SFU server, only the code stream with the resolution corresponding to the picture synthesis requirement is required to be subscribed, and the code stream with the highest resolution is not required to be acquired each time, so that the resource occupation of the synthesizer in the decoding process is reduced, the encoding and decoding resources of the media server are more effectively saved, and the dynamic synthesis under the multi-terminal conference can be met; the video code stream scheduling method can cancel the limit on the number of terminals under the condition of saving media coding and decoding resources, realize cross-group multi-terminal meeting, and ensure that a user does not sense difference and limit at all when the user operates and customizes multiple pictures at a large capacity of multiple terminals.
Fig. 3 to 5 exemplarily show a video stream scheduling architecture of this embodiment. In which N media servers are exemplarily shown, the number N of media servers may be a positive integer greater than or equal to 2. Each media server has a codec module and a synthesizer and is connected to at least one terminal, respectively. Each media server also has an ENC interface that transmits the synthesized picture to a terminal or issues to an SFU after synthesizing the picture by the synthesizer. For example, a first media server is connected to AVC terminal 1 and RTC terminal 1, configured with codec module 1, synthesizer 1, ENC interface 3 and ENC interface 4, a second media server is connected to AVC terminal 2, configured with codec module 2, synthesizer 2 and ENC interface 7, a third media server is connected to AVC terminal 3 and RTC terminal 3, configured with codec module 3, synthesizer 3, ENC interface 8 and ENC interface 9, a fourth media server is connected to AVC terminal 4, configured with codec module 4, synthesizer 4 and ENC interface 10, and an nth media server is connected to AVC terminal N, configured with codec module N, synthesizer N and ENC interface N. Therefore, each media server has the functions of publishing the media data code stream of the terminal to the SFU server and acquiring the code stream from the SFU server for synthesis. In practical application, the number of terminals connected with each media server can be more, and the number of media servers can be configured according to the number of actual conference terminals. In the figure, for convenience of composition, SFU servers are shown at a plurality of locations, respectively, and in practice, each of the SFU server block diagrams shown may be implemented by one SFU server device or cluster. Likewise, for ease of patterning, the DSW servers are shown separately in multiple locations, and in practice each of the DSW server blocks shown may be implemented by one DSW server device or cluster.
As shown in fig. 3 and 4, each media server is provided with a codec module, and each codec module includes a high-resolution DEC interface (decoding interface) and two low-resolution ENC interfaces (encoding interfaces). The single stream sent by the AVC terminal can be converted into the multi-layer code stream of the RTC through the encoding and decoding module, and the multi-layer code stream is released to the SFU server. The 1080p code stream acquired by the AVC terminal is taken as an example for explanation. The two low-resolution ENC interfaces can output 1/2 and 1/4 source code streams respectively to form a simulcast 3-layer code stream, and issue the code stream to an SFU network, i.e. 1080p code stream is formed into 1080p, 540p and 270p 3-layer code stream, so as to meet the 3-layer code stream requirement issued by the SFU.
In practical applications, each media server may operate as a media server for a picture to be synthesized, or may operate as a media server connected to a source terminal, and may simultaneously operate as a media server for a picture to be synthesized and a media server connected to a source terminal for synthesizing pictures from other media servers. In the following description, referring to fig. 4 and fig. 5, a primary picture synthesis performed by the first media server is taken as an example of a media server to be synthesized, that is, a code stream demand party, in this secondary picture synthesis, source terminals corresponding to a synthesized picture channel at least include an AVC terminal 1 and an AVC terminal 2, where the AVC terminal 1 is a first type source terminal connected to the first media server, the AVC terminal 2 is a second type source terminal not connected to the first media server, and the second media server is a media server connected to the AVC terminal 2, that is, a code stream provider. The source terminals are classified based on whether they are connected to the media server of the code stream demander, i.e. the source terminals connected to the media server of the picture to be synthesized are the first type of source terminals, and the source terminals not connected to the media server of the picture to be synthesized (connected to other media servers) are the second type of source terminals. In each picture composition, there may or may not be one or more source terminals of the first type and one or more source terminals of the second type, but only a plurality of source terminals of the second type. When there are only a plurality of first type source terminals and no second type source terminals, the problem of cross-media server composition is not involved. As shown in fig. 4, the codec module 1 of the first media server includes a DEC interface 1, an ENC interface 1 and an ENC interface 2, and may also include a relay module 1, a relay module 2 and a relay module 3, where the relay module 1 directly inputs 1080p code streams output by the AVC terminal 1 to the SFU server, the DEC interface 1 decodes 1080p code streams into YUV code streams and inputs them to the ENC interface 1 and the ENC interface 2, the ENC interface 1 outputs 540p code streams to the SFU server, and the ENC interface 2 outputs 270p code streams to the SFU server.
As shown in fig. 5, the codec module 2 of the second media server includes a DEC interface 2, an ENC interface 5 and an ENC interface 6, and may also include a relay module 4, a relay module 5 and a relay module 6, where the relay module 4 directly inputs 1080p code streams output by the AVC terminal 2 to the SFU server, the DEC interface 2 decodes 1080p code streams into YUV code streams and inputs them to the ENC interface 5 and the ENC interface 6, the ENC interface 5 outputs 540p code streams to the SFU server, and the ENC interface 6 outputs 270p code streams to the SFU server. The synthesizer 1 of the first media server may subscribe to and obtain the specified code stream of the AVC terminal 2 as the source terminal from the SFU server when synthesizing the picture.
Therefore, in this embodiment, each ENC interface may encode the code stream into a compressed code stream of h.264/h.265, and transmit between each media server, so as to effectively reduce the transmission bandwidth between the media servers and effectively save the resource consumption of the target media server.
In this embodiment, when the media server for synthesizing the frames performs frame synthesis, there are cases where the source terminal corresponding to a part of the frame channels and the server belong to the same group, and the source terminal corresponding to a part of the frame channels and the server do not belong to the same group. The step S200: after the media server of the picture to be synthesized determines the source terminal of the specified code stream corresponding to each picture channel according to the picture synthesis requirement, the method further comprises the following steps:
The media server of the picture to be synthesized respectively judges whether each source terminal is connected with the media server of the picture to be synthesized;
for the example of fig. 3 to 5, when the synthesizer 1 of the first media server synthesizes the pictures, the corresponding source terminals include AVC terminal 1 and AVC terminal 2, where AVC terminal 1 and the first media server belong to the same group, AVC terminal 1 belongs to a first type of source terminal synthesized at this time, AVC terminal 2 and the first media server belong to different groups, and AVC terminal 2 belongs to a second type of source terminal synthesized at this time;
for each first type source terminal connected with the media server of the picture to be synthesized, after the media server of the picture to be synthesized acquires the source code stream of the first type source terminal, decoding is carried out through the decoding interface to obtain the appointed code stream of the first type source terminal; taking fig. 4 as an example, after receiving a source code stream of an AVC terminal 1, a first media server decodes the source code stream through a DEC interface 1 to obtain a specific code stream corresponding to the AVC terminal 1; specifically, the first media server directly acquires the decoded YUV code stream from the DEC interface 1 as a specified code stream to be synthesized corresponding to the AVC terminal 1; therefore, when the source terminal and the synthesizer belong to one group, the YUV code stream is directly used as synthesis, and DEC decoding of the SFU subscription code stream does not need to be enabled;
For each second-class source terminal connected with the second media server, after the first media server determines the corresponding specified resolution according to the picture composition requirement in the step S300, the specified code streams corresponding to the second-class source terminal and the specified resolution are subscribed and acquired from the SFU server as the specified code streams corresponding to the second-class source terminal in the step S400;
for AVC terminal 2, the first media server determines the specified resolution corresponding to AVC terminal 2 according to the picture composition requirement, subscribes to the specified code stream corresponding to the specified resolution of AVC terminal 2 from the SFU server, for example, when AVC terminal 2 corresponds to the 1 st picture channel in 7 pictures, the first media server determines the specified resolution corresponding to AVC terminal 2 as 540p, and synthesizer 1 subscribes to the 540p code stream corresponding to AVC terminal 2 from the SFU server.
In this embodiment, the step S300: for each source terminal not connected with the server, the media server of the picture to be synthesized determines the specified resolution of the specified code stream corresponding to the source terminal according to the picture synthesis requirement, including executing the following steps for each source terminal not connected with the server:
The media server of the picture to be synthesized sends request information to the SFU server, wherein the request information comprises the source terminal information of the source terminal and picture channel port information of the server corresponding to the appointed code stream;
after receiving the request information, the SFU server allocates a corresponding port for the picture channel, so that the code stream can be transmitted between the picture channel port of the media server of the picture to be synthesized, which corresponds to the specified code stream, and the port allocated by the SFU server;
the media server of the picture to be synthesized acquires first optional resolution information of a specified code stream from the SFU server; the first selectable resolution information is multiple resolution information when the media server connected with the source terminal issues the code stream of the source terminal to the SFU server, that is, the SFU server can provide selectable resolutions for the specified code stream, for example, the media server connected with the source terminal issues 1080p code stream, 540p code stream and 270p code stream of the source terminal when issuing the code stream, and the first selectable resolution information of the SFU server includes 1080p code stream, 540p code stream and 270p code stream;
And the media server of the picture to be synthesized determines the appointed resolution of each appointed code stream corresponding to the source terminal which is not connected with the server according to the picture synthesis requirement and the first selectable resolution information of the appointed code stream.
In this embodiment, the picture composition requirement further includes a picture composition style to be composed, where the picture composition style includes a picture composition type corresponding to each picture channel, for example, a picture composition style (as shown in fig. 6) is: there are 7 picture channels in total, wherein 3 large picture channels are 1/4 picture, 4 small picture channels are 1/16 picture, wherein the picture composition type of the 1 st picture channel is the 1 st picture channel of 7 picture style, the picture composition type of the 2 nd picture channel is the 2 nd picture channel of 7 picture style, and so on, and the other picture composition style is: the method comprises the steps of adding 9 picture channels, wherein the 9 picture channels are 1/9 picture, wherein the picture synthesis type of the 1 st picture channel is the 1 st picture channel of 9 picture style, the picture synthesis type of the 2 nd picture channel is the 2 nd picture channel of 9 picture style, and the like, and the other picture synthesis style is as follows: 10 picture channels are total, wherein 2 large picture channels are 1/4 picture, 8 small picture channels are 1/16 picture, the picture synthesis type of the 1 st picture channel is the 1 st picture channel of 10 picture style, the picture synthesis type of the 2 nd picture channel is the 2 nd picture channel of 10 picture style, and so on; the mapping relation between various picture synthesis types and resolution ratios is established in advance, so that the mapping relation most suitable for the picture synthesis types can be determined;
The media server of the picture to be synthesized determines the designated resolution of each designated code stream which is not connected with the server according to the picture synthesis requirement and the first selectable resolution information of the designated code stream, and the method comprises the following steps:
the media server of the picture to be synthesized determines second optional resolution information corresponding to the picture synthesis type of each picture channel according to a preset mapping relation between the picture synthesis type and the resolution; the second selectable resolution information is the resolution queried according to the mapping relation, for example, for the style of 7 picture channels in total, the corresponding area of the 3 large picture channels of the 1 st picture channel, the 2 nd picture channel and the 3 rd picture channel is 1/4, the width-height dimension is 1080p corresponding to 1/2 of the width-height dimension, the corresponding second selectable resolution is 540p, the corresponding area of the 4 small picture channels is 1/16, the width-height dimension is 1080p corresponding to 1/4 of the width-height dimension, and the corresponding second selectable resolution is 270p;
the media server of the picture to be synthesized determines the appointed resolution corresponding to the appointed code stream according to the first selectable resolution information and the second selectable resolution information;
specifically, when the first selectable resolution information and the second selectable resolution information can find a coincidence point, selecting the coincident resolution in the first selectable resolution information and the second selectable resolution information as a designated resolution; when there is no coincidence between the first and second selectable resolution information, one of the first selectable resolution information closest to and greater than the second selectable resolution information is selected as the specified resolution, e.g., the first selectable resolution information includes 720p, 360p, 180p, and the second selectable resolution information is 270p, and 360p is selected as the specified resolution.
In this embodiment, each media server may perform decoding and encoding processing on the code stream of the terminal connected to the media server to obtain code streams with three resolutions, and issue the code streams to the SFU server. The video code stream scheduling method further comprises the following steps:
the media server receives a source code stream of a first resolution of a terminal connected with the media server;
taking fig. 4 as an example, for the first media server, the first terminal is AVC terminal 1, AVC terminal 1 sends 1080p code stream to the transit module 1 through the DSW server, and the first resolution is 1080p;
the media server decodes the source code stream with the first resolution to obtain a decoded code stream; taking fig. 4 as an example, for the first media server, the DEC interface 1 decodes 1080p source code stream to obtain decoded YUV code stream;
the media server encodes the decoded code stream by adopting two encoding interfaces to respectively obtain a code stream with a second resolution and a code stream with a third resolution;
taking fig. 4 as an example, for the first media server, the DEC interface 1 sends the decoded YUV code stream to the ENC interface 1 and the ENC interface 2, the ENC interface 1 encodes to obtain a 540p (second resolution) code stream, and the ENC interface 2 encodes to obtain a 270 (third resolution) code stream;
The media server issues the source code stream with the first resolution, the code stream with the second resolution and the code stream with the third resolution to an SFU server;
taking fig. 4 as an example, for the first media server, the code streams 1080p, 540p and 270p are issued to the SFU server through the relay module 1, the relay module 2 and the relay module 3, and the issued code stream ID and the ID of the corresponding AVC terminal 1 are sent to the conference management terminal. Thus, other media servers can subscribe to the code stream of AVC terminal 1 of a specified resolution from the SFU server. For example, the second media server, the third media server, the fourth media server, and the like shown in fig. 3 can subscribe to a specific code stream with a specific resolution from the SFU server when needed, and the second media server, the third media server, the fourth media server, and the like each have a function of implementing the release of the code streams with three resolutions through the above steps, and after the release to the SFU server, other media servers can subscribe to obtain as needed.
For example, as shown in fig. 5, the second media server may use a similar method to distribute the code stream of the AVC terminal 2 in the SFU server. Taking fig. 5 as an example, the transfer module 4 of the second media server receives 1080p code stream sent by the AVC terminal 2 through the DSW server, decodes the 1080p code stream by the DEC interface 2 to obtain decoded YUV code stream, sends the decoded YUV code stream to the ENC interface 5 and the ENC interface 6, encodes the ENC interface 5 to obtain 540p code stream, and encodes the ENC interface 6 to obtain 270 code stream; the 1080p, 540p and 270p code streams are issued to the SFU server, and the issued code stream ID and the ID of the corresponding AVC terminal 2 are transmitted to the conference management terminal. Thus, other media servers can subscribe to the code stream of AVC terminal 2 of a specified resolution from the SFU server.
The media server in the existing video conference architecture has a primary and secondary division, only the media server 1 is provided with a synthesizer, so that the synthesis workload of the media server 1 is large, and the structure of the media server 1 is very complex. Whereas the media servers of the video conferencing architecture of this embodiment have no primary and secondary divisions, each media server contains a synthesizer and codec module. When the terminal of the first media server needs the composite picture of the terminal of the second media server, a request is sent to the corresponding SFU server. After receiving the request, the SFU server sends a code stream request to the DEC module 2 of the second media server, and at the moment, the coding and decoding module 2 starts to work, converts 1080p code stream into 3 paths of code stream and sends the 3 paths of code stream to the SFU server for subscription and acquisition of other media servers. Each AVC terminal supports not only multiple pictures entering a group media where the AVC terminal is located, but also multiple pictures entering other group media, a media server which the AVC terminal belongs to supports not only picture synthesis by supporting YUV code streams, but also 3-layer code streams which are used for converting single-stream 1080p code streams of the AVC terminal into simulcast to be released to an SFU network, and other media servers can subscribe proper code stream layers from the SFU network for decoding and synthesis.
In this embodiment, the step S500: the media server for synthesizing pictures synthesizes pictures based on the specified code streams of each picture channel, and comprises the following steps:
the media server of the picture to be synthesized performs picture synthesis on the specified code stream to be synthesized according to the picture synthesis requirement to obtain a synthesized code stream; for example, the synthesizer 1 synthesizes the specified code streams of the respective picture channels according to the picture synthesis style to obtain synthesized code streams.
Specifically, when the to-be-synthesized picture media server synthesizes the to-be-synthesized specified code streams according to the picture synthesis requirement, the specified code streams of all picture channels subscribed and acquired from the SFU server are decoded, and as the resolution of each picture channel is lower, the resources occupied by decoding are effectively reduced. Taking a 1080p picture including 16 picture channels as an example, firstly adopting a 1080p synthetic base map, cutting the picture into 16 pictures, wherein the resolution of each picture channel is 270p, and 16 paths of code streams can be input, but 16 paths of 1080p DEC decoding are not required, only 16 paths of 270p DEC decoding are required, and the equivalent DEC decoding resources are approximately equal to 1 1080p compared with the existing case of 16 1080p DEC decoding resources, so that the 14.5 1080p DEC resource consumption is saved. In the case of 25 pictures, the DEC decoding at the synthesizer end is approximately equal to 2 1080p, which saves 23 DEC decoding resources compared with the existing case of 25 1080p DEC decoding resources. Therefore, by adopting the video code stream scheduling method, when picture composition of a cross-media server is involved, the consumption of DEC decoding resources is linearly increased along with the increase of the number of terminals, and is reduced to nonlinear increase, and the larger the number of composition styles is, the more terminals are, and the less average resources are consumed.
Further, in this embodiment, the media server of the to-be-synthesized picture synthesizes the specified code stream according to the picture synthesis requirement, and after obtaining the synthesized code stream, the method further includes the following steps:
and the media server of the picture to be synthesized encodes the synthesized code stream to obtain synthesized code streams with various resolutions, and transmits the synthesized code streams to the SFU server.
Taking fig. 3 as an example, after synthesizing to obtain a synthesized code stream, the synthesizer 1 may code to obtain a 3-layer code stream and issue the 3-layer code stream to the SFU server;
the first terminal subscribes and acquires a synthesized code stream with specific resolution from the SFU server;
if the first terminal is an AVC device, the synthesized code stream with the highest resolution may be obtained directly through the DSW server, or the code stream with the proper resolution may be subscribed from the SFU server.
As shown in fig. 3 to 5, the video code stream scheduling method can also be effectively compatible with an RTC terminal, and the RTC terminal can subscribe a synthesized code stream with proper resolution from an SFU or subscribe multiplexed synthesized code streams.
The embodiment of the application also provides video conference equipment, which comprises a processor; a memory having stored therein executable instructions of the processor; wherein the processor is configured to perform the steps of the video bitstream scheduling method via execution of the executable instructions.
Those skilled in the art will appreciate that the various aspects of the present application may be implemented as a system, method, or program product. Accordingly, aspects of the present application may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 600 according to this embodiment of the present application is described below with reference to fig. 7. The electronic device 600 shown in fig. 7 is merely an example, and should not be construed as limiting the functionality and scope of use of the embodiments herein.
As shown in fig. 7, the electronic device 600 is in the form of a general purpose computing device. Components of electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different system components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code that is executable by the processing unit 610 such that the processing unit 610 performs the steps according to various exemplary embodiments of the present application described in the above video bitstream scheduling method section of the present specification. For example, the processing unit 610 may perform the steps as shown in fig. 2.
The memory unit 620 may include readable media in the form of volatile memory units, such as Random Access Memory (RAM) 6201 and/or cache memory unit 6202, and may further include Read Only Memory (ROM) 6203.
The storage unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 630 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 600, and/or any device (e.g., router, modem, etc.) that enables the electronic device 600 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 650. Also, electronic device 600 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 600, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
By adopting the video conference equipment provided by the application, the processor executes the video code stream scheduling method when executing the executable instruction, so that the beneficial effects of the video code stream scheduling method can be obtained.
The embodiment of the application also provides a computer readable storage medium for storing a program, which when executed by a processor, implements the steps of the video code stream scheduling method. In some possible embodiments, the various aspects of the present application may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the present application as described in the video bitstream scheduling method section of the present specification, when said program product is run on the terminal device.
Referring to fig. 8, a program product 800 for implementing the above-described method according to an embodiment of the present application is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a data signal propagated in baseband or as part of a carrier wave, with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable storage medium may also be any readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or cluster. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
By adopting the computer readable storage medium provided by the application, the stored program realizes the steps of the video code stream scheduling method when being executed, thereby obtaining the beneficial effects of the video code stream scheduling method.
The foregoing is a further detailed description of the present application in connection with the specific preferred embodiments, and it is not intended that the practice of the present application be limited to such description. It should be understood that those skilled in the art to which the present application pertains may make several simple deductions or substitutions without departing from the spirit of the present application, and all such deductions or substitutions should be considered to be within the scope of the present application.
Claims (10)
1. The video code stream scheduling method is characterized by comprising the following steps:
a media server for synthesizing pictures acquires picture synthesizing requirements;
the media server of the picture to be synthesized determines the source terminal of the appointed code stream corresponding to each picture channel according to the picture synthesis requirement;
for each source terminal which is not connected with the server, the media server of the picture to be synthesized determines the appointed resolution of the appointed code stream corresponding to the source terminal according to the picture synthesis requirement;
the media server of the picture to be synthesized subscribes from the SFU server to acquire the appointed code stream corresponding to the source terminal and the appointed resolution, wherein the media server connected with the source terminal is configured to process the source code stream of the source terminal into code streams with multiple resolutions and then issue the code streams to the SFU server;
And the media server of the pictures to be synthesized synthesizes the pictures based on the specified code streams of the picture channels.
2. The video code stream scheduling method according to claim 1, wherein the media server connected to the source terminal is further configured to transmit the transmitted code stream information and the corresponding terminal information to the conference management terminal after transmitting the code streams of the plurality of resolutions to the SFU server;
the media server for obtaining the picture to be synthesized needs includes: the media server of the picture to be synthesized acquires a picture synthesis requirement initiated by the conference management terminal, wherein the picture synthesis requirement comprises source terminal information of a specified code stream corresponding to each picture channel;
the media server of the picture to be synthesized subscribes to obtain a specified code stream corresponding to the source terminal and the specified resolution from the SFU server, and the method comprises the following steps: and subscribing and acquiring a code stream from the SFU server by the media server of the picture to be synthesized based on the source terminal information and the appointed resolution.
3. The video bitstream scheduling method according to claim 2, wherein the media server of the picture to be synthesized determines the specified resolution of the specified bitstream corresponding to the source terminal according to the picture synthesis requirement, and comprises executing the following steps for each source terminal not connected to the server:
The media server of the picture to be synthesized sends request information to the SFU server, wherein the request information comprises picture channel port information of the server corresponding to the specified code stream corresponding to the source terminal;
the media server of the picture to be synthesized acquires first optional resolution information of the appointed code stream from the SFU server;
and the media server of the picture to be synthesized determines the appointed resolution of the appointed code stream according to the picture synthesis requirement and the first selectable resolution information of the appointed code stream.
4. The video bitstream scheduling method of claim 3, wherein the picture composition requirement further includes a picture composition type corresponding to each picture channel, respectively;
the media server of the picture to be synthesized determines the designated resolution of the designated code stream according to the picture synthesis requirement and the first selectable resolution information of the designated code stream, and comprises the following steps:
the media server of the picture to be synthesized determines second optional resolution information corresponding to the picture synthesis type of the picture channel according to a preset mapping relation between the picture synthesis type and the resolution;
and the media server of the picture to be synthesized determines the appointed resolution of the corresponding appointed code stream according to the first selectable resolution information and the second selectable resolution information.
5. The video code stream scheduling method according to claim 1, wherein after the media server of the picture to be synthesized determines the source terminal of the specified code stream corresponding to each picture channel according to the picture synthesis requirement, the method further comprises the steps of:
the media server of the picture to be synthesized respectively judges whether each source terminal is connected with the media server of the picture to be synthesized;
for each first type source terminal connected with the media server of the picture to be synthesized, the media server of the picture to be synthesized acquires the source code stream of the first type source terminal and decodes the source code stream to obtain the designated code stream corresponding to the first type source terminal;
and for each second-type source terminal which is not connected with the media server of the picture to be synthesized, after the media server of the picture to be synthesized determines the appointed resolution corresponding to the second-type source terminal according to the picture synthesis requirement, subscribing and acquiring the appointed code stream corresponding to the second-type source terminal and the appointed resolution from the SFU server.
6. The video bitstream scheduling method of claim 1, further comprising the steps of:
The media server receives a source code stream of a first resolution of a terminal connected with the media server;
the media server adopts a decoding interface to decode the source code stream with the first resolution to obtain a decoded code stream;
the media server adopts two coding interfaces to code the decoded code stream to respectively obtain a code stream with a second resolution and a code stream with a third resolution;
and the media server issues the source code stream with the first resolution, the code stream with the second resolution and the code stream with the third resolution to the SFU server.
7. The video bitstream scheduling method according to claim 1, wherein the media server for pictures to be synthesized performs picture synthesis based on the specified bitstream of each picture channel, comprising the steps of:
the media server of the picture to be synthesized synthesizes the picture based on the appointed code stream according to the picture synthesis requirement to obtain a synthesized code stream;
the media server of the picture to be synthesized encodes the synthesized code stream to obtain synthesized code streams with various resolutions, and transmits the synthesized code streams to the SFU server;
and the first terminal subscribes and acquires the synthesized code stream with the specific resolution from the SFU server.
8. A video bitstream scheduling system, applied to the video bitstream scheduling method of any one of claims 1 to 7, the system comprising a plurality of media servers, each of the media servers comprising:
the encoding and decoding module comprises a decoding interface, a first encoding interface and a second encoding interface, wherein the encoding and decoding module is used for decoding the source code stream with the first resolution by adopting the decoding interface to obtain a decoded code stream after receiving the source code stream with the first resolution from a terminal connected with the server, and respectively encoding the decoded code stream by adopting the first encoding interface and the second encoding interface to obtain a code stream with the second resolution and a code stream with the third resolution, and the encoding and decoding module is also used for issuing the source code stream with the first resolution, the code stream with the second resolution and the code stream with the third resolution to an SFU server;
the system comprises a synthesizer, a server and a plurality of SFU (small form-factor user) servers, wherein the synthesizer is used for acquiring picture synthesis requirements, determining source terminals of specified code streams corresponding to all picture channels according to the picture synthesis requirements, determining the specified resolution of the specified code streams corresponding to the source terminals according to the picture synthesis requirements for all source terminals which are not connected with the server, subscribing and acquiring the specified code streams corresponding to the source terminals and the specified resolution from the SFU server, and performing picture synthesis based on the specified code streams of all picture channels.
9. A video conferencing device, comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the video bitstream scheduling method of any one of claims 1 to 7 via execution of the executable instructions.
10. A computer readable storage medium storing a program, wherein the program when executed by a processor implements the steps of the video bitstream scheduling method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311764701.8A CN117750076A (en) | 2023-12-21 | 2023-12-21 | Video code stream scheduling method, system, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311764701.8A CN117750076A (en) | 2023-12-21 | 2023-12-21 | Video code stream scheduling method, system, equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117750076A true CN117750076A (en) | 2024-03-22 |
Family
ID=90252314
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311764701.8A Pending CN117750076A (en) | 2023-12-21 | 2023-12-21 | Video code stream scheduling method, system, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117750076A (en) |
-
2023
- 2023-12-21 CN CN202311764701.8A patent/CN117750076A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10250683B2 (en) | Server node arrangement and method | |
CN100583985C (en) | Method, apparatus and system for switching pictures in video service | |
US9300705B2 (en) | Methods and systems for interfacing heterogeneous endpoints and web-based media sources in a video conference | |
KR100880150B1 (en) | Multi-point video conference system and media processing method thereof | |
US20080117965A1 (en) | Multiple-Channel Codec and Transcoder Environment for Gateway, Mcu, Broadcast, and Video Storage Applications | |
US8773497B2 (en) | Distributed real-time media composer | |
US20110208837A1 (en) | Method and system for data communications in cloud computing architecture | |
CN112543297A (en) | Video conference live broadcasting method, device and system | |
CN101370114A (en) | Video and audio processing method, multi-point control unit and video conference system | |
US8984156B2 (en) | Multi-party mesh conferencing with stream processing | |
CN111385515B (en) | Video conference data transmission method and video conference data transmission system | |
WO2021093882A1 (en) | Video meeting method, meeting terminal, server, and storage medium | |
CN106303377A (en) | Video monitoring processing method and processing device | |
CA2897920A1 (en) | Video conference virtual endpoints | |
US20200329083A1 (en) | Video conference transmission method and apparatus, and mcu | |
CN101316352B (en) | Method and device for implementing multiple pictures of conference television system, video gateway and implementing method thereof | |
WO2023241693A1 (en) | Media service orchestration method and apparatus, and media server and storage medium | |
CN117750076A (en) | Video code stream scheduling method, system, equipment and storage medium | |
CN112788429B (en) | Screen sharing system based on network | |
WO2017173953A1 (en) | Server, conference terminal, and cloud conference processing method | |
CN203206388U (en) | Multi-point control unit used for video conferences | |
CN112995570B (en) | Information processing method, device, equipment, system and storage medium | |
CN113038183B (en) | Video processing method, system, device and medium based on multiprocessor system | |
WO2023274094A1 (en) | Method for processing media stream in video conference and related product | |
CN115567671A (en) | Method for processing media stream in video conference and related product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |