CN114513668A - Live video hardware encoder control method and device, computer equipment and storage medium - Google Patents

Live video hardware encoder control method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN114513668A
CN114513668A CN202210175910.8A CN202210175910A CN114513668A CN 114513668 A CN114513668 A CN 114513668A CN 202210175910 A CN202210175910 A CN 202210175910A CN 114513668 A CN114513668 A CN 114513668A
Authority
CN
China
Prior art keywords
video
hardware encoder
data
coding
encoding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210175910.8A
Other languages
Chinese (zh)
Inventor
鲍琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Cubesili Information Technology Co Ltd
Original Assignee
Guangzhou Cubesili Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Cubesili Information Technology Co Ltd filed Critical Guangzhou Cubesili Information Technology Co Ltd
Priority to CN202210175910.8A priority Critical patent/CN114513668A/en
Publication of CN114513668A publication Critical patent/CN114513668A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • H04N21/2402Monitoring of the downstream path of the transmission network, e.g. bandwidth available

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The application relates to the technical field of network live broadcast, and provides a live video hardware encoder control method and device, computer equipment and a storage medium. The method comprises the steps of creating a first hardware encoder with first encoding parameters through a first thread, and encoding the live video data to obtain first video encoding data. And when the hardware encoder is switched, a second hardware encoder is created by using the second encoding parameter through a second thread, first video encoding data encoded by the first hardware encoder and second video encoding data encoded by the second hardware encoder are acquired, the first video encoding data are continuously sent to the video receiver, and the second video encoding data are switched to be sent to the video receiver after a preset trigger condition is met. Seamless switching can be realized in the switching process of the two hardware encoders, and the blocking of live video is reduced.

Description

Live video hardware encoder control method and device, computer equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of network live broadcast, in particular to a live video hardware encoder control method and device, computer equipment and a storage medium.
Background
In the process of live network broadcast, because the network environments of the anchor terminal and the audience terminal are various, the problem that the watching experience is affected by the fluctuation of the network bandwidth is often encountered. Therefore, in the process of broadcasting at the anchor end, video coding parameters need to be continuously adjusted in real time according to the current network bandwidth condition to adapt to the network bandwidth, so as to prevent the network from being poor, and the video coding parameters are coded at a higher code rate, so that the network is further blocked to cause blocking, or the network is restored to be good, but a lower code rate is always maintained to carry out coding, so that the video quality is poor, and the viewing experience is influenced.
In practical application, especially for overseas networks, outdoor networks and the like, the network state is very unstable, the bandwidth change is large, and high requirements are provided for the response speed of real-time adjustment of video coding parameters. The adjustment of video coding parameters is not only to adjust the broadcast rate, but also to adjust the broadcast resolution, frame rate, and code rate, even the coding mode, in order to ensure the video image quality smoothness.
And the coding mode in the field of live video at the mobile terminal usually adopts a hardware encoder, the restarting process of the hardware encoder has the problem of long consumed time, and if the working time of the two hardware encoders before and after the hardware encoder is restarted is not well connected and the interval is long, the live video is discontinuous, so that the blocking of the audience terminal is caused.
Disclosure of Invention
The embodiment of the application provides a live video hardware encoder control method, a live video hardware encoder control device, computer equipment and a storage medium, which can reduce the video pause problem caused by restarting of a hardware encoder, and the technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a method for controlling a live video hardware encoder, including:
creating a first hardware encoder according to the first encoding parameter through the first thread;
encoding live video data through the first hardware encoder to obtain first video encoding data, and sending the first video encoding data to a video receiver;
acquiring real-time network bandwidth between a video sender and a video receiver, and judging whether to switch a hardware encoder according to the real-time network bandwidth;
if the hardware encoder is judged to be switched, acquiring a second encoding parameter adaptive to the real-time network bandwidth;
creating a second hardware encoder according to the second encoding parameter through a second thread, and encoding live video data through the first hardware encoder and the second hardware encoder at the same time to obtain first video encoding data encoded by the first hardware encoder and second video encoding data encoded by the second hardware encoder;
and continuing to send the first video coding data to the video receiver until a preset trigger condition is met, and switching to send the second video coding data to the video receiver.
In a second aspect, an embodiment of the present application provides a live video hardware encoder control apparatus, including:
the first hardware encoder creating unit is used for creating a first hardware encoder according to the first encoding parameter through a first thread;
the first coding unit is used for coding live video data through the first hardware coder to obtain first video coding data and sending the first video coding data to a video receiver;
the second coding parameter acquisition unit is used for acquiring the real-time network bandwidth between a video sender and a video receiver, judging whether to switch a hardware encoder according to the real-time network bandwidth, and acquiring a second coding parameter adaptive to the real-time network bandwidth when judging to switch the hardware encoder;
a second hardware encoder creating unit, configured to create, by a second thread, a second hardware encoder according to the second encoding parameter;
the double-encoder encoding unit is used for encoding live video data through the first hardware encoder and the second hardware encoder at the same time to obtain first video encoding data encoded by the first hardware encoder and second video encoding data encoded by the second hardware encoder;
and the switching unit is used for continuing to send the first video coding data to the video receiver until a preset trigger condition is met, and switching to send the second video coding data to the video receiver.
In a third aspect, embodiments of the present application provide a computer device, a processor, a memory, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the live video hardware encoder control method according to the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium storing a computer program, which when executed by a processor implements the steps of the live video hardware encoder control method according to the first aspect.
According to the method and the device, after the live video is played, a first hardware encoder is established through a first thread according to a first encoding parameter, and the live video data is encoded to obtain first video encoding data. And in the playing process, acquiring the real-time network bandwidth between a video sender and a video receiver, acquiring a corresponding second coding parameter according to the real-time network bandwidth, and determining whether to switch a hardware encoder according to the bandwidth. When the hardware encoders are switched, a second hardware encoder is established through a second thread according to the second encoding parameters, live video data are simultaneously input into the first hardware encoder and the second hardware encoder, first video encoding data encoded by the first hardware encoder and second video encoding data encoded by the second hardware encoder are obtained, the first video encoding data are continuously sent to the video receiving party, and the second hardware encoder is switched to send the second video encoding data to the video receiving party after a preset triggering condition is met. Therefore, seamless switching can be realized in the switching process of the front hardware encoder and the rear hardware encoder, and the situations of disconnection and blockage of live videos are reduced.
For a better understanding and implementation, the technical solutions of the present application are described in detail below with reference to the accompanying drawings.
Drawings
Fig. 1 is a schematic view of an application scenario of a live video hardware encoder control method according to an embodiment of the present application;
FIG. 2 is a diagram illustrating a general method for controlling a live video hardware encoder;
fig. 3 is a schematic flowchart of a live video hardware encoder control method according to a first embodiment of the present application;
fig. 4 is a schematic structural diagram of a live video hardware encoder control apparatus according to a second embodiment of the present application;
fig. 5 is a schematic structural diagram of a computer device according to a third embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if/if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
As will be appreciated by those skilled in the art, the terms "transmitting end," "receiving end," "client," "terminal," and "terminal device" as used herein include both wireless signal receiver devices, which include only wireless signal receiver devices without transmit capability, and receiving and transmitting hardware devices, which include receiving and transmitting hardware devices capable of two-way communication over a two-way communication link. Such a device may include: cellular or other communication devices such as personal computers, tablets, etc. having single or multi-line displays or cellular or other communication devices without multi-line displays; PCS (personal communications Service), which may combine voice, data processing, facsimile and/or data communications capabilities; a PDA (Personal Digital Assistant), which may include a radio frequency receiver, a pager, internet/intranet access, a web browser, a notepad, a calendar and/or a GPS (Global positioning system) receiver; a conventional laptop and/or palmtop computer or other device having and/or including a radio frequency receiver. As used herein, a "client," "terminal device" can be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space. The "sending end", "receiving end", "client end", "terminal" and "terminal Device" used herein may also be a communication terminal, a network access terminal, and a music/video playing terminal, and may be, for example, a PDA, an MID (Mobile Internet Device) and/or a Mobile phone with a music/video playing function, and may also be a smart television, a set-top box, and other devices.
The hardware referred to by the names "server", "client", "service node", etc. is essentially a computer device with the performance of a personal computer, and is a hardware device having necessary components disclosed by the von neumann principle, such as a central processing unit (including an arithmetic unit and a controller), a memory, an input device, an output device, etc., wherein a computer program is stored in the memory, and the central processing unit loads a program stored in an external memory into the internal memory to run, executes instructions in the program, and interacts with the input and output devices, thereby accomplishing specific functions.
It should be noted that the concept of "server" as referred to in this application can be extended to the case of a server cluster. According to the network deployment principle understood by those skilled in the art, the servers should be logically divided, and in physical space, the servers may be independent from each other but can be called through an interface, or may be integrated into one physical computer or a set of computer clusters. Those skilled in the art will appreciate this variation and should not be so limited as to restrict the implementation of the network deployment of the present application.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario of a live video hardware encoder control method provided in an embodiment of the present application, where the application scenario includes an anchor client 101, a server 102, and a viewer client 103 provided in the embodiment of the present application, and the anchor client 101 and the viewer client 103 interact with each other through the server 102.
The anchor client 101 is a client that transmits a live video, and is generally a client used by an anchor (i.e., a live anchor user) in live streaming.
The viewer client 103 refers to an end that receives and views a live video, and is typically a client employed by a viewer viewing a video in a live network (i.e., a live viewer user).
The hardware at which the anchor client 101 and viewer client 103 are directed is essentially a computer device, and in particular, as shown in fig. 1, it may be a type of computer device such as a smart phone, smart interactive tablet, and personal computer. Both the anchor client 101 and the viewer client 103 may access the internet via known network access means to establish a data communication link with the server 102.
Server 102, acting as a business server, may be responsible for further connecting with related audio data servers, video streaming servers, and other servers providing related support, etc., to form a logically associated server cluster for serving related terminal devices, such as anchor client 101 and viewer client 103 shown in fig. 1.
In the embodiment of the present application, the anchor client 101 and the audience client 103 may join in the same live broadcast room (i.e., a live broadcast channel), where the live broadcast room is a chat room implemented by means of an internet technology, and generally has an audio/video broadcast control function. The anchor user is live in the live room through the anchor client 101, and the audience of the audience client 103 can log in the server 102 to enter the live room to watch the live.
In the live broadcast room, interaction between the anchor and the audience can be realized through known online interaction modes such as voice, video, characters and the like, generally, the anchor user performs programs for the audience in the form of audio and video streams, and interaction behaviors can also be generated in the interaction process. Of course, the application form of the live broadcast room is not limited to online entertainment, and can also be popularized to other relevant scenes, such as: user pairing interaction scenarios, video conference scenarios, product recommendation sale scenarios, and any other scenario requiring similar interaction.
Specifically, the viewer watches live broadcast as follows: the viewer can click to access a live application (e.g., YY) installed on the viewer client 103 and choose to enter any one of the live rooms, and the viewer client 103 is triggered to load a live room interface for the viewer, wherein the live room interface includes a plurality of interactive components, and the viewer can watch live in the live room by loading the interactive components and perform various online interactions.
In the live broadcast process, after the anchor client 101 starts the broadcast, the live broadcast video data acquired this time needs to be encoded and then sent to the server 102; after receiving the live video data of the anchor client 101, the server 102 may directly forward the live video data to the viewer client 103 joining the corresponding live broadcast room, or the server 102 may also receive the live video data of a plurality of anchor clients 101, merge the live video data after decoding (in a scene where a plurality of anchors join in a wheat), and then re-encode the merged live video data and send the encoded live video data to the viewer client 103 joining the corresponding live broadcast room.
Therefore, the video sending end in this embodiment may refer to the anchor client 101, or the server 102, and in some specific scenarios, may also be the viewer client 103; the video receiving end in the embodiment of the present application may be the server 102, or the viewer client 103, and in some specific scenarios, may also be the anchor client 101.
Generally, the control flow steps executed by the anchor client 101 after the broadcast are as shown in fig. 2, and include: s201, creating a first hardware encoder; s202, coding the direct broadcasting video data according to the first coding parameter; when the network between the anchor client 101 and the server is degraded (or improved), S203 first closes the first hardware encoder, and executes the release of the first hardware encoder, thereby completing the destruction process of the first hardware encoder; s204, establishing a second hardware encoder; s205, controlling the second hardware encoder to start and initialize; s206, transmitting live video data to the second hardware encoder, wherein the second hardware encoder starts to encode the live video data; and S207, outputting the second video coding data. Thus, the time from switching off the first hardware encoder to switching off the output of the second video encoding data into the encoded data stream.
Moreover, as the time consumed for the release and the creation of the hardware encoder is relatively long, the creation and the destruction of the hardware encoder take a relatively long time, and the live video data collected by the anchor client 101 cannot be continuously encoded and streamed in the time period corresponding to the creation of the hardware encoder and the destruction of the old hardware encoder, the continuity of the live video stream is affected, and the live video watched by the viewer client 103 is blocked. If a large network jitter occurs frequently, the hardware encoder of the anchor client 101 is triggered to restart and switch for a large number of times, which seriously affects the viewing experience.
Based on the above, an embodiment of the present application provides a live video hardware encoder control method, please refer to fig. 3, where fig. 3 is a schematic flow diagram of a live video hardware encoder control method according to a first embodiment of the present application, and the method includes the following steps:
s301: creating a first hardware encoder according to the first encoding parameter through the first thread;
s302: encoding live video data through the first hardware encoder to obtain first video encoding data, and sending the first video encoding data to a video receiver;
s303: acquiring real-time network bandwidth between a video sender and a video receiver, and judging whether to switch a hardware encoder according to the real-time network bandwidth;
s304: if the hardware encoder is judged to be switched, acquiring a second encoding parameter adaptive to the real-time network bandwidth;
s305: creating a second hardware encoder according to the second encoding parameter through a second thread, and encoding live video data through the first hardware encoder and the second hardware encoder at the same time to obtain first video encoding data encoded by the first hardware encoder and second video encoding data encoded by the second hardware encoder;
s306: and continuing to send the first video coding data to the video receiver until a preset trigger condition is met, and switching to send the second video coding data to the video receiver.
According to the control method of the live video hardware encoder, after the live video hardware encoder is started, the first hardware encoder is established through the first thread according to the first encoding parameter, and the live video data is encoded to obtain first video encoding data. And in the playing process, acquiring the real-time network bandwidth between a video sender and a video receiver, acquiring a corresponding second coding parameter according to the real-time network bandwidth, and determining whether to switch a hardware encoder according to the bandwidth. When the hardware encoders are switched, a second hardware encoder is established through a second thread according to the second encoding parameters, live video data are simultaneously input into the first hardware encoder and the second hardware encoder, first video encoding data encoded by the first hardware encoder and second video encoding data encoded by the second hardware encoder are obtained, the first video encoding data are continuously sent to the video receiving party, and the second hardware encoder is switched to send the second video encoding data to the video receiving party after a preset triggering condition is met. Therefore, seamless switching can be realized in the switching process of the front hardware encoder and the rear hardware encoder, and the situation that live video is disconnected and jammed is reduced.
In a possible embodiment, the anchor client is used as a video sending end, and the server is used as a video receiving end to explain the live video hardware encoder control method in the embodiment of the present application. The following will describe in detail steps S101 to S106.
With respect to step S301, the anchor client creates a first hardware encoder according to the first encoding parameter through the first thread.
The first thread may be a live software thread run by the anchor client or an independent thread otherwise created by the anchor client. And creating a first hardware encoder locally at the anchor client through the first thread, and initializing the first hardware encoder with the first encoding parameter.
And after the first hardware encoder is initialized successfully, the first hardware encoder is bound with the first thread, the encoding function of a graphic processor is called in the first thread, and the graphic processor sends the drawn image to a corresponding image buffer area to be provided for the first hardware encoder to encode.
And after the anchor client is played, acquiring a first coding parameter, and creating a first hardware encoder according to the first coding parameter. The first encoding parameter may be a preset default encoding parameter, and the encoding parameter may include: one or more combinations of coding width and height, coding frame rate and code rate, wherein the default coding parameters can be uniformly configured or adjusted by the server.
The first encoding parameter may also be obtained by obtaining a real-time bandwidth, and the specific obtaining method includes the following steps:
the anchor client responds to a broadcast command, acquires a real-time network bandwidth between a video sender and a video receiver (namely the real-time network bandwidth between the anchor client and the server), acquires a first coding parameter adaptive to the real-time network bandwidth, and creates a first hardware encoder according to the first coding parameter through a first thread.
Wherein, the broadcast command may include identification information of a video sender and identification information of a video receiver, and the step of obtaining the real-time network bandwidth between the video sender and the video receiver in response to the broadcast command includes:
and according to the identification information of the video sender and the identification information of the video receiver, communication address information of the video sender and the video receiver is obtained, and according to the communication address information, network testing is carried out to obtain the real-time network bandwidth between the video sender and the video receiver.
And according to the communication address information of the anchor client and the server, performing network test to obtain the real-time network bandwidth between the anchor client and the server. The real-time network bandwidth can be obtained by the anchor client executing network test, or can be sent to the anchor client after the server executes network test.
In one embodiment, the first encoding parameter may be obtained by:
acquiring a first coding rate adaptive to the real-time network bandwidth;
determining the coding rate range of the coder to which the first coding rate belongs according to a plurality of preset coding rate ranges of the coder and preset coding parameters corresponding to the coding rate ranges of the coders, and acquiring the preset coding parameters corresponding to the coding rate ranges of the coders as the first coding parameters.
As shown in table 1 below, the first encoding parameters include: coding width and height, coding frame rate and code rate.
Presetting coding rate ranges (rate thresholds) of a plurality of different gears and coding parameters corresponding to the rate thresholds of the gears, executing bandwidth detection after a broadcasting client starts, mapping a detection result into a corresponding first coding rate, matching the gear corresponding to the preset rate threshold in real time according to the first coding rate, and if the gear falls into the corresponding gear, using the coding parameter corresponding to the gear as the first coding parameter.
TABLE 1
Gear position Code rate threshold value Encoding parameters
Default gear Encoding parameters [ W, H, FPS, BitRATE]
Gear 1 [Rmin1,Rmax1) Encoding parameters [ W1, H1, FPS, BitRATE1]
Gear 2 [Rmin2,Rmax2) Encoding parameters [ W2, H2, FPS, BitRATE2]
Gear 3 [Rmin3,Rmax3) Encoding parameters [ W3, H3, FPS, BitRATE3]
The code rate threshold value Rmin in table 1 is the lowest value of the coding code rate of the corresponding gear, and Rmax is the highest value of the coding code rate of the corresponding gear. In the encoding parameters [ W, H, FPS ] in table 1, W and H are encoding width and height, FPS is encoding frame rate, BitRate is code rate, and the code rate thresholds of the gears 1-3 are sequentially reduced according to the gear sequence, and the encoding parameters are also sequentially reduced according to the gear sequence. The default gear may correspond to an encoding parameter that is directly executed after the anchor client is started, that is, before the anchor client executes bandwidth detection after being started to obtain a real-time bandwidth, encoding may be performed according to the encoding parameter corresponding to the default gear.
And if the anchor client obtains that the mapped coding code rate falls in the corresponding gear 1 according to the real-time bandwidth, using the coding parameter of the gear 1 as the first coding parameter.
In a possible embodiment, the ranges of coding rates (rate thresholds) of the respective gears do not overlap.
Further, the obtaining manner of the first coding rate adapted to the real-time network bandwidth may be: determining a bandwidth threshold corresponding to the real-time network bandwidth according to a plurality of preset bandwidth thresholds and coding rates corresponding to the bandwidth thresholds, and acquiring the coding rate corresponding to the bandwidth threshold as the first coding rate.
In step S302, the live video data is encoded by the first hardware encoder to obtain first video encoded data, and the first video encoded data is sent to the video receiving side.
The anchor client acquires and acquires live video data, encodes the live video data through the first hardware encoder, and encodes the live video data through the first encoding parameter by the first hardware encoder to obtain first video encoding data.
And the anchor client sends the first video coding data obtained by the coding of the first hardware encoder to the server through a network.
After receiving the first video coding data, the server sends the first video coding data to the audience client terminals which are added into the corresponding live broadcast rooms, or the server can also receive the first video coding data of a plurality of anchor client terminals, merge the first video coding data after decoding (in a scene of a plurality of anchor microphones), and then re-encode the merged live broadcast video data and send the merged live broadcast video data to the audience client terminals which are added into the corresponding live broadcast rooms.
And the audience client executes corresponding decoding operation and plays the first video coded data after receiving the first video coded data. In an embodiment, when the video sending end sends the first video coded data, the video receiving end further sends corresponding decoding information, receives the decoding information, obtains a first decoding parameter corresponding to the first coding parameter, and decodes the first video coded data by using the first decoding parameter, so as to obtain live video data.
Step S303, acquiring a real-time network bandwidth between a video sender and a video receiver, and judging whether to switch a hardware encoder according to the real-time network bandwidth;
and according to the communication address information of the anchor client and the server, performing network test to obtain the real-time network bandwidth between the anchor client and the server. The real-time network bandwidth can be obtained by the anchor client executing network test, or can be sent to the anchor client after the server executes network test.
In one embodiment, the step of determining whether to switch the hardware encoder according to the real-time network bandwidth includes:
acquiring a second coding rate adaptive to the real-time network bandwidth;
determining the coding rate range of the coder to which the second coding rate belongs according to the second coding rate and the coding rate ranges of a plurality of preset coders;
and if the coding rate range is not overlapped with the coding rate range of the first hardware encoder, judging to switch the hardware encoders.
And if the anchor client obtains that the mapped second coding rate falls in the corresponding gear 2 according to the real-time bandwidth and the coding rate range of the first hardware encoder which is currently working corresponds to the gear 1, judging to switch the hardware encoders if the coding rate range corresponding to the second coding rate is not overlapped with the coding rate range of the first hardware encoder.
Further, the obtaining manner of the second coding rate adapted to the real-time network bandwidth may be: determining a bandwidth threshold corresponding to the real-time network bandwidth according to a plurality of preset bandwidth thresholds and the coding rate corresponding to each bandwidth threshold, and acquiring the coding rate corresponding to the bandwidth threshold as the second coding rate.
In step S304, if it is determined to switch the hardware encoder, a second encoding parameter adapted to the real-time network bandwidth is obtained.
The obtaining mode of obtaining the second coding parameter adaptive to the real-time network bandwidth comprises obtaining a preset default second coding parameter or obtaining the second coding parameter by obtaining the real-time network bandwidth.
In one embodiment, the second encoding parameter is obtained by:
acquiring a second coding rate adaptive to the real-time network bandwidth;
and determining the coding rate range of the coder to which the second coding rate belongs according to the coding rate ranges of a plurality of preset coders and preset coding parameters corresponding to the coding rate ranges of the coders, and acquiring the preset coding parameters corresponding to the coding rate ranges of the coders as the second coding parameters.
The detailed acquiring manner of the second encoding parameter may refer to the acquiring manner of the first encoding parameter, and is not described herein again.
Regarding to S305, a second hardware encoder is created by a second thread according to the second encoding parameter, and live video data is encoded by the first hardware encoder and the second hardware encoder at the same time, so as to obtain first video encoded data encoded by the first hardware encoder and second video encoded data encoded by the second hardware encoder.
The second thread may be an independent thread other than a live software thread operated by the anchor client, and may be independent of the first thread and may be operated in parallel. And creating a second hardware encoder in the local of the anchor client through the second thread, and initializing the second hardware encoder by using the second encoding parameter to construct the GI environment required by the second hardware encoder.
And after the second hardware encoder is initialized successfully, the second hardware encoder is bound with the second thread, the encoding function of the graphics processor is called in the second thread, and the graphics processor sends the drawn image to a corresponding image buffer area to be provided for the second hardware encoder to encode.
After the second hardware encoder is initialized successfully, the second hardware encoder is operated through the second thread, and live video data are encoded according to the second encoding parameters; and meanwhile, the first hardware encoder is continuously operated through the first thread, and the live video data is encoded according to the first encoding parameter. Thereby obtaining first video encoding data encoded by said first hardware encoder and second video encoding data encoded by said second hardware encoder.
In one embodiment, texture features of the live video data may be obtained, and the first video encoding data and the second video encoding data obtained by encoding the same frame of live video data may be respectively marked according to the texture features.
The texture features are used for describing surface properties of a scene corresponding to the image or the image area, such as thickness of image texture, dense features and the like. Common texture features include gray level co-occurrence matrix, autoregressive texture model, Tamura texture feature, wavelet transform, and the like. According to the invention of the present application, a person skilled in the art can freely select any one of commonly used texture features to describe the live video data, and mark the first video coded data and the second video coded data obtained by coding the same frame of live video data with the texture features.
Therefore, in the subsequent step, the corresponding ending frame and starting frame when the first video coding data and the second video coding data are switched can be obtained according to the marks.
S306: and continuing to send the first video coding data to the video receiver until a preset trigger condition is met, and switching to send the second video coding data to the video receiver.
According to the method and the device, after the encoder needs to be switched, the first hardware encoder which is executing encoding is not closed immediately, but live video data is continuously transmitted to the first hardware encoder, so that the first hardware encoder continuously executes encoding work, and the first video encoding data is continuously and uninterruptedly output.
And switching to sending the second video coding data to the video receiver after the preset triggering condition is met. The preset trigger condition may be: and the second hardware encoder is successfully initialized and encodes the input live video data according to the second encoding parameters to obtain a corresponding first frame of second video encoding data.
In an embodiment, after the second hardware encoder run by the second thread successfully performs encoding initialization, the direct broadcast video data is encoded according to the second encoding parameter, and corresponding second video encoded data is output. And the anchor client recalls the second video coded data from the second thread through the running of the first thread, and when the first frame of the second video coded data is recalled from the second thread, the anchor client is switched to send the second video coded data to the video receiver.
According to the concept of the present application, the preset trigger condition may also be set to obtain any frame of the second video encoding data, or to elapse of a preset time, etc., which is not limited to this.
The step of the anchor client switching to sending the second video encoding data to the video recipient may include;
according to the mark in step S305, acquiring the first video coding data and the second video coding data obtained by coding the same frame of live video data, and determining a sending end frame of the first video coding data and a sending start frame of the second video coding data;
and stopping transmitting the first video encoding data after the transmission end frame, and transmitting the second video encoding data from the transmission start frame.
The transmission start frame of the second video coding data can be the first frame of the second video coding data; the transmission end frame of the first video coded data may be a previous frame of the first video coded data corresponding to the first frame of the second video coded data, which is obtained according to the flag.
According to the concept of the present application, the sending start frame of the second video coding data is not limited to the first frame of the second video coding data, and may also be any frame of the second video coding data, or a corresponding frame obtained after a preset time elapses, and the sending end frame of the first video coding data is a previous frame of the first video coding data corresponding to the sending start frame of the second video coding data.
Preferably, after the second live video data is sent to the video receiver, the method further includes the following steps:
and stopping the encoding of the live video data by the first hardware encoder, and releasing the first hardware encoder.
In an alternative embodiment, the first hardware encoder may be configured to stop encoding the live video data after the first hardware encoder completes encoding of the transmission end frame of the first video encoding data. By releasing the first hardware encoder, the performance of the partially occupied terminal can be released, and the operation is smoother.
Further, the first hardware encoder may be destroyed to free up more terminal capabilities.
In another alternative embodiment, the first hardware encoder does not immediately release the destruction, but may release the destruction after a preset time or a preset condition is met, so that the work of destroying and creating the hardware encoder does not need to be frequently performed when the bandwidth fluctuates repeatedly.
Please refer to fig. 4, which is a schematic structural diagram of a live video hardware encoder control apparatus according to a second embodiment of the present application. The apparatus may be implemented as all or part of a server in software, hardware, or a combination of both.
The apparatus 40 comprises:
a first hardware encoder creating unit 401, configured to create, by a first thread, a first hardware encoder according to a first encoding parameter;
a first encoding unit 402, configured to encode live video data by using the first hardware encoder to obtain first video encoded data, and send the first video encoded data to a video receiving side;
a second encoding parameter obtaining unit 403, configured to obtain a real-time network bandwidth between a video sender and a video receiver, determine whether to switch a hardware encoder according to the real-time network bandwidth, and obtain a second encoding parameter adapted to the real-time network bandwidth when determining to switch the hardware encoder;
a second hardware encoder creating unit 404, configured to create, by a second thread, a second hardware encoder according to the second encoding parameter;
a dual-encoder encoding unit 405, configured to encode live video data through the first hardware encoder and the second hardware encoder at the same time, and obtain first video encoded data encoded by the first hardware encoder and second video encoded data encoded by the second hardware encoder;
the switching unit 406 is configured to continue to send the first video encoding data to the video receiving side until a preset trigger condition is met, and switch to send the second video encoding data to the video receiving side.
It should be noted that, when the live video hardware encoder control apparatus provided in the foregoing embodiment executes live video hardware encoder control, only the division of the above functional modules is taken as an example, and in practical applications, the above functions may be allocated to different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the above described functions. In addition, the live video hardware encoder control device and the live video hardware encoder control method provided by the above embodiments belong to the same concept, and details of implementation processes are shown in the method embodiments and are not described herein again.
Please refer to fig. 5, which is a schematic structural diagram of a computer device according to a third embodiment of the present application. As shown in fig. 5, the computer device 21 may include: a processor 210, a memory 211, and a computer program 212 stored in the memory 211 and operable on the processor 210, such as: a live video hardware encoder control program; the steps of the live video hardware encoder control method in the above embodiments are implemented when the processor 210 executes the computer program 212.
The processor 210 may include one or more processing cores, among other things. The processor 210 is connected to various parts in the computer device 21 by various interfaces and lines, executes various functions of the computer device 21 and processes data by executing or executing instructions, programs, code sets or instruction sets stored in the memory 211 and calling data in the memory 211, and optionally, the processor 210 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), Programmable Logic Array (PLA). The processor 210 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing contents required to be displayed by the touch display screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 210, but may be implemented by a single chip.
The Memory 211 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 211 includes a non-transitory computer-readable medium. The memory 211 may be used to store instructions, programs, code sets, or instruction sets. The memory 211 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for at least one function (such as touch instructions, etc.), instructions for implementing the above-mentioned method embodiments, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 211 may optionally be at least one memory device located remotely from the processor 210.
The embodiment of the present application further provides a computer storage medium, where the computer storage medium may store a plurality of instructions, where the instructions are suitable for being loaded by a processor and executing the method steps of the foregoing embodiment, and a specific execution process may refer to specific descriptions of the foregoing embodiment, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium and used for instructing relevant hardware, and when the computer program is executed by a processor, the steps of the above-described embodiments of the method may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc.
The present invention is not limited to the above-described embodiments, and various modifications and variations of the present invention are intended to be included within the scope of the claims and the equivalent technology of the present invention if they do not depart from the spirit and scope of the present invention.

Claims (11)

1. A method for controlling a live video hardware encoder, the method comprising the steps of:
creating a first hardware encoder according to the first encoding parameter through the first thread;
encoding live video data through the first hardware encoder to obtain first video encoding data, and sending the first video encoding data to a video receiver;
acquiring real-time network bandwidth between a video sender and a video receiver, and judging whether to switch a hardware encoder according to the real-time network bandwidth;
if the hardware encoder is judged to be switched, acquiring a second encoding parameter adaptive to the real-time network bandwidth;
creating a second hardware encoder according to the second encoding parameter through a second thread, and encoding live video data through the first hardware encoder and the second hardware encoder at the same time to obtain first video encoding data encoded by the first hardware encoder and second video encoding data encoded by the second hardware encoder;
and continuing to send the first video coding data to the video receiver until a preset trigger condition is met, and switching to send the second video coding data to the video receiver.
2. The method of claim 1, wherein the step of simultaneously encoding live video data by the first hardware encoder and the second hardware encoder to obtain first video encoding data encoded by the first hardware encoder and second video encoding data encoded by the second hardware encoder comprises:
acquiring texture features of the live video data, and marking the first video coding data and the second video coding data obtained by coding the same frame of live video data according to the texture features;
the step of switching to transmitting the second video coding data to the video receiver comprises;
acquiring the first video coding data and the second video coding data obtained by coding the same frame of live video data according to the marks, and determining a sending end frame of the first video coding data and a sending start frame of the second video coding data;
and stopping transmitting the first video encoding data after the transmission end frame, and transmitting the second video encoding data from the transmission start frame.
3. The live video hardware encoder control method of claim 1, wherein the preset trigger condition is:
and the second hardware encoder is successfully initialized and encodes the input live video data according to the second encoding parameters to obtain a corresponding first frame of second video encoding data.
4. The live video hardware encoder control method of claim 1, further comprising, after sending the second live video data to the video recipient, the steps of:
and stopping the encoding of the live video data by the first hardware encoder, and releasing the first hardware encoder.
5. The live video hardware encoder control method of claim 1, wherein the step of creating a first hardware encoder by a first thread according to the first encoding parameter comprises:
responding to a broadcasting instruction, acquiring a real-time network bandwidth between a video sender and a video receiver, and acquiring a first coding parameter adaptive to the real-time network bandwidth;
a first hardware encoder is created by a first thread in accordance with a first encoding parameter.
6. The live video hardware encoder control method of claim 5, wherein the opening instruction includes identification information of a video sender and identification information of a video receiver;
the step of responding to the broadcasting instruction and acquiring the real-time network bandwidth between the video sender and the video receiver comprises the following steps:
acquiring communication address information of the video sender and the video receiver according to the identification information of the video sender and the identification information of the video receiver;
and according to the communication address information, performing network test to obtain the real-time network bandwidth between the video sender and the video receiver.
7. The method of any of claims 1-6, wherein the step of determining whether to switch a hardware encoder based on the real-time network bandwidth comprises:
acquiring a second coding rate adaptive to the real-time network bandwidth;
determining the coding rate range of the coder to which the second coding rate belongs according to the second coding rate and the coding rate ranges of a plurality of preset coders;
and if the coding rate range is not overlapped with the coding rate range of the first hardware encoder, judging to switch the hardware encoders.
8. The method of any of claims 1-6, wherein the step of obtaining a second coding rate that is compatible with the real-time network bandwidth comprises:
and judging the bandwidth threshold corresponding to the real-time network bandwidth according to a plurality of preset bandwidth thresholds and the coding rate corresponding to each bandwidth threshold, and acquiring the coding rate corresponding to the bandwidth threshold as the second coding rate.
9. A live video hardware encoder control apparatus, comprising:
the first hardware encoder creating unit is used for creating a first hardware encoder according to the first encoding parameter through a first thread;
the first coding unit is used for coding live video data through the first hardware coder to obtain first video coding data and sending the first video coding data to a video receiver;
the second coding parameter acquisition unit is used for acquiring the real-time network bandwidth between a video sender and a video receiver, judging whether to switch a hardware encoder according to the real-time network bandwidth, and acquiring a second coding parameter adaptive to the real-time network bandwidth when judging to switch the hardware encoder;
a second hardware encoder creating unit, configured to create, by a second thread, a second hardware encoder according to the second encoding parameter;
the double-encoder encoding unit is used for encoding live video data through the first hardware encoder and the second hardware encoder at the same time to obtain first video encoding data encoded by the first hardware encoder and second video encoding data encoded by the second hardware encoder;
and the switching unit is used for continuing to send the first video coding data to the video receiver until a preset trigger condition is met, and switching to send the second video coding data to the video receiver.
10. A computer device, comprising: processor, memory and computer program stored in the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1 to 8 are implemented when the processor executes the computer program.
11. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
CN202210175910.8A 2022-02-24 2022-02-24 Live video hardware encoder control method and device, computer equipment and storage medium Pending CN114513668A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210175910.8A CN114513668A (en) 2022-02-24 2022-02-24 Live video hardware encoder control method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210175910.8A CN114513668A (en) 2022-02-24 2022-02-24 Live video hardware encoder control method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114513668A true CN114513668A (en) 2022-05-17

Family

ID=81553585

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210175910.8A Pending CN114513668A (en) 2022-02-24 2022-02-24 Live video hardware encoder control method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114513668A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115866248A (en) * 2022-11-08 2023-03-28 格兰菲智能科技有限公司 Video transcoding method and device, computer equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115866248A (en) * 2022-11-08 2023-03-28 格兰菲智能科技有限公司 Video transcoding method and device, computer equipment and storage medium
CN115866248B (en) * 2022-11-08 2024-01-19 格兰菲智能科技有限公司 Video transcoding method, device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
US9479737B2 (en) Systems and methods for event programming via a remote media player
RU2506715C2 (en) Transmission of variable visual content
WO2018120946A1 (en) Method and apparatus for determining video image abnormality, and terminal device
CN111711833B (en) Live video stream push control method, device, equipment and storage medium
AU2010260683B2 (en) Apparatus and method for transmitting and receiving a user interface in a communication system
US10945038B2 (en) Media content distribution
AU2003258912B2 (en) Audio visual media encoding system
US10225305B2 (en) Adaptive data segment delivery arbitration for bandwidth optimization
CN111629239B (en) Screen projection processing method, device, equipment and computer readable storage medium
JP2006134326A (en) Method for controlling transmission of multimedia data from server to client based on client's display condition, method and module for adapting decoding of multimedia data in client based on client's display condition, module for controlling transmission of multimedia data from server to client based on client's display condition and client-server system
AU2003258912A2 (en) Audio visual media encoding system
US9226003B2 (en) Method for transmitting video signals from an application on a server over an IP network to a client device
CN114513668A (en) Live video hardware encoder control method and device, computer equipment and storage medium
US20080104659A1 (en) Prioritized real-time data transmission
CN114286128A (en) Live video parameter adjusting method, system, device, equipment and storage medium
CN102821309A (en) System and method for transferring streaming media based on desktop sharing
CN112929704A (en) Data transmission method, device, electronic equipment and storage medium
CN115314727A (en) Live broadcast interaction method and device based on virtual object and electronic equipment
CN113747181A (en) Network live broadcast method, live broadcast system and electronic equipment based on remote desktop
CN114640849B (en) Live video encoding method, device, computer equipment and readable storage medium
CN114760528A (en) Method, system, device, computer equipment and medium for video data transmission
KR102359367B1 (en) Method and apparatus for game streaming
CN116016972A (en) Live broadcasting room beautifying method, device and system, storage medium and electronic equipment
CN115883863A (en) Method, device, medium and electronic equipment for obtaining event live broadcast delay
CN116761002A (en) Video coding method, virtual reality live broadcast method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination