CN112543350A - Live broadcast interaction method and device - Google Patents
Live broadcast interaction method and device Download PDFInfo
- Publication number
- CN112543350A CN112543350A CN202011410640.1A CN202011410640A CN112543350A CN 112543350 A CN112543350 A CN 112543350A CN 202011410640 A CN202011410640 A CN 202011410640A CN 112543350 A CN112543350 A CN 112543350A
- Authority
- CN
- China
- Prior art keywords
- interactive
- stream
- video stream
- rendering
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 42
- 238000000034 method Methods 0.000 title claims abstract description 25
- 230000002452 interceptive effect Effects 0.000 claims abstract description 234
- 238000009877 rendering Methods 0.000 claims abstract description 130
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 17
- 239000002131 composite material Substances 0.000 claims abstract description 13
- 230000009471 action Effects 0.000 claims description 32
- 230000001960 triggered effect Effects 0.000 claims description 11
- 230000004044 response Effects 0.000 claims description 8
- 230000015572 biosynthetic process Effects 0.000 claims description 5
- 238000003786 synthesis reaction Methods 0.000 claims description 5
- 230000009286 beneficial effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23614—Multiplexing of additional data and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/239—Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The application provides a live broadcast interaction method and a live broadcast interaction device, wherein the method applied to a server side comprises the following steps: combining a live video stream and an interactive rendering stream sent by a main broadcasting client into a video stream to obtain a combined video stream; the interactive rendering stream is obtained by encoding an interactive interface at least comprising preset interactive elements by a server; sending the composite video stream to a user client; under the condition of receiving touch data sent by any user client, determining interactive operation indicated by the touch data; responding to the interactive operation to obtain result data; rendering the result data to an interactive rendering stream to obtain a target rendering stream; and sending the video stream obtained by synthesizing the target rendering stream and the live video stream received when the target rendering stream is obtained to the user client. The client only needs to send the touch data, so that when the interactive operation needs to be added, the flow which the client needs to execute is not changed, namely the client does not need to be upgraded, and therefore the upgrading efficiency can be improved.
Description
Technical Field
The present application relates to the field of video processing, and in particular, to a live broadcast interaction method and apparatus.
Background
In live interaction, a user can interact with the live. For example, the user gifts a gift to the anchor.
Currently, the process of implementing interaction among the client of the user, the client of the anchor and the server may include: the client side of the anchor sends the live video stream to the client side of the user through the server side, the client side of the user renders the interactive elements configured in advance to the live video stream to obtain the rendered video stream, and the rendered video stream is displayed. When the client of the user receives an interaction instruction (for example, a gift is given), the client determines data corresponding to the interaction instruction from a preset corresponding relationship between preset data (taking the gift as an example, the preset data may include the gift instruction and a gift code) and the interaction instruction, and sends the determined data to the server. And the server calls the corresponding service interface, responds to the received data and respectively sends the response result data to the anchor and the client of the user. And rendering the response result data to the live video stream for display by the client of the user.
However, when the number of interactive operations between the user and the anchor increases, the user needs to adjust the agreed interactive instruction and the interactive object code between the client and the server of the user, and thus, both the client and the server of the user need to be upgraded, thereby reducing the upgrade efficiency.
Disclosure of Invention
The application provides a live broadcast interaction method and device, and aims to solve the problem of low upgrading efficiency.
In order to achieve the above object, the present application provides the following technical solutions:
the application provides a live broadcast interaction method, which is applied to a server and comprises the following steps:
combining a live video stream and an interactive rendering stream sent by a main broadcasting client into a video stream to obtain a combined video stream; the interactive rendering stream is obtained by encoding an interactive interface at least comprising preset interactive elements by the server;
sending the composite video stream to a user client;
under the condition of receiving touch data sent by any user client, determining interactive operation indicated by the touch data; the touch data includes: actions and areas triggered by the user on the user client;
responding to the interactive operation to obtain result data;
rendering the result data to the interactive rendering stream to obtain a target rendering stream;
and sending the video stream obtained by synthesizing the target rendering stream and the live video stream received when the target rendering stream is obtained to the user client.
Optionally, the determining the interactive operation indicated by the touch data includes:
and determining the action and the area triggered by the user on the user client according to the preset corresponding relation among the interactive operation, the triggering action and the triggering area, and the corresponding interactive operation.
Optionally, before the synthesizing the live video stream and the interactive rendering stream into a video stream to obtain a synthesized video stream, the method further includes:
judging whether a preset trigger condition of the target interactive operation is met or not;
under the condition that the preset trigger condition of the target interactive operation is not met, the interactive interface at least comprising preset interactive elements is as follows: and the interactive interface only comprises preset interactive elements.
Optionally, under the condition that a preset trigger condition of the target interactive operation is met, the interactive interface at least including the preset interactive element is: and the interactive interface comprises interactive elements corresponding to the target interactive operation and the preset interactive elements.
The application also provides a live broadcast interaction method, which is applied to a user client and comprises the following steps:
decoding and displaying a video stream composed of a live video stream and an interactive rendering stream in the case of receiving the video stream;
under the condition that a trigger instruction representing the trigger interactive operation of a user is received, sending trigger data at least consisting of a trigger action indicated by the trigger instruction and a trigger area to the server;
under the condition of receiving a target rendering stream sent by a server, decoding and displaying the target rendering stream; the target rendering stream is obtained by rendering result data to the interactive rendering stream by the server; the result data is obtained by the server side through responding to the interactive operation indicated by the trigger data.
The application also provides a live broadcast interaction device, which is applied to a server and comprises:
the system comprises a synthesis module, a video processing module and a video processing module, wherein the synthesis module is used for synthesizing a live video stream and an interactive rendering stream sent by a main broadcasting client into a video stream to obtain a synthesized video stream; the interactive rendering stream is obtained by encoding an interactive interface at least comprising preset interactive elements by the server;
the first sending module is used for sending the composite video stream to a user client;
the determining module is used for determining the interactive operation indicated by the touch data under the condition of receiving the touch data sent by any user client; the touch data includes: actions and areas triggered by the user on the user client;
the response module is used for responding to the interactive operation to obtain result data;
the rendering module is used for rendering the result data to the interactive rendering stream to obtain a target rendering stream;
and the second sending module is used for sending the video stream obtained by synthesizing the target rendering stream and the live video stream received when the target rendering stream is obtained to the user client.
Optionally, the determining module is configured to determine an interactive operation indicated by the touch data, and includes:
the determining module is specifically configured to determine, according to a preset correspondence between the interactive operation, the trigger action, and the trigger area, an action and an area triggered by the user on the user client, and a corresponding interactive operation.
Optionally, the apparatus further comprises:
the judging module is used for judging whether a preset triggering condition of target interactive operation is met or not before the synthesizing module synthesizes the live video stream and the interactive rendering stream into a video stream to obtain a synthesized video stream;
under the condition that the preset trigger condition of the target interactive operation is not met, the interactive interface at least comprising preset interactive elements is as follows: and the interactive interface only comprises preset interactive elements.
Optionally, under the condition that a preset trigger condition of the target interactive operation is met, the interactive interface at least including the preset interactive element is: and the interactive interface comprises interactive elements corresponding to the target interactive operation and the preset interactive elements.
The application also provides a live broadcast interaction device, which is applied to a user client and comprises:
the first decoding and displaying module is used for decoding and displaying a video stream under the condition of receiving the video stream synthesized by a live video stream and an interactive rendering stream;
the third sending module is used for sending triggering data which is at least composed of a triggering action indicated by the triggering instruction and a triggering area to the server side under the condition that the triggering instruction for representing the triggering interactive operation of the user is received;
the second encryption display module is used for decoding and displaying the target rendering stream under the condition of receiving the target rendering stream sent by the server; the target rendering stream is obtained by rendering result data to the interactive rendering stream by the server; the result data is obtained by the server side through responding to the interactive operation indicated by the trigger data.
According to the live broadcast interaction method and device, under the condition that a server side receives a live broadcast video stream sent by a main broadcast client side, the live broadcast video stream and an interactive rendering stream are combined into a video stream, and a combined video stream is obtained; sending the composite video stream to a user client; under the condition of receiving touch data sent by any user client, determining interactive operation indicated by the touch data; responding to the interactive operation to obtain result data; rendering the result data to an interactive rendering stream to obtain a target rendering stream; and sending the video stream obtained by synthesizing the target rendering stream and the live video stream received when the target rendering stream is obtained to the user client.
It can be seen that, in the application, the interactive rendering stream is generated by the server, and after the user triggers the interactive operation, the user client only needs to send the touch data, the server determines the interactive operation indicated by the touch data according to the touch data, responds to the determined interactive operation, renders result data obtained by the response into the interactive rendering stream, obtains the target rendering stream, and sends the target rendering stream to the user client. In the processing process of the interactive operation, the client only needs to send the touch data, so that when the interactive operation needs to be added, the flow required to be executed by the client cannot be changed, namely the client does not need to be upgraded, and only the server needs to be upgraded, therefore, the upgrading efficiency can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a live broadcast interaction method disclosed in an embodiment of the present application;
fig. 2 is a flowchart of another live broadcast interaction method disclosed in the embodiment of the present application;
fig. 3 is a schematic structural diagram of a live broadcast interaction apparatus disclosed in an embodiment of the present application;
fig. 4 is a schematic structural diagram of another live broadcast interaction device disclosed in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a live broadcast interaction method provided in an embodiment of the present application, which may include the following steps:
and S101, the anchor client sends the recorded live video stream to a server.
In this embodiment, the anchor client records a video to the anchor in real time to obtain a live video stream, and sends the generated live video stream to the server. That is, in this embodiment, the anchor client always sends the live video stream to the server.
S102, the server side judges whether preset trigger conditions of target interactive operation are met, if not, S103 is executed, and if yes, S112 is executed.
In the embodiment of the application, the interaction between the user and the anchor can be divided into two types, and the first type is an operation that the user and the live broadcast room need to interact only when a certain trigger condition is met, for example, voting. In the present embodiment, for convenience of description, this type of interactive operation is referred to as a target interactive operation. The second category is interactions that may occur at all times between the user and the anchor, such as giving away gifts.
In this embodiment, a preset trigger condition corresponds to the target interaction operation, where the content of the trigger condition corresponding to each target interaction operation needs to be determined according to an actual situation, and this embodiment is not limited.
In this step, the server determines whether a preset trigger condition of the target interactive operation is satisfied, if not, the action of S103 is executed, and if so, the action of S112 is executed.
Taking the target interaction operation as an example of voting, in this practice, if the voting needs to be started, the administrator starts the voting program from the background, that is, the voting program is started at the server. In this step, the preset trigger condition corresponding to the voting is whether the voting program is started. If the start of the voting procedure is not detected, S103 is performed, otherwise, S112 is performed.
S103, the server side synthesizes the received live video stream sent by the anchor client side and the first interactive rendering stream into a video stream to obtain a first synthesized video stream.
And executing the operation of the step under the condition that the preset trigger condition of the target interactive operation is not met. In this step, a currently received live video stream and a first interactive rendering stream are combined into one video stream, which is referred to as a first combined video stream for convenience of description. The first interactive rendering stream is obtained by encoding an interactive interface which only comprises preset interactive elements by the server side, wherein the preset interactive elements refer to interactive elements of the second type of interactive operation. Namely, the first interactive rendering stream is obtained by encoding the interactive interface of the service end, which only comprises the interactive elements of the second type of interactive operation.
In this step, the server synthesizes the first interactive rendering stream with the live video stream received when the first interactive rendering stream is obtained, where a specific implementation manner of the synthesis is the prior art, and is not described herein again.
And S104, the server side sends the first composite video stream to the user client side.
And S105, the user client decodes and displays the first composite video stream.
In this step, the client decodes the first composite video stream and displays the decoded data stream, where a specific implementation manner of decoding and displaying is the prior art and is not described herein again.
In this embodiment, after the data stream decoded by the user client is displayed on the screen, the specifically displayed image includes interactive elements of the second type of interactive operation, for example, icons representing gift gifts.
S106, under the condition that the user client receives a trigger instruction for representing the user to trigger interactive operation, sending trigger data at least consisting of a trigger action and a trigger area to the server.
The user, if they want to present a gift to the anchor, may trigger an icon on the client that characterizes the gift being presented. In this case, the user client may receive a trigger instruction representing that the user triggers the interactive operation, and the user client may acquire a trigger action that the user triggers the interactive operation, a coordinate of the trigger position, and a coordinate region generated according to the coordinate of the trigger position. For convenience of description, the trigger action and the trigger area acquired by the user client are referred to as touch data. Optionally, the touch data may further include coordinates of the trigger position.
For example, the user client receives an instruction to trigger gifting of a gift, and the user client may acquire an action (e.g., a click, double-click, or touch gesture, etc.) to trigger gifting of the gift, coordinates of a location on a screen where a picture of the gift is triggered, and a trigger area generated by the coordinates.
S107, under the condition that the server side receives the touch data sent by any user client side, the interactive operation indicated by the touch data is determined.
In this step, the server determines the interactive operation indicated by the touch data when receiving the trigger data sent by any user client.
Optionally, the server is configured with a preset corresponding relationship between the trigger action and the trigger area in advance, and in this step, the server determines the interactive operation corresponding to the trigger action and the trigger area in the received touch data according to the corresponding relationship.
And S108, the server responds to the interactive operation to obtain result data.
In this step, the server may call an interface corresponding to the interactive operation, implement a response to the interactive operation, and obtain result data. Taking the interactive operation as presenting gifts as an example, the result data obtained in this step may be "xx presents xx".
S109, the server renders the result data to the first interactive rendering stream to obtain a first target rendering stream.
In this step, the server renders the result data into a first interactive rendering stream, and for convenience of description, the rendered result is referred to as a first target rendering stream. The specific implementation manner of rendering is the prior art, and is not described herein again.
S110, the server side sends the video stream obtained by synthesizing the first target rendering stream and the live video stream received when the first target rendering stream is obtained to the user client side.
In this step, the server synthesizes the live video stream received when the first target rendering stream is obtained with the first target rendering stream, and sends the synthesized video stream to each user client.
It should be noted that, in this embodiment, after the server executes this step, in the case of receiving a live video stream, the server still synthesizes the live video stream and the first interactive rendering stream.
S111, the client decodes and displays the first target rendering stream when receiving the first target rendering stream sent by the server.
In this step, the first target rendering stream is decoded, and the decoded data stream is displayed, where a specific implementation manner of decoding and displaying is the prior art, and is not described herein again.
And S112, the server side synthesizes the live video stream and the second interactive rendering stream into a video stream to obtain a second synthesized video stream.
And executing the operation of the step under the condition that the preset trigger condition corresponding to the target interactive operation is met.
In this step, the second interactive rendering stream is obtained by encoding an interactive interface including an interactive element corresponding to the target interactive operation and a preset interactive element, that is, encoding an interactive interface of an interactive element of the first type of interactive operation and an interactive element of the second type of interactive operation, so as to obtain the second interactive rendering stream. For example, the first type of interaction includes voting, the second type of interaction includes a gift presentation, and in this step, an interaction interface including interaction elements (e.g., icons) corresponding to the voting and the gift presentation, respectively, is encoded.
And S113, the server sends the second composite video stream to the user client.
And S114, the user client decodes and displays the second composite video stream.
In this step, the second composite video stream is decoded to obtain a decoded data stream, and the decoded data stream is displayed, where a specific implementation manner of decoding and displaying is the prior art, and is not described herein again.
S115, the user client sends the trigger data at least consisting of the trigger action and the trigger area to the server side under the condition of receiving the trigger instruction for representing the user to trigger the interactive operation.
In this step, the touch data includes: actions and zones that the user triggers on the user client.
The meaning and specific implementation principle of this step may refer to S106, and are not described herein again.
S116, the server determines the interactive operation indicated by the touch data under the condition that the touch data sent by any user client is received.
The meaning and specific implementation principle of this step can refer to S107, and are not described herein again.
And S117, the server responds to the interactive operation to obtain result data.
The meaning and specific implementation principle of this step may refer to S108, and are not described herein again.
S118, the server renders the result data to a second interactive rendering stream to obtain a second target rendering stream.
The meaning and specific implementation principle of this step may refer to S109, and are not described herein again.
S119, the server side sends the second target rendering stream and the video stream obtained by synthesizing the live video stream received when the second target rendering stream is obtained to the user client side.
The meaning and specific implementation principle of this step may refer to S110, and are not described herein again.
S120, the user client decodes and displays the second target rendering stream under the condition that the second target rendering stream sent by the server is received.
The meaning and specific implementation principle of this step may refer to S111, which is not described herein again.
Fig. 2 is a still another live broadcast interaction method provided in the embodiment of the present application, which may include the following steps:
s201, the anchor client sends the recorded live video stream to a server.
The meaning and specific implementation of this step may refer to S101, which is not described herein again.
S202, the server side combines the live video stream and the interactive rendering stream sent by the anchor client side into a video stream to obtain a combined video stream.
In this embodiment, the interactive rendering stream is obtained by encoding, by the service end, an interactive interface at least including a preset interactive element.
Under the condition that the preset trigger condition of the target interactive operation is not met, the interactive interface at least comprising the preset interactive elements specifically comprises the following steps: and the interactive interface only comprises preset interactive elements.
Under the condition that a preset trigger condition of the target interactive operation is met, an interactive interface at least comprising preset interactive elements specifically comprises the following steps: the interactive interface comprises interactive elements corresponding to the target interactive operation and preset interactive elements.
And S203, the server side sends the synthesized video stream to the user client side.
S204, the user client decodes and displays the video stream under the condition of receiving the video stream synthesized by the live video stream and the interactive rendering stream.
In this step, the received video stream is decoded, and the decoded data stream is displayed. The specific implementation manner of decoding and displaying is the prior art, and is not described herein again.
S205, the user client sends the trigger data at least composed of the trigger action and the trigger area to the server side under the condition that the user client receives the trigger instruction for representing the user to trigger the interactive operation.
In this embodiment, the touch data includes: actions and zones that the user triggers on the user client. Optionally, in practice, the touch data may further include coordinates of a position where the user triggers the interactive operation on the screen.
The meaning and specific implementation principle of this step may refer to S106, and are not described herein again.
S206, under the condition that the server side receives touch data sent by any user client side, the interactive operation indicated by the touch data is determined.
The meaning and specific implementation principle of this step may refer to S107, which is not described herein again.
S207, the server responds to the interactive operation to obtain result data.
The meaning and specific implementation principle of this step may refer to S108, and are not described herein again.
And S208, rendering the result data to the interactive rendering stream by the server side to obtain a target rendering stream.
The meaning and specific implementation principle of this step may refer to S109, and are not described herein again.
S209, the server side sends the target rendering stream and the video stream obtained by synthesizing the live video stream received when the target rendering stream is obtained to the user client side.
The meaning and specific implementation principle of this step may refer to S110, which is not described herein again.
S210, after receiving the video stream, the user client decodes and displays the received video stream.
In this step, the user client decodes the received video stream and displays the decoded video stream. The specific implementation manner of decoding and displaying is the prior art, and is not described herein again.
The embodiment has the following beneficial effects:
the beneficial effects are that:
in this embodiment, the server synthesizes the live video stream and the interactive rendering stream, and sends the synthesized video stream to the user client.
The beneficial effects are that:
in this embodiment, the rendering operation is performed by the server, so that the server can use high-quality materials, and can perform complex 3D calculation, thereby achieving a rendering effect that cannot be achieved by the client in the prior art.
The beneficial effects are three:
in this embodiment, the user client sends the touch data, and decodes and displays the video stream sent by the server, so that the user client does not need to download a script or load a program, and thus, the user client can quickly adapt to the needs of different live broadcast interactions.
Fig. 3 is a live broadcast interaction apparatus provided in an embodiment of the present application, and is applied to a server, where the live broadcast interaction apparatus may include: a composition module 301, a first sending module 302, a determination module 303, a response module 304, a rendering module 305, and a second sending module 306, wherein,
the synthesizing module 301 is configured to synthesize a live video stream and an interactive rendering stream sent by a anchor client into a video stream, so as to obtain a synthesized video stream; the interactive rendering stream is obtained by encoding an interactive interface at least comprising preset interactive elements by the server;
a first sending module 302, configured to send the composite video stream to a user client;
a determining module 303, configured to determine, when touch data sent by any user client is received, an interactive operation indicated by the touch data; the touch data includes: actions and areas triggered by the user on the user client;
a response module 304, configured to respond to the interaction operation to obtain result data;
a rendering module 305, configured to render the result data to the interactive rendering stream, so as to obtain a target rendering stream;
a second sending module 306, configured to send the video stream obtained by synthesizing the target rendering stream and the live video stream received when the target rendering stream is obtained to the user client.
Optionally, the determining module 303 is configured to determine the interactive operation indicated by the touch data, and includes:
the determining module 303 is specifically configured to determine, according to a preset correspondence between the interactive operation, the trigger action, and the trigger area, an action and an area triggered by the user on the user client, and a corresponding interactive operation.
Optionally, the system may further include an execution module, configured to determine whether a preset trigger condition of a target interactive operation is satisfied before the live video stream and the interactive rendering stream are combined into one video stream to obtain a combined video stream; under the condition that the preset trigger condition of the target interactive operation is not met, the interactive interface at least comprising preset interactive elements is as follows: and the interactive interface only comprises preset interactive elements.
Optionally, under the condition that a preset trigger condition of the target interactive operation is met, the interactive interface at least including the preset interactive element is: and the interactive interface comprises interactive elements corresponding to the target interactive operation and the preset interactive elements.
Fig. 4 is a live broadcast interaction apparatus provided in an embodiment of the present application, which is applied to a user client, and includes: a first decoding display module 401, a third transmitting module 402, and a second encrypting display module 403, wherein,
and a first decoding and displaying module 401, configured to, in a case that a video stream composed of a live video stream and an interactive rendering stream is received, decode and display the video stream.
A third sending module 402, configured to send, to the server, trigger data composed of at least a trigger action indicated by the trigger instruction and the trigger area, under the condition that a trigger instruction representing that the user triggers the interactive operation is received;
a second encryption display module 403, configured to decode and display a target rendering stream when the target rendering stream sent by the server is received; the target rendering stream is obtained by rendering result data to the interactive rendering stream by the server side; the result data is obtained by the server side through responding to the interactive operation indicated by the trigger data.
The functions described in the method of the embodiment of the present application, if implemented in the form of software functional units and sold or used as independent products, may be stored in a storage medium readable by a computing device. Based on such understanding, part of the contribution to the prior art of the embodiments of the present application or part of the technical solution may be embodied in the form of a software product stored in a storage medium and including several instructions for causing a computing device (which may be a personal computer, a server, a mobile computing device or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other.
In the above description of the disclosed embodiments, features described in various embodiments in this specification can be substituted for or combined with each other to enable those skilled in the art to make or use the present application.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (10)
1. A live broadcast interaction method is applied to a server side and comprises the following steps:
combining a live video stream and an interactive rendering stream sent by a main broadcasting client into a video stream to obtain a combined video stream; the interactive rendering stream is obtained by encoding an interactive interface at least comprising preset interactive elements by the server;
sending the composite video stream to a user client;
under the condition of receiving touch data sent by any user client, determining interactive operation indicated by the touch data; the touch data includes: actions and areas triggered by the user on the user client;
responding to the interactive operation to obtain result data;
rendering the result data to the interactive rendering stream to obtain a target rendering stream;
and sending the video stream obtained by synthesizing the target rendering stream and the live video stream received when the target rendering stream is obtained to the user client.
2. The method of claim 1, wherein the determining the interaction indicated by the touch data comprises:
and determining the action and the area triggered by the user on the user client according to the preset corresponding relation among the interactive operation, the triggering action and the triggering area, and the corresponding interactive operation.
3. The method of claim 1, wherein before the synthesizing the live video stream and the interactive rendering stream into a single video stream to obtain a synthesized video stream, the method further comprises:
judging whether a preset trigger condition of the target interactive operation is met or not;
under the condition that the preset trigger condition of the target interactive operation is not met, the interactive interface at least comprising preset interactive elements is as follows: and the interactive interface only comprises preset interactive elements.
4. The method according to claim 3, wherein in case that a preset trigger condition of the target interactive operation is met, the interactive interface at least comprising preset interactive elements is: and the interactive interface comprises interactive elements corresponding to the target interactive operation and the preset interactive elements.
5. A live broadcast interaction method is applied to a user client and comprises the following steps:
decoding and displaying a video stream composed of a live video stream and an interactive rendering stream in the case of receiving the video stream;
under the condition that a trigger instruction representing the trigger interactive operation of a user is received, sending trigger data at least consisting of a trigger action indicated by the trigger instruction and a trigger area to the server;
under the condition of receiving a target rendering stream sent by a server, decoding and displaying the target rendering stream; the target rendering stream is obtained by rendering result data to the interactive rendering stream by the server; the result data is obtained by the server side through responding to the interactive operation indicated by the trigger data.
6. A live broadcast interaction device is applied to a server side and comprises:
the system comprises a synthesis module, a video processing module and a video processing module, wherein the synthesis module is used for synthesizing a live video stream and an interactive rendering stream sent by a main broadcasting client into a video stream to obtain a synthesized video stream; the interactive rendering stream is obtained by encoding an interactive interface at least comprising preset interactive elements by the server;
the first sending module is used for sending the composite video stream to a user client;
the determining module is used for determining the interactive operation indicated by the touch data under the condition of receiving the touch data sent by any user client; the touch data includes: actions and areas triggered by the user on the user client;
the response module is used for responding to the interactive operation to obtain result data;
the rendering module is used for rendering the result data to the interactive rendering stream to obtain a target rendering stream;
and the second sending module is used for sending the video stream obtained by synthesizing the target rendering stream and the live video stream received when the target rendering stream is obtained to the user client.
7. The apparatus of claim 6, wherein the determining module is configured to determine the interaction indicated by the touch data, and comprises:
the determining module is specifically configured to determine, according to a preset correspondence between the interactive operation, the trigger action, and the trigger area, an action and an area triggered by the user on the user client, and a corresponding interactive operation.
8. The apparatus of claim 6, further comprising:
the judging module is used for judging whether a preset triggering condition of target interactive operation is met or not before the synthesizing module synthesizes the live video stream and the interactive rendering stream into a video stream to obtain a synthesized video stream;
under the condition that the preset trigger condition of the target interactive operation is not met, the interactive interface at least comprising preset interactive elements is as follows: and the interactive interface only comprises preset interactive elements.
9. The apparatus according to claim 8, wherein in case that a preset trigger condition of the target interactive operation is met, the interactive interface at least including the preset interactive elements is: and the interactive interface comprises interactive elements corresponding to the target interactive operation and the preset interactive elements.
10. A live broadcast interaction device is applied to a user client and comprises the following components:
the first decoding and displaying module is used for decoding and displaying a video stream under the condition of receiving the video stream synthesized by a live video stream and an interactive rendering stream;
the third sending module is used for sending triggering data which is at least composed of a triggering action indicated by the triggering instruction and a triggering area to the server side under the condition that the triggering instruction for representing the triggering interactive operation of the user is received;
the second encryption display module is used for decoding and displaying the target rendering stream under the condition of receiving the target rendering stream sent by the server; the target rendering stream is obtained by rendering result data to the interactive rendering stream by the server; the result data is obtained by the server side through responding to the interactive operation indicated by the trigger data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011410640.1A CN112543350A (en) | 2020-12-03 | 2020-12-03 | Live broadcast interaction method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011410640.1A CN112543350A (en) | 2020-12-03 | 2020-12-03 | Live broadcast interaction method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112543350A true CN112543350A (en) | 2021-03-23 |
Family
ID=75015996
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011410640.1A Pending CN112543350A (en) | 2020-12-03 | 2020-12-03 | Live broadcast interaction method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112543350A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113596561A (en) * | 2021-07-29 | 2021-11-02 | 北京达佳互联信息技术有限公司 | Video stream playing method and device, electronic equipment and computer readable storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101101219A (en) * | 2006-07-06 | 2008-01-09 | 株式会社查纳位资讯情报 | Vehicle-mounted displaying device and displaying method employed for the same |
CN101488333A (en) * | 2009-01-22 | 2009-07-22 | 中兴通讯股份有限公司 | Image display device and display outputting method thereof |
US10021458B1 (en) * | 2015-06-26 | 2018-07-10 | Amazon Technologies, Inc. | Electronic commerce functionality in video overlays |
CN108810599A (en) * | 2017-04-27 | 2018-11-13 | 腾讯科技(上海)有限公司 | Net cast method, apparatus and computer equipment |
CN111698547A (en) * | 2019-03-11 | 2020-09-22 | 腾讯科技(深圳)有限公司 | Video interaction method and device, storage medium and computer equipment |
-
2020
- 2020-12-03 CN CN202011410640.1A patent/CN112543350A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101101219A (en) * | 2006-07-06 | 2008-01-09 | 株式会社查纳位资讯情报 | Vehicle-mounted displaying device and displaying method employed for the same |
CN101488333A (en) * | 2009-01-22 | 2009-07-22 | 中兴通讯股份有限公司 | Image display device and display outputting method thereof |
US10021458B1 (en) * | 2015-06-26 | 2018-07-10 | Amazon Technologies, Inc. | Electronic commerce functionality in video overlays |
CN108810599A (en) * | 2017-04-27 | 2018-11-13 | 腾讯科技(上海)有限公司 | Net cast method, apparatus and computer equipment |
CN111698547A (en) * | 2019-03-11 | 2020-09-22 | 腾讯科技(深圳)有限公司 | Video interaction method and device, storage medium and computer equipment |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113596561A (en) * | 2021-07-29 | 2021-11-02 | 北京达佳互联信息技术有限公司 | Video stream playing method and device, electronic equipment and computer readable storage medium |
CN113596561B (en) * | 2021-07-29 | 2023-06-27 | 北京达佳互联信息技术有限公司 | Video stream playing method, device, electronic equipment and computer readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11412292B2 (en) | Video processing method, video processing device, and storage medium | |
CN108491275B (en) | Program optimization method, device, terminal and storage medium | |
US20200204513A1 (en) | Message Display Method, Terminal, and Storage Medium | |
KR101973718B1 (en) | Information processing method, client and server | |
US11132123B2 (en) | Key display method, terminal, and non-transitory computer-readable medium | |
US11890540B2 (en) | User interface processing method and device | |
CN113840154B (en) | Live broadcast interaction method and system based on virtual gift and computer equipment | |
CN107765976B (en) | Message pushing method, terminal and system | |
CN106027631B (en) | Data transmission method and device | |
CN106843794B (en) | Split screen display method and system based on android | |
CN111506283A (en) | Image display method, device and system | |
EP3680765A1 (en) | Navigation bar control method and device | |
CN113778583A (en) | Method, device, equipment and medium for publishing local application of cloud desktop | |
CN108933947B (en) | Bullet screen display method and device | |
CN108573185B (en) | Two-dimensional code picture identification method and device | |
CN111324216A (en) | Character input method, device and system for cloud application | |
CN112543350A (en) | Live broadcast interaction method and device | |
CN108664498B (en) | Webpage content display method and terminal | |
CN112667093A (en) | Character input method and device for cloud application and computer equipment | |
CN110659082A (en) | Application program interface display method and device, terminal and storage medium | |
CN114205636B (en) | Live broadcasting room window information display method, device, equipment and storage medium | |
CN113849117A (en) | Interaction method, interaction device, computer equipment and computer-readable storage medium | |
CN114760519A (en) | Interaction method, device and equipment based on gift special effect of live broadcast room and storage medium | |
CN112423005A (en) | Message display method and device in live broadcast and electronic equipment | |
CN113950043A (en) | Communication method, communication apparatus, storage medium, and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210323 |
|
RJ01 | Rejection of invention patent application after publication |