CN114374856B - Interaction method and device based on live broadcast - Google Patents

Interaction method and device based on live broadcast Download PDF

Info

Publication number
CN114374856B
CN114374856B CN202210078799.0A CN202210078799A CN114374856B CN 114374856 B CN114374856 B CN 114374856B CN 202210078799 A CN202210078799 A CN 202210078799A CN 114374856 B CN114374856 B CN 114374856B
Authority
CN
China
Prior art keywords
live broadcast
interaction
live
picture
task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210078799.0A
Other languages
Chinese (zh)
Other versions
CN114374856A (en
Inventor
董佳音
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youku Technology Co Ltd
Original Assignee
Beijing Youku Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youku Technology Co Ltd filed Critical Beijing Youku Technology Co Ltd
Priority to CN202210078799.0A priority Critical patent/CN114374856B/en
Publication of CN114374856A publication Critical patent/CN114374856A/en
Application granted granted Critical
Publication of CN114374856B publication Critical patent/CN114374856B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/86Watching games played by other players
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/18Speech classification or search using natural language modelling
    • G10L15/1822Parsing for meaning understanding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application provides an interaction method and device based on live broadcasting, wherein in the process of pushing a live broadcasting picture of a live broadcasting terminal to a receiving terminal by a server, a first user of the receiving terminal can select a live broadcasting element on the live broadcasting picture displayed by the receiving terminal, and configure interaction task configuration information related to the live broadcasting element for the live broadcasting element, the server can create an interaction task of the live broadcasting element by utilizing the interaction task configuration information and send the interaction task to the live broadcasting terminal, and the live broadcasting terminal can correspondingly display the task information of the interaction task in the pushed live broadcasting picture.

Description

Interaction method and device based on live broadcast
Technical Field
The embodiment of the application relates to the technical field of live broadcasting, in particular to an interaction method and device based on live broadcasting.
Background
With the rapid development of computer and internet technologies, watching various live broadcasts on a multimedia live broadcast platform becomes a part of life of people.
In the existing live interaction process, interaction between audience and anchor is often realized through an evaluation area provided by a live broadcast platform, and the audience can leave a message for the anchor in the evaluation area when watching live broadcast, so that interaction with the anchor is realized.
But such live broadcast has a single interaction mode and a poor interaction effect.
Disclosure of Invention
The embodiment of the application provides a live-broadcast-based interaction method and device, which are used for providing more interaction modes for a live broadcast terminal and a receiving terminal and increasing interaction diversity of live broadcast.
In a first aspect, an embodiment of the present application provides a live-based interaction method, including:
Acquiring a live broadcast picture sent by a live broadcast terminal, and pushing the live broadcast picture to a receiving terminal; determining a live broadcast element selected by a first user of a receiving end according to a live broadcast picture and interaction task configuration information associated with the live broadcast element; and creating an interaction task of the live broadcast element according to the interaction task configuration information, and sending the interaction task of the live broadcast element to the live broadcast terminal.
It is known that in the process that the server pushes the live broadcast picture of the live broadcast terminal to the receiving terminal, a first user of the receiving terminal can select a live broadcast element on the live broadcast picture displayed by the receiving terminal, and configure interaction task configuration information associated with the live broadcast element for the live broadcast element, the server can create an interaction task of the live broadcast element by using the interaction task configuration information and send the interaction task to the live broadcast terminal, and the live broadcast terminal can correspondingly display task information of the interaction task in the pushed live broadcast picture.
Optionally, the obtaining, by the first user at the receiving end, live broadcast elements selected according to the live broadcast picture and interaction task configuration information associated with the live broadcast elements includes:
Acquiring a live broadcast element selected by a first user of a receiving end in a live broadcast picture; determining an interaction task configuration item corresponding to the live broadcast element in a preset interaction task configuration library; and acquiring interaction task configuration information selected by a first user of the receiving end in the interaction task configuration items.
It is known that, by configuring different interaction task configuration items for different live broadcast elements in advance and generating a corresponding interaction task configuration library, when the server determines the live broadcast element selected by the first user, the interaction task configuration item corresponding to the live broadcast element can be directly determined by using the interaction task configuration library, so that the first user can select the desired interaction task configuration information from the interaction task configuration items, and the acquisition of the interaction task configuration information is realized. By the method, the interaction mode adopted by the generated interaction task has more selectivity, and the personalized requirement of the first user on the interaction task is met.
Optionally, acquiring a live broadcast element selected by the first user of the receiving end on a live broadcast picture includes:
Acquiring a picture area selected by a first user of a receiving end in a live broadcast picture; and carrying out image recognition processing on the picture area to obtain the live broadcast elements included in the picture area.
It can be known that the server determines the live broadcast element selected by the first user by acquiring the picture area selected by the first user in the live broadcast picture and performing corresponding image recognition processing, so that the interaction mode is simple and the first user can use the live broadcast element conveniently.
Optionally, acquiring a live broadcast element selected by the first user of the receiving end on a live broadcast picture includes:
acquiring voice information sent by a first user of a receiving end; carrying out semantic recognition on the voice information to obtain a semantic recognition result; and determining the live broadcast element corresponding to the semantic recognition result in the live broadcast picture.
It is known that the server determines the live broadcast element selected by the first user by acquiring the voice information input by the first user and performing corresponding semantic recognition processing, so that the interaction mode is simple and the first user can use the live broadcast element conveniently.
Optionally, the interactive task configuration item includes: a target configuration item and a result configuration item;
The method for acquiring the interaction task configuration information selected by the first user of the receiving end in the interaction task configuration item comprises the following steps: acquiring target configuration information and result configuration information selected by a first user of a receiving end in an interactive task configuration item; the target configuration information is used for describing the target state of the live broadcast element when the interaction task is completed; the result configuration information is used for describing the interaction result of the interaction task when the interaction task is completed.
It is known that the server also obtains the target configuration information and the result configuration information determined by the first user for the selected live broadcast element, so that the server can conveniently create the corresponding interaction task of the live broadcast element, and the playability of the interaction task is improved.
Optionally, the interaction method further includes: and when the live broadcast terminal is determined to finish the interaction task, sending an interaction result of the interaction task to the live broadcast terminal.
It can be known that, after the server sends the interaction task to the live broadcast end, the completion condition of the interaction task by the live broadcast end is monitored, and after the live broadcast end completes the interaction task, the interaction result is sent to the live broadcast end, so that the task interaction is realized.
Optionally, the interaction method further includes: determining the current state of live broadcast elements in a live broadcast picture sent by a live broadcast terminal; and judging whether the current state of the live element meets the target state described by the interactive task.
It is known that, based on the difference of the configuration information of the interactive task used when creating the interactive task, each interactive task should include a target state for describing the live broadcast element required by the completion of the interactive task, and based on the target state, the server monitors the current state of the live broadcast element in the live broadcast picture to realize the determination of the completion of the interactive task at the live broadcast end.
In a second aspect, an embodiment of the present application provides a live-based interaction method, including:
Acquiring and displaying a live broadcast picture of a live broadcast terminal sent by a server; responding to the interactive operation triggered by the first user on the live broadcast picture, and sending an operation result of the interactive operation to a server, wherein the operation result of the interactive operation is used for representing live broadcast elements selected by the first user in the live broadcast picture and interactive task configuration information associated with the live broadcast elements; the live broadcast element and the interaction task configuration information are used for generating interaction tasks of the live broadcast element.
It can be known that when a first user at the receiving end views a live broadcast picture of the live broadcast end pushed by the server, the operation of selecting a live broadcast element and configuring interaction task configuration information associated with the live broadcast element for the live broadcast element can be realized by performing interaction operation on the live broadcast picture. The server can utilize the configuration information of the interaction task to create the interaction task of the live broadcast element and send the interaction task to the live broadcast end, the live broadcast end can correspondingly display the task information of the interaction task in the live broadcast picture of the push stream, the interaction between the receiving end and the live broadcast end is realized in such a way, the interaction mode is related to the live broadcast element of the game, the interaction mode in the live broadcast process is enriched, and a better interaction effect is provided.
Optionally, the interaction operation includes a circle selection operation for determining the live element;
Transmitting the operation result of the interactive operation to the server, including: determining a picture area of a live broadcast picture selected by the selecting operation; and sending a picture area to a server, wherein the picture area is used for determining live broadcast elements selected by the first user in a live broadcast picture.
Optionally, the interactive operation includes a voice input operation for determining a live element;
Transmitting the operation result of the interactive operation to the server, including: determining voice information input by voice input operation; and sending voice information to a server, wherein the voice information is used for determining the live broadcast element selected by the first user on the live broadcast picture.
Optionally, the interactive operation includes a selection operation for determining interactive task configuration information associated with the live element;
Transmitting the operation result of the interactive operation to the server, including: according to the selected operation, determining target configuration information and result configuration information selected by the first user in the interactive task configuration item; the interaction task configuration item is obtained by determining a preset interaction task configuration library according to live broadcast elements selected by a first user by the server; and sending the target configuration information and the result configuration information to the server.
It is known that the first user can directly perform various interactive operations on the interface of the receiving end, so as to realize quick selection of the live broadcast element and quick confirmation of the configuration information of the interactive task, facilitate creation of the interactive task and promote user experience.
In a third aspect, an embodiment of the present application provides a live-based interaction method, including:
a live broadcast picture is sent to a server, and the live broadcast picture is used for pushing and streaming to a receiving end; receiving an interaction task of a live broadcast element sent by a server, and displaying task information of the interaction task at the live broadcast element in a live broadcast picture; the interaction task is created by the server according to the live broadcast element selected by the first user at the receiving end in the live broadcast picture and interaction task configuration information associated with the live broadcast element.
It is known that when the second user at the live broadcast end performs live broadcast, the second user at the live broadcast end also receives the interaction task issued by the first user through the server, and the interaction mode is related to the live broadcast element of the game, so that the interaction mode in the live broadcast process is enriched, and a better interaction effect is provided.
Optionally, the interaction method further includes: receiving an interaction result of the interaction task sent by the server, and displaying the interaction result on the live broadcast picture; the task interaction result is sent when the server determines that the current state of the live broadcast element in the live broadcast picture meets the target state described by the interaction task.
It is known that the second user can complete the corresponding interaction task according to the task information displayed at the live broadcast element of the current live broadcast interface of the live broadcast end, so that the interaction result is obtained, the interaction interestingness is higher, and the effect is better.
In a fourth aspect, an embodiment of the present application provides a live-based interaction device, including:
The live broadcast terminal comprises a live broadcast terminal, a live broadcast pushing module and a receiving terminal, wherein the live broadcast terminal is used for receiving live broadcast pictures;
the task processing module is used for determining live broadcast elements selected by a first user at the receiving end according to the live broadcast pictures and interaction task configuration information associated with the live broadcast elements; creating an interaction task of the live broadcast element according to the interaction task configuration information;
and the plug flow module is also used for sending the interaction task of the live broadcast element to the live broadcast terminal.
In a fifth aspect, an embodiment of the present application provides a live-based interaction device, including:
the receiving module is used for acquiring and displaying a live broadcast picture of the live broadcast terminal sent by the server;
The interactive module is used for responding to the interactive operation triggered by the first user on the live broadcast picture, sending an operation result of the interactive operation to the server, wherein the operation result of the interactive operation is used for representing live broadcast elements selected by the first user in the live broadcast picture and interactive task configuration information associated with the live broadcast elements; the live broadcast element and the interaction task configuration information are used for generating interaction tasks of the live broadcast element.
In a sixth aspect, an embodiment of the present application provides a live-based interaction device, including:
the sending module is used for sending a live broadcast picture to the server, wherein the live broadcast picture is used for pushing and streaming to the receiving end;
The task execution module is used for receiving the interactive task of the live broadcast element sent by the server and displaying task information of the interactive task at the live broadcast element in the live broadcast picture; the interaction task is created by the server according to the live broadcast element selected by the first user at the receiving end in the live broadcast picture and interaction task configuration information associated with the live broadcast element.
In a seventh aspect, an embodiment of the present application provides an electronic device, including: a memory, a processor; at least one processor; and
A memory;
The memory stores computer-executable instructions;
the at least one processor executing computer-executable instructions stored in the memory causes the at least one processor to perform the method of any of the first, second, or third aspects.
In an eighth aspect, an embodiment of the present application provides a computer-readable storage medium having stored therein computer-executable instructions which, when executed by a processor, implement a method according to any of the first or second or third aspects.
In a ninth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when executed by a processor, implements a method according to any of the first or second or third aspects.
The embodiment of the application provides an interaction method and device based on live broadcasting, wherein in the process of pushing a live broadcasting picture of a live broadcasting terminal to a receiving terminal by a server, a first user of the receiving terminal can select a live broadcasting element on the live broadcasting picture displayed by the receiving terminal, and configure interaction task configuration information related to the live broadcasting element for the live broadcasting element, the server can create an interaction task of the live broadcasting element by utilizing the interaction task configuration information and send the interaction task to the live broadcasting terminal, and the live broadcasting terminal can correspondingly display the task information of the interaction task in the pushed live broadcasting picture.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic diagram of a network architecture on which the present application is based;
Fig. 2 is an interaction schematic diagram of an interaction method based on live broadcast provided by the application;
Fig. 3 is a schematic flow chart of an interaction method based on live broadcast according to an embodiment of the present application;
fig. 4 is a flow chart of another interaction method based on live broadcast according to an embodiment of the present application;
fig. 5 is a first schematic diagram of a live-based interaction method according to an embodiment of the present application;
fig. 6 is a second schematic diagram of a live-based interaction method according to an embodiment of the present application;
Fig. 7 is a third schematic diagram of a live-based interaction method according to an embodiment of the present application;
Fig. 8 is a schematic flow chart of another interaction method based on live broadcast according to an embodiment of the present application;
fig. 9 is a fourth schematic diagram of a live-based interaction method according to an embodiment of the present application;
fig. 10 is a block diagram of an interactive device based on live broadcast according to an embodiment of the present application;
Fig. 11 is a block diagram of another interactive device based on live broadcast according to an embodiment of the present application;
fig. 12 is a block diagram of still another interactive device based on live broadcast according to an embodiment of the present application;
fig. 13 is a schematic diagram of a hardware structure of an electronic device according to the present application.
Specific embodiments of the present disclosure have been shown by way of the above drawings and will be described in more detail below. These drawings and the written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the disclosed concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
In the technical scheme of the application, the related processing such as collection, storage, use, processing, transmission, provision, disclosure and the like of various information accords with the regulations of related laws and regulations and does not violate the popular public order.
With the rapid development of computer and internet technologies, various live broadcasting views through a multimedia live broadcasting platform become indispensable life style of people.
In existing live broadcasting, a live broadcasting needs to push a current live broadcasting picture to a viewer through a live broadcasting server so that the viewer can watch the current live broadcasting picture of the live broadcasting.
In order to promote the interest of live broadcast, a live broadcast platform is often provided with an comment area, and audiences can leave a message in the comment area when watching live broadcast of a host, so as to realize interaction with the host.
However, in such a live interaction manner, on the one hand, the host needs to pay attention to the messages in the comment area in real time, which is not beneficial to the live broadcast of the host, and on the other hand, for example, the host often needs to continuously play the game, and it is difficult to pay attention to the messages of comments in real time, so that the messages of the audience are easily ignored or missed by the host, and the interactive enthusiasm of the audience is affected. On the other hand, the interaction mode is single, the interaction effect is poor, and the interaction experience of the audience and the anchor is not facilitated.
Aiming at the problems, the application provides a live broadcast-based interaction method and device, which are used for providing more interaction modes for two live broadcast parties and increasing interaction diversity of game live broadcast.
Referring to fig. 1, fig. 1 is a schematic diagram of a network architecture according to the present application, where the network architecture shown in fig. 1 may specifically include a server, a receiving end, and a live end.
The server may specifically be a server cluster disposed at a cloud, where the server may be configured to provide a live broadcast service related to a live broadcast platform for a terminal including a receiving end and a live broadcast end.
The receiving end and the live broadcast end can be hardware terminals with network communication function, information display function and operation function, including but not limited to smart phones, tablet computers, desktop computers, internet of things equipment and the like. The receiving end and the live end can be provided with a live broadcast platform provided by a server, and a host (namely a second user in the application) using the live broadcast end can start live broadcast through the live broadcast platform and share live broadcast pictures to audiences (namely first users in the application) using the receiving end for viewing by the audiences.
In addition, the receiving end and the live broadcast end can communicate and interact with the server, so that the server can realize the processes of pushing live broadcast pictures, creating and sending interaction tasks, finishing judgment and the like.
Specifically, fig. 2 is an interaction schematic diagram of an interaction method based on live broadcast provided by the present application, and in combination with fig. 1 and fig. 2, when live broadcast starts, a live broadcast end may push a current picture of a second user (i.e., a host broadcast) to a receiving end through a server. At this time, the first user of the receiving end (i.e., the viewer or the player who is in play with the anchor) will view the live view synchronously displayed on its receiving end display interface.
And then, the first user can directly perform interactive operation on the display interface of the receiving end so as to select direct broadcast elements from the live broadcast picture displayed on the display interface and configure corresponding interactive task configuration information for the selected direct broadcast elements. After acquiring the live broadcast elements and the configuration information of the interaction task, the server creates a corresponding interaction task and sends the interaction task to the live broadcast end.
At this time, the live broadcast end displays relevant information of the interaction task at the live broadcast element in the live broadcast picture so as to prompt the second user to complete the interaction task and realize interaction with the first user.
Of course, as the live broadcast process of the second user continues, the second user can change the state of the live broadcast element in the live broadcast through various operations. Wherein the specific operation content of the various operations is related to the live broadcast type; for example, when the live type is a game live, the operation may be a game operation; as another example, when the live type is sports live, the operation may be an operation for use with certain exercise equipment.
The server can judge whether the second user at the live terminal completes the interaction task in real time by monitoring the state of the live element. When the interaction task is completed, the server pushes an interaction result triggered by the interaction task to the live broadcast end, and the interaction result is displayed in a live broadcast picture of the live broadcast end. Thus, all interactions are completed.
It should be noted that, in the embodiment of the present application, the live broadcast picture of the live broadcast end is continuously pushed to the receiving end, that is, when the live broadcast picture of the live broadcast end displays the interactive task or the interactive result, the live broadcast picture displayed by the receiving end synchronously displays the interactive task or the interactive result.
Through the interaction mode, more interaction possibilities are provided for the first user and the second user, and diversified interaction requirements of interaction users are met.
In an exemplary embodiment, in a live sports scene, a second user at the live sports end pushes a picture of the second user at the live sports end to a receiving end through a live sports platform for viewing by the first user. In the watching process of the first user, the first user selects equipment in a live broadcast scene such as a dumbbell, a rope skipping and the like to serve as live broadcast elements, and configures corresponding interaction task configuration information for the live broadcast elements selected by the first user according to prompts of a server so as to send different interaction tasks including movement using the dumbbell, movement using the rope skipping and the like to a second user at the live broadcast end, and interaction with the second user is achieved.
In an exemplary embodiment, in a live food scene, a second user at the live-broadcast end pushes a picture of the second user at the live-broadcast end during feeding to the receiving end through the live-broadcast platform for viewing by the first user. In the process of watching by the first user, the first user selects articles or foods in a live scene such as chopsticks, foods and the like to serve as live broadcast elements, and configures corresponding interaction task configuration information for the live broadcast elements selected by the first user according to prompts of the server so as to send different interaction tasks including chopsticks for eating, foods for eating and the like to a second user at the live broadcast end, and interaction with the second user is achieved.
That is, for the live-based interaction method provided by the application, the method can be applied to live interaction of different types of live scenes, wherein the live scenes include but are not limited to: live games, sports, food, e-commerce, etc. When the live broadcast-based interaction method provided by the application is used for interaction in any type of live broadcast scene, a first user can select a live broadcast element in the live broadcast scene on a live broadcast picture and configure corresponding interaction task configuration information for the live broadcast element, so that a server creates an interaction task to realize interaction between two parties.
The live-broadcast-based interaction method and device provided by the application are respectively described in detail through specific embodiments from different angles. The following embodiments may be combined with each other and may not be described in detail in some embodiments for the same or similar concepts or processes.
It should be noted that, the execution body of the live-broadcast-based interaction method provided in this embodiment is the aforementioned server, and fig. 3 is a schematic flow chart of the live-broadcast-based interaction method provided in this embodiment. As shown in fig. 3, the live-based interaction method may include the following steps:
Step 301, acquiring a live broadcast picture sent by a live broadcast terminal, and pushing the live broadcast picture to a receiving terminal.
Specifically, the second user at the live end may send a live request to the server, and establish a communication link with the server after the server agrees to the request. Through the communication link, the server can continuously receive the data stream from the live broadcast end, wherein the data stream comprises the current live broadcast picture of the live broadcast end.
In addition, the server receives a viewing request from the receiving end for viewing the live broadcast of the live broadcast end, and establishes a communication link with the receiving end after agreeing to the request. Through the communication link, the server can push the received data stream of the live broadcast end to the receiving end so as to be watched by a first user of the receiving end.
Step 302, determining a live broadcast element selected by a first user of the receiving end according to the live broadcast picture and interaction task configuration information associated with the live broadcast element.
In the step, the server can acquire a live broadcast element selected by a first user of the receiving end in a live broadcast picture through interaction with the receiving end; determining an interaction task configuration item corresponding to the live broadcast element in a preset interaction task configuration library; and acquiring interaction task configuration information selected by a first user of the receiving end in the interaction task configuration items.
The live broadcast elements refer to components in a live broadcast scene corresponding to a live broadcast picture, and generally, the live broadcast elements are different according to differences of the live broadcast scenes.
In an exemplary live scene of the live game, the live game element may be a game element, where the game element refers to a component element in the game corresponding to the live game picture, and the selectable game elements are different according to different game types shown in the live game picture.
Further by way of example, when the game live view shows a billiard game, the game elements available for selection by the first user will include "billiards," "clubs," "holes," and the like; for example, when the live game picture shows a checkers game, the game elements selected by the first user include "checkers", etc.; illustratively, when the live game screen displays a card game or a combat game, the first user may select the game element to include a card, a head portrait, a score panel, and the like.
In other live scenes, the live elements will vary from live scene type to live scene type.
For example, in a live scene of sports live, the live elements may be equipment elements, and different equipment elements may refer to different sports equipment, etc. In addition, in a live scene of the food live broadcast, the live broadcast element can be a food material element, and different food material elements can be used for representing different types of foods.
The manner of determining the live element by the server may be implemented in a plurality of ways:
For example, the server may obtain a picture area selected by the first user of the receiving end in the live broadcast picture; and carrying out image recognition processing on the picture area to obtain the live broadcast elements included in the picture area.
Specifically, the first user can perform the region selection operation in the live broadcast picture displayed by the receiving end, the server will receive the image corresponding to the picture region selected by the first user and uploaded by the receiving end, and perform the image recognition processing on the picture region by using a series of algorithms including the target recognition algorithm, the target detection algorithm and the like, so as to recognize the object, i.e. the live broadcast element, which the first user wants to select from the picture region.
The server may also obtain voice information sent by the first user at the receiving end; carrying out semantic recognition on the voice information to obtain a semantic recognition result; and determining the live broadcast element corresponding to the semantic recognition result in the live broadcast picture.
Specifically, the first user may click a voice input button to input voice information. After the input is completed, the server receives the voice information input by the first user, and determines the live broadcast element expressed by the first user in the voice information by utilizing a series of algorithms such as a semantic detection algorithm, a semantic segmentation algorithm, a semantic analysis algorithm and the like.
It should be noted that, for different live broadcast types, the manner of selecting the live broadcast element may also be different.
For example, in a live game scenario, for some flat visual-type games, including "billiard games," a first user may interactively determine game elements by way of a circle or voice. For some somatosensory games including VR games, the method of selecting and acquiring game elements may also be implemented by gesture recognition or limb behavior recognition, that is, capturing gesture behavior information or limb behavior information of the first user by a camera at the receiving end, and recognizing the gesture behavior information or limb behavior information by using a server to determine the game element selected by the first user.
Of course, after the server determines the live element, the server will also determine the interactive task configuration information corresponding to the live element. For each direct broadcast element, as the object pointed by the direct broadcast element is different, the server configures interaction task configuration items matched with the game element attribute for different direct broadcast elements, so that the first user can configure personalized interaction tasks for the direct broadcast element. The server is internally provided with an interactive task configuration library corresponding to each live broadcast element in a correlated mode, and a plurality of interactive task configuration items, such as target configuration items and result configuration items, of the live broadcast elements are stored in the interactive task configuration library corresponding to each live broadcast element.
The target configuration items represent the completion condition range of the interactive task of the live broadcast element, a plurality of target configuration information are corresponding to the target configuration items in the interactive task configuration library, and each target configuration information corresponds to the completion condition of one interactive task, namely, the target state of the live broadcast element when the interactive task is completed is described; the result configuration items represent the interactive result range of the interactive task of the live broadcast element, and a plurality of result configuration information are correspondingly arranged under the result configuration items in the interactive task configuration library, wherein each result configuration information describes the interactive result of an interactive task when the interactive task is completed.
The target configuration item and the corresponding result configuration item corresponding to each on-air element are stored in the interaction task configuration library in a pre-associated mode, when the game element is applied, the receiving end in the server responds to the interaction operation of the first user to send the interaction result to the server, and the server determines target configuration information and result configuration information corresponding to the game element in a mode of searching element ID and the like from the interaction task configuration library according to the interaction result.
For example, for a game element in a live game scene of "billiards", the interactive task may include multiple task completion conditions such as "hitting a hole" or "hitting other balls". Under the completion condition of 'hitting a hole', the target configuration information will be described as 'target ball falling bag'; in the completed condition of "hit other ball", the target configuration information will be described as "target ball hit non-target ball".
The first user may input text information or voice information such as "hole-in-the-hole gift" to cause the server to acquire "target ball pocket" as target configuration information from the game element "billiard" under the target configuration item.
For example, for the equipment element in the live sports scene of dumbbell, the interactive task playing method can comprise the task completion conditions such as dumbbell movement and the like. In this case, the target configuration information will be described as "exercise using dumbbell" under the completion condition of "dumbbell movement".
The first user may obtain "exercise with dumbbell" as the target configuration information from the game element "dumbbell" by inputting text information or voice information such as "exercise with dumbbell i.e. gift".
For example, for different live broadcast elements, the interaction task playing method can include rewarding interaction results related to electronic certificate transaction such as 'point gift', and graphic interaction results related to information transmission such as 'expression package', 'text picture'. Under the bonus interaction result of 'integrating gift', the result configuration information can be described as 'adding and delivering a certain integrating gift'; under the graphic interaction result, the result configuration information can be described as "displaying a certain expression package".
The server may determine the interaction result selected by the first user according to the interaction operation of the first user, and obtain corresponding result configuration information (e.g. "send a certain integral gift").
That is, in this embodiment, the server further obtains the target configuration information and the result configuration information determined by the first user for the selected live broadcast element, so that the server creates an interaction task corresponding to the live broadcast element conveniently, and the playability of the interaction task is improved.
Step 303, creating an interaction task of the live broadcast element according to the interaction task configuration information, and sending the interaction task of the live broadcast element to the live broadcast end.
Specifically, with the interactive task configuration information, the server will create an interactive task. Optionally, the interactive task includes an initiating user with the interactive task, a receiving user, a task object, a task target, and a task result. Illustratively, a certain interaction task includes the following information: "first user ID", "second user ID", "billiards No. 8", "goal ball falling bag", "added delivery point gift".
And after the interactive task is established, the interactive task is issued to the live broadcast end so as to be checked and executed by a second user of the live broadcast end.
In an alternative embodiment, after the server finishes the issuing of the interaction task, it is further determined whether the live broadcast end finishes the interaction task. Once the server determines that the live broadcast end completes the interaction task, an interaction result of the interaction task is sent to the live broadcast end.
Specifically, the server can determine the current state of the live broadcast element in the live broadcast picture sent by the live broadcast terminal; and judging whether the current state of the live element meets the target state described by the interactive task.
For example, when the interactive task includes "first user ID", "second user ID", "billiards No. 8", "goal ball falling into a bag", "added delivery point gift", the server may determine, in real time, a current position of "billiards No. 8" in the live broadcast screen provided by the live broadcast end, and once it is determined that "billiards No. 8" have fallen into a bag, it is determined that the interactive task is completed. At this time, the server executes the "send-up point gift", that is, sends the "point gift" provided by the first user to the account of the second user at the live broadcast end, and completes the transaction processing of the electronic certificate.
For example, when the interactive task includes "first user ID", "second user ID", "red dumbbell", "dumbbell moving", "gifting expression package", the server may determine, in real time, a current position of "red dumbbell" in the live view provided by the live end, and once the position of "red dumbbell" is determined to move, it is determined that the interactive task is completed. At this time, the server executes the "give expression package", that is, the "expression package" provided by the first user is sent to the interface of the second user at the live broadcast end, so as to complete the information interaction processing based on the expression package once.
By the method, after the interaction task is sent to the live broadcast end, the server can monitor the completion condition of the interaction task by the live broadcast end and send the interaction result to the live broadcast end after the live broadcast end completes the interaction task, so that task interaction is realized.
On the basis of the foregoing embodiment, fig. 4 is a schematic flow chart of another live-broadcast-based interaction method according to an embodiment of the present application, where an execution body of the method is the aforementioned receiving end.
As shown in fig. 4, the live-based interaction method may include the following steps:
step 401, acquiring and displaying a live broadcast picture of a live broadcast terminal sent by a server.
Step 402, responding to the interactive operation triggered by the first user on the live broadcast picture, and sending an operation result of the interactive operation to a server, wherein the operation result of the interactive operation is used for representing live broadcast elements selected by the first user in the live broadcast picture and interactive task configuration information associated with the live broadcast elements; the live broadcast element and the interaction task configuration information are used for generating interaction tasks of the live broadcast element.
Corresponding to the foregoing embodiment, the receiving end may send a viewing request for viewing the live broadcast of the live broadcast end to the server. After receiving the viewing request, the server establishes a communication link with the receiving end. Through the communication link, the receiving end acquires the data stream from the live broadcast end, and a first user of the receiving end can watch a live broadcast picture of a second user of the live broadcast end during the game.
In this embodiment, when the first user views the live broadcast picture, the first user can interact with the second user at the live broadcast end in an interaction task manner, so that user experience is improved.
During interaction, the first user needs to select the live broadcast elements in the game live broadcast picture and the interaction task configuration information associated with the live broadcast elements through interaction operation so that the server can generate corresponding interaction tasks, and the first user can achieve the aim through multiple interaction operations.
The first user may select the live element in the live view through a plurality of types of interactions:
in an alternative embodiment, the interactive operation includes a circling operation for determining the live broadcast element, and the receiving end can determine the picture area of the live broadcast picture circled by the circling operation; and sending a picture area to a server, wherein the picture area is used for determining live broadcast elements selected by the first user in a live broadcast picture.
Taking a live game scene as an example, fig. 5 is a first schematic diagram of a live-broadcast-based interaction method according to an embodiment of the present application, and fig. 5 shows a current interface 501 of a receiving end, where a received live game picture is displayed in the interface 501. The first user may perform the circling of the current interface 501 as shown in fig. 5 through the touch operation of the receiving end such as clicking, sliding, etc., to obtain the circled and selected picture area 502.
Then, the image of the picture area 502 is sent to the server, so that the server performs processing including image recognition processing according to the foregoing embodiment, and obtains the live broadcast element corresponding to the picture area 502. Illustratively, as shown in fig. 5, the live element is "billiard No. 1".
In another alternative embodiment, the interactive operation includes a voice input operation for determining a live element; the receiving end can determine the voice information input by the voice input operation; and sending voice information to a server, wherein the voice information is used for determining the live broadcast element selected by the first user on the live broadcast picture.
Taking a live game scene as an example, fig. 6 is a second schematic diagram of a live-broadcast-based interaction method according to an embodiment of the present application, and fig. 6 shows a current interface 601 of a receiving end, where a received live broadcast picture is displayed in the interface 601. The interface 601 further includes a voice input component 602, and the first user can click on the voice input component 602 to make the receiving end invoke the voice acquisition hardware to acquire the voice of the first user. At this point, a corresponding prompt will appear in interface 601, and a prompt 603 of "in captured speech" will appear in interface 601 as shown on the right side of FIG. 6.
Then, the voice information is sent to the server, so that the server performs processing including semantic recognition processing according to the foregoing embodiment to obtain a live broadcast element corresponding to the voice information. For example, as shown in fig. 6, if "… … No.1 billiard ball … …" is included in the voice information, the live broadcast element is "No. 1 billiard ball".
It should be noted that, for different display modes of the live broadcast scene, the modes of selecting the live broadcast elements are different. For example, for some live presentation scenes that may be based on a flat visual class, including "billiard games", "eating" the first user may interactively determine the live element by way of a circle or voice, etc. For some live scenes which are displayed based on a somatosensory display mode and include VR games, VR live sports and the like, the selection and acquisition mode of live elements can be realized through gesture recognition or limb behavior recognition, namely, gesture behavior information or limb behavior information of a first user is captured through a camera of a receiving end, and the gesture behavior information or the limb behavior information is recognized by a server so as to determine the live elements selected by the first user.
In order to improve diversity of interaction tasks, after a first user selects a live broadcast element through interaction operation, a server provides interaction task configuration items corresponding to the live broadcast element according to the selected live broadcast element, so that the first user can further select interaction task configuration information.
Specifically, the interactive operation includes a selection operation for determining interactive task configuration information associated with the live element; the receiving end determines target configuration information and result configuration information selected by the first user in the interactive task configuration item according to the selected operation; the interaction task configuration item is obtained by determining a preset interaction task configuration library according to live broadcast elements selected by a first user by the server; and sending the target configuration information and the result configuration information to the server.
Taking live game as an example, fig. 7 is a third schematic diagram of an interaction method based on live game provided in the embodiment of the present application, and fig. 7 shows a current interface 701 of a receiving end, where not only a live game interface but also an interaction task configuration item, such as a target configuration item 702 and a result configuration item 703, associated with the live game element are displayed on the interface 701.
As in the previous embodiments, for the live element "billiards No. 1", the interactive task may include a plurality of task completion conditions such as "hit holes" or "hit other balls". As shown in fig. 7, the first user may input text information, such as "enter hole, send, or" at the target configuration item 702, so that the server obtains "target ball drop pocket" as target configuration information from the live broadcast element "billiard No. 1" under the target configuration item.
As described in the foregoing embodiment, for the live element "billiard No. 1", the interactive task playing method may include "score gift" and other rewarding interactive results related to the electronic certificate transaction, as shown in fig. 7, the first user may select, from a plurality of "score gift" provided by the result configuration item 703, a rewarding interactive result of the electronic certificate transaction that he wishes to perform, as result configuration information.
Of course, it should be noted that, the result configuration item herein is merely taken as an example of an interaction result of the electronic certificate transaction, and in other implementations, the result configuration item may also provide an interaction result of the "expression package" or an interaction result of the "text and picture" that may be selected.
After the first user completes the configuration of the target configuration item 702 and the result configuration item 703, by clicking a confirmation button such as "give" to enable the receiving end to upload the operation results of the target configuration item 702 and the result configuration item 703 configured by the first user to the server, the server takes the "target ball pocket" as the target configuration information based on the uploaded operation results, takes the electronic certificate transaction corresponding to the "god hand" as the result configuration information, and generates a corresponding interaction task for the second user at the live broadcast end to view and execute the interaction task.
Through the mode, the first user at the receiving end can send various interaction tasks to the second user at the live broadcast end according to the self requirements, the interaction playability is higher, and the user experience is better.
On the basis of the foregoing embodiment, fig. 8 is a schematic flow chart of another live-broadcast-based interaction method according to an embodiment of the present application, where an execution body of the method is the live broadcast end mentioned above.
As shown in fig. 8, the live-based interaction method may include the following steps:
step 801, a live broadcast picture is sent to a server, and the live broadcast picture is used for pushing to a receiving end.
Step 802, receiving an interaction task of a live broadcast element sent by a server, and displaying task information of the interaction task at the live broadcast element in a live broadcast picture; the interaction task is created by the server according to the live broadcast element selected by the first user at the receiving end in the live broadcast picture and interaction task configuration information associated with the live broadcast element.
Corresponding to the foregoing embodiment, the second user at the live end may send a live request to the server, and establish a communication link with the server after the server agrees to the request. Through the communication link, the server may continuously receive a data stream from the live side, where the data stream is to be used for live pictures including the live side.
In this embodiment, the live broadcast picture of the second user is pushed to one or more receiving ends, and the first user of the receiving end may issue an interaction task for the live broadcast end by using the interaction method mentioned in the foregoing embodiment. That is, after the server creates the interactive task according to the interactive operation of the first user, the live broadcast end will receive the interactive task issued by the server, and display the task information of the interactive task at the live broadcast element in the live broadcast picture.
Taking live game as an example, fig. 9 is a fourth schematic diagram of an interaction method based on live game provided in the embodiment of the present application, and fig. 9 shows a current interface 901 of a live game terminal, where a live game interface of a second user is displayed on the interface 901.
The interface 901 further displays task information 902 of the interactive task, and it is known that the display position of the task information 902 is determined according to the live broadcast element corresponding to the interactive task. Taking fig. 9 as an example, since the interactive task of the task information 902 is created for the "billiard ball No. 1", the task information 902 of the interactive task will be displayed at the "billiard ball No. 1" of the current interface 901. In addition, a prompt message of "entering a hole and sending" is displayed in the task information 902, and the second user can know the completion condition for completing the interactive task through the prompt message.
Along with the operation of the second user and the progress of the game, when the second user hits the billiards 1 into the hole, the server judges that the current state of the live playing element billiards 1 in the live playing picture of the live playing end of the second user is the hole, and the current state of the billiards 1 meets the target state described by the interaction task, and at the moment, the server sends the interaction result of the interaction task to the live playing end. Referring to the right diagram of fig. 9, the interactive result 903, that is, the result including the prompt message "you can please get" one bar of the god of spectator a "given" is displayed on the current interface 901 of the live broadcast side.
That is, the live broadcast terminal receives the interaction result of the interaction task sent by the server and displays the interaction result on the live broadcast picture; the task interaction result is sent when the server determines that the current state of the live broadcast element in the live broadcast picture meets the target state described by the interaction task.
Through the mode, the second user at the live broadcasting end can receive various interaction tasks sent by the first user at the receiving end, the interaction playability is higher, and the user experience is better.
Corresponding to the live-based interaction method provided in the above embodiment, fig. 10 is a block diagram of a live-based interaction device provided in the embodiment of the present application. For convenience of explanation, only portions relevant to the embodiments of the present application are shown. Referring to fig. 10, the live-based interactive apparatus is provided at a server, and includes:
The pushing module 1010 is configured to obtain a live broadcast picture sent by the live broadcast terminal, and push the live broadcast picture to the receiving terminal;
The task processing module 1020 is configured to determine a live broadcast element selected by a first user at the receiving end according to the live broadcast picture and interaction task configuration information associated with the live broadcast element; creating an interaction task of the live broadcast element according to the interaction task configuration information;
the plug flow module 1010 is further configured to send an interaction task of the live element to the live end.
Optionally, the task processing module 1020 is specifically configured to obtain a live broadcast element selected by the first user at the receiving end on a live broadcast picture; determining an interaction task configuration item corresponding to the live broadcast element in a preset interaction task configuration library; and acquiring interaction task configuration information selected by a first user of the receiving end in the interaction task configuration items.
Optionally, the task processing module 1020 is specifically configured to obtain a picture area selected by the first user at the receiving end in the live broadcast picture; and carrying out image recognition processing on the picture area to obtain the live broadcast elements included in the picture area.
Optionally, the task processing module 1020 is specifically configured to obtain voice information sent by the first user at the receiving end; carrying out semantic recognition on the voice information to obtain a semantic recognition result; and determining the live broadcast element corresponding to the semantic recognition result in the live broadcast picture.
Optionally, the interactive task configuration item includes: a target configuration item and a result configuration item;
The task processing module 1020 is specifically configured to obtain target configuration information and result configuration information selected by the first user at the receiving end in the interactive task configuration item; the target configuration information is used for describing the target state of the live broadcast element when the interaction task is completed; the result configuration information is used for describing the interaction result of the interaction task when the interaction task is completed.
Optionally, the task processing module 1020 is further configured to send an interaction result of the interaction task to the live broadcast end when it is determined that the live broadcast end completes the interaction task.
Optionally, the task processing module 1020 is further configured to determine a current state of a live element in the live image sent by the live end; and judging whether the current state of the live element meets the target state described by the interactive task.
Corresponding to the interaction method based on live game provided in the above embodiment, fig. 11 is a block diagram of another interaction device based on live game provided in the embodiment of the present application. For convenience of explanation, only portions relevant to the embodiments of the present application are shown. Referring to fig. 11, the interaction based on the live game is provided at a terminal, specifically, it may be provided at a live broadcast receiving end in a live game scene.
The interaction device based on the game live broadcast comprises:
A receiving module 1110, configured to acquire and display a live broadcast picture of a live broadcast end sent by a server;
The interaction module 1120 is configured to respond to an interaction operation triggered by the first user on the live broadcast picture, and send an operation result of the interaction operation to the server, where the operation result of the interaction operation is used to represent a live broadcast element selected by the first user in the live broadcast picture and interaction task configuration information associated with the live broadcast element; the live broadcast element and the interaction task configuration information are used for generating interaction tasks of the live broadcast element.
Optionally, the interaction operation includes a circle selection operation for determining the live element;
The interaction module 1120 is specifically configured to determine a frame area of the live broadcast frame selected by the selection operation; and sending a picture area to a server, wherein the picture area is used for determining live broadcast elements selected by the first user in a live broadcast picture.
Optionally, the interactive operation includes a voice input operation for determining a live element;
the interaction module 1120 is specifically configured to: determining voice information input by voice input operation; and sending voice information to a server, wherein the voice information is used for determining the live broadcast element selected by the first user on the live broadcast picture.
Optionally, the interactive operation includes a selection operation for determining interactive task configuration information associated with the live element;
the interaction module 1120 is specifically configured to: according to the selected operation, determining target configuration information and result configuration information selected by the first user in the interactive task configuration item; the interaction task configuration item is obtained by determining a preset interaction task configuration library according to live broadcast elements selected by a first user by the server; and sending the target configuration information and the result configuration information to the server.
Fig. 12 is a block diagram of still another live-based interaction device according to an embodiment of the present application, corresponding to the live-based interaction method provided in the above embodiment. For convenience of explanation, only portions relevant to the embodiments of the present application are shown. Referring to fig. 12, the live-based interaction device is disposed at a terminal, and in particular, may be disposed at a live end in a live scene.
The live-broadcast-based interaction device comprises:
the sending module 1210 is configured to send a live broadcast picture to the server, where the live broadcast picture is used for pushing to a receiving end;
The task execution module 1220 is configured to receive an interactive task of a live broadcast element sent by the server, and display task information of the interactive task at the live broadcast element in the live broadcast picture; the interaction task is created by the server according to the live broadcast element selected by the first user at the receiving end in the live broadcast picture and interaction task configuration information associated with the live broadcast element.
Optionally, the interaction method further includes: the task execution module 1220 is configured to receive an interaction result of the interaction task sent by the server, and display the interaction result on the live broadcast picture; the task interaction result is sent when the server determines that the current state of the live broadcast element in the live broadcast picture meets the target state described by the interaction task.
Fig. 13 is a schematic diagram of a hardware structure of an electronic device according to the present application, and as shown in fig. 13, an embodiment of the present application provides an electronic device, where a memory of the electronic device may be configured to store at least one program instruction, and a processor is configured to execute the at least one program instruction, so as to implement a technical solution of the foregoing method embodiment. The implementation principle and technical effects are similar to those of the related embodiments of the method, and are not repeated here.
The embodiment of the application provides a chip. The chip comprises a processor for invoking a computer program in a memory to perform the technical solutions in the above embodiments. The principle and technical effects of the present application are similar to those of the above-described related embodiments, and will not be described in detail herein.
An embodiment of the present application provides a computer program product, which when executed on an electronic device, causes the electronic device to execute the technical solution in the foregoing embodiment. The principle and technical effects of the present application are similar to those of the above-described related embodiments, and will not be described in detail herein.
An embodiment of the present application provides a computer readable storage medium, on which program instructions are stored, which when executed by an electronic device, cause the electronic device to execute the technical solution of the above embodiment. The principle and technical effects of the present application are similar to those of the above-described related embodiments, and will not be described in detail herein.
The foregoing detailed description of the application has been presented for purposes of illustration and description, and it should be understood that the foregoing is by way of illustration and description only, and is not intended to limit the scope of the application.

Claims (11)

1. A live-based interaction method, comprising:
Acquiring a live broadcast picture sent by a live broadcast terminal, and pushing the live broadcast picture to a receiving terminal;
Acquiring a picture area selected by a first user of a receiving end in a live broadcast picture; performing image recognition processing on the picture area to obtain live broadcast elements contained in the picture area;
determining an interaction task configuration item corresponding to the live broadcast element in a preset interaction task configuration library;
Acquiring interaction task configuration information selected by a first user of a receiving end in an interaction task configuration item; the live broadcast elements refer to constituent elements in a live broadcast scene corresponding to a live broadcast picture; the live broadcast elements are different according to different live broadcast scene types;
and creating an interaction task of the live broadcast element according to the interaction task configuration information, and sending the interaction task of the live broadcast element to the live broadcast end.
2. The interaction method of claim 1, further comprising:
And when the live broadcast terminal is determined to finish the interaction task, sending an interaction result of the interaction task to the live broadcast terminal.
3. The interaction method of claim 2, further comprising:
Determining the current state of the live broadcast element in the live broadcast picture sent by the live broadcast terminal;
And judging whether the current state of the live broadcast element meets the target state described by the interaction task.
4. A live-based interaction method, comprising:
acquiring and displaying a live broadcast picture of a live broadcast terminal sent by a server;
responding to the interactive operation triggered by the first user on the live broadcast picture, and sending an operation result of the interactive operation to a server, wherein the operation result of the interactive operation is used for representing live broadcast elements selected by the first user in the live broadcast picture and interactive task configuration information associated with the live broadcast elements; the live broadcast element and the interaction task configuration information are used for generating interaction tasks of the live broadcast element sent to the live broadcast end by the server; the live broadcast elements refer to constituent elements in a live broadcast scene corresponding to a live broadcast picture; the live broadcast elements are different according to different live broadcast scene types;
the interactive operation comprises a circling operation for determining a live element;
Sending the operation result of the interactive operation to a server, wherein the method comprises the following steps:
determining a picture area of the live broadcast picture selected by the circling operation;
and sending the picture area to a server, wherein the picture area is used for determining game elements selected by the first user on the live broadcast picture.
5. The interactive method according to claim 4, wherein the interactive operation comprises a voice input operation for determining a live element;
Sending the operation result of the interactive operation to a server, wherein the method comprises the following steps:
Determining voice information input by the voice input operation;
And sending the voice information to a server, wherein the voice information is used for determining the live broadcast element selected by the first user on the live broadcast picture.
6. The interactive method of claim 5, wherein the interactive operation comprises a selected operation for determining interactive task configuration information associated with the live element;
The step of sending the operation result of the interactive operation to a server comprises the following steps:
according to the selected operation, determining target configuration information and result configuration information selected by the first user in the interactive task configuration item; the interactive task configuration item is obtained by the server through determining in a preset interactive task configuration library according to the live broadcast element selected by the first user;
and sending the target configuration information and the result configuration information to a server.
7. A live-based interaction method, comprising:
The method comprises the steps of sending a live broadcast picture to a server, wherein the live broadcast picture is used for pushing to a receiving end;
Receiving an interaction task of a live broadcast element sent by the server, and displaying task information of the interaction task at the live broadcast element in the live broadcast picture; the server obtains live broadcast elements included in a picture area according to the picture area selected by a first user of the receiving end in the live broadcast picture, determines an interaction task configuration item corresponding to the live broadcast elements in a preset interaction task configuration library, and obtains interaction task configuration information selected by the first user of the receiving end in the interaction task configuration item to be created; the live broadcast elements refer to constituent elements in a live broadcast scene corresponding to a live broadcast picture; the live elements are different according to the types of the live scenes.
8. The method of interaction of claim 7, further comprising:
Receiving an interaction result of the interaction task sent by the server, and displaying the interaction result on the live broadcast picture;
And the task interaction result is sent when the server determines that the current state of the live broadcast element in the live broadcast picture meets the target state described by the interaction task.
9. An electronic device, comprising:
at least one processor; and
A memory;
The memory stores computer-executable instructions;
the at least one processor executing computer-executable instructions stored in the memory causes the at least one processor to perform the method of any one of claims 1-3 or claims 4-6 or claims 7-8.
10. A computer readable storage medium having stored therein computer executable instructions which, when executed by a processor, implement the method of any one of claims 1-3 or claims 4-6 or claims 7-8.
11. A computer program product comprising computer instructions which, when executed by a processor, implement the method of any one of claims 1-3 or claims 4-6 or claims 7-8.
CN202210078799.0A 2022-01-24 2022-01-24 Interaction method and device based on live broadcast Active CN114374856B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210078799.0A CN114374856B (en) 2022-01-24 2022-01-24 Interaction method and device based on live broadcast

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210078799.0A CN114374856B (en) 2022-01-24 2022-01-24 Interaction method and device based on live broadcast

Publications (2)

Publication Number Publication Date
CN114374856A CN114374856A (en) 2022-04-19
CN114374856B true CN114374856B (en) 2024-04-30

Family

ID=81146403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210078799.0A Active CN114374856B (en) 2022-01-24 2022-01-24 Interaction method and device based on live broadcast

Country Status (1)

Country Link
CN (1) CN114374856B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107592575A (en) * 2017-09-08 2018-01-16 广州华多网络科技有限公司 A kind of live broadcasting method, device, system and electronic equipment
CN108391174A (en) * 2018-03-22 2018-08-10 乐蜜有限公司 Living broadcast interactive method, apparatus and electronic equipment
CN109195001A (en) * 2018-07-02 2019-01-11 广州虎牙信息科技有限公司 Methods of exhibiting, device, storage medium and the terminal of present is broadcast live
CN111711831A (en) * 2020-06-28 2020-09-25 腾讯科技(深圳)有限公司 Data processing method and device based on interactive behavior and storage medium
CN112492401A (en) * 2019-09-11 2021-03-12 腾讯科技(深圳)有限公司 Video-based interaction method and device, computer-readable medium and electronic equipment
CN112995687A (en) * 2021-02-08 2021-06-18 腾竞体育文化发展(上海)有限公司 Interaction method, device, equipment and medium based on Internet
CN113411624A (en) * 2021-06-15 2021-09-17 北京达佳互联信息技术有限公司 Game live broadcast interaction method and device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107592575A (en) * 2017-09-08 2018-01-16 广州华多网络科技有限公司 A kind of live broadcasting method, device, system and electronic equipment
CN108391174A (en) * 2018-03-22 2018-08-10 乐蜜有限公司 Living broadcast interactive method, apparatus and electronic equipment
CN109195001A (en) * 2018-07-02 2019-01-11 广州虎牙信息科技有限公司 Methods of exhibiting, device, storage medium and the terminal of present is broadcast live
CN112492401A (en) * 2019-09-11 2021-03-12 腾讯科技(深圳)有限公司 Video-based interaction method and device, computer-readable medium and electronic equipment
CN111711831A (en) * 2020-06-28 2020-09-25 腾讯科技(深圳)有限公司 Data processing method and device based on interactive behavior and storage medium
CN112995687A (en) * 2021-02-08 2021-06-18 腾竞体育文化发展(上海)有限公司 Interaction method, device, equipment and medium based on Internet
CN113411624A (en) * 2021-06-15 2021-09-17 北京达佳互联信息技术有限公司 Game live broadcast interaction method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114374856A (en) 2022-04-19

Similar Documents

Publication Publication Date Title
CN109005417B (en) Live broadcast room entering method, system, terminal and device for playing game based on live broadcast
CN109011574B (en) Game interface display method, system, terminal and device based on live broadcast
CN109068182B (en) Live broadcast room entering method, system, terminal and device for playing game based on live broadcast
CN107680157B (en) Live broadcast-based interaction method, live broadcast system and electronic equipment
CN110519611B (en) Live broadcast interaction method and device, electronic equipment and storage medium
CN112334886B (en) Content distribution system, content distribution method, and recording medium
CN102263907B (en) Play control method of competition video, and generation method and device for clip information of competition video
CN109391834A (en) A kind of play handling method, device, equipment and storage medium
CN106899891A (en) The interactive method and apparatus of guess
US12086845B2 (en) Systems and methods for dynamically modifying video game content based on non-video gaming content being concurrently experienced by a user
Scheible et al. MobiToss: a novel gesture based interface for creating and sharing mobile multimedia art on large public displays
CN113115061A (en) Live broadcast interaction method and device, electronic equipment and storage medium
CN112312142B (en) Video playing control method and device and computer readable storage medium
CN112516589A (en) Game commodity interaction method and device in live broadcast, computer equipment and storage medium
CN114466209A (en) Live broadcast interaction method and device, electronic equipment, storage medium and program product
CN109173272A (en) Game interaction method, system, server and device based on live streaming
CN110072138B (en) Video playing method, video playing equipment and computer readable storage medium
CN112675537A (en) Game prop interaction method and system in live broadcast
CN108076379B (en) Multi-screen interaction realization method and device
CN109754298A (en) Interface information providing method, device and electronic equipment
CN113596558A (en) Interaction method, device, processor and storage medium in game live broadcast
CN111479119A (en) Method, device and system for collecting feedback information in live broadcast and storage medium
CN114191823A (en) Multi-view game live broadcast method and device and electronic equipment
CN114938459A (en) Virtual live broadcast interaction method and device based on barrage, storage medium and equipment
CN109754275A (en) Data object information providing method, device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant