CN110545442B - Live broadcast interaction method and device, electronic equipment and readable storage medium - Google Patents

Live broadcast interaction method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN110545442B
CN110545442B CN201910919153.9A CN201910919153A CN110545442B CN 110545442 B CN110545442 B CN 110545442B CN 201910919153 A CN201910919153 A CN 201910919153A CN 110545442 B CN110545442 B CN 110545442B
Authority
CN
China
Prior art keywords
game
data
client
scene
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910919153.9A
Other languages
Chinese (zh)
Other versions
CN110545442A (en
Inventor
庄宇轩
陈泽宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201910919153.9A priority Critical patent/CN110545442B/en
Publication of CN110545442A publication Critical patent/CN110545442A/en
Application granted granted Critical
Publication of CN110545442B publication Critical patent/CN110545442B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Abstract

The present application relates to the field of image processing technologies, and in particular, to a live broadcast interaction method and apparatus, an electronic device, and a readable storage medium. According to the interactive game selected by the first client, the first real-scene data of the scene where the first user is located and the second real-scene data of the scene where the second user is located, the first virtual prop corresponding to the first feature in the scene where the first user is located and the second virtual prop corresponding to the second feature in the scene where the second user is located can be generated, the virtual game scenes matched with the interactive game can be controlled to be presented on the first client and the second client according to the behavior data of the first feature of the first user operation and the behavior data of the second feature of the second user operation, and the first interactive behavior corresponding to the first virtual prop and the second interactive behavior corresponding to the second virtual prop are presented in the virtual game scenes. By adopting the interaction mode, the live broadcast interaction effect can be improved, and the utilization efficiency of live broadcast resources is improved.

Description

Live broadcast interaction method and device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a live broadcast interaction method and apparatus, an electronic device, and a readable storage medium.
Background
Today, network technologies are becoming more sophisticated, and network interactive live broadcasting is becoming known. The network interactive live broadcast generally refers to a network live broadcast containing interactive contents, which goes through a process from early text interaction to voice interaction, and then from the voice interaction to video interaction.
Currently, the interaction between the audience and the anchor can be roughly divided into a bullet screen interaction, an audience gift-sending interaction, and an anchor task-completion interaction. However, the existing interactive mode has the problems of single form, high convergence, lack of interactive immersion, low participation willingness of audiences, poor interactive effect, incapability of fully utilizing live broadcast resources, network resources and the like.
Disclosure of Invention
In view of this, an object of the present application is to provide a live broadcast interaction method, a live broadcast interaction device, an electronic device, and a readable storage medium, which can improve a live broadcast interaction effect and improve utilization efficiency of live broadcast resources.
The application mainly comprises the following aspects:
in a first aspect, an embodiment of the present application provides a live broadcast interaction method, where the live broadcast interaction method includes:
responding to live broadcast interaction initiated by a first client, acquiring first real scene data of a scene where a first user is located, which is acquired by the first client, and acquiring second real scene data of a scene where a second user is located, which is acquired by a second client invited to carry out live broadcast interaction;
generating a first virtual prop corresponding to a first feature in a scene where the first user is located and a second virtual prop corresponding to a second feature in a scene where the second user is located according to the interactive game selected by the first client and the first real-world data and the second real-world data;
and controlling the first client and the second client to present a virtual game scene matched with the interactive game according to the behavior data of the first user for operating the first feature acquired by the first client and the behavior data of the second user for operating the second feature acquired by the second client, and presenting a first interactive behavior corresponding to the first virtual prop and a second interactive behavior corresponding to the second virtual prop in the virtual game scene.
In a possible implementation manner, the acquiring, in response to the live broadcast interaction initiated by the first client, second real-scene data of a scene where the second user is located, which is acquired by a second client invited to perform the live broadcast interaction, includes:
responding to the live broadcast interaction initiated by the first client, and sending interaction invitation information to the second client which displays the live broadcast page of the first client;
and after the second client receives the interaction invitation, controlling to start a camera of the second client, and acquiring the second live-action data acquired by the camera.
In a possible implementation manner, before the generating, according to the interactive game selected by the first client and the first real-world data and the second real-world data, a first virtual item corresponding to a first feature in a scene where the first user is located and a second virtual item corresponding to a second feature in a scene where the second user is located, the live-broadcast interaction method further includes:
sending a game list to the first client, and determining the interactive game selected by the first client according to the game list; the game list comprises game names and game profiles corresponding to each interactive game in each interactive game.
In a possible implementation manner, the generating, according to the interactive game selected by the first client, and the first real-world data and the second real-world data, a first virtual item corresponding to a first feature in a scene where the first user is located, and a second virtual item corresponding to a second feature in a scene where the second user is located includes:
acquiring game data of the interactive game from an interactive game library according to the interactive game selected by the first client; the game data comprise description information of the first feature, description information of the second feature, description information of the first virtual prop, description information of the second virtual prop, scene data of a preset game scene and game description;
and generating a first virtual item corresponding to a first feature in the scene where the first user is located and a second virtual item corresponding to a second feature in the scene where the second user is located according to the game data, the first real-scene data and the second real-scene data.
In a possible implementation manner, the generating, according to the game data, the first real-world data, and the second real-world data, a first virtual item corresponding to a first feature in a scene where the first user is located, and a second virtual item corresponding to a second feature in a scene where the second user is located includes:
identifying the first feature from the first real-world data according to the description information of the first feature in the game data, and identifying the second feature from the second real-world data according to the description information of the second feature in the game data;
and converting the first feature into the first virtual prop according to the description information of the first virtual prop in the game data, and converting the second feature into the second virtual prop according to the description information of the second virtual prop in the game data.
In one possible embodiment, the first and second features include:
a specific part of a human body, an article having a predetermined shape, a predetermined picture.
In a possible implementation manner, after the generating a first virtual prop corresponding to a first feature in a scenario in which the first user is located and a second virtual prop corresponding to a second feature in a scenario in which the second user is located, the live broadcast interaction method further includes:
sending game descriptions corresponding to the interactive games to the first client and the second client;
and responding to an instruction for starting the interactive game sent by the first client, and acquiring the behavior data of the first user for operating the first feature, which is acquired by the first client, and the behavior data of the second user for operating the second feature, which is acquired by the second client.
In one possible embodiment, the presenting the interactive game matching virtual game scene includes:
scene data of a preset game scene corresponding to the interactive game is obtained from the game data of the interactive game;
presenting the virtual game scene based on the scene data.
In one possible embodiment, the presenting the virtual game scene includes:
extracting environmental data from the first live-action data; the environment data is data except character data in the first real scene data;
presenting the virtual game scene based on the environmental data.
In one possible implementation, the live interaction method further includes:
when the first client and the second client are detected to finish the interactive game, calculating a first game result of the first client in the interactive game process and a second game result of the second client in the interactive game process;
and generating a game result ranking list of the interactive game according to the first game result and the second game result, controlling the first client and the second client, and presenting the game result ranking list.
In a second aspect, an embodiment of the present application further provides a live broadcast interaction device, where the live broadcast interaction device includes:
the first acquisition module is used for responding to the live broadcast interaction initiated by the first client, acquiring first real-scene data of a scene where a first user is located and acquired by the first client, and acquiring second real-scene data of a scene where a second user is located and acquired by a second client invited to carry out live broadcast interaction;
the first generation module is used for generating a first virtual prop corresponding to a first feature in a scene where the first user is located and a second virtual prop corresponding to a second feature in a scene where the second user is located according to the interactive game selected by the first client and the first real-scene data and the second real-scene data;
and the control module is used for controlling the first client and the second client to present a virtual game scene matched with the interactive game according to the behavior data of the first user for operating the first feature, which is acquired by the first client, and the behavior data of the second user for operating the second feature, which is acquired by the second client, and presenting a first interactive behavior corresponding to the first virtual prop and a second interactive behavior corresponding to the second virtual prop in the virtual game scene.
In a possible implementation, the first obtaining module is configured to obtain the second live-action data according to the following steps:
responding to the live broadcast interaction initiated by the first client, and sending interaction invitation information to the second client displaying the live broadcast page of the first client;
and after the second client side accepts the interaction invitation, controlling to start the camera of the second client side, and acquiring the second live-action data acquired by the camera.
In a possible implementation manner, the live interaction apparatus further includes:
the determining module is used for sending a game list to the first client and determining the interactive game selected by the first client according to the game list; the game list comprises game names and game profiles corresponding to each interactive game in each interactive game.
In one possible embodiment, the first generating module comprises:
the first acquisition unit is used for acquiring game data of the interactive game from an interactive game library according to the interactive game selected by the first client; the game data comprise description information of the first feature, description information of the second feature, description information of the first virtual prop, description information of the second virtual prop, scene data of a preset game scene and game description;
and the generating unit is used for generating a first virtual prop corresponding to a first feature in a scene where the first user is located and a second virtual prop corresponding to a second feature in a scene where the second user is located according to the game data, the first real-scene data and the second real-scene data.
In a possible embodiment, the generating unit is configured to generate the first virtual prop and the second virtual prop according to the following steps:
identifying the first feature from the first real-world data according to the description information of the first feature in the game data, and identifying the second feature from the second real-world data according to the description information of the second feature in the game data;
and converting the first feature into the first virtual prop according to the description information of the first virtual prop in the game data, and converting the second feature into the second virtual prop according to the description information of the second virtual prop in the game data.
In one possible embodiment, the first and second features include:
a specific part of a person's body, an article having a predetermined shape, a predetermined picture.
In a possible implementation manner, the live interaction apparatus further includes:
the sending module is used for sending game descriptions corresponding to the interactive games to the first client and the second client;
and the second acquisition module is used for responding to an instruction sent by the first client for starting the interactive game, and acquiring the behavior data of the first characteristic object operated by the first user and the behavior data of the second characteristic object operated by the second user, which are acquired by the first client.
In one possible embodiment, the control module comprises:
the second obtaining unit is used for obtaining scene data of a preset game scene corresponding to the interactive game from the game data of the interactive game;
and the first presentation unit is used for presenting the virtual game scene based on the scene data.
In one possible embodiment, the control module further comprises:
an extraction unit configured to extract environmental data from the first live-action data; the environment data is data except character data in the first real scene data;
and the second presentation unit is used for presenting the virtual game scene based on the environment data.
In a possible implementation manner, the live interaction apparatus further includes:
the computing module is used for computing a first game result of the first client in the interactive game process and a second game result of the second client in the interactive game process when the first client and the second client are detected to finish the interactive game;
and the second generating module is used for generating a game result ranking list of the interactive game according to the first game result and the second game result, controlling the first client side and the second client side and presenting the game result ranking list.
In a third aspect, an embodiment of the present application further provides an electronic device, including: a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, and when the electronic device runs, the processor and the memory communicate with each other through the bus, and when the processor runs, the machine-readable instructions perform the steps of the live interaction method according to the first aspect or any one of the possible implementation manners of the first aspect.
In a fourth aspect, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the live broadcast interaction method described in the first aspect or any one of the possible implementation manners of the first aspect are performed.
In the embodiment of the application, according to the interactive game selected by the first client, the first real-world data of the scene where the first user is located and the second real-world data of the scene where the second user is located, the first virtual prop corresponding to the first feature in the scene where the first user is located and the second virtual prop corresponding to the second feature in the scene where the second user is located can be generated, and according to the behavior data of the first feature operated by the first user and the behavior data of the second feature operated by the second user, the virtual game scenes matched with the interactive game can be controlled to be presented on the first client and the second client, and the first interactive behavior corresponding to the first virtual prop and the second interactive behavior corresponding to the second virtual prop are presented in the virtual game scenes. Therefore, the actual operation of the user can be integrated into the virtual game scene, the immersion sense of the user is enhanced, the live broadcast interaction effect is improved, and the utilization efficiency of live broadcast resources is improved.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 illustrates an architecture diagram of a live broadcast platform provided in an embodiment of the present application;
fig. 2 is a flowchart illustrating a live interaction method provided in an embodiment of the present application;
fig. 3 is a functional block diagram of a live interactive apparatus according to an embodiment of the present application;
fig. 4 is a second functional block diagram of a live interactive apparatus according to an embodiment of the present application;
FIG. 5 illustrates a functional block diagram of the first generation block of FIG. 3;
FIG. 6 shows a functional block diagram of the control block of FIG. 3;
fig. 7 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
To make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Further, it should be understood that the schematic drawings are not drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and that steps without logical context may be reversed in order or performed concurrently. In addition, one skilled in the art, under the guidance of the present disclosure, may add one or more other operations to the flowchart, or may remove one or more operations from the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
To enable those skilled in the art to use the present disclosure in connection with a particular application scenario "live interaction," the following embodiments are presented to enable those skilled in the art to apply the general principles defined herein to other embodiments and application scenarios without departing from the spirit and scope of the present disclosure.
The method, the apparatus, the electronic device, or the computer-readable storage medium described in the embodiments of the present application may be applied to any scene that needs to perform live broadcast interaction.
It should be noted that, before the present application is provided, the interaction between the audience and the anchor in the existing schemes can be roughly divided into a bullet screen interaction, an audience gift-sending interaction, and an anchor task-completion interaction. However, the existing interactive mode has the problems of single form, high convergence, lack of interactive immersion, low participation willingness of audiences, poor interactive effect, incapability of fully utilizing live broadcast resources, network resources and the like.
In view of the above problem, in this embodiment of the application, according to the interactive game selected by the first client, the first real-world data of the scene where the first user is located, and the second real-world data of the scene where the second user is located, the first virtual item corresponding to the first feature in the scene where the first user is located and the second virtual item corresponding to the second feature in the scene where the second user is located may be generated, and according to the behavior data of the first feature operated by the first user and the behavior data of the second feature operated by the second user, the virtual game scenes matching the interactive game may be controlled to be presented at the first client and the second client, and the first interactive behavior corresponding to the first virtual item and the second interactive behavior corresponding to the second virtual item may be presented in the virtual game scene. Like this, can merge into virtual game scene with user's actual operation, strengthen user's sense of immersing, improve live broadcast interactive effect, improve the utilization efficiency of live broadcast resource, moreover, through the live broadcast interactive mode in increasing this application in the live broadcast, interactive mode's variety in can increasing the live broadcast, user's participation in the promotion live broadcast.
For the convenience of understanding of the present application, the technical solutions provided in the present application will be described in detail below with reference to specific embodiments.
Fig. 1 is a schematic diagram of an architecture of a live broadcast platform according to an embodiment of the present application. As shown in fig. 1, the live platform includes a first client 110, a second client 120 and a server 130, the first user uses the first client 110 to interact information with the second user of the second client 120 through the server 130, here, the number of the second clients 120 shown in fig. 1 is only for example and is not a limitation on the number of the second clients 120. It should be noted that the first client 110 may be an anchor user end, the second client 120 may be an audience user end, the anchor user may be a first user, and the audience user may be a second user. The client can be a personal computer or a mobile terminal.
It should be noted that the present application does not limit the execution subject of the live broadcast interaction method, and the execution subject may be a server, a cloud platform, or a client.
Fig. 2 is a flowchart of a live broadcast interaction method according to an embodiment of the present application. As shown in fig. 2, the live broadcast interaction method provided in the embodiment of the present application includes the following steps:
s201: responding to the live broadcast interaction initiated by the first client, acquiring first live-action data of a scene where the first user is located and acquired by the first client, and acquiring second live-action data of a scene where the second user is located and acquired by the second client invited to carry out the live broadcast interaction.
In the specific implementation, after the first user uses the first client to start the live broadcast platform, live broadcast is performed, and in the live broadcast process, the first user sends an instruction for starting the live broadcast interaction function to the server through the first client. The server side responds to the live broadcast interaction initiated by the first client side, controls to start the camera of the first client side, acquires first live-action data of a scene where the first user is located, acquired by the first client side through the camera, and acquires second live-action data of a scene where the second user is located, acquired by the second client side, according to the determined second client side invited to carry out the live broadcast interaction.
Here, the first real-scene data may be an image of a scene in which the first user is located, the first real-scene data includes portrait data of the first user and environment data of an environment in which the first user is located, the second real-scene data may be an image of a scene in which the second user is located, and the second real-scene data includes portrait data of the second user and an environment image of an environment in which the second user is located.
Further, in step S201, the responding to the live broadcast interaction initiated by the first client, and acquiring second live-action data of a scene where the second user is located, which is acquired by the second client invited to perform the live broadcast interaction, includes the following steps:
responding to the live broadcast interaction initiated by the first client, and sending interaction invitation information to the second client displaying the live broadcast page of the first client; and after the second client side accepts the interaction invitation, controlling to start the camera of the second client side, and acquiring the second live-action data acquired by the camera.
In a specific implementation, after receiving a command for performing live broadcast interaction sent by a first client at a server, starting a live broadcast interaction function, and sending interaction invitation information to second clients of a plurality of second users watching live broadcast of a first user, wherein the second clients displaying live broadcast pages of the first client are determined as the second clients of the second users watching live broadcast of the first user, further, information for receiving the interaction invitation sent by at least one second client is received, and then the second users of the second clients receiving the invitation are determined as users participating in the live broadcast interaction, and the second clients receiving the invitation are controlled to start a camera of the second clients receiving the invitation, and second live scene data acquired through the camera is acquired. The interaction invitation information is sent to the second client of the second user watching the live broadcast of the first user, so that the second client participating in the interactive game can be determined, and the second live-action image acquired by the second client participating in the interactive game is acquired.
S202: and generating a first virtual prop corresponding to a first feature in the scene where the first user is located and a second virtual prop corresponding to a second feature in the scene where the second user is located according to the interactive game selected by the first client and the first real-scene data and the second real-scene data.
In specific implementation, the live broadcast interactive platform provides a plurality of interactive games for live broadcast interaction, after the server determines the interactive game selected by the first user through the first client, according to the interactive game, the first feature required by the interactive game is identified from the acquired first live-action data, the first feature is converted into a first virtual item required by the interactive game, the second feature required by the interactive game is identified from the acquired second live-action data, and the second feature is converted into a second virtual item required by the interactive game.
Before step S202, the live interaction method further includes:
sending a game list to the first client, and determining the interactive game selected by the first client according to the game list; the game list comprises game names and game profiles corresponding to each interactive game in each interactive game.
In a specific implementation, after the server side responds to an instruction sent by the first client side for starting the live broadcast interaction function, a game list with game names and game profiles of various interactive games is sent to the first client side after the live broadcast interaction function is started, and after the server side confirms that the first user selects the interactive games in the game list through the first client side, the server side identifies first features matched with the interactive games from first live-action data according to the interactive games and identifies second features matched with the interactive games from second live-action data.
Further, in step S202, according to the interactive game selected by the first client, and the first real-world data and the second real-world data, a first virtual item corresponding to a first feature in a scene where the first user is located and a second virtual item corresponding to a second feature in a scene where the second user is located are generated, which includes the following steps:
step 2021: acquiring game data of the interactive game from an interactive game library according to the interactive game selected by the first client; the game data comprises the description information of the first feature, the description information of the second feature, the description information of the first virtual prop, the description information of the second virtual prop, scene data of a preset game scene and game description.
In specific implementation, the live broadcast interactive platform provides various interactive games for live broadcast interaction, game data of the various interactive games are stored in an interactive game database, and the server side can acquire the game data of the interactive games from the interactive game database after determining the interactive games selected by the first client side.
It should be noted that the game data of the interactive game includes description information of at least one feature, description information of a virtual item corresponding to the feature, scene data of a preset game scene, and a game description, where the feature may be a specific part of a human body, such as a head, an arm, and a thigh, the feature may also be an article in a preset shape, such as a basketball, a cube, and the like, and the feature may also be a preset picture, such as a photograph.
Step 2022: and generating a first virtual prop corresponding to a first feature in the scene where the first user is located and a second virtual prop corresponding to a second feature in the scene where the second user is located according to the game data, the first real-scene data and the second real-scene data.
In a specific implementation, the server acquires game data of the interactive game from an interactive game database according to the interactive game selected by the first client, further identifies a first feature from the first real-scene data and a second feature from the second real-scene data according to the game data, converts the first feature into a first virtual item and converts the second feature into a second virtual item according to the game data, and specifically, an image processing tool can be used to correspondingly convert an image of the first feature into an image of the first virtual item and convert an image of the second feature into an image of the second virtual item.
Here, an Augmented Reality Development Kit (AR SDK) may be embedded in the image processing tool in advance, and then the obtained live-action image is processed by the image processing tool, so that the feature in the live-action image may be identified, and the feature may be correspondingly converted into a three-dimensional virtual prop, where the three-dimensional virtual prop is obtained by performing three-dimensional modeling according to the feature.
Further, in step 2022, the generating, according to the game data, the first real-world data, and the second real-world data, a first virtual item corresponding to a first feature in a scene where the first user is located, and a second virtual item corresponding to a second feature in a scene where the second user is located includes the following steps:
step (1): identifying the first feature from the first live-action data according to the description information of the first feature in the game data, and identifying the second feature from the second live-action data according to the description information of the second feature in the game data.
In specific implementation, the server side obtains game data of the interactive game from an interactive game library according to the interactive game selected by the first client side, the game data comprise description information of a first feature and description information of a second feature, identifies the first feature from the first real-scene data according to the description information of the first feature in the game data, and identifies the second feature from the second real-scene data according to the description information of the second feature in the game data. Here, the first feature and the second feature may be the same feature, or may be two different features that are interacted and matched in an interactive game.
Step (2): and converting the first feature into the first virtual prop according to the description information of the first virtual prop in the game data, and converting the second feature into the second virtual prop according to the description information of the second virtual prop in the game data.
In specific implementation, after a first feature is identified from first real-scene data, feature points are extracted from the first feature according to description information of a first virtual prop, and a three-dimensional model of the first virtual prop is established according to the feature points so as to convert the first feature into the first virtual prop; after the second feature object is identified from the second real-scene data, feature points are extracted from the second feature object according to the description information of the second virtual prop, and a three-dimensional model of the second virtual prop is established according to the feature points so as to convert the second feature object into the second virtual prop.
In one example, the interactive game is a pinball game, the first feature is a head, the second feature is an arm, the first virtual item is a ball, the second virtual item is a baffle, the head is recognized from the first real-scene data and correspondingly converted into the ball, the arm is recognized from the second real-scene data and correspondingly converted into the baffle.
Further, after step S202, the live interaction method further includes:
sending game descriptions corresponding to the interactive games to the first client and the second client; and responding to an instruction for starting the interactive game sent by the first client, and acquiring the behavior data of the first user for operating the first feature, which is acquired by the first client, and the behavior data of the second user for operating the second feature, which is acquired by the second client.
In specific implementation, after determining an interactive game selected by a first client, a server sends a game description corresponding to the interactive game to the first client and a second client receiving an interactive invitation, so that both the first user and the second user can know the game play method of the interactive game, so that the first user and the second user can carry out the interactive game according to the game description, further, after receiving an instruction sent by the first client to start the interactive game, the interactive game is entered, at the moment, first live-action data collected by the first client is obtained in real time, behavior data of the first user for operating a first feature object is determined from the first live-action data, second live-action data collected by the second client is obtained in real time, and behavior data of the second user for operating a second feature object is determined from the second live-action data.
S203: and controlling the first client and the second client to present a virtual game scene matched with the interactive game according to the behavior data of the first user for operating the first feature acquired by the first client and the behavior data of the second user for operating the second feature acquired by the second client, and presenting a first interactive behavior corresponding to the first virtual prop and a second interactive behavior corresponding to the second virtual prop in the virtual game scene.
In specific implementation, when entering an interactive game, a server side acquires first real-scene data acquired by a first client side in real time, determines behavior data of a first user operating a first feature object from the first real-scene data, acquires second real-scene data acquired by a second client side in real time, determines behavior data of a second user operating a second feature object from the second real-scene data, further converts the first feature object into a first virtual item capable of generating a first interactive behavior according to the behavior data of the first user operating the first feature object, converts the second feature object into a second virtual item capable of generating a second interactive behavior according to the behavior data of the second user operating the second feature object, and further controls the first client side and the second client side to present a virtual game scene matched with the interactive game and present a first interactive behavior of the first virtual item and a second interactive behavior corresponding to the second virtual item in the virtual game scene.
It should be noted that, in the process of an interactive game between a first user at a first client and a second user at a second client, a server presents a virtual game scene matched with the interactive game at the first client and the second client, the first user operates a first feature object in front of a camera of the first client, the second user operates a second feature object in front of a camera of the second client, and the server presents the virtual game scene, an image of an interactive behavior of the first virtual item and an image of an interactive behavior of the second virtual item at the first client and the second client together, where both the first user and the second user can see a change state of the first virtual item corresponding to the first feature object and a change state of the second virtual item corresponding to the second feature on screens of the first client and the second client at the same time, where the change states include a change in position and a change in shape.
Further, the interactive game matched virtual game scene is presented through the following modes:
the first method is as follows: scene data of a preset game scene corresponding to the interactive game is obtained from the game data of the interactive game; presenting the virtual game scene based on the scene data.
In a specific implementation, after determining the interactive game selected by the first user side, the server side obtains game data corresponding to the interactive game from the interactive game library, obtains scene data of a preset game scene carried by the game data, and controls the first client side and the second client side to present a virtual game scene according to the scene data.
Here, the virtual game scene is a game scene of the interactive game itself, and the virtual game scene is different for different interactive games.
The second method comprises the following steps: extracting environmental data from the first live-action data; the environment data is data except character data in the first real scene data; presenting the virtual game scene based on the environmental data.
In specific implementation, after determining the interactive game selected by the first user, the server determines a virtual game scene matched with the interactive game, and when determining that the virtual game scene is determined according to the first real-scene data, separates the environment data from the character data in the first real-scene data, extracts the environment data, processes the environment data, and controls the first client and the second client to present the virtual game scene.
In one example, the interactive game is a pinball game, the first feature is a head, the second feature is an arm, the first virtual item is a ball, the second virtual item is a baffle, the head is identified from the first real-scene data and is correspondingly converted into the ball, the arm is identified from the second real-scene data and is correspondingly converted into the baffle, the first user controls the head of the first user to move the ball in the virtual game scene, and the second user controls the arm of the second user to make the baffle in the virtual game scene meet the ball, so that the ball is received, and the interaction in the virtual scene is realized.
In another example, the interactive game is an avatar interaction, the first feature is a body of a first user, the second feature is a body of a second user, the first virtual item is a cartoon character 1, and the second virtual item is a cartoon character 2, a scene where the first user is located is converted into a virtual game scene, the first user controls the body of the first user and the second user controls the body of the second user, and the interaction between the cartoon character 1 and the cartoon character 2 in the virtual game scene is realized, where the interaction may be face pinching, handshake and the like.
Further, after step S203, the live interaction method further includes the following steps:
when the first client and the second client are detected to finish the interactive game, calculating a first game result of the first client in the interactive game process and a second game result of the second client in the interactive game process; and generating a game result ranking list of the interactive game according to the first game result and the second game result, controlling the first client and the second client, and presenting the game result ranking list.
In specific implementation, when it is detected that the first client and the second client complete the interactive game, according to first behavior data of the first virtual item and second behavior data of the second virtual item during the interactive game of the first client and the second client, a first game result of the first client during the interactive game and a second game result of the second client during the interactive game are calculated, where the game results include scores, credits, and the like, and according to the first game result and the second game result, a game result ranking list of the interactive game is generated, and the first client and the second client are controlled to present the ranking list of the game results.
In the embodiment of the application, according to the interactive game selected by the first client, the first real-world data of the scene where the first user is located, and the second real-world data of the scene where the second user is located, the first virtual prop corresponding to the first feature in the scene where the first user is located and the second virtual prop corresponding to the second feature in the scene where the second user is located may be generated, and the virtual game scenes matching the interactive game may be controlled to be presented at the first client and the second client according to the behavior data of the first feature operated by the first user and the behavior data of the second feature operated by the second user, and the first interactive behavior corresponding to the first virtual prop and the second interactive behavior corresponding to the second virtual prop are presented in the virtual game scene. Like this, can merge into virtual game scene with user's actual operation, strengthened user's sense of immersing, improved live broadcast interactive effect, improved live broadcast resource's utilization efficiency, moreover, through the live broadcast interactive mode in increasing this application in the live broadcast, the variety of interactive mode in can increasing the live broadcast promotes user's participation in the live broadcast.
Based on the same application concept, a live broadcast interaction device corresponding to the live broadcast interaction method is further provided in the embodiment of the application, and as the principle of solving the problem of the device in the embodiment of the application is similar to that of the live broadcast interaction method in the application, the implementation of the device can be referred to the implementation of the method, and repeated parts are not repeated.
Fig. 3 to 6 are functional block diagrams of a live broadcast interaction apparatus 300 according to an embodiment of the present disclosure, fig. 4 is a second functional block diagram of the live broadcast interaction apparatus 300 according to the embodiment of the present disclosure, fig. 5 is a functional block diagram of a first generation module 320 in fig. 3, and fig. 6 is a functional block diagram of a control module 330 in fig. 3.
As shown in fig. 3 and 4, the live interactive apparatus 300 includes:
the first acquiring module 310 is configured to respond to a live broadcast interaction initiated by a first client, acquire first real-scene data of a scene where a first user is located, which is acquired by the first client, and acquire second real-scene data of a scene where a second user is located, which is acquired by a second client invited to perform the live broadcast interaction;
a first generating module 320, configured to generate, according to the interactive game selected by the first client, and the first real-world data and the second real-world data, a first virtual item corresponding to a first feature in a scene where the first user is located, and a second virtual item corresponding to a second feature in a scene where the second user is located;
a control module 330, configured to control, according to the behavior data of the first user operating the first feature collected by the first client and the behavior data of the second user operating the second feature collected by the second client, to display a virtual game scene matched with the interactive game at the first client and the second client, and to display a first interactive behavior corresponding to the first virtual prop and a second interactive behavior corresponding to the second virtual prop in the virtual game scene.
In a possible implementation, as shown in fig. 3 and 4, the first obtaining module 310 is configured to obtain the second real-world data according to the following steps:
responding to the live broadcast interaction initiated by the first client, and sending interaction invitation information to the second client which displays the live broadcast page of the first client;
and after the second client side accepts the interaction invitation, controlling to start the camera of the second client side, and acquiring the second live-action data acquired by the camera.
In a possible implementation manner, as shown in fig. 4, the live interactive apparatus 300 further includes:
a determining module 340, configured to send a game list to the first client, and determine the interactive game selected by the first client according to the game list; the game list comprises game names and game profiles corresponding to each interactive game in each interactive game.
In one possible embodiment, as shown in fig. 5, the first generating module 320 includes:
a first obtaining unit 322, configured to obtain, according to the interactive game selected by the first client, game data of the interactive game from an interactive game library; the game data comprise description information of the first feature, description information of the second feature, description information of the first virtual prop, description information of the second virtual prop, scene data of a preset game scene and game description;
a generating unit 324, configured to generate, according to the game data, the first real-world data, and the second real-world data, a first virtual item corresponding to a first feature in a scene where the first user is located, and a second virtual item corresponding to a second feature in a scene where the second user is located.
In a possible embodiment, as shown in fig. 5, the generating unit 324 is configured to generate the first virtual item and the second virtual item according to the following steps:
identifying the first feature from the first live-action data according to the description information of the first feature in the game data, and identifying the second feature from the second live-action data according to the description information of the second feature in the game data;
and converting the first feature into the first virtual prop according to the description information of the first virtual prop in the game data, and converting the second feature into the second virtual prop according to the description information of the second virtual prop in the game data.
In one possible embodiment, the first and second features include:
a specific part of a person's body, an article having a predetermined shape, a predetermined picture.
In a possible implementation manner, as shown in fig. 4, the live interactive apparatus 300 further includes:
a sending module 350, configured to send a game description corresponding to the interactive game to the first client and the second client;
a second obtaining module 360, configured to obtain, in response to an instruction sent by the first client to start the interactive game, behavior data of the first user operating the first feature collected by the first client and behavior data of the second user operating the second feature collected by the second client.
In one possible embodiment, as shown in fig. 6, the control module 330 includes:
a second obtaining unit 332, configured to obtain scene data of a preset game scene corresponding to the interactive game from the game data of the interactive game;
a first presenting unit 334, configured to present the virtual game scene based on the scene data.
In one possible implementation, as shown in fig. 6, the control module 330 further includes:
an extracting unit 336 configured to extract environment data from the first live-action data; the environment data is data except character data in the first real scene data;
a second presenting unit 338, configured to present the virtual game scene based on the environment data.
In a possible implementation manner, as shown in fig. 4, the live interaction apparatus 300 further includes:
a calculating module 370, configured to calculate a first game result of the first client in the interactive game process and a second game result of the second client in the interactive game process when it is detected that the first client and the second client complete the interactive game;
the second generating module 380 is configured to generate a game result ranking list of the interactive game according to the first game result and the second game result, control the first client and the second client, and present the game result ranking list.
In the embodiment of the application, according to the interactive game selected by the first client, the first real-scene data of the scene where the first user is located and the second real-scene data of the scene where the second user is located, the first virtual prop corresponding to the first feature in the scene where the first user is located and the second virtual prop corresponding to the second feature in the scene where the second user is located can be generated, and according to the behavior data of the first user for operating the first feature and the behavior data of the second user for operating the second feature, the virtual game scenes matched with the interactive game can be controlled to be presented on the first client and the second client, and the first interactive behavior corresponding to the first virtual prop and the second interactive behavior corresponding to the second virtual prop are presented in the virtual game scenes. Like this, can merge into virtual game scene with user's actual operation, strengthened user's sense of immersing, improved live broadcast interactive effect, improved live broadcast resource's utilization efficiency, moreover, through the live broadcast interactive mode in increasing this application in the live broadcast, the variety of interactive mode in can increasing the live broadcast promotes user's participation in the live broadcast.
Based on the same application concept, referring to fig. 7, a schematic structural diagram of an electronic device 700 provided in the embodiment of the present application includes: a processor 710, a memory 720 and a bus 730, wherein the memory 720 stores machine-readable instructions executable by the processor 710, when the electronic device 700 is running, the processor 710 communicates with the memory 720 via the bus 730, and the machine-readable instructions are executed by the processor 710 to perform the steps of the live interaction method as shown in fig. 2.
In particular, the machine readable instructions, when executed by the processor 710, may perform the following:
responding to live broadcast interaction initiated by a first client, acquiring first live-action data of a scene where a first user is located and acquired by the first client, and acquiring second live-action data of a scene where a second user is located and acquired by a second client invited to carry out live broadcast interaction;
generating a first virtual prop corresponding to a first feature in a scene where the first user is located and a second virtual prop corresponding to a second feature in a scene where the second user is located according to the interactive game selected by the first client and the first real-world data and the second real-world data;
and controlling the first client and the second client to present a virtual game scene matched with the interactive game according to the behavior data of the first user for operating the first feature acquired by the first client and the behavior data of the second user for operating the second feature acquired by the second client, and presenting a first interactive behavior corresponding to the first virtual prop and a second interactive behavior corresponding to the second virtual prop in the virtual game scene.
In the embodiment of the application, according to the interactive game selected by the first client, the first real-world data of the scene where the first user is located and the second real-world data of the scene where the second user is located, the first virtual prop corresponding to the first feature in the scene where the first user is located and the second virtual prop corresponding to the second feature in the scene where the second user is located can be generated, and according to the behavior data of the first feature operated by the first user and the behavior data of the second feature operated by the second user, the virtual game scenes matched with the interactive game can be controlled to be presented on the first client and the second client, and the first interactive behavior corresponding to the first virtual prop and the second interactive behavior corresponding to the second virtual prop are presented in the virtual game scenes. Like this, can merge into virtual game scene with user's actual operation, strengthened user's sense of immersing, improved live broadcast interactive effect, improved live broadcast resource's utilization efficiency, moreover, through the live broadcast interactive mode in increasing this application in the live broadcast, the variety of interactive mode in can increasing the live broadcast promotes user's participation in the live broadcast.
Based on the same application concept, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the live broadcast interaction method provided in fig. 2 are performed.
Specifically, the storage medium can be a general storage medium, such as a mobile disk, a hard disk, and the like, and when a computer program on the storage medium is executed, the live broadcast interaction method can be executed, so that the live broadcast interaction effect is improved, and the utilization efficiency of live broadcast resources is improved. It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall cover the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A live broadcast interaction method is characterized by comprising the following steps:
responding to live broadcast interaction initiated by a first client, acquiring first live-action data of a scene where a first user is located and acquired by the first client, and acquiring second live-action data of a scene where a second user is located and acquired by a second client invited to carry out live broadcast interaction;
according to the interactive game selected by the first client, the first real-scene data and the second real-scene data, identifying a first feature from the first real-scene data and a second feature from the second real-scene data, and generating a first virtual prop corresponding to the first feature in the scene where the first user is located and a second virtual prop corresponding to the second feature in the scene where the second user is located; wherein the first virtual prop is generated based on the first feature transformation; the second virtual prop is generated based on the second feature transformation; the first feature and the second feature comprise: a specific part of a human body, an article with a preset shape, and a preset picture; the first feature and the second feature are two different features which are interacted and matched in an interactive game;
controlling the first client and the second client according to the behavior data of the first user for operating the first feature and the behavior data of the second user for operating the second feature, which are acquired by the first client, to present a virtual game scene matched with the interactive game, and present a first interactive behavior corresponding to the first virtual prop and a second interactive behavior corresponding to the second virtual prop in the virtual game scene; simultaneously presenting images of the interactive behaviors of the virtual game scene, the first virtual item and the second virtual item on the first client and the second client together;
the presenting the virtual game scene includes:
extracting environment data from the first live-action data; the environment data is data except character data in the first live-action data;
presenting the virtual game scene based on the environmental data.
2. The live broadcast interaction method of claim 1, wherein the obtaining second live-action data of a scene where a second user is located, which is collected by a second client invited to perform live broadcast interaction, in response to the live broadcast interaction initiated by the first client comprises:
responding to the live broadcast interaction initiated by the first client, and sending interaction invitation information to the second client displaying the live broadcast page of the first client;
and after the second client receives the interaction invitation, controlling to start a camera of the second client, and acquiring the second live-action data acquired by the camera.
3. The live interaction method as claimed in claim 1, wherein before generating a first virtual item corresponding to a first feature in a scene of the first user and a second virtual item corresponding to a second feature in a scene of the second user according to the interactive game selected by the first client and the first and second live-action data, the live interaction method further comprises:
sending a game list to the first client, and determining the interactive game selected by the first client according to the game list; the game list comprises game names and game profiles corresponding to each interactive game in each interactive game.
4. The live broadcast interaction method of claim 1, wherein the generating a first virtual item corresponding to a first feature in a scene of the first user and a second virtual item corresponding to a second feature in a scene of the second user according to the interactive game selected by the first client and the first and second live-action data comprises:
acquiring game data of the interactive game from an interactive game library according to the interactive game selected by the first client; the game data comprise description information of the first feature, description information of the second feature, description information of the first virtual prop, description information of the second virtual prop, scene data of a preset game scene and game description;
and generating a first virtual prop corresponding to a first feature in the scene where the first user is located and a second virtual prop corresponding to a second feature in the scene where the second user is located according to the game data, the first real-scene data and the second real-scene data.
5. The live interaction method as claimed in claim 4, wherein the generating, according to the game data, the first live-action data and the second live-action data, a first virtual item corresponding to a first feature in a scene of the first user and a second virtual item corresponding to a second feature in a scene of the second user comprises:
identifying the first feature from the first real-world data according to the description information of the first feature in the game data, and identifying the second feature from the second real-world data according to the description information of the second feature in the game data;
and converting the first feature into the first virtual prop according to the description information of the first virtual prop in the game data, and converting the second feature into the second virtual prop according to the description information of the second virtual prop in the game data.
6. The live interaction method of claim 1, wherein after the generating a first virtual item corresponding to a first feature in a scene of the first user and a second virtual item corresponding to a second feature in a scene of the second user, the live interaction method further comprises:
sending game descriptions corresponding to the interactive games to the first client and the second client;
and responding to an instruction for starting the interactive game sent by the first client, and acquiring the behavior data of the first user for operating the first characteristic object, which is acquired by the first client, and the behavior data of the second user for operating the second characteristic object, which is acquired by the second client.
7. The live interaction method of claim 1, wherein the presenting the interactive game matching virtual game scene comprises:
scene data of a preset game scene corresponding to the interactive game is obtained from the game data of the interactive game;
presenting the virtual game scene based on the scene data.
8. The live interaction method of claim 1, further comprising:
when the first client and the second client are detected to finish the interactive game, calculating a first game result of the first client in the interactive game process and a second game result of the second client in the interactive game process;
and generating a game result ranking list of the interactive game according to the first game result and the second game result, controlling the first client and the second client, and presenting the game result ranking list.
9. A live interactive device, comprising:
the first acquisition module is used for responding to the live broadcast interaction initiated by the first client, acquiring first real-scene data of a scene where a first user is located and acquired by the first client, and acquiring second real-scene data of a scene where a second user is located and acquired by a second client invited to carry out the live broadcast interaction;
a first generating module, configured to identify a first feature from first real-world data and a second feature from second real-world data according to the interactive game selected by the first client, the first real-world data, and the second real-world data, and generate a first virtual item corresponding to the first feature in the scene where the first user is located and a second virtual item corresponding to the second feature in the scene where the second user is located; wherein the first virtual prop is generated based on the first feature transformation; the second virtual prop is generated based on the second feature transformation; the first and second features comprise: a specific part of a human body, an article with a preset shape, and a preset picture; the first feature and the second feature are two different features which are interacted and matched in an interactive game;
the control module is used for controlling the first client and the second client according to the behavior data of the first user for operating the first feature collected by the first client and the behavior data of the second user for operating the second feature collected by the second client, presenting a virtual game scene matched with the interactive game, and presenting a first interactive behavior corresponding to the first virtual prop and a second interactive behavior corresponding to the second virtual prop in the virtual game scene; simultaneously presenting images of the interactive behaviors of the virtual game scene, the first virtual item and the second virtual item on the first client and the second client together;
the control module further comprises:
an extracting unit configured to extract environment data from the first live-action data; the environment data is data except character data in the first live-action data;
and the second presentation unit is used for presenting the virtual game scene based on the environment data.
10. The live interactive device as claimed in claim 9, wherein the first obtaining module is configured to obtain the second live-action data according to the following steps:
responding to the live broadcast interaction initiated by the first client, and sending interaction invitation information to the second client displaying the live broadcast page of the first client;
and after the second client side accepts the interaction invitation, controlling to start the camera of the second client side, and acquiring the second live-action data acquired by the camera.
11. The live interaction device of claim 9, further comprising:
the determining module is used for sending a game list to the first client and determining the interactive game selected by the first client according to the game list; the game list comprises game names and game profiles corresponding to each interactive game in each interactive game.
12. The live interaction device of claim 9, wherein the first generating module comprises:
the first acquisition unit is used for acquiring game data of the interactive game from an interactive game library according to the interactive game selected by the first client; the game data comprise description information of the first feature, description information of the second feature, description information of the first virtual prop, description information of the second virtual prop, scene data of a preset game scene and game description;
and the generating unit is used for generating a first virtual prop corresponding to a first feature in a scene where the first user is located and a second virtual prop corresponding to a second feature in a scene where the second user is located according to the game data, the first real-scene data and the second real-scene data.
13. The live interaction device of claim 12, wherein the generating unit is configured to generate the first virtual item and the second virtual item according to the following steps:
identifying the first feature from the first real-world data according to the description information of the first feature in the game data, and identifying the second feature from the second real-world data according to the description information of the second feature in the game data;
and converting the first feature into the first virtual prop according to the description information of the first virtual prop in the game data, and converting the second feature into the second virtual prop according to the description information of the second virtual prop in the game data.
14. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is operated, the machine-readable instructions, when executed by the processor, performing the steps of the live interaction method of any of claims 1 to 8.
15. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, performs the steps of the live interaction method as claimed in any one of claims 1 to 8.
CN201910919153.9A 2019-09-26 2019-09-26 Live broadcast interaction method and device, electronic equipment and readable storage medium Active CN110545442B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910919153.9A CN110545442B (en) 2019-09-26 2019-09-26 Live broadcast interaction method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910919153.9A CN110545442B (en) 2019-09-26 2019-09-26 Live broadcast interaction method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN110545442A CN110545442A (en) 2019-12-06
CN110545442B true CN110545442B (en) 2022-12-16

Family

ID=68714562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910919153.9A Active CN110545442B (en) 2019-09-26 2019-09-26 Live broadcast interaction method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN110545442B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111389015A (en) * 2020-02-27 2020-07-10 腾讯科技(深圳)有限公司 Method and device for determining game props and storage medium
CN111399653A (en) * 2020-03-24 2020-07-10 北京文香信息技术有限公司 Virtual interaction method, device, equipment and computer storage medium
CN111918090B (en) * 2020-08-10 2023-03-28 广州繁星互娱信息科技有限公司 Live broadcast picture display method and device, terminal and storage medium
CN113018849A (en) * 2021-03-30 2021-06-25 广州虎牙科技有限公司 Game interaction method, related device and equipment
CN113179446B (en) 2021-04-26 2022-05-27 北京字跳网络技术有限公司 Video interaction method and device, electronic equipment and storage medium
CN113873283A (en) * 2021-09-30 2021-12-31 思享智汇(海南)科技有限责任公司 Multi-player-participated AR game live broadcast system and method
CN114189743B (en) * 2021-12-15 2023-12-12 广州博冠信息科技有限公司 Data transmission method, device, electronic equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10567466B2 (en) * 2017-04-06 2020-02-18 Microsoft Technology Licensing, Llc Co-streaming within a live interactive video game streaming service
CN107566911B (en) * 2017-09-08 2021-06-29 广州方硅信息技术有限公司 Live broadcast method, device and system and electronic equipment

Also Published As

Publication number Publication date
CN110545442A (en) 2019-12-06

Similar Documents

Publication Publication Date Title
CN110545442B (en) Live broadcast interaction method and device, electronic equipment and readable storage medium
CN112348969B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
CN111641844B (en) Live broadcast interaction method and device, live broadcast system and electronic equipment
CN111556278B (en) Video processing method, video display device and storage medium
CN111638793B (en) Display method and device of aircraft, electronic equipment and storage medium
CN110119700B (en) Avatar control method, avatar control device and electronic equipment
CN112148189A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN110472099B (en) Interactive video generation method and device and storage medium
CN111640197A (en) Augmented reality AR special effect control method, device and equipment
CN111638797A (en) Display control method and device
US11882336B2 (en) Method and system for interaction in live streaming
CN111643900A (en) Display picture control method and device, electronic equipment and storage medium
US20170171621A1 (en) Method and Electronic Device for Information Processing
CN111651057A (en) Data display method and device, electronic equipment and storage medium
CN112905014A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN111569436A (en) Processing method, device and equipment based on interaction in live broadcast fighting
CN111640169A (en) Historical event presenting method and device, electronic equipment and storage medium
CN108134945B (en) AR service processing method, AR service processing device and terminal
CN108537149B (en) Image processing method, image processing device, storage medium and electronic equipment
CN113648650A (en) Interaction method and related device
CN111651049B (en) Interaction method, device, computer equipment and storage medium
CN111569414B (en) Flight display method and device of virtual aircraft, electronic equipment and storage medium
CN112333498A (en) Display control method and device, computer equipment and storage medium
CN115624740A (en) Virtual reality equipment, control method, device and system thereof, and interaction system
CN114489337A (en) AR interaction method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant