CN113115061B - Live broadcast interaction method and device, electronic equipment and storage medium - Google Patents
Live broadcast interaction method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN113115061B CN113115061B CN202110370982.3A CN202110370982A CN113115061B CN 113115061 B CN113115061 B CN 113115061B CN 202110370982 A CN202110370982 A CN 202110370982A CN 113115061 B CN113115061 B CN 113115061B
- Authority
- CN
- China
- Prior art keywords
- live broadcast
- interaction
- virtual object
- live
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4722—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
- H04N21/4725—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content using interactive regions of the image, e.g. hot spots
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the disclosure relates to a live broadcast interaction method, a live broadcast interaction device, electronic equipment and a storage medium, wherein the method comprises the following steps: determining an interactive area on a live broadcast layer, wherein the live broadcast layer is superposed on a live broadcast interface of a virtual object, the live broadcast interface refers to an interface synchronously displayed in each terminal entering a live broadcast room of the virtual object, the content displayed on the live broadcast layer and the live broadcast content displayed on the live broadcast interface are jointly used as the content displayed on the terminal of a target user, and the interactive area and the live broadcast interface are kept in frame synchronization; responding to the touch operation of a target user in an interaction area, and determining interaction information corresponding to the touch operation; and displaying the interactive information at a preset position on the live broadcasting layer. The embodiment of the disclosure realizes personalized live broadcast interaction effect, and different users can see differentiated display contents in the live broadcast process.
Description
Technical Field
The present disclosure relates to the field of internet live broadcast technologies, and in particular, to a live broadcast interaction method and apparatus, an electronic device, and a storage medium.
Background
The development of the internet live broadcast technology enables live broadcast to become a popular entertainment and consumption mode. In the existing live broadcast mode, a user can carry out live broadcast interaction in a live broadcast room through chatting with a main broadcast, presenting gifts to the main broadcast, commenting and the like.
However, in the existing live broadcast mode, a large number of users enter the same live broadcast room, the interactive information between each user and the anchor broadcast is directly synchronized to the display picture of the live broadcast room through the background server, the live broadcast pictures seen by each user are basically consistent, and the personalized live broadcast interactive effect cannot be achieved.
Disclosure of Invention
In order to solve the technical problems or at least partially solve the technical problems, embodiments of the present disclosure provide a live broadcast interaction method, apparatus, electronic device, and storage medium.
In a first aspect, an embodiment of the present disclosure provides a live broadcast interaction method, applied to a terminal entering a virtual object live broadcast room, including:
determining an interaction area on a live broadcast layer, wherein the live broadcast layer is superposed on a live broadcast interface of the virtual object, the live broadcast interface refers to an interface which is synchronously displayed in each terminal entering a live broadcast room of the virtual object, the content displayed on the live broadcast layer and the live broadcast content displayed on the live broadcast interface are jointly used as the content displayed on the terminal of a target user, and the interaction area and the live broadcast interface keep frame synchronization;
responding to the touch operation of a target user in the interaction area, and determining interaction information corresponding to the touch operation;
and displaying the interactive information at a preset position on the live broadcasting layer.
In a second aspect, an embodiment of the present disclosure further provides a live broadcast interaction apparatus, configured at a terminal entering a virtual object live broadcast room, including:
an interactive area determining module, configured to determine an interactive area on a live broadcast layer, where the live broadcast layer is superimposed on a live broadcast interface of the virtual object, the live broadcast interface is an interface displayed synchronously in each terminal entering a live broadcast room of the virtual object, a content displayed on the live broadcast layer and a live broadcast content displayed on the live broadcast interface are both used as a content displayed on a terminal of a target user, and the interactive area and the live broadcast interface keep frame synchronization;
the interaction information determining module is used for responding to the touch operation of the target user in the interaction area and determining interaction information corresponding to the touch operation;
and the interactive information display module is used for displaying the interactive information at a preset position on the live broadcast layer.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including a memory and a processor, where the memory stores a computer program, and when the computer program is executed by the processor, the electronic device is enabled to implement any one of the live broadcast interaction methods provided in the embodiments of the present disclosure.
In a fourth aspect, an embodiment of the present disclosure further provides a computer-readable storage medium, where a computer program is stored in the storage medium, and when the computer program is executed by a computing device, the computing device is enabled to implement any one of the live broadcast interaction methods provided in the embodiments of the present disclosure.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has at least the following advantages: in the embodiment of the disclosure, for each terminal entering a virtual object live broadcast room, a live broadcast layer is superimposed on a live broadcast interface of a virtual object displayed by the terminal, an interaction area exists on the live broadcast layer and is used for receiving touch operation of a target user, the terminal determines interaction information according to the touch operation of the target user and then displays the interaction information at a preset position on the live broadcast layer, content displayed on the live broadcast layer and live broadcast content displayed on the live broadcast interface are jointly used as content displayed on the terminal of the target user, for different users, the same display content can be viewed based on the live broadcast interface in a live broadcast process, and personalized display content for individuals can be viewed based on the live broadcast layer, so that the problem that a personalized interaction live broadcast effect cannot be realized by an existing live broadcast scheme is solved, a personalized live broadcast interaction effect is realized, and an effect that different users can view differentiated display content in the live broadcast process is achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the embodiments or technical solutions in the prior art description will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a flowchart of a live broadcast interaction method according to an embodiment of the present disclosure;
fig. 2 is a flowchart of another live interaction method provided by the present disclosure;
fig. 3 is a schematic diagram of a terminal display interface related to limb interaction provided in an embodiment of the present disclosure;
fig. 4 is a flowchart of another live interaction method provided by the present disclosure;
FIG. 5 is a schematic view of a terminal display interface related to the manufacture of an article according to an embodiment of the present disclosure;
FIG. 6 is a schematic view of another terminal display interface for manufacturing an article according to an embodiment of the present disclosure;
fig. 7 is a flowchart of another live interaction method provided by the embodiment of the present disclosure;
fig. 8 is a schematic diagram of a terminal display interface related to item finding provided by an embodiment of the present disclosure;
fig. 9 is a flowchart of another live interaction method provided by the embodiment of the present disclosure;
fig. 10 is a schematic diagram of a terminal display interface related to live image generation according to an embodiment of the present disclosure;
fig. 11 is a schematic view of another terminal display interface related to live image generation according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of a live broadcast interaction apparatus according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, aspects of the present disclosure will be further described below. It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced in other ways than those described herein; it is to be understood that the embodiments disclosed in the specification are only a few embodiments of the present disclosure, and not all embodiments.
Fig. 1 is a flowchart of a live broadcast interaction method according to an embodiment of the present disclosure, where the method may be applied to a live broadcast scene of a virtual object, where the virtual object is a virtual anchor different from a real person, behavior of the virtual object is controlled by a live broadcast background device, and the virtual object may be a 2D or 3D animation character, or an animal or an object, whose image and behavior are generated and driven and controlled by the background device; the action of the virtual object can be driven by a real person or a real animal or an object carrying the sensing equipment, the control equipment collects sensing data of the sensing equipment and further drives the generation of the corresponding action of the virtual object, and the collection, driving and live broadcasting processes can be carried out in real time. The live broadcast interaction method provided by the embodiment of the disclosure can be executed by a live broadcast interaction device, and the device can be realized by software and/or hardware and can be integrated on electronic equipment with computing capability, such as a mobile phone, a tablet computer, a desktop computer and other terminals.
As shown in fig. 1, a live interaction method provided by the embodiment of the present disclosure may include:
s101, determining an interactive area on a live broadcast layer, wherein the live broadcast layer is superposed on a live broadcast interface of a virtual object, the live broadcast interface refers to an interface which is synchronously displayed in each terminal entering a live broadcast room of the virtual object, the content displayed on the live broadcast layer and the live broadcast content displayed on the live broadcast interface are jointly used as the content displayed on the terminal of a target user, and the interactive area and the live broadcast interface keep frame synchronization.
In the embodiment of the present disclosure, the interaction area on the live layer (or referred to as a live floating layer) is used to receive the touch operation of the target user, so that the terminal determines the interaction information according to the touch operation of the target user, and then displays the interaction information at a preset position on the live layer. For different users, in the live broadcast process, the same display content can be watched based on a live broadcast interface (or called a live broadcast picture), and personalized display content aiming at individuals can be watched based on a live broadcast layer, so that the problem that the personalized live broadcast interaction effect cannot be realized by the existing live broadcast scheme is solved, and the personalized live broadcast interaction effect is realized.
The live layer may be invisible to the naked eye of the user, for example, the live layer may be set to a completely transparent state. The interaction area on the live broadcast layer can prompt the user according to a preset prompt strategy, so that the user can clearly determine the effective position of the interaction area. Illustratively, after the terminal of the target user enters a live broadcasting room of the virtual object, position prompt information about the interaction area can be displayed on a live broadcasting interface, or a dynamic effect display with preset duration is performed on the position of the interaction area, and after the preset duration is reached, the position prompt information about the interaction area or the dynamic effect of the position of the interaction area disappears.
The interactive area and the live interface keep frame synchronization, and the interactive area appears on a specific frame of a live video stream displayed by any terminal and is specifically determined according to live scenes of different virtual objects. The appearance and disappearance of the interactive area are dynamic processes. For example, with the playing of a live broadcast video stream, an interactive region exists on a live broadcast layer overlapped from an nth frame live broadcast interface to an (N + M) th frame live broadcast interface, the interactive region starts to appear on the nth frame live broadcast interface, the interactive region starts to disappear on the (N + M) th frame live broadcast interface, and the interactive region does not exist on the (N-1) th frame live broadcast interface and the (N + M + 1) th frame live broadcast interface. N and M are integers, and specific values can be preset according to different live broadcast scenes, which is not specifically limited in the embodiment of the present disclosure. The specific area size of the interactive region and the specific interactive mode supported by the interactive region can also be determined according to different live scenes.
In an optional implementation manner, determining an interaction area on a live layer includes:
searching a target interaction type matched with the live broadcast content in the multiple interaction types according to the live broadcast content displayed on the live broadcast interface; determining an interaction area on a live broadcast layer according to a target interaction type; wherein the plurality of interaction types include limb interaction, item production, item finding, and live image generation.
Live content displayed on live interfaces in different interaction types is different, the live content can include but is not limited to a live background, and actions, expressions, wearing, languages and the like of virtual objects, a current target interaction type, namely a current specific live scene, can be determined according to the live content displayed on the current live interface, and then an interaction area on a live layer is determined.
Specifically, the body interaction refers to the interaction of a user on the body with a virtual object through a terminal screen in the live broadcast process, for example, the user touches or rubs the fingertip with the virtual object through the terminal screen, and the user touches the head of the virtual object through the terminal screen. At this time, the interaction region may be a preset region determined based on the position of the target limb of the virtual object, for example, for the case of fingertip touch, the interaction region may be a circular region of a preset area size determined based on the fingertip of the virtual object; for the case of stroking the head, the interaction region may be a circular region of a preset area size determined based on the head of the virtual object.
The article manufacturing means that a user can manufacture a specific article, such as clothes, backpacks and the like with a specific color or a specific style, by performing touch operation on a terminal screen according to a manufacturing task given by a virtual object in a live broadcasting process. At this time, the interactive area may be a preset area supporting the article manufacturing operation of the user, for example, the interactive area may be an area on the live layer corresponding to the upper half of the terminal screen.
The item searching means that a user can search for a specific item through touch operation on a terminal screen according to a searching task given by the virtual object in the live broadcasting process. At this time, the interaction region may be a response region determined based on the item to be searched, for example, the interaction region may be a region covering a preset area size of the item to be searched.
Live broadcast image generation, or referred to as shooting a virtual object, means that in a live broadcast process, a user can generate a live broadcast image (or referred to as a photograph of the virtual object) through touch operation on a terminal screen according to content displayed on a live broadcast interface and an image generation requirement (or referred to as a shooting requirement) of the virtual object. At this time, the interaction area may be a response area capable of triggering an image generation operation (or referred to as a photographing operation), for example, the interaction area may be an area corresponding to the whole terminal screen on the live broadcast layer.
S102, responding to the touch operation of the target user in the interactive area, and determining interactive information corresponding to the touch operation.
And aiming at different interaction scenes, the interaction information corresponding to the touch operation of the user is different. For example, the interaction information corresponding to the touch operation of the current target user may be determined according to the corresponding relationship between the user touch operation and the interaction information in different interaction scenes, so as to serve as feedback to the touch operation of the target user. The interactive information can be synchronously downloaded to the terminal with the area information of the interactive area, and the corresponding interactive information can be matched at the server side and loaded to the terminal after the preset user operation is received.
S103, displaying the interactive information at a preset position on the live broadcast layer.
The display position of the interactive information on the live broadcast layer can be determined according to different live broadcast scenes. For example, for limb interaction, the preset position for displaying the interaction information may be determined according to the position of the target limb of the virtual object and the motion of the target limb; for article manufacturing or article searching, the preset position for displaying the interactive information can be determined according to the touch position of the user in the interactive area; aiming at the generation of live images, the preset position for displaying the interactive information can be any position of a corresponding terminal screen on a live layer.
Through showing interactive information, the richness of the displayable contents of the terminal in the live broadcast process can be increased, the interactive interestingness of the user and the virtual object is increased, and the experience of watching live broadcast by the user is increased.
Optionally, the live broadcast interaction method provided by the embodiment of the present disclosure may further include: displaying shared information corresponding to the target interaction type on a live broadcast interface; the shared information is obtained based on touch results corresponding to touch operations of one or more or all users in the virtual object live broadcast room.
Exemplarily, each terminal entering the virtual object live broadcast room can determine a touch result of the user according to the touch operation of the user in the interaction area, then send the touch result to the server according to the terminal identifier or the user identifier, and the server performs statistics and analysis on the touch result sent by each terminal to generate shared information, and then sends the shared information to each terminal, so that the same shared information can be displayed in each terminal.
The touch result may include, but is not limited to: whether the touch pressure of the user reaches a threshold value, whether the touch position of the user is within a standard touch position range, whether the user completes required interactive operation, whether the user completes an article to be made, an evaluation result of the article to be made completed by the user, whether the user successfully finds the article to be searched, whether the user triggers generation of a required live broadcast image, an evaluation result of the live broadcast image generated by triggering of the user, and the like. Accordingly, the shared information may include, but is not limited to, ranking information of the server on each user based on the received touch result, the counted total number of users entering the live broadcast room, the counted total number of manufactured items, and the like. Through displaying the shared information, the organic combination of personalized display contents on each terminal in the live broadcast process and contents synchronously displayed in each terminal in the live broadcast video stream can be realized, and the realization mode of live broadcast interaction is enriched.
In the embodiment of the disclosure, for each terminal entering a virtual object live broadcast room, a live broadcast layer is superimposed on a live broadcast interface of a virtual object displayed by the terminal, an interaction area exists on the live broadcast layer and is used for receiving touch operation of a target user, the terminal determines interaction information according to the touch operation of the target user and then displays the interaction information at a preset position on the live broadcast layer, content displayed on the live broadcast layer and live broadcast content displayed on the live broadcast interface are jointly used as content displayed on the terminal of the target user, for different users, the same display content can be viewed based on the live broadcast interface in a live broadcast process, and personalized display content for individuals can be viewed based on the live broadcast layer, so that the problem that a personalized interaction live broadcast effect cannot be realized by an existing live broadcast scheme is solved, a personalized live broadcast interaction effect is realized, and an effect that different users can view differentiated display content in the live broadcast process is achieved.
Fig. 2 is a flowchart of another live broadcast interaction method provided by the embodiment of the present disclosure, which is further optimized and expanded based on the foregoing technical solution, and may be combined with each of the foregoing optional embodiments. Fig. 2 specifically illustrates an embodiment of the present disclosure by taking a target interaction type as an example of a limb interaction, but should not be construed as a specific limitation to the embodiment of the present disclosure.
As shown in fig. 2, a live interaction method provided by the embodiment of the present disclosure may include:
s201, searching a target interaction type matched with the live broadcast content in the multiple interaction types according to the live broadcast content displayed on the live broadcast interface, wherein the target interaction type is limb interaction.
S202, determining an interaction area on a live broadcast layer according to the action of a target limb part of a virtual object on a live broadcast interface and the position of the target limb part.
The interaction area is used for receiving touch operation generated by a target user aiming at a target limb part of the virtual object. Taking the hand with the target limb part as the virtual object as an example, the motion of the target limb part of the virtual object may be to extend a fingertip, which indicates that the virtual object wishes to perform fingertip touch (or referred to as fingertip interaction) with the target user, at this time, the interaction region may be determined based on the hand motion and the hand position of the virtual object, for example, the interaction region may be a circular region with a preset area size determined based on the fingertip of the virtual object; or, the action of the target limb part of the virtual object may be to extend the palm forward, which indicates that the virtual object wishes to make a palm stroke with the target user, and at this time, the interaction region may be a rectangular region with a preset area size determined based on the palm position of the virtual object; alternatively, for example, the target limb portion may be a head of the virtual object, and the motion of the target limb portion of the virtual object may be a shaking of the head, which indicates that the virtual object desires the target user to beat or touch the head of the virtual object.
In an optional implementation manner, the live broadcast interaction method provided by the embodiment of the present disclosure further includes: and determining prompt information matched with the action of the target limb part of the virtual object, and displaying the prompt information on a live broadcast interface, wherein the prompt information is used for prompting a target user to execute touch operation matched with the action of the target limb part of the virtual object in an interaction area. The prompt message can be realized by at least one of a static image effect, a dynamic image effect and a character. The prompt message can be a message from a live interface after being displayed for a specific time, or can disappear after the terminal detects the touch operation of the user.
Fig. 3 is a schematic diagram of a terminal display interface related to limb interaction provided in an embodiment of the present disclosure, which is used for exemplary illustration of the embodiment of the present disclosure, but should not be construed as a specific limitation to the embodiment of the present disclosure. As shown in the left diagram of fig. 3, after the virtual object lifts up the hand and extends out of the fingertip, prompt information may be displayed based on the fingertip position of the virtual object on the live broadcast interface, where the prompt information may include ripples formed by dynamic circles and dynamic virtual hands, and is used to prompt the target user to perform fingertip touch with the virtual object. The right diagram in fig. 3 exemplifies the display effect of the interactive information by taking the heart-shaped image, in which the interactive information determined according to the touch operation of the target user is dynamic, and the display position of the interactive information is determined based on the fingertip position of the virtual object.
S203, responding to the touch operation of the target user in the interaction area, and determining interaction information corresponding to the touch operation.
Aiming at the live broadcast scene of limb interaction, the interaction information corresponding to the touch operation can be dynamically determined according to the touch operation information of the target user, such as touch pressure, touch position and the like, so that the interaction interestingness is increased. Optionally, the interaction information comprises feedback dynamic effects matching the motion of the target limb part of the virtual object. For example, if the distance between the touch position of the target user and the center position of the interaction area is smaller than a first distance threshold, or the touch pressure of the target user is greater than a first pressure threshold, the interaction information may be a dynamic heart effect formed by heart-shaped images, and if the distance between the touch position of the target user and the center position of the interaction area is greater than or equal to the first distance threshold, or the touch pressure of the target user is less than or equal to the first pressure threshold, the interaction information may be a dynamic ripple effect formed by circles. The values of the thresholds can be flexibly determined.
And S204, displaying the interactive information at a preset position on the live broadcasting layer.
The preset position for displaying the interactive information can still be determined according to the position of the target limb of the virtual object and the action of the target limb. For example, for fingertip touches, the presentation position of the interaction information may be determined based on the fingertip position of the virtual object.
In addition, in the live broadcasting process, the server can sort the user identifications according to the touch operation time from long to short by collecting the user identifications (or user names) participating in the interaction with the limbs of the virtual object in each terminal and the touch operation time of the user, and send sorting results to each terminal for display. The server can also send different rewards to each user according to the ranking condition (the more the ranking is, the more rewards can be increased) so as to achieve the effect of promoting the user to participate in the body interaction.
According to the method and the device, the problem that the existing live broadcast scheme cannot realize the personalized live broadcast interaction effect is solved through the limb interaction between the virtual object and the target user in the live broadcast process, the personalized live broadcast interaction effect is realized, and the effect that different users can see differentiated display contents in the live broadcast process is achieved.
Fig. 4 is a flowchart of another live broadcast interaction method provided by the embodiment of the present disclosure, which is further optimized and expanded based on the foregoing technical solution, and may be combined with each of the foregoing optional embodiments. Fig. 4 specifically exemplifies the object interaction type as an article manufacturing, and illustrates an embodiment of the present disclosure, but should not be construed as a specific limitation to the embodiment of the present disclosure. Aiming at the article manufacturing, interactive information corresponding to touch operation of a user in an interactive area comprises colors selected by the user, and a preset position used for displaying the interactive information on a live broadcast layer comprises a filling position of the colors.
As shown in fig. 4, a live broadcast interaction method provided by the embodiment of the present disclosure may include:
s301, searching a target interaction type matched with the live broadcast content in the plurality of interaction types according to the live broadcast content displayed on the live broadcast interface, wherein the target interaction type is article manufacturing.
S302, determining an interaction area on the live broadcast layer according to the target interaction type, wherein the interaction area is a preset drawing board area, and the preset drawing board area displays an article to be manufactured.
Tools needed for making the articles, such as color plates, paintbrushes, erasers and the like, can also be displayed in the preset drawing board area. The preset drawing board area is used for receiving touch operation of a target user so that the target user can finish article manufacturing. The specific position of the drawing board area on the live broadcast layer and the area size, shape and the like of the drawing board area are preset, and the embodiment of the disclosure is not limited specifically. The article to be made may be any article, such as clothing, a bag, an image, etc., the embodiment of the present disclosure is not particularly limited, and the article to be made may be determined according to an article making task proposed by the virtual object in the live broadcasting process.
And S303, responding to the touch operation of the target user in the preset drawing board area, and determining the color selected by the target user and used for filling the to-be-manufactured object.
The color selection area is displayed in the preset drawing board area, after the target user selects a specific color, touch control can be performed at the corresponding color position, and the terminal determines the color selected by the target user according to the touch control operation of the target user.
S304, responding to the touch operation of the target user on the object to be manufactured, and determining the filling position selected by the target user on the live broadcast layer.
In the embodiment of the present disclosure, the first touch operation of the target user may be used to determine the selected color, and the second touch operation of the target user may be used to determine the position where the target user wants to fill in the color.
It should be noted that, in the process of manufacturing the article, the filling colors selected by different users are different, so that the filling colors of the article displayed on different user terminals are different, and different article manufacturing effects are displayed on different user terminals.
In addition, in the process of manufacturing the product, the preset drawing board area supports operations such as zooming and moving of the product to be manufactured, and great flexibility is provided for user operation.
S305, displaying the color selected by the target user at a filling position on the live broadcast layer.
And after the target user finishes filling the colors of part or all areas of the object to be manufactured, the object manufacturing can be submitted to show that the object manufacturing is finished. After the manufactured article is submitted, the terminal can also score the manufactured article based on a preset scoring rule, for example, the smaller the difference between the filling color of the user and the standard filling color of the preset article is, the higher the score is, otherwise, the lower the score is, the score can be displayed on the live broadcast layer. In addition, the operation of scoring the manufactured items can also be executed by the server, for example, the server can score the manufactured items sent by the terminal based on a preset scoring rule, and then send the score to the terminal based on the user identifier or the terminal identifier so as to display the score on the live broadcast layer.
Fig. 5 is a schematic view of a terminal display interface related to manufacturing an article, specifically, taking an article to be manufactured as a T-shirt as an example, and the embodiment of the present disclosure is exemplarily illustrated and should not be construed as a specific limitation of the embodiment of the present disclosure. As shown in fig. 5, the preset drawing board area shows the T-shirt to be made, and based on the touch operation of the user, the color on the T-shirt can be filled. In the color filling process, the target user can enlarge the T-shirt and move the T-shirt to the middle position of the preset drawing board area, so that the specific color filling is conveniently carried out on the specific position on the T-shirt until the T-shirt is made.
In an optional implementation manner, the live broadcast interaction method provided by the embodiment of the present disclosure may further include:
playing the communication voice of the virtual object about the target production object, wherein the target production object is obtained by screening the production objects submitted by one or more or all users in the virtual object live broadcast room according to a preset selection strategy; the preset selection policy may include, but is not limited to: determining the manufactured article with the highest grade in all manufactured articles as a target manufactured article, randomly selecting the manufactured article from all manufactured articles as the target manufactured article, and determining the manufactured article with the earliest submission time in all manufactured articles as the target manufactured article; the communication voice can be configured in advance according to the actual scene, and the embodiment of the disclosure is not limited specifically, for example, the communication voice can "thank you for the articles made by everybody, i like very much, and then i show one target made article for everybody", and the like;
the target production item is presented on the virtual object. For example, the display mode of the target production item by the virtual object may be determined according to the type of the production item, for example, for a production item of clothing, the target production item may be displayed in a mode that the virtual object wears the target production item on the body, and for other production items of non-clothing, the target production item may be displayed in a mode that the virtual object puts the target production item on the hand.
Fig. 6 is a schematic view of another terminal display interface for manufacturing an article according to the embodiment of the present disclosure, specifically, taking the manufactured article submitted by the user as a T-shirt, which is exemplary and should not be construed as a specific limitation to the embodiment of the present disclosure. As shown in the left panel of fig. 6, after the target user submits a finished T-shirt, the terminal may present the corresponding rating. The server side can determine the T-shirt with the highest score as the target T-shirt based on the score of the T-shirt submitted by each terminal, then fuse the target T-shirt with the virtual object in the live video stream by using an image fusion technology to achieve the effect that the target T-shirt is worn by the virtual object, and send the fusion result to each terminal for display, wherein the display effect of each terminal is shown in the right picture in fig. 6.
According to the method and the device, the problem that the existing live broadcast scheme cannot realize the personalized live broadcast interaction effect is solved through the article making of the user in the live broadcast process and the display of the virtual object on the object making article, the personalized live broadcast interaction effect is realized, and the effect that different users can see the differentiated display content in the live broadcast process is achieved.
Fig. 7 is a flowchart of another live broadcast interaction method provided in the embodiment of the present disclosure, which is further optimized and expanded based on the foregoing technical solution, and may be combined with each of the foregoing optional embodiments. Fig. 7 specifically illustrates an example of the target interaction type for finding an article, but should not be construed as a specific limitation to the embodiment of the present disclosure.
As shown in fig. 7, a live interaction method provided by the embodiment of the present disclosure may include:
s401, according to the live broadcast content displayed on the live broadcast interface, searching a target interaction type matched with the live broadcast content in the multiple interaction types, wherein the target interaction type is article searching.
S402, determining an interaction area on the live broadcast layer according to the target interaction type, wherein the interaction area is an area covering an article to be searched in the live broadcast layer.
The item to be searched can be any type of item, and specifically, the item searching task provided by the virtual object can be determined, for example, in the live broadcasting process, the virtual object says that "welcome to join a pot searching game", which means that the item to be searched is a pot. Because the display position of the article to be searched on the live broadcast interface, the shape of the article to be searched and other factors are different, the position of the interactive region on the live broadcast layer and the region area can be determined according to actual conditions, for example, the interactive region can be a rectangular region covering the article to be searched.
And S403, responding to the touch operation of the target user in the interaction area, and determining the touch position of the target user.
S404, if the touch position of the target user is matched with the position of the object to be searched in the interaction area, determining that the target user successfully searches the object to be searched, and generating a dynamic effect for representing successful object searching.
Exemplarily, if the distance between the touch position of the target user and the center position of the to-be-searched item in the interaction area is smaller than a distance threshold (the value can be determined adaptively), or the touch position of the user is in the effective interaction area, the touch position of the target user is matched with the position of the to-be-searched item in the interaction area, and it can be determined that the target user successfully searches the to-be-searched item; if the distance between the touch position of the target user and the center position of the object to be searched in the interaction area is greater than or equal to the distance threshold value, or the touch position of the user is not located in the effective interaction area, the touch position of the target user is not matched with the position of the object to be searched in the interaction area, and it can be determined that the target user does not successfully find the object to be searched.
Whether the user successfully finds the article to be found or not, a corresponding dynamic effect can be generated to prompt the target user of the current article finding result.
S405, displaying a dynamic effect for indicating that the object is successfully searched based on the position of the object to be searched on the live broadcast layer.
For example, for the case that the item is successfully searched, an image frame displaying a dynamic specific color may be superimposed on the successfully searched item based on the live broadcast layer to indicate that the user successfully searches for the item in the frame, and meanwhile, for the successfully searched item, a dynamic alternate display effect that the item is enlarged and reduced may be displayed on the live broadcast layer based on the position of the item. The dynamic effect may end up being displayed when the presentation time reaches a corresponding time threshold.
Optionally, the live broadcast interaction method provided by the embodiment of the present disclosure may further include:
and displaying the types of the articles to be searched and the quantity of the articles under each type which are searched successfully on the live broadcasting layer. The type of the object to be searched is related to the object searching task provided by the virtual object, for example, the object to be searched provided by the virtual object is a living article-pot, in the live broadcasting process, the type of the object to be searched displayed on the live broadcasting layer is a pot, and the number of the successfully searched objects can be updated and displayed in real time according to the touch operation of the user.
Fig. 8 is a schematic view of a terminal display interface related to item searching provided in an embodiment of the present disclosure, and specifically, taking an item to be searched as a pan as an example, the embodiment of the present disclosure is exemplarily illustrated, but should not be construed as a specific limitation to the embodiment of the present disclosure. As shown in fig. 8, in the live broadcast process, the target user may participate in item finding under the guidance of the virtual object, and after determining that the target user successfully finds the pan based on the touch operation of the target user, a red image frame (i.e., a display frame) surrounding the pan may be dynamically displayed on the live broadcast layer based on the display position of the pan.
In addition, the virtual object can perform language communication with the target user according to the article searching result of the target user, for example, the virtual object can say that the xxx is found successfully according to the congruity, the xxx is distinguished from the mind, the xxx is continuously searched, and the like, so that the interactivity of live broadcasting is improved.
Because the article searching results of different users are different, the display contents on different user terminals are different in the live broadcasting process, the problem that the existing live broadcasting scheme cannot realize the personalized live broadcasting interaction effect is solved, the personalized live broadcasting interaction effect is realized, and the effect that different users can see the differentiated display contents in the live broadcasting process is achieved.
Fig. 9 is a flowchart of another live broadcast interaction method provided in the embodiment of the present disclosure, which is further optimized and expanded based on the foregoing technical solution, and may be combined with each of the foregoing optional embodiments. Fig. 9 specifically illustrates the target interaction type as a live image generation, and illustrates an embodiment of the present disclosure, but should not be construed as a specific limitation to the embodiment of the present disclosure.
As shown in fig. 9, a live broadcast interaction method provided by the embodiment of the present disclosure may include:
s501, searching a target interaction type matched with the live broadcast content in the multiple interaction types according to the live broadcast content displayed on the live broadcast interface, wherein the target interaction type is live broadcast image generation.
S502, determining an interaction area on the live broadcast layer according to the target interaction type, wherein the interaction area is a photographing operation area in the live broadcast layer.
For example, the photographing operation area may be an area on a live layer corresponding to the whole terminal screen, or may be an area for a part of the terminal screen, which is not specifically limited in the embodiment of the present disclosure.
S503, displaying photographing prompt information on the live broadcast interface, wherein the photographing prompt information is used for prompting a target user to trigger conditions required to be met by photographing, and the conditions comprise at least one of a live broadcast background displayed in the live broadcast interface, a posture of a virtual object and an expression of the virtual object.
Illustratively, in the live broadcasting process, when the live broadcasting duration reaches the trigger time of a live broadcasting image generation task (or called a photographing task), photographing prompt information may be displayed on a live broadcasting interface, or when it is detected that a virtual object presents the live broadcasting image generation task to a target user based on the live broadcasting image generation requirement of the virtual object, the photographing prompt information is displayed. For example, in the live broadcasting process, the virtual object performs dance and performance or swings out a preset gesture and the like, and meanwhile sends out voice to help me shoot one photo, the terminal detects out the virtual object based on the behavior and voice of the virtual object, provides a live broadcasting image generation task, and displays shooting prompt information on a live broadcasting interface.
The live background may be information describing an environment in which the virtual object is currently located, such as indoor or outdoor environment information; the posture and expression of the virtual object can be used for describing the behavior and state of the virtual object, and can be preset by the server.
Fig. 10 is a schematic view of a terminal display interface related to live image generation provided by the embodiment of the present disclosure, and a left diagram in fig. 10 illustrates a display manner of a photographing prompt message, which is used for exemplarily explaining the embodiment of the present disclosure and should not be construed as a specific limitation to the embodiment of the present disclosure. As shown in the left diagram of fig. 10, "background lighting" corresponds to a posture of the virtual object, "open lighting" corresponds to an expression of the virtual object, and "side lighting under sunset" corresponds to a posture of the virtual object and live background, the target user needs to take a picture of the virtual object that satisfies the aforementioned 3 conditions. In addition, the live background is not specifically displayed in fig. 10, and in practical application, the corresponding live background may be displayed according to the content of the live video stream.
And S504, determining the picture of the virtual object in response to the photographing triggering operation of the target user in the photographing operation area.
The target user can shoot the virtual object by performing touch operation in the interaction area when determining that the live background, the posture of the virtual object and the expression of the virtual object meet the conditions required in the shooting prompt information according to the live process.
In an optional implementation manner, in the process of determining the photo of the virtual object, the live broadcast interaction method provided by the embodiment of the present disclosure may further include:
and determining the photographing focal length in response to the photographing focal length adjusting operation of the target user in the photographing operation area, so as to determine the picture of the virtual object based on the photographing focal length. The embodiment of the disclosure supports that a target user adjusts the photographing focal length in the process of photographing the virtual object, for example, the target user performs left-right sliding or up-down sliding in the interaction area to realize focal length adjustment, so that the effect of simulating a real photographing scene is realized, and a clear picture is ensured to be photographed for the virtual object.
Fig. 11 is a schematic diagram of another terminal display interface related to live image generation according to an embodiment of the present disclosure. As shown in fig. 11, in the live broadcasting process, a prompt message of focal length adjustment may also be displayed on the live broadcasting interface to prompt the user to perform focal length adjustment.
And S505, determining a photographing evaluation result of the target user based on the photograph of the virtual object.
And aiming at a live scene generated by a live image, the interactive information corresponding to the touch operation of the target user in the interactive area is a photographing evaluation result of the target user. The terminal can evaluate the photos shot by the target user for the virtual object based on a preset photo evaluation strategy, determine a photo evaluation result and display the photo evaluation result. Of course, the terminal may also send the photo taken by the target user as the virtual object to the server, and the server evaluates the photo according to the preset photo evaluation policy and sends the photo evaluation result to the terminal.
In an alternative embodiment, determining the result of the evaluation of the photograph of the target user based on the photograph of the virtual object comprises: and comparing the photo information of the virtual object with the standard photo information to determine the photographing evaluation result of the target user. Specifically, the smaller the difference between the photograph information of the virtual object and the standard photograph information is, the higher the score of the photographing evaluation result is; the larger the difference between the photograph information of the virtual object and the standard photograph information is, the lower the score of the photographing evaluation result is.
The photo information of the virtual object comprises at least one of a photo trigger moment, a live background, a posture of the virtual object, an expression of the virtual object and a target object (such as an ornament) on the virtual object; the standard photo information comprises at least one of a standard photographing trigger time, a standard live background, a standard photographing posture of the virtual object, a standard photographing expression of the virtual object, and a standard target object on the virtual object. The standard photograph information is description information corresponding to a photograph when the standard photograph is taken for the virtual object.
Specifically, the photographing trigger time refers to generation time of photographing trigger operation of a target user in a live broadcasting process, the standard photographing trigger time refers to ideal generation time of photographing trigger operation of the target user preset by the server, and the smaller the time difference between the generation time and the ideal generation time is, the higher the score of the photographing evaluation result of the target user is, and the lower the score is otherwise. Similarly, the smaller the difference between the live background displayed in the photograph of the virtual object and the standard live background, the smaller the difference between the posture of the virtual object displayed in the photograph of the virtual object and the standard photographing posture of the virtual object, the smaller the difference between the expression of the virtual object displayed in the photograph of the virtual object and the standard photographing expression of the virtual object, or the smaller the difference between the target item on the virtual object displayed in the photograph of the virtual object and the standard target item on the virtual object, the higher the score of the photographing evaluation result of the target user is, and the lower the score is otherwise. In a particular application, the photo information participating in the evaluation of the taking of a picture may be determined from information included in the photo of the virtual object.
S506, displaying the shooting evaluation result at a preset position on the live broadcast layer.
The preset position for displaying the photographing evaluation result can be reasonably determined according to the interface display layout, and the embodiment of the disclosure is not particularly limited. For example, the photographing evaluation result may be displayed in an area corresponding to the upper half portion of the terminal screen on the live broadcast layer, and exemplarily, as shown in the right sub-diagram in fig. 10, the photographing evaluation results "Great" and "Perfect" are respectively displayed in an area corresponding to 1/3 of the upper side of the terminal screen on the live broadcast layer.
In addition, the server side can also determine the reward which can be obtained by the target user according to the photographing evaluation result of the target user, and synchronizes the reward to the account of the target user. The higher the shooting evaluation result of the target user is, or the higher the ranking of the user is based on the shooting evaluation result, the more rewards the target user can obtain, so that the enthusiasm of the user in participating in live broadcast interaction is improved.
Optionally, before determining the picture of the virtual object in response to a photographing triggering operation of the target user in the photographing operation region, the live broadcast interaction method provided by the embodiment of the present disclosure may further include:
determining the type information of the photo to be shot based on the shooting prompt information;
and determining the total shooting times of the target user based on the type information of the photo to be shot, wherein the total shooting times are used for determining the residual shooting times of the target user in the process of determining the photo of the virtual object.
After determining the picture of the virtual object in response to a photographing triggering operation of the target user in the photographing operation region, the live broadcast interaction method provided by the embodiment of the present disclosure may further include:
and displaying the residual shooting times of the target user in a shooting time display area on the live broadcast layer. And when the target user generates a photographing triggering operation once, the residual photographing times are reduced once.
Continuing with fig. 10, based on the photographing prompting information, it may be determined that the photos to be photographed include 3 types, and further, it may be determined that the total number of times of photographing of the target user is not less than 3 times (for example, there is a case where at least 2 photos of each type need to be photographed), and the total number of times of photographing is taken as 3 times in fig. 10 as an example. After the target user shoots the back shadow for the virtual object, the residual shooting times displayed on the live broadcast layer are 2 times, and after the target user continues to shoot the heart shadow for the virtual object, the residual shooting times displayed on the live broadcast layer are 1 time. The remaining 1 photo opportunity is used to take a side-shot of the virtual subject in sunset. In addition, it should be noted that after the shooting prompt information is displayed and before the photo is taken for the virtual object, the virtual object is allowed to be worn by changing according to the live broadcasting logic preset by the server, and different live broadcasting backgrounds (which can show that the virtual object is in different environments) are switched, so that the interest of live broadcasting interaction is increased.
In the embodiment of the present disclosure, the target user and the virtual object perform a photographing interaction, and different users have different information of photographs taken for the virtual object, so that the photographing evaluation results displayed on the live broadcast layer are different, and the photographing speeds of the different users are different, so that the remaining photographing times displayed on the live broadcast layer are also different. The embodiment of the invention solves the problem that the existing live broadcast scheme can not realize the personalized live broadcast interaction effect, realizes the personalized live broadcast interaction effect, and achieves the effect that different users can see the differentiated display content in the live broadcast process.
Fig. 12 is a schematic structural diagram of a live broadcast interaction apparatus according to an embodiment of the present disclosure, where the apparatus may be implemented by software and/or hardware, and may be integrated on an electronic device with computing capability, for example, a user terminal such as a mobile phone, a tablet computer, or a desktop.
As shown in fig. 12, a live interaction apparatus 600 provided in an embodiment of the present disclosure may include an interaction area determining module 601, an interaction information determining module 602, and an interaction information presenting module 603, where:
an interaction region determining module 601, configured to determine an interaction region on a live broadcast layer, where the live broadcast layer is superimposed on a live broadcast interface of a virtual object, the live broadcast interface is an interface displayed synchronously in each terminal entering a virtual object live broadcast room, a content displayed on the live broadcast layer and a live broadcast content displayed on the live broadcast interface are both used as a content displayed on a terminal of a target user, and the interaction region and the live broadcast interface keep frame synchronization;
the interaction information determining module 602 is configured to determine, in response to a touch operation of a target user in an interaction area, interaction information corresponding to the touch operation;
and the interactive information display module 603 is configured to display the interactive information at a preset position on the live layer.
Optionally, the interaction region determining module 601 includes:
the target interaction type determining unit is used for searching a target interaction type matched with the live broadcast content in the plurality of interaction types according to the live broadcast content displayed on the live broadcast interface;
the interactive area determining unit is used for determining an interactive area on a live broadcast layer according to the target interactive type;
wherein the plurality of interaction types include limb interaction, item production, item finding, and live image generation.
Optionally, the live broadcast interaction apparatus 600 provided in the embodiment of the present disclosure further includes:
the shared information display module is used for displaying the shared information corresponding to the target interaction type on a live broadcast interface; the shared information is obtained based on a touch result corresponding to the touch operation of the user in the virtual object live broadcast room.
Optionally, if the target interaction type is a limb interaction, the interaction region determining unit is specifically configured to:
and determining an interaction area on the live broadcast layer according to the action of the target limb part of the virtual object on the live broadcast interface and the position of the target limb part.
Optionally, the live broadcast interaction apparatus 600 provided in the embodiment of the present disclosure further includes:
and the action prompt information display module is used for determining prompt information matched with the action of the target limb part of the virtual object and displaying the prompt information on the live broadcast interface, wherein the prompt information is used for prompting a target user to execute touch operation matched with the action of the target limb part of the virtual object in the interaction area.
Optionally, the interaction information comprises feedback dynamic effects matching the motion of the target limb part of the virtual object.
Optionally, if the target interaction type is article manufacturing, the interaction area is a preset drawing board area, and an article to be manufactured is displayed in the preset drawing board area;
the interaction information determining module 602 is specifically configured to:
responding to the touch operation of a target user in a preset drawing board area, and determining the color selected by the target user and used for filling the to-be-manufactured object;
the interaction information presentation module 603 includes:
the filling position determining unit is used for responding to the touch operation of the target user on the object to be manufactured and determining the filling position selected by the target user on the live broadcast layer;
and the color display unit is used for displaying the color selected by the target user at the filling position on the live broadcasting layer.
Optionally, the live broadcast interaction apparatus 600 provided in the embodiment of the present disclosure further includes:
the system comprises an exchange voice playing module, a target production object and a virtual object live broadcast server, wherein the exchange voice playing module is used for playing an exchange voice of the virtual object about the target production object, and the target production object is obtained by screening production objects submitted by a user in a virtual object live broadcast room according to a preset selection strategy;
and the article display module is used for displaying the target production article on the virtual object.
Optionally, if the target interaction type is article finding, the interaction area is an area covering an article to be found in the live broadcast layer;
the mutual information determination module 602 includes:
the touch position determining unit is used for responding to the touch operation of the target user in the interaction area and determining the touch position of the target user;
the article searching success determining unit is used for determining that the target user successfully searches the article to be searched if the touch position of the target user is matched with the position of the article to be searched in the interaction area, and generating a dynamic effect for representing the successful article searching;
the interaction information display module 603 is specifically configured to:
and displaying a dynamic effect for indicating that the object is successfully searched based on the position of the object to be searched on the live broadcast layer.
Optionally, the live broadcast interaction apparatus 600 provided in the embodiment of the present disclosure further includes:
and the article information display module is used for displaying the types of the articles to be searched and the quantity of the articles under each type which are searched successfully on the live broadcast layer.
Optionally, if the target interaction type is generated for a live image, the interaction area is a photographing operation area in a live image layer;
the interaction information determination module 602 includes:
the shooting prompt information display unit is used for displaying the shooting prompt information on a live interface, wherein the shooting prompt information is used for prompting a target user to trigger conditions which need to be met by shooting, and the conditions comprise at least one of a live background, a posture of a virtual object and an expression of the virtual object, which are displayed in the live interface;
the photo determining unit is used for responding to the photographing triggering operation of the target user in the photographing operation area and determining the photo of the virtual object;
a photographing evaluation result determining unit for determining a photographing evaluation result of the target user based on the photograph of the virtual object;
the interaction information display module 603 is specifically configured to:
and displaying the shooting evaluation result at a preset position on the live broadcast layer.
Optionally, the live broadcast interaction apparatus 600 provided in the embodiment of the present disclosure further includes:
and the photographing focal length determining module is used for responding to the photographing focal length adjusting operation of the target user in the photographing operation area, determining the photographing focal length and determining the picture of the virtual object based on the photographing focal length.
Optionally, the photographing evaluation result determining unit is specifically configured to:
and comparing the photo information of the virtual object with the standard photo information to determine the photographing evaluation result of the target user.
Optionally, the photo information of the virtual object includes at least one of a photographing trigger time, a live background, a gesture of the virtual object, an expression of the virtual object, and a target item on the virtual object;
the standard photo information comprises at least one of standard photographing trigger time, standard live background, standard photographing posture of the virtual object, standard photographing expression of the virtual object and standard target object on the virtual object.
Optionally, the live broadcast interaction apparatus 600 provided in the embodiment of the present disclosure further includes:
the photo type information determining module is used for determining the type information of the photo to be shot based on the shooting prompt information;
the shooting frequency determining module is used for determining the total shooting frequency of the target user based on the type information of the photo to be shot, wherein the total shooting frequency is used for determining the residual shooting frequency of the target user in the process of determining the photo of the virtual object;
and the residual shooting frequency display module is used for displaying the residual shooting frequency of the target user in the shooting frequency display area on the live broadcast layer.
The live broadcast interaction device provided by the embodiment of the disclosure can execute any live broadcast interaction method provided by the embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method. Reference may be made to the description of any method embodiment of the disclosure that may not be described in detail in the embodiments of the apparatus of the disclosure.
Fig. 13 is a schematic structural diagram of an electronic device provided in the embodiment of the present disclosure, which is used to exemplarily explain an electronic device that implements a live broadcast interaction method provided in the embodiment of the present disclosure. The electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., car navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, smart home devices, wearable electronic devices, servers, and the like. The electronic device shown in fig. 13 is only an example, and should not bring any limitation to the functions and occupation ranges of the embodiments of the present disclosure.
As shown in fig. 13, the electronic device 700 includes one or more processors 701 and memory 702.
The processor 701 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 700 to perform desired functions.
The live broadcast interaction method provided by the embodiment of the disclosure is applied to a terminal entering a virtual object live broadcast room, and includes: determining an interactive area on a live broadcast layer, wherein the live broadcast layer is superposed on a live broadcast interface of a virtual object, the live broadcast interface refers to an interface synchronously displayed in each terminal entering a live broadcast room of the virtual object, the content displayed on the live broadcast layer and the live broadcast content displayed on the live broadcast interface are jointly used as the content displayed on the terminal of a target user, and the interactive area and the live broadcast interface are kept in frame synchronization; responding to the touch operation of a target user in an interaction area, and determining interaction information corresponding to the touch operation; and displaying the interactive information at a preset position on the live broadcasting layer. It should be understood that electronic device 700 may also perform other alternative embodiments provided by the disclosed method embodiments.
In one example, the electronic device 700 may further include: an input device 703 and an output device 704, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
The input device 703 may include, for example, a keyboard, a mouse, and the like.
The output device 704 may output various information including the determined distance information, direction information, and the like to the outside. The output devices 704 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, among others.
Of course, for simplicity, only some of the components of the electronic device 700 relevant to the present disclosure are shown in fig. 13, omitting components such as buses, input/output interfaces, and the like. In addition, electronic device 700 may include any other suitable components depending on the particular application.
In addition to the methods and apparatus described above, embodiments of the present disclosure may also be a computer program product comprising a computer program or computer program instructions that, when executed by a processor, cause a computing device to implement any of the live interaction methods provided by embodiments of the present disclosure.
The computer program product may write program code for performing the operations of embodiments of the present disclosure in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the consumer electronic device, partly on the consumer electronic device, as a stand-alone software package, partly on the consumer electronic device and partly on a remote electronic device, or entirely on the remote electronic device.
Furthermore, embodiments of the present disclosure may also provide a computer-readable storage medium on which computer program instructions are stored, where the computer program instructions, when executed by a processor, cause a computing device to implement any live interaction method provided by embodiments of the present disclosure.
The live broadcast interaction method provided by the embodiment of the disclosure is applied to a terminal entering a virtual object live broadcast room, and includes: determining an interactive area on a live broadcast layer, wherein the live broadcast layer is superposed on a live broadcast interface of a virtual object, the live broadcast interface refers to an interface synchronously displayed in each terminal entering a live broadcast room of the virtual object, the content displayed on the live broadcast layer and the live broadcast content displayed on the live broadcast interface are jointly used as the content displayed on the terminal of a target user, and the interactive area and the live broadcast interface are kept in frame synchronization; responding to the touch operation of a target user in an interaction area, and determining interaction information corresponding to the touch operation; and displaying the interactive information at a preset position on the live broadcast layer. It should be understood that the computer program instructions, when executed by a processor, may also cause a computing device to implement alternative embodiments provided by the disclosed method embodiments.
A computer-readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising one of 8230; \8230;" 8230; "does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present disclosure, which will enable those skilled in the art to understand or practice the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (16)
1. A live broadcast interaction method is applied to a terminal entering a virtual object live broadcast room, and comprises the following steps:
determining an interaction area on a live broadcast layer, wherein the live broadcast layer is superposed on a live broadcast interface of the virtual object, the live broadcast interface refers to an interface which is synchronously displayed in each terminal entering a live broadcast room of the virtual object, the content displayed on the live broadcast layer and the live broadcast content displayed on the live broadcast interface are jointly used as the content displayed on the terminal of a target user, and the interaction area and the live broadcast interface keep frame synchronization;
responding to the touch operation of a target user in the interaction area, and determining interaction information corresponding to the touch operation;
displaying the interactive information at a preset position on the live broadcast layer, wherein the interactive information displayed on the live broadcast layer is personalized display content aiming at a target user;
the determining the interaction area on the live layer includes:
searching a target interaction type matched with the live broadcast content in a plurality of interaction types according to the live broadcast content displayed on the live broadcast interface;
determining an interaction area on the live broadcast layer according to the target interaction type;
wherein the multiple interaction types comprise limb interaction, article making, article finding and live image generation;
displaying the shared information corresponding to the target interaction type on the live broadcast interface; and obtaining the shared information based on a touch result corresponding to the touch operation of the user in the virtual object live broadcast room.
2. The method according to claim 1, wherein if the target interaction type is the limb interaction, the determining an interaction area on the live layer according to the target interaction type includes:
and determining an interaction area on the live broadcast layer according to the action of the target limb part of the virtual object on the live broadcast interface and the position of the target limb part.
3. The method of claim 2, further comprising:
and determining prompt information matched with the action of the target limb part of the virtual object, and displaying the prompt information on the live broadcast interface, wherein the prompt information is used for prompting a target user to execute touch operation matched with the action of the target limb part of the virtual object in the interaction area.
4. The method according to claim 2 or 3, wherein the interaction information comprises a feedback dynamic effect matched with the action of the target limb part of the virtual object.
5. The method according to claim 1, wherein if the target interaction type is the item production, the interaction area is a preset drawing board area in which an item to be produced is displayed;
the determining the interaction information corresponding to the touch operation in response to the touch operation of the target user in the interaction area comprises:
responding to the touch operation of the target user in the preset drawing board area, and determining the color selected by the target user and used for filling the article to be made;
the displaying of the interaction information at a preset position on the live broadcast layer comprises:
responding to the touch operation of the target user on the article to be manufactured, and determining the filling position selected by the target user on the live broadcast layer;
and displaying the color selected by the target user at a filling position on the live broadcast layer.
6. The method of claim 5, further comprising:
playing the communication voice of the virtual object about a target manufactured article, wherein the target manufactured article is obtained by screening manufactured articles submitted by a user in the virtual object live broadcast room according to a preset selection strategy;
displaying the target production item on the virtual object.
7. The method according to claim 1, wherein if the target interaction type is the item finding, the interaction area is an area covering an item to be found in the live layer;
the determining the interaction information corresponding to the touch operation in response to the touch operation of the target user in the interaction area comprises:
responding to the touch operation of the target user in the interaction area, and determining the touch position of the target user;
if the touch position of the target user is matched with the position of the object to be searched in the interaction area, determining that the target user successfully searches the object to be searched, and generating a dynamic effect for representing successful object searching;
the displaying of the interaction information at the preset position on the live broadcasting layer comprises:
and displaying the dynamic effect for representing the successful finding of the article on the live broadcast layer based on the position of the article to be found.
8. The method of claim 7, further comprising:
and displaying the types of the articles to be searched and the quantity of the articles under each type which are searched successfully on the live broadcast layer.
9. The method according to claim 1, wherein if the target interaction type is generated for the live image, the interaction area is a photo operation area in the live image layer;
the determining the interaction information corresponding to the touch operation in response to the touch operation of the target user in the interaction area comprises:
displaying photographing prompt information on the live broadcast interface, wherein the photographing prompt information is used for prompting a target user to trigger conditions which need to be met by photographing, and the conditions comprise at least one of a live broadcast background displayed in the live broadcast interface, a posture of the virtual object and an expression of the virtual object;
determining a picture of the virtual object in response to a photographing triggering operation of a target user in the photographing operation area;
determining a photographing evaluation result of a target user based on the photograph of the virtual object;
the displaying of the interaction information at the preset position on the live broadcasting layer comprises:
and displaying the shooting evaluation result at a preset position on the live broadcast layer.
10. The method of claim 9, wherein in determining the photograph of the virtual object, further comprising:
and responding to the photographing focal length adjusting operation of the target user in the photographing operation area, and determining the photographing focal length so as to determine the picture of the virtual object based on the photographing focal length.
11. The method of claim 9, wherein determining a result of the evaluation of the photograph of the target user based on the photograph of the virtual object comprises:
and comparing the photo information of the virtual object with the standard photo information to determine the photographing evaluation result of the target user.
12. The method of claim 11, wherein the photo information of the virtual object comprises at least one of the photograph trigger time, the live background, the pose of the virtual object, the expression of the virtual object, and a target item on the virtual object;
the standard photo information comprises at least one of standard photographing triggering time, standard live broadcast background, standard photographing posture of the virtual object, standard photographing expression of the virtual object and standard target object on the virtual object.
13. The method of claim 9, further comprising:
determining the type information of the photo to be photographed based on the photographing prompt information;
determining the total shooting times of a target user based on the type information of the photo to be shot, wherein the total shooting times are used for determining the residual shooting times of the target user in the process of determining the photo of the virtual object;
the method further comprises the following steps: and displaying the residual shooting times of the target user in a shooting time display area on the live broadcast layer.
14. A live interactive device configured for a terminal entering a virtual object live room, comprising:
the interactive area determining module is used for determining an interactive area on a live broadcasting layer, wherein the live broadcasting layer is superposed on a live broadcasting interface of the virtual object, the live broadcasting interface refers to an interface which is synchronously displayed in each terminal entering a live broadcasting room of the virtual object, the content displayed on the live broadcasting layer and the live broadcasting content displayed on the live broadcasting interface are jointly used as the content displayed on the terminal of a target user, the interactive area and the live broadcasting interface keep frame synchronization, and the content displayed on the live broadcasting layer is personalized display content aiming at the target user;
the interaction information determining module is used for responding to the touch operation of the target user in the interaction area and determining interaction information corresponding to the touch operation;
the interactive information display module is used for displaying the interactive information at a preset position on the live broadcast layer;
the interactive area determination module comprises:
the target interaction type determining unit is used for searching a target interaction type matched with the live broadcast content in the multiple interaction types according to the live broadcast content displayed on the live broadcast interface;
the interactive area determining unit is used for determining an interactive area on a live broadcast layer according to the target interactive type;
the multiple interaction types comprise limb interaction, article making, article searching and live image generation;
the shared information display module is used for displaying the shared information corresponding to the target interaction type on the live broadcast interface; and obtaining the shared information based on a touch result corresponding to the touch operation of the user in the virtual object live broadcast room.
15. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program that, when executed by the processor, performs the live interaction method of any of claims 1-13.
16. A computer-readable storage medium, having stored thereon a computer program that, when executed by a computing device, causes the computing device to implement the live interaction method of any of claims 1-13.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110370982.3A CN113115061B (en) | 2021-04-07 | 2021-04-07 | Live broadcast interaction method and device, electronic equipment and storage medium |
PCT/CN2022/076542 WO2022213727A1 (en) | 2021-04-07 | 2022-02-17 | Live broadcast interaction method and apparatus, and electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110370982.3A CN113115061B (en) | 2021-04-07 | 2021-04-07 | Live broadcast interaction method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113115061A CN113115061A (en) | 2021-07-13 |
CN113115061B true CN113115061B (en) | 2023-03-10 |
Family
ID=76714563
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110370982.3A Active CN113115061B (en) | 2021-04-07 | 2021-04-07 | Live broadcast interaction method and device, electronic equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113115061B (en) |
WO (1) | WO2022213727A1 (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113115061B (en) * | 2021-04-07 | 2023-03-10 | 北京字跳网络技术有限公司 | Live broadcast interaction method and device, electronic equipment and storage medium |
CN113596496A (en) * | 2021-07-28 | 2021-11-02 | 广州博冠信息科技有限公司 | Interaction control method, device, medium and electronic equipment for virtual live broadcast room |
CN115278285B (en) * | 2022-07-26 | 2024-01-30 | 北京字跳网络技术有限公司 | Live broadcast picture display method and device, electronic equipment and storage medium |
CN115842936A (en) * | 2022-12-02 | 2023-03-24 | 上海哔哩哔哩科技有限公司 | Multi-anchor live broadcasting method and device |
CN116030191B (en) * | 2022-12-21 | 2023-11-10 | 北京百度网讯科技有限公司 | Method, device, equipment and medium for displaying virtual object |
CN115937430B (en) * | 2022-12-21 | 2023-10-10 | 北京百度网讯科技有限公司 | Method, device, equipment and medium for displaying virtual object |
CN116843800B (en) * | 2023-08-29 | 2023-11-24 | 深圳有咖互动科技有限公司 | Animation information transmission method, device, electronic equipment and computer readable medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110850983A (en) * | 2019-11-13 | 2020-02-28 | 腾讯科技(深圳)有限公司 | Virtual object control method and device in video live broadcast and storage medium |
CN112135160A (en) * | 2020-09-24 | 2020-12-25 | 广州博冠信息科技有限公司 | Virtual object control method and device in live broadcast, storage medium and electronic equipment |
CN112533017A (en) * | 2020-12-01 | 2021-03-19 | 广州繁星互娱信息科技有限公司 | Live broadcast method, device, terminal and storage medium |
CN112601100A (en) * | 2020-12-11 | 2021-04-02 | 北京字跳网络技术有限公司 | Live broadcast interaction method, device, equipment and medium |
CN112616063A (en) * | 2020-12-11 | 2021-04-06 | 北京字跳网络技术有限公司 | Live broadcast interaction method, device, equipment and medium |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109618177B (en) * | 2018-12-26 | 2020-02-28 | 北京微播视界科技有限公司 | Video processing method and device, electronic equipment and computer readable storage medium |
SG11202111640RA (en) * | 2019-04-30 | 2021-11-29 | Guangzhou Huya Information Technology Co Ltd | Virtual image control method, apparatus, electronic device and storage medium |
CN110634483B (en) * | 2019-09-03 | 2021-06-18 | 北京达佳互联信息技术有限公司 | Man-machine interaction method and device, electronic equipment and storage medium |
CN112135154B (en) * | 2020-09-08 | 2022-08-05 | 网易(杭州)网络有限公司 | Live broadcast room interaction method, electronic equipment and storage medium |
CN112330819B (en) * | 2020-11-04 | 2024-02-06 | 腾讯科技(深圳)有限公司 | Interaction method and device based on virtual article and storage medium |
CN113115061B (en) * | 2021-04-07 | 2023-03-10 | 北京字跳网络技术有限公司 | Live broadcast interaction method and device, electronic equipment and storage medium |
-
2021
- 2021-04-07 CN CN202110370982.3A patent/CN113115061B/en active Active
-
2022
- 2022-02-17 WO PCT/CN2022/076542 patent/WO2022213727A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110850983A (en) * | 2019-11-13 | 2020-02-28 | 腾讯科技(深圳)有限公司 | Virtual object control method and device in video live broadcast and storage medium |
CN112135160A (en) * | 2020-09-24 | 2020-12-25 | 广州博冠信息科技有限公司 | Virtual object control method and device in live broadcast, storage medium and electronic equipment |
CN112533017A (en) * | 2020-12-01 | 2021-03-19 | 广州繁星互娱信息科技有限公司 | Live broadcast method, device, terminal and storage medium |
CN112601100A (en) * | 2020-12-11 | 2021-04-02 | 北京字跳网络技术有限公司 | Live broadcast interaction method, device, equipment and medium |
CN112616063A (en) * | 2020-12-11 | 2021-04-06 | 北京字跳网络技术有限公司 | Live broadcast interaction method, device, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN113115061A (en) | 2021-07-13 |
WO2022213727A1 (en) | 2022-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113115061B (en) | Live broadcast interaction method and device, electronic equipment and storage medium | |
CN112334886B (en) | Content distribution system, content distribution method, and recording medium | |
CN107294838B (en) | Animation generation method, device and system for social application and terminal | |
KR102701209B1 (en) | Selecting virtual objects in a three-dimensional space | |
CN105210373B (en) | Provide a user the method and system of personalized channels guide | |
CN107633441A (en) | Commodity in track identification video image and the method and apparatus for showing merchandise news | |
CN108230428B (en) | E-book rendering method, electronic equipment and storage medium based on augmented reality | |
CN117980962A (en) | Apparatus, method and graphical user interface for content application | |
CN111242682B (en) | Article display method | |
JP2013513304A (en) | System and method for determining proximity of media objects in a 3D media environment | |
CN106464773A (en) | Augmented reality apparatus and method | |
CN113518264A (en) | Interaction method, device, terminal and storage medium | |
CN107635153A (en) | A kind of exchange method and system based on image data | |
CN112261481A (en) | Interactive video creating method, device and equipment and readable storage medium | |
KR102200239B1 (en) | Real-time computer graphics video broadcasting service system | |
CN116596611A (en) | Commodity object information display method and electronic equipment | |
CN112823528B (en) | Information processing device, information processing method, and information processing program | |
CN114173173A (en) | Barrage information display method and device, storage medium and electronic equipment | |
JP2015220651A (en) | Try-on simulation experience system, control method for the same, and computer program | |
JP6684306B2 (en) | Terminal device, video distribution device, program | |
JP6568246B2 (en) | GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE | |
CN113194329B (en) | Live interaction method, device, terminal and storage medium | |
JP6453500B1 (en) | GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE | |
CN111145088A (en) | Projection style rendering method and system suitable for viewing space | |
JP7335536B2 (en) | Computer program, information processing device and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |