CN113423017A - Live broadcast picture display method and device, computer equipment and storage medium - Google Patents

Live broadcast picture display method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113423017A
CN113423017A CN202110687210.2A CN202110687210A CN113423017A CN 113423017 A CN113423017 A CN 113423017A CN 202110687210 A CN202110687210 A CN 202110687210A CN 113423017 A CN113423017 A CN 113423017A
Authority
CN
China
Prior art keywords
user
live broadcast
interaction
virtual scene
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110687210.2A
Other languages
Chinese (zh)
Other versions
CN113423017B (en
Inventor
胡珏
陈明华
杨晗
朱春林
韩龙
许兆博
初明洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110687210.2A priority Critical patent/CN113423017B/en
Publication of CN113423017A publication Critical patent/CN113423017A/en
Application granted granted Critical
Publication of CN113423017B publication Critical patent/CN113423017B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/254Management at additional data server, e.g. shopping server, rights management server
    • H04N21/2541Rights Management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • H04N21/25875Management of end-user data involving end-user authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4758End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for providing answers, e.g. voting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles

Abstract

The application relates to a live broadcast picture display method, a live broadcast picture display device, computer equipment and a storage medium, and relates to the technical field of live broadcast. The method comprises the following steps: displaying a first live-broadcast picture in a live-broadcast interface, wherein the first live-broadcast picture is a scene picture of a virtual scene updated based on a user operation instruction of a first user in the virtual scene; responding to the fact that a first user triggers live broadcast interaction in a virtual scene, and overlaying and displaying an interaction control on a first live broadcast picture; and displaying a second live broadcast picture in the live broadcast interface, wherein the second live broadcast picture is a scene picture of a virtual scene updated based on an interaction instruction corresponding to the interaction control and at least one second user. By the method, the second user can intervene in the selection condition of the event option in the live broadcast content of the first user while watching the anchor live broadcast, so that the interaction mode of the user in the live broadcast watching process is expanded, the participation degree of audiences in the live broadcast content is improved, and the interaction effect in the live broadcast process is further improved.

Description

Live broadcast picture display method and device, computer equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of live broadcast, in particular to a live broadcast picture display method and device, computer equipment and a storage medium.
Background
With the popularization of mobile terminal applications and the development of internet technologies, live broadcast interaction has become a new demand for live broadcast interaction.
In the related art, for the viewer watching the live broadcast, the viewer can interact with the anchor by playing a barrage, praise, or comment in the comment area.
However, in the above interaction method, the main experience of the audience is limited by the operation of the anchor, and the interaction with the anchor is also limited by the function provided by the platform other than the live content, so that the audience has low participation in the live content and the interaction effect is poor.
Disclosure of Invention
The embodiment of the application provides a live broadcast picture display method, a live broadcast picture display device, computer equipment and a storage medium, which can improve the interactive effect in the live broadcast process, and the technical scheme is as follows:
in one aspect, a method for displaying a live view is provided, where the method includes:
displaying a first live broadcast picture in a live broadcast interface, wherein the first live broadcast picture is a scene picture of a virtual scene updated based on a user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in the virtual scene; the first user is a user who live broadcasts the virtual scene;
responding to the fact that the first user triggers live broadcast interaction in the virtual scene, and overlaying and displaying an interaction control on the first live broadcast picture; the interactive control is used for indicating that event options in the virtual scene are selected, and the event options are at least two options existing in a specified event node; the appointed event node is an event node corresponding to live broadcast interaction in the virtual scene;
displaying a second live broadcast picture in the live broadcast interface, wherein the second live broadcast picture is a scene picture of the virtual scene updated based on an interaction instruction, and the interaction instruction is an operation instruction determined based on the interaction operation of at least one second user on the interaction control; the second user comprises a user watching a live broadcast picture corresponding to the virtual scene.
In another aspect, a method for displaying a live view is provided, the method including:
displaying a scene interface corresponding to the virtual scene;
displaying a first scene picture in the scene interface in response to receiving a user operation instruction; the first scene picture is a scene picture of the virtual scene updated based on the user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in the virtual scene; the first user is a user who live broadcasts the virtual scene;
responding to a live broadcast interaction triggered by the first user in the virtual scene, displaying a second scene picture in the scene interface, wherein the second scene picture is a scene picture of the virtual scene updated based on an interaction instruction, the interaction instruction is an operation instruction determined based on an interaction operation of at least one second user on an interaction control, the interaction control is a control displayed in the live broadcast interface displayed by a terminal of the second user, the interaction control is used for indicating selection of an event option in the virtual scene, and the event option is at least two options existing in a specified event node; the specified event node is an event node corresponding to the live broadcast interaction in the virtual scene; the second user comprises a user watching the live broadcast picture corresponding to the virtual scene.
In another aspect, there is provided a live view display apparatus, the apparatus including:
the first picture display module is used for displaying a first direct-broadcast picture in a direct-broadcast interface, wherein the first direct-broadcast picture is a scene picture of a virtual scene updated based on a user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in the virtual scene; the first user is a user who live broadcasts the virtual scene;
the interactive control display module is used for responding to the fact that the first user triggers live broadcast interaction in the virtual scene and overlaying and displaying an interactive control on the first live broadcast picture; the interactive control is used for indicating that event options in the virtual scene are selected, and the event options are at least two options existing in a specified event node; the appointed event node is an event node corresponding to live broadcast interaction in the virtual scene;
the second picture display module is used for displaying a second live broadcast picture in the live broadcast interface, the second live broadcast picture is a scene picture of the virtual scene updated based on an interaction instruction, and the interaction instruction is an operation instruction determined based on the interaction operation of at least one second user on the interaction control; the second user comprises a user watching a live broadcast picture corresponding to the virtual scene.
In a possible implementation manner, the second picture display module is configured to display the second live view in the live view interactive interface in response to that the display duration of the interactive control reaches a duration threshold.
In one possible implementation, the apparatus further includes:
and the duration information display module is used for responding to the fact that the first user triggers live broadcast interaction in the virtual scene, and displaying duration information in an overlapping mode on the first live broadcast picture, wherein the duration information is used for indicating the display duration of the interaction control.
In a possible implementation manner, the duration information display module is configured to cancel displaying the interactive control in response to the display duration of the interactive control reaching the duration threshold.
In one possible implementation manner, the interactive control comprises at least two interactive sub-controls;
the interaction instruction is an operation instruction determined based on a target interaction sub-control, and the target interaction sub-control is an interaction sub-control determined based on the selection operation of at least one second user in the at least two interaction sub-controls.
In a possible implementation manner, the target interactive sub-control is obtained based on a candidate interactive sub-control, and the candidate interactive sub-control is a sub-control with the largest number of received selection operations in the at least two interactive sub-controls.
In a possible implementation manner, the interactive control display module is configured to, in response to a trigger of a specified event in the virtual scene based on a user operation performed by the first user in the virtual scene, display the interactive control on the first live-view screen in an overlapping manner.
In a possible implementation manner, the second user includes a user having a control right for the virtual scene among users watching the live broadcast.
In a possible implementation manner, in response to the method being executed by a playback terminal, the playback terminal is configured to indicate a terminal of a user viewing a live view corresponding to the virtual scene, so the apparatus further includes:
the permission application control display module is used for displaying a permission application control in the live broadcast interface;
the interactive control display module is used for responding to the permission application instruction to pass, triggering live broadcast interaction in the virtual scene by the first user, and overlapping and displaying the interactive control in an operable state on the first live broadcast picture; the permission application instruction is an application instruction sent in response to receiving the selection operation based on the permission application control.
In one possible implementation, in response to the method being performed by a terminal, the apparatus further includes:
and the confirmation information sending module is used for responding to the received selection operation based on the at least two interactive sub-controls and sending selection confirmation information to the server so that the server determines the target interactive sub-control based on the selection confirmation information.
In a possible implementation manner, the acknowledgment information sending module includes:
the input box display sub-module is used for responding to the received selection operation based on the at least two interactive sub-controls and displaying an information input box in the live broadcast interface;
and the confirmation information sending submodule is used for responding to the received information input completion operation and sending the selection confirmation information containing the text information to the server so that the server synchronously displays the text information to the live broadcast picture corresponding to the virtual scene.
In another aspect, there is provided a live view display apparatus, the apparatus including:
the scene interface display module is used for displaying a scene interface corresponding to the virtual scene;
the third picture display module is used for responding to the received user operation instruction and displaying the first scene picture in the scene interface; the first scene picture is a scene picture of the virtual scene updated based on the user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in the virtual scene; the first user is a user who live broadcasts the virtual scene;
a fourth picture display module, configured to, in response to a live broadcast interaction being triggered in the virtual scene by the first user, display a second scene picture in the scene interface, where the second scene picture is a scene picture of the virtual scene updated based on an interaction instruction, the interaction instruction is an operation instruction determined based on an interaction operation of at least one second user on an interaction control, the interaction control is a control displayed in the live broadcast interface displayed at a terminal of the second user, and the interaction control is used to instruct to select an event option in the virtual scene, where the event option is at least two options existing in a specified event node; the specified event node is an event node corresponding to the live broadcast interaction in the virtual scene; the second user comprises a user watching the live broadcast picture corresponding to the virtual scene.
In another aspect, a computer device is provided, which includes a processor and a memory, wherein at least one computer program is stored in the memory, and the at least one computer program is loaded by the processor and executed to implement the live view display method as described above.
In another aspect, a computer-readable storage medium is provided, in which at least one computer program is stored, the computer program being loaded and executed by a processor to implement the live view display method as described above.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the live view display method provided in the above-mentioned various optional implementations.
The technical scheme provided by the application can comprise the following beneficial effects:
in the process of live broadcast by the first user, in response to the trigger of live broadcast interaction, the interactive control is displayed on the current live broadcast picture corresponding to the virtual scene in an overlapping mode, so that the user watching the live broadcast can intervene in the selection condition of event options in the live broadcast content of the first user based on the interactive control, the interactive mode of the user in the live broadcast watching process is expanded, the participation degree of audiences in the live broadcast content is improved, and the interactive effect in the live broadcast process is further improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic diagram illustrating a live system provided by an exemplary embodiment of the present application;
fig. 2 is a schematic diagram illustrating a structure of a terminal according to an exemplary embodiment of the present application;
FIG. 3 is a flow chart illustrating a live view display method provided by an exemplary embodiment of the present application;
fig. 4 illustrates a flow chart of a live view display method according to an exemplary embodiment of the present application;
FIG. 5 is a diagram illustrating a display of an interactive child control according to an exemplary embodiment of the present application;
fig. 6 is a diagram illustrating a live screen display method in a cloud game mode according to an exemplary embodiment of the present application;
fig. 7 is a flowchart illustrating a live view display method according to an exemplary embodiment of the present application;
FIG. 8 is a diagram illustrating a live view in accordance with an exemplary embodiment of the present application;
FIG. 9 illustrates a schematic diagram of a voting process shown in an exemplary embodiment of the present application;
FIG. 10 shows a schematic diagram of a voting process shown in an exemplary embodiment of the present application;
fig. 11 is a flowchart illustrating a live view display method according to an exemplary embodiment of the present application;
fig. 12 is a block diagram illustrating a live view display apparatus according to an exemplary embodiment of the present application;
fig. 13 is a block diagram illustrating a live view display apparatus according to an exemplary embodiment of the present application;
FIG. 14 is a block diagram illustrating the structure of a computer device in accordance with an exemplary embodiment;
FIG. 15 is a block diagram illustrating the structure of a computer device according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Fig. 1 shows a schematic diagram of a live system provided in an exemplary embodiment of the present application, and as shown in fig. 1, the live system 100 includes a server 110, a terminal 120, and a terminal 130.
The server 110 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and artificial intelligence platform, and the like.
The terminal 120 has an application program supporting the live broadcast function, and each user can access a server of the application program supporting the live broadcast function through the terminal 120 to realize the live broadcast function; further, the terminal 120 may be a terminal device having a network connection function and an interface display function, for example, the terminal 120 may be a smart phone, a tablet computer, an e-book reader, smart glasses, a smart watch, a smart television, an MP3 player (Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4), a laptop computer, a desktop computer, and the like.
The terminal 130 is provided with an application program supporting the live broadcast watching function, and each user can access a server of the application program supporting the live broadcast watching function through the terminal 130 so as to realize the purpose of watching live broadcast; further, the terminal 130 may be a terminal device having a network connection function and an interface display function, for example, the terminal 130 may be a smart phone, a tablet computer, an e-book reader, smart glasses, a smart watch, a smart television, an MP3 player, an MP4 player, a laptop portable computer, a desktop computer, and the like.
The terminal 120 and the terminal 130 may be the same terminal device, that is, the same terminal may have an application program supporting a live broadcast function, and may also have an application program supporting a live broadcast viewing function; optionally, the application program supporting the live viewing function and the application program supporting the live viewing function may be the same application program. Alternatively, the terminal 120 and the terminal 130 are different terminal devices, and optionally, the application program supporting the live broadcast function and the application program supporting the live broadcast viewing function may be different application programs, which is not limited in this application.
Fig. 2 illustrates a schematic structural diagram of a terminal according to an exemplary embodiment of the present application. The terminal may be implemented as the terminal 120 or the terminal 130 shown in fig. 1, and as shown in fig. 2, the terminal includes a main board 131, an external input/output device 132, a memory 133, an external interface 134, a capacitive touch system 135, and a power supply 136.
The main board 131 is integrated with a processor, a controller, and other processing elements.
The external input/output devices 132 may include a display component (e.g., a display screen), a sound playing component (e.g., a speaker), a sound collecting component (e.g., a microphone), and various keys.
The memory 133 has program codes and data stored therein.
The external interface 134 may include a headset interface, a charging interface, a data interface, and the like.
The capacitive touch system 135 may be integrated in a display component or a key of the external input/output device 132, and the capacitive touch system 135 is used to detect a touch operation performed by a user on the display component or the key.
The power supply 136 is used to power various other components in the terminal.
The terminal 120 and the server 110 are connected through a communication network. Optionally, the communication network is a wired network or a wireless network.
Optionally, the system may further include a management device (not shown in fig. 1), which is connected to the server 110 through a communication network. Optionally, the communication network is a wired network or a wireless network.
Optionally, the wireless network or wired network described above uses standard communication techniques and/or protocols. The Network is typically the Internet, but may be any Network including, but not limited to, a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile, wireline or wireless Network, a private Network, or any combination of virtual private networks. In some embodiments, data exchanged over a network is represented using techniques and/or formats including Hypertext Mark-up Language (HTML), Extensible Markup Language (XML), and the like. All or some of the links may also be encrypted using conventional encryption techniques such as Secure Socket Layer (SSL), Transport Layer Security (TLS), Virtual Private Network (VPN), Internet Protocol Security (IPsec). In other embodiments, custom and/or dedicated data communication techniques may also be used in place of, or in addition to, the data communication techniques described above.
Fig. 3 is a flowchart illustrating a live view display method according to an exemplary embodiment of the present application, where the live view display method may be executed by a computer device, and the computer device may be a terminal as shown in fig. 1, or the computer device may also be implemented as a server as shown in fig. 1, and the live view display method is executed by a terminal as an example, and as shown in fig. 3, the live view display method includes:
step 310, displaying a first live-broadcast picture in a live-broadcast interface, wherein the first live-broadcast picture is a scene picture of a virtual scene updated based on a user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in a virtual scene, and the first user is a user for live broadcasting the virtual scene.
In this embodiment, the scene of the virtual scene used by the anchor (i.e., the first user) to perform live broadcasting may be a scene of the virtual scene displayed (or provided) when the application runs on the server in the terminal/cloud. The virtual scene can be a simulation environment scene of a real world, can also be a semi-simulation semi-fictional three-dimensional environment scene, and can also be a pure fictional three-dimensional environment scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene. Virtual scenes are typically generated by an application in a computer device, such as a terminal, and are presented based on hardware (e.g., a screen) in the terminal.
The live interface can be an interface displayed in a terminal corresponding to a main broadcast when the main broadcast carries out live broadcast; or, the live interface may be an interface displayed in a terminal corresponding to a user viewing the live interface when the anchor performs live broadcasting.
In this embodiment of the present application, the first live view is a scene view of a virtual scene that is updated based on an operation instruction corresponding to a user operation executed by a first user in the virtual scene, and in this embodiment of the present application, the first user is configured to instruct a live broadcast initiating a live broadcast, that is, when a change in the virtual scene is due to a change in the user operation of the live broadcast, a live view in a current live interface is the first live view.
In this embodiment of the present application, the virtual scene may be a virtual scene that changes based on a user operation of a user; at a designated node, at least two virtual scenes can be corresponding based on different user operations, illustratively, the designated node includes but is not limited to opening boxes in the virtual scenes, selecting props, selecting virtual character attributes/talents/skills/gains/benefits-reducing BUFFs, selecting trading articles, selecting plot options, selecting role interaction content and other scenes; for example, taking the control of the virtual character as an example, the movement of the first user-controlled virtual object in the virtual scene may cause the scene picture of the virtual scene displayed in the live interface to change, and when the first user-controlled virtual object moves to a specified position (reaches a specified node), two buildings are displayed in the virtual scene, different virtual scene pictures corresponding to different buildings are displayed when entering different buildings, when the virtual object is controlled to enter the building A, the virtual scene picture inside the building A is displayed, when the virtual object is controlled to enter the building B, a virtual scene picture of the inside of the building B is displayed, and, in this case, the live interaction may be set so that a user watching the live may participate in the movement control of the virtual object to determine whether the virtual object enters building a or building B.
Step 320, in response to a live broadcast interaction triggered by a first user in a virtual scene, displaying an interaction control on a first live broadcast picture in an overlapping manner, wherein the interaction control is used for indicating selection of an event option in the virtual scene, and the event option is at least two options existing in a specified event node; the designated event node is an event node corresponding to live interaction in the virtual scene.
The live action interaction can be triggered based on user action of the first user in the virtual scene; illustratively, live broadcast interaction is triggered when it is determined that the change of the virtual scene has at least two change conditions based on user operation of the anchor in the virtual scene.
In the embodiment of the application, the interactive control is used to instruct to select an event option in a virtual scene, that is, one or more event options in a plurality of event options in a specified event node in the virtual scene are selected as a selected event option to affect a change of the virtual scene, so that a user watching a live broadcast picture corresponding to the virtual scene can intervene in determining the event option in the virtual scene based on an interactive result of the interactive control.
The interactive content corresponding to the interactive control can be interactive content correspondingly set in an interactive mode based on the setting of related personnel, and illustratively, the interactive content of the interactive control can be a small game, such as coin throwing, box pushing and the like; alternatively, the interactive content of the interactive control can be interactive with the anchor, such as praise, coin, gift, etc.; alternatively, the interactive content of the interactive control may be a designated attribute that improves the anchor, such that, for example, an integral value, a popularity value, and the like; alternatively, the interactive content of the interactive control can be event selection, such as voting and the like; the interactive content corresponding to the interactive control can be set differently based on different virtual scenes, and the interactive content of the interactive control and the virtual scenes corresponding to various interactive contents are not limited.
Correspondingly, the interactive content corresponding to the interactive control is different, and the interactive operation of the second user is also different, illustratively, when the interactive content corresponding to the interactive control is a mini game, the interactive operation of the second user is to complete the mini game; when the interactive content corresponding to the interactive control is event selection, the interactive operation of the second user is to select an option, illustratively, the scenario (the specified event node) triggering the event selection includes, but is not limited to, opening a box, selecting a prop, selecting a virtual character attribute/talent/skill/gain BUFF, selecting a trading item, selecting a plot selection, selecting a character interactive content, and the like, and correspondingly, the event option may include, but is not limited to, a plurality of boxes, a plurality of props, a plurality of selectable virtual character attributes/talent/skill/gain BUFF, a plurality of tradeable items, a plurality of plot options, a plurality of character interactive contents, and the like.
In a possible implementation manner, the live interaction may also be triggered by the first user based on the virtual scene and the live interface together; illustratively, a live broadcast interaction control is displayed in a live broadcast interface of a terminal corresponding to a main broadcast, whether live broadcast interaction is triggered is determined based on selection operation of the main broadcast on the live broadcast interaction control in response to a virtual scene reaching a specified node, live broadcast interaction is triggered in response to the live broadcast interaction control being in a selected state, and the interaction control is displayed in an overlapping manner on a first live broadcast picture; and responding to the situation that the live broadcast interaction control is not selected, not triggering the live broadcast interaction, and still displaying a first live broadcast picture in a live broadcast interface, namely a scene picture of a virtual scene determined based on the user operation of the anchor.
Step 330, displaying a second live broadcast picture in the live broadcast interface, wherein the second live broadcast picture is a scene picture of a virtual scene updated based on an interaction instruction, and the interaction instruction is an operation instruction determined based on the interaction operation of at least one second user on an interaction control; the second user comprises a user watching a live broadcast picture corresponding to the virtual scene.
The virtual scene corresponding to the second live broadcast picture is a virtual scene updated based on the interaction instruction, and the second live broadcast picture is a scene picture corresponding to the updated virtual scene.
The second user comprises a user watching a live broadcast picture corresponding to the virtual scene; in a possible implementation manner, the second user is a user watching a live broadcast picture, that is, the second live broadcast picture is determined by an interactive operation result of the user watching the live broadcast picture corresponding to the virtual scene on the interactive control; or the second user comprises the user watching the live broadcast picture corresponding to the virtual scene and the first user, that is, the second live broadcast picture is determined by the audience and the anchor based on the interactive operation result of the interactive control, so that the second user can participate in the control process of the virtual scene controlled by the anchor while watching the anchor live broadcast, thereby improving the interactive effect in the live broadcast process.
When the computer device is implemented as a server, the above steps 310 to 330 may be implemented as:
the server indicates the terminal to display a first live broadcast picture in a live broadcast interface, wherein the first live broadcast picture is a scene picture of a virtual scene updated based on a user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in a virtual scene; the first user is a user who carries out live broadcast on a virtual scene;
responding to the fact that a first user triggers live broadcast interaction in a virtual scene, and indicating a terminal to superpose and display an interaction control on a first live broadcast picture by a server; the interactive control is used for indicating that event options in the virtual scene are selected, and the event options are at least two options existing in a specified event node; the appointed event node is an event node corresponding to live broadcast interaction in a virtual scene;
the server instructs the terminal to display a second live broadcast picture in the live broadcast interface, the second live broadcast picture is a scene picture of a virtual scene updated based on an interaction instruction, and the interaction instruction is an operation instruction determined based on the interaction operation of at least one second user on the interaction control; the second user comprises a user watching a live broadcast picture corresponding to the virtual scene.
To sum up, in the live broadcast picture display method provided by the embodiment of the application, in the process of live broadcast by the first user, in response to the trigger of live broadcast interaction, the interactive control is displayed in a superposition manner on the current live broadcast picture corresponding to the virtual scene, so that a user watching the live broadcast can intervene in the selection condition of event options in the live broadcast content of the first user based on the interactive control, the interactive mode of the user in the live broadcast watching process is expanded, the participation degree of audiences in the live broadcast content is improved, and the interactive effect in the live broadcast process is further improved.
The interactive content corresponding to the interactive control can be set based on different types of virtual scenes, the implementation form of the interactive content can also be set based on actual requirements, illustratively, the interactive content can be time-limited interactive content, or can also be non-time-limited interactive content, and the application does not limit the content; taking an interactive content corresponding to an interactive control as an example of time-limited event selection, fig. 4 shows a flowchart of a live view display method shown in an exemplary embodiment of the present application, where the live view display method may be executed by a computer device, and the computer device may be a terminal as shown in fig. 1, or the computer device may also be implemented as a server as shown in fig. 1, and the live view display method is executed by a terminal as an example, as shown in fig. 4, the live view display method includes:
step 410, displaying a first live-broadcast picture in a live-broadcast interface, wherein the first live-broadcast picture is a scene picture of a virtual scene updated based on a user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in a virtual scene; the first user is a user who is live broadcasting a virtual scene.
Taking the live broadcast interaction as an example of being triggered based on the user operation of the first user in the virtual scene, in one possible implementation manner, in response to a specified event in the virtual scene triggered based on the user operation executed by the first user in the virtual scene, confirming that the live broadcast interaction is triggered, and overlaying and displaying an interaction control on the first live broadcast picture; the specified event is used to indicate that at least two event options in the virtual scene are triggered. At least two event options respectively correspond to different changes of the virtual scene; taking a card game as an example, when a card turning round is entered based on user operation of a first user, live broadcast interaction is confirmed to be triggered, in the scene, the specified event is that the user enters the card turning round, and at least two cards (event options) to be selected are displayed in the virtual scene for the first user to select and determine a virtual card selected from a plurality of cards; or, taking a scenario game as an example, when a user operation based on a first user triggers interaction with a virtual character in a virtual scene, and needs to determine interactive content with a virtual object from a plurality of interactive options, or enters a scenario branch node, and needs to determine a scenario trend from a plurality of scenario trends, live broadcast interaction is confirmed to be triggered, and in the scene, at least two event options to be selected are displayed in the virtual scene for the first user to select, and determine interactive content with the virtual character, or determine the next trend of the scenario.
Step 420, in response to a live broadcast interaction triggered by a first user in a virtual scene, overlaying a display interaction control and display duration information on a first live broadcast picture, wherein the duration information is used for indicating the display duration of the interaction control; the interactive control is used for indicating that event options in the virtual scene are selected, and the event options are at least two options existing in a specified event node; the designated event node is an event node corresponding to live interaction in the virtual scene.
In a scene that live broadcast interaction has a time limit attribute, a timer used for recording the time length of the interaction control superposed and displayed on a first live broadcast picture is arranged in computer equipment, and in order to visualize the display time length of the interaction control, display time length information is superposed on the first live broadcast picture, wherein the time length information can be displayed as countdown for indicating the displayed time length and the residual display time length of the interaction control, the residual display time length is equal to the difference between a time length threshold value and the displayed time length, and the time length threshold value is used for indicating the effective time length of the interaction control; when the duration information is countdown, the display numerical value of the duration information is gradually reduced along with the time; alternatively, the duration information may be displayed as a positive sequence timer to remind the user that the interactive control is displayed.
In order to avoid the influence on the operation experience and the viewing experience caused by the overlong time of the interactive operation based on the interactive control, in one possible implementation manner, the interactive control is cancelled to be displayed in response to the display duration of the interactive control reaching the duration threshold, wherein the interactive control can be hidden or deleted when the interactive control is cancelled to be displayed.
Optionally, the display state of the interactive control may be set to a locked state (i.e., a non-selectable state), so that the second user cannot operate the interactive control, and further cannot affect the change of the virtual scene picture.
In the event selection scenario, optionally, the interactive control includes at least two interactive sub-controls, and each interactive sub-control corresponds to a different event option in the virtual scenario.
The control content of the at least two interactive sub-controls and the number of the interactive sub-controls both correspond to at least two event options in the virtual scene, and illustratively, when the at least two interactive sub-controls are implemented as at least two options in the voting assembly, the number and content of the options in the voting assembly are consistent with the at least two event options in the virtual scene.
Further, the display order of the at least two interactive sub-controls is consistent with the display order of the at least two event options in the virtual scene, fig. 5 shows a schematic diagram of the display manner of the interactive sub-controls shown in an exemplary embodiment of the present application, taking a card game interface as an example, as shown in fig. 5, when a virtual scene enters a card selection page based on a user operation of a first user, that is, when a virtual scene picture is displayed as a card selection page, a live broadcast interaction is triggered, at least two interactive sub-controls are additionally displayed on a scene picture (a first live broadcast picture) of the current virtual scene, as shown in fig. 5, three selectable virtual cards 510 are displayed in the virtual scene picture, three selectable interactive sub-controls 520 are additionally displayed on the current virtual scene picture correspondingly, the three interactive sub-controls respectively correspond to the three virtual cards displayed in the virtual scene picture, so that the second user affects the card selection result in the virtual scene based on the interactive child control.
And 430, responding to the display duration of the interactive control reaching the duration threshold, and displaying a second live broadcast picture in the live broadcast interactive interface.
The second live broadcast picture is a scene picture of a virtual scene updated based on an interaction instruction, the interaction instruction is an operation instruction determined based on interaction operation of at least one second user on the interaction control, and the second user comprises a user watching the live broadcast picture corresponding to the virtual scene.
And when the interactive control comprises at least two interactive sub-controls, the interactive instruction is an operation instruction determined based on a target interactive sub-control, and the target interactive sub-control is an interactive sub-control determined based on the selection operation of at least one second user in the at least two interactive sub-controls. The number of the target interactive sub-controls can be one or more, that is, the computer device can determine one or more target interactive sub-controls based on the selection operation of at least one second user, correspondingly, the interactive sub-controls can be selected singly or in multiple ways based on the setting of the virtual scene, and the number of the target interactive sub-controls is not limited in the application.
In a time-limited event selection scene, when the display duration of the interactive control reaches a specified duration threshold, determining that the user operation of each user on the virtual environment is completed, and when the display duration reaches the specified duration threshold, determining a target interactive sub-control based on the selection operation of a second user on each selection control. And each second user can select the interactive sub-control in the interactive control singly or in multiple ways, and the selectable number of the interactive sub-controls by each user is not limited in the application. The second user may comprise only the audience user or the second user may comprise both the audience user and the anchor.
Taking the number of target interactive sub-controls as 1, optionally, the target interactive sub-controls are obtained based on candidate interactive sub-controls, and the candidate interactive sub-controls are the sub-controls with the largest number of received selection operations in at least two interactive sub-controls; or the candidate interactive sub-control is a sub-control with the largest proportion of the number of received selection operations to the total number of selection operations in at least two interactive sub-controls.
Responding to at least two interactive sub-controls, wherein the number of the candidate interactive sub-controls is 1, and the candidate interactive controls are target interactive sub-controls;
and responding to the at least two interactive sub-controls, wherein the number of the candidate interactive sub-controls is greater than 1, and the target interactive sub-control is a random one of the candidate interactive sub-controls.
That is, when only one sub-control with the largest selection number exists in at least two interactive sub-controls, the sub-control is acquired as a target interactive sub-control; when a plurality of sub-controls with the same selection quantity and the largest quantity exist in at least two interactive sub-controls, the computer equipment can randomly select one of the plurality of sub-controls as a target interactive sub-control, or can select one of the plurality of sub-controls as a target interactive sub-control according to the display sequence of the sub-controls, illustratively, the interactive sub-control with the largest selection quantity in the first display sequence can be selected as the target interactive sub-control, or the interactive sub-control with the largest selection quantity in the last display sequence can be selected as the target interactive sub-control; alternatively, one of the multiple sub-controls (candidate sub-controls) with the same number and the largest number may be selected as the target interactive sub-control based on the selection operation of the first user.
When the live broadcast picture display method is executed by the terminal, in response to receiving selection operation based on at least two interactive sub-controls, sending selection confirmation information to the server, so that the server determines a target interactive sub-control based on the selection confirmation information. The server is used for receiving, verifying and counting selection confirmation information so as to collect the selection conditions of the sub-controls and further determine the target interactive sub-control from at least two sub-controls.
In one possible implementation manner, in response to receiving a selection operation based on at least two interactive sub-controls, an information input box is displayed in a live broadcast interface;
and in response to receiving the information input completion operation, sending selection confirmation information containing the text information to the server so that the server synchronously displays the text information into a live broadcast picture corresponding to the virtual scene.
Illustratively, taking the card game scene shown in fig. 5 as an example, when the user selects a second sub-control of the at least two interactive sub-controls, an information input box is displayed in the live broadcast interface, the user may input a reason (text information) for selecting the second sub-control in the information input box, and after the input is confirmed to be completed, that is, after the terminal receives an information input completion operation, the selection confirmation information containing the text information is sent to the server, so that the server synchronously displays the text information in the live broadcast picture corresponding to the virtual scene in each terminal. Optionally, the text information is displayed corresponding to the selection confirmation information, for example, a control selected by the user and the corresponding text information may be displayed in a comment area of the live broadcast interface in a comment form, or a control selected by the user and the corresponding text information may also be displayed in a live broadcast screen in a bullet screen form. Illustratively, in a card scene, when a user a selects multiple cards, it is determined that a second card is selected, and most users and the user a wish to select the same card, the user a may input a text content of "the second card is odd in a later round" in an information input box displayed after selecting the second card, and after receiving an information input completion operation of the user a, the terminal will carry "the user a selects the second card: the second card is sent to the server in the following turn with the miraculous' selection confirmation information, so that the server shares the selection confirmation information to the terminal corresponding to the user watching the live broadcast, and the second card is displayed in the live broadcast picture of the terminal corresponding to the user watching the live broadcast in the form of a bullet screen, wherein the selection confirmation information comprises the following information: the second card has information of miraculous effect in the later round, so that the user who sees the bullet screen content can select a control based on the bullet screen content, and the purpose of drawing a ticket is achieved.
The second user comprises a user watching a live broadcast picture corresponding to the virtual scene; optionally, the second user includes a user who has a control right for the virtual scene in a live view corresponding to the virtual scene.
The second user can obtain the interactive operation authority for the virtual scene by sending the authority application instruction, and the step of determining whether the authority application sent by the second user passes can be executed by the server or the terminal corresponding to the first user; illustratively, when the step of determining whether the permission applications sent by the second users pass is executed by the server, the server may obtain the order in which the second users send the permission applications, and in response to that the number of the received permission applications is smaller than the number threshold, determine that the users sending the permission applications have control permission for the virtual scene picture; and in response to the number of the received permission applications being larger than the number threshold, determining that the users corresponding to the permission applications received after being larger than the number threshold do not have the control permission to the virtual scene picture. When the step of determining whether the permission application sent by the second user passes is executed by the anchor terminal corresponding to the first user, the server can forward the received permission application to the anchor terminal corresponding to the first user, so that the first user can determine whether to give a control permission of a certain user watching the live broadcast to the virtual scene based on subjective intention, and the server determines which users have the permission to control the virtual scene picture and which users do not have the permission to control the virtual scene picture in the users watching the live broadcast based on a permission application passing instruction fed back by the anchor terminal;
for a playing terminal corresponding to a user watching a live broadcast, displaying an authority application control in a live broadcast interface; the permission application control can be continuously displayed in the live broadcast interface, or the permission application control is a control which is displayed in the live broadcast interface and used for determining whether the user has the permission to control the virtual scene picture or not based on the received selection operation of the permission application control in response to the fact that the first user triggers the live broadcast interaction in the virtual scene. As shown in fig. 5, the live interface includes an authority application control 530, configured to receive a selection operation of a user watching a live broadcast, and generate and send an authority application instruction based on the selection operation.
Responding to the permission application instruction, triggering live broadcast interaction in a virtual scene by a first user, and overlaying and displaying an interaction control in an operable state on a first live broadcast picture; the permission application instruction is an application instruction sent in response to receiving the selection operation based on the permission application control.
Optionally, when the first user triggers live broadcast interaction in the virtual scene in response to the failure of the permission application instruction, the interaction control is not displayed in an overlapping manner on the first live broadcast picture, or the interaction control in the inoperable state is displayed in an overlapping manner on the first live broadcast picture.
In one possible implementation, the at least one second user includes a first user; that is, the target interaction sub-control may be determined based on a result of voting common to the first user and the second user, or a result of selection of the control.
In order to avoid the lack of perception of the virtual scene information due to live interaction, in one possible implementation, the live interaction is triggered in the virtual scene in response to the first user, and the updating of the virtual scene is suspended.
And responding to the display duration of the interactive control reaching the duration threshold, recovering the updating of the virtual scene, and simultaneously displaying a second live broadcast picture in the live broadcast interface.
To sum up, in the live broadcast picture display method provided by the embodiment of the application, in the process of live broadcast by the first user, in response to the trigger of live broadcast interaction, the interactive control is displayed in a superposition manner on the current live broadcast picture corresponding to the virtual scene, so that a user watching the live broadcast can intervene in the selection condition of event options in the live broadcast content of the first user based on the interactive control, the interactive mode of the user in the live broadcast watching process is expanded, the participation degree of audiences in the live broadcast content is improved, and the interactive effect in the live broadcast process is further improved.
In a possible implementation manner, the live-broadcast picture display method provided by the present application may be applied to a Cloud game scene, where a Cloud game (Cloud Gaming), which may also be referred to as game on Demand (game on Demand), is an online game manner based on Cloud computing, in an operation mode of the Cloud game, all games are operated at a server, and a rendered game picture is compressed by the server and then transmitted to a client through a network, and after receiving the compressed game picture transmitted by the server, the client decompresses a video of the game picture and displays the decompressed game picture on a client interface.
In the cloud game mode, a user can directly pull up a game interface in a live broadcast interface through a cloud game technology, and can perform related game experience without downloading related games, so that the threshold of the user for entering the game is reduced, and the occupation of user terminal resources is reduced. Taking a live view in a live view interface as an example, where the live view is a game view of a first user in a cloud game mode, fig. 6 shows a schematic view of a live view display method in the cloud game mode according to an exemplary embodiment of the present application, and as shown in fig. 6, the process includes:
1) the anchor plays the game through the cloud game client 610; the cloud game client 610 is integrated with a cloud game interaction SDK (Software Development Kit) 620, and the cloud game client 610 implements logics such as audio and video data acquisition, game data synchronization (the game data includes information on completing a level, event execution progress and the like) and game event processing (such as voting and game results) through each module in the cloud game interaction SDK 620.
The cloud game client 610 is connected to the cloud game interaction SDK, and logic such as audio and video data acquisition, game data synchronization (the game data includes information on completing a level, event execution progress and the like), and game event processing (such as voting and game results) is realized through each module in the cloud game interaction SDK.
In the process, the audio/video module 621 is used for capturing textures and audio in a game to form an audio/video plug flow, schematically, the audio/video module 621 includes an audio module and a video module, and captures corresponding textures and audio and forms an audio/video plug flow by respectively accessing at least two cameras and an audio monitor in the game; in the process, the audio/video module 621 may perform data acquisition on at least two accessed cameras and audio listeners according to a certain frequency, for example, capture a texture when each frame of rendering is finished, and the frequency may be customized by related personnel according to actual needs.
The game data module 622 is used to collect additional game content data streams, which may contain any data type; illustratively, the game content data stream may be game data to be transmitted to the spectator according to content needs, for example, the game content data stream may include information such as game character ID (Identification), real-time location, and the like; the game data stream is a unidirectional push stream, that is, the game data stream is pushed to the audience side by the anchor side in a unidirectional way. Different games can correspond to different data stream pushing frequencies to refresh the data stream.
The game event module 623 is configured to receive a server and a user control event, and send a game content trigger event and the like, where the user control event refers to various interaction behaviors of a user and a game, such as operation and voting, and the user control event may be customized by a relevant person according to a difference in game types, which is not limited in the present application; the game event module supports bidirectional communication between a game and the outside, that is, the game event module is used for collecting user operation events of the anchor end and user operation events of the audience end, and sending the collected game events to the cloud server.
2) The cloud game interaction SDK synchronizes the audio and video data, the game data, and the game event to the cloud server 630, so as to send the audio and video data, the game data, and the game event to the playing client 640 corresponding to the spectator;
the cloud server 630 may include an audio and video processing service 631, a game data processing service 632, and an event processing service 633;
the audio/video processing service 631 is configured to receive audio/video data sent by the cloud game interaction SDK620, and distribute the audio/video data to the streaming media service 650, so as to send the audio/video data to the playing client 640 corresponding to the audience, where the streaming media service 650 may be implemented as a CDN (Content Delivery Network) or each live broadcast platform; or, the audio/video processing service 631 may also directly send the audio/video data to the playing client 640 corresponding to the viewer side, and the distribution manner of the audio/video data is not limited in the present application. In the process of audio and video distribution, the audio and video processing service 621 is used to maintain multiple audio and video data streams and provide the capability of switching different audio and video data streams.
The game data processing service 632 is configured to receive and process game data sent by the cloud game interaction SDK620, generate various events according to the game data, send the events to an Overlay SDK (Overlay software development kit) 641 embedded in the playing client 640, and generate information prompting content, where the information prompting content may be implemented in a text form or a picture form, and is schematically represented as a mark of a user or a virtual object in a game, or an attribute of the virtual object.
The event processing service 633 is used for receiving control events of the client and processing various events in the game; in the process, the event processing service 633 generates a control flow by acquiring various control information of different clients, checking and summarizing the various control information, and inputs the control flow into the cloud game client 610 corresponding to the anchor; meanwhile, the event processing service 633 is also used to invoke a game response through the SDK based on various control events of the respective clients because the game event received by the SDK is processed.
3) The audience watches the live broadcast content through the playing client 640 with the audio and video playing function, wherein the live broadcast picture played in the playing client 640 is generated by rendering based on the video data received by the playing client 640; the playing client 640 is integrated with an Overlay SDK (Overlay software development kit) 641 of a cloud game, and is configured to receive game data, display an interaction control in an overlapping manner above the playing client corresponding to the game data (for example, when a level is reached, or when an event is completed, a voting selection interface is displayed, where the voting selection interface includes at least two interaction sub-controls), and send a game event (for example, a selection operation of a user on an interaction control) to the cloud game server 620; after completing the statistics of the game event, the cloud game service 620 returns the statistical result to the playing client 640, and after receiving the statistical result, the playing client 640 displays the statistical result, for example, receives and displays a voting result, and at the same time, the playing client 640 controls the change of the virtual scene picture based on the statistical result.
Based on the live broadcast picture display method in the cloud game mode shown in fig. 6, taking live broadcast interaction as an example of voting for a user, fig. 7 shows a flowchart of a live broadcast picture display method shown in an exemplary embodiment of the present application, where the method may be executed by a computer device, where the computer device includes a terminal and a server shown in fig. 1, and as shown in fig. 7, the process includes:
and step 710, the anchor performs user operation on the cloud game and broadcasts a cloud game picture directly.
Optionally, the anchor performs user operation on the cloud game through the cloud game client, and sends audio and video data in the cloud game process to the playing client of the audience through the cloud game interaction SDK embedded in the cloud game client, so as to achieve the purpose of live broadcasting of the cloud game picture.
And 720, responding to the user operation triggering live broadcast interaction based on the anchor, and superposing and displaying a voting assembly on the cloud game picture, wherein the voting assembly comprises at least two interactive sub-controls.
The voting component is used for receiving interactive operation of a second user, wherein the second user comprises a user watching a live broadcast picture corresponding to the cloud game, and schematically, the second user is a user watching a live broadcast picture corresponding to the cloud game, or the second user is a user watching a live broadcast picture corresponding to the cloud game and an anchor for live broadcast of the cloud game. Fig. 8 is a schematic diagram of a live view shown in an exemplary embodiment of the present application, and as shown in fig. 8, in response to triggering a time-limited live view interaction during live view of a main play in a cloud game, taking the live view interaction as an example of time-limited voting, a voting component (interaction control) is displayed on a cloud game view in an overlapping manner, where the voting component includes a functional introduction of the voting component and effective duration information 810 of the voting component, as shown in fig. 8, the effective duration of the voting component is 15s, and after receiving a user selection operation on a participation control, a user is determined to participate in a vote, and a voting view 820 is displayed, where the voting view includes at least two operable interactive sub-controls 830, which correspond to at least two event options in the cloud game view, that is, in response to at least two event options appearing in the cloud game view, and triggering live broadcast interaction.
Optionally, within the effective duration, the user may perform single selection or double selection on at least two interactive widgets, where the user selecting the interactive widgets may include a main broadcaster and an audience, or only include the audience, and after the user completes the selection operation, the operation prompt information 840 may be displayed on the cloud game screen in an overlapping manner to indicate that the current selection operation is successful or failed.
In a possible implementation manner, in the voting process, the interactive operation of each user may be broadcasted in the form of a bullet screen prompt or a comment prompt, schematically, as shown in fig. 8, in response to receiving the interactive operation of each user, corresponding to the interactive operation of each user, bullet screen prompt information 850 is generated and displayed, so as to display the selection result of each user.
Optionally, in the voting process, the current voting result may also be updated in real time, so as to remind the user of the selection condition of each current sub-control.
In a possible implementation manner, the step of counting the voting data is performed by an event processing server, fig. 9 shows a schematic diagram of a voting process shown in an exemplary embodiment of the present application, as shown in fig. 9, when a vote is entered, an Overlay SDK910 built in a playing client acquires and packages a voting event, and sends the voting event to an event processing module 920 of the server, and the event processing module 920 counts the voting event to obtain voting data; in the process, the anchor client pushes real-time data (game content 950) in the game scene to the game data module 930 of the server according to a certain frequency, and after obtaining the voting data pushed by the event processing module 920, the game data module 930 will push the voting data to the playing client (the client including the Overlay SDK 910) together, so as to update the voting result in real time; when the voting is finished, that is, when the effective duration of the voting component is finished, the event processing module 920 sends the game event corresponding to the voting result to the cloud game interaction SDK940 built in the cloud game client, and after the cloud game interaction SDK receives the game event, executes the corresponding game event, changes the game content 950 in the anchor client, and after the game content is changed, the anchor client returns the execution result to the server, so as to synchronize the execution result to the playing client.
Or, in another possible implementation manner, the step of counting the voting data is performed by a main broadcasting client corresponding to a cloud game, fig. 10 shows a schematic diagram of a voting process shown in an exemplary embodiment of the present application, as shown in fig. 10, when voting is entered, an Overlay SDK910 built in a playing client acquires and packages a voting event, and sends the voting event to an event processing module 920 of a server, and the event processing module 920 directly transmits the voting event to game content 950 corresponding to the main broadcasting client through a cloud game interaction SDK940, so as to perform voting counting and related display in a game; when the voting is finished, the anchor client calls the game event corresponding to the voting result to change the game content 950, and after the game content is changed, the anchor client returns the execution result to the server to synchronize the execution result to the playing client.
Step 730, in response to the end of the effective duration of the voting component, determining a target sub-control of the at least two interactive sub-controls based on the interactive operation of the second user.
As shown in FIG. 8, during the voting process, the number of users selecting the second card is the largest or the proportion of the total number of people occupying the ballot is the largest, so that the second sub-control of the at least two mutual sub-controls is determined as the target sub-control.
And 740, displaying the virtual scene picture determined based on the target interaction child control in the cloud game picture.
As shown in fig. 8, the virtual scene screen determined based on the second sub-control is reserved for displaying a second card in the cloud game screen.
To sum up, in the live broadcast picture display method provided by the embodiment of the application, in the process of live broadcast by the first user, in response to the trigger of live broadcast interaction, the interactive control is displayed in a superposition manner on the current live broadcast picture corresponding to the virtual scene, so that a user watching the live broadcast can intervene in the selection condition of event options in the live broadcast content of the first user based on the interactive control, the interactive mode of the user in the live broadcast watching process is expanded, the participation degree of audiences in the live broadcast content is improved, and the interactive effect in the live broadcast process is further improved.
In a specified scene, the live broadcast picture display method provided by the application can be implemented by combining a block chain technology, and illustratively, in a live broadcast voting scene, in order to ensure the authenticity and reliability of data, voting data can be stored in a block chain.
In this embodiment of the application, the live interface may be displayed at both the anchor terminal (the terminal corresponding to the first user) and the viewer terminal (the terminal corresponding to the user viewing the live frame corresponding to the virtual scene); or, the anchor terminal only displays a scene interface corresponding to the virtual scene to provide audio and video streams of a live broadcast picture, and the live broadcast interface is only displayed at the audience terminal.
Taking the example that the anchor terminal only displays the scene picture, fig. 11 shows a flowchart of a live view display method shown in an exemplary embodiment of the present application, where the method may be executed by a computer device, where the computer device includes the terminal 120 shown in fig. 1, and as shown in fig. 11, the live view display method may include the following steps:
and step 1110, displaying a scene interface corresponding to the virtual scene.
The scene interface corresponding to the virtual scene may be a scene interface in which the first user performs user operation in the cloud game mode, and the scene interface includes a scene picture of the virtual scene.
Step 1120, responding to the received user operation instruction, and displaying a first scene picture in a scene interface; the first scene picture is a scene picture of a virtual scene updated based on a user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in a virtual scene; the first user is a user who is live broadcasting a virtual scene.
Step 1130, in response to a live broadcast interaction triggered by a first user in a virtual scene, displaying a second scene picture in a scene interface, where the second scene picture is a scene picture of the virtual scene updated based on an interaction instruction, the interaction instruction is an operation instruction determined based on an interaction operation of at least one second user on an interaction control, the interaction control is a control displayed in the live broadcast interface displayed by a terminal of the second user, and the interaction control is used for indicating selection of an event option in the virtual scene, where the event option is at least two options existing in a specified event node; the appointed event node is an event node corresponding to live broadcast interaction in a virtual scene; the second user comprises a user watching a live broadcast picture corresponding to the virtual scene.
In one possible implementation manner, in response to a live interaction triggered by a first user in a virtual scene, the interaction control is displayed in the scene interface, so that the interaction control can receive a selection operation of the first user, that is, the anchor and the audience can jointly control selection of an event option in the virtual scene through the selection operation of the interaction control.
In summary, in the live broadcast picture display method provided by the application, in the process of live broadcast by the first user, in response to the trigger of live broadcast interaction, the interactive control is displayed in an overlapping manner on the current live broadcast picture corresponding to the virtual scene, so that a user watching the live broadcast can intervene in the selection condition of event options in the live broadcast content of the first user based on the interactive control, the interactive mode of the user in the live broadcast watching process is expanded, the participation degree of audiences in the live broadcast content is improved, and the interactive effect in the live broadcast process is further improved.
Fig. 12 is a block diagram illustrating a live view display apparatus according to an exemplary embodiment of the present application, which includes, as shown in fig. 12:
a first screen display module 1210, configured to display a first live view in a live view interface, where the first live view is a scene view of a virtual scene updated based on a user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in the virtual scene; the first user is a user who live broadcasts the virtual scene;
the interaction control display module 1220 is configured to respond to that the first user triggers live broadcast interaction in the virtual scene, and superimpose and display an interaction control on the first live broadcast picture; the interactive control is used for indicating that event options in the virtual scene are selected, and the event options are at least two options existing in a specified event node; the appointed event node is an event node corresponding to live broadcast interaction in the virtual scene;
a second frame display module 1230, configured to display a second live frame in the live interface, where the second live frame is a scene frame of the virtual scene updated based on an interaction instruction, and the interaction instruction is an operation instruction determined based on an interaction operation of at least one second user on the interaction control; the second user comprises a user watching a live broadcast picture corresponding to the virtual scene.
In a possible implementation manner, the second image display module 1230 is configured to display the second live image in the live broadcast interactive interface in response to that the display duration of the interactive control reaches a duration threshold.
In one possible implementation, the apparatus further includes:
and the duration information display module is used for responding to the fact that the first user triggers live broadcast interaction in the virtual scene, and displaying duration information in an overlapping mode on the first live broadcast picture, wherein the duration information is used for indicating the display duration of the interaction control.
In a possible implementation manner, the duration information display module is configured to cancel displaying the interactive control in response to the display duration of the interactive control reaching the duration threshold.
In one possible implementation manner, the interactive control comprises at least two interactive sub-controls;
the interaction instruction is an operation instruction determined based on a target interaction sub-control, and the target interaction sub-control is an interaction sub-control determined based on the selection operation of at least one second user in the at least two interaction sub-controls.
In a possible implementation manner, the target interactive sub-control is obtained based on a candidate interactive sub-control, and the candidate interactive sub-control is a sub-control with the largest number of received selection operations in the at least two interactive sub-controls.
In a possible implementation manner, the interactive control display module 1220 is configured to, in response to a trigger of a specified event in the virtual scene based on a user operation performed by the first user in the virtual scene, display the interactive control on the first live view in an overlapping manner.
In a possible implementation manner, the second user includes a user having a control right for the virtual scene among users watching the live broadcast.
In a possible implementation manner, in response to the method being executed by a playback terminal, the playback terminal is configured to indicate a terminal of a user viewing a live view corresponding to the virtual scene, so the apparatus further includes:
the permission application control display module is used for displaying a permission application control in the live broadcast interface;
the interactive control display module 1220 is configured to respond to a permission application instruction, trigger live broadcast interaction in the virtual scene by the first user, and superimpose and display the interactive control in an operable state on the first live broadcast picture; the permission application instruction is an application instruction sent in response to receiving the selection operation based on the permission application control.
In one possible implementation, in response to the method being performed by a terminal, the apparatus further includes:
and the confirmation information sending module is used for responding to the received selection operation based on the at least two interactive sub-controls and sending selection confirmation information to the server so that the server determines the target interactive sub-control based on the selection confirmation information.
In a possible implementation manner, the acknowledgment information sending module includes:
the input box display sub-module is used for responding to the received selection operation based on the at least two interactive sub-controls and displaying an information input box in the live broadcast interface;
and the confirmation information sending submodule is used for responding to the received information input completion operation and sending the selection confirmation information containing the text information to the server so that the server synchronously displays the text information to the live broadcast picture corresponding to the virtual scene.
To sum up, the live broadcast picture display device provided by the embodiment of the application responds to the trigger of live broadcast interaction through the process of live broadcast by the first user, and displays the interaction control in a superposition manner on the current live broadcast picture corresponding to the virtual scene, so that a user watching the live broadcast can intervene in the selection condition of event options in the live broadcast content of the first user based on the interaction control, the interaction mode of the user watching the live broadcast process is expanded, the participation degree of audiences in the live broadcast content is improved, and the interaction effect in the live broadcast process is further improved.
Fig. 13 is a block diagram illustrating a live view display apparatus according to an exemplary embodiment of the present application, which includes, as shown in fig. 13:
a scene interface display module 1310, configured to display a scene interface corresponding to the virtual scene;
a third screen display module 1320, configured to display a first scene screen in the scene interface in response to receiving a user operation instruction; the first scene picture is a scene picture of the virtual scene updated based on the user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in the virtual scene; the first user is a user who live broadcasts the virtual scene;
a fourth screen displaying module 1330, configured to, in response to the first user triggering a live broadcast interaction in the virtual scene, display a second scene screen in the scene interface, where the second scene screen is a scene screen of the virtual scene updated based on an interaction instruction, the interaction instruction is an operation instruction determined based on an interaction operation of at least one second user on an interaction control, the interaction control is a control displayed in the live broadcast interface displayed at a terminal of the second user, and the interaction control is used to instruct to select an event option in the virtual scene, where the event option is at least two options existing in a designated event node; the specified event node is an event node corresponding to the live broadcast interaction in the virtual scene; the second user comprises a user watching the live broadcast picture corresponding to the virtual scene.
To sum up, the live broadcast picture display device provided by the embodiment of the application responds to the trigger of live broadcast interaction through the process of live broadcast by the first user, and displays the interaction control in a superposition manner on the current live broadcast picture corresponding to the virtual scene, so that a user watching the live broadcast can intervene in the selection condition of event options in the live broadcast content of the first user based on the interaction control, the interaction mode of the user watching the live broadcast process is expanded, the participation degree of audiences in the live broadcast content is improved, and the interaction effect in the live broadcast process is further improved.
FIG. 14 is a block diagram illustrating the structure of a computer device 1400 in accordance with an exemplary embodiment. The computer device may be implemented as a server in the above-mentioned aspects of the present application. The computer apparatus 1400 includes a Central Processing Unit (CPU) 1401, a system Memory 1404 including a Random Access Memory (RAM) 1402 and a Read-Only Memory (ROM) 1403, and a system bus 1405 connecting the system Memory 1404 and the Central Processing Unit 1401. The computer device 1400 also includes a mass storage device 1406 for storing an operating system 1409, application programs 1410, and other program modules 1411.
The mass storage device 1406 is connected to the central processing unit 1401 by a mass storage controller (not shown) connected to the system bus 1405. The mass storage device 1406 and its associated computer-readable media provide non-volatile storage for the computer device 1400. That is, the mass storage device 1406 may include a computer-readable medium (not shown) such as a hard disk or Compact Disc-Only Memory (CD-ROM) drive.
Without loss of generality, the computer-readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), flash Memory or other solid state Memory technology, CD-ROM, Digital Versatile Disks (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage devices. Of course, those skilled in the art will appreciate that the computer storage media is not limited to the foregoing. The system memory 1404 and mass storage device 1406 described above may be collectively referred to as memory.
The computer device 1400 may also operate as a remote computer connected to a network via a network, such as the internet, in accordance with various embodiments of the present disclosure. That is, the computer device 1400 may be connected to the network 1408 via the network interface unit 1407 that is coupled to the system bus 1405, or alternatively, the network interface unit 1407 may be utilized to connect to other types of networks or remote computer systems (not shown).
The memory further includes at least one instruction, at least one program, a code set, or a set of instructions, which is stored in the memory, and the central processing unit 1401 implements all or part of the steps of the live view display method shown in each of the above embodiments by executing the at least one instruction, the at least one program, the code set, or the set of instructions.
FIG. 15 is a block diagram illustrating the structure of a computer device 1500 according to an example embodiment. The computer device 1500 may be the terminal 120 or the terminal 130 shown in fig. 1 or the terminal shown in fig. 2.
Generally, computer device 1500 includes: a processor 1501 and memory 1502.
Processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1501 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 1501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1502 may include one or more computer-readable storage media, which may be non-transitory. The memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1502 is used to store at least one instruction for execution by processor 1501 to implement the methods provided by the method embodiments herein.
In some embodiments, computer device 1500 may also optionally include: a peripheral interface 1503 and at least one peripheral. The processor 1501, memory 1502, and peripheral interface 1503 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1503 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1504, a display 1505, a camera assembly 1506, an audio circuit 1507, a positioning assembly 1508, and a power supply 1509.
The peripheral interface 1503 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1501 and the memory 1502. In some embodiments, the processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1504 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1504 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1504 can communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1505 is used to display a UI (user interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1505 is a touch display screen, the display screen 1505 also has the ability to capture touch signals on or over the surface of the display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. In this case, the display screen 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1505 may be one, providing a front panel of the computer device 1500; in other embodiments, the display screens 1505 may be at least two, each disposed on a different surface of the computer device 1500 or in a folded design; in still other embodiments, the display 1505 may be a flexible display disposed on a curved surface or a folded surface of the computer device 1500. Even further, the display 1505 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1505 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1506 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1507 may include a microphone and speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1501 for processing or inputting the electric signals to the radio frequency circuit 1504 to realize voice communication. For stereo capture or noise reduction purposes, the microphones may be multiple and located at different locations on the computing device 1500. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1507 may also include a headphone jack.
A Location component 1508 is used to locate the current geographic Location of the computer device 1500 for navigation or LBS (Location Based Service). The Positioning component 1508 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, or the russian galileo System.
The power supply 1509 is used to supply power to the various components in the computer device 1500. The power supply 1509 may be alternating current, direct current, disposable or rechargeable. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the computer device 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, fingerprint sensor 1514, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 can detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the computer apparatus 1500. For example, the acceleration sensor 1511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1501 may control the touch screen display 1505 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1512 may detect a body direction and a rotation angle of the computer device 1500, and the gyro sensor 1512 and the acceleration sensor 1511 cooperate to collect a 3D motion of the user on the computer device 1500. The processor 1501 may implement the following functions according to the data collected by the gyro sensor 1512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 1513 may be disposed on a side bezel of the computer device 1500 and/or underneath the touch screen display 1505. When the pressure sensor 1513 is disposed on the side frame of the computer device 1500, the user's holding signal to the computer device 1500 may be detected, and the processor 1501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at a lower layer of the touch display 1505, the processor 1501 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1514 is configured to capture a fingerprint of the user, and the processor 1501 identifies the user based on the fingerprint captured by the fingerprint sensor 1514, or the fingerprint sensor 1514 identifies the user based on the captured fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1501 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1514 may be disposed on the front, back, or side of the computer device 1500. When a physical key or vendor Logo is provided on the computer device 1500, the fingerprint sensor 1514 may be integrated with the physical key or vendor Logo.
The optical sensor 1515 is used to collect ambient light intensity. In one embodiment, processor 1501 may control the brightness of the display on touch screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also known as a distance sensor, is typically provided on the front panel of the computer device 1500. The proximity sensor 1516 is used to capture the distance between the user and the front of the computer device 1500. In one embodiment, the touch display 1505 is controlled by the processor 1501 to switch from a bright screen state to a dark screen state when the proximity sensor 1516 detects that the distance between the user and the front face of the computer device 1500 is gradually decreasing; when the proximity sensor 1516 detects that the distance between the user and the front of the computer device 1500 is gradually increasing, the processor 1501 controls the touch display 1505 to switch from a breath screen state to a bright screen state.
Those skilled in the art will appreciate that the architecture shown in FIG. 15 is not intended to be limiting of the computer device 1500, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, a non-transitory computer readable storage medium including instructions, such as a memory including at least one instruction, at least one program, set of codes, or set of instructions, executable by a processor to perform all or part of the steps of the method shown in any of the embodiments of fig. 3, 4, 7, or 11 described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product or a computer program is also provided, which comprises computer instructions, which are stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the computer device to perform all or part of the steps of the method shown in any one of the embodiments of fig. 3, fig. 4, fig. 7 or fig. 11.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (15)

1. A method for displaying a live view, the method comprising:
displaying a first live broadcast picture in a live broadcast interface, wherein the first live broadcast picture is a scene picture of a virtual scene updated based on a user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in the virtual scene; the first user is a user who live broadcasts the virtual scene;
responding to the fact that the first user triggers live broadcast interaction in the virtual scene, and overlaying and displaying an interaction control on the first live broadcast picture; the interactive control is used for indicating that event options in the virtual scene are selected, and the event options are at least two options existing in a specified event node; the appointed event node is an event node corresponding to live broadcast interaction in the virtual scene;
displaying a second live broadcast picture in the live broadcast interface, wherein the second live broadcast picture is a scene picture of the virtual scene updated based on an interaction instruction, and the interaction instruction is an operation instruction determined based on the interaction operation of at least one second user on the interaction control; the second user comprises a user watching a live broadcast picture corresponding to the virtual scene.
2. The method of claim 1, wherein displaying the second live view in the live interface comprises:
and responding to the display duration of the interactive control reaching a duration threshold, and displaying the second live broadcast picture in the live broadcast interactive interface.
3. The method of claim 2, further comprising:
responding to the fact that the first user triggers live broadcast interaction in the virtual scene, and overlaying display duration information on the first live broadcast picture, wherein the duration information is used for indicating the display duration of the interaction control.
4. The method of claim 3, further comprising:
and responding to the display duration of the interactive control reaching the duration threshold value, and canceling the display of the interactive control.
5. The method of claim 1, wherein the interactive control comprises at least two interactive sub-controls;
the interaction instruction is an operation instruction determined based on a target interaction sub-control, and the target interaction sub-control is an interaction sub-control determined based on the selection operation of at least one second user in the at least two interaction sub-controls.
6. The method of claim 2, wherein the target interactive sub-control is obtained based on a candidate interactive sub-control, and wherein the candidate interactive sub-control is a sub-control of the at least two interactive sub-controls that receives a largest number of selection operations.
7. The method of claim 1, wherein the displaying an interaction control in superimposition on the first live view in response to the first user triggering a live interaction in the virtual scene comprises:
responding to a specified event in the virtual scene triggered based on the user operation executed by the first user in the virtual scene, and overlaying and displaying the interactive control on the first live-action picture.
8. The method according to claim 1, wherein the second user includes a user having a control right to the virtual scene among users viewing the live view.
9. The method of claim 8, wherein in response to the method being performed by a cast terminal, the cast terminal is configured to indicate a terminal of a user viewing a live view corresponding to the virtual scene, the method further comprising:
displaying an authority application control in the live broadcast interface;
the responding to the first user triggering the live broadcast interaction in the virtual scene, and overlaying and displaying an interaction control on the first live broadcast picture comprises:
responding to the permission application instruction, triggering live broadcast interaction in the virtual scene by the first user, and overlaying and displaying the interaction control in an operable state on the first live broadcast picture; the permission application instruction is an application instruction sent in response to receiving the selection operation based on the permission application control.
10. The method of claim 5, wherein in response to the method being performed by a terminal, the method further comprises:
in response to receiving the selection operation based on the at least two interactive sub-controls, sending selection confirmation information to a server, so that the server determines the target interactive sub-control based on the selection confirmation information.
11. The method of claim 10, wherein sending a selection confirmation message to the server in response to receiving the selection operation based on the at least two interactive child controls comprises:
in response to receiving a selection operation based on the at least two interactive sub-controls, displaying an information input box in the live interface;
and in response to receiving an information input completion operation, sending the selection confirmation information containing text information to a server so that the server synchronously displays the text information to a live broadcast picture corresponding to the virtual scene.
12. A method for displaying a live view, the method comprising:
displaying a scene interface corresponding to the virtual scene;
displaying a first scene picture in the scene interface in response to receiving a user operation instruction; the first scene picture is a scene picture of the virtual scene updated based on the user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in the virtual scene; the first user is a user who live broadcasts the virtual scene;
responding to a live broadcast interaction triggered by the first user in the virtual scene, displaying a second scene picture in the scene interface, wherein the second scene picture is a scene picture of the virtual scene updated based on an interaction instruction, the interaction instruction is an operation instruction determined based on an interaction operation of at least one second user on an interaction control, the interaction control is a control displayed in the live broadcast interface displayed by a terminal of the second user, the interaction control is used for indicating selection of an event option in the virtual scene, and the event option is at least two options existing in a specified event node; the specified event node is an event node corresponding to the live broadcast interaction in the virtual scene; the second user comprises a user watching the live broadcast picture corresponding to the virtual scene.
13. A live view display apparatus, comprising:
the first picture display module is used for displaying a first direct-broadcast picture in a direct-broadcast interface, wherein the first direct-broadcast picture is a scene picture of a virtual scene updated based on a user operation instruction; the user operation instruction is an operation instruction corresponding to user operation executed by a first user in the virtual scene; the first user is a user who live broadcasts the virtual scene;
the interactive control display module is used for responding to the fact that the first user triggers live broadcast interaction in the virtual scene and overlaying and displaying an interactive control on the first live broadcast picture; the interactive control is used for indicating that event options in the virtual scene are selected, and the event options are at least two options existing in a specified event node; the appointed event node is an event node corresponding to live broadcast interaction in the virtual scene;
the second picture display module is used for displaying a second live broadcast picture in the live broadcast interface, the second live broadcast picture is a scene picture of the virtual scene updated based on an interaction instruction, and the interaction instruction is an operation instruction determined based on the interaction operation of at least one second user on the interaction control; the second user comprises a user watching a live broadcast picture corresponding to the virtual scene.
14. A computer device comprising a processor and a memory, the memory storing at least one computer program that is loaded and executed by the processor to implement a live view display method as claimed in any one of claims 1 to 12.
15. A computer-readable storage medium, in which at least one computer program is stored, the computer program being loaded and executed by a processor to implement the live view display method according to any one of claims 1 to 12.
CN202110687210.2A 2021-06-21 2021-06-21 Live broadcast picture display method and device, computer equipment and storage medium Active CN113423017B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110687210.2A CN113423017B (en) 2021-06-21 2021-06-21 Live broadcast picture display method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110687210.2A CN113423017B (en) 2021-06-21 2021-06-21 Live broadcast picture display method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113423017A true CN113423017A (en) 2021-09-21
CN113423017B CN113423017B (en) 2023-03-24

Family

ID=77789583

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110687210.2A Active CN113423017B (en) 2021-06-21 2021-06-21 Live broadcast picture display method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113423017B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114125569A (en) * 2022-01-27 2022-03-01 阿里巴巴(中国)有限公司 Live broadcast processing method and device
CN114363711A (en) * 2021-12-20 2022-04-15 上海哔哩哔哩科技有限公司 Live broadcast interaction method, device and system
CN114390360A (en) * 2022-01-12 2022-04-22 北京字跳网络技术有限公司 Live broadcast voting method, apparatus, electronic device, storage medium, and program product
CN114885180A (en) * 2022-07-11 2022-08-09 成都秦川物联网科技股份有限公司 Smart city public landscape live broadcast control method and Internet of things system
CN115150634A (en) * 2022-07-06 2022-10-04 广州博冠信息科技有限公司 Live broadcast room information processing method and device, storage medium and electronic equipment
CN115225949A (en) * 2022-07-11 2022-10-21 广州博冠信息科技有限公司 Live broadcast interaction method and device, computer storage medium and electronic equipment
WO2023109037A1 (en) * 2021-12-14 2023-06-22 北京达佳互联信息技术有限公司 Interaction method based on live-streaming room, and electronic device
WO2024046357A1 (en) * 2022-08-30 2024-03-07 北京字跳网络技术有限公司 Information display method and apparatus, and electronic device and computer-readable medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160261917A1 (en) * 2015-03-03 2016-09-08 Google Inc. Systems and methods for broadcast audience interaction and participation
US20170111418A1 (en) * 2015-10-16 2017-04-20 Microsoft Technology Licensing, Llc Two-way interactive streaming media
CN106792229A (en) * 2016-12-19 2017-05-31 广州虎牙信息科技有限公司 Ballot exchange method and its device based on direct broadcasting room video flowing barrage
WO2019092590A1 (en) * 2017-11-09 2019-05-16 ГИОРГАДЗЕ, Анико Тенгизовна User interaction in a communication system with the aid of multiple live streaming of augmented reality data
CN110519611A (en) * 2019-08-23 2019-11-29 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus, electronic equipment and storage medium
CN111711831A (en) * 2020-06-28 2020-09-25 腾讯科技(深圳)有限公司 Data processing method and device based on interactive behavior and storage medium
CN111787407A (en) * 2020-07-24 2020-10-16 腾讯科技(深圳)有限公司 Interactive video playing method and device, computer equipment and storage medium
CN112561576A (en) * 2020-12-08 2021-03-26 广州繁星互娱信息科技有限公司 Interface display method, device and equipment for live application and storage medium
CN112973120A (en) * 2021-04-16 2021-06-18 腾讯科技(深圳)有限公司 Interaction method, device and equipment based on cloud game and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160261917A1 (en) * 2015-03-03 2016-09-08 Google Inc. Systems and methods for broadcast audience interaction and participation
US20170111418A1 (en) * 2015-10-16 2017-04-20 Microsoft Technology Licensing, Llc Two-way interactive streaming media
CN106792229A (en) * 2016-12-19 2017-05-31 广州虎牙信息科技有限公司 Ballot exchange method and its device based on direct broadcasting room video flowing barrage
WO2019092590A1 (en) * 2017-11-09 2019-05-16 ГИОРГАДЗЕ, Анико Тенгизовна User interaction in a communication system with the aid of multiple live streaming of augmented reality data
CN110519611A (en) * 2019-08-23 2019-11-29 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus, electronic equipment and storage medium
CN111711831A (en) * 2020-06-28 2020-09-25 腾讯科技(深圳)有限公司 Data processing method and device based on interactive behavior and storage medium
CN111787407A (en) * 2020-07-24 2020-10-16 腾讯科技(深圳)有限公司 Interactive video playing method and device, computer equipment and storage medium
CN112561576A (en) * 2020-12-08 2021-03-26 广州繁星互娱信息科技有限公司 Interface display method, device and equipment for live application and storage medium
CN112973120A (en) * 2021-04-16 2021-06-18 腾讯科技(深圳)有限公司 Interaction method, device and equipment based on cloud game and storage medium

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023109037A1 (en) * 2021-12-14 2023-06-22 北京达佳互联信息技术有限公司 Interaction method based on live-streaming room, and electronic device
CN114363711A (en) * 2021-12-20 2022-04-15 上海哔哩哔哩科技有限公司 Live broadcast interaction method, device and system
CN114390360A (en) * 2022-01-12 2022-04-22 北京字跳网络技术有限公司 Live broadcast voting method, apparatus, electronic device, storage medium, and program product
CN114390360B (en) * 2022-01-12 2024-01-30 北京字跳网络技术有限公司 Live voting method and device, electronic equipment and storage medium
CN114125569A (en) * 2022-01-27 2022-03-01 阿里巴巴(中国)有限公司 Live broadcast processing method and device
CN115150634A (en) * 2022-07-06 2022-10-04 广州博冠信息科技有限公司 Live broadcast room information processing method and device, storage medium and electronic equipment
CN114885180A (en) * 2022-07-11 2022-08-09 成都秦川物联网科技股份有限公司 Smart city public landscape live broadcast control method and Internet of things system
CN115225949A (en) * 2022-07-11 2022-10-21 广州博冠信息科技有限公司 Live broadcast interaction method and device, computer storage medium and electronic equipment
US11765403B1 (en) 2022-07-11 2023-09-19 Chengdu Qinchuan Iot Technology Co., Ltd. Methods and internet of things systems for controlling public landscape live broadcast in smart cities
WO2024046357A1 (en) * 2022-08-30 2024-03-07 北京字跳网络技术有限公司 Information display method and apparatus, and electronic device and computer-readable medium

Also Published As

Publication number Publication date
CN113423017B (en) 2023-03-24

Similar Documents

Publication Publication Date Title
CN113423017B (en) Live broadcast picture display method and device, computer equipment and storage medium
CN108810576B (en) Live wheat-connecting method and device and storage medium
CN109600678B (en) Information display method, device and system, server, terminal and storage medium
CN109982102B (en) Interface display method and system for live broadcast room, live broadcast server and anchor terminal
CN109803154B (en) Live broadcast method, equipment and storage medium for chess game
CN111432266A (en) Interactive information display method, device, terminal and storage medium
CN112468831B (en) Multi-user live broadcast method, device, terminal, server and storage medium
CN111083516B (en) Live broadcast processing method and device
CN111355974A (en) Method, apparatus, system, device and storage medium for virtual gift giving processing
CN111918090B (en) Live broadcast picture display method and device, terminal and storage medium
CN112929687A (en) Interaction method, device and equipment based on live video and storage medium
CN111327918B (en) Interaction method and device for live webcast room and storage medium
CN111787407B (en) Interactive video playing method and device, computer equipment and storage medium
CN113230655B (en) Virtual object control method, device, equipment, system and readable storage medium
CN112118477A (en) Virtual gift display method, device, equipment and storage medium
CN111327916B (en) Live broadcast management method, device and equipment based on geographic object and storage medium
CN111246236A (en) Interactive data playing method, device, terminal, server and storage medium
CN111669640B (en) Virtual article transfer special effect display method, device, terminal and storage medium
CN111787347A (en) Live broadcast time length calculation method, live broadcast display method, device and equipment
CN110958464A (en) Live broadcast data processing method and device, server, terminal and storage medium
CN114245221A (en) Interaction method and device based on live broadcast room, electronic equipment and storage medium
CN114040219B (en) Game live broadcast method, device, system, equipment and computer readable storage medium
CN114845129A (en) Interaction method, device, terminal and storage medium in virtual space
CN108184130B (en) Simulator system, live broadcast method, device and storage medium
CN111954018B (en) Live broadcast room management method, system, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40051846

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant