CN114430494B - Interface display method, device, equipment and storage medium - Google Patents

Interface display method, device, equipment and storage medium Download PDF

Info

Publication number
CN114430494B
CN114430494B CN202011184663.5A CN202011184663A CN114430494B CN 114430494 B CN114430494 B CN 114430494B CN 202011184663 A CN202011184663 A CN 202011184663A CN 114430494 B CN114430494 B CN 114430494B
Authority
CN
China
Prior art keywords
user
shared
playing
option
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011184663.5A
Other languages
Chinese (zh)
Other versions
CN114430494A (en
Inventor
唐艾妮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011184663.5A priority Critical patent/CN114430494B/en
Publication of CN114430494A publication Critical patent/CN114430494A/en
Application granted granted Critical
Publication of CN114430494B publication Critical patent/CN114430494B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8545Content authoring for generating interactive applications

Abstract

The embodiment of the invention discloses an interface display method, a device, a terminal and a storage medium, wherein the method can comprise the following steps: displaying a playing interface of a shared video in a shared room of a first application program, wherein a sharing party comprises at least one audience user; when the playing of the shared video meets the interaction condition, displaying at least one interaction option on a playing interface; and outputting an interactive response corresponding to any one of the interactive options when any one of the at least one interactive option is triggered by any one of the at least one viewer user. By adopting the embodiment of the invention, the same video is watched by a plurality of people through connecting wires, and the interaction is performed when the video is played.

Description

Interface display method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interface processing method, apparatus, device, and storage medium.
Background
Zoos are one of the most popular sites for many people to swim, however, under the influence of epidemic situations of new crown pneumonia, people reduce offline aggregation, and zoos also start to transform online live broadcasting, in other words, show the activity videos of all animals in the zoos to users through live broadcasting so as to realize online browsing of the zoos. But the form of online upstream zoos now lacks interaction between the user and the animal, and the user engagement is low. In a multi-user game scene, the unified progress of watching animal videos by a plurality of users cannot be ensured, and communication is performed in real time. Thus, how to better make online sights at present becomes a hotspot problem for research today.
Disclosure of Invention
The embodiment of the invention provides an interface display method, device, equipment and storage medium, which realize that multiple persons are connected to watch the same video and interact when the video is played.
In one aspect, an embodiment of the present invention provides an interface display method, including:
displaying a playing interface of the shared video in the shared room of the first application program; the shared room includes at least one spectator user therein;
when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface;
outputting an interactive response corresponding to any one of the at least one interactive option when the interactive option is triggered by any one of the at least one viewer-user.
In one aspect, an embodiment of the present invention provides an interface display device, including:
the display unit is used for displaying a playing interface of the shared video in the shared room of the first application program; the shared room includes at least one spectator user therein;
the display unit is further used for displaying at least one interaction option in the playing interface when the playing of the shared video meets the interaction condition;
And the output unit is used for outputting an interaction response corresponding to any interaction option when any interaction option in the at least one interaction option is triggered by any audience user in the at least one audience user.
In one aspect, an embodiment of the present invention provides a terminal, including:
a processor adapted to implement one or more instructions; the method comprises the steps of,
a computer storage medium storing one or more instructions adapted to be loaded by the processor and to perform the steps of:
displaying a playing interface of the shared video in the shared room of the first application program; the shared room includes at least one spectator user therein;
when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface;
outputting an interactive response corresponding to any one of the at least one interactive option when the interactive option is triggered by any one of the at least one viewer-user.
In one aspect, an embodiment of the present invention provides a computer storage medium, wherein computer program instructions are stored in the computer storage medium, and when the computer program instructions are executed by a processor, the computer program instructions are configured to perform the following steps:
Displaying a playing interface of the shared video in the shared room of the first application program; the shared room includes at least one spectator user therein;
when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface;
outputting an interactive response corresponding to any one of the at least one interactive option when the interactive option is triggered by any one of the at least one viewer-user.
In one aspect, embodiments of the present invention provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium; a processor of a terminal reads the computer instructions from the computer storage medium, the processor executing the computer instructions to perform:
displaying a playing interface of the shared video in the shared room of the first application program; the shared room includes at least one spectator user therein;
when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface;
outputting an interactive response corresponding to any one of the at least one interactive option when the interactive option is triggered by any one of the at least one viewer-user.
In the embodiment of the invention, the playing interface of the shared video in the shared room of the first application program is displayed, and the shared room comprises at least one audience user, that is, a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching the video by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; further, according to the triggering of any audience user to any interaction option, the interaction response corresponding to any interaction option is output, so that the interaction between the user and the video or the objects in the video is realized, and the interaction option can be selected by any user, so that the social property and the interaction property are improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a shared video management system according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of an interface display method according to an embodiment of the present invention;
FIG. 3a is a schematic diagram of a playback interface according to an embodiment of the present invention;
FIG. 3b is a schematic diagram of another playback interface according to an embodiment of the present invention;
fig. 3c is a schematic diagram of displaying invitation information in a friend user according to an embodiment of the present invention;
FIG. 3d is a schematic diagram of a display room setup interface provided by an embodiment of the present invention;
FIG. 3e is a schematic diagram of inviting friend users to join a shared room according to an embodiment of the present invention;
FIG. 3f is a schematic diagram of a friend user joining a shared room according to an embodiment of the present invention;
FIG. 4a is a schematic diagram showing an interactive option according to an embodiment of the present invention;
FIG. 4b is a schematic diagram illustrating a group photo interaction according to an embodiment of the present invention;
FIG. 4c is a schematic diagram of another video playing option according to an embodiment of the present invention;
FIG. 5 is a flowchart of another interface display method according to an embodiment of the present invention;
FIG. 6a is a schematic diagram showing an interactive option according to an embodiment of the present invention;
FIG. 6b is a schematic diagram of a welcome window display provided by an embodiment of the present invention;
FIG. 6c is a schematic diagram of a photo special effect selection provided by an embodiment of the present invention;
FIG. 6d is a schematic diagram of a display interface according to an embodiment of the present invention;
FIG. 6e is a schematic diagram of another photographing arrangement according to an embodiment of the present invention;
FIG. 7a is a schematic diagram of displaying a historical browsing map according to an embodiment of the present invention;
fig. 7b is a playing interface displayed in a host user terminal according to an embodiment of the present invention;
FIG. 8a is a network topology diagram of a shared video management system according to an embodiment of the present invention;
FIG. 8b is a block diagram of a shared video management system according to an embodiment of the present invention;
FIG. 8c is an interaction diagram provided by an embodiment of the present invention;
FIG. 8d is a flowchart of a method for implementing interface display in a shared video management system according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an interface display device according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
The embodiment of the invention provides an interface display scheme, which is used for displaying a playing interface of a shared video in a shared room of a first application program in a terminal, wherein the shared room can comprise one or more audience users; when the playing of the shared video meets the interaction condition, for example, the shared video is played to a certain designated position, or a designated object is included in a currently played playing picture, at least one interaction option, for example, a group photo option with a target object, other shared video options and the like are displayed in the playing interface; and outputting an interactive response corresponding to any one of the at least one interactive option if any one of the at least one interactive option is triggered by any one of the at least one viewer user. The synchronization of watching videos by multiple people is realized, each audience user in the multiple audience users can select interaction options, and social property and interaction property are improved.
Based on the above-mentioned interface display scheme, the embodiment of the present invention provides a shared video management system, please refer to fig. 1, which is a schematic structural diagram of the shared video management system provided by the embodiment of the present invention. The shared video management system shown in fig. 1 can realize real-time watching of the shared video by multiple people in a wired manner, and any one of the audience users can select an interaction mode in a playing interface of the shared video in the process, so that interaction among the audience users or interaction between a plurality of the audience users and target objects in the shared video can be completed according to the indication of the interaction mode.
Optionally, the shared video management system shown in fig. 1 may include a terminal 101 corresponding to each of the at least one viewer user and a server 102. The terminal 101 may include a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, a smart car, a smart television, etc.; the server 102 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligence platforms, and the like.
In one embodiment, the terminal 101 is configured to run a first application program and provide an interface UI presentation function for a corresponding viewer user, such as presenting a play interface, presenting at least one interactive option, presenting an interactive response, and the like. The server 102 provides support for the running of the first application in the terminal 101.
In a specific implementation, any audience user may perform an operation of creating a shared room through the terminal 101 used by the audience user, where the any audience user may be referred to as a creator of the shared room, and may also be referred to as a master user in the shared room; upon detecting the operation of creating the shared room, the terminal 101 transmits a creation request to the server 102, and the server 102 creates the shared room in the terminal 101 of the master user (in the following description, the terminal of the master user may be simply referred to as a master user terminal) after the verification of the creation request.
Further, the master state user may invite other audience users to join the shared room through the terminal 101 of the master state user, and the invited audience users may be referred to as friend state users in the shared room. Optionally, when detecting an invitation operation that the master user invites other audience users to join the shared room in the terminal 101, the master user terminal 101 sends an invitation request to the server 102, where the invitation request is used to instruct the server 102 to send an invitation notification to the terminal of the invited friend user (in the following description, the terminal of the friend user may be simply referred to as a friend user terminal). After the invited friend users accept the invitation and join the shared room, the server 102 may store user information of individual spectator users joining the shared room.
In one embodiment, the server 102 further stores video data of a plurality of shared videos, the terminal 101 of the host user may send a play request for playing a certain shared video to the server 102, and the server 102 returns the video data of the shared video requested by the play request; the terminal 101 displays a playing interface of the shared video in the shared room; it can be understood that the terminal 101 of the friend user added to the shared room also displays a playing interface of the shared video in the shared room.
With playing of the shared video in the shared room, if the terminal 101 or the server 102 of any audience user detects that the playing of the shared video meets the interaction condition, for example, playing to a certain designated playing screen or playing to a designated duration, the terminal 101 of each audience user in the shared room is notified to display one or more interaction options in the playing interface, so that any audience user can trigger any interaction option to interact through the corresponding terminal 101.
Optionally, when any one of the plurality of interactive options is triggered by any one of the viewer users, the terminal 101 of any one of the viewer users outputs an interactive response corresponding to any one of the interactive options. The interactive response may be generated by the server 102 according to any interactive option and then sent to each terminal 101 for display; alternatively, the interactive response may be generated by the terminal 101 of the viewer who triggers any interactive option, and sent to the server 102, and then sent to the terminals 101 of other viewer users by the server 102.
In the shared video management system shown in fig. 1, synchronization of watching videos by multiple people is realized, interaction options with the videos or objects in the videos can be selected by any user of multiple audience users, and social property and interactivity are improved.
Based on the interface display scheme and the shared video management system, the embodiment of the invention provides an interface display method. Referring to fig. 2, a flow chart of an interface display method according to an embodiment of the present invention is shown. The interface display method shown in fig. 2 may be performed by the target terminal, and in particular, may be performed by a processor of the target terminal. The target terminal may refer to a terminal of any audience user in the shared room, for example, the audience user in the shared room includes a host user and a friend user, and the target terminal may refer to any one of the host user terminal and the friend user terminal. The interface display method in fig. 2 may include the following steps:
step S201, displaying a playing interface of a shared video in a shared room of the first application program, where the shared room includes at least one audience user.
The first application may refer to any application running in the target terminal that supports creation of a shared room, such as an online zoo application, a small video application, and so on. The shared room is a virtual online room, and a plurality of audience users in the same shared room can synchronously or asynchronously watch the shared video played in the shared room and synchronously or asynchronously browse other information in the shared room. The shared video may refer to any one of a plurality of videos that may be provided by the first application program.
In one embodiment, when the target terminal is in the landscape state, the playing interface is displayed in the target terminal in the landscape state, as shown in fig. 3 a; when the target terminal is in the portrait state, the playing interface is displayed in the target terminal in the portrait state, as shown in fig. 3 b.
Optionally, at least one viewer user may be included in the shared room, where the at least one viewer user may include a creator of the shared room (abbreviated as a host user in the following description) and one or more non-creators (abbreviated as friend users in the following description). The target terminal may refer to any one of a host user terminal and a friend user terminal. Optionally, the friend state user is a contact user of the host state user in a second application program, and the second application program can be any social application program.
Optionally, the playing interface may include identification information of the audience users in the shared room, and the identification information of any audience user may include any one or more of a video frame of any audience user, a user avatar of any audience user in the second application program, and so on. Optionally, the identification information of the host user and the identification information of the friend user in the plurality of audience users may be displayed at any position of the playing interface. In particular implementations, the display interface may be related to whether the display interface is displayed in a landscape screen state or in a portrait screen state.
For example, assume that the identification information of the host user is denoted as 001, the identification information of the friend user 1 is denoted as 002, and the identification information of the friend user 2 is denoted as 003. The display positions of the identification information of the host state user and the identification information of the friend state user in the playing interface are shown in fig. 3a, the playing interface is displayed in a horizontal screen state in fig. 3a, at this time, the identification information 001 of the host state user is displayed at the left lower part of the playing interface, and the identification information 002 and 003 of the friend state user are displayed at the right part of the playing interface.
For another example, the positions of the identification information of the host user and the identification information of the friend user in the playing interface may also be as shown in fig. 3b, where the playing interface is displayed in a vertical screen state in fig. 3b, and at this time, the identification information 001 of the host user may be displayed at the lower left side of the playing interface, and the identification information 002 and 003 of the friend user may be displayed at the upper side of the playing interface.
It should be understood that, in the foregoing embodiments of the present invention, only the identification information of the two host users and the identification information of the friend user are displayed on the display positions of the playing interface, and in practical application, whether the playing interface is displayed in a horizontal screen state or a vertical screen state, the identification information of the host users and the identification information of the friend user may be displayed on the same horizontal position or the same vertical position; or whether the playing interface is displayed in a horizontal screen state or in a vertical screen state, the identification information of the host user can be displayed at the top end of the playing interface, and the identification information of the friend user can be displayed at the bottom end of the playing interface.
In one embodiment, the friend state user in the shared room can be added to the shared room by inviting the friend state user. In a specific implementation, assuming that the target terminal is a master state user terminal, the master state user inviting the friend state user to join in the shared room includes: when the friend inviting option is triggered, a friend inviting window is displayed, wherein the friend inviting window comprises an application program identifier of the second application program; selecting the application program identifier, and displaying a friend state user selection window, wherein the friend state user selection window comprises user identifiers of a plurality of friend state users; and selecting a target user identifier in the user identifiers of the plurality of friend users, triggering to send invitation information to the friend users indicated by the target user identifier, wherein the invitation information is used for indicating the friend users indicated by the target user identifier to join in the shared room.
The number of the second applications may be one or more, so the number of the application identifiers of the second applications in the friend invitation window is one or more, the application identifiers may refer to any one or more of icons and names of the second applications, for example, the second applications are WeChat applications, and the application identifiers of the second applications may be: "WeChat icon+WeChat".
In an embodiment, the implementation manner of displaying the friend user selection window may be: when the application program identification of the second application program is triggered, the host terminal sends request information for displaying the friend user selection window to the server, and the server generates the friend user selection window according to the friend user in the second application program and sends the generated friend user selection window to the host terminal; and the host state user terminal displays the friend state user selection window in a first application program, or the host state user terminal displays the friend state user selection window in a second application program.
In an embodiment, the invitation information sent to the friend user indicated by the target user identifier may be a session message in the second application, where the session message is displayed in a session interface of the friend user terminal indicated by the target user identifier, and the session interface is used for performing a session between the host user and the friend user. Optionally, the invitation information may include prompt information for joining the shared room and a join determination option, where the prompt information may be in a form of "your friend XXX invites your join in the XXX room to browse pandas together, and clicking the join determination option below opens the first application and enters the XXX room".
For example, referring to fig. 3c, a schematic diagram of displaying invitation information in a friend user according to an embodiment of the present invention is shown. 3A represents a session interface displayed in the friend-state user terminal indicated by the target user identifier, 3B represents the invitation information, and 3C represents the confirm joining button. After clicking the confirm joining button 3C, the friend user terminal jumps from running the second application to running the first application, and displays a friend room setting interface of the shared room in the first application as shown in fig. 3C at 31. The friend-state room setting interface 31 may display identification information of the host-state user and identification information of other friend-state users who have joined in the shared room. For example, the identification information of the host user may be expressed as "a WeChat friend user XXX of the homeowner you"; alternatively, it may be represented as an avatar of the host user in the WeChat application; or if the host user starts the camera, the identification information of the host user can be expressed as a video picture of the host user.
In one embodiment, the display manner of the friend inviting option may be any one or more of the following: when the number of friend state users included in the shared room is smaller than a number threshold, displaying the friend state users in the playing interface; and the room setting interface is displayed in the room setting interface of the shared room, wherein the room setting interface is displayed after the master state user creates and enters the shared room, that is to say, the room setting interface refers to the master state room setting interface.
In this manner of displaying the inviting friend options on the playing interface, the inviting friend options may be displayed at a position adjacent to the identification information of the friend user, as shown in 301 in fig. 3 a; alternatively, as shown at 310 in FIG. 3 b. The number threshold is the number of friend users which can be accommodated in a preset shared room, and the number threshold can be set by a default of a master user terminal or can be set by the master user when the shared room is created. If the number of friend users included in the current shared room is smaller than the number threshold, indicating that the shared room can also be added with other friend users; otherwise, if the number of friend users included in the current shared room is greater than or equal to the number threshold, the number of people in the shared room reaches the upper limit, and other friend users are not allowed to be added.
Thus, if the number of friend users included in the shared room is less than the number threshold, an invite friend option may be displayed in the play interface such that the master user invites the friend user to join the shared room by triggering the invite friend option. For example, assuming that the number threshold is 3, in the playing interface shown in fig. 3a, 2 friend users who have joined in the shared room may be included in the playing interface.
In the display mode that the friend inviting options are displayed on the room setting interface of the shared room, the room setting interface is displayed in the host user terminal after the host user creates the shared room and enters the shared room. In the room setting interface, the identification information of the host-state user may be included, and when the target terminal is placed on the vertical screen, the identification information of the host-state user may be displayed at the bottom end of the room setting interface, specifically, at the lower left corner of the room setting interface, as shown in 302 of fig. 3 d.
In one embodiment, the invitation friend options may be displayed at the same horizontal position as the identification information of the host user, as in fig. 3d, 311, 322, and 333 represent the invitation friend options, and the visible 311, 322, and 333 are displayed at the same horizontal position as the identification information 302 of the host user, which is the bottommost end of the room setting interface. It should be understood that the foregoing is merely a display position of an inviting friend option in the room setting interface, and in practical application, it may be set that the inviting friend option is displayed at any position of the room setting interface, such as a position in the upper left corner, a position in the upper right corner, and so on.
Regardless of the invitation friend options displayed in any of the above display modes, when the invitation friend options are triggered, a friend invitation window is displayed. How the master user invites the friend user to join the shared room is specifically described below with reference to fig. 3d and 3e, taking the example in which the invitation friend option is displayed on the room setting interface.
In a specific implementation, after the host user clicks the friend inviting option 311 in fig. 3d, the host user displays a friend inviting window as shown in 303 in fig. 3 e; icons and names of WeChat applications, and QQ application icons and names are included in friend user selection window 303; when the master state user triggers the icon of the QQ application program, a friend state user selection window 304 is displayed; 304 includes a user identifier a of a friend user a, a user identifier B corresponding to the friend user B, and a user identifier C corresponding to the friend user C. When the user identifier B is triggered, the master state user terminal sends invitation information to the friend state user B, and the invitation information is assumed to be shown as 3C in fig. 3C. And if the friend user B clicks the determine joining button in the 3C, indicating that the friend user B joins the shared room. After friend user B joins the shared room, the room setting interface may display identification information of friend user B, as shown at 306 in fig. 3 e.
In other embodiments, the buddy state user is actively engaged in the shared room by the room identification code of the shared room. In the specific implementation, any friend user terminal runs a first application program and opens an application interface for applying to join a shared room, wherein the application interface can comprise a room identification code filling area and a confirm joining button; and when the room identification code filling area is filled with the room identification codes of the shared rooms and the confirm joining button is triggered, joining the shared rooms by any friend users.
Optionally, the room identifier of the shared room may be automatically obtained and automatically filled into the room identifier filling area by the target terminal through a session interface of the host user and the friend user in the second application program, or manually filled by the friend user. In other alternative manners, the room identifier of the shared room may be that the host user informs the user of any friend state face to face, and the user of any friend state manually fills the room identifier filling area.
For example, referring to fig. 3f, a schematic diagram of a friend user joining a shared room according to an embodiment of the present invention is shown in fig. 3f, 31A shows an application interface for applying to join the shared room in a first application program displayed by a friend user terminal, 31B shows a session interface for a friend user and a host user to perform a session in a second application program displayed by the friend user terminal, and 32B shows a session message including a shared room identifier sent by the host user to the friend user; 32A denotes a room identification code filling area, and 33A denotes a confirm joining button. The friend user inputs the room identification code included in 32B in 32A and clicks 33A, and at this time, the friend user terminal displays a friend room setting interface of the shared room as shown in 31C.
Step S202, when the playing of the shared video meets the interaction condition, at least one interaction option is displayed on the playing interface.
In one embodiment, the interaction condition includes a play progress condition, and the playing of the shared video satisfying the interaction condition may include: the fact that the playing of the shared video meets the interaction condition means that the playing progress of the shared video at the current moment is equal to the target playing progress indicated by the playing progress condition. The target playing progress may be 30 seconds from the end of playing, that is, the playing of any video is about to end. It should be understood that, in the embodiment of the present invention, at least one interaction option triggered by any one of the audience users in the shared room is displayed near the end of playing the shared video, so that the audience user in the shared room decides what needs to be played or displayed in the shared room in the next step, and in colloquial terms, any one of the audience users decides the next step of the shared room, thereby improving interactivity.
Alternatively, the target playing progress also refers to played 1/2, played 1/3, or played 4/5, etc. The interactive options are displayed in the process of playing the shared video, so that the fatigue sense of watching the video for a long time by the audience user can be avoided, the watching interest of the audience user to the shared video is increased through proper interaction, and the attention of the first application program is improved.
In another embodiment, the interactive condition includes a play frame condition, in colloquial terms, the play frame condition refers to a condition that needs to be met by a play frame currently played by the shared video, the play frame condition is used to indicate a target frame, and the play meeting interactive condition of the shared video refers to that the play frame in the shared video at the current moment is the target frame indicated by the play frame condition. The target picture may be pre-designated, and the target picture may be any one of multi-frame playing pictures included in the shared video, for example, the target picture refers to a playing picture including a target object, or the target picture refers to a playing picture including a target object, where a gesture of the target object is a preset gesture. The preset gesture may refer to any one or more of a gesture suitable for group photo, a gesture suitable for feeding, or a gesture suitable for stroking. In the playing process of the shared video, a plurality of playing pictures suitable for interaction are selected for audience users to interact, so that the interestingness of watching the shared video can be improved.
As an optional implementation manner, if the target picture refers to any one frame of playing pictures in multi-frame playing pictures included in the shared video, an implementation manner in which the target terminal detects whether playing of the shared video meets the interaction condition may be: the target terminal adds a mark for a target picture in the shared video; and in the playing process of the shared video, if the mark exists in the playing picture played at the current moment, determining that the playing of the shared video meets the interaction condition.
As another optional implementation manner, if the target picture refers to a playing picture including a target object, the implementation manner of detecting whether playing of the shared video by the target terminal meets the interaction condition may further be that: during the playing process of the shared video, identifying a playing picture of which each frame comprises an object; and if the object included in the playing picture played at the current moment is identified as the target object, determining that the playing of the shared video meets the interaction condition.
As still another optional implementation manner, if the target picture includes a target object and the gesture of the target object is a preset gesture, the implementation manner that the target terminal detects whether the playing of the shared video meets the interaction condition may be: during the playing process of the shared video, identifying a playing picture of each frame including a target object; and identifying the gesture of the target object in the playing picture; and if the gesture of the target object is equal to the preset gesture, determining that the playing of the shared video meets the interaction condition.
After detecting that the playing of the shared video meets the interaction condition, the target terminal can display at least one interaction option in a playing interface, wherein each interaction option corresponds to an interaction operation, and each interaction option can be understood as a scene.
In one embodiment, the at least one interaction option may include each of at least one interaction option to be displayed, where the at least one interaction option to be displayed is a preset interaction option related to the shared video. The at least one interactive option to be displayed may include: options for interacting with the target object and options for viewing other shared videos; the option of interacting with the target object includes any one or more of a group photo option with the target object and an option of inputting a voice containing the name of the target object (i.e., an option of calling the name of the target object). The other shared video may include a shared video related to the target object, or may refer to a video including other objects.
It should be understood that the embodiments of the present invention just enumerate some possible interaction options to be displayed related to the shared video, and in practical applications, more interaction options to be displayed may be set according to different object types to which the target object belongs. For example, if the target object is a plant, the interaction options to be displayed for interacting with the target object may further include watering the target object, pruning branches and leaves for the target object, and the like; for another example, if the target object is a person, the option to be displayed for interacting with the target object may further include handshake with the target object, hug with the target object, and so on.
In popular terms, the target terminal may preset a plurality of interaction options related to the shared video, and these preset interaction options may be referred to as interaction options to be displayed. When the playing of the shared video is detected to meet the interaction condition, the target terminal can display preset interaction options to be displayed in a playing interface.
In other embodiments, the at least one interaction option may be an interaction option that matches the interaction condition and is selected from at least one interaction option to be displayed. For example, if the interactive condition is a playing progress condition, the at least one interactive option refers to an interactive option matching the playing progress condition in the at least one interactive option to be displayed, for example, the target playing progress indicated by the playing progress condition is 30 seconds from the end of playing, and at this time, the interactive option matching the playing progress condition in the at least one interactive option to be displayed may refer to an option for watching other shared videos; if the interactive condition is a play screen condition, the at least one interactive option refers to an interactive option matched with the play screen condition in at least one interactive option to be displayed, such as a group photo option with a target object and an option for inputting a name voice containing the target object.
In one embodiment, the at least one interactive option may be displayed in an interactive window of the playback interface. For example, referring to fig. 4a, which is a schematic diagram showing an interactive option provided in the embodiment of the present invention, it is assumed that the first application is an online zoo, the shared video is an active video of panda pan, the target object included in the shared video is pan, and 401 represents a playing interface of the shared video in the shared room. If the target terminal detects that the playing progress of the shared video is equal to the target playing progress indicated by the interaction condition, displaying an interaction window in the playing interface as shown in 402 in fig. 4 a; displaying a plurality of interaction options in the interaction window, such as: the target object is fed with the option of "bamboo feed to the trypan", the option of combining with the target object, the option of "and trypan, the option of inputting the name voice containing the target object, the option of" the name of trypan "and the option of watching other shared videos, and the option of" round out ".
Step S203, when any one of the at least one interactive option is triggered by any one of the at least one viewer user, outputting an interactive response corresponding to the any one interactive option.
As can be seen from the foregoing, at least one interaction option is displayed in an interaction window of the playing interface, and the interaction window may further include a prompt message triggering the interaction option in a voice manner, where any interaction option is triggered in any one or more of the following triggering manners: the touch control mode refers to selecting interaction options by any one of touch modes such as clicking, long pressing, double clicking and the like; the voice mode refers to outputting voice comprising any interaction option, and popular speaking is that any audience user inputs voice comprising any interaction option through any audience user terminal. For example, in the interactive window 402 of FIG. 4a, the prompt to trigger the interactive option by voice may appear as "try to turn on the microphone to speak your selection bar", as shown at 411 in FIG. 4 a.
It should be understood that triggering any interaction option in a voice manner enriches the form of participation of any audience user in the interaction, so that both hands of the audience user can be liberated, convenience is provided for the audience user, and the enthusiasm of the audience user for participation in the interaction can be improved.
In one embodiment, if any of the interactive options triggered by the at least one viewer user refers to a group photo option with the target object, the interactive response may refer to a group photo image, and the outputting the interactive response corresponding to any of the interactive options includes: when the group photo options with the target object are triggered, group photo prompt animation is played on the playing interface; and after the prompt animation is played, displaying a group photo completion window, wherein the group photo completion window comprises a group photo image.
In one embodiment, the group photo reminder animation is used for reminding the audience that the user is generating the group photo image, and the group photo reminder animation can include reminder information for generating the group photo image, such as "group photo countdown, please put a gesture" for example. The group photo prompt animation may be a countdown animation, and the countdown starting time may be preset by the target terminal. Alternatively, the target terminal may set a countdown time according to the time required to generate the group photo image.
And after the countdown is finished, the target terminal displays the group photo image in a group photo completion window. Alternatively, individual viewer users in a shared room may conduct an audio-video session.
In one embodiment, the group photo image is generated according to a target image and a viewer user image of at least one viewer user, wherein the target image refers to a preset playing picture in the shared video, or the target image refers to a playing picture in the shared video when the any interaction option is triggered; the audience user image may refer to an image captured by an audience user during the process of playing the group photo reminder animation.
Optionally, after detecting that the playing of the shared video meets the interaction condition, displaying at least one interaction option, the target terminal may stop playing the shared video, where the time when any one of the at least one interaction option is triggered is equal to the time when the at least one interaction option is displayed; alternatively, if the target terminal continues to play the shared video while the at least one interactive option is displayed until any one of the interactive options is triggered, at which time any one of the interactive options is triggered later than the time at which the at least one interactive option is displayed.
As an alternative embodiment, the purpose of the group photo is to commemorate that a plurality of audience users watch a shared video including a target object together, so that it is preferable that the audience image includes the face of the audience user. In particular implementations, each of the viewer-user terminals may send the captured image to a server, from which the viewer-user image including the face is selected by the server as a valid image, or as a viewer-user image that may be used to generate a group photo image. For invalid audience user images, the server may be reacquired by the corresponding audience terminal, or ignored.
For example, an embodiment of the present invention provides a schematic diagram for displaying a group photo image, as shown in fig. 4b, if any viewer selects the "and pan around group photo" option in the interactive window 402, the target terminal displays a group photo prompt animation as shown in 41. Over time, the group photo prompt animation is played, and the target terminal displays a group photo window 42 in the play interface, wherein the group photo window 42 comprises a look-ahead image and audience user images of two audience users.
Optionally, the group photo window may also include a group photo collection option, shown as "recall", which is shown at 421 in group photo window 42 of fig. 4b, where the group photo is saved in any of the viewer user terminals when triggered by that group photo collection option.
In other embodiments, if any of the interactive options triggered by any of the viewer users in step S202 refers to watching other shared video options, the interactive response is a playing interface of the other shared video, and when any of the at least one interactive option is triggered by any of the at least one viewer user, outputting an interactive response corresponding to the any of the interactive options includes: when the option of watching the video related to the target object is triggered, switching to a playing interface of the other shared videos in the shared room of the first application program by the playing interface of the shared video.
For example, referring to fig. 4a, a schematic diagram of an output interactive response is provided in an embodiment of the present invention as shown in fig. 4 c. Assuming that the playing interface of the shared video displayed by the target terminal is shown as 401 in fig. 4a, if any viewer in the shared room clicks the "feed bamboo to pan" option in the interactive window 402 of fig. 4a, the target terminal displays the video of "feed bamboo to pan" as shown as 45 in fig. 4 c.
In the embodiment of the invention, the playing interface of the shared video in the shared room of the first application program is displayed, and the shared room comprises at least one audience user, that is, a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching the video by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; further, according to the triggering of any audience user to any interaction option, the interaction response corresponding to any interaction option is output, the interaction between the user and the video or the objects in the video is realized, the interaction option can be selected by any user, and the social property and the interactivity are improved.
Based on the interface display method, another interface display method is provided in the embodiment of the invention. Referring to fig. 5, a flowchart of another interface display method according to an embodiment of the present invention is shown. The flowchart shown in fig. 5 may be executed by the target terminal, and in particular, may be executed by a processor of the target terminal. The target terminal may be at least one audience user included in the shared room and includes a host user and a friend user, and the target terminal may refer to any one of the host user terminal and the friend user terminal. The interface display method shown in fig. 5 may include the steps of:
in step S501, if there is a trigger event for playing the shared video, a playing interface of the shared video in a shared room of the first application program is displayed, where the shared room includes at least one audience user.
In one embodiment, the triggering event for playing the shared video may refer to the management user of the at least one viewer user inputting a playing operation for triggering the shared video. Wherein the management user refers to a viewer user having management authority for the shared room among at least one viewer user. From the foregoing, the at least one audience user includes a host user and a friend user, where the host user is a creator of the shared room and has management authority for the shared room. Optionally, the master state user can transfer the management authority to any friend state user; alternatively, the master mode user may designate one or more friend mode users as management users to manage the shared room together.
Assuming that the management user refers to a master state user, as an alternative implementation manner, the triggering event for playing the shared video may refer to that the master state user triggers an opening control for opening playing the shared video. Alternatively, the open control may be displayed in a room setting interface, which may be displayed after the master state user creates and enters the shared room.
For example, referring to fig. 6a, a schematic diagram of a room setting interface according to an embodiment of the present invention is provided, 601 represents the room setting interface, and 61 represents an open control for playing a shared video. When the master state user clicks on the open control 61, a play interface is displayed in the master state user terminal as shown at 602 in fig. 6 a.
Optionally, after the host user enters the room setting interface of the shared room, the target terminal may display a welcome window in the room setting interface, where some prompt information may be included in the welcome window, and assuming that the first application is an online zoo, the prompt information in the welcome window may be used to prompt what kind of animals are browsable in the shared room and how to browse. For example, the prompt in the welcome window may appear as "hi-welcome to panda's home you can invite family or friends to play zoos with wheat, watch interesting animal performances, and interact closely with small animals.
Optionally, a determine option and a close button may be included in the welcome window. If the master state user selects the decision option and the close button, the welcome window is closed. For example, referring to FIG. 6b, a schematic diagram showing a welcome window is provided for an embodiment of the present invention, 603 representing a room setup interface, 62 representing a welcome window. 6A represents a decision option and 6B represents a close button, which closes a welcome window in the room setting interface when the decision option 6A is triggered.
As another possible implementation manner, if the master state user selects the close button or determines an option, the master state user triggers to display a friend invitation window in the room setting interface, so that the master state user invites the friend state user to join in the shared room through the friend invitation window. For example, if the master state user clicks on 6A in fig. 6b, it may trigger the display of a friend invitation window in the room settings interface. Specifically, the implementation of how to invite the friend state user to join the shared room through the friend invitation window is described in detail in step S201 of the embodiment of fig. 2, and will not be described herein.
In one embodiment, the target terminal may provide a photo setting function, so that the host user or the friend user may set his own photo special effect, and then if the photo special effect selected by the host user or the friend user is triggered, the image of the host user or the image of the friend user may be collected based on the photo special effect selected by the host user or the friend user.
In the specific implementation, a target terminal displays a shooting special effect selection interface, wherein the shooting special effect selection interface comprises an image preview area and a special effect selection area, and the special effect selection area comprises a plurality of special effect identifications and determination controls; when any special effect identifier of the plurality of special effect identifiers is selected and the determination control is not triggered, displaying a target preview image in the image preview area, wherein the target preview image is generated based on the user image acquired at the current moment and the special effect indicated by the selected special effect identifier; and when any special effect identifier in the plurality of special effect identifiers is selected and the determination control is triggered, determining that the shooting special effect selection is completed.
Optionally, the photographing special effect selection interface is displayed when a photographing setting triggering event exists, and the photographing setting triggering event includes: and an opening control for opening playing the shared video, which is included in the room setting interface in the master state user terminal, is triggered.
That is, when the open control in the room setting interface is selected, the target terminal displays the photo special effect selection interface. In this case, the trigger event for playing the shared video may refer to shooting special effect selection completion. The special effects indicated by each special effect identifier in the plurality of special effect identifiers can comprise a filter and a beauty, wherein the filter is used for modifying an image through virtual decoration, and the beauty is used for processing a face in the image by adopting a certain means, such as face thinning, whitening and the like. Optionally, the special effect selection area may include a first type of special effect selection item and a second type of special effect selection item, where the plurality of special effect identifiers are respectively displayed under different special effect selection items according to the special effect category to which each special effect identifier belongs, for example, the special effect identifier belonging to the filter is displayed on the first type of special effect selection item, and the special effect identifier belonging to the beauty is displayed on the second type of special effect selection item.
For example, taking a target terminal as a master state user terminal as an example, referring to fig. 6c, a schematic diagram of photo special effect selection provided for an embodiment of the present invention is shown in fig. 61A, where 61A represents a room setting interface, and 61B represents an open control in the room setting interface that opens to play a shared video. When 61B is triggered, the master state user terminal displays a photo special effect selection interface 604; the photo special effect selection interface 604 includes an image preview area 64 and a special effect selection area 65, and the special effect identifier included in the special effect selection area 65 can be expressed as: panda decoration 611, filter 1 and filter 2; the special effects selection area 65 also includes a determination control 66. When panda decoration 611 is selected, but it is determined that control 66 is not triggered, target preview image 622 is displayed in image preview area 64.
In one embodiment, when it is detected that the photo special effect selection is completed, that is, it is determined that the determination control in the photo special effect selection interface is triggered, the playing interface displaying the shared video may be triggered, for example, assuming that the filter 1 is selected and the determination control 66 is triggered in fig. 6c, the master state user terminal may display the playing interface displaying the shared video, as shown by 67 in fig. 6 d.
In other embodiments, the photographing setting triggering event may further include a photographing setting option in the play interface being triggered. Referring to fig. 6e, a schematic diagram of another photographing setting provided in an embodiment of the present invention is shown in 611, 622, and may be displayed on the lower right side of the playing interface when the playing interface is displayed in a vertical screen state. If the photographing setting option 622 is triggered, the target terminal displays a photographing special effect selection interface 633.
Step S502, when the playing of the shared video meets the interaction condition, at least one interaction option is displayed in the playing interface.
In step S503, when any one of the at least one interactive option is triggered by any one of the audience users in a voice triggering manner, outputting an interactive response corresponding to the any one of the interactive options.
In an embodiment, some possible implementations included in step S502 and step S503 may be referred to the related descriptions in step S202 and step S203 in the embodiment of fig. 2, which are not described herein.
Step S504, if the viewing options of the history browsing in the playing interface are triggered, displaying a history browsing window, wherein the history browsing window comprises a history browsing map, and the history browsing map is generated according to the selected interaction options in the playing interface in the history time period.
In one embodiment, multiple interactions may have been performed or multiple shared videos may be played in the shared room over time, so that, in order to facilitate the respective viewer users to browse the interactions performed in the shared room or the shared videos played in the shared room, the target terminal may set a viewing option of history browsing in the playing interface for the viewer users to view the browsing history of the shared room. Optionally, when the playing interface is displayed in a horizontal screen state, the display position of the viewing option of the history browsing in the playing interface may be displayed at the bottom end of the playing interface and below the position where the identification information of the host user is located, as shown in fig. 3a 131; when the playing interface is displayed in the vertical screen state, the display position of the viewing options for history browsing in the playing interface may be displayed at the bottom end of the playing interface and located at the same horizontal position as the identification information of the host user, as shown in 132 in fig. 3 b.
When a viewing option of historical browsing in a playing interface is triggered, the target terminal displays a historical browsing window, wherein the historical browsing window comprises a historical browsing map, and the historical browsing map is generated according to the selected interaction option in the shared room in a historical time period. For example, the historical browsing map is formed by sequentially connecting the time when each interaction option is selected in the historical time period.
For example, referring to fig. 7a, a schematic diagram of displaying a history browsing map according to an embodiment of the present invention is provided, 701 represents a playing interface for sharing video when a target terminal is in a landscape state, 71 represents a viewing option of history browsing, and when the viewing option of history browsing is triggered, history browsing windows 702, 702 include a history browsing map 72. The selected interaction options in the shared room over the historical time period are included in 72: the "family of trypan", "feed trypan", "and trypan film" and "round" are found.
In one embodiment, a camera control can be further arranged in the playing interface, the camera control comprises an opening state and a closing state, and when the camera control is in the opening state, the target terminal can acquire a video picture; when the camera control is in a closed state, the target terminal cannot acquire the video picture. Optionally, the camera control may be displayed at the same horizontal position as the viewing options of the history browse. Specifically, when the playing interface is displayed in a horizontal screen state, the camera control may be displayed at the bottom end of the playing interface and below the identification information of the host user, as shown in 133 in fig. 3 a; when the playing interface is displayed in the vertical screen state, the camera control may be displayed at the bottommost end of the playing interface and located at the same horizontal position as the identification information of the host user, as shown in 144 in fig. 3 b.
Optionally, the playing interface may further include identification information of a target audience user and identification information of remaining audience users, where the target audience user is an audience user who uses a target terminal in at least one audience user of the shared room, and the remaining audience user is another audience user except for the target audience user in at least one audience user of the shared room. For example, if at least one audience user includes a host user and a friend user, and the target terminal is referred to as a host user terminal, then the target audience user is referred to as a host user, and the remaining audience users are referred to as friend users.
If the camera control in the playing interface is in an on state, the identification information of the target audience user is a video picture of the target user; if the camera control in the playing interface is in a closed state, the identification information of the target audience user is the user identification of the target user in the second application program;
if a camera control in a playing interface for playing the shared video, which is displayed by the residual audience user terminal, is in an on state, the identification information of the residual audience user is a video picture of the residual audience user; and if the camera control in the playing interface for playing the shared video, which is displayed by the residual audience user terminal, is in a closed state, the identification information of the residual audience user is the user identification of the residual audience user in the second application program.
Taking the target terminal as a host state user terminal as an example, the target audience user is a host state user, the rest audience users are friend state users, and in colloquial, if the host state user sets a camera control to be in an on state, video pictures of the host state user can be displayed in a playing interface; otherwise, displaying user identification, such as head portrait or nickname, of the host user in the second application program in the playing interface; similarly, if the friend user sets the camera control in the on state in the friend user terminal, a video picture of the friend user can be displayed in a play interface displayed by the master user terminal; otherwise, displaying the user identification of the friend state user in the playing interface displayed by the master state user terminal.
In short, no matter the playing interface displayed by the master state user terminal or the playing interface displayed by the friend state user terminal, a camera control is displayed, the camera control comprises an on state and an off state, and when the camera control is in the on state, the real-time picture of the corresponding audience user can be seen by other audience users in the shared room; in contrast, when the camera control is in the off state, the real-time screen of the corresponding viewer is not seen by other viewer users, who can only see the user identification of the viewer user in the second application program, such as the avatar.
For example, assuming that the target audience user is a host user and the remaining audience users are friend users, referring to fig. 7b, for a playing interface displayed in a host user terminal provided by an embodiment of the present invention, 703 identification information of the host user, 75 and 76 respectively represent identification information of the friend user, 73 represents a camera control, and the camera control is in an on state, where the identification information of the host user is a video picture of the host user; if the friend users set the camera control to be in an on state in the corresponding playing interfaces, the identification information of each friend user is a video picture of each friend user.
When the camera control in fig. 7b is in the closed state, the identification information of the host user is the user identification of the host user in the second application program, such as the head portrait of the host user, as shown in 7031 in fig. 7 b; if the friend users set the camera control to be in the closed state in their respective corresponding playing interfaces, the identification information of each friend user is the user identification of each friend user in the second application program, such as the head portraits, as shown by 751 and 761 in fig. 7 b.
In one embodiment, microphone control controls may also be included in the playback interface. Alternatively, the microphone control may be displayed at the same level as the camera control described above, as well as the viewing options for history browsing. When the microphone control is on, the target user may input speech through the microphone, whereas the target user cannot input speech, as indicated by 7A in fig. 7 b.
In one embodiment, a play control may also be included in the play interface. Alternatively, the play control may be displayed at the same level as the microphone control, camera control, and viewing options for history browsing described above. The play control is used to control the suspension of playing the shared video and the continuation of playing the shared video, as shown by 7B in fig. 7B.
In one embodiment, a cross-screen play control may also be included in the play interface. Optionally, the cross-screen playing control may be displayed at the same horizontal position by using the controls. When the cross-screen playing control is triggered, the playing interface is switched from being displayed in the target terminal to being displayed in the screen throwing terminal, and 7C in FIG. 7b shows the screen throwing control. For example, the current target terminal is a mobile phone, and because the screen of the mobile phone is smaller, the target user watches the shared video on the mobile phone for a long time, and the eyes are tired, at this time, the target user can trigger a cross-screen playing control to cast the shared video on the television for playing.
In one embodiment, the playing interface may further include a photo special settings option, as shown at 7D in fig. 7 b. When the photo special settings option is triggered, a photo special selection interface, as shown at 604 in fig. 6c, may be displayed for the target user to make a photo special selection.
According to the above, when the target terminal is in the horizontal screen state or the vertical screen state, the identification information of the host user, the identification information of the friend user and the display line form of the control in the playing interface are different. The above-mentioned fig. 7b is a display form of the above-mentioned identification information of the host state user, the identification information of the friend state user and each control in the playing interface when the target terminal is in the vertical screen state, the identification information of the visible target audience user is displayed in the lower left corner of the playing interface, the identification information of the friend state user is displayed in the top of the playing interface, and the camera control, the microphone control, the cross-screen playing control, the playing control and the photo special effect setting options can be displayed in parallel with the identification information of the host state user in the bottom of the playing interface. Referring to fig. 3a, when the target terminal is in the landscape screen state, the controls are displayed on the playing interface. The camera control, the microphone control, the cross-screen playing control, the playing control and the photographing special effect setting options are displayed at the bottom of the playing interface in parallel, and the identification information of the host user is displayed above the position of the camera control; the identification information of the friend state user is displayed on the right side of the playing interface.
The above is only one possible display form of each control on the playing interface when the target terminal is on the vertical screen or the horizontal screen, and other display forms can be set according to the size information of the target terminal screen in practical application. For example, when the target terminal is in a vertical screen state, the camera control and the microphone control are displayed on the right side of the playing interface, and the screen throwing playing control and the photographing special effect setting option are displayed on the left side; for another example, when the target terminal is in a horizontal screen state, the identification information of the friend user and other controls are displayed at the bottom of the playing interface in parallel, and the like.
In the embodiment of the invention, when the trigger event for displaying and playing the shared video exists, the target terminal displays the playing interface of the shared video in the shared room of the first application program, wherein the shared room comprises at least one audience user, that is, a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching the video by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; further, any audience user in the shared room can trigger any interaction option in a voice triggering mode, so that the interaction mode is enriched, the participation of each audience user on interaction is improved, and the interactivity can be improved; and outputting an interaction response corresponding to any interaction option by the target terminal. Further, if the viewing option of the history browsing in the playing interface is triggered, a history browsing window is displayed, wherein the history browsing window comprises a history browsing map, so that any audience user can view the browsing history in the shared room, and the user experience is improved.
Based on the embodiment of the interface method and the shared video management system, the embodiment of the invention provides another shared video management system, and referring to fig. 8a, a network topology diagram of the shared video management system provided by the embodiment of the invention is provided. A terminal 801, a server 802, and a third party cloud service may be included in the shared video management system shown in fig. 8 a.
In one embodiment, the terminal 801 may be used by any audience in a shared room, and the main functions of the terminal 801 are: 1) Interface UI presenter: for example, various interfaces and windows are shown in the embodiments of fig. 2 and 5, such as a play interface showing a shared video, a show mutual window, and interactive options (each interactive option may correspond to a scene), etc.; 2) Camera data acquisition and video connection display: in short, if the corresponding audience user of the terminal sets the camera control to be in an on state in the playing interface of the shared video, the terminal collects the real-time video picture of the audience user through the camera and displays the real-time video picture of other audience users; 3) Voice input recognition initiates the requestor: that is, if the viewer user corresponding to the terminal triggers any one of the at least one interaction option in the interaction window in a voice manner, the terminal may request the server 102 to recognize the input voice, so as to determine which interaction option is selected from the at least one interaction option; 4) And receiving the data returned by the server 102 and displaying the data to the user.
In one embodiment, the server 102 may include three major components, namely data processing, voice processing, and video streaming. The data processing machine group is used for receiving the data from the terminal, preprocessing the received data, transmitting the audio data which is required to be subjected to voice recognition to the voice processing machine group, and transmitting the video stream data to the video stream processing machine group; after the voice processing machine group is processed by the recognition algorithm, the recognition result is returned to the data processing machine group; the video stream processing cluster processes video and video data of a plurality of audience users (the audience users can be called connecting users) in the shared room and returns the processed video and video data to the data processing cluster; finally, the data processing cluster returns the voice recognition result and the video stream related data to the terminal 101.
In one embodiment, the third party cloud service may provide a powerful natural language identification service and an audio/video streaming call service for the server 102.
Based on the above description, a module architecture diagram corresponding to the shared video management system in the embodiment of the present invention may be obtained. Referring to fig. 8b, a module architecture diagram of a shared video management system according to an embodiment of the present invention is provided. The shared video management system shown in fig. 8b may include a presentation layer 81, a logic layer 82, and a service layer 83.
In one embodiment, the presentation layer 81 is processed by a terminal, and is mainly responsible for playing a shared video and UI presentation of an audio-video call connection, history browsing footprint UI presentation (i.e. the history browsing map described above), group photo image UI presentation, and interaction with any viewer user, and according to the operation of any viewer user in the playing interface, the designation is transferred to the logic layer 82, and the result returned by the logic layer 82 is updated to the UI for presentation.
In one embodiment, the logic layer 82 is primarily responsible for some business logic and non-presentation logic processing, such as network request, data persistence, shared video playback control, managing audience users in a shared room, voice recognition input control, group photo image generation, and the like. For example, any audience user can select any interaction option through a triggering manner of voice triggering, at this time, voice recognition processing logic is triggered, audio data collected by a microphone is sent to the server 802 to perform voice recognition processing, and scene selection corresponding to the corresponding interaction option is performed according to data returned by the server 802. During this time, the logic layer 82 will synchronize the selected interactive option to the server 802 for recording, so as to generate a browsing footprint map according to the recording of the server 802. If the selected interactive option is a group photo option with the target object, the terminal 801 needs to acquire video stream images of all audience users in the shared room at a specific moment and image data in the shared video, and then synthesizes the video stream images into a group photo image through an API and stores the group photo image in a local file.
In one embodiment, the service layer 83 primarily provides audio-video telephony capabilities, voice recognition capabilities. For example, the service layer 83 may provide data interaction with the terminal through a network interface service, such as processing for performing voice recognition when the service layer 83 receives audio data input by voice recognition, and return the result of the recognition processing to the terminal 801; for another example, when the service layer 83 receives the interactive option selection data synchronized by the terminal 801, the interactive option selection data is persisted to a database, and a browsing footprint map is generated and returned to the terminal 801 for display; as another example, the audio-video call capability provided by the service layer 83 may obtain video image data of the viewer's user, such as a video frame at a certain moment in time and image data in the shared video being watched, such as an image of the target object at a certain moment in time, from the terminal 801, so that the terminal 801 may use the data to generate a group picture.
In the following, a specific description will be given of how the interface display method shown in fig. 2 and fig. 5 is implemented in the shared video management system shown in fig. 8a and fig. 8 b. FIG. 8c is an interaction diagram provided by an embodiment of the present invention, showing interactions between presentation layer 81, logical layer 82, and service layer 83; fig. 8d is a schematic flow chart of a method for implementing interface display in a shared video management system according to an embodiment of the present invention. It should be appreciated that the terminals described in fig. 8c and 8d are examples of master state user terminals. In the specific implementation:
(1) The terminal enters a shared room, namely, a creator (hereinafter referred to as a master state user) of the shared room enters the shared room; initiating an invitation through the presentation layer 83, inviting friends to enter a shared room to watch the shared video together;
(2) The logic layer 82 is used for carrying out video management and connection management processing instructions among audience users, sending a request for establishing network communication to the service layer 83 and sending data to the service layer 83;
(3) The service layer 83 calls the audio and video call service of the third party according to the received data of the logic layer 82, manages the shared room data, and returns the processed data to the terminal 801;
(4) The logic layer 82 processes the received corresponding audio and video and shared room data, and then sends the processed data to the display layer 81 to display corresponding data UI;
(5) When the presentation layer 81 enters interaction, the notification logic layer 82 starts to acquire microphone audio data of the terminal 801, and uploads the audio data to the service layer 83 to request a response thereof;
(6) The service layer 83 calls a third party server to perform voice recognition processing after receiving the audio data, and returns a voice recognition result to the terminal 801;
(7) The logic layer 82 performs matching processing on the received voice recognition result and each interaction option, determines the selected interaction option, and then performs the item corresponding to the interaction option. For example, the selected interaction option is a group photo option with the target object, the logic layer 82 acquires the image data of the audience user and the image data of the shared video in the shared room at the moment through the service layer 83, synthesizes the group photo image through the local API, saves the group photo image, and then sends the group photo image to the display layer 81 for display.
In the embodiment of the invention, the same shared video is watched together in real time through the connection of the multi-user video, so that the requirement of multi-user remote interaction is met, and the social property and the interactivity are improved. Compared with the mode of text, pictures and pure voice connection, the mode of connecting the multi-user video is more suitable for online interaction among friends, and the user experience is richer. When watching the shared video, the user provides an interaction option, any user on the line can select the next browsing scene in a mode of directly outputting the interaction option, the system automatically recognizes the voice and enters the next scene, and the interestingness and the participation of the user watching the video are improved. In addition, the method can shoot the combination of most online users and animals, so that souvenirs are reserved for the users, and online sightseeing becomes more humanized.
Based on the method embodiment, the embodiment of the invention also provides an interface display device. Referring to fig. 9, a schematic structural diagram of an interface display device according to an embodiment of the present invention is provided. The interface display device shown in fig. 9 may operate as follows:
a display unit 901, configured to display a playing interface of a shared video in a shared room of a first application program; the shared room includes at least one spectator user therein;
The display unit 901 is further configured to display at least one interaction option in the playing interface when the playing of the shared video meets an interaction condition;
and an output unit 902, configured to output an interaction response corresponding to any one of the at least one interaction option when the interaction option is triggered by any one of the at least one viewer user.
In one embodiment, the interaction conditions include any one or more of the following: playing progress conditions and playing picture conditions; if the interaction condition comprises a playing progress condition, the playing of the shared video meets the interaction condition, namely that the played progress in the shared video is equal to the target playing progress indicated by the playing progress condition; if the interaction condition includes a play picture condition, the play of the shared video satisfying the interaction condition means that the play picture in the shared video at the current moment is a target picture indicated by the play picture condition.
In one embodiment, the at least one interactive option is displayed in an interactive window of the playing interface, and the shared video includes a target object; the at least one interaction option comprises each interaction option in at least one interaction option to be displayed, wherein the at least one interaction option to be displayed refers to a preset interaction option related to the shared video; or the at least one interaction option comprises interaction options which are screened from the at least one interaction option to be displayed and matched with the interaction conditions;
The at least one interactive option to be displayed includes any one or more of: options for interacting with the target object and viewing other shared video options; the option of interacting with the target object includes any one or more of a group photo option with the target object and an option of inputting a name voice containing the target object.
In one embodiment, if the any one of the interactive options is a group photo option with the target object, the interactive response is a group photo image;
the output unit 902 performs the following steps when any one of the at least one interactive option is triggered by any one of the at least one viewer user, and outputs an interactive response corresponding to the any one interactive option: when the group photo options with the target object are triggered, group photo prompt animation is played on the playing interface;
displaying a group photo completion window after the prompt animation is played, wherein the group photo completion window comprises a group photo image; the group photo image is generated from the target image and an audience user image of at least one audience user; the target image refers to a preset playing picture in the shared video, the preset playing picture comprises the target object, or the target image refers to the playing picture in the shared video when any interaction option is triggered; the audience user image refers to an image collected by an audience user terminal in the process of playing the group photo prompt animation.
In one embodiment, if the any one of the interactive options refers to watching other shared video options, the interactive response is a playing interface of the other shared video;
the output unit 902 performs the following steps when any one of the at least one interactive option is triggered by any one of the at least one viewer user, and outputs an interactive response corresponding to the any one interactive option: and when the option of watching other shared videos is triggered, switching to a playing interface of the other shared videos in the shared room of the first application program by the playing interface of the displayed shared videos.
In one embodiment, the interactive window includes prompt information for triggering interactive options by voice; any interaction option is triggered by any one or more of the following triggering modes: the touch control mode and the voice mode refer to inputting voice comprising any interaction option.
In one embodiment, the at least one audience user included in the shared room includes a host user and a friend user, the host user is a creator of the shared room, the friend user is a contact user of the host user in the second application program, the playing interface is an interface for playing the shared video displayed in a target terminal, and the target terminal is any one of the host user terminal and the friend user terminal.
In one embodiment, the display unit 901 is further configured to display a photo special effect selection interface, where the photo special effect selection interface includes an image preview area and a special effect selection area, and the special effect selection area includes a plurality of special effect identifiers and determination controls; when any special effect identifier of the plurality of special effect identifiers is selected and the determination control is not triggered, displaying a target preview image in the image preview area, wherein the target preview image is generated based on the user image acquired at the current moment and the special effect indicated by the selected special effect identifier;
the interface display device further includes a processing unit 903, where the processing unit 903 is configured to determine that the selection of the shooting special effect is completed when any special effect identifier of the plurality of special effect identifiers is selected and the determination control is triggered.
In one embodiment, the photo special effect selection interface is displayed when a photo setting trigger event exists, the photo setting trigger event including: an opening control for opening playing the shared video, which is included in a room setting interface in the host state user terminal, is triggered; or a photographing setting option in the playing interface is triggered, and the room setting interface is displayed when the master state user enters the shared room.
In one embodiment, the playing interface includes a camera control, identification information of a target audience user, and identification information of remaining audience users, wherein the target audience user is an audience user using the target terminal in at least one audience user of the shared room, and the remaining audience user is another audience user except the target audience user in at least one audience user of the shared room;
the display unit 901 is further configured to: if the camera control in the playing interface is in an on state, the identification information of the target audience user is a video picture of the target user; if the camera control in the playing interface is in a closed state, the identification information of the target audience user is the user identification of the target user in the second application program;
if the camera control in the playing interface for playing the shared video, which is displayed by the residual user terminal, is in an on state, the identification information of the residual audience user is the video picture of the residual audience user; and if the camera control in the playing interface for playing the shared video, which is displayed by the residual user terminal, is in a closed state, the identification information of the residual audience user is the user identification of the residual audience user in the second application program.
In one embodiment, the interface display device further includes a sending unit 904, where the friend user of the at least one audience user joins the shared room by inviting the master user; the target terminal is a master state user terminal;
the display unit 901 is further configured to display a friend invitation window when the friend invitation option is triggered, where the friend invitation window includes an application identifier of the second application;
the display unit 901 is further configured to select the application identifier, and display a friend user selection window, where the friend user selection window includes user identifiers of multiple friend users;
the sending unit 904 is configured to select a target user identifier from user identifiers of the plurality of friend users, send invitation information to a friend user indicated by the target user identifier, where the invitation information is used to indicate that the friend user indicated by the target user identifier joins in the shared room.
In one embodiment, the display mode of the friend inviting option is any one or more of the following: when the number of friend state users included in the shared room is smaller than a number threshold, displaying the friend state users in the playing interface; and the room setting interface is displayed in the room setting interface of the shared room, and the room setting interface is displayed after the master state user enters the shared room.
In one embodiment, the play interface includes a view option for historical browsing, and when the view option is triggered, a historical browsing window is displayed, the historical browsing window including a historical browsing map generated from selected interactive options in the shared room at a historical time.
According to one embodiment of the present invention, the steps involved in the interface display method shown in fig. 2 and 5 may be performed by the respective units in the interface display device shown in fig. 9. For example, the steps S201 to S202 described in fig. 2 may be performed by the display unit 901 in the interface display device shown in fig. 9, and the step S203 may be performed by the output unit 902 in the interface display device shown in fig. 9; for another example, in the interface display method shown in fig. 5, step S501 to step S502 may be performed by the display unit 501 in the interface display device shown in fig. 9, step S503 may be performed by the output unit 902 in the interface display device shown in fig. 9, and step S504 may be performed by the display unit 901 in the interface display device shown in fig. 9.
According to another embodiment of the present invention, each unit in the interface display device shown in fig. 9 may be separately or completely combined into one or several other units, or some unit(s) thereof may be further split into a plurality of units with smaller functions, which may achieve the same operation without affecting the implementation of the technical effects of the embodiments of the present invention. The above units are divided based on logic functions, and in practical applications, the functions of one unit may be implemented by a plurality of units, or the functions of a plurality of units may be implemented by one unit. In other embodiments of the present invention, the interface-based display device may also include other units, and in practical applications, these functions may also be implemented with assistance from other units, and may be implemented by cooperation of a plurality of units.
According to another embodiment of the present invention, an interface display apparatus as shown in fig. 9 may be constructed by running a computer program (including program code) capable of executing the steps involved in the respective methods as shown in fig. 2 and 5 on a general-purpose computing device such as a computer including a processing element such as a Central Processing Unit (CPU), a random access storage medium (RAM), a read only storage medium (ROM), and the like, and a storage element, and implementing the interface display method of the embodiment of the present invention. The computer program may be recorded on, for example, a computer readable storage medium, and loaded into and executed by the computing device described above.
In the embodiment of the invention, the playing interface of the shared video in the shared room of the first application program is displayed, and the shared room comprises at least one audience user, that is, a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching the video by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; further, according to the triggering of any audience user to any interaction option, the interaction response corresponding to any interaction option is output, so that the interaction between the user and the video or the objects in the video is realized, and the interaction option can be selected by any user, so that the social property and the interaction property are improved.
Based on the method and the device embodiments, the embodiment of the invention provides a terminal. Referring to fig. 10, a schematic structural diagram of a terminal according to an embodiment of the present invention is provided. The terminal shown in fig. 10 includes at least a processor 1001, an input interface 1002, an output interface 1003, and a computer storage medium 1004. Wherein the processor 1001, input interface 1002, output interface 1003, and computer storage medium 1004 may be connected by a bus or other means.
The computer storage medium 1004 may be stored in a memory of the terminal, the computer storage medium 601 is used for storing a computer program, the computer program comprises program instructions, and the processor 1001 is used for executing the program instructions stored in the computer storage medium 1004. The processor 1001 (or CPU (Central Processing Unit, central processing unit)) is a computing core and a control core of the terminal, which are adapted to implement one or more instructions, in particular to load and execute:
displaying a playing interface of the shared video in the shared room of the first application program; the shared room includes at least one spectator user therein; when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface; outputting an interactive response corresponding to any one of the at least one interactive option when the interactive option is triggered by any one of the at least one viewer-user.
The embodiment of the invention also provides a computer storage medium (Memory), which is a Memory device in the terminal and is used for storing programs and data. It will be appreciated that the computer storage media herein may include both built-in storage media in the terminal and extended storage media supported by the terminal. The computer storage medium provides a storage space that stores an operating system of the terminal. Also stored in this memory space are one or more instructions, which may be one or more computer programs (including program code), adapted to be loaded and executed by the processor 1001. The computer storage medium herein may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory; optionally, at least one computer storage medium remote from the processor may be present.
In one embodiment, the computer storage media may be loaded by the processor 1001 and execute one or more instructions stored in the computer storage media to implement the corresponding steps described above with respect to the interface display methods shown in fig. 2 and 5. In particular implementations, one or more instructions in a computer storage medium are loaded by the processor 1001 and perform the steps of:
Displaying a playing interface of the shared video in the shared room of the first application program; the shared room includes at least one spectator user therein; when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface; outputting an interactive response corresponding to any one of the at least one interactive option when the interactive option is triggered by any one of the at least one viewer-user.
In one embodiment, the interaction conditions include any one or more of the following: playing progress conditions and playing picture conditions; if the interaction condition comprises a playing progress condition, the playing of the shared video meets the interaction condition, namely that the played progress in the shared video is equal to the target playing progress indicated by the playing progress condition; if the interaction condition includes a play picture condition, the play of the shared video satisfying the interaction condition means that the play picture in the shared video at the current moment is a target picture indicated by the play picture condition.
In one embodiment, the at least one interactive option is displayed in an interactive window of the playing interface, and the shared video includes a target object;
The at least one interaction option comprises each interaction option in at least one interaction option to be displayed, wherein the at least one interaction option to be displayed refers to a preset interaction option related to the shared video; or, the at least one interaction option comprises an interaction option matched with the interaction condition and screened from the at least one interaction option to be displayed; the at least one interactive option to be displayed includes any one or more of: options for interacting with the target object and viewing other shared video options; the option of interacting with the target object includes any one or more of a group photo option with the target object and an option of inputting a name voice containing the target object.
In one embodiment, if the any one of the interactive options is a group photo option with the target object, the interactive response is a group photo image; the processor 1001 performs the following steps when outputting an interactive response corresponding to any one of the at least one interactive option when the interactive option is triggered by any one of the at least one viewer user:
When the group photo options with the target object are triggered, group photo prompt animation is played on the playing interface; displaying a group photo completion window after the prompt animation is played, wherein the group photo completion window comprises a group photo image; the group photo image is generated from the target image and an audience user image of at least one audience user; the target image refers to a preset playing picture in the shared video, the preset playing picture comprises the target object, or the target image refers to the playing picture in the shared video when any interaction option is triggered; the audience user image refers to an image collected by an audience user terminal in the process of playing the group photo prompt animation.
In one embodiment, if the any one of the interactive options refers to watching other shared video options, the interactive response is a playing interface of the other shared video;
the processor 1001 performs the following steps when outputting an interactive response corresponding to any one of the at least one interactive option when the interactive option is triggered by any one of the at least one viewer user: and when the option of watching other shared videos is triggered, switching to a playing interface of the other shared videos in the shared room of the first application program by the playing interface of the displayed shared videos.
In one embodiment, the interactive window includes prompt information for triggering interactive options by voice; any interaction option is triggered by any one or more of the following triggering modes: the touch control mode and the voice mode refer to inputting voice comprising any interaction option.
In one embodiment, the at least one audience user included in the shared room includes a host user and a friend user, the host user is a creator of the shared room, the friend user is a contact user of the host user in the second application program, the playing interface is an interface for playing the shared video displayed in a target terminal, and the target terminal is any one of the host user terminal and the friend user terminal.
In one embodiment, the processor 1001 is further configured to perform:
displaying a shooting special effect selection interface, wherein the shooting special effect selection interface comprises an image preview area and a special effect selection area, and the special effect selection area comprises a plurality of special effect identifications and determination controls; when any special effect identifier of the plurality of special effect identifiers is selected and the determination control is not triggered, displaying a target preview image in the image preview area, wherein the target preview image is generated based on the user image acquired at the current moment and the special effect indicated by the selected special effect identifier; and when any special effect identifier in the plurality of special effect identifiers is selected and the determination control is triggered, determining that the shooting special effect selection is completed.
In one embodiment, the photo special effect selection interface is displayed when a photo setting trigger event exists, the photo setting trigger event including: an opening control for opening playing the shared video, which is included in a room setting interface in the host state user terminal, is triggered; or a photographing setting option in the playing interface is triggered, and the room setting interface is displayed when the master state user enters the shared room.
In one embodiment, the playing interface includes a camera control, identification information of a target audience user, and identification information of remaining audience users, wherein the target audience user is an audience user using the target terminal in at least one audience user of the shared room, and the remaining audience user is another audience user except the target audience user in at least one audience user of the shared room;
if the camera control in the playing interface is in an on state, the identification information of the target audience user is a video picture of the target user; if the camera control in the playing interface is in a closed state, the identification information of the target audience user is the user identification of the target user in the second application program;
If a camera control in a playing interface for playing the shared video, which is displayed by the residual audience user terminal, is in an on state, the identification information of the residual audience user is a video picture of the residual audience user; and if the camera control in the playing interface for playing the shared video, which is displayed by the residual audience user terminal, is in a closed state, the identification information of the residual audience user is the user identification of the residual audience user in the second application program.
In one embodiment, the friend user of the at least one viewer user joins the shared room by way of the master user invitation; the target terminal refers to a master state user terminal, and the processor 1001 is further configured to perform:
when the friend inviting option is triggered, a friend inviting window is displayed, wherein the friend inviting window comprises an application program identifier of the second application program; selecting the application program identifier, and displaying a friend state user selection window, wherein the friend state user selection window comprises user identifiers of a plurality of friend state users; selecting a target user identifier in user identifiers of the plurality of friend users, and sending invitation information to the friend users indicated by the target user identifier, wherein the invitation information is used for indicating the friend users indicated by the target user identifier to join in the shared room.
In one embodiment, the display mode of the friend inviting option is any one or more of the following: when the number of friend state users included in the shared room is smaller than a number threshold, displaying the friend state users in the playing interface; and the room setting interface is displayed in the room setting interface of the shared room, and the room setting interface is displayed after the master state user enters the shared room.
In one embodiment, the play interface includes a view option for historical browsing, and when the view option is triggered, a historical browsing window is displayed, the historical browsing window including a historical browsing map generated from selected interactive options in the shared room at a historical time.
In the embodiment of the invention, the playing interface of the shared video in the shared room of the first application program is displayed, and the shared room comprises at least one audience user, that is, a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching the video by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; further, according to the triggering of any audience user to any interaction option, the interaction response corresponding to any interaction option is output, so that the interaction between the user and the video or the objects in the video is realized, and the interaction option can be selected by any user, so that the social property and the interaction property are improved.
According to one aspect of the present application, embodiments of the present invention also provide a computer product or computer program comprising computer instructions stored in a computer-readable storage medium. The processor 1001 reads the computer instructions from the computer-readable storage medium, and the processor 1001 executes the computer instructions to cause the image processing apparatus to execute the interface display method shown in fig. 2 and 5, specifically: displaying a playing interface of the shared video in the shared room of the first application program; the shared room includes at least one spectator user therein; when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface; outputting an interactive response corresponding to any one of the at least one interactive option when the interactive option is triggered by any one of the at least one viewer-user.
In the embodiment of the invention, the playing interface of the shared video in the shared room of the first application program is displayed, and the shared room comprises at least one audience user, that is, a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching the video by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; further, according to the triggering of any audience user to any interaction option, the interaction response corresponding to any interaction option is output, so that the interaction between the user and the video or the objects in the video is realized, and the interaction option can be selected by any user, so that the social property and the interaction property are improved.

Claims (15)

1. An interface display method, comprising:
displaying a playing interface of the shared video in the shared room of the first application program; the shared room comprises at least one audience user, and the shared video comprises a target object;
when the playing of the shared video meets the interaction condition, screening at least one interaction option matched with the interaction condition from at least one preset interaction option to be displayed related to the shared video, wherein the at least one interaction option to be displayed comprises any one or more of the following: options for interacting with the target object and other shared video viewing options, wherein the options for interacting with the target object comprise a group photo option with the target object; the interaction condition comprises a playing progress condition, and the playing of the shared video meets the interaction condition and comprises that the playing progress of the shared video is equal to a target playing progress indicated by the playing progress condition;
displaying the at least one interactive option in the play interface;
when any one of the at least one interactive option is selected to be triggered by any one of the at least one audience user, the terminal of any one audience user outputs an interactive response corresponding to the any one interactive option, wherein,
If any interaction option refers to watching other shared video options, the interaction response is a playing interface of the other shared videos, and when the option of watching the other shared videos is triggered, a terminal of any audience user is switched to the playing interface of the other shared videos in a shared room of a first application program by the playing interface of the shared videos;
if any one of the interaction options refers to a group photo option with the target object, the interaction response is a group photo image, the group photo image is generated according to a target image and an audience user image of at least one audience user, the target image is a preset play picture in the shared video, the preset play picture comprises the target object, or the target image is a play picture in the shared video when the group photo option with the target object is triggered, and the audience user image is an image acquired through an audience user terminal of each of the at least one audience user.
2. The method of claim 1, wherein the interactive conditions further comprise a play-out screen condition;
if the interaction condition includes a play picture condition, the play of the shared video satisfying the interaction condition means that the play picture in the shared video at the current moment is a target picture indicated by the play picture condition.
3. The method of claim 1, wherein the at least one interactive option is displayed in an interactive window of the playback interface; the option of interacting with the target object further comprises any one or more of the options of inputting a name voice containing the target object; the method further comprises the steps of:
and when the playing of the shared video meets the interaction condition, displaying each interaction option in the interaction options to be displayed in the interaction window.
4. The method of claim 1, wherein when the group photo options with the target object are triggered, group photo reminder animations are played on the playing interface, and wherein the audience user images are images acquired through an audience user terminal during the playing of the group photo reminder animations;
and displaying a group photo completion window after the prompt animation is played, wherein the group photo completion window comprises the group photo image.
5. The method of claim 3, wherein the interactive window includes a prompt for triggering interactive options via voice; any interaction option is triggered by any one or more of the following triggering modes: the touch control mode and the voice mode refer to inputting voice comprising any interaction option.
6. The method of claim 1, wherein the at least one viewer user included in the shared room comprises a host user and a friend user, the host user being a creator of the shared room, the friend user being a contact user of the host user in a second application, the playing interface being an interface for playing the shared video displayed in a target terminal, the target terminal being any one of the host user terminal and the friend user terminal.
7. The method of claim 6, wherein the method further comprises:
displaying a shooting special effect selection interface, wherein the shooting special effect selection interface comprises an image preview area and a special effect selection area, and the special effect selection area comprises a plurality of special effect identifications and determination controls;
when any special effect identifier of the plurality of special effect identifiers is selected and the determination control is not triggered, displaying a target preview image in the image preview area, wherein the target preview image is generated based on the user image acquired at the current moment and the special effect indicated by the selected special effect identifier;
and when any special effect identifier in the plurality of special effect identifiers is selected and the determination control is triggered, determining that the shooting special effect selection is completed.
8. The method of claim 7, wherein the photo special effects selection interface is displayed in the presence of a photo settings trigger event, the photo settings trigger event comprising: an opening control for opening playing the shared video, which is included in a room setting interface in the host state user terminal, is triggered; or a photographing setting option in the playing interface is triggered, and the room setting interface is displayed when the master state user enters the shared room.
9. The method of claim 6, wherein the playback interface includes a camera control, identification information of a target audience user, the target audience user being an audience user of the at least one audience user of the shared room who uses the target terminal, and identification information of a remaining audience user, the remaining audience user being other than the target audience user of the at least one audience user of the shared room;
if the camera control in the playing interface is in an on state, the identification information of the target audience user is a video picture of the target audience user; if the camera control in the playing interface is in a closed state, the identification information of the target audience user is the user identification of the target audience user in the second application program;
If a camera control in a playing interface for playing the shared video, which is displayed by the residual audience user terminal, is in an on state, the identification information of the residual audience user is a video picture of the residual audience user; and if the camera control in the playing interface for playing the shared video, which is displayed by the residual audience user terminal, is in a closed state, the identification information of the residual audience user is the user identification of the residual audience user in the second application program.
10. The method of claim 6, wherein the friend user of the at least one spectator user is joined to the shared room by way of an invitation from the master user; the target terminal is a master state user terminal, and the method further comprises the steps of:
when the friend inviting option is triggered, a friend inviting window is displayed, wherein the friend inviting window comprises an application program identifier of the second application program;
selecting the application program identifier, and displaying a friend state user selection window, wherein the friend state user selection window comprises user identifiers of a plurality of friend state users;
selecting a target user identifier in user identifiers of the plurality of friend users, and sending invitation information to the friend users indicated by the target user identifier, wherein the invitation information is used for indicating the friend users indicated by the target user identifier to join in the shared room.
11. The method of claim 10, wherein the inviting friend option is displayed in any one or more of the following ways: when the number of friend state users included in the shared room is smaller than a number threshold, displaying the friend state users in the playing interface; and the room setting interface is displayed in the room setting interface of the shared room, and the room setting interface is displayed after the master state user enters the shared room.
12. The method of claim 1, wherein the play interface includes a view option for historical browsing, and wherein when the view option is triggered, a historical browsing window is displayed, the historical browsing window including a historical browsing map generated from selected interactive options in the shared room at a historical time.
13. An interface display device, comprising:
the display unit is used for displaying a playing interface of the shared video in the shared room of the first application program; the shared room comprises at least one audience user, and the shared video comprises a target object;
the display unit is further configured to screen at least one interaction option matching with the interaction condition from among at least one preset interaction option to be displayed related to the shared video when the playing of the shared video meets the interaction condition, and display the at least one interaction option in the playing interface, where the at least one interaction option to be displayed includes any one or more of the following: options for interacting with the target object and other shared video viewing options, wherein the options for interacting with the target object comprise a group photo option with the target object; the interaction condition comprises a playing progress condition, and the playing of the shared video meets the interaction condition and comprises that the playing progress of the shared video is equal to a target playing progress indicated by the playing progress condition;
The output unit is used for outputting an interaction response corresponding to any interaction option by a terminal of any audience user when any interaction option in the at least one interaction option is selected and triggered by any audience user in the at least one audience user, wherein if any interaction option is selected to watch other shared video options, the interaction response is a playing interface of the other shared videos; if any one of the interaction options refers to a group photo option with the target object, the interaction response is a group photo image, the group photo image is generated according to a target image and an audience user image of at least one audience user, the target image is a preset playing picture in the shared video, the preset playing picture comprises the target object, or the target image is a playing picture in the shared video when the group photo option with the target object is triggered, and the audience user image is an image acquired through an audience user terminal of each of the at least one audience user;
the output unit is specifically configured to: when the option of watching other shared videos is triggered, the terminal of any audience user is switched to the playing interface of the other shared videos in the shared room of the first application program by the playing interface of the shared videos.
14. A terminal, comprising:
a processor adapted to implement one or more instructions, and
a computer storage medium storing one or more instructions adapted to be loaded by the processor and to perform the interface display method of any one of claims 1-12.
15. A computer storage medium having stored therein computer program instructions for performing the interface display method of any of claims 1-12 when executed by a processor.
CN202011184663.5A 2020-10-29 2020-10-29 Interface display method, device, equipment and storage medium Active CN114430494B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011184663.5A CN114430494B (en) 2020-10-29 2020-10-29 Interface display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011184663.5A CN114430494B (en) 2020-10-29 2020-10-29 Interface display method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114430494A CN114430494A (en) 2022-05-03
CN114430494B true CN114430494B (en) 2024-04-09

Family

ID=81308879

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011184663.5A Active CN114430494B (en) 2020-10-29 2020-10-29 Interface display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114430494B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115225602A (en) * 2022-06-29 2022-10-21 赤子城网络技术(北京)有限公司 Social application processing method and system

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104822090A (en) * 2014-04-25 2015-08-05 腾讯科技(北京)有限公司 Video playing method, device and system
CN105608715A (en) * 2015-12-17 2016-05-25 广州华多网络科技有限公司 Online group shot method and system
CN105959207A (en) * 2016-05-17 2016-09-21 广州酷狗计算机科技有限公司 Audio and video sharing method and device
CN106210757A (en) * 2016-07-28 2016-12-07 北京小米移动软件有限公司 Live broadcasting method, live broadcast device and live broadcast system
CN106411687A (en) * 2015-07-31 2017-02-15 腾讯科技(深圳)有限公司 Method and apparatus for interaction between network access device and bound user
CN106533924A (en) * 2016-12-19 2017-03-22 广州华多网络科技有限公司 Instant messaging method and device
CN106534953A (en) * 2016-12-09 2017-03-22 北京小米移动软件有限公司 Video rebroadcasting method for live streaming application and control terminal
CN107333167A (en) * 2017-05-22 2017-11-07 武汉斗鱼网络科技有限公司 A kind of processing method, device and the electronic equipment of video-see record
CN107465937A (en) * 2017-06-30 2017-12-12 武汉斗鱼网络科技有限公司 A kind of processing method, device and the electronic equipment of video-see record
CN108111918A (en) * 2017-12-08 2018-06-01 深圳岚锋创视网络科技有限公司 Interactive approach, device and live streaming client during a kind of panoramic video live streaming
CN109688480A (en) * 2019-01-14 2019-04-26 广州虎牙信息科技有限公司 A kind of live broadcasting method, terminal device and storage medium
CN110166799A (en) * 2018-07-02 2019-08-23 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus and storage medium
CN111314773A (en) * 2020-01-22 2020-06-19 广州虎牙科技有限公司 Screen recording method and device, electronic equipment and computer readable storage medium
CN111385632A (en) * 2020-03-06 2020-07-07 腾讯科技(深圳)有限公司 Multimedia interaction method
CN111654730A (en) * 2020-06-05 2020-09-11 腾讯科技(深圳)有限公司 Video playing method, data processing method, related device and medium
CN111698566A (en) * 2020-06-04 2020-09-22 北京奇艺世纪科技有限公司 Video playing method and device, electronic equipment and storage medium
CN111836114A (en) * 2020-07-08 2020-10-27 北京达佳互联信息技术有限公司 Video interaction method and device, electronic equipment and storage medium

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104822090A (en) * 2014-04-25 2015-08-05 腾讯科技(北京)有限公司 Video playing method, device and system
CN106411687A (en) * 2015-07-31 2017-02-15 腾讯科技(深圳)有限公司 Method and apparatus for interaction between network access device and bound user
CN105608715A (en) * 2015-12-17 2016-05-25 广州华多网络科技有限公司 Online group shot method and system
CN105959207A (en) * 2016-05-17 2016-09-21 广州酷狗计算机科技有限公司 Audio and video sharing method and device
CN106210757A (en) * 2016-07-28 2016-12-07 北京小米移动软件有限公司 Live broadcasting method, live broadcast device and live broadcast system
CN106534953A (en) * 2016-12-09 2017-03-22 北京小米移动软件有限公司 Video rebroadcasting method for live streaming application and control terminal
CN106533924A (en) * 2016-12-19 2017-03-22 广州华多网络科技有限公司 Instant messaging method and device
CN107333167A (en) * 2017-05-22 2017-11-07 武汉斗鱼网络科技有限公司 A kind of processing method, device and the electronic equipment of video-see record
CN107465937A (en) * 2017-06-30 2017-12-12 武汉斗鱼网络科技有限公司 A kind of processing method, device and the electronic equipment of video-see record
CN108111918A (en) * 2017-12-08 2018-06-01 深圳岚锋创视网络科技有限公司 Interactive approach, device and live streaming client during a kind of panoramic video live streaming
CN110166799A (en) * 2018-07-02 2019-08-23 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus and storage medium
CN109688480A (en) * 2019-01-14 2019-04-26 广州虎牙信息科技有限公司 A kind of live broadcasting method, terminal device and storage medium
CN111314773A (en) * 2020-01-22 2020-06-19 广州虎牙科技有限公司 Screen recording method and device, electronic equipment and computer readable storage medium
CN111385632A (en) * 2020-03-06 2020-07-07 腾讯科技(深圳)有限公司 Multimedia interaction method
CN111698566A (en) * 2020-06-04 2020-09-22 北京奇艺世纪科技有限公司 Video playing method and device, electronic equipment and storage medium
CN111654730A (en) * 2020-06-05 2020-09-11 腾讯科技(深圳)有限公司 Video playing method, data processing method, related device and medium
CN111836114A (en) * 2020-07-08 2020-10-27 北京达佳互联信息技术有限公司 Video interaction method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114430494A (en) 2022-05-03

Similar Documents

Publication Publication Date Title
US10491859B2 (en) Communication event
RU2527199C2 (en) Avatar integrated shared media selection
CN106791893B (en) Video live broadcasting method and device
CN113395533B (en) Virtual gift special effect display method and device, computer equipment and storage medium
WO2022087920A1 (en) Video playing method and apparatus, and terminal and storage medium
US20210281909A1 (en) Method and apparatus for sharing video, and storage medium
EP2887686A1 (en) Sharing content on devices with reduced user actions
CN109691054A (en) Animation user identifier
US10148911B2 (en) Communication event
CN109068081A (en) Video generation method, device, electronic equipment and storage medium
CN112905074B (en) Interactive interface display method, interactive interface generation method and device and electronic equipment
CN112261481B (en) Interactive video creating method, device and equipment and readable storage medium
CN114025189B (en) Virtual object generation method, device, equipment and storage medium
CN111314730A (en) Virtual resource searching method, device, equipment and storage medium for live video
WO2022142944A1 (en) Live-streaming interaction method and apparatus
WO2023098011A1 (en) Video playing method and electronic device
CN114430494B (en) Interface display method, device, equipment and storage medium
CN109788327B (en) Multi-screen interaction method and device and electronic equipment
CN109819341B (en) Video playing method and device, computing equipment and storage medium
CN110446090A (en) A kind of virtual auditorium spectators bus connection method, system, device and storage medium
CN112188223B (en) Live video playing method, device, equipment and medium
CN112988315A (en) Method, system and readable storage medium for personalized viewing of shared desktop
CN112947819A (en) Message display method, device, storage medium and equipment for interactive narrative work
EP2629512A1 (en) Method and arrangement for generating and updating A composed video conversation
CN114760520A (en) Live small and medium video shooting interaction method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40070392

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant