CN114430494A - Interface display method, device, equipment and storage medium - Google Patents

Interface display method, device, equipment and storage medium Download PDF

Info

Publication number
CN114430494A
CN114430494A CN202011184663.5A CN202011184663A CN114430494A CN 114430494 A CN114430494 A CN 114430494A CN 202011184663 A CN202011184663 A CN 202011184663A CN 114430494 A CN114430494 A CN 114430494A
Authority
CN
China
Prior art keywords
user
shared
playing
option
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011184663.5A
Other languages
Chinese (zh)
Other versions
CN114430494B (en
Inventor
唐艾妮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011184663.5A priority Critical patent/CN114430494B/en
Publication of CN114430494A publication Critical patent/CN114430494A/en
Application granted granted Critical
Publication of CN114430494B publication Critical patent/CN114430494B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8545Content authoring for generating interactive applications

Abstract

The embodiment of the invention discloses an interface display method, an interface display device, a terminal and a storage medium, wherein the method comprises the following steps: displaying a play interface of a shared video in a shared room of a first application program, wherein a sharing party comprises at least one audience user; when the playing of the shared video meets the interaction condition, displaying at least one interaction option on a playing interface; and when any one of the at least one interactive option is triggered by any one of the at least one audience user, outputting an interactive response corresponding to any one of the interactive options. By adopting the embodiment of the invention, the effect that multiple people watch the same video in a connecting line and interact when the video is played is realized.

Description

Interface display method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an interface processing method, apparatus, device, and storage medium.
Background
The zoo is one of the most popular places for many people to swim, however, under the epidemic situation of new coronary pneumonia, people reduce gathering under the line, the zoo also starts to transform to direct online broadcasting, in other words, activity videos of each animal in the zoo are displayed to users through direct broadcasting, and online visiting of the zoo is achieved. But the current online zoo-visiting format lacks user-animal interaction and has low user involvement. In addition, in a multi-user game scene, the uniform progress of watching animal videos by a plurality of users cannot be ensured, and real-time communication is carried out. Therefore, how to better perform online browsing is now a hot issue for research today.
Disclosure of Invention
The embodiment of the invention provides an interface display method, device, equipment and storage medium, which realize that multiple people watch the same video in a wired mode and interact when the video is played.
In one aspect, an embodiment of the present invention provides an interface display method, including:
displaying a playing interface of a shared video in a shared room of a first application program; the shared room comprises at least one audience user;
when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface;
and when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to the any interactive option.
In one aspect, an embodiment of the present invention provides an interface display apparatus, including:
the display unit is used for displaying a playing interface of a shared video in a shared room of the first application program; the shared room comprises at least one audience user;
the display unit is further used for displaying at least one interaction option in the playing interface when the playing of the shared video meets the interaction condition;
and the output unit is used for outputting an interactive response corresponding to any interactive option when any interactive option in the at least one interactive option is triggered by any audience user in the at least one audience user.
In one aspect, an embodiment of the present invention provides a terminal, including:
a processor adapted to implement one or more instructions; and the number of the first and second groups,
a computer storage medium storing one or more instructions adapted to be loaded by the processor and to perform the steps of:
displaying a playing interface of a shared video in a shared room of a first application program; the shared room comprises at least one audience user;
when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface;
and when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to the any interactive option.
In one aspect, an embodiment of the present invention provides a computer storage medium, where computer program instructions are stored in the computer storage medium, and when executed by a processor, the computer program instructions are configured to perform the following steps:
displaying a playing interface of a shared video in a shared room of a first application program; the shared room comprises at least one audience user;
when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface;
and when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to the any interactive option.
In one aspect, an embodiment of the present invention provides a computer program product or a computer program, where the computer program product or the computer program includes computer instructions stored in a computer-readable storage medium; a processor of the terminal reads the computer instructions from the computer storage medium, and executes the computer instructions to perform:
displaying a playing interface of a shared video in a shared room of a first application program; the shared room comprises at least one audience user;
when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface;
and when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to the any interactive option.
In the embodiment of the invention, a playing interface of a shared video in a shared room of a first application program is displayed, the shared room comprises at least one audience user, namely a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching videos by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; furthermore, according to the triggering of any audience user on any interaction option, an interaction response corresponding to any interaction option is output, the interaction between the user and the video or an object in the video is realized, and the interaction option can be selected by any user, so that the social interaction and the interaction are improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a shared video management system according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of an interface display method according to an embodiment of the present invention;
fig. 3a is a schematic diagram of a playing interface according to an embodiment of the present invention;
FIG. 3b is a diagram of another playback interface provided by an embodiment of the present invention;
fig. 3c is a schematic diagram illustrating an invitation message displayed in a friend-state user according to an embodiment of the present invention;
FIG. 3d is a schematic diagram of a display room setting interface provided by an embodiment of the invention;
fig. 3e is a schematic diagram illustrating a friend-state user is invited to join a shared room according to an embodiment of the present invention;
fig. 3f is a schematic diagram of a friend-state user joining a shared room according to an embodiment of the present invention;
FIG. 4a is a diagram illustrating an interactive option display according to an embodiment of the present invention;
FIG. 4b is a schematic diagram illustrating a group photo interaction according to an embodiment of the present invention;
FIG. 4c is a diagram illustrating another video option according to an embodiment of the present invention;
FIG. 5 is a schematic flow chart diagram illustrating another interface display method according to an embodiment of the present invention;
FIG. 6a is a diagram illustrating an interactive option display according to an embodiment of the present invention;
FIG. 6b is a diagram illustrating a welcome window according to an embodiment of the present invention;
FIG. 6c is a diagram illustrating a selection of a special effect of photographing according to an embodiment of the present invention;
FIG. 6d is a diagram illustrating a display interface according to an embodiment of the present invention;
FIG. 6e is a schematic diagram of another shooting setup provided by an embodiment of the present invention;
FIG. 7a is a diagram of a history browsing map display according to an embodiment of the present invention;
fig. 7b is a playing interface displayed in the main dynamic user terminal according to an embodiment of the present invention;
fig. 8a is a network topology diagram of a shared video management system according to an embodiment of the present invention;
FIG. 8b is a block diagram of a shared video management system according to an embodiment of the present invention;
FIG. 8c is an interaction diagram provided by an embodiment of the invention;
FIG. 8d is a flowchart illustrating a method for implementing an interface display in the shared video management system according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an interface display device according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
The embodiment of the invention provides an interface display scheme, wherein a playing interface of a shared video in a shared room of a first application program is displayed in a terminal, and the shared room can comprise one or more audience users; when the playing of the shared video meets the interaction condition, for example, the shared video has been played to a certain specified position, or the currently played playing picture includes a specified object, displaying at least one interaction option in the playing interface, for example, a group photo option with a target object, a playing option of other shared videos, and the like; and if any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to any interactive option. The synchronization of watching videos by multiple people is realized, each audience user in the multiple audience users can select the interaction option, and the social interaction and the interactivity are improved.
Based on the above interface display scheme, an embodiment of the present invention provides a shared video management system, please refer to fig. 1, which is a schematic structural diagram of a shared video management system according to an embodiment of the present invention. The shared video management system shown in fig. 1 can realize that multiple people watch the shared video in real time in an online manner, and in the process, any audience user can select an interaction mode in a playing interface of the shared video, so as to complete interaction between the audience users or interaction between multiple audience users and a target object in the shared video according to the indication of the interaction mode.
Optionally, the shared video management system shown in fig. 1 may include a terminal 101 and a server 102 corresponding to each of at least one viewer user. The terminal 101 may include devices such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, a smart car, and a smart television; the server 102 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, a middleware service, a domain name service, a security service, a CDN, a big data and artificial intelligence platform, and the like.
In one embodiment, the terminal 101 is configured to run a first application program and provide an interface UI display function for a corresponding viewer user, such as displaying a playing interface, displaying at least one interaction option, displaying an interaction response, and the like. The server 102 provides support for the running of the first application in the terminal 101.
In a specific implementation, any viewer user may perform an operation of creating a shared room through the terminal 101 used by the viewer user, where the viewer user may be referred to as a creator of the shared room and may also be referred to as a master dynamic user in the shared room; upon detecting the operation of creating the shared room, the terminal 101 sends a creation request to the server 102, and the server 102 verifies the creation request and creates the shared room in the terminal 101 of the master status user (in the following description, the terminal of the master status user may be simply referred to as a master status user terminal).
Further, the main modal user may invite other audience users to join the shared room through the terminal 101 of the main modal user, and the invited audience users may be referred to as friend modal users in the shared room. Optionally, when detecting an invitation operation of the main modality user inviting the other audience users to join the shared room in the terminal 101, the terminal 101 of the main modality user sends an invitation request to the server 102, where the invitation request is used to instruct the server 102 to send an invitation notification to a terminal of an invited friend modality user (in the following description, the terminal of the friend modality user may be simply referred to as a friend modality user terminal). After the invited buddy state user accepts the invitation and joins the shared room, the server 102 may store user information for the individual audience users that joined the shared room.
In one embodiment, the server 102 further stores video data of a plurality of shared videos, the terminal 101 of the master modal user may send a play request for playing a certain shared video to the server 102, and the server 102 returns the video data of the shared video requested by the play request; the terminal 101 displays a playing interface of a shared video in a shared room; it can be understood that the terminal 101 of the friend-state user joining in the shared room also displays a playing interface of the shared video in the shared room.
With the playing of the shared video in the shared room, if the terminal 101 or the server 102 of any viewer user detects that the playing of the shared video meets the interaction condition, such as playing to a certain specified playing picture or playing to a specified duration, the terminal 101 of each viewer user in the shared room is notified to display one or more interaction options in the playing interface, so that any viewer user can trigger any interaction option through the corresponding terminal 101 to interact.
Optionally, when any one of the plurality of interaction options is triggered by any viewer user, the terminal 101 of any viewer user outputs an interaction response corresponding to any one of the interaction options. The interactive response may be generated by the server 102 according to any interactive option and then sent to each terminal 101 for display; or, the interactive response may be generated by the terminal 101 of the audience user triggering any interactive option according to any interactive option, and sent to the server 102, and then sent to the terminals 101 of other audience users by the server 102.
In the shared video management system shown in fig. 1, synchronization of watching videos by multiple people is realized, and any user of multiple audience users can select an interaction option with the video or an object in the video, so that social interaction and interactivity are improved.
Based on the interface display scheme and the shared video management system, the embodiment of the invention provides an interface display method. Referring to fig. 2, a schematic flow chart of an interface display method according to an embodiment of the present invention is shown. The interface display method shown in fig. 2 may be executed by the target terminal, and may specifically be executed by a processor of the target terminal. The target terminal may refer to a terminal of any audience user in the shared room, for example, the audience users in the shared room include a master mode user and a friend mode user, and the target terminal may refer to any one of the master mode user terminal and the friend mode user terminal. The interface display method in fig. 2 may include the following steps:
step S201, displaying a playing interface of a shared video in a shared room of a first application program, wherein the shared room comprises at least one audience user.
The first application may refer to any application running in the target terminal that supports creation of a shared room, such as an online zoo application, a small video application, and the like. The shared room is a virtual online room, and multiple viewer users in the same shared room can synchronously or asynchronously view the shared video played in the shared room and synchronously or asynchronously browse other information in the shared room. The shared video may refer to any one of a plurality of videos available by the first application.
In one embodiment, when the target terminal is in the landscape state, the playing interface is displayed in the target terminal in the landscape state, as shown in fig. 3 a; when the target terminal is in the vertical screen state, the playing interface is displayed in the target terminal in the vertical screen state, as shown in fig. 3 b.
Optionally, the shared room may include at least one audience user, and the at least one audience user may include a creator of the shared room (hereinafter, referred to as a master dynamic user) and one or more non-creators (hereinafter, referred to as friend dynamic users). The target terminal may refer to any one of the master mode user terminal and the friend mode user terminal. Optionally, the friend-state user is a contact user of the master-state user in a second application program, and the second application program may be any social application program.
Optionally, the playing interface may include identification information of the audience users in the shared room, and the identification information of any audience user may include any one or more of a video picture of any audience user, a user avatar of any audience user in the second application program, and the like. Optionally, the identification information of the master modal user and the identification information of the friend modal user in the plurality of audience users may be displayed at any position of the playing interface. In a specific implementation, the display interface may be related to whether the playing interface is displayed in a landscape mode or a portrait mode.
For example, it is assumed that the identification information of the main modality user is represented by 001, the identification information of the friend modality user 1 is represented by 002, and the identification information of the friend modality user 2 is represented by 003. The display positions of the identification information of the main modality user and the identification information of the friend modality user in the playing interface are shown in fig. 3a, the playing interface is displayed in a landscape screen state, at this time, the identification information 001 of the main modality user is displayed at the lower left of the playing interface, and the identification information 002 and 003 of the friend modality user is displayed at the right of the playing interface.
For another example, the positions of the identification information of the main modality user and the identification information of the friend modality user in the playing interface may also be as shown in fig. 3b, and the playing interface is displayed in a vertical screen state in fig. 3b, at this time, the identification information 001 of the main modality user may be displayed at the lower left of the playing interface, and the identification information 002 and 003 of the friend modality user may be displayed at the upper side of the playing interface.
It should be understood that, in the above description, only the display positions of the identification information of the main modal user and the identification information of the friend modal user listed in the embodiment of the present invention are shown in the play interface, and in practical applications, no matter the play interface is shown in the landscape screen state or the portrait screen state, the identification information of the main modal user and the identification information of the friend modal user can be shown in the same horizontal position or the same vertical position; or, no matter the playing interface is displayed in a horizontal screen state and a vertical screen state, the identification information of the master state user can be displayed at the top end of the playing interface, and the identification information of the friend state user is displayed at the bottom end of the playing interface.
In one embodiment, the friend users in the shared room can join the shared room by means of invitation of the master users. In a specific implementation, assuming that the target terminal is a master dynamic user terminal, the master dynamic user inviting a friend dynamic user to join in a shared room, including: when the friend invitation option is triggered, displaying a friend invitation window, wherein the friend invitation window comprises an application program identifier of the second application program; selecting the application program identification, and displaying a friend-state user selection window, wherein the friend-state user selection window comprises user identifications of a plurality of friend-state users; and selecting a target user identifier in the user identifiers of the friend-state users, and triggering to send invitation information to the friend-state user indicated by the target user identifier, wherein the invitation information is used for indicating the friend-state user indicated by the target user identifier to join the shared room.
The number of the second application programs may be one or more, so that the number of the application program identifiers of the second application program in the friend invitation window is one or more, and the application program identifier may refer to any one or more of an icon and a name of the second application program, for example, if the second application program is a WeChat application program, the application program identifier of the second application program may be: "WeChat icon + WeChat".
In an embodiment, the above-mentioned implementation manner of displaying the friend-state user selection window may be: when the application program identifier of the second application program is triggered, the main dynamic terminal sends request information for displaying a friend-state user selection window to the server, the server generates a friend-state user selection window according to the friend-state user in the second application program, and sends the generated friend-state user selection window to the main dynamic terminal; and the main dynamic user terminal displays the friend-state user selection window in a first application program, or the main dynamic user terminal displays the friend-state user selection window in a second application program.
In an embodiment, the invitation information sent to the friend-state user indicated by the target user identifier may be a session message in the second application program, where the session message is displayed in a session interface of the friend-state user terminal indicated by the target user identifier, and the session interface is used for performing a session between the master-state user and the friend-state user. Optionally, the invitation message may include a prompt message for joining the shared room and an option for determining joining, where the prompt message may be in a form of "your friend XXX invites your joining the XXX room to browse pandas together, and click on the below option for determining joining to open the first application and enter the XXX room".
For example, referring to fig. 3c, a schematic diagram of displaying invitation information in a friend-state user according to an embodiment of the present invention is provided. And 3A represents a session interface displayed in the buddy state user terminal indicated by the target user identification, 3B represents invitation information, and 3C represents a join determination button. After the friend-state user clicks the join determining button 3C, the friend-state user terminal jumps from operating the second application program to operating the first application program, and displays a friend-state room setting interface of the shared room in the first application program as shown in 31 in fig. 3C. The friend-state room setting interface 31 may display identification information of the master user and identification information of other friend-state users who have joined the shared room. For example, the identification information of the main status user may be represented as "the house owner your WeChat friend user XXX"; or, may appear as an avatar of the master-mode user in the WeChat application; or if the main modal user starts the camera, the identification information of the main modal user can be represented as a video picture of the main modal user.
In an embodiment, the display manner of the invite buddy option may be any one or more of the following: when the number of friend-state users included in the shared room is smaller than a number threshold, displaying the friend-state users in the playing interface; and displaying the room setting interface in the room setting interface of the shared room, wherein the room setting interface is displayed after the master user creates and enters the shared room, namely the room setting interface is the master room setting interface.
In such a display mode that the invitation friend option is displayed on the play interface, the invitation friend option may be displayed at a position adjacent to the identification information of the friend-state user, as shown in 301 in fig. 3 a; alternatively, as shown at 310 in FIG. 3 b. The number threshold refers to the number of friend-state users that can be accommodated in a preset shared room, and the number threshold may be set by a default of the main-state user terminal or set by the main-state user when the shared room is created. If the number of the friend-state users in the current shared room is smaller than the number threshold, indicating that other friend-state users can be added into the shared room; on the contrary, if the number of friend-state users included in the current shared room is greater than or equal to the number threshold, it indicates that the number of people in the shared room has reached the upper limit, and no other friend-state users are allowed to be added.
Therefore, if the number of the friend-state users included in the shared room is less than the number threshold, an invitation friend option may be displayed in the playing interface, so that the master-state user invites the friend-state user to join the shared room by triggering the invitation friend option. For example, assuming that the number threshold is 3, in the playing interface shown in fig. 3a, there are 2 friend-state users who have joined in the shared room, and then the inviting friend option 301 may be included in the playing interface.
In the display mode that the invitation friend option is displayed on a room setting interface of the shared room, the room setting interface is displayed in the main dynamic user terminal after the main dynamic user creates the shared room and enters the shared room. In the room setting interface, identification information of the main dynamic user may be included, and when the target terminal is placed in a vertical screen mode, the identification information of the main dynamic user may be displayed at the bottom end of the room setting interface, specifically, at the position of the lower left corner of the room setting interface, as shown in 302 in fig. 3 d.
In one embodiment, the inviting friend option may be displayed at the same horizontal position as the identification information of the main dynamic user, as shown in fig. 3d, 311, 322, and 333 represent the inviting friend option, and the visible information 311, 322, and 333 are displayed at the same horizontal position as the identification information 302 of the main dynamic user, which are the bottom end of the room setting interface. It should be understood that the above is only a display position of the invitation friend option in the room setting interface recited in the embodiment of the present invention, and in practical applications, the invitation friend option may be set to be displayed at any position of the room setting interface, for example, at a position of an upper left corner, a position of an upper right corner, and the like.
No matter what kind of display mode is used for displaying the friend invitation option, when the friend invitation option is triggered, a friend invitation window is displayed. Taking the example that the invite friend option is displayed in the room setting interface, how the main dynamic user invites the friend dynamic user to join the shared room will be specifically described below with reference to fig. 3d and fig. 3 e.
In a specific implementation, after the main modal user clicks the invite friend option 311 in fig. 3d, the main modal user displays a friend invite window as shown in 303 in fig. 3 e; the icon and name of the WeChat application and the icon and name of the QQ application are included in the friend-state user selection window 303; when the main modal user triggers the icon of the QQ application, a friend modal user selection window 304 is displayed; the information 304 includes a user identifier a of the friend-state user a, a user identifier B corresponding to the friend-state user B, and a user identifier C corresponding to the friend-state user C. When the user identity B is triggered, the main modal user terminal sends invitation information to the friend modal user B, assuming that the invitation information is as shown in 3C in fig. 3C. And if the friend-state user B clicks the determined joining button in the 3C, indicating that the friend-state user B joins the shared room. After the friend-state user B joins the shared room, the identification information of the friend-state user B may be displayed in the room setting interface, as shown in 306 in fig. 3 e.
In other embodiments, the buddy state user is actively joined to the shared room by a room identification code of the shared room. In a specific implementation, any friend-state user terminal runs a first application program and opens an application interface for applying for joining a shared room, wherein the application interface can comprise a room identification code filling area and a joining determining button; when the room identification code filling area is filled with the room identification code of the shared room and the joining confirmation button is triggered, any friend-state user joins the shared room.
Optionally, the room identifier of the shared room may be automatically obtained by the target terminal through a session interface of a session between the host-state user and the friend-state user in the second application program and automatically filled in the room identifier filling area, or manually filled by the friend-state user. In another alternative, the room identifier of the shared room may be manually filled into the room identifier filling area by the friend user who is told by the master user face to face.
For example, referring to fig. 3f, in order to provide a schematic diagram that a friend-state user joins in a shared room according to an embodiment of the present invention, 31A represents an application interface for applying for joining in the shared room in a first application program displayed by a friend-state user terminal, 31B represents a session interface for a friend-state user to perform a session with a master-state user in a second application program displayed by the friend-state user terminal, and 32B represents a session message that the master-state user sends to the friend-state user and includes a shared room identification code; 32A denotes a room identification code filling area, and 33A denotes an enter determination button. The friend-state user inputs the room identification code included in the room 32B in the room 32A, and clicks the room identification code 33A, and at this time, the friend-state user terminal displays a friend-state room setting interface of the shared room as shown in fig. 31C.
Step S202, when the playing of the shared video meets the interaction condition, at least one interaction option is displayed on the playing interface.
In one embodiment, the interaction condition includes a play progress condition, and the playing of the shared video meeting the interaction condition may include: the fact that the playing of the shared video meets the interaction condition means that the playing progress of the shared video at the current moment is equal to the target playing progress indicated by the playing progress condition. The target playing progress may be 30 seconds away from the end of playing, that is, the playing of any video is about to end. It should be understood that, in the embodiment of the present invention, when the playing of the shared video is about to end, at least one interaction option triggered by any viewer user in the shared room is displayed, so that the viewer user in the shared room can determine the content to be played or displayed in the shared room in the next step, in colloquial, any viewer user determines the shared room to move towards in the next step, and the interactivity is improved.
Or, the target playing progress also refers to played 1/2, played 1/3 or played 4/5, etc. The interaction options are displayed in the process of playing the shared video, so that fatigue of the audience user when watching the video for a long time can be avoided, the watching interest of the audience user on the shared video is increased through proper interaction, and the attention of the first application program is favorably improved.
In another embodiment, the interaction condition includes a play picture condition, and in colloquial, the play picture condition refers to a condition that a play picture currently played by a shared video needs to meet, the play picture condition is used to indicate a target picture, and the play meeting of the shared video refers to that a play picture in the shared video at the current time is the target picture indicated by the play picture condition. The target picture may be pre-specified, and the target picture may be any one of a plurality of frames of play pictures included in the shared video, for example, the target picture refers to a play picture including a target object, or the target picture refers to a play picture including a target object and having a preset posture. Wherein, predetermine the gesture and can refer to the gesture that is fit for the group photo, or the gesture that is fit for feeding, or the gesture that is fit for touching arbitrary one or more. In the playing process of the shared video, some playing pictures suitable for interaction are selected for audience users to interact, so that the interest of watching the shared video can be improved.
As an optional implementation manner, if the target picture refers to any one of the multiple frames of playing pictures included in the shared video, the implementation manner of the target terminal detecting whether the playing of the shared video meets the interaction condition may be: the target terminal adds marks to target pictures in the shared video; and if the fact that the mark exists in the playing picture played at the current moment is detected in the playing process of the shared video, determining that the playing of the shared video meets the interaction condition.
As another optional implementation manner, if the target picture is a play picture including a target object, an implementation manner of the target terminal detecting whether play of the shared video meets an interaction condition may further be: in the playing process of the shared video, identifying a playing picture of which each frame comprises an object; and if the object included in the playing picture played at the current moment is identified as the target object, determining that the playing of the shared video meets the interaction condition.
As another optional implementation manner, if the target picture includes a target object and the gesture of the target object is a preset gesture, the implementation manner of the target terminal detecting whether the playing of the shared video meets the interaction condition may be: in the playing process of the shared video, identifying a playing picture of which each frame comprises a target object; recognizing the posture of a target object in the playing picture; and if the gesture of the target object is equal to the preset gesture, determining that the playing of the shared video meets the interaction condition.
After detecting that the playing of the shared video meets the interaction condition, the target terminal can display at least one interaction option in a playing interface, each interaction option corresponds to one interaction operation, and each interaction option can also be understood as a scene.
In one embodiment, the at least one interactive option may include each of at least one interactive option to be displayed, where the at least one interactive option to be displayed refers to a preset interactive option related to the shared video. The at least one interactive option to be displayed may include: an option to interact with the target object and an option to view other shared videos; the options for interacting with the target object comprise any one or more of an option for combining with the target object and an option for inputting a name voice containing the target object (namely, an option for calling the name of the target object). The other shared videos may include shared videos related to the target object, and may also refer to videos including other objects.
It should be understood that the embodiment of the present invention only lists some possible interaction options to be displayed related to the shared video, and in practical applications, more interaction options to be displayed may be set according to different types of objects to which the target object belongs. For example, if the target object is a plant, options interacting with the target object in the interaction options to be displayed may further include watering the target object, trimming branches and leaves for the target object, and the like; for another example, if the target object is a person, the options for interacting with the target object in the interaction options to be displayed may further include handshaking with the target object, hugging with the target object, and the like.
Colloquially, the target terminal may preset a plurality of interaction options related to the shared video, and these preset interaction options may be referred to as interaction options to be displayed. When the fact that the playing of the shared video meets the interaction condition is detected, the target terminal can display preset interaction options to be displayed in the playing interface.
In other embodiments, the at least one interactive option may be an interactive option matched with the interactive condition, which is screened from at least one interactive option to be displayed. For example, if the interaction condition is a play progress condition, the at least one interaction option refers to an interaction option matching the play progress condition in the at least one interaction option to be displayed, for example, the target play progress indicated by the play progress condition is 30 seconds away from the end of playing, and at this time, the interaction option matching the play progress condition in the at least one interaction option to be displayed may refer to an option for viewing other shared videos; if the interaction condition is a play picture condition, the at least one interaction option refers to an interaction option matched with the play picture condition in at least one interaction option to be displayed, such as an option for combining with a target object and an option for inputting name and voice containing the target object.
In one embodiment, the at least one interactive option may be displayed in an interactive window of the play interface. For example, referring to fig. 4a for a schematic diagram of displaying interaction options provided by an embodiment of the present invention, assuming that the first application is an online zoo, the shared video is an active video expected by a panda, a target object included in the shared video is an expected, and 401 represents a playing interface of the shared video in a shared room. If the target terminal detects that the playing progress of the shared video is equal to the target playing progress indicated by the interaction condition, displaying an interaction window in the playing interface as shown in 402 in fig. 4 a; displaying a plurality of interactive options in the interactive window, such as: feeding the target object with an option expressed as "bamboo feed with anticipation", matching the target object with an option expressed as "match with anticipation", entering an option containing the name of the target object with a voice, expressed as "name with anticipation", and viewing other shared videos expressed as "go round".
Step S203, when any interactive option in the at least one interactive option is triggered by any audience user in the at least one audience user, outputting an interactive response corresponding to any interactive option.
As can be seen from the foregoing, at least one interactive option is displayed in an interactive window of the play interface, the interactive window may further include a prompt message for triggering the interactive option in a voice manner, and any one of the interactive options is triggered in any one or more of the following triggering manners: the method comprises a touch mode and a voice mode, wherein the touch mode refers to that an interaction option is selected in any one of contact modes such as clicking, long pressing, double clicking and the like; the voice mode refers to outputting voice including any interaction option, and colloquially, any audience user inputs voice including any interaction option through any audience user terminal. For example, in the interactive window 402 of fig. 4a, the prompt to trigger the interactive option by voice may be expressed as "try to turn on the microphone to say your selection bar", as shown in 411 of fig. 4 a.
It should be understood that triggering any interaction option in a voice manner enriches the interaction participation form of any audience user, can liberate both hands of the audience user, provides convenience for the audience user, and thus can also improve the enthusiasm of the audience user in participating the interaction.
In one embodiment, if any of the interactive options triggered by the at least one viewer user is a group photo option with a target object, the interactive response may refer to a group photo image, and the outputting of the interactive response corresponding to any of the interactive options includes: when the group photo option with the target object is triggered, playing a group photo prompt animation on the playing interface; and after the prompt animation is played, displaying a group photo finishing window, wherein the group photo finishing window comprises a group photo image.
In one embodiment, the composition prompt animation is used for prompting each viewer that the user is currently generating the composition image, and prompt information for generating the composition image, such as "composition countdown please pose", may be included in the composition prompt animation. The group photo prompt animation may be a countdown animation, and the countdown starting time may be preset by the target terminal. Optionally, the target terminal may set a countdown time according to a time required to generate the group photo image.
And after the countdown is finished, the target terminal displays the group photo image in a group photo finishing window. Optionally, individual viewer users in the shared room may conduct an audio-video session.
In one embodiment, the group photo image is generated according to a target image and a viewer user image of at least one viewer user, where the target image is a preset playing picture in the shared video, or the target image is a playing picture in the shared video when any one of the interaction options is triggered; the viewer user image may refer to an image captured by the viewer user during the playing of the group photo reminder animation.
Optionally, after it is detected that the playing of the shared video meets the interaction condition, when at least one interaction option is displayed, the target terminal may stop playing the shared video, and at this time, the time when any one of the at least one interaction option is triggered is equal to the time when the at least one interaction option is displayed; optionally, if the target terminal continues to play the shared video while displaying the at least one interactive option, until any interactive option is triggered, where the time when any interactive option is triggered is later than the time when the at least one interactive option is displayed.
As an alternative embodiment, the purpose of group photo is to commemorate that a plurality of audience users watch a shared video including a target object together, and therefore, preferably, the audience image requires the audience user's face to be included. In a specific implementation, each audience user terminal can send the acquired image to the server, and the server selects an audience user image including a human face as an effective image or an audience user image which can be used for generating a group photo image. For invalid viewer user images, the server may re-acquire, or ignore, the corresponding viewer terminal.
For example, an embodiment of the present invention provides a schematic diagram for displaying a group photo image, as shown in fig. 4b, if any viewer user selects the option "group photo with expectations" in the interactive window 402, the target terminal displays a group photo prompt animation as shown in fig. 41. As time passes, the group photo presentation animation is finished, the target terminal displays a group photo window 42 in the playback interface, and the group photo window 42 includes an intended image and viewer user images of two viewer users.
Optionally, the group photo window may also include a favorite group photo image option, shown as "recall down", shown as 421 in group photo window 42 of fig. 4b, where when the favorite group photo image option is triggered by any viewer user, the group photo image is saved in said any viewer user terminal.
In other embodiments, if any interactive option triggered by any viewer user in step S202 refers to watching another shared video option, the interactive response is a playing interface of the other shared video, and when any interactive option of the at least one interactive option is triggered by any viewer user of the at least one viewer user, outputting an interactive response corresponding to the any interactive option includes: when the option for watching the video related to the target object is triggered, the playing interface for displaying the shared video is switched to the playing interface for displaying the other shared videos in the shared room of the first application program.
For example, referring to fig. 4a, a schematic diagram of outputting an interactive response according to an embodiment of the present invention is shown in fig. 4 c. Assuming that the playing interface of the shared video displayed by the target terminal is as shown in 401 in fig. 4a, if any viewer user in the shared room clicks the "bamboo feed with expectation" option in the interactive window 402 of fig. 4a, the target terminal displays a video of "bamboo feed with expectation" as shown at 45 in fig. 4 c.
In the embodiment of the invention, a playing interface of a shared video in a shared room of a first application program is displayed, the shared room comprises at least one audience user, namely a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching videos by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; furthermore, according to the triggering of any audience user on any interaction option, an interaction response corresponding to any interaction option is output, the interaction between the user and the video or an object in the video is realized, and the interaction option can be selected by any user, so that the social interaction and the interaction are improved.
Based on the interface display method, the embodiment of the invention provides another interface display method. Referring to fig. 5, a schematic flow chart of another interface display method according to an embodiment of the present invention is provided. The flowchart shown in fig. 5 may be executed by the target terminal, and may specifically be executed by a processor of the target terminal. The target terminal may be at least one audience user included in the shared room, and the target terminal may refer to any one of the main modal user terminal and the friend modal user terminal. The interface display method shown in fig. 5 may include the steps of:
step S501, if a trigger event for playing the shared video exists, displaying a playing interface of the shared video in a shared room of the first application program, wherein the shared room comprises at least one audience user.
In one embodiment, the trigger event for playing the shared video may refer to an input by an administrative user of the at least one viewer user to trigger a playing operation of the shared video. The management user refers to a viewer user who has management authority for the shared room in at least one viewer user. As can be seen from the foregoing, the at least one audience user includes a master dynamic user and a friend dynamic user, and the master dynamic user is a creator of the shared room and has a management right for the shared room. Optionally, the master dynamic user may transfer the management authority to any one friend dynamic user; alternatively, the master posture user may designate one or more friend posture users as the administrative users to manage the shared room together.
Assuming that the administrative user is the main dynamic user, as an alternative, the triggering event for playing the shared video may be that the main dynamic user triggers an opening control for opening the shared video. Optionally, the opening control may be displayed in a room setting interface, and the room setting interface may be displayed after the main dynamic user creates and enters the shared room.
For example, referring to fig. 6a, a schematic diagram of a room setting interface provided by an embodiment of the present invention is shown, where 601 denotes the room setting interface, and 61 denotes an opening control for opening to play a shared video. When the main dynamic user clicks the open control 61, a play interface is displayed in the main dynamic user terminal as shown at 602 in fig. 6 a.
Optionally, after the master status user enters the room setting interface of the shared room, the target terminal may display a welcome window on the room setting interface, where the welcome window may include some prompting information, and assuming that the first application program is an online zoo, the prompting information in the welcome window may be used to prompt which animal and how to browse in the shared room. For example, the prompt in the welcome window may appear as "hi welcome to the panda family where you can invite family or friends to travel with the wheat to play a zoo, watch interesting animal shows, and interact with small animals.
Optionally, a determination option and a close button may be included in the welcome window. If the main modal user selects the decision option and the close button, the welcome window is closed. For example, referring to fig. 6b, a schematic diagram of a display welcome window is provided for an embodiment of the present invention, 603 denotes a room setting interface, and 62 denotes a welcome window. 6A represents a confirm option, 6B represents a close button, and when confirm option 6A is triggered, the welcome window in the room settings interface is closed.
As another possible implementation manner, if the main dynamic user selects the close button or determines an option, the main dynamic user triggers to display a friend invitation window in the room setting interface, so that the main dynamic user invites the friend dynamic user to join in the shared room through the friend invitation window. For example, if the master dynamic user clicks 6A in fig. 6b, the display of a friend invitation window in the room settings interface may be triggered. Specifically, the implementation of how to invite the friend-state user to join the shared room through the friend invitation window is described in detail in step S201 of the embodiment in fig. 2, and is not described herein again.
In one embodiment, the target terminal may provide a photographing setting function for the main status user or the friend status user to set a photographing special effect of the user, and subsequently, if the group photo option with the target object is triggered, the image of the main status user or the image of the friend status user may be acquired based on the photographing special effect selected by the main status user or the friend status user.
In specific implementation, a target terminal displays a shooting special effect selection interface, wherein the shooting special effect selection interface comprises an image preview area and a special effect selection area, and the special effect selection area comprises a plurality of special effect identifications and a determination control; when any one of the plurality of special effect identifications is selected and the determination control is not triggered, displaying a target preview image in the image preview area, wherein the target preview image is generated based on the user image acquired at the current moment and the special effect indicated by the selected special effect identification; and when any one of the plurality of special effect identifications is selected and the determination control is triggered, determining that the shooting special effect selection is completed.
Optionally, the shooting special effect selection interface is displayed when a shooting setting trigger event exists, where the shooting setting trigger event includes: and an opening control for opening the playing of the shared video is triggered, wherein the opening control is included in the room setting interface in the main dynamic user terminal.
That is to say, after the opening control in the room setting interface is selected, the target terminal displays the photographing special effect selection interface. In this case, the trigger event for playing the shared video may refer to completion of the shooting special effect selection. The special effect indicated by each of the special effect identifiers may include a filter and a beautification, the filter is used for modifying the image through virtual decoration, and the beautification is used for processing the human face in the image by using a certain means, such as face thinning and whitening. Optionally, the special effect selection area may include a first type of special effect selection item and a second type of special effect selection item, where the plurality of special effect identifiers are respectively displayed under different special effect selection items according to the special effect categories to which the respective special effect identifiers belong, for example, the special effect identifier belonging to the filter is displayed on the first type of special effect selection item, and the special effect identifier belonging to the beauty is displayed on the second type of special effect selection item.
For example, taking a target terminal as a main dynamic user terminal as an example, referring to fig. 6c, a schematic diagram of a shooting special effect selection provided by an embodiment of the present invention is shown, where 61A represents a room setting interface, and 61B represents an opening control for opening playing of a shared video in the room setting interface. When 61B is triggered, the main dynamic user terminal displays a photographing special effect selection interface 604; the image preview area 64 and the effect selection area 65 are included in the photographing effect selection interface 604, and the effect identification included in the effect selection area 65 can be expressed as: panda decorations 611, a filter 1 and a filter 2; the special effects selection area 65 also includes a determination control 66. When the panda decoration 611 is selected, but the determination control 66 is not triggered, the target preview image 622 is displayed in the image preview area 64.
In one embodiment, when it is detected that the photo effect selection is complete, i.e., it is determined that a determination control in the photo effect selection interface is triggered, a play interface displaying the shared video may be triggered, for example, if filter 1 is selected in fig. 6c and determination control 66 is triggered, the main dynamic user terminal may display the play interface displaying the shared video, as shown at 67 in fig. 6 d.
In other embodiments, the photographing setting triggering event may further include that a photographing setting option in the play interface is triggered. Referring to fig. 6e, which is a schematic diagram of another photographing setting provided in the embodiment of the present invention, 611 shows a playing interface of a shared video, 622 shows a photographing setting option, and it can be seen that when the playing interface is displayed in a vertical screen state, the photographing setting option can be displayed at the lower right side of the playing interface. If the photo setting option 622 is triggered, the target terminal displays a photo special effect selection interface 633.
Step S502, when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface.
Step S503, when any interactive option in the at least one interactive option is triggered by any audience user in a voice triggering mode, outputting an interactive response corresponding to any interactive option.
In an embodiment, some possible implementations included in step S502 and step S503 may refer to the related descriptions in step S202 and step S203 in fig. 2, and are not described herein again.
Step S504, if the viewing options of the history browsing in the playing interface are triggered, displaying a history browsing window, wherein the history browsing window comprises a history browsing map, and the history browsing map is generated according to the selected interactive options in the playing interface in the history time period.
In one embodiment, multiple interactions or multiple shared videos may have been played in the shared room over time, and in order to facilitate each viewer user to browse the shared videos that have been played in the shared room or the interactions that have been performed in the shared room, the target terminal may set a viewing option for history browsing in the playing interface for the viewer user to view a browsing history of the shared room. Optionally, when the playing interface is displayed in a landscape state, the display position of the viewing option for history browsing in the playing interface may be displayed at the bottom of the playing interface and below the position where the identification information of the main modal user is located, as shown in 131 in fig. 3 a; when the playing interface is displayed in the vertical screen state, the display position of the viewing option for history browsing in the playing interface may be displayed at the bottom of the playing interface, and is located at the same horizontal position as the identification information of the main dynamic user, as shown at 132 in fig. 3 b.
When the viewing option of the history browsing in the playing interface is triggered, the target terminal displays a history browsing window, wherein the history browsing window comprises a history browsing map, and the history browsing map is generated according to the selected interactive option in the shared room in the history time period. For example, the interactive options are sequentially connected according to the time when the interactive options are selected in the historical time period to form a historical browsing map.
For example, referring to fig. 7a, for a schematic diagram for displaying a history browsing map provided by an embodiment of the present invention, 701 represents a playing interface for sharing a video when a target terminal is in a landscape state, 71 represents a viewing option for history browsing, and when the viewing option for history browsing is triggered, a history browsing window 702 is displayed in the playing interface, and the history browsing map 72 is included in the window 702. The interaction options selected in the shared room in the historical time period in 72 are: "family expected", "feeding expected", "and group expected", and "circle finding".
In one embodiment, a camera control can be further arranged in the playing interface, the camera control comprises an open state and a closed state, and when the camera control is in the open state, the target terminal can acquire a video picture; when the camera control is in a closed state, the target terminal cannot collect video pictures. Optionally, the camera control may be displayed at the same horizontal position as the viewing options for the historical browsing. Specifically, when the playback interface is displayed in a landscape mode, the camera control may be displayed at the bottom end of the playback interface and located below the identification information of the main modal user, as shown at 133 in fig. 3 a; when the playing interface is displayed in the vertical screen state, the camera control may be displayed at the bottom of the playing interface and located at the same horizontal position as the identification information of the main dynamic user, as shown at 144 in fig. 3 b.
Optionally, the playing interface may further include identification information of a target audience user and identification information of remaining audience users, where the target audience user is an audience user using the target terminal among the at least one audience user sharing the room, and the remaining audience users are other audience users except the target audience user among the at least one audience user sharing the room. For example, if at least one of the audience users includes a main modality user and a friend modality user, the target terminal is the main modality user terminal, the target audience user is the main modality user, and the rest of the audience users are the friend modality users.
If the camera control in the playing interface is in an open state, the identification information of the target audience user is a video picture of the target user; if the camera control in the playing interface is in a closed state, the identification information of the target audience user is the user identification of the target user in the second application program;
if a camera control in a playing interface which is displayed by the remaining audience user terminals and used for playing the shared video is in an open state, the identification information of the remaining audience users is the video pictures of the remaining audience users; and if the camera control in the playing interface which is displayed by the remaining audience user terminals and used for playing the shared video is in a closed state, the identification information of the remaining audience users is the user identification of the remaining audience users in the second application program.
Taking the target terminal as the main modal user terminal as an example, the target audience user is the main modal user, the rest audience users are friend modal users, and in popular terms, if the main modal user sets the camera control to be in an open state, the video picture of the main modal user can be displayed in the playing interface; otherwise, displaying the user identification of the master mode user in the second application program, such as a head portrait or a nickname, in the playing interface; similarly, if the friend-state user sets the camera control to be in an open state in the friend-state user terminal, the video picture of the friend-state user can be displayed in the playing interface displayed by the main friend-state user terminal; otherwise, the user identification of the friend-state user is displayed in the playing interface displayed by the main-state user terminal.
In a simple way, a camera control is displayed in both the playing interface displayed by the main modal user terminal and the playing interface displayed by the friend modal user terminal, the camera control comprises an open state and a close state, and when the camera control is in the open state, the real-time picture of the corresponding audience user can be seen by other audience users in the shared room; conversely, when the camera control is in the closed state, the real-time image of the corresponding viewer is not visible to other viewer users, and the other viewer users can only see the user identifier, such as the avatar, of the viewer user in the second application.
For example, assuming that the target audience user is the master user and the remaining audience users are friend users, referring to fig. 7b, the playing interface displayed in the master user terminal provided in the embodiment of the present invention, 703 identification information of the master user, 75 and 76 respectively represent identification information of the friend users, 73 represents the camera control, and the camera control is in an open state, at this time, the identification information of the master user is a video frame of the master user; and if the friend-state users set the camera control to be in an open state in the playing interfaces corresponding to the friend-state users, the identification information of each friend-state user is the video picture of each friend-state user.
When the camera control is in the closed state in fig. 7b, the identification information of the main dynamic user is a user identification of the main dynamic user in the second application program, such as an avatar of the main dynamic user, as shown in 7031 in fig. 7 b; if the friend-state users set the camera control in their respective playing interfaces to the off state, the identification information of each friend-state user is the user identification, such as the avatar, of each friend-state user in the second application program, as shown in 751 and 761 in fig. 7 b.
In one embodiment, a microphone control may also be included in the playback interface. Alternatively, the microphone control may be displayed at the same horizontal position as the camera control described above, as well as viewing options for historical browsing. When the microphone control is turned on, the target user may input speech through the microphone, otherwise, the target user may not input speech, as indicated by 7A in fig. 7b for the microphone control.
In one embodiment, a play control may also be included in the play interface. Optionally, the play control may be displayed at the same horizontal position as the above-mentioned microphone control, camera control, and viewing options for history browsing. The play control is used to control the pause of playing the shared video and the continuation of playing the shared video, as shown in 7B in fig. 7B.
In one embodiment, a cross-screen play control may be further included in the play interface. Optionally, the cross-screen playing control may be displayed at the same horizontal position by the above-mentioned controls. When the cross-screen playing control is triggered, the playing interface is switched from being displayed in the target terminal to being displayed in the screen projection terminal, as shown in fig. 7b, 7C represents the screen projection control. For example, the current target terminal is a mobile phone, and because the mobile phone screen is small, the target user watches the shared video on the mobile phone for a long time and has eye fatigue, and at this time, the target user can trigger the cross-screen playing control to project the shared video onto a television for playing.
In one embodiment, a photo special effect setting option may be further included in the playing interface, as shown in fig. 7b at 7D. When the photo effect setting option is triggered, a photo effect selection interface, as shown at 604 in FIG. 6c, may be displayed for the target user to make the photo effect selection.
Therefore, when the target terminal is in the horizontal screen state or the vertical screen state, the display line forms of the identification information of the main user, the identification information of the friend user and the control in the playing interface are different. Fig. 7b is a display form of the identification information of the main modal user, the identification information of the friend modal user, and each control in the playing interface when the target terminal is in the vertical screen state, so that it can be seen that the identification information of the target audience user is displayed at the lower left corner of the playing interface, the identification information of the friend modal user is displayed at the top of the playing interface, and the camera control, the microphone control, the cross-screen playing control, the playing control, and the shooting special effect setting option can be displayed at the bottom of the playing interface in parallel with the identification information of the main modal user. Referring to fig. 3a, the display form of each control in the play interface is shown when the target terminal is in the landscape state. As can be seen, the camera control, the microphone control, the cross-screen play control, the play control and the shooting special effect setting options are still displayed in parallel at the bottom of the play interface, and the identification information of the main modal user is displayed above the position of the camera control; and the identification information of the friend-state user is displayed on the right side of the playing interface.
The above is only one feasible display form of each control in the playing interface when the target terminal is vertically or horizontally displayed, and in practical application, other display forms may be set according to the size information of the target terminal screen. For example, when the target terminal is in a vertical screen state, the camera control and the microphone control are displayed on the right side of the playing interface, and the screen projection playing control and the shooting special effect setting option are displayed on the left side; for another example, when the target terminal is in the landscape state, the identification information of the friend-state user is displayed at the bottom of the playing interface in parallel with other controls, and so on.
In the embodiment of the invention, when a trigger event for displaying and playing the shared video exists, the target terminal displays the playing interface of the shared video in the shared room of the first application program, the shared room comprises at least one audience user, namely a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching the video by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; furthermore, any audience user in the shared room can trigger any interaction option in a voice triggering mode, so that the interaction mode is enriched, and the participation of each audience user in the interaction is improved, thereby improving the interactivity; and the target terminal outputs an interactive response corresponding to any interactive option. Furthermore, if the viewing option of the history browsing in the playing interface is triggered, a history browsing window is displayed, wherein the history browsing window comprises a history browsing map, so that any viewer user can view the browsing history in the shared room, and the user experience is improved.
Based on the above embodiment of the interface method and the shared video management system, an embodiment of the present invention provides another shared video management system, and referring to fig. 8a, it is a network topology diagram of a shared video management system provided in an embodiment of the present invention. The shared video management system shown in fig. 8a may include a terminal 801, a server 802, and a third-party cloud service.
In one embodiment, the terminal 801 may be used by any audience user in the shared room, and the main functions of the terminal 801 are: 1) interface UI display party: for example, various interfaces and windows in the embodiments of fig. 2 and 5 are shown, such as a playing interface showing a shared video, a mutual window and interaction options (each interaction option may correspond to a scene), and so on; 2) the camera data acquisition and video connection display method comprises the following steps: in brief, if the viewer user corresponding to the terminal sets the camera control to be in an open state in the playing interface of the shared video, the terminal acquires the real-time video picture of the viewer user through the camera, and displays the real-time video pictures of other viewer users; 3) voice input recognition initiates the requestor: that is, if the viewer user corresponding to the terminal triggers any one of the at least one interaction option in the interaction window in a voice manner, the terminal may request the server 102 to recognize the input voice so as to determine which interaction option is selected from the at least one interaction option; 4) and receiving the data returned by the server 102 and displaying the data to the user.
In one embodiment, the server 102 may include three major components of a data processing, voice processing, and video streaming processor cluster. The data processor group is used for preprocessing the data from the terminal, transmitting the audio data related to the voice recognition to the voice processor group and transmitting the video stream data to the video stream processor group; after the voice processor group is processed by the recognition algorithm, the recognition result is returned to the data processor group; the video streaming processor group processes videos and video data of a plurality of audience users (the audience users can be called as online users) in a shared room and returns the processed videos and video data to the data processor group; and finally, the data processor group returns the voice recognition result and the video stream related data to the terminal 101.
In one embodiment, the third-party cloud service may provide a powerful natural language recognition service and an audio-video streaming service for the service end 102.
Based on the above description, a module architecture diagram corresponding to the shared video management system in the embodiment of the present invention can be obtained. Referring to fig. 8b, a block architecture diagram of a shared video management system according to an embodiment of the present invention is shown. The shared video management system shown in fig. 8b may include a presentation layer 81, a logical layer 82, and a service layer 83.
In one embodiment, the display layer 81 is processed by a terminal, and is mainly responsible for playing a shared video, displaying a UI of an audio/video call connection, displaying a history browsing footprint UI (also referred to as the history browsing map), displaying a group image UI, and processing interaction with any viewer user, and according to an operation of any viewer user in a playing interface, the designation is transferred to the logic layer 82, and a result returned by the logic layer 82 is updated to the UI for displaying.
In one embodiment, the logic layer 82 is primarily responsible for some business logic and non-presentation logic processing, such as network requests, data persistence, shared video playback controls, managing audience users in shared rooms, voice recognition input controls, group image generation, and the like. For example, any viewer user may select any interactive option through a voice-triggered triggering manner, and at this time, a voice recognition processing logic is triggered, and audio data acquired by a microphone is sent to the server 802 for voice recognition processing, and scene selection corresponding to the corresponding interactive option is performed according to data returned by the server 802. During this period, the logic layer 82 also synchronizes the selected interactive options to the server 802 for recording, so as to generate a browsing footprint map according to the record of the server 802 subsequently. If the selected interaction option is the option of group photo with the target object, the terminal 801 further needs to acquire video stream images of all audience users in the shared room at a specific time and image data in the shared video, and then synthesize the video stream images and the image data into a group photo image through the API and store the group photo image in a local file.
In one embodiment, the service layer 83 primarily provides audio-video call capability, voice recognition capability. For example, the service layer 83 may provide data interaction with the terminal through a network interface service, such as processing of voice recognition when the service layer 83 receives audio data input by voice recognition, and returning the result of the recognition processing to the terminal 801; for another example, when the service layer 83 receives the interactive option selection data synchronized by the terminal 801, the interactive option selection data is persisted to the database, a browsing footprint map is generated, and the browsing footprint map is returned to the terminal 801 for display; as another example, the service layer 83 may provide an audio-video call capability by obtaining video image data of the viewer user, such as a video picture at a certain moment and image data in the viewed shared video, such as an image of a target object at a certain moment, to the terminal 801, so that the terminal 801 may use the data to generate a group image.
In the following, a detailed description is given through fig. 8c and 8d on how the interface display method shown in fig. 2 and 5 is implemented in the shared video management system shown in fig. 8a and 8 b. FIG. 8c is an interaction diagram provided by an embodiment of the present invention, which illustrates the interaction between the presentation layer 81, the logic layer 82, and the service layer 83; fig. 8d is a flowchart illustrating a method for implementing interface display in a shared video management system according to an embodiment of the present invention. It should be understood that the terminals described in fig. 8c and 8d are both primary dynamic user terminals as an example. In a specific implementation:
(1) the terminal enters a shared room, namely, a creator (hereinafter referred to as a master dynamic user) of the shared room enters the shared room; initiating an invitation through the presentation layer 83 to invite friends to enter a shared room to watch a shared video;
(2) the video management and the connection management processing instruction between audience users are carried out through the logic layer 82, a request for establishing network communication is sent to the service layer 83, and data are sent to the service layer 83;
(3) the service layer 83 calls the audio and video call service of the third party according to the received data of the logic layer 82, manages shared room data, and returns the processed data to the terminal 801;
(4) the logic layer 82 receives corresponding audio and video and shared room data, processes the audio and video and shared room data, and then delivers the audio and video and shared room data to the display layer 81 to display a corresponding data UI;
(5) when the presentation layer 81 enters the interaction, the presentation layer 81 notifies the logic layer 82 to start acquiring the microphone audio data of the terminal 801, uploads the audio data to the service layer 83, and requests the response;
(6) after receiving the audio data, the service layer 83 calls a third-party server to perform voice recognition processing, and returns a voice recognition result to the terminal 801;
(7) the logic layer 82 performs matching processing on the received voice recognition result and each interactive option, determines the selected interactive option, and then executes items corresponding to the interactive option. For example, the selected interaction option is a group photo option with the target object, the logic layer 82 obtains the image data of the viewer user in the shared room at the moment and the image data of the shared video through the service layer 83, synthesizes and stores the group photo image through the local API, and then delivers the group photo image to the presentation layer 81 for presentation.
In the embodiment of the invention, the same shared video is watched together in real time through the connection of the videos of multiple persons, so that the requirement of interaction of multiple persons in different places is met, and the social interaction and the interactivity are improved. Compared with the connection form of characters, pictures and pure voice, the connection form of multi-person videos is more suitable for online interaction among friends, and the user experience is richer. When the shared video is watched, the interactive option is provided, any user on the line can select the next browsing scene in a mode of directly outputting the interactive option, the system automatically performs voice recognition and enters the next scene, and interestingness and participation sense of the user in watching the video are improved. In addition, can also shoot and mostly be the conjunctive photograph of line user and animal, leave the memory for the user, make the change of visiting more humanized on line.
Based on the above method embodiment, the embodiment of the invention also provides an interface display device. Fig. 9 is a schematic structural diagram of an interface display device according to an embodiment of the present invention. The interface display apparatus shown in fig. 9 may operate as follows:
a display unit 901 configured to display a play interface of a shared video in a shared room of a first application; the shared room comprises at least one audience user;
the display unit 901 is further configured to display at least one interaction option in the play interface when the play of the shared video meets an interaction condition;
an output unit 902, configured to output an interactive response corresponding to any one of the at least one interactive option when the any one of the at least one interactive option is triggered by any one of the at least one viewer user.
In one embodiment, the interaction conditions include any one or more of: playing progress conditions and playing picture conditions; if the interaction condition comprises a playing progress condition, the fact that the playing of the shared video meets the interaction condition means that the played progress in the shared video is equal to the target playing progress indicated by the playing progress condition; if the interaction condition comprises a playing picture condition, the fact that the playing of the shared video meets the interaction condition means that the playing picture in the shared video at the current moment is the target picture indicated by the playing picture condition.
In one embodiment, the at least one interactive option is displayed in an interactive window of the playing interface, and the shared video comprises a target object; the at least one interactive option comprises each interactive option in at least one interactive option to be displayed, and the at least one interactive option to be displayed refers to a preset interactive option related to the shared video; or the at least one interactive option comprises an interactive option which is screened from the at least one interactive option to be displayed and is matched with the interactive condition;
the at least one interactive option to be displayed comprises any one or more of: options for interacting with the target object and options for viewing other shared videos; the options for interacting with the target object comprise any one or more of options for combining with the target object and options for inputting name voices containing the target object.
In one embodiment, if any of the interaction options is a group photo option with the target object, the interaction response is a group photo image;
when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, the output unit 902 performs the following steps when an interactive response corresponding to any interactive option is output: when the group photo option with the target object is triggered, playing a group photo prompt animation on the playing interface;
after the prompt animation is played, displaying a group photo finishing window, wherein the group photo finishing window comprises a group photo image; the group image is generated from a target image and an audience user image of at least one audience user; the target image is a preset playing picture in the shared video, and the preset playing picture comprises the target object, or the target image is a playing picture in the shared video when any one interactive option is triggered; the audience user image is an image acquired by an audience user terminal in the process of playing the group photo prompt animation.
In one embodiment, if any of the interaction options refers to an option for watching other shared videos, the interaction response is a playing interface of the other shared videos;
when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, the output unit 902 performs the following steps when an interactive response corresponding to any interactive option is output: and when the option for watching other shared videos is triggered, switching to the playing interface for displaying other shared videos in the shared room of the first application program from the playing interface for displaying the shared videos.
In one embodiment, the interactive window comprises prompt information for triggering interactive options through voice; any one of the interaction options is triggered by any one or more of the following triggering modes: the method comprises a touch control mode and a voice mode, wherein the voice mode refers to inputting voice comprising any interactive option.
In an embodiment, the at least one audience user included in the shared room includes a master mode user and a friend mode user, the master mode user is a creator of the shared room, the friend mode user is a contact user of the master mode user in the second application program, the playing interface is an interface displayed in the target terminal and used for playing the shared video, and the target terminal is any one of the master mode user terminal and the friend mode user terminal.
In one embodiment, the display unit 901 is further configured to display a shooting special effect selection interface, where the shooting special effect selection interface includes an image preview area and a special effect selection area, and the special effect selection area includes a plurality of special effect identifiers and determination controls; when any one of the plurality of special effect identifications is selected and the determination control is not triggered, displaying a target preview image in the image preview area, wherein the target preview image is generated based on the user image acquired at the current moment and the special effect indicated by the selected special effect identification;
the interface display device further comprises a processing unit 903, and the processing unit 903 is configured to determine that the shooting special effect selection is completed when any one of the plurality of special effect identifiers is selected and the determination control is triggered.
In one embodiment, the photo special effects selection interface is displayed in the presence of a photo setting trigger event, the photo setting trigger event including: an opening control for opening the playing of the shared video is triggered, wherein the opening control is included in a room setting interface in the main dynamic user terminal; or the shooting setting option in the playing interface is triggered, and the room setting interface is displayed when the main dynamic user enters the shared room.
In one embodiment, the playing interface includes a camera control, identification information of a target audience user and identification information of remaining audience users, where the target audience user is an audience user using the target terminal among at least one audience user in the shared room, and the remaining audience users are other audience users except the target audience user among the at least one audience user in the shared room;
the display unit 901 is further configured to: if the camera control in the playing interface is in an open state, the identification information of the target audience user is a video picture of the target user; if the camera control in the playing interface is in a closed state, the identification information of the target audience user is the user identification of the target user in the second application program;
if the camera control in the playing interface which is displayed by the remaining user terminal and used for playing the shared video is in an open state, the identification information of the remaining audience users is the video pictures of the remaining audience users; and if the camera control in the playing interface which is displayed by the remaining user terminal and used for playing the shared video is in a closed state, the identification information of the remaining audience users is the user identification of the remaining audience users in the second application program.
In one embodiment, the interface display apparatus further includes a sending unit 904, wherein the friend user of the at least one audience user joins the shared room by way of invitation from the master user; the target terminal is a main dynamic user terminal;
the display unit 901 is further configured to display a friend invitation window when the friend invitation option is triggered, where the friend invitation window includes the application identifier of the second application;
the display unit 901 is further configured to select the application program identifier and display a friend-state user selection window, where the friend-state user selection window includes user identifiers of multiple friend-state users;
the sending unit 904 is configured to select a target user identifier in the user identifiers of the multiple friend-state users, and send invitation information to the friend-state user indicated by the target user identifier, where the invitation information is used to indicate that the friend-state user indicated by the target user identifier joins in the shared room.
In one embodiment, the display mode of the invite buddy option is any one or more of the following: when the number of friend-state users included in the shared room is smaller than a number threshold, displaying the friend-state users in the playing interface; and displaying in a room setting interface of the shared room, the room setting interface being displayed after the master dynamic user enters the shared room.
In one embodiment, the play interface includes a view option for historical browsing, and when the view option is triggered, a historical browsing window is displayed, the historical browsing window including a historical browsing map, the historical browsing map being generated according to historical time of the selected interactive option in the shared room.
According to an embodiment of the present invention, the steps involved in the interface display method shown in fig. 2 and 5 may be performed by the units in the interface display apparatus shown in fig. 9. For example, steps S201 to S202 described in fig. 2 may be performed by the display unit 901 in the interface display apparatus shown in fig. 9, and step S203 may be performed by the output unit 902 in the interface display apparatus shown in fig. 9; as another example, steps S501 to S502 in the interface display method shown in fig. 5 may be performed by the display unit 501 in the interface display apparatus shown in fig. 9, step S503 may be performed by the output unit 902 in the interface display apparatus shown in fig. 9, and step S504 may be performed by the display unit 901 in the interface display apparatus shown in fig. 9.
According to another embodiment of the present invention, the units in the interface display apparatus shown in fig. 9 may be respectively or entirely combined into one or several other units to form the interface display apparatus, or some unit(s) thereof may be further split into multiple units with smaller functions to form the interface display apparatus, which may achieve the same operation without affecting the achievement of the technical effect of the embodiment of the present invention. The units are divided based on logic functions, and in practical application, the functions of one unit can be realized by a plurality of units, or the functions of a plurality of units can be realized by one unit. In other embodiments of the present invention, the interface-based display device may also include other units, and in practical applications, these functions may also be implemented by the assistance of other units, and may be implemented by cooperation of a plurality of units.
According to another embodiment of the present invention, the interface display apparatus shown in fig. 9 may be constructed by running a computer program (including program codes) capable of executing the steps involved in the respective methods shown in fig. 2 and 5 on a general-purpose computing device such as a computer including a processing element such as a Central Processing Unit (CPU), a random access storage medium (RAM), a read-only storage medium (ROM), and a storage element, and the interface display method according to an embodiment of the present invention may be implemented. The computer program may be embodied on a computer-readable storage medium, for example, and loaded into and executed by the above-described computing apparatus via the computer-readable storage medium.
In the embodiment of the invention, a playing interface of a shared video in a shared room of a first application program is displayed, the shared room comprises at least one audience user, namely a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching videos by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; furthermore, according to the triggering of any audience user on any interaction option, an interaction response corresponding to any interaction option is output, the interaction between the user and the video or an object in the video is realized, and the interaction option can be selected by any user, so that the social interaction and the interaction are improved.
Based on the method and the device embodiment, the embodiment of the invention provides a terminal. Referring to fig. 10, a schematic structural diagram of a terminal according to an embodiment of the present invention is provided. The terminal shown in fig. 10 includes at least a processor 1001, an input interface 1002, an output interface 1003, and a computer storage medium 1004. The processor 1001, the input interface 1002, the output interface 1003, and the computer storage medium 1004 may be connected by a bus or other means.
A computer storage medium 1004 may be stored in the memory of the terminal, said computer storage medium 601 being adapted to store a computer program comprising program instructions, said processor 1001 being adapted to execute said program instructions stored by said computer storage medium 1004. The processor 1001 (or CPU) is a computing core and a control core of the terminal, and is adapted to implement one or more instructions, and specifically adapted to load and execute:
displaying a playing interface of a shared video in a shared room of a first application program; the shared room comprises at least one audience user; when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface; and when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to the any interactive option.
The embodiment of the invention also provides a computer storage medium (Memory), which is a Memory device in the terminal and is used for storing programs and data. It is understood that the computer storage medium herein may include a built-in storage medium in the terminal, and may also include an extended storage medium supported by the terminal. The computer storage medium provides a storage space that stores an operating system of the terminal. Also stored in this memory space are one or more instructions, which may be one or more computer programs (including program code), suitable for loading and execution by processor 1001. The computer storage medium may be a high-speed RAM memory, or may be a non-volatile memory (non-volatile memory), such as at least one disk memory; and optionally at least one computer storage medium located remotely from the processor.
In one embodiment, the computer storage medium may be loaded with one or more instructions and executed by processor 1001 to implement the corresponding steps described above with respect to the interface display methods shown in fig. 2 and 5. In particular implementations, one or more instructions in the computer storage medium are loaded by the processor 1001 and perform the following steps:
displaying a playing interface of a shared video in a shared room of a first application program; the shared room comprises at least one audience user; when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface; and when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to the any interactive option.
In one embodiment, the interaction conditions include any one or more of: playing progress conditions and playing picture conditions; if the interaction condition comprises a playing progress condition, the fact that the playing of the shared video meets the interaction condition means that the played progress in the shared video is equal to the target playing progress indicated by the playing progress condition; if the interaction condition comprises a playing picture condition, the fact that the playing of the shared video meets the interaction condition means that the playing picture in the shared video at the current moment is the target picture indicated by the playing picture condition.
In one embodiment, the at least one interactive option is displayed in an interactive window of the playing interface, and the shared video comprises a target object;
the at least one interactive option comprises each interactive option in at least one interactive option to be displayed, and the at least one interactive option to be displayed refers to a preset interactive option related to the shared video; or, the at least one interactive option comprises an interactive option which is selected from the at least one interactive option to be displayed and is matched with the interactive condition; the at least one interactive option to be displayed comprises any one or more of: options for interacting with the target object and options for viewing other shared videos; the options for interacting with the target object comprise any one or more of options for combining with the target object and options for inputting name voices containing the target object.
In one embodiment, if any of the interaction options is a group photo option with the target object, the interaction response is a group photo image; when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, the processor 1001 outputs an interactive response corresponding to the any interactive option, and executes the following steps:
when the group photo option with the target object is triggered, playing a group photo prompt animation on the playing interface; after the prompt animation is played, displaying a group photo finishing window, wherein the group photo finishing window comprises a group photo image; the group image is generated from a target image and an audience user image of at least one audience user; the target image is a preset playing picture in the shared video, and the preset playing picture comprises the target object, or the target image is a playing picture in the shared video when any one interactive option is triggered; the audience user image is an image acquired by an audience user terminal in the process of playing the group photo prompt animation.
In one embodiment, if any of the interaction options refers to an option for watching other shared videos, the interaction response is a playing interface of the other shared videos;
when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, the processor 1001 outputs an interactive response corresponding to the any interactive option, and executes the following steps: and when the option for watching other shared videos is triggered, switching to the playing interface for displaying other shared videos in the shared room of the first application program from the playing interface for displaying the shared videos.
In one embodiment, the interactive window comprises prompt information for triggering interactive options through voice; any one of the interaction options is triggered by any one or more of the following triggering modes: the method comprises a touch control mode and a voice mode, wherein the voice mode refers to inputting voice comprising any interactive option.
In an embodiment, the at least one audience user included in the shared room includes a master mode user and a friend mode user, the master mode user is a creator of the shared room, the friend mode user is a contact user of the master mode user in the second application program, the playing interface is an interface displayed in the target terminal and used for playing the shared video, and the target terminal is any one of the master mode user terminal and the friend mode user terminal.
In one embodiment, the processor 1001 is further configured to perform:
displaying a shooting special effect selection interface, wherein the shooting special effect selection interface comprises an image preview area and a special effect selection area, and the special effect selection area comprises a plurality of special effect identifications and a determination control; when any one of the plurality of special effect identifications is selected and the determination control is not triggered, displaying a target preview image in the image preview area, wherein the target preview image is generated based on the user image acquired at the current moment and the special effect indicated by the selected special effect identification; and when any one of the plurality of special effect identifications is selected and the determination control is triggered, determining that the shooting special effect selection is completed.
In one embodiment, the photo special effects selection interface is displayed in the presence of a photo setting trigger event, the photo setting trigger event including: an opening control for opening the playing of the shared video is triggered, wherein the opening control is included in a room setting interface in the main dynamic user terminal; or the shooting setting option in the playing interface is triggered, and the room setting interface is displayed when the main dynamic user enters the shared room.
In one embodiment, the playing interface includes a camera control, identification information of a target audience user and identification information of remaining audience users, where the target audience user is an audience user using the target terminal among at least one audience user in the shared room, and the remaining audience users are other audience users except the target audience user among the at least one audience user in the shared room;
if the camera control in the playing interface is in an open state, the identification information of the target audience user is a video picture of the target user; if the camera control in the playing interface is in a closed state, the identification information of the target audience user is the user identification of the target user in the second application program;
if a camera control in a playing interface which is displayed by the remaining audience user terminals and used for playing the shared video is in an open state, the identification information of the remaining audience users is the video pictures of the remaining audience users; and if the camera control in the playing interface which is displayed by the remaining audience user terminals and used for playing the shared video is in a closed state, the identification information of the remaining audience users is the user identification of the remaining audience users in the second application program.
In one embodiment, the friend state user of the at least one audience user joins the shared room by way of invitation from the master state user; the target terminal is a master dynamic user terminal, and the processor 1001 is further configured to perform:
when the friend invitation option is triggered, displaying a friend invitation window, wherein the friend invitation window comprises an application program identifier of the second application program; selecting the application program identification, and displaying a friend-state user selection window, wherein the friend-state user selection window comprises user identifications of a plurality of friend-state users; and selecting a target user identifier in the user identifiers of the friend-state users, and sending invitation information to the friend-state user indicated by the target user identifier, wherein the invitation information is used for indicating the friend-state user indicated by the target user identifier to join the shared room.
In one embodiment, the display mode of the invite buddy option is any one or more of the following: when the number of friend-state users included in the shared room is smaller than a number threshold, displaying the friend-state users in the playing interface; and displaying in a room setting interface of the shared room, the room setting interface being displayed after the master dynamic user enters the shared room.
In one embodiment, the play interface includes a view option for historical browsing, and when the view option is triggered, a historical browsing window is displayed, the historical browsing window including a historical browsing map, the historical browsing map being generated according to historical time of the selected interactive option in the shared room.
In the embodiment of the invention, a playing interface of a shared video in a shared room of a first application program is displayed, the shared room comprises at least one audience user, namely a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching videos by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; furthermore, according to the triggering of any audience user on any interaction option, an interaction response corresponding to any interaction option is output, the interaction between the user and the video or an object in the video is realized, and the interaction option can be selected by any user, so that the social interaction and the interaction are improved.
According to an aspect of the present application, an embodiment of the present invention also provides a computer product or a computer program, which includes computer instructions stored in a computer-readable storage medium. The processor 1001 reads the computer instructions from the computer-readable storage medium, and the processor 1001 executes the computer instructions, so that the image processing apparatus performs the interface display method shown in fig. 2 and 5, specifically: displaying a playing interface of a shared video in a shared room of a first application program; the shared room comprises at least one audience user; when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface; and when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to the any interactive option.
In the embodiment of the invention, a playing interface of a shared video in a shared room of a first application program is displayed, the shared room comprises at least one audience user, namely a plurality of audience users in the shared room watch the same shared video, so that the synchronism of watching videos by a plurality of people is realized; if the playing of the shared video meets the interaction condition, displaying one or more interaction options in a playing interface; furthermore, according to the triggering of any audience user on any interaction option, an interaction response corresponding to any interaction option is output, the interaction between the user and the video or an object in the video is realized, and the interaction option can be selected by any user, so that the social interaction and the interaction are improved.

Claims (15)

1. An interface display method, comprising:
displaying a playing interface of a shared video in a shared room of a first application program; the shared room comprises at least one audience user;
when the playing of the shared video meets the interaction condition, displaying at least one interaction option in the playing interface;
and when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to the any interactive option.
2. The method of claim 1, wherein the interaction conditions include any one or more of: playing progress conditions and playing picture conditions;
if the interaction condition comprises a playing progress condition, the fact that the playing of the shared video meets the interaction condition means that the playing progress of the shared video is equal to the target playing progress indicated by the playing progress condition;
if the interaction condition comprises a playing picture condition, the fact that the playing of the shared video meets the interaction condition means that the playing picture in the shared video at the current moment is the target picture indicated by the playing picture condition.
3. The method of claim 1, wherein the at least one interactive option is displayed in an interactive window of the playback interface, and the shared video includes a target object;
the at least one interactive option comprises each interactive option in at least one interactive option to be displayed, and the at least one interactive option to be displayed refers to a preset interactive option related to the shared video; or, the at least one interactive option comprises an interactive option which is selected from the at least one interactive option to be displayed and is matched with the interactive condition;
the at least one interactive option to be displayed comprises any one or more of: options for interacting with the target object and options for viewing other shared videos; the options for interacting with the target object comprise any one or more of an option for combining with the target object and an option for inputting a name voice containing the target object.
4. The method of claim 3, wherein if any of the interaction options is a group photo option with the target object, the interaction response is a group photo image;
when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to the any interactive option, including:
when the group photo option with the target object is triggered, playing a group photo prompt animation on the playing interface;
after the prompt animation is played, displaying a group photo finishing window, wherein the group photo finishing window comprises a group photo image;
the group image is generated from a target image and an audience user image of at least one audience user;
the target image refers to a preset playing picture in the shared video, and the preset playing picture comprises the target object; or the target image refers to a playing picture in the shared video when any interactive option is triggered;
the audience user image is an image acquired by an audience user terminal in the process of playing the group photo prompt animation.
5. The method of claim 3, wherein if any of the interactive options is an option to view other shared videos, the interactive response is a play interface of the other shared videos;
when any interactive option in the at least one interactive option is triggered by any viewer user in the at least one viewer user, outputting an interactive response corresponding to the any interactive option, including:
and when the option for watching other shared videos is triggered, switching to the playing interface for displaying other shared videos in the shared room of the first application program from the playing interface for displaying the shared videos.
6. The method of claim 3, wherein the interactive window includes a prompt to trigger interactive options via speech; any one of the interaction options is triggered by any one or more of the following triggering modes: the method comprises a touch control mode and a voice mode, wherein the voice mode refers to inputting voice comprising any interactive option.
7. The method of claim 1, wherein the at least one of the audience users included in the shared room includes a master dynamic user and a friend dynamic user, the master dynamic user is a creator of the shared room, the friend dynamic user is a contact user of the master dynamic user in the second application, the playback interface is an interface displayed in a target terminal for playing the shared video, and the target terminal is any one of the master dynamic user terminal and the friend dynamic user terminal.
8. The method of claim 7, wherein the method further comprises:
displaying a shooting special effect selection interface, wherein the shooting special effect selection interface comprises an image preview area and a special effect selection area, and the special effect selection area comprises a plurality of special effect identifications and a determination control;
when any one of the plurality of special effect identifications is selected and the determination control is not triggered, displaying a target preview image in the image preview area, wherein the target preview image is generated based on the user image acquired at the current moment and the special effect indicated by the selected special effect identification;
and when any one of the plurality of special effect identifications is selected and the determination control is triggered, determining that the shooting special effect selection is completed.
9. The method of claim 8, wherein the picture effect selection interface is displayed in the presence of a picture setting trigger event, the picture setting trigger event comprising: an opening control for opening the playing of the shared video is triggered, wherein the opening control is included in a room setting interface in the main dynamic user terminal; or the shooting setting option in the playing interface is triggered, and the room setting interface is displayed when the main dynamic user enters the shared room.
10. The method of claim 7, wherein the playback interface includes a camera control, identification information of a target audience user that is an audience user using the target terminal among the at least one audience user of the shared room, and identification information of remaining audience users that are other than the target audience user among the at least one audience user of the shared room;
if the camera control in the playing interface is in an open state, the identification information of the target audience user is a video picture of the target user; if the camera control in the playing interface is in a closed state, the identification information of the target audience user is the user identification of the target user in the second application program;
if a camera control in a playing interface which is displayed by the remaining audience user terminals and used for playing the shared video is in an open state, the identification information of the remaining audience users is the video pictures of the remaining audience users; and if the camera control in the playing interface which is displayed by the remaining audience user terminals and used for playing the shared video is in a closed state, the identification information of the remaining audience users is the user identification of the remaining audience users in the second application program.
11. The method of claim 7, wherein the buddy-mode user of the at least one spectator user is joined to the shared room by way of an invitation from the master-mode user; the target terminal is a master dynamic user terminal, and the method further comprises the following steps:
when the friend invitation option is triggered, displaying a friend invitation window, wherein the friend invitation window comprises an application program identifier of the second application program;
selecting the application program identification, and displaying a friend-state user selection window, wherein the friend-state user selection window comprises user identifications of a plurality of friend-state users;
and selecting a target user identifier in the user identifiers of the friend-state users, and sending invitation information to the friend-state user indicated by the target user identifier, wherein the invitation information is used for indicating the friend-state user indicated by the target user identifier to join the shared room.
12. The method of claim 11 wherein the inviting buddy option is displayed in any one or more of: when the number of friend-state users included in the shared room is smaller than a number threshold, displaying the friend-state users in the playing interface; and displaying in a room setting interface of the shared room, the room setting interface being displayed after the master dynamic user enters the shared room.
13. The method of claim 1, wherein the play interface includes a view option for historical browsing, wherein when the view option is triggered, a historical browsing window is displayed, wherein the historical browsing window includes a historical browsing map, and wherein the historical browsing map is generated based on historical time for a selected interaction option in the shared room.
14. An interface display device, comprising:
the display unit is used for displaying a playing interface of a shared video in a shared room of the first application program; the shared room comprises at least one audience user;
the display unit is further used for displaying at least one interaction option in the playing interface when the playing of the shared video meets the interaction condition;
and the output unit is used for outputting an interactive response corresponding to any interactive option when any interactive option in the at least one interactive option is triggered by any audience user in the at least one audience user.
15. A terminal, comprising:
a processor adapted to implement one or more instructions, an
A computer storage medium having stored thereon one or more instructions adapted to be loaded by the processor and to perform the interface display method of any of claims 1-13.
CN202011184663.5A 2020-10-29 2020-10-29 Interface display method, device, equipment and storage medium Active CN114430494B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011184663.5A CN114430494B (en) 2020-10-29 2020-10-29 Interface display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011184663.5A CN114430494B (en) 2020-10-29 2020-10-29 Interface display method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114430494A true CN114430494A (en) 2022-05-03
CN114430494B CN114430494B (en) 2024-04-09

Family

ID=81308879

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011184663.5A Active CN114430494B (en) 2020-10-29 2020-10-29 Interface display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114430494B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115225602A (en) * 2022-06-29 2022-10-21 赤子城网络技术(北京)有限公司 Social application processing method and system

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104822090A (en) * 2014-04-25 2015-08-05 腾讯科技(北京)有限公司 Video playing method, device and system
CN105608715A (en) * 2015-12-17 2016-05-25 广州华多网络科技有限公司 Online group shot method and system
CN105959207A (en) * 2016-05-17 2016-09-21 广州酷狗计算机科技有限公司 Audio and video sharing method and device
CN106210757A (en) * 2016-07-28 2016-12-07 北京小米移动软件有限公司 Live broadcasting method, live broadcast device and live broadcast system
CN106411687A (en) * 2015-07-31 2017-02-15 腾讯科技(深圳)有限公司 Method and apparatus for interaction between network access device and bound user
CN106534953A (en) * 2016-12-09 2017-03-22 北京小米移动软件有限公司 Video rebroadcasting method for live streaming application and control terminal
CN106533924A (en) * 2016-12-19 2017-03-22 广州华多网络科技有限公司 Instant messaging method and device
CN107333167A (en) * 2017-05-22 2017-11-07 武汉斗鱼网络科技有限公司 A kind of processing method, device and the electronic equipment of video-see record
CN107465937A (en) * 2017-06-30 2017-12-12 武汉斗鱼网络科技有限公司 A kind of processing method, device and the electronic equipment of video-see record
CN108111918A (en) * 2017-12-08 2018-06-01 深圳岚锋创视网络科技有限公司 Interactive approach, device and live streaming client during a kind of panoramic video live streaming
CN109688480A (en) * 2019-01-14 2019-04-26 广州虎牙信息科技有限公司 A kind of live broadcasting method, terminal device and storage medium
CN110166799A (en) * 2018-07-02 2019-08-23 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus and storage medium
CN111314773A (en) * 2020-01-22 2020-06-19 广州虎牙科技有限公司 Screen recording method and device, electronic equipment and computer readable storage medium
CN111385632A (en) * 2020-03-06 2020-07-07 腾讯科技(深圳)有限公司 Multimedia interaction method
CN111654730A (en) * 2020-06-05 2020-09-11 腾讯科技(深圳)有限公司 Video playing method, data processing method, related device and medium
CN111698566A (en) * 2020-06-04 2020-09-22 北京奇艺世纪科技有限公司 Video playing method and device, electronic equipment and storage medium
CN111836114A (en) * 2020-07-08 2020-10-27 北京达佳互联信息技术有限公司 Video interaction method and device, electronic equipment and storage medium

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104822090A (en) * 2014-04-25 2015-08-05 腾讯科技(北京)有限公司 Video playing method, device and system
CN106411687A (en) * 2015-07-31 2017-02-15 腾讯科技(深圳)有限公司 Method and apparatus for interaction between network access device and bound user
CN105608715A (en) * 2015-12-17 2016-05-25 广州华多网络科技有限公司 Online group shot method and system
CN105959207A (en) * 2016-05-17 2016-09-21 广州酷狗计算机科技有限公司 Audio and video sharing method and device
CN106210757A (en) * 2016-07-28 2016-12-07 北京小米移动软件有限公司 Live broadcasting method, live broadcast device and live broadcast system
CN106534953A (en) * 2016-12-09 2017-03-22 北京小米移动软件有限公司 Video rebroadcasting method for live streaming application and control terminal
CN106533924A (en) * 2016-12-19 2017-03-22 广州华多网络科技有限公司 Instant messaging method and device
CN107333167A (en) * 2017-05-22 2017-11-07 武汉斗鱼网络科技有限公司 A kind of processing method, device and the electronic equipment of video-see record
CN107465937A (en) * 2017-06-30 2017-12-12 武汉斗鱼网络科技有限公司 A kind of processing method, device and the electronic equipment of video-see record
CN108111918A (en) * 2017-12-08 2018-06-01 深圳岚锋创视网络科技有限公司 Interactive approach, device and live streaming client during a kind of panoramic video live streaming
CN110166799A (en) * 2018-07-02 2019-08-23 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus and storage medium
CN109688480A (en) * 2019-01-14 2019-04-26 广州虎牙信息科技有限公司 A kind of live broadcasting method, terminal device and storage medium
CN111314773A (en) * 2020-01-22 2020-06-19 广州虎牙科技有限公司 Screen recording method and device, electronic equipment and computer readable storage medium
CN111385632A (en) * 2020-03-06 2020-07-07 腾讯科技(深圳)有限公司 Multimedia interaction method
CN111698566A (en) * 2020-06-04 2020-09-22 北京奇艺世纪科技有限公司 Video playing method and device, electronic equipment and storage medium
CN111654730A (en) * 2020-06-05 2020-09-11 腾讯科技(深圳)有限公司 Video playing method, data processing method, related device and medium
CN111836114A (en) * 2020-07-08 2020-10-27 北京达佳互联信息技术有限公司 Video interaction method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115225602A (en) * 2022-06-29 2022-10-21 赤子城网络技术(北京)有限公司 Social application processing method and system

Also Published As

Publication number Publication date
CN114430494B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
RU2527199C2 (en) Avatar integrated shared media selection
CN113395533B (en) Virtual gift special effect display method and device, computer equipment and storage medium
US10856035B2 (en) Message sharing method, client, and computer storage medium
WO2022087920A1 (en) Video playing method and apparatus, and terminal and storage medium
CN112905074B (en) Interactive interface display method, interactive interface generation method and device and electronic equipment
CN113518264A (en) Interaction method, device, terminal and storage medium
JP2023538958A (en) Photography methods, equipment, electronic equipment and computer-readable storage media
CN112261481B (en) Interactive video creating method, device and equipment and readable storage medium
CN109195003B (en) Interaction method, system, terminal and device for playing game based on live broadcast
CN113411656B (en) Information processing method, information processing device, computer equipment and storage medium
CN113457123A (en) Interaction method and device based on cloud game, electronic equipment and readable storage medium
WO2022142944A1 (en) Live-streaming interaction method and apparatus
CN114466209A (en) Live broadcast interaction method and device, electronic equipment, storage medium and program product
JPWO2018074516A1 (en) Information processing system
US20200349749A1 (en) Virtual reality equipment and method for controlling thereof
WO2023093698A1 (en) Interaction method for game live-streaming, and storage medium, program product and electronic device
WO2023098011A1 (en) Video playing method and electronic device
CN113518240A (en) Live broadcast interaction method, virtual resource configuration method, virtual resource processing method and device
CN109788327B (en) Multi-screen interaction method and device and electronic equipment
CN110446090A (en) A kind of virtual auditorium spectators bus connection method, system, device and storage medium
CN114430494B (en) Interface display method, device, equipment and storage medium
CN112188223B (en) Live video playing method, device, equipment and medium
CN109688347A (en) Multi-screen interaction method, device and electronic equipment
CN112261482B (en) Interactive video playing method, device and equipment and readable storage medium
CN114760520A (en) Live small and medium video shooting interaction method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40070392

Country of ref document: HK

GR01 Patent grant