WO2019228120A1 - Procédé et dispositif d'interaction vidéo, terminal, et support de stockage - Google Patents

Procédé et dispositif d'interaction vidéo, terminal, et support de stockage Download PDF

Info

Publication number
WO2019228120A1
WO2019228120A1 PCT/CN2019/084930 CN2019084930W WO2019228120A1 WO 2019228120 A1 WO2019228120 A1 WO 2019228120A1 CN 2019084930 W CN2019084930 W CN 2019084930W WO 2019228120 A1 WO2019228120 A1 WO 2019228120A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
interactive
control
interface
game
Prior art date
Application number
PCT/CN2019/084930
Other languages
English (en)
Chinese (zh)
Inventor
崔凌睿
张然
江会福
郑尚镇
林福源
钟雨
王方晓
张震云
吴歆婉
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2019228120A1 publication Critical patent/WO2019228120A1/fr
Priority to US16/920,863 priority Critical patent/US11178471B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8545Content authoring for generating interactive applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • H04N21/4725End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content using interactive regions of the image, e.g. hot spots
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software
    • H04N21/8173End-user applications, e.g. Web browser, game
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8541Content authoring involving branching, e.g. to different story endings

Definitions

  • the embodiments of the present application relate to the field of multimedia, and in particular, to a video interaction method, device, terminal, and storage medium.
  • Short videos are a method of spreading Internet content. Generally, short videos are videos that are spread within 5 minutes on short video platforms. Users can use the short video program corresponding to the short video platform to upload short videos to other users. You can watch videos, and you can also shoot and upload short videos through this short video program.
  • the short video uploaded by the second user following the first user can be viewed, or the recommended short video can be viewed in the short video recommendation area.
  • the short video may be determined by the server according to the playback volume and / or retransmission volume of the short video.
  • the first user may interact with the second user who uploaded the short video with respect to the content of the viewed short video, such as: liking the short video, forwarding the short video, or commenting on the short video.
  • a video interaction method, device, terminal, and storage medium are provided.
  • a video interaction method which includes:
  • Playing an interactive video in a video playing interface where the playing interface is an interface for playing a video, and the interactive video includes a target story node;
  • a video interactive device in another aspect, includes:
  • a playing module configured to play an interactive video in a video playing interface, where the playing interface is an interface for playing a video, and the interactive video includes a target story node;
  • a display module configured to display an interactive control on a target video screen of the interactive short video when the interactive video is played to the target story node;
  • a receiving module configured to receive a trigger operation on the interactive control
  • the display module is further configured to display the interactive content corresponding to the interactive control according to the trigger operation.
  • a terminal includes a memory and a processor.
  • the memory stores computer-readable instructions.
  • the processor causes the processor to perform the following steps:
  • Playing an interactive video in a video playing interface where the playing interface is an interface for playing a video, and the interactive video includes a target story node;
  • a computer-readable storage medium stores computer-readable instructions. When the computer-readable instructions are executed by a processor, the processor causes the processor to perform the following steps:
  • Playing an interactive video in a video playing interface where the playing interface is an interface for playing a video, and the interactive video includes a target story node;
  • FIG. 1 is a schematic diagram of a video playback system provided by an exemplary embodiment of the present application.
  • FIG. 2 is a flowchart of a video interaction method provided by an exemplary embodiment of the present application.
  • FIG. 3 is a schematic diagram of a video playing interface provided by an exemplary embodiment of the present application.
  • FIG. 4 is a schematic diagram of configuration parameters for configuring an interactive video provided by an exemplary embodiment of the present application.
  • FIG. 5 is a flowchart of a video interaction method according to another exemplary embodiment of the present application.
  • FIG. 6 is a schematic diagram of a video playback interface provided by another exemplary embodiment of the present application.
  • FIG. 7 is a schematic diagram of interactive content provided by an exemplary embodiment of the present application.
  • FIG. 8 is a schematic diagram of a video playing interface provided by another exemplary embodiment of the present application.
  • FIG. 9 is a schematic diagram of a video playing interface provided by another exemplary embodiment of the present application.
  • FIG. 10 is a flowchart of a video interaction method according to another exemplary embodiment of the present application.
  • FIG. 11 is a structural block diagram of a video interactive device provided by an exemplary embodiment of the present application.
  • FIG. 12 is a structural block diagram of a video interactive apparatus according to another exemplary embodiment of the present application.
  • FIG. 13 is a structural block diagram of a device for interacting in a short video program according to another exemplary embodiment of the present application.
  • FIG. 14 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • FIG. 15 is a structural block diagram of a server provided by an exemplary embodiment of the present application.
  • FIG. 16 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • Short video is a means of disseminating Internet content.
  • short videos are video content that is transmitted within a short video program within a preset length (for example, 5 minutes, 20 seconds, 10 seconds), and most of the short videos are user-originated content.
  • the user can watch short videos uploaded by other users through the short video program, and can also shoot and upload short videos through the short video program.
  • Interactive short video is a short video with interactive functions, or a short video with interactive controls superimposed to achieve interactive functions.
  • the interactive short video is a short video played in an interactive storyline.
  • the interactive short video includes at least one story node, and the at least one story node includes a target story node.
  • interactive controls are superimposed and displayed on the current video screen.
  • the terminal displays interactive content, and the interactive content is content that can perform multiple forms of interaction with the user.
  • the interactive short video may be referred to as a short feed video in the embodiments of the present application.
  • the interactive short video is an information export in a format standard provided to meet the needs of continuously updating itself in some form. It can be understood as a dynamic in the short video community, which is in the dynamic stream. A video.
  • the interactive short video is played in a playback interface in the form of a normal short video, that is, an interactive short video is interspersed and played in the playback of the ordinary short video.
  • the form of short video is played in the terminal.
  • Video streaming Play multiple short videos one after the other in a short video program.
  • Interactive story line It is the storyline of the interactive short video, that is, various events in the interactive short video are connected to the story nodes according to time (or place).
  • the interactive story line includes at least one story node. For example: in the first second, video character A gets up, in the third second, video character A pours water, in the fifth second, video character A drinks water, then the first second corresponds to a story node, the third second corresponds to a story node, and the fifth The second corresponds to a story node, and the three story nodes are connected in sequence to form the interactive story line of the interactive short video.
  • Story node It is used to represent the landmark event point and / or the plot change point in the story development of the short video.
  • the video character A gets up
  • the video character A pours water
  • the video character A drinks water
  • the first story node indicates that the status of the video character A has changed.
  • the sitting state became a rising state.
  • the second story node indicates that the status of video character A has changed from a standing state to a pouring state.
  • the third story node indicates that the status of video character A has changed.
  • the state of holding water changed to a state of drinking water.
  • Electronic red envelope an electronic carrier used to send virtual items through a network.
  • the virtual items include at least one of virtual currency, virtual points, virtual tickets, virtual pets, and virtual props.
  • FIG. 1 is a schematic diagram of a video playback system provided by an exemplary embodiment of the present application. As shown in FIG. 1, the system includes a video upload terminal 11, a video viewing terminal 12, a server 13, and a communication network 14.
  • the video upload terminal 11 is configured to upload related information of the ordinary video and the interactive video to the server 13 through the communication network 14.
  • the video upload terminal 11 may upload the video used as the material and the display configuration information of the interactive controls to generate an interactive video in the server 13.
  • operations to be performed include at least one of a shooting operation, a configuration operation, and an adjustment operation.
  • After configuring the interactive controls and adjusting the video parameters of the interactive video upload the display configuration information of the video and interactive controls to the server 13 through the communication network 14.
  • the video parameters include: video duration, video clarity, video You can also obtain the video you have taken as the material, and after adding interactive controls to the video, upload the video and the display configuration information of the interactive controls to the server 13.
  • the configuration of the interactive control includes the configuration of the display time of the interactive control, the display position of the interactive control, the display elements of the interactive control, the configuration of the interactive content corresponding to the interactive control, and the interactive control.
  • the number is configured for at least one of them.
  • the display configuration information of the interactive control is sent to the server 13 for storage.
  • the video upload terminal 11 may be a mobile terminal such as a mobile phone, a tablet computer, or a smart watch, or a terminal such as a desktop computer or a portable laptop computer.
  • the video upload terminal 11 may be a terminal used by a manager or an operation and maintenance person.
  • the video viewing terminal 12 is configured to obtain a video from the server 13 for viewing through the communication network 14.
  • the video obtained by the video viewing terminal 12 from the server 13 includes related information of the interactive video.
  • the related information of the interactive video includes: video and display configuration information of interactive controls that need to be displayed in the video.
  • the playback interface of the video viewing terminal 12 includes interactive controls corresponding to the interactive video.
  • the interactive control is triggered in the playback interface of, the interactive content corresponding to the interactive control is displayed on the playback interface.
  • the video viewing terminal 12 may be a mobile terminal such as a cell phone, a tablet computer, a smart watch, or a terminal such as a desktop computer or a portable laptop computer.
  • the server 13 is configured to receive a video uploaded by the video upload terminal 11 through the communication network 14.
  • the server 13 is further configured to store display configuration information of the interactive controls in the interactive video.
  • the server 13 is further configured to send the video to the video viewing terminal 12.
  • the server 13 sends the interactive video to the video viewing terminal 12, and the interactive video is played to the target story node on the video viewing terminal 12, the video is viewed.
  • the terminal 12 displays interactive controls according to the display configuration information in the interactive video, where the display configuration information is included in the interactive video and is sent to the video viewing terminal 12.
  • the server 13 may be an independent server or a group of servers.
  • the server 13 may be a physical server or a cloud server, which is not limited in the embodiment of the present application.
  • the server 13 includes a shooting server 131, a virtual item receiving server 132, a game server 133, and an element storage server 134, wherein the shooting server 131 is configured to store an interactive video including starting shooting controls, and The interactive video including the activation of the shooting control is sent to the video viewing terminal 12; the server 132 is used to store the interactive video including the receipt of the virtual item control, and the interactive video including the receipt of the virtual item control is sent to the video viewing terminal 12; the server 133 is used for Store the interactive video including the game control activation, and send the interactive video including the game control activation to the video viewing terminal 12; the element storage server 134 is used to store the display elements of the interactive content corresponding to the interactive control in the interactive video, such as: When the interactive content is a shooting interface, the display element is a shooting element in the shooting interface. When the user triggers the interactive control, the video viewing terminal 12 obtains the shooting element from the element storage server 134 for video shooting.
  • the shooting server 131 is configured to store an interactive video including starting shooting controls
  • the communication network 14 may be a wired communication network or a wireless communication network.
  • FIG. 2 is a flowchart of a video interaction method provided by an exemplary embodiment of the present application.
  • the viewing terminal 12 is described as an example. As shown in FIG. 2, the method includes:
  • Step 201 Play an interactive video on a video playing interface.
  • the video playback interface may be a playback interface in a video program
  • the video program is an application program with a video playback capability.
  • the video program also has a video download function, video capture function, video upload function, user account registration and login function, like function, comment function, add friend function, follow other user functions, instant chat function At least one function.
  • the video program may also be a short video program, and the video played in the short video program may also be a short video and an interactive short video.
  • the playback interface is an interface for playing a video in a video program, and the interactive video includes a target story node.
  • the playback interface occupies the entire area or most of the screen area. That is, when the interactive video is played on the playback interface of the video program, the interactive video may be displayed on the entire screen in the playback interface of the terminal, or the interactive video may be displayed in a local area of the playback interface of the terminal.
  • the interactive video includes at least one story node, and the at least one story node includes the target story node.
  • the story node is used to indicate the mark time and / or the plot change point in the plot of the video.
  • the introduction of the above terms please refer to the introduction of the above terms.
  • step 202 when the interactive short video is played to the target story node, an interactive control is displayed on the target video screen of the interactive video.
  • the target video picture may be a one-frame picture or a continuous multi-frame picture.
  • pictures from 1:30 to 1:32 can also be discontinuous multi-frame pictures; such as pictures from 1:30 to 1:32 and pictures from 1:35 to 1:40.
  • the interactive control when the terminal displays the interactive control on the target video screen, the interactive control may be displayed at a random position on the target video screen of the interactive short video, or a local area may be determined in the target video screen of the interactive short video.
  • An interactive control is displayed at the local position, where the local area is a position where there is an interactive relationship with a video element in a video picture.
  • an interactive relationship means that a local area where an interactive control is displayed is associated with a state of a video element in a video picture.
  • an interactive short video is played in the playback interface 31, and the interactive short video includes a video character 33.
  • the video character 33 is saying "provide a red envelope to everyone".
  • An interactive control 32 is displayed in a local area, and the interactive control 32 is displayed on the hand position of the video character 33 to indicate the action of the video character 33 to send an electronic red envelope.
  • the interactive short video includes display configuration information of the interactive control, and the display configuration information is used to determine at least one of a display time, a display position, a display element, and corresponding interactive content of the interactive control in the interactive short video.
  • the interactive content is in the form of a web page.
  • the interactive content is represented by a web page link.
  • a schema is used to represent data information of the interactive content.
  • the display configuration information of the interactive control is first obtained, and the display configuration information includes time stamp information corresponding to the target story node (that is, determining the interaction The display time of the control in the interactive short video), the coordinate information of the local position (that is, determining the display position of the interactive control in the interactive short video), and the control elements of the interactive control; determine the target video frame of the interactive short video according to the time stamp information
  • the target video picture is a video picture corresponding to the target story node.
  • the local position on the target video picture is determined according to the coordinate information, and the interactive control is displayed on the local position according to the control element.
  • Step 203 Receive a trigger operation on the interactive control.
  • the trigger operation is used to display the interactive content corresponding to the interactive control.
  • the interactive content is an interactive user interface different from the playback interface, and / or, the interactive content is an interface element displayed superimposed on the playback interface, and / or, the interactive content is an interactive video played in the playback interface.
  • the user can trigger the interactive control in the playback interface.
  • the terminal when the terminal is a mobile terminal such as a mobile phone or a tablet computer, the user may press and hold the interactive control in the playback interface as a trigger operation on the interactive control; when the display screen of the terminal is a pressure touch display When the screen is on, the user can also touch the interactive control as a trigger operation.
  • the terminal is a desktop computer or a portable laptop, the user can complete the interactive control through the input of an external device. For example, the user can click on the interactive control with the mouse to complete the trigger operation on the interactive control, or use the keyboard to input shortcut keys to complete the trigger operation on the interactive control.
  • Step 204 Display the interactive content corresponding to the interactive control according to the trigger operation.
  • the interactive control includes a jump link
  • the terminal determines a jump link corresponding to the interactive control according to the trigger operation
  • the jump link is a link for displaying mutual content
  • the interactive content includes at least one of an interactive user interface, interface elements superimposed and displayed in a playback interface, and an interactive video played in the playback interface.
  • an interactive user interface is displayed according to the jump link.
  • the interactive user interface includes at least one of a shooting interface, a virtual item receiving interface, and a game interface.
  • the video interaction method guides the user through the corresponding relationship between the video content of the interactive video and the interactive control by adding an interactive control to the playback interface when the interactive video is played to the target story node. Trigger the interactive control.
  • the interactive control corresponding to the target story node is displayed, which attracts the user to trigger the interactive control and displays the interactive content corresponding to the interactive control.
  • a user can not only perform active actions such as watching or liking for a single video, but also can interact with the video according to the interactive controls in the video, which increases the interest of the video and avoids a relatively single form of video interaction. problem.
  • the interactive video is described as an interactive short video
  • the video program is a short video application
  • the interactive content is an interactive user interface.
  • the interactive short video includes at least three considerations. As shown in Figure 4, the considerations of this interactive short video include:
  • the position in the video stream that is, the number of the interactive short video configuration in the video stream, and its validity period, such as: the interactive short video configuration appears in the third stream of the video stream, the validity period is 2018 May 20, 2015 to May 30, 2018; in one embodiment, the interactive short video may also be designated to specify gender, age, region, model and version number.
  • the position in the video storyline that is, the interactive control appears in the interactive video stream in the first few seconds of the interactive video.
  • the interactive control appear in the interactive short video, the interaction configured in the same interactive short video The number of controls.
  • the interactive behavior is the interactive user interface that is jumped to when the interactive control is triggered.
  • the interactive user interface includes at least the following four situations:
  • FIG. 5 is a flowchart of a method for interacting in a short video program according to another exemplary embodiment of the present application, and the method is applied to the short video viewing terminal 12 shown in FIG. 1 as an example for description, as shown in FIG. 5 As shown, the method includes:
  • Step 501 Play an interactive short video on a playback interface of a short video program.
  • the playback interface is an interface for playing a short video
  • the interactive short video includes a target story node.
  • the interactive short video includes at least one story node, and the at least one story node includes the target story node.
  • the short video program plays the short videos in sequence according to the arrangement order of the short videos in the video stream.
  • the interactive short video it includes at least one of the following situations:
  • the video stream includes at least two short videos arranged in sequence; during the playback of the video stream on the playback interface, determine the short video that has been played on the playback interface, Play the interactive short video in the playback interface when the next short video playback position is the playback position of the interactive short video;
  • the video stream includes at least two short videos arranged in sequence; when the playback position of the target short video is the playback stream When the i-th play position in the video stream, insert an interactive short video between the i-th and i-th play positions in the video stream; or, when the target short video is played, the i-th play in the video stream Position, the playback position of the interactive short video is determined as the i + 1th playback position in the video stream, and the short video in the video stream is played in turn on the playback interface.
  • the playback position of the target short video a in video six is the eighth playback position in the playback stream, and then the seventh short position a and the eighth playback position
  • the interactive short video is inserted between the playback positions, or the 9th playback position in the video stream is determined as the playback position of the interactive short video A.
  • step 502 when the interactive short video is played to the target story node, an interactive control is displayed on the target video screen of the interactive short video.
  • the target video picture may be a one-frame picture, or may be a continuous multi-frame picture, or may be a discontinuous multi-frame picture.
  • the interactive control when the interactive control is displayed, the interactive control may be displayed at a random position of the target video frame of the interactive short video, or a local area may be determined in the target video frame of the interactive short video, and the interaction is displayed at the local position.
  • Control where the local area is a position where there is an interactive relationship with a video element in a video picture.
  • the existence of an interactive relationship means that a local area where an interactive control is displayed is associated with a state of a video element in a video picture.
  • the interactive short video includes display configuration information of the interactive control, and the display configuration information is used to determine at least one of a display time, a display position, a display element, and corresponding interactive content of the interactive control in the interactive short video.
  • the display configuration information is used to determine at least one of a display time, a display position, a display element, and corresponding interactive content of the interactive control in the interactive short video.
  • an interactive control when an interactive control is displayed on a local position of a target video frame of an interactive short video, first, display configuration information of the interactive control is obtained, and the display configuration information includes time stamp information corresponding to the target story node (that is, determining the interactive control Display time in interactive short videos), coordinate information of local positions (that is, determining the display position of interactive controls in interactive short videos), and control elements of interactive controls; determine the target video frame of interactive short videos based on timestamp information, The target video picture is a video picture corresponding to the target story node, the local position on the target video picture is determined according to the coordinate information, and the interactive control is displayed on the local position according to the control element.
  • Struct stWSInteractiveFeed is used to indicate the configuration of the interactive short video
  • 0 to 9 is the serial number
  • optional string and optional string is the format of the parameter
  • feedId is the video ID of the interactive short video
  • feedType is the type of the interactive short video.
  • the types of interactive short videos include at least one of star interaction, commercial interaction, and gamification interaction.
  • rectX represents the distance between the display area of the interactive control and the X point in the upper left corner of the interactive short video. The value is a percentage of the video width of the interactive short video.
  • rectY indicates the distance between the display area of the interactive control and the upper left corner of the interactive short video. Distance. The value is the percentage of the video height of the interactive short video.
  • RectWidth indicates the width of the display area of the interactive control
  • rectHeight indicates the height of the display area of the interactive control
  • buttonSrc indicates the picture displayed in the display area of the interactive control. It is transparent by default.
  • AppearanceTime indicates the moment when the interactive control appears during the playback of the interactive short video
  • durationTime indicates the duration of the interactive control's continuous display
  • actionScheme indicates the page to which the interactive control jumps.
  • Step 503 Receive a trigger operation on the interactive control.
  • the trigger operation is used to jump to the interactive user interface corresponding to the interactive control.
  • the user can trigger the interactive control in the playback interface.
  • the terminal when the terminal is a mobile terminal such as a mobile phone or a tablet computer, the user may press and hold the interactive control in the playback interface as a trigger operation for the interactive control; when the display screen of the terminal is pressure touch When the screen is displayed, the user can also perform pressure touch on the interactive control as a trigger operation for the interactive control.
  • the terminal is a desktop computer or a portable laptop, the user can complete the interactive control through input from an external device. Trigger action.
  • Step 504 Display the shooting interface corresponding to the shooting control according to the trigger operation display.
  • the interactive control includes a start shooting control, and a shooting interface corresponding to the start shooting control is displayed according to the trigger operation, and the shooting interface is used for short video shooting in combination with the shooting material corresponding to the target story node.
  • the shooting control and the target story node include at least one of the following situations:
  • the target story node is a node that makes a call with a character in the target video screen, and the control elements that start the shooting control include an answer call icon.
  • an interactive short video is played on the playback interface 61, and when it is played to the target story node of the interactive short video (to make a call with the star A in the target video screen 62), an answer call icon is displayed 63. After the user clicks the answer call icon 63, a shooting interface 64 is displayed.
  • the shooting interface 64 includes an image 65 collected by a camera.
  • the shooting interface 64 is used to combine the shooting material 66 corresponding to the target story node 66 (that is, the target video).
  • the star A) in the picture 62 performs short video shooting.
  • the target story node is the node where the music video playback ends.
  • the control elements that start the shooting control include a music icon.
  • the music video may be a music video (MV) of a song, or a video of music introduction.
  • MV music video
  • the interactive short video is a video that plays a music video.
  • a start shooting control including a music icon is displayed in the playback interface.
  • the shooting control is clicked, a shooting interface is displayed, and the shooting interface is used for short video shooting using the music corresponding to the music video as the background music.
  • the target story node is a node that displays the video material in the target video picture.
  • the control elements that start the shooting control include icons corresponding to the camera.
  • the video material "crown” is displayed in the target video screen of the interactive short video.
  • a shooting interface is displayed.
  • the shooting interface is used to "crown" Perform short video shooting for shooting materials, such as identifying faces and displaying the crown on the top of the face for short video shooting.
  • Step 505 Obtain a short video shot through the shooting interface.
  • the user after the user performs short video shooting through the shooting interface, the user automatically generates the short video obtained by shooting.
  • Step 506 Publish the short video to a network platform corresponding to the short video program.
  • the terminal may automatically publish the short video to a network platform corresponding to the short video program, or may publish the short video to the network platform corresponding to the short video program after the user performs a publishing operation.
  • the short video is published to the network platform corresponding to the short video program
  • other users can view the short video in other terminals or in the terminal through the short video program or other applications that can open the short video.
  • a short video obtained by shooting is obtained.
  • the short video will be published to the web platform corresponding to the short video program.
  • the user can also use the title input control 73 Enter the title of the short video.
  • the title of the short video can be the default title corresponding to the shooting material of the short video. For example, when the shooting material is "Crown", the default title can be It is "Crown Show”.
  • Step 507 Publish the short video to a contact or information interaction platform corresponding to the social application.
  • the user can share the short video and share it with a contact or information interaction platform corresponding to the social application.
  • the short video when the user shares the short video to a contact corresponding to a social application, the short video may be uploaded to a web platform corresponding to the short video program by default, and the visible state of the short video is only the publisher I can see that other accounts cannot view the short video, but the publisher can share the short video to the contacts or information interaction platform of the social application when other users and publishers establish a contact relationship in the social application , You can view the short video through social applications.
  • the information interaction platform refers to a network architecture that connects people with each other through social relationships and / or common interests (or common interests).
  • different users can Establish social relationships through mutual confirmation, such as adding friends or following each other.
  • two users When two users establish a social relationship, they become each other's social network contacts.
  • user A posts a message in the information interaction platform, his network contact can view the message through the information interaction platform.
  • a network contact of the user can watch the short video through the information interaction platform.
  • step 506 and step 507 are two steps in parallel, that is, only step 506, step 507, or both step 506 and step 507 may be performed. This embodiment of the present application does not address this. Be restricted.
  • Step 508 Display a virtual item receiving interface corresponding to the virtual item control according to the trigger operation.
  • the interactive control includes a virtual item receiving control, and a virtual item receiving interface corresponding to the received virtual item control is displayed according to the trigger operation, and the virtual item receiving interface is used to display the received virtual item.
  • the above-mentioned receiving virtual item control and the target story node include at least one of the following situations:
  • the target story node is a node in which the character in the target video screen sends an electronic red envelope, and the control elements that receive the virtual item control include the icon corresponding to the electronic red envelope.
  • an interactive short video is played on the playback interface 81.
  • the character in the target video frame 82 sends an electronic red envelope node (that is, the video character says “give you a gift”)
  • the receipt is displayed on the playback interface
  • a virtual item control 83 which includes an icon corresponding to an electronic red envelope.
  • the user clicks on the receipt virtual item control 83 the user jumps to the virtual item collection interface 84.
  • the virtual item collection interface 84 displays the received virtual item 85.
  • the icon corresponding to the electronic red envelope in FIG. 8 in this embodiment is only a schematic, and the form of the icon corresponding to the electronic red envelope is not limited in the embodiment of the present application.
  • the target story node is a node where the lottery carousel starts to rotate.
  • the control elements of the collection virtual item control include an icon to stop the lottery carousel from rotating.
  • the interactive short video is played and an icon for stopping the rotation of the lottery carousel is displayed when it reaches the node where the lottery carousel starts to rotate.
  • the icon for stopping the lottery carousel may also be synchronized with the lottery carousel Is displayed in the playback interface.
  • the playing interface 91 displays a rotating lottery dial.
  • the playback interface 91 clicks on the playback interface 91 to stop the icon 93 of the lottery dial rotating, the turntable stops slowly and stays in the user's drawing.
  • the terminal jumps to display the virtual item receiving interface 94, and the virtual item receiving interface 94 displays the A prize 95 drawn by the user.
  • Step 509 Display a game interface corresponding to the start game control according to the trigger operation.
  • the interactive control includes a start game control, and a game interface corresponding to the start game control is displayed according to the trigger operation, and the game interface includes game elements.
  • the display of the game interface corresponding to the game control according to the trigger operation includes at least the following two cases:
  • the first type is to jump to the game interface corresponding to the start game control according to the trigger operation, and the game interface includes game elements, that is, directly jump to the display interface;
  • the second method is to superimpose and display the game element on the playback interface according to the trigger operation, and determine the playback interface with the game element superimposed as the game interface, that is, the playback interface is still displayed as the background, and the game element is superimposed and displayed on the playback interface. .
  • the above-mentioned startup game control and the target story node include at least one of the following situations:
  • the target story node is a node that determines the game result in the target video screen.
  • the control elements of the game start control include icons corresponding to the game start.
  • an interactive short video is played in the playback interface.
  • the video content in the interactive short video is the game process of game A.
  • the control element of the start-up game control includes the icon corresponding to the start-up game.
  • the start-up game control includes the four words" start game ".
  • the target story node is a node where the characters in the target video screen invite the game to play, and the control elements that start the game control include icons corresponding to the game.
  • an interactive short video is played in the playback interface.
  • the video content in the interactive short video is a character's introduction or recommendation to the game.
  • the interactive short video is played to a node in the target video screen that invites the player to play the game, it displays
  • the game control is started, and the control element for starting the game control includes an icon corresponding to the recommended game.
  • the target story node is the node where personality testing is started.
  • the control elements that start the game controls include test questions corresponding to personality testing.
  • the playback interface may display the next test question after the user selects an answer to the test question until the answers of all associated test questions are selected, or directly after the user selects the answer to a test question Get test results.
  • an interactive short video is played in the playback interface.
  • the video content in the interactive short video is to analyze the character of the character.
  • the game control is displayed to start.
  • the game control is a personality test test question.
  • the user selects the answer to the personality test test question, the next associated test question can be displayed or the personality test result can be obtained.
  • the related A series of test questions can be considered a game process.
  • the target story node is the node that starts the Q & A interaction
  • the control elements that start the game controls include questions corresponding to the Q & A interaction.
  • an interactive short video is played in the playback interface.
  • the video content in the interactive short video is two video characters in the quick question and quick answer segment.
  • the game control is displayed.
  • the startup game control is a question corresponding to a question-and-answer interaction.
  • the next related question can be displayed, and the result of the question and answer can be obtained.
  • a series of questions related to each other Can be considered a game process.
  • the personality test questions in the third case and the questions in the fourth case can be superimposed on the playback interface in a superimposed display manner, or you can jump to the H5 page from the playback interface. Personality test or Q & A interaction.
  • Step 510 Obtain a game result obtained through a game interface.
  • the game result is the result obtained by the user after playing the game on the game interface.
  • the game result is a personality test result.
  • the game result is a quiz result.
  • Step 511 Jump to a result analysis page corresponding to the game result according to the game result.
  • the result analysis page is used to display a user's game result, and an extension analysis based on the game result.
  • an extension analysis based on the game result.
  • extended analysis results of the user's game ability, game adaptability, and reaction ability can be obtained.
  • the result analysis page is used to display the user's personality analysis; when the game is a question-and-answer interaction and the game result is a question-and-answer result,
  • the result analysis page is used to display the user's question and answer result analysis, such as: question and answer scores, and response ability.
  • Step 512 Display a shooting interface corresponding to the game result according to the game result.
  • the shooting interface is used for shooting short videos in combination with game elements corresponding to game results.
  • the shooting interface displayed includes the game element corresponding to the title of the strongest king, such as: the game element is a crown displaying the "strongest king”, Then the shooting interface is used for face recognition of the user, and the game element is displayed on the head for short video shooting.
  • steps 511 and 512 are parallel steps, and either step 511 or step 512 can be performed.
  • Step 513 Play the branch video corresponding to the story branch selection control in the playback interface according to the trigger operation.
  • the story branch selection control is used for selecting and playing in at least two branch videos.
  • the target story node is a node that generates an optional plot development in the target video screen
  • the control element of the story branch selection control Includes an introduction to optional story development
  • an interactive short video is played in the playback interface.
  • the video content in the interactive short video is that two video characters are talking.
  • the interactive short video is played to the node where the video character A receives a call and can choose whether to answer the call.
  • two story branch selection controls are displayed in the playback interface, one story branch selection control displays "Answer", and the other story branch selection control displays "No Answer”. ", When the user selects on the" Answer "control, the branch video corresponding to the" Answer "control can be played in the playback interface, that is, the character A of the video answers the call.
  • steps 504 to 507, and the above step 508, the above steps 509 to 512, and the above step 513 are also side by side, that is, according to the interactive controls included in the interactive short video, only steps 504 to 507 may be performed, Perform only step 508, or only step 509 to step 512, or only step 513.
  • step 504 to step 507, step 508, step 509 to step 512 may also be performed.
  • steps 513, or the above four sets of steps are performed, and the specific execution method is determined according to the type of the interactive control in the interactive short video.
  • the method provided in this embodiment guides the user through the corresponding relationship between the video content of the interactive short video and the interactive control by adding an interactive control to the playback interface when the interactive short video is played to the target story node. Trigger the interactive control.
  • the interactive control corresponding to the target story node is displayed, which attracts the user to trigger the interactive control and displays the interaction corresponding to the interactive control.
  • users can not only perform active actions such as watching or liking for a single short video, but also can interact with the short video according to the interactive controls in the short video, which increases the fun of the short video and avoids the short video
  • the interactive form of the program is a single issue.
  • the method provided in this embodiment combines an interactive short video with an application scenario of short video shooting, adds a shooting control to the interactive short video, and when the user triggers the interactive control, displays a shooting interface to guide the user through the interactive short
  • the shooting parameters corresponding to the video are used for short video shooting, which increases the fun of the interactive short video.
  • the method provided in this embodiment combines an interactive short video with an application scenario for receiving virtual items, and adds an electronic red envelope control to the interactive short video.
  • a virtual item receiving interface is displayed to guide the user Pick up the virtual item through the interactive short video.
  • the interactive short video is an advertising short video
  • the interactive short video can attract users to use the virtual item to achieve the purpose of advertising and increase the interest of the interactive short video. .
  • the method provided in this embodiment combines an interactive short video with a game application scenario, and adds a startup game control to the interactive short video.
  • a game interface is displayed to guide the user through the interactive short video. Play games to increase the fun of interactive short videos and avoid the problem of a single interactive form of short video programs.
  • the method provided in this embodiment combines an interactive segment video with a story branch selection, and the user can select different story branch controls to control the development of the plot and guide the user to watch the different developments obtained by different plots.
  • the ending will increase the fun of the interactive short video and avoid the problem of a single interactive form of the short video program.
  • the short video obtained by the shooting is obtained, and the short video is sent to a contact or information interaction of a network platform or a social application.
  • the platform propagates the short video obtained in a certain range, and through the secondary transmission of the short video, the attractiveness of the interactive short video corresponding to the short video is improved.
  • both the short video and the interactive short video are sent by the server to the terminal.
  • FIG. 10 is a flowchart of a video interaction method provided by another exemplary embodiment of the present application. The method is applied in FIG. 1
  • the interactive video is an interactive short video
  • the video program is a short video program
  • the interactive content is an interactive user interface. As shown in FIG. 10, the method includes:
  • Step 1001 The server receives the display configuration information of the short video and the interactive control.
  • the short video is a video material for generating an interactive short video.
  • the short video includes a target story node.
  • the display configuration information includes time stamp information corresponding to the target story node (that is, determining the display time of the interactive control in the interactive short video), and coordinate information showing the local position of the interactive control (that is, determining that the interactive control is interacting with Display position in short videos), and control elements for interactive controls.
  • Step 1002 The server generates an interactive short video on the basis of the short video according to the configuration information.
  • the interactive short video includes a target story node.
  • the short video is used to superimpose and display interactive controls on the target video screen when the short video is played to the target story node by the terminal.
  • the interactive short video includes: a correspondence between the short video and the display configuration information.
  • Step 1003 The server sends an interactive short video to the terminal.
  • the server before the server sends the interactive short video to the terminal, the server receives the target account information request.
  • the target account information request is uploaded to the server by the configuration terminal.
  • the target account information request is used to determine the account information based on the terminal. Requirements for the terminal receiving the interactive short video.
  • the terminal first sends account information to the server.
  • the account information includes at least one of the user's gender, user age, the region where the terminal is located, the terminal model, and the version number of the short video program.
  • the account information obtained from the terminal includes: user gender (male), user age (18), region where the terminal is located (Jiangsu), terminal model (phone), and version number of the short video program (10.2. 1) If the target account information requires the user's gender to be male, the interactive short video is sent to the terminal; when the target account information requires the user's gender to be female, the interactive short video is not sent to the terminal.
  • the server sends the interactive short video to the terminal, that is, the server sends the video data of the short video and the display configuration information of the interactive control to the terminal.
  • the server before the server sends the interactive short video, the server receives a first configuration operation for the interactive short video, and the first configuration operation is used to configure a playback position of the interactive short video in the video stream.
  • the video The stream includes at least two short videos arranged in sequence; the server determines the short video that has been sent to the terminal, and sends the interactive short video to the playback position of the interactive short video when the next short video has been sent to the terminal. terminal.
  • the server before the server sends the interactive short video, the server receives a second configuration operation for the interactive short video, and the second configuration operation is used to configure a binding relationship between the interactive short video and the target short video; first To obtain a playback position of the target short video in a video stream, where the video stream includes at least two short videos arranged in sequence.
  • the playback position of the target short video is the i-th playback position in the video stream
  • the playing position of the interactive short video is determined as the i + 1-th playing position in the video stream, i ⁇ 1; the short videos in the video stream are sequentially transmitted to the terminal.
  • the terminal since the terminal plays one by one when playing the short video, when the terminal obtains the i-th short video from the server for playback, the terminal may obtain the i + 1th short video from the server in advance to load and cache,
  • Step 1004 The terminal receives the interactive short video sent by the server.
  • the terminal receives the video data of the short video and the display configuration information of the interactive control sent by the server.
  • the terminal plays the short videos one by one when playing the short video
  • the terminal when the terminal obtains the i-th short video from the server for playback, it can obtain the i + 1th short video from the server in advance to load the cache.
  • Step 1005 The terminal plays the interactive short video in the playback interface of the short video program.
  • the terminal may sequentially play the short video in the video stream sent by the server, and the terminal may also locally play the video stream at the playback position Controlling, when playing the interactive short video, including at least one of the following situations:
  • the video stream includes at least two short videos arranged in sequence; during the playback of the video stream on the playback interface, determine the short video that has been played on the playback interface, Play the interactive short video in the playback interface when the next short video playback position is the playback position of the interactive short video;
  • the video stream includes at least two short videos arranged in sequence; when the playback position of the target short video is the playback stream When the i-th play position in the video stream, insert an interactive short video between the i-th and i-th play positions in the video stream; or, when the target short video is played, the i-th play in the video stream Position, the playback position of the interactive short video is determined as the i + 1th playback position in the video stream, and the short video in the video stream is played in turn on the playback interface.
  • Step 1006 When the interactive short video is played to the target story node, the interactive control is superimposed and displayed on the target video screen of the interactive short video.
  • the display configuration information of the interactive control is first obtained, and the display configuration information includes time stamp information corresponding to the target story node (that is, determining the interaction The display time of the control in the interactive short video), the coordinate information of the local position (that is, determining the display position of the interactive control in the interactive short video), and the control elements of the interactive control; determine the target video frame of the interactive short video according to the time stamp information
  • the target video picture is a video picture corresponding to the target story node.
  • the local position on the target video picture is determined according to the coordinate information, and the interactive control is superimposed and displayed on the local position according to the control element.
  • Step 1007 The terminal receives a trigger operation on the interactive control.
  • the trigger operation is used to display the interactive content corresponding to the interactive control.
  • Step 1008 The terminal displays interactive content corresponding to the interactive control.
  • the interactive control includes a jump link
  • the terminal determines a jump link corresponding to the interactive control according to the trigger operation.
  • the jump link is a link for jumping to the interactive user interface, and is displayed according to the jump link.
  • Interactive user interface includes at least one of a shooting interface, a virtual item receiving interface, and a game interface.
  • the terminal when the terminal displays the interactive user interface corresponding to the interactive control, it may first obtain the interface element from the server according to the jump link corresponding to the interactive control, and display the interface element, and the interactive user interface is already displayed; the terminal may also Directly jump to the H5 page corresponding to the app according to the jump link corresponding to the interactive control.
  • the interactive control is the startup shooting control
  • the shooting interface corresponding to the interactive control has a startup path.
  • the startup shooting control includes the startup path.
  • the shooting interface is displayed according to the startup path corresponding to the startup shooting control, and the code of the shooting interface includes a calling code for calling the camera.
  • the method for interacting in a short video program includes adding an interactive control to the playback interface when the interactive short video is played to the target story node, and interacting with the video content and interaction of the short video.
  • the corresponding relationship between the controls guides the user to trigger the interactive control.
  • the interactive control corresponding to the target story node is displayed to attract the user to trigger the interactive control, and
  • the interactive content corresponding to the interactive control is displayed.
  • the user can not only perform active actions such as watching or liking for a single short video, but also can interact with the short video according to the interactive controls in the short video, adding a short video
  • the fun avoids the problem of a single interactive form of short video programs.
  • the method for interacting in a short video program obtaineds the account information of the short video program of the terminal through the server, and sends the interactive short video to the terminal when the account information meets the requirements of the target account information.
  • the video is a short video for advertising purposes, the placement of the interactive short video is more targeted and more accurate.
  • the interactive short videos involved in the above embodiments can be implemented as ordinary interactive videos
  • the short video programs can be implemented as ordinary video programs
  • the interactive user interface can be implemented as other interactive content.
  • the device includes a playback module 1110, a display module 1120, and a receiving module 1130.
  • a playing module 1110 configured to play an interactive video in a video playing interface, where the playing interface is an interface for playing a video, and the interactive video includes a target story node;
  • a display module 1120 configured to display an interactive control on a target video screen of the interactive video when the interactive video is played to the target story node;
  • a receiving module 1130 configured to receive a trigger operation on the interactive control
  • the display module 1120 is further configured to display interactive content corresponding to the interactive control according to the trigger operation.
  • the display module 1120 is further configured to display the interactive control on a local position of a target video frame of the interactive video, where the local position is an interaction with a video element in the video frame. The location of the relationship.
  • FIG. 12 is a structural block diagram of a device for interacting in a short video program according to an exemplary embodiment of the present application.
  • the display module 1110 includes: an acquisition submodule 1121, a determination submodule 1122, and a display submodule. 1123; the device further includes: an obtaining module 1140, a publishing module 1150, and a result obtaining module 1160;
  • the display module 1120 includes:
  • An acquisition submodule 1121 configured to acquire display configuration information of the interactive control, where the display configuration information includes time stamp information corresponding to the target story node, coordinate information of the local position, and control elements of the interactive control;
  • a determining submodule 1122 configured to determine the target video picture of the interactive video according to the timestamp information
  • the determining sub-module 1122 is further configured to determine the local position on the target video screen according to the coordinate information
  • a display sub-module 1123 is configured to display the interactive control on the local position according to the control element.
  • the interactive control includes a start shooting control
  • the display module 1120 is further configured to display a shooting interface corresponding to the startup shooting control according to the trigger operation, and the shooting interface is used for video shooting in combination with shooting materials corresponding to the target story node.
  • the target story node is a node that makes a call with a character in the target video frame, and the control element that starts the shooting control includes an answer call icon;
  • the target story node is a node where music video playback ends, and the control element that starts the shooting control includes a music icon;
  • the target story node is a node that displays video material in the target video picture, and the control element that starts the shooting control includes an icon corresponding to the camera.
  • the apparatus further includes:
  • An obtaining module 1140 configured to obtain a video captured through the shooting interface
  • a publishing module 1150 is configured to publish the video to a network platform corresponding to the video program, and / or publish the video to a contact or information interaction platform corresponding to a social application.
  • the interactive control includes a virtual item receiving control
  • the display module 1120 is further configured to display a virtual item receiving interface corresponding to the virtual item control for receiving the virtual item according to the trigger operation, and the virtual item receiving interface is used to display the received virtual item.
  • the target story node is a node in which the character in the target video frame sends an electronic red envelope, and the control element for receiving the virtual item control includes an icon corresponding to the electronic red envelope;
  • the target story node is a node at which the lottery carousel starts to rotate, and the control element for receiving the virtual item control includes an icon that stops the lottery carousel from rotating.
  • the interactive control includes a start game control
  • the display module 1120 is further configured to display a game interface corresponding to the startup game control according to the trigger operation, where the game interface includes game elements.
  • the display module 1120 is further configured to jump to the game interface corresponding to the startup game control according to the trigger operation, where the game interface includes the game element;
  • the display module 1120 is further configured to display the game element on the playback interface according to the trigger operation, and determine the playback interface on which the game element is displayed as the game interface.
  • the target story node is a node that determines a game result in the target video screen, and the control element that starts the game control includes an icon corresponding to the start of the game;
  • the target story node is a node in which a character in the target video screen invites a game, and a control element that starts a game control includes an icon corresponding to the game;
  • the target story node is a node that starts a character test, and the control elements that start a game control include a test question corresponding to the character test;
  • the target story node is a node that starts a question-and-answer interaction
  • the control element that starts the game control includes a question corresponding to the question-and-answer interaction
  • the apparatus further includes:
  • a result acquisition module 1160 configured to acquire a game result obtained through the game interface
  • the display module 1120 is further configured to jump to a result analysis page corresponding to the game result according to the game result, and / or display a shooting interface corresponding to the game result according to the game result, the shooting interface. And is configured to perform video shooting in combination with the game element corresponding to the game result.
  • the interactive control includes a story branch selection control, and the story branch selection control is used for selecting and playing in at least two branch videos;
  • the display module 1120 is further configured to play a branch video corresponding to the story branch selection control on the playback interface according to the trigger operation.
  • the target story node is a node that generates an optional story development in the target video picture
  • the control element of the story branch selection control includes a brief introduction of the optional story development
  • the playback module 1110 is further configured to obtain a playback position of the interactive video in a video stream, where the video stream includes at least two videos arranged in sequence; and playing the video on the playback interface.
  • the playback module 1110 determine the video that has been played in the playback interface; when the next playback position of the played video is the playback position of the interactive video, play the video in the playback interface. The interactive video.
  • the playback module 1110 is further configured to obtain a target video bound to the interactive video; obtain a playback position of the target video in a video stream, and the video stream includes at least two When the playback position of the target video is the i-th playback position in the video stream, inserting between the i-1th playback position and the i-th playback position in the video stream The interactive video; or, when the playback position of the target video is the i-th playback position in the video stream, determining the playback position of the interactive video as the i + 1th playback position in the video stream Playing the videos in the video stream in sequence in the playback interface.
  • the display module 1120 is further configured to determine a jump link corresponding to the interactive control according to the trigger operation, where the jump link is a link corresponding to the interactive content; according to the jump The link jumps to the interactive user interface.
  • the receiving module 1130 is further configured to receive the interactive video sent by a server, and the server is configured to generate the interactive video according to the received video and display configuration information of the interactive control, and Sending the interactive video to the terminal.
  • the receiving module 1130 is further configured to send an account signal to the server, and the server is used to determine whether the account information meets the requirements of the target account information, and when the account information meets the target account When the information is requested, the interactive video is sent to the terminal, and the account information includes at least one of user gender, user age, region where the terminal is located, terminal model, and version number of the video program.
  • the device includes a video receiving module 1310, a video generating module 1320, and a sending module 1330.
  • a video receiving module 1310 configured to receive display configuration information of a short video and an interactive control
  • a video generating module 1320 configured to generate an interactive short video on the basis of the short video according to the display configuration information, where the interactive short video includes a target story node;
  • a sending module 1330 is configured to send the interactive short video to a terminal, where the interactive short video is used to superimpose and display the interactive control on a target video screen when being played by the terminal to the target story node.
  • the display configuration information includes time stamp information corresponding to the target story node, coordinate information showing a local area of the interactive control, and control elements of the interactive control.
  • the apparatus further includes:
  • An information receiving module configured to receive a target account information request, where the target account information request is used to determine a terminal that receives the interactive short video;
  • the sending module 1330 is further configured to obtain account information of the short video program corresponding to the terminal, where the account information includes user gender, user age, region where the terminal is located, terminal model, and version of the short video program. At least one of the numbers; when the account information meets the requirements of the target account information, sending the interactive short video to the terminal.
  • the apparatus further includes:
  • An operation receiving module configured to receive a first configuration operation on the interactive short video, where the first configuration operation is used to configure a playback position of the interactive short video in a video stream, and the video stream includes a sequence Arranged at least two short videos;
  • the sending module 1330 is further configured to determine the short video that has been sent to the terminal; when the next playback position of the short video that has been sent to the terminal is the playback position of the interactive short video, Sending the interactive short video to the terminal.
  • the apparatus further includes:
  • An operation receiving module configured to receive a second configuration operation on the interactive short video, where the second configuration operation is used to configure a binding relationship between the interactive short video and a target short video;
  • the sending module 1330 is further configured to obtain a playback position of the target short video in the video stream, where the video stream includes at least two short videos arranged in sequence; when the playback position of the target short video is the When the i-th playback position in the video stream, insert the interactive short video between the i-1th playback position and the i-th playback position in the video stream; or, when the target short video's When the playback position is the i-th playback position in the video stream, determining the playback position of the interactive short video as the i + 1-th playback position in the video stream, i ⁇ 1; The short videos are sequentially sent to the terminal.
  • FIG. 14 shows a structural block diagram of a terminal 1400 provided by an exemplary embodiment of the present invention.
  • the terminal 1400 can be: smartphone, tablet, MP3 player (Moving Picture Experts Group Audio Layer III, Moving Picture Experts Compression Standard Audio Level 3), MP4 (Moving Picture Experts Group Audio Audio Layer Compression, Standard Audio Level 4) Player, laptop or desktop computer.
  • the terminal 1400 may also be called other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and the like.
  • the terminal 1400 includes a processor 1401 and a memory 1402.
  • the processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
  • the processor 1401 may use at least one hardware form of DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array). achieve.
  • the processor 1401 may also include a main processor and a coprocessor.
  • the main processor is a processor for processing data in the awake state, also called a CPU (Central Processing Unit).
  • the coprocessor is Low-power processor for processing data in standby.
  • the processor 1401 may be integrated with a GPU (Graphics Processing Unit).
  • the GPU is responsible for rendering and drawing content needed to be displayed on the display screen.
  • the processor 1401 may further include an AI (Artificial Intelligence) processor, and the AI processor is configured to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1402 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1402 may further include a high-speed random access memory, and a non-volatile memory, such as one or more disk storage devices, flash storage devices.
  • the non-transitory computer-readable storage medium in the memory 1402 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1401 to implement the video interaction provided by the method embodiment in this application. method.
  • the terminal 1400 may optionally include a peripheral device interface 1403 and at least one peripheral device.
  • the processor 1401, the memory 1402, and the peripheral device interface 1403 may be connected through a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1403 through a bus, a signal line, or a circuit board.
  • the peripheral device includes at least one of a radio frequency circuit 1404, a touch display screen 1405, a camera 1406, an audio circuit 1407, a positioning component 1408, and a power supply 1409.
  • the peripheral device interface 1403 may be used to connect at least one peripheral device related to I / O (Input / Output) to the processor 1401 and the memory 1402.
  • the processor 1401, the memory 1402, and the peripheral device interface 1403 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1401, the memory 1402, and the peripheral device interface 1403 or Both can be implemented on separate chips or circuit boards, which is not limited in this embodiment.
  • the radio frequency circuit 1404 is used to receive and transmit an RF (Radio Frequency) signal, also called an electromagnetic signal.
  • the radio frequency circuit 1404 communicates with a communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1404 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1404 includes an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and the like.
  • the radio frequency circuit 1404 can communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocols include, but are not limited to, the World Wide Web, metropolitan area networks, intranets, mobile communication networks of all generations (2G, 3G, 4G, and 5G), wireless local area networks, and / or WiFi (Wireless Fidelity) networks.
  • the radio frequency circuit 1404 may further include circuits related to Near Field Communication (NFC), which is not limited in this application.
  • NFC Near Field Communication
  • the display screen 1405 is used to display a UI (User Interface).
  • the UI may include graphics, text, icons, videos, and any combination thereof.
  • the display screen 1405 also has the ability to collect touch signals on or above the surface of the display screen 1405.
  • the touch signal can be input to the processor 1401 as a control signal for processing.
  • the display screen 1405 may also be used to provide a virtual button and / or a virtual keyboard, which is also called a soft button and / or a soft keyboard.
  • one display screen 1405 may be provided, and the front panel of the terminal 1400 is provided; in other embodiments, at least two display screens 1405 may be provided on different surfaces of the terminal 1400 or may be folded design; In still other embodiments, the display screen 1405 may be a flexible display screen disposed on a curved surface or a folded surface of the terminal 1400. Moreover, the display screen 1405 can also be set as a non-rectangular irregular figure, that is, a special-shaped screen.
  • the display 1405 can be made of materials such as LCD (Liquid Crystal Display) and OLED (Organic Light-Emitting Diode).
  • the camera component 1406 is used to capture images or videos.
  • the camera assembly 1406 includes a front camera and a rear camera.
  • the front camera is disposed on the front panel of the terminal, and the rear camera is disposed on the back of the terminal.
  • the camera assembly 1406 may further include a flash.
  • the flash can be a monochrome temperature flash or a dual color temperature flash.
  • a dual color temperature flash is a combination of a warm light flash and a cold light flash, which can be used for light compensation at different color temperatures.
  • the audio circuit 1407 may include a microphone and a speaker.
  • the microphone is used to collect the sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 1401 for processing, or input to the radio frequency circuit 1404 to implement voice communication.
  • the microphone can also be an array microphone or an omnidirectional acquisition microphone.
  • the speaker is used to convert electrical signals from the processor 1401 or the radio frequency circuit 1404 into sound waves.
  • the speaker can be a traditional film speaker or a piezoelectric ceramic speaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert electrical signals into sound waves audible to humans, but also convert electrical signals into sound waves inaudible to humans for ranging purposes.
  • the audio circuit 1407 may further include a headphone jack.
  • the positioning component 1408 is used to locate the current geographic position of the terminal 1400 to implement navigation or LBS (Location Based Service).
  • the positioning component 1408 may be a positioning component based on the US GPS (Global Positioning System), the Beidou system in China, or the Galileo system in Russia.
  • the power supply 1409 is used to power various components in the terminal 1400.
  • the power source 1409 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery.
  • the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
  • the wired rechargeable battery is a battery charged through a wired line
  • the wireless rechargeable battery is a battery charged through a wireless coil.
  • the rechargeable battery can also be used to support fast charging technology.
  • the terminal 1400 further includes one or more sensors 1410.
  • the one or more sensors 1410 include, but are not limited to, an acceleration sensor 1411, a gyro sensor 1412, a pressure sensor 1413, a fingerprint sensor 1414, an optical sensor 1415, and a proximity sensor 1416.
  • the acceleration sensor 1411 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 1400.
  • the acceleration sensor 1411 may be used to detect components of the acceleration of gravity on three coordinate axes.
  • the processor 1401 may control the touch display screen 1405 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 1411.
  • the acceleration sensor 1411 may also be used for collecting motion data of a game or a user.
  • the gyro sensor 1412 can detect the body direction and rotation angle of the terminal 1400, and the gyro sensor 1412 can cooperate with the acceleration sensor 1411 to collect 3D actions of the user on the terminal 1400. Based on the data collected by the gyro sensor 1412, the processor 1401 can implement the following functions: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1413 may be disposed on a side frame of the terminal 1400 and / or a lower layer of the touch display screen 1405.
  • a user's holding signal to the terminal 1400 can be detected, and the processor 1401 can perform left-right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1413.
  • the processor 1401 controls the operable controls on the UI interface according to the user's pressure operation on the touch display screen 1405.
  • the operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 1414 is used to collect a user's fingerprint, and the processor 1401 identifies the user's identity based on the fingerprint collected by the fingerprint sensor 1414, or the fingerprint sensor 1414 identifies the user's identity based on the collected fingerprint. When identifying the user's identity as a trusted identity, the processor 1401 authorizes the user to perform related sensitive operations, such as unlocking the screen, viewing encrypted information, downloading software, paying and changing settings.
  • the fingerprint sensor 1414 may be provided on the front, back, or side of the terminal 1400. When a physical button or a manufacturer's logo is set on the terminal 1400, the fingerprint sensor 1414 can be integrated with the physical button or the manufacturer's logo.
  • the optical sensor 1415 is used to collect ambient light intensity.
  • the processor 1401 may control the display brightness of the touch display screen 1405 according to the ambient light intensity collected by the optical sensor 1415. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1405 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1405 is reduced.
  • the processor 1401 may also dynamically adjust the shooting parameters of the camera component 1406 according to the ambient light intensity collected by the optical sensor 1415.
  • the proximity sensor 1416 also called a distance sensor, is usually disposed on the front panel of the terminal 1400.
  • the proximity sensor 1416 is used to collect the distance between the user and the front side of the terminal 1400.
  • the processor 1401 controls the touch display screen 1405 to switch from the bright screen state to the closed screen state; when the proximity sensor 1416 detects When the distance between the user and the front side of the terminal 1400 gradually becomes larger, the processor 1401 controls the touch display screen 1405 to switch from the rest screen state to the bright screen state.
  • FIG. 14 does not constitute a limitation on the terminal 1400, and may include more or fewer components than shown, or combine certain components, or adopt different component arrangements.
  • the server includes a processor and a memory.
  • the memory stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement interaction in the short video program provided by the foregoing method embodiments. Methods. It should be noted that the server may be the server provided in FIG. 15 as follows.
  • FIG. 15 is a schematic structural diagram of a server provided by an exemplary embodiment of the present application.
  • the server 1500 includes a central processing unit (CPU) 1501, a system memory 1504 including a random access memory (RAM) 1502 and a read-only memory (ROM) 1503, and a system memory 1504 connected to the central processing unit 1501.
  • the server 1500 also includes a basic input / output system (I / O system) 1506 that helps transfer information between various devices in the computer, and a large-capacity storage for storing the operating system 1513, application programs 1514, and other program modules 1515.
  • Device 1507 is a schematic structural diagram of a server provided by an exemplary embodiment of the present application.
  • the server 1500 includes a central processing unit (CPU) 1501, a system memory 1504 including a random access memory (RAM) 1502 and a read-only memory (ROM) 1503, and a system memory 1504 connected to the central processing unit 1501.
  • the server 1500 also includes
  • the basic input / output system 1506 includes a display 1508 for displaying information and an input device 1509 such as a mouse, a keyboard, or the like for a user to input information.
  • the display 1508 and the input device 1509 are both connected to the central processing unit 1501 through an input-output controller 1510 connected to the system bus 1505.
  • the basic input / output system 1506 may further include an input / output controller 1510 for receiving and processing input from a plurality of other devices such as a keyboard, a mouse, or an electronic stylus.
  • the input-output controller 1510 also provides output to a display screen, printer, or other type of output device.
  • the mass storage device 1507 is connected to the central processing unit 1501 through a mass storage controller (not shown) connected to the system bus 1505.
  • the mass storage device 1507 and its associated computer storage medium provide non-volatile storage for the server 1500. That is, the mass storage device 1507 may include a computer storage medium (not shown) such as a hard disk or a CD-ROI drive.
  • the computer storage medium may include a computer storage medium and a communication medium.
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
  • Computer storage media include RAM, ROM, EPROM, EEPROM, flash memory, or other solid-state storage technologies, CD-ROM, DVD or other optical storage, tape cartridges, magnetic tape, disk storage, or other magnetic storage devices.
  • RAM random access memory
  • ROM read-only memory
  • EPROM Erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or other solid-state storage technologies
  • CD-ROM, DVD or other optical storage CD-ROM, DVD or other optical storage
  • tape cartridges magnetic tape
  • disk storage or other magnetic storage devices.
  • the above-mentioned system memory 1504 and mass storage device 1507 may be collectively referred to as a memory.
  • the memory stores one or more programs, which are configured to be executed by one or more central processing units 1501.
  • the one or more programs contain instructions for implementing the method for interacting in the short video program described above,
  • the central processing unit 1501 executes the one or more programs to implement the method for interacting in a short video program provided by the foregoing method embodiments.
  • the server 1500 may also be operated by a remote computer connected to a network through a network such as the Internet. That is, the server 1500 can be connected to the network 1512 through the network interface unit 1511 connected to the system bus 1505, or the network interface unit 1511 can also be used to connect to other types of networks or remote computer systems (not shown) .
  • the memory further includes one or more programs, the one or more programs are stored in the memory, and the one or more programs include a method for performing interaction in a short video program provided by an embodiment of the present invention. Steps performed by the server.
  • An embodiment of the present application further provides a computer-readable storage medium, where the storage medium stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program, and the code set. Or the instruction set is loaded and executed by the processor 1510 to implement the video interaction method as described in any one of FIGS. 1 to 9.
  • the present application also provides a computer program product.
  • the computer program product runs on a computer, the computer causes the computer to execute the video interaction method provided by the foregoing method embodiments.
  • FIG. 16 shows a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • the terminal includes a processor, a memory, a network interface, an input device, a camera device, and a display screen connected through a system bus.
  • the memory includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium of the terminal stores an operating system and can also store computer-readable instructions.
  • the processor can implement a video interaction method.
  • the internal memory may also store computer-readable instructions.
  • the processor may cause the processor to execute a video interaction method.
  • the camera device of the terminal is a camera for collecting images.
  • the display screen of the terminal can be a liquid crystal display or an electronic ink display.
  • the input device of the terminal can be a touch layer covered on the display screen, or a button, trackball or touchpad provided on the terminal shell, or an external Keyboard, trackpad, or mouse.
  • FIG. 16 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the terminal to which the solution of the present application is applied.
  • the specific terminal may include More or fewer components are shown in the figure, or some components are combined, or have different component arrangements.
  • the video interactive device provided in this application may be implemented in the form of a computer program, and the computer program may be run on a terminal as shown in FIG. 16.
  • the memory of the terminal may store various program modules constituting the video interactive device, such as a playback module, a display module, and a receiving module shown in FIG. 11.
  • the computer program constituted by each program module causes the processor to execute the steps in the video interaction method of each embodiment of the application described in this specification.
  • steps in the embodiments of the present application are not necessarily performed sequentially in the order indicated by the step numbers. Unless explicitly stated in this document, the execution of these steps is not strictly limited, and these steps can be performed in other orders. Moreover, at least a part of the steps in each embodiment may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily performed at the same time, but may be performed at different times. The execution of these sub-steps or stages The order is not necessarily performed sequentially, but may be performed in turn or alternately with other steps or at least a part of the sub-steps or stages of other steps.
  • Non-volatile memory may include read-only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM) or external cache memory.
  • RAM is available in various forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous chain (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

La présente invention se rapporte au domaine du multimédia, et concerne un procédé et un appareil d'interaction vidéo, un terminal, et un support de stockage. Le procédé comprend les étapes suivantes : un terminal lit une vidéo d'interaction dans une interface de lecture d'une vidéo, la vidéo d'interaction comprenant un noeud d'histoire cible; lorsque la vidéo d'interaction est lue au noeud d'histoire cible, le terminal affiche une commande d'interaction sur une image vidéo cible de la vidéo d'interaction; le terminal reçoit une opération de déclenchement sur la commande d'interaction; le terminal affiche un contenu d'interaction correspondant à la commande d'interaction, d'après l'opération de déclenchement.
PCT/CN2019/084930 2018-06-01 2019-04-29 Procédé et dispositif d'interaction vidéo, terminal, et support de stockage WO2019228120A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/920,863 US11178471B2 (en) 2018-06-01 2020-07-06 Video interaction method, terminal, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810555639.4 2018-06-01
CN201810555639.4A CN108769814B (zh) 2018-06-01 2018-06-01 视频互动方法、装置、终端及可读存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/920,863 Continuation US11178471B2 (en) 2018-06-01 2020-07-06 Video interaction method, terminal, and storage medium

Publications (1)

Publication Number Publication Date
WO2019228120A1 true WO2019228120A1 (fr) 2019-12-05

Family

ID=64001720

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/084930 WO2019228120A1 (fr) 2018-06-01 2019-04-29 Procédé et dispositif d'interaction vidéo, terminal, et support de stockage

Country Status (3)

Country Link
US (1) US11178471B2 (fr)
CN (1) CN108769814B (fr)
WO (1) WO2019228120A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111400014A (zh) * 2020-03-23 2020-07-10 Oppo广东移动通信有限公司 终端进程切换方法、终端及存储介质
CN113411657A (zh) * 2021-06-16 2021-09-17 湖南快乐阳光互动娱乐传媒有限公司 一种视频播放控制方法、装置及电子设备
CN114125498A (zh) * 2021-11-24 2022-03-01 北京百度网讯科技有限公司 视频数据处理方法、装置、设备以及存储介质
WO2022057722A1 (fr) * 2020-09-15 2022-03-24 腾讯科技(深圳)有限公司 Procédé, système et appareil d'essai de programme, dispositif et support
CN115103232A (zh) * 2022-07-07 2022-09-23 北京字跳网络技术有限公司 一种视频播放方法、装置、设备和存储介质
CN115515014A (zh) * 2022-09-26 2022-12-23 北京字跳网络技术有限公司 媒体内容的分享方法、装置、电子设备和存储介质
CN116304133A (zh) * 2023-05-23 2023-06-23 深圳市人马互动科技有限公司 一种图片生成方法及相关装置
EP4170475A4 (fr) * 2020-09-30 2023-12-20 Beijing Zitiao Network Technology Co., Ltd. Procédé, appareil et dispositif pour une interaction basée sur une vidéo, et support d'enregistrement
EP4175309A4 (fr) * 2020-09-30 2024-03-13 Beijing Zitiao Network Technology Co Ltd Procédés de traitement vidéo et d'interaction basée sur une vidéo, appareil, dispositif et support d'enregistrement

Families Citing this family (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108769814B (zh) 2018-06-01 2022-02-01 腾讯科技(深圳)有限公司 视频互动方法、装置、终端及可读存储介质
CN111277866B (zh) * 2018-12-04 2022-05-10 华为技术有限公司 一种控制vr视频播放的方法及相关装置
CN109660855B (zh) * 2018-12-19 2021-11-02 北京达佳互联信息技术有限公司 贴纸显示方法、装置、终端及存储介质
CN109794064B (zh) * 2018-12-29 2020-07-03 腾讯科技(深圳)有限公司 互动剧情实现方法、装置、终端和存储介质
CN111698547A (zh) * 2019-03-11 2020-09-22 腾讯科技(深圳)有限公司 视频互动方法、装置、存储介质和计算机设备
CN109977303A (zh) * 2019-03-13 2019-07-05 北京达佳互联信息技术有限公司 多媒体信息的交互方法、装置及存储介质
CN109992187B (zh) * 2019-04-11 2021-12-10 北京字节跳动网络技术有限公司 一种控制方法、装置、设备及存储介质
CN110062270B (zh) * 2019-04-24 2022-08-12 北京豆萌信息技术有限公司 广告展示方法和装置
CN110062290A (zh) * 2019-04-30 2019-07-26 北京儒博科技有限公司 视频互动内容生成方法、装置、设备和介质
CN110162366B (zh) * 2019-05-07 2023-03-10 北京达佳互联信息技术有限公司 一种信息显示方法、装置、电子设备及存储介质
CN110597581A (zh) * 2019-08-02 2019-12-20 北京奇艺世纪科技有限公司 动态ui系统对外交互方法、装置、电子设备及存储介质
CN110784772A (zh) * 2019-09-10 2020-02-11 上海道浮于海科技有限公司 一种短视频答题系统及方法
CN112584248B (zh) * 2019-09-27 2022-09-20 腾讯科技(深圳)有限公司 互动影视的实现方法及装置、计算机存储介质和电子设备
CN112637640B (zh) * 2019-10-09 2022-07-08 腾讯科技(深圳)有限公司 视频互动方法和装置
CN110784753B (zh) * 2019-10-15 2023-01-17 腾讯科技(深圳)有限公司 互动视频播放方法及装置、存储介质、电子设备
CN110750161A (zh) * 2019-10-25 2020-02-04 郑子龙 一种交互系统、方法、移动设备及计算机可读介质
CN112969093B (zh) * 2019-12-13 2023-09-08 腾讯科技(北京)有限公司 互动业务处理方法、装置、设备及存储介质
CN112995774A (zh) * 2019-12-13 2021-06-18 阿里巴巴集团控股有限公司 一种视频播放方法、装置、终端及存储介质
CN111031373A (zh) * 2019-12-23 2020-04-17 北京百度网讯科技有限公司 视频播放方法、装置、电子设备及计算机可读存储介质
CN113132808B (zh) * 2019-12-30 2022-07-29 腾讯科技(深圳)有限公司 视频生成方法、装置及计算机可读存储介质
CN111225292B (zh) * 2020-01-15 2022-05-06 北京奇艺世纪科技有限公司 信息的展示方法和装置、存储介质和电子装置
CN113157169A (zh) * 2020-01-22 2021-07-23 阿里巴巴集团控股有限公司 互动媒体内容的交互方法、装置和电子设备
CN111346376B (zh) * 2020-02-25 2021-12-21 腾讯科技(深圳)有限公司 基于多媒体资源的互动方法、装置、电子设备及存储介质
CN111359209B (zh) * 2020-02-28 2022-03-29 腾讯科技(深圳)有限公司 视频播放方法、装置和终端
CN113301402B (zh) * 2020-03-26 2023-04-25 阿里巴巴集团控股有限公司 交互方法和视频播放设备
CN111522614A (zh) * 2020-04-20 2020-08-11 北京三快在线科技有限公司 图像编辑信息的展示方法、装置、计算机设备及存储介质
CN111629240B (zh) * 2020-05-06 2021-08-10 上海幻电信息科技有限公司 多屏互动显示方法及装置
CN111669639A (zh) * 2020-06-15 2020-09-15 北京字节跳动网络技术有限公司 一种活动入口的展示方法、装置、电子设备及存储介质
CN113301436A (zh) * 2020-06-17 2021-08-24 阿里巴巴集团控股有限公司 播放控制方法、装置及计算机可读存储介质
CN111818371B (zh) * 2020-07-17 2021-12-24 腾讯科技(深圳)有限公司 一种互动视频的管理方法以及相关装置
CN111787415B (zh) * 2020-07-23 2021-08-17 北京字节跳动网络技术有限公司 视频互动方法、装置、电子设备和存储介质
CN111787407B (zh) * 2020-07-24 2021-10-29 腾讯科技(深圳)有限公司 互动视频播放方法、装置、计算机设备及存储介质
CN112044061B (zh) * 2020-08-11 2022-05-06 腾讯科技(深圳)有限公司 游戏画面处理方法、装置、电子设备以及存储介质
CN114225364B (zh) * 2020-09-14 2023-02-28 成都拟合未来科技有限公司 一种实时互动方法、系统
CN113301361B (zh) * 2020-09-15 2023-08-11 阿里巴巴华南技术有限公司 人机交互、控制以及直播方法、设备及存储介质
CN112199553A (zh) * 2020-09-24 2021-01-08 北京达佳互联信息技术有限公司 一种信息资源的处理方法、装置、设备及存储介质
CN112333478A (zh) * 2020-10-26 2021-02-05 深圳创维-Rgb电子有限公司 视频推荐方法、终端设备以及存储介质
CN112351203B (zh) * 2020-10-26 2022-04-08 北京达佳互联信息技术有限公司 视频拍摄方法、装置、电子设备及存储介质
CN112333473B (zh) * 2020-10-30 2022-08-23 北京字跳网络技术有限公司 一种交互方法、装置以及计算机存储介质
CN112402954B (zh) * 2020-11-09 2023-11-14 北京达佳互联信息技术有限公司 视频数据处理方法、装置及系统
CN113297065A (zh) * 2020-11-16 2021-08-24 阿里巴巴集团控股有限公司 数据处理方法、基于游戏的处理方法、装置和电子设备
CN112616086A (zh) * 2020-12-16 2021-04-06 北京有竹居网络技术有限公司 一种互动视频生成方法及装置
CN113891134A (zh) * 2021-01-29 2022-01-04 北京字跳网络技术有限公司 红包互动方法、装置、计算机设备、可读存储介质
CN113014989A (zh) * 2021-02-26 2021-06-22 拉扎斯网络科技(上海)有限公司 视频互动方法、电子设备和计算机可读存储介质
CN115145507A (zh) * 2021-03-15 2022-10-04 华为技术有限公司 基于多设备的在线互动方法、芯片、电子设备及存储介质
CN113099275A (zh) * 2021-03-16 2021-07-09 互影科技(北京)有限公司 互动视频的用户行为统计方法、装置及设备
CN113038236A (zh) * 2021-03-17 2021-06-25 北京字跳网络技术有限公司 一种视频处理方法、装置、电子设备及存储介质
CN113101646B (zh) * 2021-04-09 2023-11-28 北京达佳互联信息技术有限公司 视频处理方法、装置及系统
US20220335977A1 (en) * 2021-04-20 2022-10-20 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for editing object, electronic device and storage medium
CN113518253A (zh) * 2021-04-29 2021-10-19 广州酷狗计算机科技有限公司 歌曲播放方法、装置、终端设备及存储介质
CN113286159B (zh) * 2021-05-14 2022-05-31 北京字跳网络技术有限公司 应用程序的页面显示方法、装置和设备
CN113301441B (zh) * 2021-05-21 2023-02-03 北京字跳网络技术有限公司 应用程序的交互方法、装置和电子设备
CN113490004B (zh) * 2021-06-29 2022-07-05 腾讯科技(深圳)有限公司 一种直播互动方法及相关装置
CN113407744A (zh) * 2021-07-15 2021-09-17 北京达佳互联信息技术有限公司 资源展示方法、装置、计算机设备及介质
CN113542853A (zh) * 2021-07-20 2021-10-22 北京字跳网络技术有限公司 视频互动方法、装置、电子设备和存储介质
CN113784213A (zh) * 2021-10-08 2021-12-10 智令互动(深圳)科技有限公司 基于非线编模式的互动视频编辑器的互动控件实现方法
CN114125501A (zh) * 2021-10-30 2022-03-01 杭州当虹科技股份有限公司 互动视频生成方法及其播放方法和装置
CN114125566B (zh) * 2021-12-29 2024-03-08 阿里巴巴(中国)有限公司 互动方法、系统及电子设备
CN114327214A (zh) * 2022-01-05 2022-04-12 北京有竹居网络技术有限公司 交互方法、装置、电子设备、存储介质及计算机程序产品
CN114501101B (zh) * 2022-01-19 2024-01-02 北京达佳互联信息技术有限公司 视频互动方法、装置、电子设备及计算机程序产品
CN114610198B (zh) * 2022-03-10 2024-06-11 抖音视界有限公司 基于虚拟资源的交互方法、装置、设备和存储介质
CN117014648A (zh) * 2022-04-28 2023-11-07 北京字跳网络技术有限公司 一种视频处理方法、装置、设备及存储介质
CN115278334A (zh) * 2022-07-12 2022-11-01 阿里巴巴(中国)有限公司 视频互动方法、装置、设备及存储介质
CN115348478B (zh) * 2022-07-25 2023-09-19 深圳市九洲电器有限公司 设备交互显示方法、装置、电子设备及可读存储介质
CN115297272B (zh) * 2022-08-01 2024-03-15 北京字跳网络技术有限公司 一种视频处理方法、装置、设备及存储介质
CN116226446B (zh) * 2023-05-06 2023-07-18 深圳市人马互动科技有限公司 一种互动项目的互动方法及相关装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160067609A1 (en) * 2012-03-15 2016-03-10 Game Complex. Inc. Novel real time physical reality immersive experiences having gamification of actions taken in physical reality
CN106341695A (zh) * 2016-08-31 2017-01-18 腾讯数码(天津)有限公司 直播间互动方法、装置及系统
CN106534941A (zh) * 2016-10-31 2017-03-22 腾讯科技(深圳)有限公司 实现视频互动的方法和装置
CN108769814A (zh) * 2018-06-01 2018-11-06 腾讯科技(深圳)有限公司 视频互动方法、装置及可读介质

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6845485B1 (en) * 1999-07-15 2005-01-18 Hotv, Inc. Method and apparatus for indicating story-line changes by mining closed-caption-text
US6580437B1 (en) * 2000-06-26 2003-06-17 Siemens Corporate Research, Inc. System for organizing videos based on closed-caption information
US7725910B2 (en) * 2001-05-03 2010-05-25 Sony Corporation Interactive broadcast system and method with different content displayed to different viewers
US20050108773A1 (en) * 2003-10-04 2005-05-19 Samsung Electronics Co., Ltd. Information storage medium with AV data including non-multiplexed streams recorded thereon, and method of and apparatus for recording and reproducing the same
US20070300273A1 (en) * 2006-06-21 2007-12-27 Gary Turner Interactive television application and content enhancement
US8595781B2 (en) * 2009-05-29 2013-11-26 Cognitive Media Networks, Inc. Methods for identifying video segments and displaying contextual targeted content on a connected television
US8930985B2 (en) * 2009-12-30 2015-01-06 Verizon Patent And Licensing Inc. Trigger-based transactional advertising for television
US20110300916A1 (en) * 2010-06-07 2011-12-08 Patchen Jeffery Allen Multi-Level Competition/Game, Talent, and Award Show Productions Systems, Methods and Apparatus
WO2012051585A1 (fr) * 2010-10-14 2012-04-19 Fixmaster, Inc. Système et procédé pour créer et analyser des expériences interactives
US20120167145A1 (en) * 2010-12-28 2012-06-28 White Square Media, LLC Method and apparatus for providing or utilizing interactive video with tagged objects
US9082092B1 (en) * 2012-10-01 2015-07-14 Google Inc. Interactive digital media items with multiple storylines
US9986307B2 (en) * 2013-07-19 2018-05-29 Bottle Rocket LLC Interactive video viewing
AU2013273829A1 (en) * 2013-12-23 2015-07-09 Canon Kabushiki Kaisha Time constrained augmented reality
CN103873945A (zh) * 2014-02-21 2014-06-18 周良文 与视频节目中对象进行社交的系统、方法
US9930405B2 (en) * 2014-09-30 2018-03-27 Rovi Guides, Inc. Systems and methods for presenting user selected scenes
US20160217136A1 (en) * 2015-01-22 2016-07-28 Itagit Technologies Fz-Llc Systems and methods for provision of content data
US9837124B2 (en) * 2015-06-30 2017-12-05 Microsoft Technology Licensing, Llc Layered interactive video platform for interactive video experiences
CA3216076A1 (fr) * 2015-07-16 2017-01-19 Inscape Data, Inc. Detection de segments multimedias communs
US20170034237A1 (en) * 2015-07-28 2017-02-02 Giga Entertainment Media Inc. Interactive Content Streaming Over Live Media Content
CN105451086A (zh) * 2015-09-22 2016-03-30 合一网络技术(北京)有限公司 一种实现视频互动的方法及装置
US10003853B2 (en) * 2016-04-14 2018-06-19 One Gold Tooth, Llc System and methods for verifying and displaying a video segment via an online platform
CN107222788A (zh) * 2016-08-31 2017-09-29 北京正阳天马信息技术有限公司 一种基于视频播放过程的交互问答系统实现方法
CN106534993A (zh) * 2016-09-27 2017-03-22 乐视控股(北京)有限公司 一种信息交互方法及装置
TWI630822B (zh) * 2017-03-14 2018-07-21 王公誠 互動影片發送系統及裝置
EP3639261B1 (fr) * 2017-05-05 2023-08-30 Unity IPR APS Applications contextuelles dans un environnement de réalité mixte
US10057310B1 (en) * 2017-06-12 2018-08-21 Facebook, Inc. Interactive spectating interface for live videos
CN107231568A (zh) * 2017-08-01 2017-10-03 腾讯科技(深圳)有限公司 一种媒体播放方法、服务器及终端设备
US10636252B2 (en) * 2017-10-02 2020-04-28 Everi Games, Inc. Gaming machine and method having bonus game trigger adjustments based on supplemental data
CN107945596A (zh) * 2017-12-25 2018-04-20 成都福润得科技有限责任公司 一种便于灵活教学的交互式教学方法
US20190214055A1 (en) * 2018-01-09 2019-07-11 Splic, Inc. Methods and systems for creating seamless interactive video content
US11671670B2 (en) * 2018-02-13 2023-06-06 Hq Trivia Llc System and interfaces for providing an interactive system
CN108989692A (zh) * 2018-10-19 2018-12-11 北京微播视界科技有限公司 视频拍摄方法、装置、电子设备及计算机可读存储介质
USD947233S1 (en) * 2018-12-21 2022-03-29 Streamlayer, Inc. Display screen or portion thereof with transitional graphical user interface

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160067609A1 (en) * 2012-03-15 2016-03-10 Game Complex. Inc. Novel real time physical reality immersive experiences having gamification of actions taken in physical reality
CN106341695A (zh) * 2016-08-31 2017-01-18 腾讯数码(天津)有限公司 直播间互动方法、装置及系统
CN106534941A (zh) * 2016-10-31 2017-03-22 腾讯科技(深圳)有限公司 实现视频互动的方法和装置
CN108769814A (zh) * 2018-06-01 2018-11-06 腾讯科技(深圳)有限公司 视频互动方法、装置及可读介质

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111400014B (zh) * 2020-03-23 2023-10-13 Oppo广东移动通信有限公司 终端进程切换方法、终端及存储介质
CN111400014A (zh) * 2020-03-23 2020-07-10 Oppo广东移动通信有限公司 终端进程切换方法、终端及存储介质
WO2022057722A1 (fr) * 2020-09-15 2022-03-24 腾讯科技(深圳)有限公司 Procédé, système et appareil d'essai de programme, dispositif et support
EP4175309A4 (fr) * 2020-09-30 2024-03-13 Beijing Zitiao Network Technology Co Ltd Procédés de traitement vidéo et d'interaction basée sur une vidéo, appareil, dispositif et support d'enregistrement
EP4170475A4 (fr) * 2020-09-30 2023-12-20 Beijing Zitiao Network Technology Co., Ltd. Procédé, appareil et dispositif pour une interaction basée sur une vidéo, et support d'enregistrement
CN113411657A (zh) * 2021-06-16 2021-09-17 湖南快乐阳光互动娱乐传媒有限公司 一种视频播放控制方法、装置及电子设备
CN114125498A (zh) * 2021-11-24 2022-03-01 北京百度网讯科技有限公司 视频数据处理方法、装置、设备以及存储介质
CN114125498B (zh) * 2021-11-24 2024-02-27 北京百度网讯科技有限公司 视频数据处理方法、装置、设备以及存储介质
CN115103232B (zh) * 2022-07-07 2023-12-08 北京字跳网络技术有限公司 一种视频播放方法、装置、设备和存储介质
CN115103232A (zh) * 2022-07-07 2022-09-23 北京字跳网络技术有限公司 一种视频播放方法、装置、设备和存储介质
CN115515014B (zh) * 2022-09-26 2024-01-26 北京字跳网络技术有限公司 媒体内容的分享方法、装置、电子设备和存储介质
CN115515014A (zh) * 2022-09-26 2022-12-23 北京字跳网络技术有限公司 媒体内容的分享方法、装置、电子设备和存储介质
CN116304133B (zh) * 2023-05-23 2023-07-25 深圳市人马互动科技有限公司 一种图片生成方法及相关装置
CN116304133A (zh) * 2023-05-23 2023-06-23 深圳市人马互动科技有限公司 一种图片生成方法及相关装置

Also Published As

Publication number Publication date
CN108769814A (zh) 2018-11-06
US20200336804A1 (en) 2020-10-22
US11178471B2 (en) 2021-11-16
CN108769814B (zh) 2022-02-01

Similar Documents

Publication Publication Date Title
WO2019228120A1 (fr) Procédé et dispositif d'interaction vidéo, terminal, et support de stockage
US20210306700A1 (en) Method for displaying interaction information, and terminal
CN109286852B (zh) 直播间的竞赛方法及装置
CN109600678B (zh) 信息展示方法、装置及系统、服务器、终端、存储介质
CN112929687B (zh) 基于直播视频的互动方法、装置、设备及存储介质
CN110198484B (zh) 消息推送方法、装置及设备
US11516303B2 (en) Method for displaying media resources and terminal
CN109729372B (zh) 直播间切换方法、装置、终端、服务器及存储介质
CN112118477B (zh) 虚拟礼物展示方法、装置、设备以及存储介质
CN111901658B (zh) 评论信息显示方法、装置、终端及存储介质
US20220191557A1 (en) Method for displaying interaction data and electronic device
CN112258241A (zh) 页面展示方法、装置、终端以及存储介质
CN109660855A (zh) 贴纸显示方法、装置、终端及存储介质
CN112261481B (zh) 互动视频的创建方法、装置、设备及可读存储介质
CN113490010B (zh) 基于直播视频的互动方法、装置、设备及存储介质
CN114205324A (zh) 消息显示方法、装置、终端、服务器及存储介质
CN112492339A (zh) 直播方法、装置、服务器、终端以及存储介质
CN111327916B (zh) 基于地理对象的直播管理方法、装置、设备及存储介质
CN112995759A (zh) 互动业务处理方法、系统、装置、设备及存储介质
CN113395566B (zh) 视频播放方法、装置、电子设备及计算机可读存储介质
WO2019170118A1 (fr) Procédé, dispositif et appareil de lecture vidéo
CN114116053A (zh) 资源展示方法、装置、计算机设备及介质
CN112969093A (zh) 互动业务处理方法、装置、设备及存储介质
CN113230655A (zh) 虚拟对象的控制方法、装置、设备、系统及可读存储介质
CN113204671A (zh) 资源展示方法、装置、终端、服务器、介质及产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19810584

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19810584

Country of ref document: EP

Kind code of ref document: A1