CN106998486B - Video playing method and device - Google Patents

Video playing method and device Download PDF

Info

Publication number
CN106998486B
CN106998486B CN201610045719.6A CN201610045719A CN106998486B CN 106998486 B CN106998486 B CN 106998486B CN 201610045719 A CN201610045719 A CN 201610045719A CN 106998486 B CN106998486 B CN 106998486B
Authority
CN
China
Prior art keywords
video
option
time node
playing
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610045719.6A
Other languages
Chinese (zh)
Other versions
CN106998486A (en
Inventor
苏德仁
刘孝园
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaoxiong Bowang Technology Co., Ltd.
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN201610045719.6A priority Critical patent/CN106998486B/en
Publication of CN106998486A publication Critical patent/CN106998486A/en
Application granted granted Critical
Publication of CN106998486B publication Critical patent/CN106998486B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44222Analytics of user selections, e.g. selection of programs or purchase activity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments

Abstract

The application discloses a video playing method and device. The video comprises at least one video segment, and a specific implementation method of the method comprises the following steps: responding to the video playing to a preset time node, and presenting at least one interaction option on a playing interface; detecting the operation of a user on the interaction option to determine the selected interaction option; determining a video clip to be played corresponding to the selected interaction option from at least one video clip; and playing the video clip to be played. The implementation method can provide more targeted video content, simplifies the implementation process of video interaction and reduces the development cost.

Description

Video playing method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to the field of video data processing technologies, and in particular, to a video playing method and apparatus.
Background
During the video playing process, the user can manually select a playing node, pause or continue the video playing, but cannot interact with the video content. For the video automatically pushed by the background of the terminal, the user can acquire preset information from the video. Because the user can only obtain the preset information from the pushed video, the pushed video has single content and poor pertinence, the user may not be interested in the pushed video, and the pushing effect is poor.
At present, the video interaction technology based on the HTML5 has high development cost, and user interaction during video playing is difficult to realize, so a simpler interactive video playing method needs to be provided.
Disclosure of Invention
In view of the above, it is desirable to provide a more targeted video playing method. In order to solve the technical problem, the application provides a video playing method and device.
In one aspect, the present application provides a method for playing a video, where the video includes at least one video segment, and the method includes: responding to the video playing to a preset time node, and presenting at least one interaction option on a playing interface; detecting the operation of a user on the interaction option to determine the selected interaction option; determining a video clip to be played corresponding to the selected interaction option from the at least one video clip; and playing the video clip to be played.
In some embodiments, the method further comprises: acquiring a time node of the at least one video clip; and determining a video clip to be played corresponding to the selected interactive option from the at least one video clip, including: determining a time node corresponding to the selected interactive option; and determining the video clip to be played based on the time node of the at least one video clip and the time node corresponding to the selected interactive option.
In some embodiments, the time node of the at least one video segment comprises at least a start time node of the at least one video segment; and the determining of the video clip to be played based on the time node of the at least one video clip and the time node corresponding to the selected interactive option comprises: and finding out the video clip with the starting time node consistent with the time node corresponding to the selected interactive option from the at least one video clip as the video clip to be played.
In some embodiments, the method further comprises: and configuring a corresponding time node for each video clip.
In some embodiments, the method further comprises: and associating the interaction options with the time nodes of the video segments to establish the corresponding relation between the interaction options and the time nodes of the video segments.
In some embodiments, the detecting the user's operation on the interaction option to determine the selected interaction option includes: detecting whether a user selects the interaction option; in response to detecting that a user selects an interactive option, determining the selected interactive option; and responding to the fact that the user does not select the interaction option within the preset time period, and automatically selecting one interaction option as the selected interaction option.
In some embodiments, the manner of presenting at least one interactive option in the playing interface of the video includes: floating display and pop-up window display.
In some embodiments, the method further comprises: counting the playing quantity of each video clip in the video; determining interest point information of the user based on the playing quantity of each video clip; and pushing the video to the user based on the interest point information of the user.
In a second aspect, the present application provides a video playback device. The video comprising at least one video segment, the apparatus comprising: the presentation unit is used for responding to the video playing to the preset time node and presenting at least one interaction option on the playing interface; the detection unit is used for detecting the operation of the user on the interaction options so as to determine the selected interaction options; the determining unit is used for determining a video clip to be played corresponding to the selected interaction option from the at least one video clip; and the playing unit is used for playing the video clip to be played.
In some embodiments, the apparatus further comprises: an obtaining unit, configured to obtain a time node of the at least one video segment; and the determining unit is further configured to determine the video segment to be played corresponding to the selected interactive option as follows: determining a time node corresponding to the selected interactive option; and determining the video clip to be played based on the time node of the at least one video clip and the time node corresponding to the selected interactive option.
In some embodiments, the time node of the at least one video segment comprises at least a start time node of the at least one video segment; and the determining unit is further used for determining the video clip to be played as follows: and finding out the video clip with the starting time node consistent with the time node corresponding to the selected interactive option from the at least one video clip as the video clip to be played.
In some embodiments, the apparatus further comprises: and the configuration unit is used for configuring a corresponding time node for each video clip.
In some embodiments, the apparatus further comprises: and the association unit is used for associating the interaction options with the time nodes of the video clips so as to establish the corresponding relation between the interaction options and the time nodes of the video clips.
In some embodiments, the detection unit is further configured to: detecting whether a user selects the interaction option; in response to detecting that a user selects an interactive option, determining the selected interactive option; and in response to the fact that the user does not select the interaction option within a preset time period, automatically selecting one interaction option as the selected interaction option.
In some embodiments, the presentation unit is configured to present at least one interactive option in a playback interface of the video as follows: floating display and pop-up window display.
In some embodiments, the apparatus further comprises a pushing unit for: counting the playing quantity of each video clip in the video; determining interest point information of the user based on the playing quantity of each video clip; and pushing the video to the user based on the interest point information of the user.
According to the video playing method and device, the interaction options are provided when the video is played, and the video is played by jumping to the corresponding video clip based on the selected interaction options, so that more targeted video content can be provided, the video interaction implementation process is simplified, and the development cost is reduced.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a video playback method according to the present application;
fig. 3 is a schematic effect diagram of an application scenario of a video playing method according to the present application;
FIG. 4 is a flow diagram of another embodiment of a video playback method according to the present application;
FIG. 5 is a scene schematic diagram of a specific implementation of an embodiment of a video playback method according to the present application;
FIG. 6 is a schematic diagram of configuring a time node for a video segment in the scene shown in FIG. 5;
FIG. 7 is a block diagram of an embodiment of a video playback device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a computer system suitable for implementing the terminal device or the server according to the embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the video playback method or video playback apparatus of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user 110 may use the terminal devices 101, 102, 103 to interact with the server 105 over the network 104 to receive or send messages or the like. Various video playing applications may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting video playing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like.
The server 105 may be a server providing various services, such as a background video server providing support for video playing on the terminal devices 101, 102, 103. The background video server may analyze and perform other processing on the received video playing request, and feed back a processing result (e.g., video data) to the terminal device.
It should be noted that the video playing method provided in the embodiment of the present application is generally executed by the terminal devices 101, 102, and 103, and accordingly, the video playing apparatus is generally disposed in the terminal devices 101, 102, and 103.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a video playback method according to the present application is shown. In this embodiment, the video includes at least one video clip. The video playing method comprises the following steps:
step 201, responding to the video playing to the preset time node, and presenting at least one interaction option on the playing interface.
In this embodiment, the electronic device on which the video playing method is running may receive a video playing request of a user, and play the video data on the electronic device after acquiring the corresponding video data from the background video server. The video to be played acquired from the background video server may be a video requested to be played by a user, or may also be video data configured in advance in the background video server, for example, an advertisement video associated with a video selected to be played by the user in an online playing platform. The video to be played may be stored in the background video server in the form of a plurality of video clips.
When the video is played to the preset time node, the electronic equipment can present at least one interaction option on the playing interface. The preset time node may be a time node preset according to a time node corresponding to each video clip of the video to be played. The video to be played can be divided into a plurality of video segments, and each video segment corresponds to a different time node. Or splicing a plurality of video clips into a video to be played, and setting a corresponding time node for each video clip. Therefore, the time node corresponding to one video clip can be set as the preset time node, and when the time node corresponding to the video clip is played, the interaction option can be presented on the playing interface.
The interactive option is used for the next video segment selected for play by the user. In this embodiment, at least one interactive option may be presented in the play interface. Each interaction option is associated with a video clip. In some alternative implementations, the content of the interaction option is consistent with the content of the associated video segment, which may be a summary or a keyword of the content of the associated video segment, for example, when the content of the video segment is shopping in a mall, the content of the interaction option with which the video segment is first associated may be "shopping mall". In further implementations, the content of the interactive option is associated with the content of the currently playing video segment, and may be a keyword determined according to a prediction of the content of the currently playing video segment, for example, when the currently playing video segment is an advertisement of a certain product, the content of the interactive option may include purchasing the brand of product or trying other products of the same brand, and the like.
In some optional implementations of this embodiment, when the video is played to the preset time node, the interactive option may be presented after the playing is paused. In this embodiment, the interaction options may be presented in various manners, for example, a masking layer is created in the video playing interface, the multiple interaction options are presented in the masking layer in a pop-up window manner, the interaction options may also be directly presented in the video playing interface in a floating display manner, and optionally, if the electronic device supports voice recognition, the interaction options may also be presented in a voice prompt manner.
Step 202, detecting the operation of the user on the interaction option to determine the selected interaction option.
In this embodiment, the electronic device may detect a selection operation performed by a user on an interactive option. In some implementations, the interface for selecting the interaction option can be configured while the interaction option is presented in step 201. The user may select interactive options of interest through the interface. The parameters returned by the interface can determine whether the user has performed the selected operation and the selected interaction option. Further, the selected instruction can be generated according to the selected operation of the user.
In some optional implementations of the embodiment, the step of detecting the operation of the user on the interaction option to determine the selected interaction option may be performed as follows: the method comprises the steps of detecting whether a user selects an interactive option, determining the selected interactive option in response to detecting that the user selects the interactive option, and automatically selecting one interactive option as the selected interactive option in response to detecting that the user selects the interactive option within a preset time period. If the electronic equipment does not detect the selected operation of the user within the preset time period, an interaction option can be selected randomly or according to a preset rule. The preset rule may be a user interest ranking determined according to behavior log information including search records, web browsing records, video playing records, and the like of the user, attribute information of the user, and the like.
Step 203, determining a video segment to be played corresponding to the selected interactive option from at least one video segment.
In this embodiment, the electronic device on which the video playing method operates may determine, from at least one video clip of the video to be played, a video clip to be played corresponding to the selected interactive option. The electronic equipment can associate the interactive options and the video clips in a one-to-one correspondence mode in advance. After the selected interaction option is determined, the electronic device may determine, according to the association relationship between the interaction option and the video segment, a video segment to be played corresponding to the selected interaction option. In some optional implementation manners of this embodiment, the electronic device may analyze a correlation between the content of the selected interaction option and the content of the at least one video clip, and use the video clip with the highest correlation as the video clip to be played corresponding to the selected interaction option.
In an actual scenario, the user may select an interested option from the interactive options presented in the playing interface, and then select the interested option through an interface provided by the electronic device (e.g., clickable link, voice input interface). At this time, the electronic device may find the video segment corresponding to the interested option selected by the user according to the correspondence table between the interactive option and the video segment.
And step 204, playing the video clip to be played.
After the video clip to be played is determined, the electronic device can read the video data of the video clip to be played and directly jump to the initial playing node of the video clip to be played for playing.
In an actual scene, video content scripts of multiple development directions can be configured, a video is segmented into multiple video segments, and each development direction corresponds to one interactive option and one video segment. After one video clip is played, judging whether a preset time node is reached, and if so, presenting interaction options for user selection; if the preset time node is not reached, the next video clip can be played in time sequence.
In some optional implementation manners of this embodiment, the method flow of playing the video may further include the following steps: counting the playing quantity of each video clip in the video; determining interest point information of a user based on the playing quantity of each video clip; and pushing the video to the user based on the interest point information of the user. If the video is repeatedly played for multiple times during the video playing, the electronic device may count the video segments with higher playing number, analyze the contents of the video segments, extract the information with higher attention of the user, such as the topics and the keywords, from the contents of the video segments as the interest point information, and then may find the video related to the interest point information of the user in the background video server and push the video to the user.
Please refer to fig. 3, which shows an effect diagram of an application scenario of the video playing method according to the present application. As shown in fig. 3, a time axis 301 of a currently played video has a plurality of nodes, and the nodes divide the video into a plurality of video clips. When the video is played to the preset time node 310, the playing is paused, and the interactive options 302 and 303 are presented on the playing interface 31. Where interactive option 302 corresponds to option a "drive home" and interactive option 303 corresponds to option B "go to show. The user's selection operation may then be detected. When detecting that the user selects the option a, the electronic device on which the video playing method is operated can find the video segment corresponding to the interactive option 302, jump to the start node 3A of the video segment and continue playing the video; if it is detected that the user selects the option B in the playing interface 31, the electronic device on which the video playing method operates may find the video segment corresponding to the interactive option 303, jump to the start node 3B of the video segment, and start to continue playing the video. Therefore, in the process of playing the video, the interactive options can be presented for the user to select when the playing of any one or more video clips is finished, the participation degree of the user in the video playing is improved, and the video can be played in a targeted manner.
According to the video playing method provided by the embodiment of the application, the interactive options are provided in the video playing process, and the corresponding video clips are played according to the interactive options selected by the user. The user can acquire more interesting information from the played video, so that the video playing with rich pertinence is realized.
With further reference to fig. 4, a flow diagram of another embodiment of a video playback method according to the present application is shown. In this embodiment, the video to be played includes a plurality of video clips. As shown in fig. 4, the video playing method 400 includes the following steps:
step 401, obtaining a time node of at least one video segment.
In this embodiment, when the electronic device on which the video playing method operates acquires the video segments from the background video server, the time node of each video segment may be acquired. The time node of the video clip may include a time period that the video clip lasts in the time axis of the entire video, and in some alternative implementations, the time node of the video clip may include a start time node and an end time node. For example, the total duration of a video is 5 minutes, the start time node and the end time node of the first video segment in the video are 0 minute 0 second and 0 minute 50 second, respectively, and the time node of the first video segment can be represented by [0:0, 0:50 ].
In some embodiments, the video playing method may further include the step of configuring a time node for the video segment. The video can be divided into a plurality of video segments according to the configured time nodes, or the time nodes of the video segments can be configured according to the playing sequence of the video segments. The configured time nodes may include a start time node and an end time node.
Step 402, responding to the video playing to the preset time node, and presenting at least one interaction option on the playing interface.
The preset time node may be an end time node of one video clip. And when the video is played to the preset time node, one video clip is played. At this point, the playback may be paused and the interactive options presented in the playback interface. The interaction options presented may be pre-configured options associated with other video segments. In this embodiment, the electronic device may detect a current playing progress, determine whether to play to a preset time node, and if so, present a preconfigured interaction option in a playing interface for a user watching a video to select.
And 403, detecting the operation of the user on the interaction option to determine the selected interaction option.
In this embodiment, the electronic device may detect a selection operation performed by a user on an interactive option. In some implementations, the interface for selecting the interaction option can be configured while the interaction option is presented in step 201. Whether the selected operation is performed by the user can be judged by detecting the data returned by the interface, so that the selected interaction option is determined. Further, the selected instruction can be generated according to the selected operation of the user. The selected instruction may include information related to the selected interactive option, such as a time node or a video segment corresponding to the interactive option.
In some optional implementations of this embodiment, after the interaction options are presented, whether the user performs the selected operation may be continuously monitored, and if the user's selected operation is not detected within a preset time period, one interaction option may be selected randomly or based on a behavior log of the user and the selected instruction may be generated.
The steps 402 and 403 are the same as the steps 201 and 202 in the previous embodiment, and are not described again here.
Step 404, determining a time node corresponding to the selected interactive option.
In this embodiment, a time node may be configured for the interaction option. When a user selects an interaction option through an interface provided by the electronic device, the electronic device can determine a target time node according to the interaction option selected by the user. The target time node may be a start time node of the video segment to be played.
In some embodiments, the video playing method may further include the step of associating the interaction option with a time node of the video segment to establish a correspondence relationship between the interaction option and the time node of the video segment. That is, the interaction option may be associated with the time node of the video segment in advance, and in step 404, the electronic device may determine the time node associated with the selected interaction option according to the correspondence relationship between the interaction option and the time node of the video segment established by the core.
Step 405, determining the video segment to be played based on the time node of at least one video segment and the time node corresponding to the selected interactive option.
In this embodiment, if the time node of the video clip is a time period during which the video clip continues in the time axis of the video, the video clip corresponding to the time period in which the time node corresponding to the selected interactive option is located can be found out and used as the video clip to be played.
In some optional implementation manners of this embodiment, if the video segment is configured with a start time node, and the time nodes of the video segment obtained in step 401 at least include the start time node, the electronic device may find, from at least one video segment, a video segment whose start time node is consistent with the time node corresponding to the selected interactive option, as the video segment to be played.
Step 406, playing the video clip to be played.
After the video clip to be played is determined, the electronic device can read the video data of the video clip to be played and directly jump to the initial playing node of the video clip to be played for playing.
In an actual application scenario, a plurality of time nodes may be set in a video as selection points, and each selection point may correspond to a plurality of video clips to be played. Fig. 5 is a scene diagram illustrating a specific implementation of an embodiment of a video playing method according to the present application. In fig. 5, the video to be played includes 11 video segments: video segment 0, video segment 11, video segment 21, video segment 111, video segment 112, video segment 12, video segment 13, video segment 211, video segment 212, video segment 22, and video segment 23. And 7 preset time nodes are configured: selection point a, selection point B, selection point C, selection point D, selection point E, selection point F, and selection point G. When the corresponding preset time node is played, the corresponding interaction option can be presented on the playing interface, and the corresponding video clip is played according to the selection of the user. For example, when the video segment 0 is played, the first preset time node, that is, the selection point a is reached, at this time, two interaction options are presented on the playing interface, which respectively correspond to the video segment 11 and the video segment 21, and if the user selects the interaction option corresponding to the video segment 11, the video segment 11 continues to be played. Then, when the video is played to a second preset time node, that is, the selection point B, two interaction options are presented to the user, which correspond to the video segment 111 and the video segment 112, respectively. The video segment 111 or the video segment 112 continues to be played according to the user's selection. If the user selects the interaction option corresponding to the video segment 111, when the user plays to a third preset time node, that is, the selection point D, two interaction options are presented to the user, which respectively correspond to the video segment 12 and the video segment 13, and if the user selects the interaction option corresponding to the video segment 12, the video segment 12 is played. In the process, only the video segment 0, the video segment 11 corresponding to the interactive option selected by the user, the video segment 111 and the video segment 12 are played, and other video segments are not played.
With further reference to fig. 6, a schematic diagram of configuring a time node for a video segment in the scene shown in fig. 5 is shown. As shown in fig. 6, the playback time axis of the video is divided into a plurality of time segments, each of which corresponds to one video clip. The duration of each time period is equal to the playing duration of the corresponding video clip. The time nodes configured for the video segments may coincide with preset time nodes (including selection point a, selection point B, selection point C, selection point D, selection point E, selection point F, and selection point G). Thus, when the playing of a video clip is finished, the interactive options can be presented on the playing interface.
With continuing reference to fig. 7, as an implementation of the method shown in the above-mentioned figures, the present application provides an embodiment of a video playing apparatus, which corresponds to the embodiment of the method shown in fig. 2, and which can be applied to various electronic devices.
In this embodiment, the video includes at least one video clip. As shown in fig. 7, the video playback device 700 according to this embodiment includes: a presentation unit 701, a detection unit 702, a determination unit 703, and a playback unit 704. The presentation unit 701 is configured to present at least one interaction option on a play interface in response to the video being played to a preset time node; the detection unit 702 is configured to detect an operation of a user on an interaction option to determine a selected interaction option; the determining unit 703 is configured to determine, from at least one video segment, a video segment to be played corresponding to the selected interaction option; the playing unit 704 is used for playing the video clip to be played.
In this embodiment, the video playing apparatus 700 may detect whether the video is played to the preset time node, and if so, the presenting unit 701 may present at least one interaction option on the playing interface in a pop-up window or floating display manner. Wherein each interactive option may be associated with a video clip. The content of the interaction option may be consistent with the content of the associated video segment, such as a summary or keywords of the content of the associated video segment.
The detection unit 702 may detect a user's selection operation on an interactive option. In some implementations, the video playback device can configure an interface for performing a selected operation on the interactive option. The user may select interactive options of interest through the interface. The parameters returned by the interface can determine whether the user has performed the selected operation and the selected interaction option. The detecting unit 702 may further detect whether the user performs a selection operation on the interaction option, and determine the selected interaction option in response to detecting that the user performs the selection operation on the interaction option; and responding to the fact that the user does not select the interaction option within the preset time period, and automatically selecting one interaction option as the selected interaction option. The automatic selection mode can be random selection or selection according to a preset rule. The preset rule may be a ranking of the user's interest level in the interaction options according to the user's behavior log analysis.
The determining unit 703 may search for a video to be played corresponding to the interaction option selected by the user. In some embodiments, the video playing apparatus 700 may pre-store a corresponding relationship table between the interaction options and the video segments, and at this time, the determining unit 703 may search for the video to be played corresponding to the interaction option selected by the user in the corresponding relationship table. In other embodiments, if time nodes are configured for the video segments and the interaction options in advance, the video segments of the time nodes configured with the selected interaction options can be used as the video segments to be played.
The playing unit 704 can read the video data of the video segment to be played, and directly jump to the starting playing node of the video segment to be played for playing.
In some embodiments, the video playback device 700 may further include a configuration unit, an association unit, and an acquisition unit (not shown). The configuration unit may be configured to configure a corresponding time node for each video segment, and the obtaining unit may be configured to obtain the time node of at least one video segment. The association unit may be configured to associate the interaction option with a time node of the video segment, so as to establish a correspondence relationship between the interaction option and the time node of the video segment. At this time, the determining unit 703 may determine a time node corresponding to the selected interactive option, and determine the video segment to be played according to the time node of at least one video segment and the time node corresponding to the selected interactive option. Further, if the time nodes configured for the video segments by the configuration unit at least include a start time node, the determination unit 703 may find out, from at least one video segment, a video segment whose start time node is consistent with the time node corresponding to the selected interactive option, as the video segment to be played.
In some embodiments, the apparatus 700 may further comprise a push unit (not shown) for pushing the video to the user as follows: counting the playing number of each video clip in the video, determining the interest point information of the user based on the playing number of each video clip, and pushing the video to the user based on the interest point information of the user. The push unit can analyze the preference of the user according to the content of the video clips with a large number of plays, so that videos which may be interested in the user can be pushed to the user according to the preference of the user, and accordingly more targeted video pushing is achieved.
Those skilled in the art will appreciate that the video playback device 700 described above also includes some other well-known structures, such as processors, memories, etc., which are not shown in fig. 7 in order to not unnecessarily obscure embodiments of the present disclosure.
It should be understood that the elements recited in apparatus 700 correspond to various steps in the method described with reference to fig. 2. Thus, the operations and features described above for the video playing method are also applicable to the apparatus 700 and the units included therein, and are not described herein again. Corresponding elements in the apparatus 700 may cooperate with elements in the terminal device and/or the server to implement aspects of embodiments of the present application.
The video playing method provided by the embodiment of the application can simplify the realization of video interaction, reduce the development cost of interactive video and provide video content with rich pertinence.
Referring now to FIG. 8, shown is a block diagram of a computer system 800 suitable for use in implementing a terminal device or server of an embodiment of the present application.
As shown in fig. 8, the computer system 800 includes a Central Processing Unit (CPU)801 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)802 or a program loaded from a storage section 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data necessary for the operation of the system 800 are also stored. The CPU 801, ROM 802, and RAM 803 are connected to each other via a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
The following components are connected to the I/O interface 805: an input portion 806 including a keyboard, a mouse, and the like; an output section 807 including a signal such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 808 including a hard disk and the like; and a communication section 809 including a network interface card such as a LAN card, a modem, or the like. The communication section 809 performs communication processing via a network such as the internet. A drive 810 is also connected to the I/O interface 805 as necessary. A removable medium 811 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 810 as necessary, so that a computer program read out therefrom is mounted on the storage section 808 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 809 and/or installed from the removable medium 811. The computer program performs the above-described functions defined in the method of the present application when executed by the Central Processing Unit (CPU) 801.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a presentation unit, a detection unit, a determination unit, a playback unit, and a generation unit. The names of the units do not form a limitation on the units themselves under certain conditions, for example, the presentation unit may also be described as a "unit that presents at least one interactive option in response to the video being played to a preset time node" in the playing interface.
As another aspect, the present application also provides a non-volatile computer storage medium, which may be the non-volatile computer storage medium included in the apparatus in the above-described embodiments; or it may be a non-volatile computer storage medium that exists separately and is not incorporated into the terminal. The non-transitory computer storage medium stores one or more programs that, when executed by a device, cause the device to: responding to the video playing to a preset time node, and presenting at least one interaction option on a playing interface; detecting the operation of a user on the interaction option to determine the selected interaction option; determining a video clip to be played corresponding to the selected interaction option from the at least one video clip; and playing the video clip to be played.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (12)

1. A method for playing a video, wherein the video comprises at least one video segment, the method comprising:
responding to the video playing to a preset time node, and presenting at least one interaction option on a playing interface;
detecting the operation of a user on the interaction option;
if the user is detected to select the interaction option, determining the selected interaction option, and determining a video clip to be played corresponding to the selected interaction option from the at least one video clip;
playing the video clip to be played;
if the user is not detected to select the interaction option within a preset time period, automatically selecting one interaction option as the selected interaction option;
wherein automatically selecting one of the interaction options as the selected interaction option comprises:
counting the playing quantity of each video clip in the video;
determining interest point information of a user based on the playing quantity of each video clip in the video;
and pushing the video to the user based on the interest point information of the user.
2. The method of claim 1, further comprising:
acquiring a time node of the at least one video clip; and
the determining the video segment to be played corresponding to the selected interactive option from the at least one video segment includes:
determining a time node corresponding to the selected interactive option;
and determining the video clip to be played based on the time node of the at least one video clip and the time node corresponding to the selected interactive option.
3. The method of claim 2, wherein the time node of the at least one video segment comprises at least a start time node of the at least one video segment; and
the determining a video clip to be played based on the time node of the at least one video clip and the time node corresponding to the selected interactive option includes:
and finding out the video clip with the starting time node consistent with the time node corresponding to the selected interactive option from the at least one video clip as the video clip to be played.
4. The method of claim 1, further comprising:
and configuring a corresponding time node for each video clip.
5. The method of claim 4, further comprising:
and associating the interaction options with the time nodes of the video segments to establish the corresponding relation between the interaction options and the time nodes of the video segments.
6. The method of claim 1, wherein presenting at least one interactive option in a video playback interface comprises: floating display and pop-up window display.
7. A video playback apparatus, wherein the video includes at least one video segment, the apparatus comprising:
the presentation unit is used for responding to the video playing to the preset time node and presenting at least one interaction option on the playing interface;
the detection unit is used for detecting whether a user selects the interaction option; in response to detecting that the user performs selection operation on the interaction option, determining the selected interaction option; responding to the fact that the user does not select the interaction option within a preset time period, and automatically selecting one interaction option as the selected interaction option;
the determining unit is used for determining a video clip to be played corresponding to the selected interaction option from the at least one video clip;
the playing unit is used for playing the video clip to be played;
the pushing unit is used for counting the playing quantity of each video clip in the video when the user is not detected to select the interaction option within a preset time period; determining interest point information of a user based on the playing quantity of each video clip in the video; and pushing the video to the user based on the interest point information of the user.
8. The apparatus of claim 7, further comprising:
an obtaining unit, configured to obtain a time node of the at least one video segment; and
the determining unit is further configured to determine a to-be-played video segment corresponding to the selected interactive option as follows:
determining a time node corresponding to the selected interactive option;
and determining the video clip to be played based on the time node of the at least one video clip and the time node corresponding to the selected interactive option.
9. The apparatus of claim 8, wherein the time node of the at least one video segment comprises at least a start time node of the at least one video segment; and
the determining unit is further configured to determine a video segment to be played as follows:
and finding out the video clip with the starting time node consistent with the time node corresponding to the selected interactive option from the at least one video clip as the video clip to be played.
10. The apparatus of claim 7, further comprising:
and the configuration unit is used for configuring a corresponding time node for each video clip.
11. The apparatus of claim 10, further comprising:
and the association unit is used for associating the interaction options with the time nodes of the video clips so as to establish the corresponding relation between the interaction options and the time nodes of the video clips.
12. The apparatus according to claim 7, wherein the presentation unit is configured to present at least one interactive option in a playback interface of the video as follows: floating display and pop-up window display.
CN201610045719.6A 2016-01-22 2016-01-22 Video playing method and device Active CN106998486B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610045719.6A CN106998486B (en) 2016-01-22 2016-01-22 Video playing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610045719.6A CN106998486B (en) 2016-01-22 2016-01-22 Video playing method and device

Publications (2)

Publication Number Publication Date
CN106998486A CN106998486A (en) 2017-08-01
CN106998486B true CN106998486B (en) 2020-03-10

Family

ID=59428386

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610045719.6A Active CN106998486B (en) 2016-01-22 2016-01-22 Video playing method and device

Country Status (1)

Country Link
CN (1) CN106998486B (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107786903B (en) * 2016-08-31 2020-04-17 北京国双科技有限公司 Video interaction method and device
CN109816411B (en) * 2017-11-21 2022-12-20 腾讯科技(深圳)有限公司 Resource popularization information processing method, device and system and computer equipment
CN109963202A (en) * 2017-12-22 2019-07-02 上海全土豆文化传播有限公司 Video broadcasting method and device
CN109982142A (en) * 2017-12-28 2019-07-05 优酷网络技术(北京)有限公司 Video broadcasting method and device
CN110198484B (en) * 2018-02-27 2021-09-14 腾讯科技(深圳)有限公司 Message pushing method, device and equipment
CN108810637A (en) * 2018-06-12 2018-11-13 优视科技有限公司 Video broadcasting method, device and terminal device
CN108966016B (en) * 2018-08-29 2021-10-26 北京奇艺世纪科技有限公司 Video clip rebroadcasting method and device and terminal equipment
CN109167950B (en) 2018-10-25 2020-05-12 腾讯科技(深圳)有限公司 Video recording method, video playing method, device, equipment and storage medium
CN112104921A (en) * 2019-06-18 2020-12-18 上海哔哩哔哩科技有限公司 Video playing method and device and computer equipment
CN112104920A (en) * 2019-06-18 2020-12-18 上海哔哩哔哩科技有限公司 Interactive video file generation method and device, computer equipment and storage medium
CN112104908A (en) * 2019-06-18 2020-12-18 上海哔哩哔哩科技有限公司 Audio and video file playing method and device, computer equipment and readable storage medium
CN112637657A (en) * 2019-09-24 2021-04-09 广州虎牙科技有限公司 Interactive video playing control method, device and system
CN110784752B (en) * 2019-09-27 2022-01-11 腾讯科技(深圳)有限公司 Video interaction method and device, computer equipment and storage medium
CN112584218A (en) * 2019-09-27 2021-03-30 腾讯科技(深圳)有限公司 Video playing method and device, computer equipment and storage medium
CN110719530A (en) * 2019-10-21 2020-01-21 北京达佳互联信息技术有限公司 Video playing method and device, electronic equipment and storage medium
CN112887787A (en) * 2019-11-29 2021-06-01 阿里巴巴集团控股有限公司 Interactive video generation method and device, object generation method and device and electronic equipment
CN112969098A (en) * 2019-12-13 2021-06-15 阿里巴巴集团控股有限公司 Engine architecture and apparatus for interactive video
CN112995764B (en) * 2019-12-18 2023-03-31 北京奇艺世纪科技有限公司 Video playing method and device, electronic equipment and computer readable storage medium
CN111031395A (en) * 2019-12-19 2020-04-17 北京奇艺世纪科技有限公司 Video playing method, device, terminal and storage medium
CN111031379B (en) * 2019-12-19 2022-04-12 北京奇艺世纪科技有限公司 Video playing method, device, terminal and storage medium
CN111225292B (en) * 2020-01-15 2022-05-06 北京奇艺世纪科技有限公司 Information display method and device, storage medium and electronic device
CN111556370B (en) * 2020-04-02 2022-10-25 北京奇艺世纪科技有限公司 Interactive video interaction method, device, system and storage medium
CN111669639A (en) * 2020-06-15 2020-09-15 北京字节跳动网络技术有限公司 Display method and device of movable entrance, electronic equipment and storage medium
CN113031842B (en) * 2021-04-12 2023-02-28 北京有竹居网络技术有限公司 Video-based interaction method and device, storage medium and electronic equipment
CN113573129B (en) * 2021-06-11 2023-10-13 阿里巴巴(中国)网络技术有限公司 Commodity object display video processing method and device
CN113438510A (en) * 2021-06-24 2021-09-24 湖南快乐阳光互动娱乐传媒有限公司 Method and playing system for realizing interactive video watching by multiple persons
CN113542844A (en) * 2021-07-28 2021-10-22 北京优酷科技有限公司 Video data processing method, device and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102917258A (en) * 2012-10-12 2013-02-06 深圳Tcl新技术有限公司 Video playing method, terminal and system based on video contents
CN103888847A (en) * 2014-03-27 2014-06-25 西安电子科技大学 Vehicular ad hoc network video transmission method based on overlay structure
WO2015032342A1 (en) * 2013-09-06 2015-03-12 乐视致新电子科技(天津)有限公司 Information displaying method and apparatus
CN104469508A (en) * 2013-09-13 2015-03-25 中国电信股份有限公司 Method, server and system for performing video positioning based on bullet screen information content
CN104794179A (en) * 2015-04-07 2015-07-22 无锡天脉聚源传媒科技有限公司 Video quick indexing method and device based on knowledge tree

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102917258A (en) * 2012-10-12 2013-02-06 深圳Tcl新技术有限公司 Video playing method, terminal and system based on video contents
WO2015032342A1 (en) * 2013-09-06 2015-03-12 乐视致新电子科技(天津)有限公司 Information displaying method and apparatus
CN104469508A (en) * 2013-09-13 2015-03-25 中国电信股份有限公司 Method, server and system for performing video positioning based on bullet screen information content
CN103888847A (en) * 2014-03-27 2014-06-25 西安电子科技大学 Vehicular ad hoc network video transmission method based on overlay structure
CN104794179A (en) * 2015-04-07 2015-07-22 无锡天脉聚源传媒科技有限公司 Video quick indexing method and device based on knowledge tree

Also Published As

Publication number Publication date
CN106998486A (en) 2017-08-01

Similar Documents

Publication Publication Date Title
CN106998486B (en) Video playing method and device
CN109413483B (en) Live content preview method, device, equipment and medium
WO2021244205A1 (en) Interaction scenario start up method, apparatus, storage medium, client end, and server
US8990328B1 (en) Facilitating media streaming with social interaction
CN109640129B (en) Video recommendation method and device, client device, server and storage medium
EP3193509B1 (en) Video advertisement filtering method, device and equipment
US20150172787A1 (en) Customized movie trailers
CN102298947A (en) Method for carrying out playing switching among multimedia players and equipment
CN107896334B (en) live broadcast method and device
US9635337B1 (en) Dynamically generated media trailers
KR101991188B1 (en) Promotion information processing method, device, and apparatus, and non-volatile computer storage medium
CN109862100B (en) Method and device for pushing information
CN110958470A (en) Multimedia content processing method, device, medium and electronic equipment
CN114071179B (en) Live broadcast preview method, device, equipment and medium
CN110913135A (en) Video shooting method and device, electronic equipment and storage medium
US20170171334A1 (en) Single-account multiple-preference recommendation method for video website and electronic device
CN110784751A (en) Information display method and device
CN111163348A (en) Searching method and device based on video playing
WO2022134555A1 (en) Video processing method and terminal
US20170272793A1 (en) Media content recommendation method and device
CN108600780B (en) Method for pushing information, electronic device and computer readable medium
CN111046292A (en) Live broadcast recommendation method and device, computer-readable storage medium and electronic device
CN112333463A (en) Program recommendation method, system, device and readable storage medium
CN113556568B (en) Cloud application running method, system, device and storage medium
KR20190101914A (en) Apparatus and method for streaming video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20181226

Address after: 100000 Room 708, 7th Floor, Building 10, No. 30, Shixing Street, Shijingshan District, Beijing

Applicant after: Beijing Xiaoxiong Bowang Technology Co., Ltd.

Address before: 100085 Baidu Building, 10 Shangdi Tenth Street, Haidian District, Beijing

Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant