CN110708571A - Video clip playing control method and related product - Google Patents

Video clip playing control method and related product Download PDF

Info

Publication number
CN110708571A
CN110708571A CN201910994135.7A CN201910994135A CN110708571A CN 110708571 A CN110708571 A CN 110708571A CN 201910994135 A CN201910994135 A CN 201910994135A CN 110708571 A CN110708571 A CN 110708571A
Authority
CN
China
Prior art keywords
video
video clip
bullet screen
playing
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910994135.7A
Other languages
Chinese (zh)
Other versions
CN110708571B (en
Inventor
孔凡阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910994135.7A priority Critical patent/CN110708571B/en
Publication of CN110708571A publication Critical patent/CN110708571A/en
Application granted granted Critical
Publication of CN110708571B publication Critical patent/CN110708571B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests

Abstract

The application discloses a video clip playing control method and a related product. The method comprises the following steps: when a server receives a video playing request of playing equipment, a video clip requested to be played by the request is firstly determined, bullet screen information generated by an original video in a time period corresponding to the video clip is obtained (the video clip is intercepted from the original video), and then the video clip and the bullet screen information are sent to the playing equipment, so that the playing equipment can display the bullet screen information of the video clip corresponding to the video clip in the original video when playing the video clip, the bullet screen display quantity of the video clip during playing is increased, and the viewing experience of a user is improved.

Description

Video clip playing control method and related product
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method for controlling playback of a video clip and a related product.
Background
Many video playing platforms support users to send barrages according to the watching experience when watching videos. The barrage sent by the user at each time point is displayed in the currently played video, so that the user can enjoy the barrage sent by other users as much as possible when watching the video, the film watching experience of the user is improved, and the enthusiasm of the user for discussing the currently played video is improved, and the video watching rate of the user is increased.
However, when the operator intercepts a large number of highlights in the video and puts the video clip on the operation page for independent display, the video clip hardly displays the barrage or only displays a small number of barrages during playing. The playing time of the video clip which can be clipped is short, so that the user cannot be in time to send the bullet screen; it is also possible that the video clip shows too few barrages to arouse the interest of the user in sending the barrages for discussion on the basis of the existing barrages. The rare bullet screen displayed during the playing of the video clip can affect the viewing experience of the user, and even can reduce the video viewing rate of the user.
Disclosure of Invention
The application provides a playing control method of a video clip and a related product, which increases the number of bullet screen displays during playing of the video clip, thereby improving the viewing experience of a user and improving the video viewing rate of the user.
In a first aspect, a method for controlling playback of a video segment is provided, including: receiving a playing request sent by playing equipment, and determining a video clip requested to be played by the playing request; determining a bullet screen storage position, wherein the bullet screen storage position is used for storing bullet screen information of an original video, and the video clip is intercepted from the original video; acquiring first bullet screen information from the bullet screen storage position, wherein the first bullet screen information comprises bullet screen information generated in a time period corresponding to the video clip; and sending the video clip and the first barrage information to the playing equipment.
In the embodiment of the application, when a server receives a video playing request of a playing device, a video clip requested to be played by the request is determined, first barrage information (the video clip is intercepted from an original video) generated by the original video in a time period corresponding to the video clip is obtained, and then the video clip and the first barrage information are sent to the playing device, so that the playing device can display the barrage information in the time period corresponding to the video clip in the original video when playing the video clip, the barrage display quantity of the video clip during playing is increased, and the viewing experience of a user is improved.
In an optional implementation manner, before the sending the video segment and the first barrage information to the playing device, the method further includes: determining second barrage information generated by the video clip; the sending the video clip and the first barrage information to the playing device includes: and sending the video clip, the first barrage information and the second barrage information to the playing equipment.
In the implementation manner, when receiving a playing request of a playing device for a video clip, a server sends the video clip, first barrage information generated by the original video in a time period corresponding to the video clip, and second barrage information generated by the video clip to the playing device, so that the number of barrage displays of the video clip during playing is further increased, and the viewing experience of a user is improved.
In an optional implementation manner, after the sending the video clip and the first barrage information to the playback device, the method further includes: receiving third barrage information sent to the video clip by the playing equipment and/or the playing equipment except the playing equipment; and sending the third bullet screen information to the playing equipment.
In this implementation manner, after the server sends the video clip and the first barrage information to the playing device, in the process of playing the video clip by the playing device, other playing devices may exist and play the video clip in the same time period, so that the server may receive third barrage information sent by the playing device and/or the playing devices other than the playing device for the video clip, and send the third barrage information to the playing device, so that the video clip can display both the first barrage information and the third barrage information during playing, the number of displayed barrages of the video clip during playing is increased, and the viewing experience of the user is improved.
In an optional implementation manner, after the sending the video clip and the first barrage information to the playback device, the method further includes: acquiring fourth barrage information from the barrage storage position, wherein the fourth barrage information comprises newly added barrage information generated in a time period corresponding to the video clip; and sending the fourth bullet screen information to the playing equipment.
In the implementation manner, since the barrage information of the original video is always in an updated state, after the server sends the video segment and the first barrage information to the playing device, in the process of playing the video segment by the playing device, the original video may generate newly added fourth barrage information in the time period corresponding to the video segment, and the server sends the fourth barrage information to the playing device, so that the video segment can display both the first barrage information and the fourth barrage information during playing, the barrage display number of the video segment during playing is increased, and the viewing experience of a user is improved.
In an optional implementation manner, after the sending the video clip, the first barrage information, and the second barrage information to the playback device, the method further includes: receiving third barrage information sent to the video clip by the playing equipment and/or the playing equipment except the playing equipment; and sending the third bullet screen information to the playing equipment.
In this implementation manner, as described above, the server sends the video clip, the first barrage information, and the second barrage information to the playing device, that is, the server sends the video clip, the barrage information generated by the original video in the time period corresponding to the video clip, and the barrage information generated by the video clip to the playing device; the video clip is in a constantly updated state, and in the process of playing the video clip by the playing device, the server may receive third barrage information sent by the playing device and/or the playing devices except the playing device aiming at the video clip, and the server sends the third barrage information to the playing device, so that the video clip can display the first barrage information, the second barrage information and the third barrage information when being played, the number of displayed barrages of the video clip when being played is increased, and the viewing experience of a user is improved.
In an optional implementation manner, after the sending the video clip, the first barrage information, and the second barrage information to the playback device, the method further includes: acquiring fourth barrage information from the barrage storage position, wherein the fourth barrage information comprises newly added barrage information generated in a time period corresponding to the video clip; and sending the fourth bullet screen information to the playing equipment.
In this implementation manner, as described above, the server sends the video clip, the first barrage information, and the second barrage information to the playing device, that is, the server sends the video clip, the barrage information generated by the original video in the time period corresponding to the video clip, and the barrage information generated by the video clip to the playing device; the barrage information of the original video is in a constantly updated state, in the process that the playing device plays the video clip, the video segment corresponding to the video clip in the original video may generate newly-added fourth barrage information, and the server sends the fourth barrage information to the playing device, so that the video clip can display the first barrage information, the second barrage information and the fourth barrage information when being played, the number of displayed barrages of the video clip when being played is increased, and the viewing experience of a user is improved.
In an optional implementation manner, the determining a bullet screen storage location, where the bullet screen storage location is used to store bullet screen information of an original video, includes: inquiring the associated identification code of the video clip, wherein the associated identification code comprises a video identification code; and determining the bullet screen storage position of the original video corresponding to the video identification code.
In an optional implementation manner, the acquiring first bullet screen information from the bullet screen storage location includes: inquiring a first association time point and a second association time point of the video clip; and acquiring first bullet screen information generated by the video segments from the first associated time point to the second associated time point in the original video from the bullet screen storage position.
In a second aspect, a server is provided, including: the device comprises a determining module, a playing module and a playing module, wherein the determining module is used for determining a video clip requested to be played by a playing request when the playing request sent by playing equipment is received; the determining module is further configured to determine a bullet screen storage location, where the bullet screen storage location is used to store bullet screen information of an original video, and the video clip is captured from the original video; the acquisition module is used for acquiring first bullet screen information from the bullet screen storage position, wherein the first bullet screen information comprises bullet screen information generated in a time period corresponding to the video clip; and the sending module is used for sending the video clip and the first barrage information to the playing equipment.
In an optional implementation manner, the determining module is further configured to determine second barrage information generated by the video segment before the video segment and the first barrage information are sent to the playing device; the sending module is specifically configured to send the video clip, the first barrage information, and the second barrage information to the playback device.
In an optional implementation manner, the server further includes a receiving module, configured to receive, after the video clip and the first barrage information are sent to the playing device, third barrage information sent to the video clip by the playing device and/or a playing device other than the playing device; the sending module is further configured to send the third bullet screen information to the playing device.
In an optional implementation manner, the obtaining module is further configured to obtain fourth bullet screen information from the bullet screen storage location after the video clip and the first bullet screen information are sent to the playing device, where the fourth bullet screen information includes new bullet screen information generated in a time period corresponding to the video clip; the sending module is further configured to send the fourth bullet screen information to the playing device.
In an optional implementation manner, the receiving module is further configured to receive, after the video clip, the first barrage information, and the second barrage information are sent to the playing device, third barrage information sent to the video clip by the playing device and/or a playing device other than the playing device; the sending module is further configured to send the third bullet screen information to the playing device.
In an optional implementation manner, the obtaining module is further configured to obtain fourth bullet screen information from the bullet screen storage location after the video clip, the first bullet screen information, and the second bullet screen information are sent to the playing device, where the fourth bullet screen information includes new bullet screen information generated in a time period corresponding to the video clip; the sending module is further configured to send the fourth bullet screen information to the playing device.
In an alternative implementation manner, the determining module is specifically configured to query an association identification code of the video segment, where the association identification code includes a video identification code; and is used for determining the bullet screen storage position of the original video corresponding to the video identification code.
In an optional implementation manner, the obtaining module is specifically configured to query a first associated time point and a second associated time point of the video segment; and the video playing method is used for acquiring first bullet screen information generated by the video segments from the first associated time point to the second associated time point in the original video from the bullet screen storage position.
In a third aspect, an electronic device is provided, including: a processor, a memory; the processor is configured to support the electronic device to perform corresponding functions in the method of the first aspect and any optional implementation manner thereof. The memory stores programs (instructions) and data necessary for the electronic device. Optionally, the electronic device may further include an input/output interface for supporting communication between the electronic device and other apparatuses.
In a fourth aspect, there is provided a computer-readable storage medium having stored therein instructions, which, when run on a computer, cause the computer to perform the method of the first aspect and any one of its optional implementations described above.
In a fifth aspect, there is provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the method of the first aspect and any of its alternative implementations.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
Fig. 1a is a screenshot of a video playing process according to an embodiment of the present application;
fig. 1b is a screenshot of another video playing process according to an embodiment of the present application;
fig. 2 is a schematic diagram of a network architecture according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a video segment playing control method according to an embodiment of the present application;
fig. 4 is a screenshot of a bullet screen list of a video according to an embodiment of the present application;
fig. 5 is a schematic interface diagram when a video segment is intercepted according to an embodiment of the present application;
fig. 6 is a schematic diagram of a video capture dialog box according to an embodiment of the present application;
fig. 7 is a schematic flowchart of another video segment playing control method according to an embodiment of the present application;
fig. 8 is a schematic flowchart of another video segment playing control method according to an embodiment of the present application;
fig. 9 is a schematic flowchart of another video segment playing control method according to an embodiment of the present application;
fig. 10a is a schematic flowchart of another video segment playing control method according to an embodiment of the present application;
fig. 10b is a schematic view of a scene during playing of a video clip according to an embodiment of the present application;
fig. 10c is a schematic diagram of receiving a bullet screen when a playing device plays a video clip according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a server according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of another server according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
In order to describe the scheme of the present application more clearly, the following first introduces some video clip playing situations.
The video clips mentioned in the embodiment of the present application are all clips captured from an original video, and taking the original video of the animation movie "mahogany return of western notes" as an example, a bullet screen can be seen to drift from the screen at almost every time point in the movie, as shown in fig. 1 a; fig. 1a is a screenshot of an animation movie "mahogany return of western memories" played on an Tencent video client, and as can be seen from fig. 1a, the complete duration of the movie is 1 hour, 25 minutes and 18 seconds, and a plurality of bullet-screens can be seen to appear on the screen at the time point of 21 minutes and 56 seconds of the movie. However, the video clip with the duration of 1 minute 32 seconds, which is cut based on the movie, hardly sees any bullet screen when playing, as shown in fig. 1 b; fig. 1b is a screenshot of a video clip showing the automated romantic movie "mahogany return of western shorthand" played on the Tencent video client, and as can be seen from fig. 1b, almost no barrage drifts through the screen. That is, the bullet screen can be seen in the screen almost at every point of time when the original video is played, and the bullet screen can hardly be seen in the screen when a video clip clipped based on the original video is played; even if a plurality of barrages appear in a time period in the original video, the video clip intercepted from the corresponding time period of the original video still hardly sees the barrages when being played. Based on the above problem, the present application provides a method for controlling playback of a video clip, so as to increase the number of bullet screen displays of the video clip during playback.
Fig. 2 is a schematic diagram of a network architecture according to an embodiment of the present application. As shown in fig. 2, the network architecture includes a server and a plurality of playback devices; the playing device 1, the playing device 2, …, and the playing device N may be respectively connected to a server via a network, so as to interact with the server and receive the video clips and the barrage information sent by the server. The playing device can be a device which can play videos and display barrage, such as a mobile phone, a tablet computer, a notebook computer, a desktop computer and the like. The server may receive, at the same time, the playing requests of the multiple playing devices for the same video segment, and the multiple playing devices may also play the same video segment at the same time in the same time period.
In the embodiment of the present application, an example of selecting one playback device from the multiple playback devices is first described, and operations performed by the server when the playback device interacts with the server are described. Referring to fig. 3, fig. 3 is a schematic flowchart illustrating a video clip playing control method according to an embodiment of the present application.
301. Receiving a playing request sent by playing equipment, and determining a video clip requested to be played by the playing request.
The server receives a playing request sent by the playing device, and determines a video segment requested to be played by the request, where the server may be a server that stores and manages the video segment (the following servers are not described, and all refer to servers that maintain video segments). The server may be the server in fig. 2, and the playback device may be any one of the playback devices in fig. 2. Specifically, when a user initiates a video clip playing request on playing equipment, the playing equipment sends a request to a server to request to acquire related information of the video clip; then, the server responds to the request and returns script information (the script information comprises the relevant information of the video segments such as the file size, the playing time length, the downloading address and the like of the video segments); the playing device initiates a resource downloading request to the server according to the video segment downloading address in the script information, and the server determines the video data of the video segment to be sent to the playing device according to the video segment downloading address in the downloading request.
302. And determining the bullet screen storage position.
The bullet screen storage position is used for storing bullet screen information of an original video, and the video clip is intercepted from the original video. Specifically, the server determines a bullet screen storage location, where bullet screen information of the original video is stored, and the video clip determined in the above 301 to be sent to the playing device is captured from the original video. For example, taking the movie of "mahogany of western memory" shown in fig. 1a and 1b and its segment as an example, the original video is a movie video with a duration of 1 hour, 25 minutes and 18 seconds, and the video segment is a segment with a duration of 1 minute and 32 seconds cut from the movie; when the server determines that the segment of length 1 min 32 sec is to be transmitted to the playback device in 301, then the server needs to determine the bullet screen storage location for the movie video of length 1 h 25 min 18 sec in 302.
In an alternative implementation, the server allocates a video distinguishing identification code for each video when managing video resources, and for a video segment clipped from an original video, the server allocates a video identification code for the video segment and also allocates an associated identification code for the video segment, and the associated identification code of the video segment includes identification code information of the corresponding original video. The identification code may be a character string of one or more combinations of numbers, letters, and special symbols, for example, the server configures "XYJ 000" for the movie "san Jose attribution of West memory", and configures "ID-XYJ 000" or "XYJ 000-01" or directly "XYJ 000" for a video clip clipped from the movie. After the server determines the video segment to be sent to the playing device, further, the server may query the associated identification code of the video segment, extract the video identification code of the corresponding original video from the associated identification code, and then determine the bullet screen storage location of the original video corresponding to the video identification code.
Further, in an optional implementation manner, when the server manages the video resources and the barrage thereof, a mapping relationship between the video identification code and the barrage storage address is established, and after the video identification code is determined, the barrage storage address of the video can be obtained based on the mapping relationship; then, after the server extracts the video identification code of the corresponding original video from the associated identification code of the video clip, the server can query the barrage storage address of the original video according to the mapping relation. In another optional implementation manner, the server performs classification management when managing video resources, for example, stores video content, video information, a barrage generated by a video, and the like in a classified manner, where the video information includes video related information such as a video file size, a video file type, a playing time, a video content download address, a barrage download address, and the like, and the server also establishes a one-to-one correspondence relationship between a video identification code and the video information; then, after extracting the video identification code of the corresponding original video from the associated identification code of the video clip, the server firstly determines the video information corresponding to the video identification code, and then extracts the bullet screen download address from the video information. It should be noted that, when the server manages the video resource and the barrage thereof, the server may establish a corresponding relationship between the video resource and the barrage thereof in a direct or indirect manner, so that the server may query the barrage information storage address through the corresponding relationship after determining a video resource (for example, determining the video identification code of the video resource), where the corresponding relationship may have a variety of specific forms.
303. And acquiring first bullet screen information from the bullet screen storage position.
The first barrage information comprises barrage information generated in a time period corresponding to the video clip. After determining the bullet screen storage location of the original video as described in 302, the server may obtain first bullet screen information from the location, where it is understood that the first bullet screen information refers to bullet screen information generated by the original video in a time period corresponding to the video segment. For example, still taking the movie of "mahogany of western memory" shown in fig. 1a and fig. 1b and its segment as an example, the original video is a movie with a duration of 1 hour, 25 minutes and 18 seconds, the video segment is a segment with a duration of 1 minute and 32 seconds cut from the movie, and assuming that the video segment is obtained by cutting the video content of the movie for 21 minutes, 10 seconds to 22 minutes and 42 seconds, the first barrage information refers to the barrage information of the movie in the time period of 21 minutes, 10 seconds to 22 minutes and 42 seconds. As shown in fig. 4, fig. 4 is a screenshot of the movie in a time period from 21 minutes 16 seconds to 21 minutes 25 seconds, and as can be seen from fig. 4, the server maintains all the information of the bullet screens according to the time sequence of the bullet screens appearing in the movie.
In an optional implementation manner, for a video segment clipped from an original video, a server (here, a server for managing the original video, which may be the same as the server for maintaining the video segment or a different server, and the following description is the same and is not repeated here) records and stores a clipping time point of the video segment in the original video as associated time point information of the video segment. For example, when the user clips the original video, the user manually inputs the clipping time point for the original video, specifically, taking the movie of "saint of western memory" as an example, the user may click the "create a feature" button in the menu option below the playing interface when watching the movie, as shown in fig. 5; then, the user interface pops up a video capture dialog box, as shown in fig. 6, an original video title line in the video capture dialog box may display a movie name such as "mahatma home in western memory", a video clip title line may display a custom name for a video clip desired to be captured, which is input by the user, such as "haihan and shang meet", and a time input box below the title line may allow the user to input a start time point and an end time point of the video clip to be captured; finally, the client sends the starting time point and the ending time point input by the user to a server (here, the server manages the original video), and the server (here, the server manages the original video) intercepts the original video according to the time point to obtain a video segment, and stores the time point as video information of the video segment. As another example, when the user clips the original video, the time schedule provided by the pop-up box selects the time point of the clip for the original video, specifically, still take the movie of "mahogany of western memory" as an example, when the user clicks the "create a feature" button in the interface shown in fig. 5, the user interface pops up a video capture dialog, different from the foregoing, a time selector unit may be provided in the video capture dialog box, and the user may select the start time point and the end time point of the video segment to be captured through the time selector without inputting the time point manually by the user, and similarly, the client may transmit the start time point and the end time point selected by the user to a server (here, a server that manages the original video), and the server (here, a server that manages the original video) performs video capture and stores the time point information.
No matter what way the user clips the original video, the server (here, the server managing the original video) can finally acquire and store the clipping time point of the video segment in the original video, and if the server (here, the server managing the original video) distributes the clipped video segment to other servers for maintenance, the server also distributes the video segment information containing the clipping time point to the corresponding server. For example, when a video content of a movie of 21 minutes 10 seconds to 22 minutes 42 seconds is clipped to obtain a video clip, a server (here, a server for managing an original video) may record and save "21 minutes 10 seconds" as a first associated time point and "22 minutes 42 seconds" as a second associated time point into video information of the video clip. Then, the server maintaining the video clip may determine the first relevant time point and the second relevant time point by querying the information of the video clip, and then obtain the barrage information from the first relevant time point to the second relevant time point of the original video from the determined barrage storage location in 302, which is the first barrage information.
The bullet screen storage location of the original video may be located in a database or a memory inside the server, or may be located in a storage device outside the server.
If the bullet screen storage position is located inside the server, one possible scenario is that the server not only manages the video clip of the original video, but also stores and manages the original video and the bullet screen information thereof; for example, the Tengchin video platform has a movie showing copyright of "mahogany coming of West tourist", an operator clips a highlight in a movie and distributes the highlight on the Tengchin video platform, and a server in the background of the Tengchin video maintains the movie and the highlight thereof at the same time and also maintains the barrage information generated by the movie at the same time.
If the bullet screen storage position is located outside the server, a possible scenario is that the server only manages the video clip of the original video, and another server stores and manages the original video and the bullet screen information thereof; for example, the Tengchin video platform has a movie showing copyright of "mahogany coming of West Olympic inscription", a server is arranged in the background of the Tengchin video to maintain the movie and the barrage information thereof, and after the operator clips out the highlight segments in the movie, the highlight segments are distributed to another video playing platform, and the server of the platform manages the highlight segments. Another possible scenario is that the server manages only the video segments of the original video, and the bullet screen information of the original video is stored and managed by another bullet screen server or device, that is, the original video and its bullet screen are stored separately on two devices. For example, the Tengcong video platform has the film showing copyright of 'mahogany return of West tourist', the video server is used for maintaining the film video content in the background of Tengcong video, and the bullet screen server maintains the bullet screen information of the film.
When the bullet screen storage location is located inside a server for maintaining the video clip, the server can determine a first correlation time point and a second correlation time point by inquiring the information of the video clip, and then directly acquire bullet screen information from the first correlation time point to the second correlation time point of the original video from the bullet screen storage address determined in 302 in a database of the server. When the bullet screen storage location is located in the bullet screen management device outside the server, the server needs to initiate a request to the bullet screen management device, where the request includes: the bullet screen storage address, the first associated time point and the second associated time point of the video segment determined in 302, and after responding to the request, the bullet screen management device sends the bullet screen information from the first associated time point to the second associated time point of the original video to the server.
304. And sending the video clip and the first barrage information to the playing equipment.
As shown in 301, the server determines the video segment, and as shown in 302 and 303, the server obtains the first barrage information, and then sends the video segment and the first barrage information to the playing device. For example, taking the movie of "mahogany of western memory" shown in fig. 1a and fig. 1b and its segment as an example again, the original video is a movie with a duration of 1 hour, 25 minutes and 18 seconds, the video segment is a segment with a duration of 1 minute and 32 seconds cut from the movie, and assuming that the video segment is obtained by cutting the video content of the movie with a duration of 21 minutes, 10 seconds to 22 minutes and 42 seconds, the server sends the segment with a duration of 1 minute and 32 seconds and the barrage information of the movie within a time period of 21 minutes, 10 seconds to 22 minutes and 42 seconds to the playing device.
In the embodiment of the application, when a server receives a video playing request of a playing device, a video clip requested to be played by the request is determined, first barrage information (the video clip is intercepted from an original video) generated by the original video in a time period corresponding to the video clip is obtained, and then the video clip and the first barrage information are sent to the playing device, so that the playing device can display the barrage information in the time period corresponding to the video clip in the original video when playing the video clip, the barrage display quantity of the video clip during playing is increased, and the viewing experience of a user is improved.
Further, the server may also store the barrage information generated by the video clip, and when the server receives a playing request for the video clip from the playing device, if the video clip, the barrage information generated by the video clip, and the barrage information in the time period corresponding to the video clip in the original video are sent to the playing device, the number of displayed barrages of the video clip during playing is increased on the basis of the embodiment shown in fig. 3. The specific process is shown in fig. 7. Referring to fig. 7, fig. 7 is a flowchart illustrating another video segment playing control method according to an embodiment of the present application.
701. Receiving a playing request sent by playing equipment, and determining a video clip requested to be played by the playing request.
702. And determining the bullet screen storage position.
703. And acquiring first bullet screen information from the bullet screen storage position.
704. And determining second bullet screen information generated by the video clip.
As described in 302, when the server manages the video resources and the barrage thereof, the server establishes a corresponding relationship between the video resources and the barrage thereof in a direct or indirect manner, so that the server can query the barrage information thereof through the corresponding relationship after determining a video resource, and the corresponding relationship may have a variety of specific forms in practical applications.
In an optional mode, when the server manages the video resources and the barrage thereof, a mapping relation between the video identification code and the barrage storage address is established; after the server determines the video identification code, the bullet screen storage address of the video can be obtained based on the mapping relation. After determining the video clip requested to be played by the play request, the server may acquire the identification code by querying the video information of the video clip, determine the bullet screen storage address of the video clip according to the identification code and the mapping relationship between the video identification code and the bullet screen storage address, and acquire the bullet screen information generated by the video clip, that is, the second bullet screen information, from the address.
705. And sending the video clip, the first barrage information and the second barrage information to the playing equipment.
As described above, the first barrage information is barrage information generated by the original video in a time period corresponding to the video clip, the second barrage information is barrage information generated by the video clip, and the server sends the video clip, the first barrage information, and the second barrage information to the playing device, so that the display number of the barrages when the playing device plays the video clip is further increased. For example, taking the movie of "mahogany of western memory" shown in fig. 1a and 1b and its segments as an example again, the original video is a movie with a duration of 1 hour, 25 minutes and 18 seconds, the video segment is a segment with a duration of 1 minute and 32 seconds cut from the movie, and the video segment is obtained by cutting the video content of the movie for 21 minutes, 10 seconds to 22 minutes and 42 seconds; assuming that when the expiration server receives a play request for the video clip sent by the playing device, the movie already generates 100 barrages in the video segment corresponding to 21 min 10 sec to 22 min 42 sec, and a clip with a duration of 1 min 32 sec generates 5 barrages, the server sends the video clip, the 100 barrages, and the 5 barrages to the playing device, and the playing device can display 105 barrages when playing the video clip.
Further, after the server sends the video clip and the corresponding barrage information to the playing device, the video clip may still generate new barrage information, and the server sends the new barrage information to the playing device, which may also increase the number of displayed barrages when the video clip is played, and the specific process is as shown in fig. 8. The server may also perform the steps shown in fig. 8 after performing the steps shown in fig. 3 or fig. 7.
Referring to fig. 8, fig. 8 is a schematic flowchart illustrating another video segment playing control method according to an embodiment of the present application.
801. And receiving third barrage information sent to the video clip by the playing equipment and/or the playing equipment except the playing equipment.
As described above, the server may receive, at the same time point, the playing requests of the multiple playing devices for the same video segment, and the multiple playing devices may also play the same video segment at the same time in the same time period; further, after the server sends the video clip to the playing device, the user may input a bullet screen when watching the video clip on the playing device, and during this time, other users may also input a bullet screen when watching the video clip through other playing devices.
One possible situation is that the server receives the third barrage information sent by the playing device to the video segment. For example, after the server sends the video clip to the playing device, no other playing device plays the video clip in the same time period as the playing device, and the user inputs the barrage when watching the video clip on the playing device, so that the server can only receive the barrage information sent by the playing device for the video clip during this period, that is, the third barrage information. For another example, after the server sends the video segment to the playing device, there are other playing devices that play the video segment in the same time period as the playing device, but no user inputs a barrage for the video segment on the other playing devices, and a barrage is input when the user watches the video segment on the playing device, so similarly, the server can only receive the barrage information sent by the playing device for the video segment during this period, that is, the third barrage information.
Another situation that may occur is that the server receives third barrage information sent to the video segment by a playing device other than the playing device. For example, after the server sends the video segment to the playing device, there are other playing devices that play the video segment in the same time slot as the playing device, and there is a barrage that is input by the user when the user watches the video segment through other playing devices, but there is no barrage that is input by the user when the user watches the video segment on the playing device, so the server can only receive the barrage information that is sent by the playing devices other than the playing device for the video segment during this time, that is, the third barrage information.
Still another situation may occur in which the server receives third barrage information sent to the video segment by the playback device and a playback device other than the playback device. For example, after the server sends the video clip to the playing device, there are other playing devices that play the video clip in the same time slot as the playing device, and both the user inputs the barrage when watching the video clip through other playing devices and the user inputs the barrage when watching the video clip on the playing device, during this period, the server may receive the barrage information that is sent by the playing device and the playing devices other than the playing device for the video clip, that is, the third barrage information.
802. And sending the third bullet screen information to the playing equipment.
After the server transmits the video clip to the playing device, the server may receive the third barrage information in any one of the three situations 801, and then transmit the third barrage information to the playing device.
In one possible implementation, the server performs the steps shown in fig. 8 after performing the steps shown in fig. 3. That is, after the server sends the video clip and the first barrage information to the playback device, if the server receives the third barrage information, the server sends the third barrage information to the playback device. The embodiment of the application increases the number of bullet screen displays of the video clips during playing on the basis of the embodiment shown in fig. 3.
In another possible implementation, the server performs the steps shown in fig. 8 after performing the steps shown in fig. 7. That is, after the server sends the video clip, the first barrage information, and the second barrage information to the playback device, if the server receives the third barrage information, the server sends third barrage information to the playback device. The embodiment of the application increases the number of bullet screen displays of the video clip during playing on the basis of the embodiment shown in fig. 7.
Further, after the server sends the video clip and the corresponding barrage information to the playing device, the original video may still generate new barrage information in the time period corresponding to the video clip, and the server sends the new barrage information to the playing device, which may also increase the number of displayed barrages when the video clip is played, and the specific process is as shown in fig. 9. After the server performs the steps shown in fig. 3 or fig. 7, optionally, the steps shown in fig. 8 may be performed, optionally, the steps shown in fig. 9 may also be performed.
Referring to fig. 9, fig. 9 is a schematic flowchart illustrating another video segment playing control method according to an embodiment of the present application.
901. And acquiring fourth bullet screen information from the bullet screen storage position.
The fourth barrage information comprises newly added barrage information generated in a time period corresponding to the video clip. As described above, the barrage storage location is used to store barrage information of an original video, and the video clip is captured from the original video. That is to say, after the server sends the video clip to the playing device, the server may further obtain, from the bullet screen storage location determined in 302, new bullet screen information generated by the original video in the time period corresponding to the video clip. For example, as described in 303, the server obtains first bullet screen information from the bullet screen storage location, where the first bullet screen information includes 150 bullet screens, after the server sends the video segment to the playing device, 10 new bullet screens are generated in the time period corresponding to the video segment in the original video, and the server obtains the 10 bullet screens from the bullet screen storage location, that is, the fourth bullet screen information.
902. And sending the fourth bullet screen information to the playing equipment.
After the server sends the video clip to the playing device, the server obtains, from the bullet screen storage location, the fourth bullet screen information, which is the newly added bullet screen information that may be generated in the time period corresponding to the video clip in the original video, and then sends the fourth bullet screen information to the playing device.
In one possible implementation, the server performs the steps shown in fig. 9 after performing the steps shown in fig. 3. That is, after the server sends the video clip and the first barrage information to the playback device, if the server obtains the fourth barrage information, the server sends the fourth barrage information to the playback device. The embodiment of the application increases the number of bullet screen displays of the video clips during playing on the basis of the embodiment shown in fig. 3.
In another possible implementation, the server performs the steps shown in fig. 9 after performing the steps shown in fig. 7. That is, after the server sends the video clip, the first barrage information, and the second barrage information to the playback device, if the server obtains the fourth barrage information, the server sends the fourth barrage information to the playback device. The embodiment of the application increases the number of bullet screen displays of the video clip during playing on the basis of the embodiment shown in fig. 7.
In yet another possible implementation, after performing the steps shown in fig. 3, the server performs not only the steps shown in fig. 8 but also the steps shown in fig. 9. That is to say, after the server sends the video clip and the first barrage information to the playing device, the server receives the third barrage information and sends the third barrage information to the playing device, and the server also obtains the fourth barrage information and sends the fourth barrage information to the playing device. The embodiment of the present application increases the number of bullet screen displays of the video clip during playing based on the embodiment combined with fig. 3 and fig. 6.
In yet another possible implementation, after performing the steps shown in fig. 7, the server performs not only the steps shown in fig. 8 but also the steps shown in fig. 9. That is to say, after the server sends the video clip, the first barrage information, and the second barrage information to the playing device, the server receives the third barrage information and sends the third barrage information to the playing device, and the server also obtains the fourth barrage information and sends the fourth barrage information to the playing device. The embodiment of the present application further increases the number of bullet screen displays of the video clip during playing based on the embodiment combined with fig. 7 and fig. 8. It should be noted that, in this embodiment, the server sends the first barrage information, the second barrage information, the third barrage information, and the fourth barrage information to the playing device, and in all the embodiments, the implementation mode maximizes the number of displayed barrages of the video clips during playing.
Regarding the last possible implementation manner in 902, in order to more clearly illustrate the specific process, the embodiment of the present application is further illustrated, as shown in fig. 10a, that is, after the server performs the steps shown in fig. 7, the server performs not only the steps shown in fig. 8 but also the steps shown in fig. 9.
1001. Receiving a playing request sent by playing equipment, and determining the video clip requested to be played by the playing request.
1002. And determining the bullet screen storage position.
1003. And acquiring first bullet screen information from the bullet screen storage position.
1004. And determining second bullet screen information generated by the video clip.
1005. And sending the video clip, the first barrage information and the second barrage information to the playing equipment.
1006. And receiving third barrage information sent to the video clip by the playing equipment and/or the playing equipment except the playing equipment.
1007. And sending the third bullet screen information to the playing equipment.
1008. And acquiring fourth bullet screen information from the bullet screen storage position.
1009. And sending the fourth bullet screen information to the playing equipment.
The application scenario may be: when a server receives a video clip playing request sent by playing equipment, the server firstly determines a video clip requested to be played and inquires video information of the video clip so as to obtain an associated identification code, a first associated time point and a second associated time point (the associated identification code is the identification code of an original video, the video clip is intercepted from the original video, and the first associated time point and the second associated time point are respectively an intercepting time starting point and an intercepting time ending point of the video clip in the original video); then, the server acquires first barrage information from a first associated time point to a second associated time point of the video corresponding to the associated identification code, and in addition, if the video segment is not played for the first time, the video segment may also generate some barrage information, and the server also acquires second barrage information generated by the video segment; then, the server sends the video clip, the first barrage information and the second barrage information to a playing device; finally, in the process of playing the video clip by the playing device, the barrage information of the video clip is in a real-time updating state, the server can send the received third barrage information newly generated by the video clip to the playing device, similarly, the barrage information of the original video is also in a real-time updating state, and the server can also send the fourth barrage information newly generated by the original video in the time period corresponding to the video clip to the playing device. For example, as shown in fig. 10b, still taking the movie and its segments of "mahogany of western book" as an example, the playing devices 1 and 2 are playing segments in the movie, and the playing device 3 is playing the movie. Specifically, when the server receives a play request sent by the play device 1, the server determines a movie fragment requested to be played by the play request, acquires a first barrage generated by the movie in a time period corresponding to the fragment, and also acquires a second barrage generated by the movie fragment, and then sends the movie fragment, the first barrage, and the second barrage to the play device 1; in the process of playing the movie fragment, the playing device 2 also plays the movie fragment, and the playing device 3 just plays the video segment corresponding to the movie fragment in the movie, so that the user using the playing device 1 may send a barrage when watching the movie fragment, and the user using the playing device 2 may send a barrage when watching the movie fragment, and when the server receives the third barrage sent by the playing device 1 and/or the playing device 2 for the movie fragment, the server may send the third barrage to the playing device 1 as well, and similarly, the user using the playing device 3 may send a barrage when watching the video segment corresponding to the movie fragment in the movie, and the server may send a fourth barrage newly generated in the video segment of the movie to the playing device 1 as well. Finally, the bullet screen situation received by the playing device 1 is shown in fig. 10 c.
Based on the description of the above video clip playing control method embodiment, the embodiment of the present application further discloses a server, where the server may run a computer program (including a program code). Referring to fig. 11, the server may run the following modules:
a determining module 1101, configured to, when a play request sent by a playing device is received, determine a video segment requested to be played by the play request;
the determining module 1101 is further configured to determine a bullet screen storage location, where the bullet screen storage location is used to store bullet screen information of an original video, and the video segment is captured from the original video;
an obtaining module 1102, configured to obtain first bullet screen information from the bullet screen storage location, where the first bullet screen information includes bullet screen information generated in a time period corresponding to the video segment;
a sending module 1103, configured to send the video clip and the first barrage information to the playback device.
In an embodiment, the determining module 1101 is further configured to determine second barrage information generated by the video segment before the video segment and the first barrage information are sent to the playing device; the sending module 1103 is specifically configured to send the video clip, the first barrage information, and the second barrage information to the playing device.
In another embodiment, the server further includes a receiving module 1104, configured to receive, after the video segment and the first barrage information are sent to the playing device, third barrage information sent to the video segment by the playing device and/or a playing device other than the playing device; the sending module 1103 is further configured to send the third bullet screen information to the playing device.
In another embodiment, the obtaining module 1102 is further configured to obtain fourth bullet screen information from the bullet screen storage location after the video segment and the first bullet screen information are sent to the playing device, where the fourth bullet screen information includes new bullet screen information generated in a time period corresponding to the video segment; the sending module 1103 is further configured to send the fourth bullet screen information to the playing device.
In another embodiment, the receiving module 1104 is further configured to receive, after the video segment, the first barrage information, and the second barrage information are sent to the playing device, third barrage information sent to the video segment by the playing device and/or a playing device other than the playing device; the sending module 1103 is further configured to send the third bullet screen information to the playing device.
In another embodiment, the obtaining module 1102 is further configured to obtain fourth bullet screen information from the bullet screen storage location after the video segment, the first bullet screen information, and the second bullet screen information are sent to the playing device, where the fourth bullet screen information includes new bullet screen information generated in a time period corresponding to the video segment; the sending module 1103 is further configured to send the fourth bullet screen information to the playing device.
In another embodiment, the determining module 1101 is specifically configured to query the associated identification codes of the video segments, where the associated identification codes include video identification codes; and is used for determining the bullet screen storage position of the original video corresponding to the video identification code.
In another embodiment, the obtaining module 1102 is specifically configured to query a first associated time point and a second associated time point of the video segment; and the video playing method is used for acquiring first bullet screen information generated by the video segments from the first associated time point to the second associated time point in the original video from the bullet screen storage position.
Based on the above description of the method embodiment and the apparatus embodiment, the present application further provides a schematic diagram of a server structure, where the server 1200 may generate relatively large differences due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 1222 (e.g., one or more processors) and a memory 1232, and one or more storage media 1230 (e.g., one or more mass storage devices) storing an application program 1242 or data 1244. Memory 1232 and storage media 1230 can be, among other things, transient storage or persistent storage. The program stored in the storage medium 1230 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Still further, the central processor 1222 may be configured to communicate with the storage medium 1230, to execute a series of instruction operations in the storage medium 1230 on the server 1200. The server 1200 may be a server provided by the present application.
The server 1200 may also include one or more power supplies 1226, one or more wired or wireless network interfaces 1250, one or more input-output interfaces 1258, and/or one or more operating systems 1241, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
The steps performed by the server in the above embodiment may be based on the server structure shown in fig. 12. Specifically, the central processing unit 1222 may implement the functions of the determining module 1101 and the obtaining module 1102; the input/output interface 1258 can realize functions of the transmitting module 1103 and the receiving module 1104.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a Digital Versatile Disk (DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media that can store program codes, such as a read-only memory (ROM) or a Random Access Memory (RAM), a magnetic disk, or an optical disk.

Claims (10)

1. A method for controlling playing of a video clip, comprising:
receiving a playing request sent by playing equipment, and determining a video clip requested to be played by the playing request;
determining a bullet screen storage position, wherein the bullet screen storage position is used for storing bullet screen information of an original video, and the video clip is intercepted from the original video;
acquiring first bullet screen information from the bullet screen storage position, wherein the first bullet screen information comprises bullet screen information generated in a time period corresponding to the video clip;
and sending the video clip and the first barrage information to the playing equipment.
2. The method of claim 1, wherein before the sending the video clip and the first barrage information to the playback device, the method further comprises:
determining second barrage information generated by the video clip;
the sending the video clip and the first barrage information to the playing device includes:
and sending the video clip, the first barrage information and the second barrage information to the playing equipment.
3. The method of claim 1, wherein after the sending the video clip and the first barrage information to the playback device, the method further comprises:
receiving third barrage information sent to the video clip by the playing equipment and/or the playing equipment except the playing equipment;
and sending the third bullet screen information to the playing equipment.
4. The method according to claim 1 or 3, wherein after the sending the video clip and the first barrage information to the playback device, the method further comprises:
acquiring fourth barrage information from the barrage storage position, wherein the fourth barrage information comprises newly added barrage information generated in a time period corresponding to the video clip;
and sending the fourth bullet screen information to the playing equipment.
5. The method of claim 2, wherein after the sending the video clip, the first barrage information, and the second barrage information to the playback device, the method further comprises:
receiving third barrage information sent to the video clip by the playing equipment and/or the playing equipment except the playing equipment;
and sending the third bullet screen information to the playing equipment.
6. The method according to claim 2 or 5, wherein after the sending the video clip, the first barrage information and the second barrage information to the playback device, the method further comprises:
acquiring fourth barrage information from the barrage storage position, wherein the fourth barrage information comprises newly added barrage information generated in a time period corresponding to the video clip;
and sending the fourth bullet screen information to the playing equipment.
7. The method according to any one of claims 1 to 3 and 5, wherein the determining a bullet screen storage position, which is used for storing bullet screen information of an original video, comprises:
inquiring the associated identification code of the video clip, wherein the associated identification code comprises a video identification code;
and determining the bullet screen storage position of the original video corresponding to the video identification code.
8. A server, comprising:
the device comprises a determining module, a playing module and a playing module, wherein the determining module is used for determining a video clip requested to be played by a playing request when the playing request sent by playing equipment is received;
the determining module is further configured to determine a bullet screen storage location, where the bullet screen storage location is used to store bullet screen information of an original video, and the video clip is captured from the original video;
the acquisition module is used for acquiring first bullet screen information from the bullet screen storage position, wherein the first bullet screen information comprises bullet screen information generated in a time period corresponding to the video clip;
and the sending module is used for sending the video clip and the first barrage information to the playing equipment.
9. An electronic device, comprising: a processor and a memory, wherein the memory stores program instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1 to 7.
10. A computer-readable storage medium having stored therein instructions which, when run on a computer, cause the computer to perform the method of any one of claims 1 to 7.
CN201910994135.7A 2019-10-18 2019-10-18 Video clip playing control method and related product Active CN110708571B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910994135.7A CN110708571B (en) 2019-10-18 2019-10-18 Video clip playing control method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910994135.7A CN110708571B (en) 2019-10-18 2019-10-18 Video clip playing control method and related product

Publications (2)

Publication Number Publication Date
CN110708571A true CN110708571A (en) 2020-01-17
CN110708571B CN110708571B (en) 2020-12-08

Family

ID=69200577

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910994135.7A Active CN110708571B (en) 2019-10-18 2019-10-18 Video clip playing control method and related product

Country Status (1)

Country Link
CN (1) CN110708571B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111405344A (en) * 2020-03-18 2020-07-10 腾讯科技(深圳)有限公司 Bullet screen processing method and device
CN112565877A (en) * 2020-12-10 2021-03-26 北京奇艺世纪科技有限公司 Screen projection method and system, electronic equipment and storage medium
CN113422998A (en) * 2021-05-21 2021-09-21 北京奇艺世纪科技有限公司 Method, device, equipment and storage medium for generating short video and note content
CN115190369A (en) * 2022-09-09 2022-10-14 北京达佳互联信息技术有限公司 Video generation method, video generation device, electronic apparatus, medium, and product

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140153652A1 (en) * 2012-12-03 2014-06-05 Home Box Office, Inc. Package Essence Analysis Kit
CN105979348A (en) * 2016-06-28 2016-09-28 武汉斗鱼网络科技有限公司 Matching method and device based on video cutting and live commenting
CN105979288A (en) * 2016-06-17 2016-09-28 乐视控股(北京)有限公司 Video interception method and device
CN106060644A (en) * 2016-06-28 2016-10-26 武汉斗鱼网络科技有限公司 Live broadcast video clipping method and device associated with bullet screens
CN110121083A (en) * 2018-02-06 2019-08-13 上海全土豆文化传播有限公司 The generation method and device of barrage

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140153652A1 (en) * 2012-12-03 2014-06-05 Home Box Office, Inc. Package Essence Analysis Kit
CN105979288A (en) * 2016-06-17 2016-09-28 乐视控股(北京)有限公司 Video interception method and device
CN105979348A (en) * 2016-06-28 2016-09-28 武汉斗鱼网络科技有限公司 Matching method and device based on video cutting and live commenting
CN106060644A (en) * 2016-06-28 2016-10-26 武汉斗鱼网络科技有限公司 Live broadcast video clipping method and device associated with bullet screens
CN110121083A (en) * 2018-02-06 2019-08-13 上海全土豆文化传播有限公司 The generation method and device of barrage

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111405344A (en) * 2020-03-18 2020-07-10 腾讯科技(深圳)有限公司 Bullet screen processing method and device
CN112565877A (en) * 2020-12-10 2021-03-26 北京奇艺世纪科技有限公司 Screen projection method and system, electronic equipment and storage medium
CN112565877B (en) * 2020-12-10 2022-10-18 北京奇艺世纪科技有限公司 Screen projection method and system, electronic equipment and storage medium
CN113422998A (en) * 2021-05-21 2021-09-21 北京奇艺世纪科技有限公司 Method, device, equipment and storage medium for generating short video and note content
CN115190369A (en) * 2022-09-09 2022-10-14 北京达佳互联信息技术有限公司 Video generation method, video generation device, electronic apparatus, medium, and product

Also Published As

Publication number Publication date
CN110708571B (en) 2020-12-08

Similar Documents

Publication Publication Date Title
CN110708571B (en) Video clip playing control method and related product
US10992781B2 (en) Method, user equipment, server, and apparatus for implementing information sharing
CN109922377B (en) Play control method and device, storage medium and electronic device
CN105183513A (en) Application recommendation method and apparatus
US20120117599A1 (en) Thumbnail publication
CN104954811A (en) Method for loading network video by video polymerization application and intelligent television terminal
CN113225572B (en) Page element display method, device and system of live broadcasting room
US10929460B2 (en) Method and apparatus for storing resource and electronic device
CN102937999A (en) Dynamic browser icon
CN104471917A (en) Application information sharing method and device
CN103597860A (en) Selective linking of message accounts
CN103269396A (en) Method and system capable of conducting management on mobile terminal
CN111744174A (en) Account management method and device of cloud game, account login method and device and electronic equipment
CN103546774A (en) Method and system for realizing seamless access to media file
CN107786631B (en) Content publishing method and device and electronic equipment
CN105611422A (en) Online live broadcast method based on multi-media list and apparatus thereof
CN104363507A (en) Video and audio recording and sharing method and system based on OTT set-top box
CN105025320B (en) Operable desktop system with hybrid architecture and implementation method thereof
CN104980807A (en) Method and terminal for multimedia interaction
CN106254953B (en) A kind of image display method and device, picture receive terminal
CN112016280B (en) File editing method and device and computer readable medium
US20170134781A1 (en) Method and apparatus for realizing custom menu, client and server
CN108614656B (en) Information processing method, medium, device and computing equipment
CN112291602A (en) Video playing method, electronic equipment and storage medium
AU2010100616A4 (en) Remote Content Download

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40020925

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant