CN112565911A - Bullet screen display method, bullet screen generation device, bullet screen equipment and storage medium - Google Patents

Bullet screen display method, bullet screen generation device, bullet screen equipment and storage medium Download PDF

Info

Publication number
CN112565911A
CN112565911A CN202011370558.0A CN202011370558A CN112565911A CN 112565911 A CN112565911 A CN 112565911A CN 202011370558 A CN202011370558 A CN 202011370558A CN 112565911 A CN112565911 A CN 112565911A
Authority
CN
China
Prior art keywords
video
bullet screen
playing
game
barrage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011370558.0A
Other languages
Chinese (zh)
Other versions
CN112565911B (en
Inventor
张云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011370558.0A priority Critical patent/CN112565911B/en
Publication of CN112565911A publication Critical patent/CN112565911A/en
Application granted granted Critical
Publication of CN112565911B publication Critical patent/CN112565911B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a bullet screen display method, a bullet screen generation device, equipment and a storage medium, and belongs to the field of multimedia playing. The method comprises the following steps: displaying a video playing interface, wherein the video playing interface is used for playing a main video; when the main video is played to the bullet screen moment, a video bullet screen control is additionally displayed on the video playing interface, wherein the video bullet screen control is used for triggering playing of a bullet screen video in a video form, and the bullet screen moment is the playing moment associated with the bullet screen video; and responding to the triggering operation on the video bullet screen control, and simultaneously playing the main video and the bullet screen video on the video playing interface. The method and the device enable the user to simultaneously watch the main video and the barrage video on the same user interface, so that different differences in the main video and the barrage video are compared, and effective information is obtained from the barrage video in a more intuitive and efficient mode.

Description

Bullet screen display method, bullet screen generation device, bullet screen equipment and storage medium
Technical Field
The embodiment of the application relates to the field of multimedia playing, in particular to a bullet screen display method, a bullet screen generation device, equipment and a storage medium.
Background
Barrage (barrage) refers to a commentary subtitle that pops up when watching a video.
The related art provides a bullet screen display method of game video, which displays the game video on a video playing interface. The playing time of the game video is associated with a text bullet screen, and when the playing time reaches a certain playing time and the text bullet screen is associated with the playing time, the text bullet screen is displayed on the video playing interface in an overlapping mode.
Because some text bulletin screens have descriptions of game playing or game outgoing of other players, users can hardly intuitively understand the game playing or game outgoing of other players only through the text bullet screens, namely, the content and the type of information which can be expressed by the text bullet screens are limited.
Disclosure of Invention
The application provides a bullet screen display method, a bullet screen generation device, equipment and a medium, which can provide a bullet screen display scheme in a video form, and a user can simultaneously play a main video and a video bullet screen to compare similar video information. The technical scheme is as follows:
according to an aspect of the present application, there is provided a bullet screen display method, the method including:
displaying a video playing interface, wherein the video playing interface is used for playing a main video;
when the main video is played to the bullet screen moment, a video bullet screen control is additionally displayed on the video playing interface, wherein the video bullet screen control is used for triggering playing of a bullet screen video in a video form, and the bullet screen moment is the playing moment associated with the bullet screen video;
and responding to the triggering operation on the video bullet screen control, and simultaneously playing the main video and the bullet screen video on the video playing interface.
According to another aspect of the present application, there is provided a bullet screen video generation method based on a cloud game, the method including:
acquiring a game video generated in the running process of the cloud game;
determining a start timestamp and an end timestamp in response to the intercept operation;
intercepting a game video segment from the game video according to the starting time stamp and the ending time stamp;
generating a bullet screen video according to the game video clip;
and taking the starting timestamp as a bullet screen moment, and storing the bullet screen video to a cloud game in an associated manner.
According to another aspect of the present application, there is provided a bullet screen display device, the device including:
the display module is used for displaying a video playing interface, and the video playing interface is used for playing a main video;
the display module is used for additionally displaying a video bullet screen control on the video playing interface when the main video is played to a bullet screen time, wherein the video bullet screen control is used for triggering playing of a bullet screen video in a video form, and the bullet screen time is the playing time associated with the bullet screen video;
and the interaction module is used for responding to the triggering operation on the video bullet screen control and simultaneously playing the main video and the bullet screen video on the video playing interface.
According to another aspect of the present application, there is provided a bullet screen video generation apparatus based on a cloud game, the apparatus including:
the acquisition module is used for acquiring a game video generated in the running process of the cloud game;
a response module to determine a start timestamp and an end timestamp in response to the intercept operation;
the intercepting module is used for intercepting a game video clip from the game video according to the starting time stamp and the ending time stamp;
the generating module is used for generating a barrage video according to the game video clip;
and the storage module is used for storing the bullet screen video to the cloud game in a correlation manner by taking the starting timestamp as the bullet screen moment.
According to another aspect of the present application, there is provided a computer device comprising: a processor and a memory storing a computer program loaded and executed by the processor to implement the bullet screen display method or the cloud game based bullet screen video generation method as described above.
According to another aspect of the present application, there is provided a computer-readable storage medium storing a computer program loaded and executed by the processor to implement the bullet screen display method or the cloud game-based bullet screen video generation method as described above.
According to another aspect of the present application, a computer program product is provided that includes computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes to implement the bullet screen display method or the bullet screen video generation method based on the cloud game as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
through providing the barrage video of a video form, when main video broadcast reaches the barrage moment, increase the video barrage control of demonstration on the video broadcast interface. When the user clicks the video barrage control, the main video and the barrage video are played simultaneously on the video playing interface, so that the user can watch the main video and the barrage video simultaneously on the same user interface, different differences in the main video and the barrage video are compared, and effective information is obtained from the barrage video in a more intuitive and efficient mode.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 illustrates an interface schematic diagram of a video bullet screen provided by an exemplary embodiment of the present application;
FIG. 2 illustrates a block diagram of a computer system provided in another exemplary embodiment of the present application;
fig. 3 is a flowchart illustrating a bullet screen display method according to another exemplary embodiment of the present application;
fig. 4 is a flowchart illustrating a bullet screen display method according to an exemplary embodiment of the present application;
FIG. 5 is a schematic interface diagram illustrating a bullet screen display method according to an exemplary embodiment of the present application;
FIG. 6 is a schematic interface diagram illustrating a bullet screen display method according to an exemplary embodiment of the present application;
FIG. 7 is a schematic interface diagram illustrating a bullet screen display method according to an exemplary embodiment of the present application;
FIG. 8 is a schematic interface diagram illustrating a bullet screen display method according to an exemplary embodiment of the present application;
fig. 9 is an interface diagram illustrating a bullet screen display method according to an exemplary embodiment of the present application;
FIG. 10 is a schematic interface diagram illustrating a bullet screen display method according to an exemplary embodiment of the present application;
fig. 11 is an interface diagram illustrating a bullet screen display method according to an exemplary embodiment of the present application;
FIG. 12 is a schematic interface diagram illustrating a bullet screen display method according to an exemplary embodiment of the present application;
fig. 13 is an interface diagram illustrating a bullet screen display method according to an exemplary embodiment of the present application;
fig. 14 is a schematic interface diagram illustrating a bullet screen display method according to an exemplary embodiment of the present application;
fig. 15 is a flowchart illustrating a bullet screen generating method based on a cloud game according to an exemplary embodiment of the present application;
fig. 16 is a flowchart illustrating a bullet screen generating method based on a cloud game according to an exemplary embodiment of the present application;
FIG. 17 is a schematic diagram illustrating an interface for initiating a cloud game via a cloud game video according to an exemplary embodiment of the present application;
FIG. 18 illustrates a storage diagram of game video and archived data provided by an exemplary embodiment of the present application;
FIG. 19 illustrates a cut-out schematic view of a barrage video provided by an exemplary embodiment of the present application;
fig. 20 is a block diagram illustrating a bullet screen display device provided in an exemplary embodiment of the present application;
fig. 21 is a block diagram illustrating a cloud game based bullet screen generating apparatus according to an exemplary embodiment of the present application;
FIG. 22 illustrates a block diagram of a computer device provided by an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, description will be given of cloud computing and cloud games related to the embodiments of the present application.
1. Cloud Computing (Cloud Computing).
Cloud computing is a computing model that distributes computing tasks over a resource pool of large numbers of computers, enabling various application systems to obtain computing power, storage space, and information services as needed. The network that provides the resources is referred to as the "cloud". Resources in the "cloud" appear to the user to be infinitely expandable and to be available at any time, available on demand, expandable at any time, and pay per use.
As a basic capability provider of cloud computing, a cloud computing resource pool (called as "cloud platform" for short, and generally referred to as IaaS (Infrastructure as a Service) platform) is established, and multiple types of virtual resources are deployed in the resource pool and are selectively used by external clients. The cloud computing resource pool mainly comprises: computing devices (which are virtualized machines, including operating systems), storage devices, and network devices.
According to the logic function division, a Platform as a Service (PaaS) layer can be deployed on the IaaS layer, a Software as a Service (SaaS) layer is deployed on the PaaS layer, and the SaaS layer can be directly deployed on the IaaS layer. PaaS is a platform on which software runs, such as databases, Web containers, etc. SaaS is a variety of business software, such as Web portal, sms group sender, etc. Generally speaking, SaaS and PaaS are upper layers relative to IaaS.
2. Cloud gaming (Cloud gaming).
Cloud games, which may also be referred to as game on demand (game), are an online game technology based on cloud computing technology. Cloud game technology enables light-end devices (thin clients) with relatively limited graphics processing and data computing capabilities to run high-quality games. In a cloud game scene, a game is not operated in a user game terminal but in a cloud server, and the cloud server renders the game scene into a video and audio stream which is transmitted to the user game terminal through a network. The user game terminal does not need to have strong graphic operation and data processing capacity, and only needs to have basic streaming media playing capacity and capacity of acquiring user input instructions and sending the instructions to the cloud server.
The application provides a barrage in video form. As shown in fig. 1, a main video 14 is played on the video playing interface 12, and the main video 14 may be a cloud game video. When the main video 14 is played to the bullet screen moment, three video bullet screen controls 16 are displayed in the lower left corner of the main video 14 in an overlapped mode. When the user clicks one of the video bullet screen controls 16, the main video 14 and the bullet screen video 18 are simultaneously played on the video playing interface 12. In one example, both main video 14 and barrage video 18 display a game scene of the same game, main video 14 displays the game progress of player 1, and barrage video 18 displays the game progress of player 2. The user may compare the gameplay of player 1 and player 2 to compare the difference in play, shipment, etc. of the two players.
Fig. 2 is a schematic diagram illustrating a cloud game management system according to an embodiment of the present application. The cloud game management system includes: a first terminal 10, a second terminal 20, and a cloud game server 30.
The first terminal 10 and the second terminal 20 may be electronic devices such as a mobile phone, a tablet Computer, a game console, an e-book reader, a multimedia playing device, a wearable device, a PC (Personal Computer), and the like. Optionally, the first terminal 10 and the second terminal 20 have a client 50 running a cloud game application (i.e., the cloud game) installed therein; alternatively, the client 50 running the cloud game application is not installed in the first terminal 10 and the second terminal 20, and after receiving a start instruction for the cloud game application, data of the cloud game application is acquired from the cloud game server 30 to load the cloud game application. The attachment relationship of the cloud game Application program is not limited in the embodiment of the Application, and optionally, the cloud game Application program is an independent Application program and can be directly operated in an operating system without depending on other Application programs, for example, the cloud game Application program is an APP (Application); alternatively, the cloud game application is a sub-application of the target application and needs to be executed by the target application, for example, the cloud game application is an applet.
In the case of a sub-application (also referred to as an applet) in which the cloud game application is a host program, a user can directly open the cloud game application by scanning a two-dimensional code corresponding to the cloud game application or searching for a name of the cloud game application, and the cloud game application can be conveniently acquired and transmitted in a target application. The target application program is an application program used for bearing the cloud game application program, and provides an environment for realizing the cloud game application program. The host program is a native application program that can run directly on the operating system. The embodiment of the present application does not limit the specific type of the host program, and optionally, the host program may be a social application program, a dedicated application program specially supporting a sub-application program, a file management application program, an email application program, a game application program, or the like. The Social application program includes an instant messaging application, an SNS (Social Network Service) application, or a live application.
The first terminal 10 refers to a terminal held by a user who records or edits a cloud game video. A user who records or edits the cloud game video may start the cloud game application through a client installed in the first terminal 10 and operate the cloud game application, and control the first terminal 10 to record a game screen of the cloud game application in an operation process of the cloud game application, so as to form the cloud game video. In addition, the first terminal 10 may also record data involved in the running of the cloud game application and form a cloud game archive. Thereafter, the first terminal 10 may transmit the cloud game video and game archive to the cloud game server 30 for storage or the like.
The second terminal 20 is a terminal held by a user who views the cloud game video. Optionally, the user may view the cloud game video in the cloud game application, may also view the cloud game video in other applications, such as a live application, a social application, a video application, and the like, and may also view the cloud game video in a web page, which is not limited in this embodiment of the present application. In addition, the second terminal 20 may also receive a game screen rendered from the cloud game server 30, and may then load the game screen according to the game screen, and may receive an operation instruction input by a user.
In one example, the first terminal 10 may also be used to play a cloud game video, and the second terminal 20 may also be used to record or edit the cloud game video, that is, the first terminal 10 or the second terminal 20 may be used to both record or edit the cloud game video and play the cloud game video. Fig. 1 is only for convenience of description, and is to distinguish the first terminal 10 for recording or editing a cloud game video and the second terminal 20 for playing the cloud game video, which should be understood not to limit the technical solution of the present application.
The cloud game server 30 is configured to provide a background service for a client installed with a running cloud game application in the first terminal 10 and the second terminal 20. For example, cloud gaming server 30 may be a backend server of the cloud gaming application described above. The cloud game server 30 may be one server, a server cluster composed of a plurality of servers, or a cloud computing service center. Optionally, the cloud game server 30 provides background services for cloud game applications in multiple terminals simultaneously.
Alternatively, the first terminal 10 and the cloud game server 30, and the second terminal 20 and the cloud game server 30 communicate with each other through the network 40. The network may be a wired network or a wireless network, which is not limited in the embodiments of the present application. In one example, the first terminal 10 transmits a recorded cloud game video and/or recorded cloud game data and the like during the operation of the cloud game application to the cloud game server 30; the cloud game server 30 renders a game screen of the cloud game according to the cloud game data, and transmits the rendered game screen to the second terminal 20; the second terminal 20 starts a cloud game application, loads a rendered game screen, and receives an operation of a user or the like.
Fig. 3 shows a flowchart of a bullet screen display method according to an exemplary embodiment of the present application. The method can be applied to a terminal for illustration. The terminal may be the first terminal or the second terminal shown in fig. 2, and the method includes:
step 302: displaying a video playing interface, wherein the video playing interface is used for playing a main video;
the user starts an application program on the terminal, and the application program can be a video playing program, a live application program, a cloud game program or other programs with video playing capability. Illustratively, the application also supports bullet-screen playback.
And after the application program is started, the terminal displays a video playing interface. The main video is played on the video playing interface. Optionally, the main video is a video provided by a background server corresponding to the application program, or is a video stored locally. In this embodiment, an example is given in which the main video is an online video.
When the application program adopts a full-screen playing mode, the main video adopts the full-screen playing mode; when the application program adopts a window playing mode, the main video adopts the window playing mode.
Step 304: when the main video is played to the bullet screen moment, a video bullet screen control is additionally displayed on a video playing interface;
the video bullet screen control is used for triggering the bullet screen video in the video playing mode. Optionally, the video bullet screen control is a start play button or a preview of the bullet screen video. Different video barrage controls correspond to different barrage videos. The barrage video is a barrage in the form of a video associated with the main video. Different barrage videos can be provided by the same generator or different generators.
The bullet screen time is the playing time associated with the bullet screen video. Any play time of the main video can be associated with the bullet screen video. Multiple bullet screen videos can be associated with the same main video.
And when the main video is played to the bullet screen moment, additionally displaying a video bullet screen control on the video playing interface. The video bullet screen control can be displayed in any area of the main video, for example, the video bullet screen control is displayed in at least one of a central area, an upper area, a lower area, a left area, a right area, an upper left corner area, a lower left corner area, an upper right corner area and a lower right corner area.
In one example, the display position of the video bullet screen control is fixed. In another example, the video bullet screen control is scrolled and displayed on the video playing interface, and the scrolling direction is at least one of left to right, right to left, top to bottom, and bottom to top.
In one example, the duration of the display of the video bullet screen control is within a certain duration, where the certain duration is a preset value, or the certain duration is equal to the duration of one circle of rolling of the video bullet screen control in the bullet screen area.
Step 306: and responding to the triggering operation on the video bullet screen control, and simultaneously playing the main video and the bullet screen video on the video playing interface.
And responding to the triggering operation on the video bullet screen control, simultaneously playing the main video and the bullet screen video in a superposition mode on the video playing interface, or simultaneously playing the main video and the bullet screen video in a parallel mode on the video playing interface.
In one example, in response to a trigger operation on a video bullet screen control, a main video is continuously played on a video playing interface at an original playing speed, and a bullet screen video is played from a trigger moment corresponding to the trigger operation; in another example, in response to a trigger operation on the video bullet screen control, the main video is returned to the bullet screen time at the trigger time corresponding to the trigger operation, and the bullet screen video is played from the trigger time corresponding to the trigger operation.
In one example, in the process of simultaneously playing a main video and a barrage video, a first audio corresponding to the main video and a second audio corresponding to the barrage video are mixed and played; in another example, in the process of simultaneously playing the main video and the barrage video, only the first audio corresponding to the main video is played, and the second audio corresponding to the barrage video is not played; in another example, in the process of simultaneously playing the main video and the barrage video, the first audio corresponding to the main video is not played, and the second audio corresponding to the barrage video is played.
In summary, in the method provided in this embodiment, by providing a bullet screen video in a video form, when the main video is played to the moment of the bullet screen, a video bullet screen control is additionally displayed on the video playing interface. When the user clicks the video barrage control, the main video and the barrage video are played simultaneously on the video playing interface, so that the user can watch the main video and the barrage video simultaneously on the same user interface, different differences in the main video and the barrage video are compared, and effective information is obtained from the barrage video in a more intuitive and efficient mode.
Fig. 4 shows a flowchart of a bullet screen display method according to an exemplary embodiment of the present application. The method can be executed by a terminal, and comprises the following steps:
step 402: displaying a video playing interface, wherein the video playing interface is used for playing a main video;
the user starts an application program on the terminal, and the application program can be a video playing program, a live application program, a cloud game program or other programs with video playing capability. Illustratively, the application also supports bullet-screen playback.
And after the application program is started, the terminal displays a video playing interface. The main video is played on the video playing interface. Optionally, the main video is a video provided by a background server corresponding to the application program, or is a video stored locally. In this embodiment, an example is given in which the main video is an online video.
When the application program adopts a full-screen playing mode, the main video adopts the full-screen playing mode; when the application program adopts a window playing mode, the main video adopts the window playing mode.
Step 404: when the main video is played to the bullet screen moment, adding a video bullet screen control and associated information on a video playing interface;
the video bullet screen control is used for triggering the bullet screen video in the video playing mode. Optionally, the video bullet screen control is a start play button or a preview of the bullet screen video. Different video barrage controls correspond to different barrage videos. The barrage video is a barrage in the form of a video associated with the main video. Different barrage videos can be provided by the same generator or different generators.
The bullet screen time is the playing time associated with the bullet screen video. Any play time of the main video can be associated with the bullet screen video. Multiple bullet screen videos can be associated with the same main video.
And when the main video is played to the bullet screen moment, additionally displaying a video bullet screen control on the video playing interface. The video bullet screen control can be displayed in any area of the main video, for example, the video bullet screen control is displayed in at least one of a central area, an upper area, a lower area, a left area, a right area, an upper left corner area, a lower left corner area, an upper right corner area and a lower right corner area.
Wherein the associated information includes: at least one of an account number of the bullet screen sender, a head portrait of the bullet screen sender, a nickname of the bullet screen sender and a text bullet screen associated with the video bullet screen.
As shown in fig. 5, for the same bullet screen video, a head portrait 11 of a bullet screen sender, a video bullet screen control 16 and a text bullet screen 15 associated with the bullet screen video are added to the lower left corner area of the video playing interface. The bullet screen sender's avatar 11, video bullet screen controls 16, and the text bullet screen 15 associated with the bullet screen video may be displayed in a left-to-right direction. Optionally, the video duration of the barrage video is also displayed in the upper right corner of the video barrage control 16.
In one example, the display position of the video bullet screen control is fixed. In another example, the video bullet screen control is scrolled and displayed on the video playing interface, and the scrolling direction is at least one of left to right, right to left, top to bottom, and bottom to top.
In one example, the duration of the display of the video bullet screen control is within a certain duration, where the certain duration is a preset value, or the certain duration is equal to the duration of one circle of rolling of the video bullet screen control in the bullet screen area.
Because the bullet screen time and the display duration of each video bullet screen control are different, a plurality of video bullet screen controls may be simultaneously displayed on the video playing interface. In the case that at least two video bullet screen controls need to be displayed, the display can be performed in any one of the following two ways:
the first method is as follows: at least two video bullet screen controls are arranged on the video playing interface along the transverse direction;
as shown in fig. 6, when there are a plurality of video bullet screen controls 16 to be displayed, the plurality of video bullet screen controls are displayed in a left-to-right arrangement on the video playing interface. The video bullet screen control with the earlier bullet screen moment is displayed on the left side; the video bullet screen control with the bullet screen at a later moment is displayed on the right side.
The second method comprises the following steps: and at least two video bullet screen controls are arranged on the video playing interface along the longitudinal direction.
As shown in fig. 7, when there are multiple video bullet screen controls to be displayed, the multiple video bullet screen controls are displayed in an arrangement from top to bottom on the video playing interface. The video bullet screen control with the earlier bullet screen moment is displayed above; and the video bullet screen control with the later bullet screen time is displayed below.
Step 406: responding to the triggering operation on the video bullet screen control, and simultaneously playing a main video and a bullet screen video on a video playing interface;
the triggering operation is a man-machine interaction operation triggered on the video bullet screen control. Triggering operations include, but are not limited to: at least one of a single-click operation, a double-click operation, a floating touch operation, a pressure touch operation, and a sliding operation.
Optionally, the mode of simultaneously playing the main video and the barrage video on the video playing interface adopts at least one of the following modes:
the first method is as follows: responding to the triggering operation on the video bullet screen control, playing the main video in the video playing interface, and overlapping and playing the bullet screen video on the video picture of the main video.
As shown in fig. 8, the barrage video may be superimposed on the main video in the form of a small floating window 80, i.e. a picture-in-picture display mode. In order not to block the main display content of the main video, the display position of the barrage video can be located in the peripheral area of the video playing interface.
In one example, the display position of the bullet screen video is changed in the video playing interface in response to a drag operation on the bullet screen video. That is, the user can change the playing position of the barrage video on the video playing interface to the desired position by dragging the floating window. In another example, a closing control is further displayed on the playing interface of the barrage video, and the playing of the barrage video is cancelled on the video playing interface in response to a triggering operation on the closing control.
The second method comprises the following steps: responding to the triggering operation on the video bullet screen control, playing the main video in a first playing area in the video playing interface, and playing the bullet screen video in a second playing area in the video playing interface.
Illustratively, the first play area and the second play area are two play areas in parallel. The area of the first playing area is larger than or equal to the area of the second playing area, namely a split-screen display mode.
As shown in fig. 9, the video playing interface is split into a left playing area and a right playing area, and the playing areas of the left playing area and the right playing area are the same. And playing the main video in the left playing area, and playing the barrage video in the right playing area.
The third method comprises the following steps: the video bullet screen control comprises a first video bullet screen control and a second video bullet screen control.
The number of the video bullet screen controls displayed on the video playing interface can be multiple, and the first video bullet screen control and the second video bullet screen control are any two of the multiple video bullet screen controls. Responding to the triggering operation on the first video barrage control, playing the main video in a first playing area in the video playing interface, and playing the first barrage video corresponding to the first video barrage control in a second playing area of the video playing interface.
As shown in fig. 9, the video playing interface is split into a left playing area and a right playing area, and the playing areas of the left playing area and the right playing area are the same. And playing the main video in the left playing area, and playing the first barrage video in the right playing area.
Responding to the triggering operation on the first video bullet screen control, playing a main video in a first playing area in the video playing interface, playing a first bullet screen video corresponding to the first video bullet screen control in a second playing area of the video playing interface, and playing a second bullet screen video corresponding to the second video bullet screen control in a third playing area of the video playing interface.
The first playing area, the second playing area and the third playing area are three parallel playing areas, the area of the first playing area is larger than that of the second playing area, and the area of the first playing area is larger than that of the third playing area.
As shown in fig. 10, the video playing interface is split into a left playing area 91, an upper right playing area 92 and a lower right playing area 93, the playing areas of the upper right playing area 92 and the lower right playing area 93 are the same, and the playing area of the left playing area 91 is larger than the playing area of the upper right playing area 92. The main video 12 is played in the left-hand playing area 91, the first barrage video 18 is played in the upper-right playing area 92, and the second barrage video 18 is played in the lower-right playing area 93.
In another example, the first barrage video 18 is played in the left playback area 91, the main video 12 is played in the upper right playback area 92, and the second barrage video 18 is played in the lower right playback area 93, as shown in fig. 11.
Alternatively, the display contents in the respective playing areas may be transposed. And responding to the position switching operation on the main video, and switching the playing areas of the main video and the barrage video. Or responding to the position switching operation on the barrage video, and switching the playing areas of the main video and the barrage video respectively.
For example, the position switching operation is a first sliding operation, and when a start point of the first sliding operation is located in a playing area of the main video and an end point of the first sliding operation is located in a playing area of the barrage video, the playing areas of the main video and the barrage video are switched. For another example, the position switching operation is a second sliding operation, and when a start point of the second sliding operation is located in a playing area of the first barrage video and an end point of the second sliding operation is located in a playing area of the second barrage video, the playing areas of the first barrage video and the second barrage video are switched.
In one example, the main video association stores first archived data and the bullet screen video association stores second archived data. For example, the main video is a cloud game video, the first archive data stores cloud game archive data corresponding to a video picture of the main video, and the cloud game archive data includes: data such as playing, operation, shipment, grade, equipment, etc.
Step 408: reading first attribute information of a first display element from first archived data in response to a selection operation of the first display element in the main video; displaying first attribute information on the periphery side of the first display element;
the first display element is a display element appearing in a video picture of the main video. Such as a person, animal, machine, equipment, prop, item, etc. The selection operation is a human-computer interaction operation for selecting the first display element. The selecting operation includes: the touch control system comprises at least one of click operation, double click operation, suspension touch operation, pressure touch operation, sliding operation and eye gaze operation.
And the terminal acquires a trigger coordinate of the selection operation and detects whether the trigger coordinate is located in a coordinate range where a first display element is stored in advance. And if the trigger coordinate is located in the coordinate range of the prestored first display element, determining that the first display element is selected.
Since the first archived data includes: and the terminal displays the first attribute information on the peripheral side of the first display element. For example, the first display element is a game character, and the first attribute information is the level and the shipment of the game character; for another example, the first display element is a weapon prop, and the first attribute information is at least one of skill, level, attack power, and attack characteristic of the weapon prop.
As shown in fig. 12, an exemplary video screen of main video 12 includes game character soldier 1 and soldier 2, and when a user selects game character soldier 1, a rank 20 of soldier 1 and equipment "scar gun" are displayed in the video screen of main video 12.
Step 410: reading second attribute information of a second display element from second archived data in response to a selection operation of the second display element in the bullet screen video; displaying second attribute information on the periphery side of the second display element;
the second display element is a display element appearing in the video picture of the main video. Such as a person, animal, machine, equipment, prop, item, etc. The selection operation is a human-computer interaction operation for selecting the second display element. The selecting operation includes: the touch control system comprises at least one of click operation, double click operation, suspension touch operation, pressure touch operation, sliding operation and eye gaze operation.
And the terminal acquires the trigger coordinate of the selection operation and detects whether the trigger coordinate is located in a coordinate range where a second display element is stored in advance. And if the trigger coordinate is located in the coordinate range of the prestored second display element, determining that the second display element is selected.
Since the second archived data includes: and the terminal displays the second attribute information on the peripheral side of the second display element. For example, the second display element is a game character, and the second attribute information is the level and the shipment of the game character; for another example, the second display element is a weapon prop, and the second attribute information is at least one of skill, level, attack power, and attack characteristic of the weapon prop.
As shown in fig. 13, the video screen of bullet screen video 18 includes game character soldiers 1 and 2, and when the user selects game character soldier 2, rank 21 of soldier 2 and equipment "AK 47 gun" are displayed in the video screen of main video 12.
Since the same third display element may appear in the main video and the barrage video, this embodiment optionally further includes the following steps:
step 412: in response to a selection operation of a third display element in the main video or the bullet screen video, displaying third attribute information on a peripheral side of the third display element in the main video; and displaying fourth attribute information on the periphery side of the third display element in the bullet screen video;
wherein the third display element is the same display element that appears in both the main video and the barrage video.
The third display element is a display element that appears simultaneously in the video pictures of the main video and the bullet screen video. Such as characters, animals, machinery, equipment, props, items, etc. The selection operation is a human-computer interaction operation for selecting the third display element. The selecting operation includes: the touch control system comprises at least one of click operation, double click operation, suspension touch operation, pressure touch operation, sliding operation and eye gaze operation.
And the terminal acquires the trigger coordinate of the selection operation and detects whether the trigger coordinate is located in a coordinate range where a prestored third display element is located. And if the trigger coordinate is located in the coordinate range of the prestored third display element, determining that the third display element is selected.
The terminal displays the third attribute information on the peripheral side of the third display element in the main video. For example, the third display element is a game character, and the third attribute information is a level and an output of the game character; for another example, the third display element is a weapon prop, and the third attribute information is at least one of skill, level, attack power, and attack characteristic of the weapon prop.
And the terminal displays the fourth attribute information on the periphery side of the third display element in the bullet screen video. For example, the third display element is a game character, and the fourth attribute information is the level and shipment of the game character; for another example, the third display element is a weapon prop, and the fourth attribute information is at least one of skill, level, attack power, and attack characteristic of the weapon prop.
As shown in fig. 14, in the video frames of main video 12 and bullet screen video 18, game character soldier 1 and soldier 2 are included, and when the user selects game character soldier 1 in main video 12 or bullet screen video 18, the grade 20 of soldier 1 and equipment "scar gun" are displayed in the video frame of main video 12, and the grade 20 of soldier 1 and equipment "AK 47 gun" are displayed in the video frame of bullet screen video 18. Since main video 12 and barrage video 18 are cloud game videos provided by different players, the user can compare the appearance of two players playing with soldier 1.
In summary, in the method provided in this embodiment, by providing a bullet screen video in a video form, when the main video is played to the moment of the bullet screen, a video bullet screen control is additionally displayed on the video playing interface. When the user clicks the video barrage control, the main video and the barrage video are played simultaneously on the video playing interface, so that the user can watch the main video and the barrage video simultaneously on the same user interface, different differences in the main video and the barrage video are compared, and effective information is obtained from the barrage video in a more intuitive and efficient mode.
The method provided by the embodiment further provides a comparison method for viewing the attribute information of the display elements in the video picture on the main video and/or the barrage video, so that the user can compare the attribute information of the same or different display elements in the main video and the barrage video in detail in the playing process of the barrage video, and an information comparison mode with a more intuitive and efficient mode is obtained.
Fig. 15 shows a flowchart of a cloud game based barrage video generation method according to an exemplary embodiment of the present application. The embodiment is exemplified by the method applied to the terminal. The method comprises the following steps:
step 1502: acquiring a game video generated in the running process of the cloud game;
illustratively, the game video is automatically recorded by the terminal in the running process of the cloud game. In the process of trial playing or normal playing of the game video by the player, the terminal automatically records the game picture according to default settings to obtain the game video. In other embodiments, the game video is recorded manually by the player, which is not limited in this embodiment.
Step 1504: determining a start timestamp and an end timestamp in response to the intercept operation;
after the game video is automatically recorded, the player can intercept the game video on the user interface. In response to the intercepting operation of the player, the terminal determines a start time stamp and an end time stamp of a clip to be intercepted.
Step 1506: intercepting a game video clip from the game video according to the start time stamp and the end time stamp;
step 1508: generating a bullet screen video according to the game video clip;
step 1510: and taking the starting timestamp as the bullet screen moment, and storing the bullet screen video to the cloud game in a correlation manner.
The server of the cloud game stores the game video of the cloud game, and when the game video is the main video in the above embodiment, the barrage video is stored in association with at least one cloud game video corresponding to the cloud game. The barrage video is displayed on the cloud game video as a barrage in a video form.
Optionally, the terminal further acquires archived data generated in the running process of the cloud game, and the archived data and the game video have a corresponding relationship. Intercepting archived data segments from the archived data according to the start timestamp and the end timestamp; and storing the archived data segment and the barrage video in an associated manner.
In summary, the method provided in this embodiment provides a method for capturing a game video clip as a barrage video by using an capturing manner based on a game video of a cloud game as a material. The production process of the bullet screen video is simple, so that a user can produce the corresponding bullet screen video by simple human-computer operation, and the human-computer interaction efficiency is improved.
Fig. 16 shows a flowchart of a bullet screen generating method according to an exemplary embodiment of the present application. The embodiment is exemplified by the method applied to the terminal. The method comprises the following steps:
step 1: starting a cloud game through a cloud game video;
the cloud game video is a video recorded based on a cloud game. The cloud game video is short for game video corresponding to the cloud game application program. The game video corresponding to the cloud game application refers to a game video related to the cloud game application, for example, a game video recorded in a process that a user operates the cloud game application, a game video for introducing the cloud game application (for example, a game video for introducing an operation mode of the cloud game application (such as automatic running, jumping, and the like), a game video for introducing various operation controls (such as buttons, sliders, and the like) in the cloud game application, a game video for introducing a use method of a game operation (such as release skills and the like), a virtual article (such as a virtual missile, a virtual aircraft, and the like) and the like in the cloud game application, and a game video for introducing a figure of a virtual figure (such as dressing of the virtual figure and the like) and the like in the cloud game application).
The terminal can play a game video corresponding to the cloud game application program, the method for playing the game video by the terminal is not limited in the embodiment of the application, optionally, the terminal plays the game video in a webpage, for example, a user uploads the game video recorded in the process of operating the cloud game application program to a website, and then the terminal can play the game video in the webpage by opening the website; or, the terminal plays the game video in the cloud game application, for example, the cloud game application may provide a video playing function, and the user may view the game video in the cloud game application through the video playing function; alternatively, the terminal plays the game video in other applications that can provide video playing functions, such as a live application, a social application, a video application, and the like.
In order to meet the requirement of a user for controlling playing and the like of a game video in the process of watching the game video, in the embodiment of the application, an operation control and indication information, for example, one or more of an exit control, a pause control, an acceleration control, a zoom control, a progress adjustment control, start time and end time indication information, playing progress indication information, and the like, may be displayed in a playing interface of the game video of the cloud game application. The display styles of the operation control and the indication information are not limited in the embodiment of the application, and optionally, the operation control comprises one or more of a button, a slider, a text box and the like; optionally, the indication information comprises one or more of text indication information, color indication information, shape indication information, and the like. Of course, the user may also trigger the terminal to execute the functions that can be achieved by the operation control in other ways, for example, triggering the terminal to pause, accelerate, exit, zoom in and out, and the like by gestures, voice, and the like.
As shown in fig. 17, (a) of fig. 17 is a playing interface of a game video provided in an embodiment of the present application. The playing interface 510 of the game video includes an exit control 511, a pause control 512, start time indication information 513, playing progress indication information 514, a progress adjustment control 515, end time indication information 516, and a zoom control 517.
In the process of playing the game video corresponding to the cloud game application, the terminal may receive a start instruction for the cloud game application. The method for triggering the starting instruction is not limited, optionally, a playing interface of the game video comprises a starting control of the cloud game application program, and a user touches the starting control to trigger the terminal to receive the starting instruction aiming at the cloud game application program, so that the terminal can start the cloud game application program; or, in the process of playing the game video, the terminal receives an operation or instruction input in a playing interface of the game video, and receives a starting instruction for the cloud game application program under the condition that the input operation or instruction is matched with a preset operation or instruction, wherein the preset operation or instruction is a predefined operation or instruction for triggering the cloud game application program; or in the process of playing the game video, the user touches a playing interface of the game video to trigger the terminal to display a starting control of the cloud game application program, and the user touches the starting control to trigger the terminal to receive a starting instruction for the cloud game application program; or, in the process of playing the game video, the user triggers the terminal to receive a start instruction for the cloud game application program through operations such as voice, gesture, AI (Artificial Intelligence), and the like. For further description of the start command, please refer to the following embodiments, which are not repeated herein.
The terminal can start and run the cloud game application program and display a display interface of the cloud game application program under the condition that the terminal receives a starting instruction for the cloud game application program. In the embodiment of the application, the game picture content displayed on the initial display interface after the cloud game application program is started is the same as the game picture content displayed on the picture frame of the game video when the starting instruction is received.
As shown in fig. 17, when the game video is played to the screen frame shown in fig. 17 (a), the terminal receives a start instruction for the cloud game application, and displays a loading interface of the cloud game application shown in fig. 17 (b), where the loading interface includes the waiting duration prompt information and the cancel control. If the terminal does not receive the touch operation for the cancel control or receives other cancel operations for canceling the loading of the cloud game application in the process of displaying the loading interface, the initial display interface of the cloud game application shown in (c) of fig. 17 is displayed after the cloud game application is loaded. The game screen content shown in the screen frame of the game video shown in fig. 17 (a) is the same as the game screen content shown in the initial display interface of the cloud game application shown in fig. 17 (c) after the start.
Optionally, the cloud game video stores corresponding archived data. The archived data includes: a correspondence between the time stamp and the game data. And when the terminal determines that the starting instruction is received, determining a target timestamp. The target timestamp is a timestamp corresponding to a frame of the game video when the start instruction is received. The method comprises the steps that a terminal obtains archived data associated with cloud game videos; determining a game progress node corresponding to the target timestamp based on the archived data; and displaying an initial display interface of the cloud game application program according to the game progress node corresponding to the target timestamp.
The archived data is an archive for recording game data and timestamps involved in the process of the user operating the cloud game application. In this embodiment of the application, the archived data includes archived data of at least one game progress node, the archived data includes a timestamp and game data corresponding to the game progress node, and the game data refers to data related to the cloud game application program in the game progress process, for example, the game data includes one or more of operation data, path data of a virtual object controlled by a user, data of virtual resources (such as virtual dress, virtual articles, and the like) owned by a virtual object controlled by the user, and the like.
Step 2: automatically recording a game video in the process of the cloud game trial playing;
in the process of operating the cloud game application program by the user, the terminal can record a game picture of the cloud game application program in the game progress process to obtain a game video corresponding to the cloud game application program. In the embodiment of the application, the terminal can acquire the game pictures in the game progress process at preset intervals, and the acquired game pictures are in the form of picture frames, so that the game video recorded by the terminal comprises at least one acquired picture frame. Optionally, each frame of the frame has a timestamp corresponding thereto, so as to facilitate subsequent association of the game video corresponding to the cloud game application with the cloud game archive corresponding to the cloud game application.
And step 3: in the process of the cloud game trial playing, game data are stored;
in the process of operating the cloud game application program by the user, on one hand, the terminal records game pictures in the game progress process to obtain game videos, and on the other hand, the terminal records game data to obtain archived data. The game data refers to data involved in the progress of the game, and for example, the game data includes one or more of operation data, path data of a virtual object controlled by a user, data of a virtual resource (such as virtual makeup, virtual article, and the like) possessed by the virtual object controlled by the user, and the like.
In one example, in the game progress process, game data corresponding to at least one game progress node is obtained; generating archived data of at least one game progress node to obtain the archived data.
The game progress node is used for dividing the game progress of the cloud game application program, and game data related to the game progress process can be conveniently stored, classified, called, acquired and the like through the division of the game progress node. The embodiment of the application does not limit the dividing mode of the game progress nodes, and optionally, one game progress node is divided every preset time, for example, one game progress node is divided every 0.5 second; or, dividing the game progress nodes according to the user operation, for example, dividing one game progress node every time one operation of the user is received; or, dividing the game progress nodes according to the path of the virtual object controlled by the user, for example, dividing one game progress node every time the path of the virtual object is changed; or, the game progress nodes are divided by combining multiple dividing manners, for example, on the basis of dividing one game progress node every 0.5 second, if the operation of the user is received during the division interval, the corresponding game progress node is also divided according to the operation of the user.
In the embodiment of the application, the game progress node can be obtained after all game data in the game progress process are obtained, and can also be determined in real time according to the game data obtained in real time in the game progress process. After the game progress nodes are divided, the game data of the game progress nodes can be obtained, and archived data of the game progress nodes is generated, wherein the archived data of the game progress nodes comprises timestamps and game data corresponding to the game progress nodes. Optionally, the archived data of the game progress node is generated in real time in the process of acquiring the game data of the game progress node, for example, the game data of the game progress node is acquired every time a game progress node is divided, and the archived data of the game progress node is generated; or, the archived data of the game progress nodes is generated after the game data of all the game progress nodes is acquired, and the generation mode of the archived data of the game progress nodes is not limited in the embodiment of the application.
After generating the archived data of at least one game progress node, the archived data of the whole game video can be obtained, that is, the archived data of the whole game video comprises the archived data of at least one game progress node. Because the archived data of each game progress node comprises the timestamp and the game data corresponding to the game node, in the embodiment of the application, the corresponding relationship exists between the timestamp and the game data, and the corresponding game data can be positioned through the timestamp. Optionally, the game data corresponding to the target timestamp includes operation data corresponding to the target timestamp, and the operation data corresponding to the target timestamp is used to indicate the game control operation received at the time of the target timestamp. For an introduction description of the operation data and the game control operation, please refer to the above embodiments, which are not repeated herein. Such as time information, operations, paths, level, character, etc.
And 4, step 4: the game data is associated and bound to the recorded game video through the time information;
in the embodiment of the application, the game video and the archived data corresponding to the cloud game application program can be synchronously generated; alternatively, the game video and the archived data corresponding to the cloud game application are not generated synchronously, for example, during the game progress, the game video is obtained by recording the game screen, and the game data is recorded.
After the game data and the archived data are respectively obtained, an association relationship between the game video and the archived data can be established so as to associate the game video and the archived data and obtain playing data corresponding to the cloud game application program. Optionally, in this embodiment of the present application, the game video and the archived data establish an association relationship through a timestamp. Each picture frame of the game video has a corresponding time stamp, the archived data also comprises the time stamp and the game data corresponding to each game progress node, and the game data can be associated to the game video through the time stamps.
As shown in fig. 17, an exemplary relationship between game video 1010 and archived data 1020 is shown. From the time stamp (e.g., 06:13) of the game progress node included in the archived data 1020, the picture frame in the game video 1010 corresponding to the time stamp (e.g., 06:13) can be located, and the game data corresponding to the time stamp (e.g., 06:13) can be associated with the game video 1010, e.g., the game data corresponding to the time stamp (e.g., 06:13) can be associated with the picture frame in the game video 1010 corresponding to the time stamp (e.g., 06: 13).
It should be noted that, since the frame obtaining manner may not be consistent with the dividing manner of the game progress node, the timestamp of a certain game progress node may not correspond to the frame in the game video exactly, for example, the timestamp of the game progress node includes 08:00, and the timestamp of the frame of the game video does not include 08:00, but includes 07:59 and 08: 01. At this time, the game data corresponding to the time stamp may be associated to the game video by the time stamp closest to the time stamp. For example, the timestamp of the game progress node includes 08:00, but the timestamps corresponding to the frame of the game video include 07:58 and 08:01, and at this time, the game data corresponding to the 08:00 timestamp may be associated to the game video through the 08:01 timestamp.
If a plurality of timestamps closest to the timestamp of the game progress node exist in the timestamps corresponding to the picture frames of the game video, the game data of the game progress node may be associated to the game video through the timestamp before the game progress node, or the game data of the game progress node may be associated to the game video through the timestamp after the game progress node, which is not limited in the embodiment of the present application. For example, although the time stamp of the game progress node includes 08:00, the time stamps corresponding to the screen frames of the game video include 07:59 and 08:01, in this case, the game data corresponding to the time stamp of 08:00 may be associated with the game video by the time stamp of 08:01, or the game data corresponding to the time stamp of 08:00 may be associated with the game video by the time stamp of 07: 59.
And 5: generating a cloud game video according to the game video and the archived data;
and after the cloud game video corresponding to the cloud game application program is generated according to the game video and the archived data, the cloud game video can be stored so as to be convenient for subsequent calling or acquisition. Optionally, the terminal may also upload the cloud game video to the server, so as to reduce the storage pressure of the terminal and facilitate other terminals to call the cloud game video. For example, in the case that a client of a cloud game application generates a cloud game video, the terminal uploads the cloud game video to a server of the cloud game application; when other applications (such as a live application, a social application, a video playing application and the like) generate the cloud game video, the terminal uploads the cloud game video to servers of the other applications.
The user can start the game through the cloud game video, can also cut the cloud game video into short videos, and release the short videos to the original game video in the form of the video barrage, so that the difference of playing, operation, loading and unloading of the original game video and the video barrage can be visually compared while the user views the videos.
Step 6: intercepting and generating a barrage video;
the video bullet screen will contain a portion of the cloud game video and a corresponding cloud game archive, as shown in fig. 19. Illustratively, the terminal acquires game video and archived data generated during the operation of the cloud game. In response to the intercept operation, a start timestamp t1 and an end timestamp t2 are determined. A game video segment 1012 is intercepted from the game video and an archived data segment 1022 is intercepted from the archived data according to the start time stamp t1 and the end time stamp t 2. The terminal generates the barrage video according to the game video clip 1012, and stores the archived data clip 1022 and the barrage video in a related manner.
And 7: and releasing the bullet screen video to the cloud game video corresponding to the game.
And uploading the barrage video to a server by the terminal for storage.
In summary, the method provided in this embodiment provides a method for capturing a game video clip as a barrage video by using an capturing manner based on a game video of a cloud game as a material. The production process of the bullet screen video is simple, so that a user can produce the corresponding bullet screen video by simple human-computer operation, and the human-computer interaction efficiency is improved.
Fig. 20 is a block diagram illustrating a bullet screen display device according to an exemplary embodiment of the present application. The device comprises:
a display module 1220, configured to display a video playing interface, where the video playing interface is used to play a main video;
the display module 1220 is configured to add a display video bullet screen control on the video playing interface when the main video is played to a bullet screen time, where the video bullet screen control is a control for triggering playing of a bullet screen video in a video format, and the bullet screen time is a playing time associated with the bullet screen video;
and the interaction module 1240 is configured to respond to a trigger operation on the video barrage control, and play the main video and the barrage video simultaneously on the video playing interface.
In an optional implementation of the present application, the display module 1220 is configured to add the video bullet screen control and the associated information to the video playing interface when the main video is played to the bullet screen time;
wherein the associated information comprises: at least one of an account number of a bullet screen sender, a head portrait of the bullet screen sender, a nickname of the bullet screen sender, and a text bullet screen associated with the bullet screen video.
In an optional implementation of the present application, at least two of the video bullet screen controls are arranged in a horizontal direction on the video playing interface; or at least two video bullet screen controls are arranged on the video playing interface along the longitudinal direction.
In an optional implementation of the present application, the interaction module 1240 is configured to, in response to a trigger operation on the video bullet screen control, play the main video in the video playing interface, and play the bullet screen video in a manner of being superimposed on a video frame of the main video; or responding to a trigger operation on the video bullet screen control, playing the main video in a first playing area in the video playing interface, and playing the bullet screen video in a second playing area in the video playing interface, wherein the first playing area and the second playing area are two parallel playing areas.
In an optional implementation of the present application, the video bullet screen control includes a first video bullet screen control and a second video bullet screen control;
the interaction module 1240 is configured to, in response to a trigger operation on the first video bullet screen control, play the main video in a first play area in the video play interface, and play a first bullet screen video corresponding to the first video bullet screen control in a second play area of the video play interface; responding to a trigger operation on the first video bullet screen control, playing the main video in a first playing area in the video playing interface, playing a first bullet screen video corresponding to the first video bullet screen control in a second playing area of the video playing interface, and playing a second bullet screen video corresponding to the second video bullet screen control in a third playing area of the video playing interface;
the first playing area, the second playing area and the third playing area are three parallel playing areas, the area of the first playing area is larger than that of the second playing area, and the area of the first playing area is larger than that of the third playing area.
In an optional implementation of the present application, the main video is associated with first archived data, and the bullet screen video is associated with second archived data;
the interaction module 1240 is configured to, in response to a selection operation on a first display element in the main video, read first attribute information of the first display element from the first archived data; displaying the first attribute information on the peripheral side of the first display element; or, in response to a selection operation of a second display element in the main video, reading second attribute information of the second display element from the second archived data; displaying the second attribute information on a peripheral side of the second display element.
In an optional implementation of the present application, the main video is stored with third attribute information of a third display element in association, and the bullet screen video is stored with fourth attribute information of the third display element in association;
the interaction module 1240 is configured to display the third attribute information on the periphery side of the third display element in the main video in response to a selection operation on the third display element in the main video or the bullet screen video; and displaying the fourth attribute information on the periphery side of a third display element in the bullet screen video;
wherein the third display element is the same display element that appears in both the main video and the bullet screen video.
Fig. 21 is a block diagram illustrating a barrage video generating apparatus based on a cloud game according to an exemplary embodiment of the present application, where the apparatus includes:
an obtaining module 1310, configured to obtain a game video generated during the running process of the cloud game;
a response module 1320 for determining a start timestamp and an end timestamp in response to the intercept operation;
an intercepting module 1330, configured to intercept a game video segment from the game video according to the start timestamp and the end timestamp;
the generating module 1340 is configured to generate a bullet screen video according to the game video segment;
the storage module 1350 is configured to store the barrage video to the cloud game in an associated manner by using the start timestamp as a barrage time.
In an optional example of this embodiment, the obtaining module 1310 is further configured to obtain archived data generated during the running process of the cloud game; the intercepting module 1330 is further configured to intercept archived data segments from the archived data according to the start timestamp and the end timestamp; the storage module 1350 is further configured to associate and store the archived data segment and the barrage video.
Fig. 22 shows a block diagram of a computer device 2200 provided in an exemplary embodiment of the present application. The computer device 2200 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Computer device 2200 may also be referred to by other names such as user device, portable computer device, laptop computer device, desktop computer device, and the like.
Generally, computer device 2200 includes: a processor 2201 and a memory 2202.
The processor 2201 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 2201 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 2201 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 2201 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 2201 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 2202 may include one or more computer-readable storage media, which may be non-transitory. Memory 2202 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 2202 is configured to store at least one instruction for execution by the processor 2201 to implement a bullet screen display method provided by method embodiments herein.
In some embodiments, the computer device 2200 may further optionally include: a peripheral interface 2203 and at least one peripheral. The processor 2201, memory 2202, and peripheral interface 2203 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 2203 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 2204, a display 2205, a camera assembly 2206, an audio circuit 2207, a positioning assembly 2208, and a power source 2209.
The peripheral interface 2203 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 2201 and the memory 2202. In some embodiments, the processor 2201, memory 2202, and peripheral interface 2203 are integrated on the same chip or circuit board; in some other embodiments, any one or both of the processor 2201, the memory 2202, and the peripheral device interface 2203 may be implemented on separate chips or circuit boards, which are not limited in this embodiment.
The Radio Frequency circuit 2204 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 2204 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 2204 converts an electric signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electric signal. Optionally, the radio frequency circuit 2204 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 2204 can communicate with other computer devices via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or Wi-Fi (Wireless Fidelity) networks. In some embodiments, the rf circuit 2204 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display 2205 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 2205 is a touch display screen, the display screen 2205 also has the ability to acquire touch signals on or over the surface of the display screen 2205. The touch signal may be input to the processor 2201 as a control signal for processing. At this point, the display 2205 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 2205 may be one, providing a front panel of the computer device 2200; in other embodiments, the display 2205 can be at least two, respectively disposed on different surfaces of the computer device 2200 or in a folded design; in still other embodiments, the display 2205 may be a flexible display disposed on a curved surface or on a folded surface of the computer device 2200. Even more, the display 2205 can be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display screen 2205 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or other materials.
The camera assembly 2206 is used to capture images or video. Optionally, camera assembly 2206 includes a front camera and a rear camera. Generally, a front camera is disposed on a front panel of a computer apparatus, and a rear camera is disposed on a rear surface of the computer apparatus. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 2206 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 2207 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals into the processor 2201 for processing or inputting the electric signals into the radio frequency circuit 2204 for realizing voice communication. For stereo capture or noise reduction purposes, the microphones may be multiple and located at different locations on the computer device 2200. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 2201 or the radio frequency circuit 2204 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 2207 may also include a headphone jack.
The Location component 2208 is configured to locate a current geographic Location of the computer device 2200 for navigation or LBS (Location Based Service). The Positioning component 2208 may be based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 2209 is used to supply power to various components in the computer device 2200. The power source 2209 can be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 2209 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, computer device 2200 also includes one or more sensors 2210. The one or more sensors 2210 include, but are not limited to: acceleration sensor 2211, gyro sensor 2212, pressure sensor 2213, fingerprint sensor 2214, optical sensor 2215 and proximity sensor 2216.
The acceleration sensor 2211 can detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the computer apparatus 2200. For example, the acceleration sensor 2211 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 2201 may control the touch display 2205 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 2211. The acceleration sensor 2211 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 2212 may detect a body direction and a rotation angle of the computer device 2200, and the gyro sensor 2212 may cooperate with the acceleration sensor 2211 to acquire a 3D motion of the user on the computer device 2200. The processor 2201 may implement the following functions according to the data collected by the gyro sensor 2212: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 2213 may be disposed on the side bezel of the computer device 2200 and/or on the lower layer of the touch display 2205. When the pressure sensor 2213 is arranged on the side frame of the computer device 2200, the holding signal of the user to the computer device 2200 can be detected, and the processor 2201 performs left-right hand identification or quick operation according to the holding signal acquired by the pressure sensor 2213. When the pressure sensor 2213 is arranged at the lower layer of the touch display screen 2205, the processor 2201 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 2205. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 2214 is used for collecting the fingerprint of the user, and the processor 2201 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 2214, or the fingerprint sensor 2214 identifies the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 2201 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 2214 may be disposed on the front, back, or side of the computer device 2200. When a physical key or vendor Logo is provided on the computer device 2200, the fingerprint sensor 2214 may be integrated with the physical key or vendor Logo.
The optical sensor 2215 is used to collect the ambient light intensity. In one embodiment, the processor 2201 may control the display brightness of the touch display 2205 according to the ambient light intensity collected by the optical sensor 2215. Specifically, when the ambient light intensity is high, the display brightness of the touch display 2205 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 2205 is turned down. In another embodiment, the processor 2201 may also dynamically adjust the shooting parameters of the camera assembly 2206 according to the intensity of the ambient light collected by the optical sensor 2215.
A proximity sensor 2216, also known as a distance sensor, is typically disposed on the front panel of the computer device 2200. The proximity sensor 2216 is used to capture the distance between the user and the front of the computer device 2200. In one embodiment, when the proximity sensor 2216 detects that the distance between the user and the front face of the computer device 2200 is gradually decreased, the touch display screen 2205 is controlled by the processor 2201 to switch from the bright screen state to the dark screen state; when the proximity sensor 2216 detects that the distance between the user and the front surface of the computer device 2200 gradually becomes larger, the processor 2201 controls the touch display 2205 to switch from the rest screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 22 is not intended to be limiting of the computer device 2200, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
The present application further provides a computer-readable storage medium, where at least one instruction, at least one program, a code set, or an instruction set is stored in the storage medium, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the bullet screen display method or the bullet screen video generation method based on the cloud game provided in the foregoing method embodiment.
Optionally, the present application further provides a computer program product containing instructions, which when run on a computer device, causes the computer device to execute the bullet screen display method or the bullet screen video generation method based on the cloud game in the above aspects.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (14)

1. A bullet screen display method is characterized by comprising the following steps:
displaying a video playing interface, wherein the video playing interface is used for playing a main video;
when the main video is played to the bullet screen moment, a video bullet screen control is additionally displayed on the video playing interface, wherein the video bullet screen control is used for triggering playing of a bullet screen video in a video form, and the bullet screen moment is the playing moment associated with the bullet screen video;
and responding to the triggering operation on the video bullet screen control, and simultaneously playing the main video and the bullet screen video on the video playing interface.
2. The method of claim 1, wherein when the main video is played to the barrage moment, additionally displaying a video barrage control on the video playing interface comprises:
when the main video is played to the barrage moment, adding the video barrage control and associated information on the video playing interface;
wherein the associated information comprises: at least one of an account number of a bullet screen sender, a head portrait of the bullet screen sender, a nickname of the bullet screen sender, and a text bullet screen associated with the bullet screen video.
3. The method of claim 2,
at least two video bullet screen controls are arranged on the video playing interface along the transverse direction;
or the like, or, alternatively,
and at least two video bullet screen controls are arranged on the video playing interface along the longitudinal direction.
4. The method according to any one of claims 1 to 3, wherein the simultaneously playing the main video and the barrage video on the video playing interface in response to a trigger operation on the video barrage control comprises:
responding to a trigger operation on the video bullet screen control, playing the main video in the video playing interface, and overlapping and playing the bullet screen video on a video picture of the main video;
or the like, or, alternatively,
responding to a trigger operation on the video bullet screen control, playing the main video in a first playing area in the video playing interface, and playing the bullet screen video in a second playing area in the video playing interface, wherein the first playing area and the second playing area are two parallel playing areas.
5. The method of any of claims 1 to 3, wherein the video bullet screen controls comprise a first video bullet screen control and a second video bullet screen control;
the responding to the triggering operation on the video barrage control, and simultaneously playing the main video and the barrage video on the video playing interface, includes:
responding to a trigger operation on the first video barrage control, playing the main video in a first playing area in the video playing interface, and playing a first barrage video corresponding to the first video barrage control in a second playing area of the video playing interface;
responding to a trigger operation on the first video bullet screen control, playing the main video in a first playing area in the video playing interface, playing a first bullet screen video corresponding to the first video bullet screen control in a second playing area of the video playing interface, and playing a second bullet screen video corresponding to the second video bullet screen control in a third playing area of the video playing interface;
the first playing area, the second playing area and the third playing area are three parallel playing areas, the area of the first playing area is larger than that of the second playing area, and the area of the first playing area is larger than that of the third playing area.
6. The method according to any one of claims 1 to 3, wherein the main video is stored with first archived data, and the bullet screen video is stored with second archived data;
the method further comprises the following steps:
reading first attribute information of a first display element in the main video from the first archived data in response to a selection operation of the first display element; displaying the first attribute information on the peripheral side of the first display element;
or the like, or, alternatively,
reading second attribute information of a second display element in the main video from the second archived data in response to a selection operation of the second display element; displaying the second attribute information on a peripheral side of the second display element.
7. The method according to any one of claims 1 to 3, wherein the main video is stored with third attribute information of a third display element in association, and the bullet screen video is stored with fourth attribute information of the third display element in association;
the method further comprises the following steps:
in response to a selection operation of the third display element in the main video or the bullet screen video, displaying the third attribute information on a peripheral side of the third display element in the main video; and displaying the fourth attribute information on the periphery side of a third display element in the bullet screen video;
wherein the third display element is the same display element that appears in both the main video and the bullet screen video.
8. A bullet screen video generation method based on a cloud game is characterized by comprising the following steps:
acquiring a game video generated in the running process of the cloud game;
determining a start timestamp and an end timestamp in response to the intercept operation;
intercepting a game video segment from the game video according to the starting time stamp and the ending time stamp;
generating a bullet screen video according to the game video clip;
and taking the starting timestamp as a bullet screen moment, and storing the bullet screen video to the cloud game in a correlation manner.
9. The method of claim 8, further comprising:
acquiring archived data generated in the running process of the cloud game;
intercepting archived data segments from the archived data according to the start timestamp and the end timestamp;
and storing the archived data segment and the barrage video in an associated manner.
10. A bullet screen display device, characterized in that the device comprises:
the display module is used for displaying a video playing interface, and the video playing interface is used for playing a main video;
the display module is used for additionally displaying a video bullet screen control on the video playing interface when the main video is played to a bullet screen time, wherein the video bullet screen control is used for triggering playing of a bullet screen video in a video form, and the bullet screen time is the playing time associated with the bullet screen video;
and the interaction module is used for responding to the triggering operation on the video bullet screen control and simultaneously playing the main video and the bullet screen video on the video playing interface.
11. A bullet screen video generation device based on a cloud game, which is characterized by comprising:
the acquisition module is used for acquiring a game video generated in the running process of the cloud game;
a response module to determine a start timestamp and an end timestamp in response to the intercept operation;
the intercepting module is used for intercepting a game video clip from the game video according to the starting time stamp and the ending time stamp;
the generating module is used for generating a barrage video according to the game video clip;
and the storage module is used for storing the bullet screen video to the cloud game in a correlation manner by taking the starting timestamp as a bullet screen moment.
12. The apparatus of claim 11,
the acquisition module is also used for acquiring archived data generated in the running process of the cloud game;
the intercepting module is further configured to intercept archived data segments from the archived data according to the start timestamp and the end timestamp;
the storage module is further configured to store the archived data segment and the barrage video in an associated manner.
13. A computer device, characterized in that the computer device comprises: a processor and a memory, the memory storing a computer program loaded and executed by the processor to implement the bullet screen display method according to any one of claims 1 to 7, or the cloud game based bullet screen video generation method according to claim 8 or 9.
14. A computer-readable storage medium storing a computer program which is loaded and executed by a processor to implement the bullet screen display method according to any one of claims 1 to 7 or the cloud game-based bullet screen video generation method according to claim 8 or 9.
CN202011370558.0A 2020-11-30 2020-11-30 Bullet screen display method, bullet screen generation device, bullet screen equipment and storage medium Active CN112565911B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011370558.0A CN112565911B (en) 2020-11-30 2020-11-30 Bullet screen display method, bullet screen generation device, bullet screen equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011370558.0A CN112565911B (en) 2020-11-30 2020-11-30 Bullet screen display method, bullet screen generation device, bullet screen equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112565911A true CN112565911A (en) 2021-03-26
CN112565911B CN112565911B (en) 2021-10-08

Family

ID=75045347

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011370558.0A Active CN112565911B (en) 2020-11-30 2020-11-30 Bullet screen display method, bullet screen generation device, bullet screen equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112565911B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113099285A (en) * 2021-03-30 2021-07-09 上海哔哩哔哩科技有限公司 Display method and device
CN113613053A (en) * 2021-07-26 2021-11-05 北京达佳互联信息技术有限公司 Video recommendation method and device, electronic equipment and storage medium
CN114666648A (en) * 2022-03-30 2022-06-24 阿里巴巴(中国)有限公司 Video playing method and electronic equipment
CN115278346A (en) * 2022-07-28 2022-11-01 北京字跳网络技术有限公司 Method for sending comments and receiving comments in live broadcast room and related equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140089800A1 (en) * 2012-09-25 2014-03-27 Avermedia Technologies, Inc. Multimedia comment system and multimedia comment method
US20140323213A1 (en) * 2013-04-30 2014-10-30 Kabam, Inc. System and method for enhanced video of game playback
CN111294662A (en) * 2020-03-02 2020-06-16 腾讯科技(深圳)有限公司 Bullet screen generation method, device, equipment and storage medium
CN111314724A (en) * 2020-02-18 2020-06-19 华为技术有限公司 Cloud game live broadcasting method and device
CN111405344A (en) * 2020-03-18 2020-07-10 腾讯科技(深圳)有限公司 Bullet screen processing method and device
CN111601150A (en) * 2020-06-05 2020-08-28 百度在线网络技术(北京)有限公司 Video processing method and device
CN111632373A (en) * 2020-05-30 2020-09-08 腾讯科技(深圳)有限公司 Method and device for starting game and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140089800A1 (en) * 2012-09-25 2014-03-27 Avermedia Technologies, Inc. Multimedia comment system and multimedia comment method
US20140323213A1 (en) * 2013-04-30 2014-10-30 Kabam, Inc. System and method for enhanced video of game playback
CN111314724A (en) * 2020-02-18 2020-06-19 华为技术有限公司 Cloud game live broadcasting method and device
CN111294662A (en) * 2020-03-02 2020-06-16 腾讯科技(深圳)有限公司 Bullet screen generation method, device, equipment and storage medium
CN111405344A (en) * 2020-03-18 2020-07-10 腾讯科技(深圳)有限公司 Bullet screen processing method and device
CN111632373A (en) * 2020-05-30 2020-09-08 腾讯科技(深圳)有限公司 Method and device for starting game and computer readable storage medium
CN111601150A (en) * 2020-06-05 2020-08-28 百度在线网络技术(北京)有限公司 Video processing method and device

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113099285A (en) * 2021-03-30 2021-07-09 上海哔哩哔哩科技有限公司 Display method and device
CN113613053A (en) * 2021-07-26 2021-11-05 北京达佳互联信息技术有限公司 Video recommendation method and device, electronic equipment and storage medium
EP4125274A1 (en) * 2021-07-26 2023-02-01 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for playing videos
CN114666648A (en) * 2022-03-30 2022-06-24 阿里巴巴(中国)有限公司 Video playing method and electronic equipment
CN115278346A (en) * 2022-07-28 2022-11-01 北京字跳网络技术有限公司 Method for sending comments and receiving comments in live broadcast room and related equipment
WO2024022473A1 (en) * 2022-07-28 2024-02-01 北京字跳网络技术有限公司 Method for sending comment in live-streaming room, method for receiving comment in live-streaming room, and related device

Also Published As

Publication number Publication date
CN112565911B (en) 2021-10-08

Similar Documents

Publication Publication Date Title
CN112565911B (en) Bullet screen display method, bullet screen generation device, bullet screen equipment and storage medium
CN109729411B (en) Live broadcast interaction method and device
CN109874312B (en) Method and device for playing audio data
EP3902278B1 (en) Music playing method, device, terminal and storage medium
CN109646944B (en) Control information processing method, control information processing device, electronic equipment and storage medium
CN110300274B (en) Video file recording method, device and storage medium
CN110533585B (en) Image face changing method, device, system, equipment and storage medium
CN109922356B (en) Video recommendation method and device and computer-readable storage medium
CN112261481B (en) Interactive video creating method, device and equipment and readable storage medium
CN113411680B (en) Multimedia resource playing method, device, terminal and storage medium
CN109275013B (en) Method, device and equipment for displaying virtual article and storage medium
CN114116053B (en) Resource display method, device, computer equipment and medium
CN110868636B (en) Video material intercepting method and device, storage medium and terminal
CN109982129B (en) Short video playing control method and device and storage medium
CN114546227B (en) Virtual lens control method, device, computer equipment and medium
CN111711838B (en) Video switching method, device, terminal, server and storage medium
CN111565338A (en) Method, device, system, equipment and storage medium for playing video
WO2022227581A1 (en) Resource display method and computer device
CN113032590B (en) Special effect display method, device, computer equipment and computer readable storage medium
CN112417180A (en) Method, apparatus, device and medium for generating album video
CN112822544B (en) Video material file generation method, video synthesis method, device and medium
CN112023403A (en) Battle process display method and device based on image-text information
EP4125274A1 (en) Method and apparatus for playing videos
CN113485596B (en) Virtual model processing method and device, electronic equipment and storage medium
CN110942426B (en) Image processing method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40040525

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant