CN115190369A - Video generation method, video generation device, electronic apparatus, medium, and product - Google Patents

Video generation method, video generation device, electronic apparatus, medium, and product Download PDF

Info

Publication number
CN115190369A
CN115190369A CN202211099685.0A CN202211099685A CN115190369A CN 115190369 A CN115190369 A CN 115190369A CN 202211099685 A CN202211099685 A CN 202211099685A CN 115190369 A CN115190369 A CN 115190369A
Authority
CN
China
Prior art keywords
video
bullet screen
original
barrage
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211099685.0A
Other languages
Chinese (zh)
Inventor
肖潇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202211099685.0A priority Critical patent/CN115190369A/en
Publication of CN115190369A publication Critical patent/CN115190369A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles

Abstract

The present disclosure relates to the field of computer technologies, and in particular, to a video generation method, a video generation apparatus, an electronic device medium, and a computer program product, where the method includes: acquiring a second video; wherein the second video comprises an original bullet screen; responding to a bullet screen parameter acquisition instruction, and acquiring bullet screen parameters corresponding to an original bullet screen in the second video; the barrage parameter is used for indicating related information of an original barrage in the second video; and deleting the original barrage in the second video to obtain the first video, and generating the target video according to the first video. Through the technical scheme of the embodiment of the disclosure, the problem of poor video production efficiency in the related technology can be solved.

Description

Video generation method, video generation device, electronic apparatus, medium, and product
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a video generation method, a video generation apparatus, an electronic device, a computer-readable storage medium, and a computer program product.
Background
With the rapid development of the internet, various video platforms are emerging. In some video platforms, a video creator may make a video for upload for viewing by a platform user. Other video material, which may carry a bullet screen, may be referenced when a video creator makes a video. For some users, it is not desirable to see a barrage in the video material, while for other users, a barrage in the video material can also provide a program effect, which is desirable to see a barrage in the video material.
However, in the related art, only videos that do not display the barrage in the video material are usually produced, and the requirements of users who want to see the barrage in the video material cannot be met, and if videos are repeatedly produced for such users, a large amount of manpower and material resources are consumed, so that the video production efficiency is poor.
Disclosure of Invention
The present disclosure provides a video generation method, a video generation apparatus, an electronic device, a computer-readable storage medium, and a computer program product, to at least solve the problem of poor video production efficiency in the related art. The technical scheme of the disclosure is as follows:
according to a first aspect of the embodiments of the present disclosure, there is provided a video generation method, including: acquiring a second video; wherein the second video comprises an original bullet screen; responding to a bullet screen parameter acquisition instruction, and acquiring bullet screen parameters corresponding to an original bullet screen in a second video; the barrage parameter is used for indicating related information of an original barrage in the second video; and deleting the original bullet screen in the second video to obtain a first video, and generating a target video according to the first video.
Optionally, before acquiring the barrage parameter corresponding to the original barrage in the second video, the method further includes: an original bullet screen in the second video is identified.
Optionally, identifying an original bullet screen in the second video includes: when a second video is input into a video editing system, identifying an original bullet screen in the second video; the video editing system is used for editing input video to generate target video.
Optionally, the bullet screen parameters corresponding to the original bullet screen in the second video include one or more of a bullet screen position, bullet screen content, a bullet screen color, a bullet screen presentation mode, bullet screen appearance time, and bullet screen disappearance time.
Optionally, in response to the video playing instruction, playing the target video in the graphical user interface; and responding to the bullet screen display instruction, and displaying the original bullet screen in the target video according to the bullet screen parameters corresponding to the original bullet screen in the second video.
Optionally, in response to the bullet screen closing instruction, stopping displaying the original bullet screen in the target video.
Optionally, the display size of the second video is smaller than that of the target video, and the original barrage is displayed in the target video according to the barrage parameter corresponding to the original barrage in the second video, including: determining a parameter conversion ratio according to the display size of the second video and the display size of the target video; converting the bullet screen parameters into mapping bullet screen parameters according to the parameter conversion proportion; the mapping barrage parameters are used in the target video to adapt to the display size of the target video to display the parameters of the original barrage; and displaying the original bullet screen in the target video according to the mapping bullet screen parameters.
According to a second aspect of the embodiments of the present disclosure, there is provided a video generating apparatus including: a second video acquisition unit configured to perform acquisition of a second video; wherein the second video comprises an original bullet screen; the acquisition instruction response unit is configured to execute a bullet screen parameter acquisition instruction in response to the bullet screen parameter, and acquire a bullet screen parameter corresponding to an original bullet screen in the second video; the barrage parameter is used for indicating related information of an original barrage in the second video; and the target video generating unit is configured to delete the original barrage in the second video to obtain a first video and generate a target video according to the first video.
Optionally, before acquiring the barrage parameter corresponding to the original barrage in the second video, the apparatus further includes: an original bullet screen identification unit configured to perform identification of an original bullet screen in the second video.
Optionally, the original barrage in the second video is identified, and the apparatus further includes: an editing system input unit configured to perform recognizing an original bullet screen in a second video when the second video is input to the video editing system; the video editing system is used for editing input videos to generate target videos.
Optionally, the bullet screen parameters corresponding to the original bullet screen in the second video include one or more of a bullet screen position, bullet screen content, a bullet screen color, a bullet screen presentation mode, bullet screen appearance time, and bullet screen disappearance time.
Optionally, the apparatus further comprises: a play instruction response unit configured to execute playing of the target video in the graphical user interface in response to the video play instruction; and the display instruction response unit is configured to execute a bullet screen display instruction in response to the bullet screen, and display the original bullet screen in the target video according to the bullet screen parameters corresponding to the original bullet screen in the second video.
Optionally, the apparatus further comprises: and a stopping display unit configured to execute stopping displaying the original bullet screen in the target video in response to the bullet screen closing instruction.
Optionally, the display size of the second video is smaller than that of the target video, and the original bullet screen is displayed in the target video according to the bullet screen parameter corresponding to the original bullet screen in the second video, where the apparatus further includes: a conversion ratio determination unit configured to perform parameter conversion ratio determination according to a display size of the second video and a display size of the target video; a mapping barrage parameter acquiring unit configured to perform conversion of the barrage parameters into mapping barrage parameters according to the parameter conversion ratio; the mapping barrage parameters are used in the target video to adapt to the display size of the target video to display the parameters of the original barrage; and the mapping display unit is configured to display the original bullet screen in the target video according to the mapping bullet screen parameters.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the executable instructions to implement the video generation method as described above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium, wherein instructions, when executed by a processor of an electronic device, enable the electronic device to perform the above-described video generation method.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product, computer program/instructions, characterized in that the computer program/instructions, when executed by a processor, implement the above-mentioned video generation method.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
in a video generation method provided by an embodiment of the present disclosure, a second video may be acquired; the method comprises the steps of obtaining a first video and a second video, wherein the first video comprises an original bullet screen, responding to a bullet screen parameter obtaining instruction, obtaining bullet screen parameters corresponding to the original bullet screen in the second video, wherein the bullet screen parameters are used for indicating relevant information of the original bullet screen in the second video, deleting the original bullet screen in the second video to obtain the first video, and generating a target video according to the first video. On one hand, the barrage parameters of the barrage in the video material can be obtained, and the barrage is displayed in the target video according to the barrage parameters, so that the requirements of a user who wants to see the barrage in the video material are met, and the utilization rate of the user on a video platform is improved; on the other hand, need not to make the video alone to the user who wants to see the barrage in the video material, avoid manpower and materials to consume to video preparation efficiency has been promoted.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
Fig. 1 schematically illustrates a schematic diagram of an exemplary system architecture implementing a video generation method in an exemplary embodiment of the present disclosure;
FIG. 2 is a flow diagram illustrating a video generation method in accordance with an exemplary embodiment;
fig. 3 is a flowchart illustrating the display of an original bullet screen in a target video according to bullet screen parameters corresponding to the original bullet screen in a second video in response to a bullet screen display instruction according to an exemplary embodiment;
FIG. 4 is a flowchart illustrating the display of an original bullet screen in a target video according to mapped bullet screen parameters, according to an exemplary embodiment;
FIG. 5 is a diagram illustrating the display of an original bullet screen in a target video and the cessation of the display of the original bullet screen in the target video, in accordance with an exemplary embodiment;
fig. 6 is a flowchart illustrating displaying an original bullet screen in a target video according to bullet screen parameters corresponding to the original bullet screen in a second video according to an exemplary embodiment;
FIG. 7 is a schematic diagram illustrating the components of a video generation apparatus in accordance with an exemplary embodiment;
FIG. 8 is a block diagram illustrating an electronic device suitable for use in implementing exemplary embodiments of the present disclosure, according to an exemplary embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in other sequences than those illustrated or described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 shows a schematic diagram of an exemplary system architecture to which the video generation method of the embodiments of the present disclosure may be applied.
As shown in fig. 1, the system architecture 1000 may include one or more of terminal devices 1001, 1002, 1003, a network 1004 and a server 1005. The network 1004 is a medium used to provide communication links between the terminal devices 1001, 1002, 1003 and the server 1005. Network 1004 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 1005 may be a server cluster composed of a plurality of servers.
A user may use the terminal devices 1001, 1002, 1003 to interact with a server 1005 via a network 1004 to receive or transmit messages or the like. The terminal devices 1001, 1002, 1003 may be various electronic devices having a display screen, including but not limited to smart phones, tablet computers, portable computers, desktop computers, and the like. In addition, the server 1005 may be a server that provides various services.
In one embodiment, an execution subject of the video generation method of the present disclosure may be a server 1005, and the server 1005 may acquire a second video sent by the terminal devices 1001, 1002, and 1003, and acquire the second video, where the second video includes an original barrage, and in response to a barrage parameter acquisition instruction, acquire a barrage parameter corresponding to the original barrage in the second video, where the barrage parameter is used to indicate related information of the original barrage in the second video, delete the original barrage in the second video to obtain a first video, generate a target video according to the first video, and then return the obtained target video to the terminal devices 1001, 1002, and 1003. Further, the video generation method of the present disclosure may also be executed by the terminal devices 1001, 1002, 1003, and the like, to achieve acquisition of the second video; the method comprises the steps that a second video comprises an original bullet screen, a bullet screen parameter corresponding to the original bullet screen in the second video is obtained in response to a bullet screen parameter obtaining instruction, wherein the bullet screen parameter is used for indicating relevant information of the original bullet screen in the second video, the original bullet screen in the second video is deleted to obtain a first video, and a target video is generated according to the first video.
In addition, the implementation process of the video generation method of the present disclosure may also be implemented by the terminal devices 1001, 1002, 1003 and the server 1005 together. For example, the terminal devices 1001, 1002, and 1003 may acquire the second video, and acquire, in response to the barrage parameter acquisition instruction, a barrage parameter corresponding to an original barrage in the second video, where the barrage parameter is used to indicate related information of the original barrage in the second video, and the server 1005 may delete the original barrage in the second video to obtain the first video, and generate the target video according to the first video.
With the rapid development of the internet, various video platforms are emerging. In some video platforms, a video creator may make a video for upload for viewing by a platform user. Other video material, which may carry a bullet screen, may be referenced when a video creator makes a video. For some users, it is not desirable to see a barrage in the video material, while for other users, barrages in the video material can also provide a program effect, hoping to see a barrage in the video material.
However, in the related art, only videos that do not display the barrage in the video material are usually produced, and the requirements of users who want to see the barrage in the video material cannot be met, and if videos are repeatedly produced for such users, a large amount of manpower and material resources are consumed, so that the video production efficiency is poor.
Fig. 2 is a flow chart illustrating a video generation method, as shown in fig. 1, including the following steps, according to an example embodiment.
In step S210, a second video is acquired; wherein the second video comprises an original bullet screen;
in step S220, in response to the barrage parameter obtaining instruction, obtaining barrage parameters corresponding to the original barrage in the second video; the barrage parameter is used for indicating related information of an original barrage in the second video;
in step S230, the original barrage in the second video is deleted to obtain the first video, and the target video is generated according to the first video.
In a video generation method provided by an embodiment of the present disclosure, a second video may be acquired; the second video comprises an original bullet screen, bullet screen parameters corresponding to the original bullet screen in the second video are obtained in response to a bullet screen parameter obtaining instruction, the bullet screen parameters are used for indicating related information of the original bullet screen in the second video, the original bullet screen in the second video is deleted to obtain a first video, and a target video is generated according to the first video. On one hand, the barrage parameters of barrages in the video material can be acquired, and the barrages are displayed in the target video according to the barrage parameters, so that the requirements of users who want to see the barrages in the video material are met, and the utilization rate of the users on the video platform is further improved; on the other hand, need not to make the video alone to the user who wants to see the barrage in the video material, avoid manpower and materials to consume to video production efficiency has been promoted.
Next, the steps S210 to S230 of the video generation method in the present exemplary embodiment will be described in more detail with reference to fig. 2 and the embodiment.
Step S210, acquiring a second video; wherein the second video comprises an original bullet screen;
in an example embodiment of the present disclosure, a second video may be acquired. Wherein, the second video comprises the original barrage. Specifically, the second video includes an original barrage, where the barrage refers to a commentary subtitle popped up when the video is viewed on the network. For example, the second video is a live video, and when the live video is recorded, a barrage on a live picture is recorded at the same time, and at this time, the second video includes an original barrage; for another example, videos provided in some video platforms may be watched and simultaneously sent as a bullet screen, and at this time, the videos may be recorded, and during recording, the bullet screen displayed on the videos may be recorded at the same time.
It should be noted that the present disclosure does not specifically limit the form of the second video.
Step S220, responding to a bullet screen parameter obtaining instruction, and obtaining bullet screen parameters corresponding to an original bullet screen in a second video;
in an example embodiment of the present disclosure, after the second video is acquired through the above steps, the barrage parameter corresponding to the original barrage in the second video may be acquired in response to the barrage parameter acquiring instruction. And the bullet screen parameter is used for indicating the related information of the original bullet screen in the second video. Specifically, the bullet screen parameter acquiring instruction may be used to acquire bullet screen parameters corresponding to the original bullet screen in the second video.
For example, the related information of the original barrage may include a barrage position of the original barrage displayed in the second video, a content of the barrage when the original barrage is displayed in the second video, a color of the barrage displayed in the second video, a barrage presenting manner of the original barrage in the second video, an appearance time of the barrage in the second video, and a barrage disappearance time of the original barrage in the second video.
It should be noted that, the present disclosure does not make any special limitation on the specific type of the bullet screen parameter corresponding to the original bullet screen in the second video.
The bullet screen parameter acquisition instruction can be triggered by bullet screen parameter acquisition operation. For example,
in an example embodiment of the present disclosure, a bullet screen parameter corresponding to an original bullet screen in a second video may be acquired in response to a bullet screen parameter acquisition instruction. Specifically, a bullet screen parameter acquisition control can be provided in the graphical user interface, and when a bullet screen parameter acquisition operation for the bullet screen parameter acquisition control is received, bullet screen parameters corresponding to an original bullet screen in the second video can be acquired. The bullet screen parameter obtaining operation can be key operation, touch operation or voice operation. It should be noted that, the present disclosure is not limited to a specific form of the bullet screen parameter obtaining operation.
Specifically, a bullet screen parameter acquisition model can be constructed, the bullet screen parameter acquisition model can be used for acquiring original bullet screens in the second video and bullet screen parameters of the original bullet screens, the bullet screen parameter acquisition model is a model established for completing a bullet screen parameter acquisition task, and the bullet screen parameter acquisition model can be obtained by training a basic model so as to complete the bullet screen parameter acquisition task. It should be noted that, the specific structure of the bullet screen parameter obtaining model is not particularly limited in this disclosure.
In an example embodiment of the present disclosure, after the barrage parameter acquiring instruction and the second video are received through the above steps, the second video may be input into the barrage parameter acquiring model, and the barrage parameter corresponding to the original barrage in the second video is obtained through the barrage parameter acquiring model.
It should be noted that, in the present disclosure, a specific manner of obtaining the barrage parameter corresponding to the original barrage in the second video is not particularly limited.
In an example embodiment of the present disclosure, before acquiring the bullet screen parameters corresponding to the original bullet screen in the second video, the original bullet screen in the second video may be identified. Specifically, after the second video is acquired through the steps, the second video is detected to identify the original bullet screen in the second video. Specifically, a bullet screen recognition model can be constructed, the bullet screen recognition model can be used for detecting an original bullet screen in the second video, the bullet screen recognition model refers to a model established for completing a bullet screen recognition task, and the bullet screen recognition model can be obtained by training a basic model so as to complete the bullet screen recognition task. It should be noted that, the specific structure of the bullet screen recognition model is not particularly limited in this disclosure.
In an example embodiment of the present disclosure, a target video may be generated in a video editing system, and an original barrage in a second video is identified when the second video is input into the video editing system. The video editing system is used for editing input video to generate target video. Specifically, the video editing system is a system for performing nonlinear editing on a video source, and the system performs remixing on added materials such as pictures, background music, special effects and scenes and videos, cuts and merges the video source, and generates new videos with different expressive power through secondary coding. The video editing system may include video editing software, among other things.
Specifically, after the second video is obtained through the above steps, the second video may be input to a video editing system as a video source, and when the second video is input to the video editing system, the second video is detected to identify an original bullet screen in the second video.
It should be noted that, the present disclosure is not limited to a specific manner of identifying the original barrage in the second video and a specific type of the video editing system.
And step S230, deleting the original barrage in the second video to obtain a first video, and generating a target video according to the first video.
In an example embodiment of the present disclosure, after the second video is obtained through the above steps and the barrage parameter corresponding to the original barrage in the second video is obtained, the original barrage in the second video may be deleted to obtain the first video, and the target video is generated according to the first video. Specifically, the original barrage in the second video can be deleted through the barrage deletion model to obtain the first video.
Specifically, a bullet screen deletion model can be constructed, the bullet screen deletion model can be used for deleting an original bullet screen in a second video to obtain a first video, the bullet screen deletion model is a model established for completing a bullet screen deletion task, and the bullet screen deletion model can be obtained by training a basic model to complete the bullet screen deletion task. It should be noted that, the specific structure of the bullet screen deletion model is not particularly limited in this disclosure.
In an example embodiment of the present disclosure, after the first video is obtained through the above steps, the target video may be generated from the first video. Specifically, an editing operation may be performed with respect to the first video. For example, special effects may be added, text may be added, logos may be added, background music may be added, etc.
It should be noted that the first video may occupy the entire duration of the target video, or the first video may occupy a portion of the duration of the target video.
Note that the present disclosure is not particularly limited to a specific manner of generating the target video from the first video.
In an example embodiment of the present disclosure, a target video may be played in a graphical user interface in response to a video playing instruction, and an original bullet screen may be displayed in the target video according to bullet screen parameters corresponding to the original bullet screen in the second video in response to a bullet screen displaying instruction. Referring to fig. 3, in response to a bullet screen display instruction, displaying an original bullet screen in a target video according to bullet screen parameters corresponding to the original bullet screen in a second video, which may include the following steps S310 to S320:
step S310, responding to a video playing instruction, and playing a target video in a graphical user interface; the target video comprises a first video, and the first video is obtained by deleting an original bullet screen in the second video;
in an example embodiment of the present disclosure, a target video may be played in a graphical user interface in response to a video play instruction. The target video comprises a first video, and the first video is obtained by deleting an original bullet screen in the second video. Specifically, the target video includes a first video, which means that the first video is added to the target video as a video material when the target video is edited.
The display duration of the first video may be the same as the display duration of the target video, or the display duration of the first video may be smaller than the display duration of the target video; in addition, the display size of the first video may be smaller than the display size of the target video, and the display size of the first video may be equal to the display size of the target video.
Further, the first video may be a partial segment of the video material, or may be a full segment of the video material.
In an example embodiment of the present disclosure, the first video is a video obtained by deleting an original bullet screen in the second video. Specifically, the second video includes an original bullet screen, where the bullet screen refers to a comment subtitle popped up when the video is viewed on the network. For example, the second video is a live video, and when the live video is recorded, a barrage on a live picture is recorded at the same time, and at this time, the second video includes an original barrage; for another example, videos provided in some video platforms may be watched and simultaneously a bullet screen may be sent, and at this time, the videos may be recorded, and at the time of recording, the bullet screen displayed on the videos may also be recorded at the same time.
It should be noted that the present disclosure does not specifically limit the form of the second video.
In an example embodiment of the present disclosure, after obtaining the second video, the original barrage in the second video may be deleted to obtain the first video. Specifically, the second video may be processed through a video processing algorithm, and the original bullet screen therein is deleted. For example, the original barrages in the second video may be identified and deleted.
It should be noted that, the manner of deleting the original barrage in the second video is not particularly limited in the present disclosure.
In an example embodiment of the present disclosure, a target video may be played in a graphical user interface in response to a video play instruction. Specifically, the video playing instruction may be used to play the target video. The video playing instruction may include playing the target video from an initial point of the target video, or playing the target video from an intermediate point of the target video. It should be noted that, the playing position of the target video is not particularly limited in the present disclosure.
Specifically, the video playing instruction may be triggered by a video playing operation. For example, a video play control may be provided in the graphical user interface, and when a video play operation is received for the video play control, the target video may be played. The video playing operation may be a key operation, a touch operation, a voice operation, or the like. It should be noted that, the present disclosure is not limited to a specific form of the video playing operation.
Step S320, responding to the bullet screen display instruction, and displaying an original bullet screen in the target video according to the bullet screen parameters corresponding to the original bullet screen in the second video; and the bullet screen parameter is used for indicating the related information of the original bullet screen in the second video.
In an example embodiment of the present disclosure, after the target video is played in the graphical user interface through the above steps, a bullet screen display instruction may be received, and the original bullet screen is displayed in the target video according to the bullet screen parameters corresponding to the original bullet screen in the second video. And the bullet screen parameter is used for indicating the related information of the original bullet screen in the second video. Specifically, the related information of the original barrage in the second video refers to related information when the original barrage is displayed in the second video. For example, the position of the bullet screen displayed by the original bullet screen in the second video, the content of the bullet screen when the original bullet screen is displayed in the second video, the color of the bullet screen displayed by the original bullet screen in the second video, the presentation mode of the bullet screen of the original bullet screen in the second video, the appearance time of the bullet screen of the original bullet screen in the second video, and the disappearance time of the bullet screen of the original bullet screen in the second video.
For example, an original bullet screen in the second video has the following bullet screen parameters: bullet screen content-this is a wonderful game, bullet screen color-yellow, bullet screen appearance time-00, bullet screen disappearance time-36, bullet screen appearance position- (0.1, 0.7), bullet screen presentation mode-from left to right.
It should be noted that the disclosure is not limited to specific types of bullet screen parameters.
In an example embodiment of the present disclosure, in response to a bullet screen display instruction, an original bullet screen is displayed in a target video according to bullet screen parameters corresponding to the original bullet screen in a second video. Specifically, the bullet screen display instruction can be triggered through bullet screen display operation. For example, a bullet screen display control may be provided in the graphical user interface, and when a bullet screen display operation for the bullet screen display control is received, the original bullet screen may be displayed in the target video. The bullet screen display operation can be a key operation, a touch operation or a voice operation. It should be noted that the present disclosure is not limited to the specific form of the bullet screen display operation.
Alternatively, the bullet screen display command may be triggered by the system. For example, after the target video is played in the graphical user interface through the above steps, the bullet screen display instruction may be triggered, and the original bullet screen is displayed in the target video according to the bullet screen parameters corresponding to the original bullet screen in the second video.
Note that the form of the bullet screen display command is not particularly limited in the present disclosure.
Further, the user may send the first bullet screen for the target video, and when receiving the bullet screen display instruction and displaying the original bullet screen in the target video according to the bullet screen parameter corresponding to the original bullet screen in the second video, may display the first bullet screen and the original bullet screen in the target video at the same time, or may display the original bullet screen only in the target video.
It should be noted that, the display priority of the first bullet screen and the original bullet screen is not particularly limited in this disclosure.
Through the steps S310-S320, the target video can be played in the graphical user interface in response to the video playing instruction, and the original bullet screen is displayed in the target video according to the bullet screen parameters corresponding to the original bullet screen in the second video in response to the bullet screen displaying instruction.
In an example embodiment of the present disclosure, after the original bullet screen is displayed in the target video according to the bullet screen parameter corresponding to the original bullet screen in the second video through the above steps, the original bullet screen may be stopped from being displayed in the target video in response to the bullet screen closing instruction. Specifically, the bullet screen closing command can be used to stop displaying the original bullet screen. For example, a bullet screen closing control may be provided in the graphical user interface, and when a bullet screen closing operation for the bullet screen closing control is received, the original bullet screen may be stopped from being displayed in the target video. The bullet screen closing operation can be a key operation, a touch operation or a voice operation. It should be noted that the present disclosure is not limited to a specific form of the bullet screen closing operation.
Furthermore, the graphic user interface may include a bullet screen control, the bullet screen control may be configured to trigger the bullet screen display instruction and the bullet screen closing instruction, when the bullet screen display instruction is triggered, the original bullet screen may be displayed in the target video according to the bullet screen parameter corresponding to the original bullet screen in the second video, and when the bullet screen closing instruction is triggered, the original bullet screen is stopped being displayed in the target video.
It should be noted that, the specific form of the bullet screen control is not particularly limited in the present disclosure.
Further, after receiving the bullet screen closing instruction, stopping displaying the original bullet screen in the target video, which may be that after the original bullet screen in the current video interface is displayed, the original bullet screen is not displayed.
In an example embodiment of the present disclosure, a parameter conversion ratio may be determined according to a display size of the second video and a display size of the target video, the barrage parameter may be converted into a mapped barrage parameter according to the parameter conversion ratio, and the original barrage may be displayed in the target video according to the mapped barrage parameter. Referring to fig. 4, displaying an original bullet screen in a target video according to the mapped bullet screen parameters may include the following steps S410 to S430:
step S410, determining a parameter conversion ratio according to the display size of the second video and the display size of the target video;
step S420, converting the barrage parameters into mapping barrage parameters according to the parameter conversion proportion; the mapping barrage parameter is used for displaying the parameter of the original barrage in the target video in a display size suitable for the target video;
and step S430, displaying the original bullet screen in the target video according to the mapping bullet screen parameters.
In an example embodiment of the present disclosure, the parameter conversion ratio may be determined according to a display size of the second video and a display size of the target video. Specifically, the display size of the video refers to a size of the video displayed on the display module of the terminal device, and when the display size of the second video is smaller than the display size of the target video, the parameter conversion ratio may be determined according to the display size of the second video and the display size of the target video. The parameter conversion proportion can be used for converting the barrage parameters into mapping barrage parameters, and the mapping barrage parameters are used in the target video to adapt to the display size of the target video to display the parameters of the original barrage.
Specifically, the parameter of displaying the original barrage according to the display size of the target video means that the original barrage is displayed in a display size matched with the display size of the target video.
For example, the display size of the target video is 4 × 4, the display size of the second video is 2 × 2, at this time, the conversion ratio of the barrage parameter to the parameter of the mapped barrage parameter is 1.
It should be noted that, the present disclosure is not limited to a specific manner of determining the parameter conversion ratio according to the display size of the second video and the display size of the target video.
Through the steps S410 to S430, a parameter conversion ratio can be determined according to the display size of the second video and the display size of the target video, the barrage parameter is converted into a mapping barrage parameter according to the parameter conversion ratio, and the original barrage is displayed in the target video according to the mapping barrage parameter.
In an example embodiment of the present disclosure, as shown in fig. 5 (a), a target video may be played in a graphical user interface 501, and in response to a bullet screen display instruction of a bullet screen control 502, an original bullet screen 503 is displayed in the target video according to a bullet screen parameter corresponding to the original bullet screen in the second video; as shown in fig. 5 (b), in response to the bullet screen closing instruction for the bullet screen control 502, the original bullet screen stops being displayed in the target video.
In an example embodiment of the present disclosure, a second video may be input to a video editing system as a video source, after the video editing system is input, an original bullet screen in the second video is identified, a bullet screen parameter corresponding to the original bullet screen in the second video is acquired in response to a bullet screen parameter acquisition instruction, the original bullet screen in the second video is deleted to obtain a first video, a target video is generated according to the first video, the bullet screen parameter corresponding to the original bullet screen in the second video is stored in a bullet screen server, when the target video is played in a graphical user interface, a bullet screen parameter corresponding to the original bullet screen in the second video may be acquired in the bullet screen server in response to a bullet screen display instruction, the original bullet screen is displayed in the target video according to the bullet screen parameter corresponding to the original bullet screen in the second video, and in addition, the display of the original bullet screen in the target video may be stopped in response to a bullet screen closing instruction. Referring to fig. 6, displaying an original bullet screen in a target video according to the bullet screen parameters corresponding to the original bullet screen in the second video may include the following steps S601 to S608:
step S601, inputting a second video serving as a video source into a video editing system; step S602, identifying an original bullet screen in a second video; step S603, receiving a bullet screen parameter acquisition instruction; step S604, acquiring bullet screen parameters corresponding to the original bullet screen in the second video; step S605, deleting the original barrage in the second video to obtain a first video, and generating a target video according to the first video; step S606, storing bullet screen parameters corresponding to the original bullet screen in the second video in a bullet screen server; step S607, when the target video is played in the graphical user interface, responding to the bullet screen display instruction, acquiring bullet screen parameters corresponding to the original bullet screen in the second video in the bullet screen server, and displaying the original bullet screen in the target video according to the bullet screen parameters corresponding to the original bullet screen in the second video; step S608, in response to the bullet screen closing instruction, stops displaying the original bullet screen in the target video.
In a video generation method provided by an embodiment of the present disclosure, a second video may be acquired; the second video comprises an original bullet screen, bullet screen parameters corresponding to the original bullet screen in the second video are obtained in response to a bullet screen parameter obtaining instruction, the bullet screen parameters are used for indicating related information of the original bullet screen in the second video, the original bullet screen in the second video is deleted to obtain a first video, and a target video is generated according to the first video. On one hand, the barrage parameters of the barrage in the video material can be obtained, and the barrage is displayed in the target video according to the barrage parameters, so that the requirements of a user who wants to see the barrage in the video material are met, and the utilization rate of the user on a video platform is improved; on the other hand, need not to make the video alone to the user who wants to see the barrage in the video material, avoid manpower and materials to consume to video production efficiency has been promoted.
Fig. 7 is a block diagram illustrating a video generation apparatus according to an example embodiment. Referring to fig. 7, the video generating apparatus 700 includes a second video acquiring unit 710, an acquisition instruction responding unit 720, and a target video generating unit 730.
Wherein the second video acquisition unit is configured to perform acquisition of a second video; wherein the second video comprises an original bullet screen; the acquisition instruction response unit is configured to execute a bullet screen parameter acquisition instruction responding to the bullet screen parameter acquisition instruction and acquire a bullet screen parameter corresponding to an original bullet screen in the second video; the barrage parameter is used for indicating related information of an original barrage in the second video; and the target video generating unit is configured to delete the original barrage in the second video to obtain a first video and generate a target video according to the first video.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, before obtaining the barrage parameter corresponding to the original barrage in the second video, the apparatus further includes: an original bullet screen identification unit configured to perform identification of an original bullet screen in the second video.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, an original barrage in the second video is identified, and the apparatus further includes: an editing system input unit configured to perform recognizing an original bullet screen in a second video when the second video is input to the video editing system; the video editing system is used for editing input video to generate target video.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the bullet screen parameters corresponding to the original bullet screen in the second video include one or more of a bullet screen position, a bullet screen content, a bullet screen color, a bullet screen presentation mode, a bullet screen appearance time, and a bullet screen disappearance time.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the apparatus further includes: a play instruction response unit configured to execute playing of the target video in the graphical user interface in response to the video play instruction; and the display instruction response unit is configured to execute and respond to the bullet screen display instruction, and display the original bullet screen in the target video according to the bullet screen parameters corresponding to the original bullet screen in the second video.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the apparatus further includes: and a stopping display unit configured to execute stopping displaying the original bullet screen in the target video in response to the bullet screen closing instruction.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the display size of the second video is smaller than the display size of the target video, and the original barrage is displayed in the target video according to the barrage parameter corresponding to the original barrage in the second video, and the apparatus further includes: a conversion ratio determination unit configured to perform parameter conversion ratio determination according to a display size of the second video and a display size of the target video; a mapping barrage parameter acquiring unit configured to perform conversion of the barrage parameters into mapping barrage parameters according to the parameter conversion ratio; the mapping barrage parameters are used in the target video to adapt to the display size of the target video to display the parameters of the original barrage; and the mapping display unit is configured to display the original bullet screen in the target video according to the mapping bullet screen parameters.
With regard to the video generation apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
An electronic device 800 according to such an embodiment of the disclosure is described below with reference to fig. 8. The electronic device 800 shown in fig. 8 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present disclosure.
As shown in fig. 8, electronic device 800 is in the form of a general purpose computing device. The components of the electronic device 800 may include, but are not limited to: the at least one processing unit 810, the at least one memory unit 820, a bus 830 connecting different system components (including the memory unit 820 and the processing unit 810), and a display unit 840.
Where the memory unit stores program code, the program code may be executed by the processing unit 810 to cause the processing unit 810 to perform steps according to various exemplary embodiments of the present disclosure as described in the "exemplary methods" section above in this specification. For example, the processing unit 810 may execute step S210 as shown in fig. 2, acquiring a second video; wherein the second video comprises an original bullet screen; step S220, responding to a bullet screen parameter obtaining instruction, and obtaining bullet screen parameters corresponding to an original bullet screen in a second video; the barrage parameter is used for indicating related information of an original barrage in the second video; and step S230, deleting the original barrage in the second video to obtain a first video, and generating a target video according to the first video.
The storage unit 820 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM) 821 and/or a cache memory unit 822, and may further include a read only memory unit (ROM) 823.
Storage unit 820 may also include a program/utility 824 having a set (at least one) of program modules 825, such program modules 825 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 830 may be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 800 may also communicate with one or more external devices 870 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 800, and/or with any device (e.g., router, modem, etc.) that enables the electronic device 800 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 850. Also, the electronic device 800 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 860. As shown, the network adapter 860 communicates with the other modules of the electronic device 800 via the bus 830. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 800, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment, a computer-readable storage medium comprising instructions, such as a memory comprising instructions, executable by a processor of an apparatus to perform the above-described method is also provided. Alternatively, the computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, there is also provided a computer program product comprising a computer program/instructions which, when executed by a processor, implement the video generation method in the above-described embodiments.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (11)

1. A method of video generation, comprising:
acquiring a second video; wherein the second video comprises an original bullet screen;
responding to a bullet screen parameter acquisition instruction, and acquiring bullet screen parameters corresponding to an original bullet screen in the second video; wherein the barrage parameter is used for indicating related information of an original barrage in the second video;
and deleting the original bullet screen in the second video to obtain the first video, and generating the target video according to the first video.
2. The video generation method according to claim 1, wherein before the obtaining of the barrage parameter corresponding to the original barrage in the second video, the method further comprises:
identifying an original bullet screen in the second video.
3. The method of claim 2, wherein the identifying the original barrage in the second video comprises:
when the second video is input into a video editing system, identifying an original bullet screen in the second video; the video editing system is used for editing input video to generate target video.
4. The video generation method according to claim 1, wherein the barrage parameters corresponding to the original barrage in the second video include one or more of a barrage position, a barrage content, a barrage color, a barrage presentation mode, a barrage appearance time, and a barrage disappearance time.
5. The video generation method of claim 1, wherein the method further comprises:
responding to a video playing instruction, and playing a target video in a graphical user interface;
and responding to a bullet screen display instruction, and displaying an original bullet screen in the target video according to bullet screen parameters corresponding to the original bullet screen in the second video.
6. The video generation method of claim 5, wherein the method further comprises:
and responding to a bullet screen closing instruction, and stopping displaying the original bullet screen in the target video.
7. The method according to claim 5, wherein a display size of the second video is smaller than a display size of a target video, and the displaying of the original barrage in the target video according to the barrage parameter corresponding to the original barrage in the second video comprises:
determining a parameter conversion ratio according to the display size of the second video and the display size of the target video;
converting the barrage parameters into mapping barrage parameters according to the parameter conversion proportion; wherein the mapping barrage parameter is used for displaying the parameter of the original barrage in the target video according to the display size of the target video;
and displaying the original bullet screen in the target video according to the mapping bullet screen parameters.
8. A video generation apparatus, comprising:
a second video acquisition unit configured to perform acquisition of a second video; wherein the second video comprises an original bullet screen;
the acquisition instruction response unit is configured to execute a bullet screen parameter acquisition instruction in response to the bullet screen parameter, and acquire a bullet screen parameter corresponding to an original bullet screen in the second video; wherein the barrage parameter is used for indicating related information of an original barrage in the second video;
and the target video generation unit is configured to delete the original barrage in the second video to obtain the first video, and generate the target video according to the first video.
9. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the executable instructions to implement the video generation method of any of claims 1 to 7.
10. A computer-readable storage medium whose instructions, when executed by a processor of an electronic device, enable the electronic device to perform the video generation method of any of claims 1 to 7.
11. A computer program product comprising computer programs/instructions, characterized in that the computer programs/instructions, when executed by a processor, implement the video generation method of any of claims 1 to 7.
CN202211099685.0A 2022-09-09 2022-09-09 Video generation method, video generation device, electronic apparatus, medium, and product Pending CN115190369A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211099685.0A CN115190369A (en) 2022-09-09 2022-09-09 Video generation method, video generation device, electronic apparatus, medium, and product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211099685.0A CN115190369A (en) 2022-09-09 2022-09-09 Video generation method, video generation device, electronic apparatus, medium, and product

Publications (1)

Publication Number Publication Date
CN115190369A true CN115190369A (en) 2022-10-14

Family

ID=83524725

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211099685.0A Pending CN115190369A (en) 2022-09-09 2022-09-09 Video generation method, video generation device, electronic apparatus, medium, and product

Country Status (1)

Country Link
CN (1) CN115190369A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105979288A (en) * 2016-06-17 2016-09-28 乐视控股(北京)有限公司 Video interception method and device
US20170048533A1 (en) * 2015-08-12 2017-02-16 Le Holdings (Beijing) Co., Ltd. Video transcoding method and device
CN107743264A (en) * 2017-10-26 2018-02-27 上海哔哩哔哩科技有限公司 Video recording method and equipment
CN110708571A (en) * 2019-10-18 2020-01-17 腾讯科技(深圳)有限公司 Video clip playing control method and related product
CN111310757A (en) * 2020-02-07 2020-06-19 北方工业大学 Video bullet screen detection and identification method and device
CN114449334A (en) * 2022-01-29 2022-05-06 上海哔哩哔哩科技有限公司 Video recording method, video recording device, electronic equipment and computer storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170048533A1 (en) * 2015-08-12 2017-02-16 Le Holdings (Beijing) Co., Ltd. Video transcoding method and device
CN105979288A (en) * 2016-06-17 2016-09-28 乐视控股(北京)有限公司 Video interception method and device
CN107743264A (en) * 2017-10-26 2018-02-27 上海哔哩哔哩科技有限公司 Video recording method and equipment
CN110708571A (en) * 2019-10-18 2020-01-17 腾讯科技(深圳)有限公司 Video clip playing control method and related product
CN111310757A (en) * 2020-02-07 2020-06-19 北方工业大学 Video bullet screen detection and identification method and device
CN114449334A (en) * 2022-01-29 2022-05-06 上海哔哩哔哩科技有限公司 Video recording method, video recording device, electronic equipment and computer storage medium

Similar Documents

Publication Publication Date Title
CN107770626B (en) Video material processing method, video synthesizing device and storage medium
CN109300179B (en) Animation production method, device, terminal and medium
CN106971009B (en) Voice database generation method and device, storage medium and electronic equipment
US20220233957A1 (en) Lag detection method and apparatus, device, and readable storage medium
US20220044703A1 (en) Method and apparatus for locating video playing node, device and storage medium
CN111629253A (en) Video processing method and device, computer readable storage medium and electronic equipment
CN107071512B (en) A kind of dubbing method, apparatus and system
CN108427589B (en) Data processing method and electronic equipment
EP3024223B1 (en) Videoconference terminal, secondary-stream data accessing method, and computer storage medium
US20230285854A1 (en) Live video-based interaction method and apparatus, device and storage medium
KR101123370B1 (en) service method and apparatus for object-based contents for portable device
CN112329403A (en) Live broadcast document processing method and device
CN112258214A (en) Video delivery method and device and server
CN108958731A (en) A kind of Application Program Interface generation method, device, equipment and storage medium
US11558440B1 (en) Simulate live video presentation in a recorded video
CN112843681B (en) Virtual scene control method and device, electronic equipment and storage medium
WO2023241360A1 (en) Online class voice interaction methods and apparatus, device and storage medium
CN113535541A (en) Test method, test device, storage medium and computer equipment
CN115190369A (en) Video generation method, video generation device, electronic apparatus, medium, and product
CN114501103B (en) Live video-based interaction method, device, equipment and storage medium
CN114422468A (en) Message processing method, device, terminal and storage medium
US20200026535A1 (en) Converting Presentations into and Making Presentations from a Universal Presentation Experience
CN108052578B (en) Method and apparatus for information processing
CN115134317B (en) Message display method, device, storage medium and electronic device
US11954778B2 (en) Avatar rendering of presentations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20221014

RJ01 Rejection of invention patent application after publication