CN116193195A - Video processing method, device, processing equipment and storage medium - Google Patents

Video processing method, device, processing equipment and storage medium Download PDF

Info

Publication number
CN116193195A
CN116193195A CN202310173881.6A CN202310173881A CN116193195A CN 116193195 A CN116193195 A CN 116193195A CN 202310173881 A CN202310173881 A CN 202310173881A CN 116193195 A CN116193195 A CN 116193195A
Authority
CN
China
Prior art keywords
video
editing
target version
bullet screen
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310173881.6A
Other languages
Chinese (zh)
Inventor
崔超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing QIYI Century Science and Technology Co Ltd
Original Assignee
Beijing QIYI Century Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing QIYI Century Science and Technology Co Ltd filed Critical Beijing QIYI Century Science and Technology Co Ltd
Priority to CN202310173881.6A priority Critical patent/CN116193195A/en
Publication of CN116193195A publication Critical patent/CN116193195A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Television Signal Processing For Recording (AREA)
  • Management Or Editing Of Information On Record Carriers (AREA)

Abstract

The embodiment of the invention relates to a video processing method, a device, processing equipment and a storage medium, wherein the method comprises the following steps: acquiring an editing record of a target version video, wherein the editing record comprises an editing time point and an editing duration; acquiring bullet screen information corresponding to the target version video; performing realignment processing on the barrage information in the target version video by utilizing the editing time point and the editing time length to obtain the updated target version video; therefore, the technical effect of alignment of the barrage and the video can be realized.

Description

Video processing method, device, processing equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of video processing, in particular to a video processing method, a device, processing equipment and a storage medium.
Background
With the continuous updating of video applications APP, more and more platforms are provided that can view video. In recent years, when a user watches an online video, a bullet screen function is started, and experience of the user is improved by inputting comment watching insight of the user while watching the video. The user can send the barrage in real time according to the watching scenario when watching the video, and other users can see barrage information of other users on the scenario when watching the same scenario, so that the user can well understand the scenario.
Typically, the storage of the bullet screen is indexed by video ID and point in time. However, during the video playing process, clips may be edited for various reasons, including operations of adding tab advertisements or deleting video clips, but during the video playing process, clips may be edited inevitably, including operations of adding tab advertisements or deleting clips, which results in changes of time points of certain episodes of the new version, and when the bullet screen of the history version is played in the new version, the bullet screen will be separated from the video episodes, causing a transparent or delayed effect, and reducing the experience of the user.
Therefore, how to solve the alignment of the bullet screen and the video becomes a urgent problem to be solved.
Disclosure of Invention
In view of this, in order to solve the technical problem of alignment of the bullet screen and the video, embodiments of the present invention provide a video processing method, device, processing apparatus and storage medium.
In a first aspect, an embodiment of the present invention provides a method for processing a video, including:
acquiring an editing record of a target version video, wherein the editing record comprises an editing time point and an editing duration;
acquiring bullet screen information corresponding to the target version video;
and executing realignment processing on the barrage information in the target version video by utilizing the editing time point and the editing time length to obtain the updated target version video.
In one possible implementation, the types of the edit records include a delete class and an insert class;
wherein when the type of the edit record is the deletion type, the edit record includes: deleting the video starting time and deleting the video duration;
when the type of the edit record is the insertion type, the edit record includes: an insert video start time and an insert video duration.
In one possible implementation manner, after performing the obtaining the edited recording of the target version video, the method further includes:
acquiring a target version number of the target version video;
and carrying out association processing on the editing record and the target version number.
In one possible embodiment, the method further comprises:
acquiring receiving information of the barrage information, wherein the receiving information comprises a sender identifier and a sending moment of the barrage information;
and binding the received information with the target version video by using the target version number.
In one possible embodiment, the method further comprises:
when the type of the editing record is the deletion type, determining whether the bullet screen information is deleted or not according to the start time of the deleted video and the deleted video duration;
the executing the realignment processing on the barrage information in the target version video by using the editing time point and the editing time length includes:
and when the bullet screen information is not deleted, performing forward moving processing on the display of the bullet screen information after the start time of deleting the video according to the deleting video duration.
In one possible implementation manner, the performing the realignment process on the bullet screen information in the target version video by using the editing time point and the editing duration further includes:
and when the bullet screen information is deleted, determining the replacement time of the deleted bullet screen information from the edited target version video, displaying the deleted bullet screen information at the replacement time, and performing forward moving processing on the display of the bullet screen information after the start time of the deleted video according to the deleted video duration.
In one possible embodiment, the method further comprises:
and when the type of the editing record is the insertion type, performing backward movement processing on the display of the bullet screen information after the start time of the insertion video according to the duration of the insertion video.
In a second aspect, an embodiment of the present invention provides a video processing apparatus, including:
the acquisition module is used for acquiring an editing record of the target version video, wherein the editing record comprises an editing time point and an editing duration;
the acquisition module is also used for acquiring bullet screen information corresponding to the target version video;
and the alignment module is used for executing the realignment processing on the bullet screen information in the target version video by utilizing the editing time point and the editing time length to obtain the updated target version video.
In a third aspect, an embodiment of the present invention provides a processing apparatus, including: the video processing device comprises a processor and a memory, wherein the processor is used for executing a video processing program stored in the memory so as to realize the video processing method in any one of the first aspect.
In a fourth aspect, an embodiment of the present invention provides a storage medium, where one or more programs are stored, where the one or more programs are executable by one or more processors to implement the method for processing video according to any one of the first aspects.
According to the video processing scheme provided by the embodiment of the invention, the editing record of the target version video is obtained firstly, wherein the editing record comprises an editing time point and an editing time length; acquiring bullet screen information corresponding to the target version video; finally, performing realignment processing on the barrage information in the target version video by utilizing the editing time point and the editing time length to obtain the updated target version video; the method comprises the steps that editing records of a current version video are obtained, the editing records are obtained for the current version video, the editing records generally comprise editing time points and editing time periods, bullet screen data before modification are obtained, the bullet screen is correspondingly modified according to the video changing time points and the video changing time periods, the modified bullet screen is aligned with the video again to serve as a next video version, the alignment operation of the video and the bullet screen is completed, and the technical effect of bullet screen alignment with the video can be achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a flow chart of a video processing method according to an embodiment of the present invention;
fig. 2 is a flowchart of another video processing method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a video and barrage alignment method according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a video processing device according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a processing apparatus according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms "comprising" and "having" in embodiments of the present invention are used to mean including open and mean that there may be additional elements/components/etc. in addition to the listed elements/components/etc.; the terms "first" and "second" and the like are used merely as labels, and are not intended to limit the number of their objects. Furthermore, the various elements and regions in the figures are only schematically illustrated and thus the present invention is not limited to the dimensions or distances illustrated in the figures.
For the purpose of facilitating an understanding of the embodiments of the present invention, reference will now be made to the following description of specific embodiments, taken in conjunction with the accompanying drawings, which are not intended to limit the embodiments of the invention.
Fig. 1 is a flow chart of a video processing method according to an embodiment of the present invention. The execution subject of the invention is a video processing program, and the video processing program is used for adjusting the alignment relation between the edited video and the barrage. As shown in the diagram provided in fig. 1, the video processing method specifically includes:
s101, acquiring an editing record of a target version video, wherein the editing record comprises an editing time point and an editing duration.
The execution main body of the video processing program is a video processing program, and the aim of aligning the modified video with the barrage is fulfilled by recording the modified record of each version of video and correspondingly adjusting the playing time point of the barrage.
The target version video is understood herein to be an unmodified original video. An edit record as referred to herein is understood to be a record modified or adapted for a target version of video, and may be, but is not limited to, a new video being deleted or added. The editing time point is herein understood to be a specific time point of modification in the total video playing length of the target version video, and the editing time period is herein understood to be a time length of increasing or decreasing the total video time length in one video modification.
Further, when the modification record exists in the target version video, a specific modification record aiming at the target version video is obtained and is used as an editing record for statistics, wherein the statistics comprises statistics of information such as specific editing time points, editing time duration and the like of each editing record, and a reference basis is provided for video alignment.
S102, bullet screen information corresponding to the target version video is obtained.
The barrage information is understood herein to be the current version of barrage information for the target version video playback.
Further, according to the target version video, the barrage information under the current version is called, so that the target version video and the barrage information belong to the video and the barrage under the same version. Because each version of video may be edited, when the edited video is encountered, the barrage corresponding to the version of video is determined, and barrage information corresponding to the target version of video can be determined according to the video ID or the video editing time point as a reference basis, so that the edited video or the barrage corresponding to the edited video is different, and the unification of the video and barrage scenario is ensured.
S103, performing realignment processing on bullet screen information in the target version video by utilizing the editing time point and the editing time length to obtain the updated target version video.
The alignment process can be understood as a process of aligning, when there is a modification record in the target version video, the scenario of the modified target version video with the editing time point and the editing time length at which the corresponding modified bullet screen information passes. The updated target version video is understood to be a new version video obtained after alignment processing. The version number of the new version video is larger than that of the target version video.
Further, after the editing time point and the editing time length information aiming at the target version video are obtained, corresponding barrage information is obtained, the time point and the time length to be modified of the barrage information are determined according to the editing time point and the editing time length of the target version video, the barrage information is edited according to the type of the editing record of the target version video, the edited barrage information is bound with the edited video, alignment processing of the video and the barrage information is achieved, accordingly an updated target version video is obtained, the version number of the updated target version video is modified, version upgrading processing is achieved, after editing is finished, a new version number is generated, the editing record and the version number are stored in an associated mode, preparation is made for editing of the next version number video, and the technical effect of alignment of the video and the barrage is achieved.
According to the video processing method provided by the embodiment of the invention, the editing record of the target version video is acquired firstly, wherein the editing record comprises an editing time point and an editing time length; acquiring bullet screen information corresponding to a target version video; finally, performing realignment processing on bullet screen information in the target version video by utilizing the editing time point and the editing time length to obtain updated target version video; the method comprises the steps that editing records of a current version video are obtained, the editing records are obtained for the current version video, the editing records generally comprise editing time points and editing time periods, bullet screen data before modification are obtained, the bullet screen is correspondingly modified according to the video changing time points and the video changing time periods, the modified bullet screen is aligned with the video again to serve as a next video version, the alignment operation of the video and the bullet screen is completed, and the technical effect of bullet screen alignment with the video can be achieved.
Fig. 2 is a flowchart of another video processing method according to an embodiment of the present invention. Fig. 2 is presented on the basis of the above embodiment. As shown in the diagram provided in fig. 2, the video processing method specifically further includes:
s201, acquiring an editing record of the target version video, wherein the editing record comprises an editing time point and an editing duration.
The execution main body of the video processing program is a video processing program, and the aim of aligning the modified video with the barrage is fulfilled by recording the modified record of each version of video and correspondingly adjusting the playing time point of the barrage.
The target version video is understood herein to be an unmodified original video. An edit record as referred to herein is understood to be a record modified or adapted for a target version of video, and may be, but is not limited to, a new video being deleted or added. The editing time point is herein understood to be a specific time point of modification in the total video playing length of the target version video, and the editing time period is herein understood to be a time length of increasing or decreasing the total video time length in one video modification.
Further, when the modification record exists in the target version video, a specific modification record aiming at the target version video is obtained and is used as an editing record for statistics, wherein the statistics comprises statistics of information such as specific editing time points, editing time duration and the like of each editing record, and a reference basis is provided for video alignment.
Wherein, the types of the editing records comprise a deletion class and an insertion class; when the type of the edit record is a delete class, the edit record includes: deleting the video starting time and deleting the video duration; when the type of the edit record is an insertion type, the edit record includes: an insert video start time and an insert video duration.
The deletion class is understood herein as an edit record for which deletion processing is performed for video. For example, during video playback, a process of deleting a video at a specified point in time position for a certain period of time due to a scenario change or other provision is performed. The main key information corresponding to the deletion operation is the start time of deleting the video and the duration of deleting the video. The insertion class is understood herein as an edit record that is subjected to insertion processing for video. The main key information of the corresponding inserting operation is the starting time and the duration of the inserting video. For example, during video playback, a process of inserting advertisements for a certain period of time at a specified point in time.
S202, acquiring a target version number of the target version video.
The target version number is understood to be a unique identification version number for a target version video, one version video corresponds to one version number, and whether the version number of the video needs to be updated is determined according to whether the video is edited or not. When the target version video does not modify the record, keeping the current version number unchanged; when the target version video is modified, a video editing record exists, and the target version number is updated correspondingly to distinguish the two videos before and after editing.
Further, according to the target version video and the corresponding editing record, the introduction information of the target version video, including the current target version number information, is found through system searching.
S203, associating the editing record with the target version number.
The association process is understood herein to be a mapping or binding process. Each version number of the video after the association processing corresponds to all editing records contained in the associated video.
Further, the obtained editing records of the target version video and the searched target version numbers are subjected to association processing means to complete the editing records under the condition of corresponding to the target version numbers, so that all associated editing records are obtained by searching the version numbers.
In one possible example scenario, the video playing software of the computer detects that a modification trace exists in a certain video. By calling the corresponding view function module, the video version number of the current video is detected first, and all modified records for the video are obtained, for example, by detecting that a deleted record and a processed record with an advertisement inserted into the current video exist. Binding all the searched modification records with the video version number to complete the association processing of the video version number and all the modification records.
For example, according to the relation between the target version number and the edit record, the edit record corresponding to each version can be obtained. Table 1 below shows a table of the relationship between each version number and the corresponding edit record:
TABLE 1
Editing serial number Version number Editing time point Editing type Duration of editing
1 2 t1 Insertion class d1
2 2 t2 Deletion class d2
3 3 t3 Insertion class d3
4 3 t4 Deletion class d4
S204, receiving information of the barrage information is obtained, wherein the receiving information comprises a sender identification and a sending moment of the barrage information.
The received information is understood herein to be the point in time and author information created for each bullet screen. And obtaining the creation record of all the barrages according to the acceptance information. The sender identification is understood herein to be the creator of each bullet screen, and the identification is understood herein to be, but not limited to, the user name of the creator of each bullet screen, a nickname, or information such as a device ID or a network IP address used by the sender when sending the bullet screen.
Further, after the editing record and the target version number of the target version video are acquired, bullet screen information aiming at the target version video is acquired. The method comprises the steps of obtaining receiving information corresponding to each piece of barrage information, and distinguishing information such as sender identification and sending time of each barrage through the receiving information of the barrage information.
S205, binding the received information with the target version video by using the target version number.
Further, after the target version number corresponding to the target version video is obtained, bullet screen information corresponding to the target version video is obtained, meanwhile, receiving information of bullet screen information in the past is obtained, and the relation between all editing records of the video under one version video and all receiving information of corresponding bullet screen information is determined by binding the receiving information of the characterization bullet screen information with the target version number of the characterization editing record.
S206, bullet screen information corresponding to the target version video is obtained.
Further, the target version number of the target version video is bound with the receiving information of the barrage information, so that the target version video is associated with the barrage information, and all barrage information associated with the target version video can be correspondingly acquired by acquiring the target version video.
In one possible real scene, a video playing option is obtained, all corresponding barrages can be searched through video content, and the corresponding barrages comprise information such as user names, barrage specific content, comment time points and the like of each barrage comment. Binding the information with the version number of the video can realize that all barrage information about the video can be obtained through the version number of the video, thereby realizing all barrage information under each version condition.
S207, when the type of the edit record is a deletion type, determining whether the bullet screen information is deleted according to the start time of deleting the video and the duration of deleting the video.
Further, when the editing record of the deletion class of the target version video is obtained, all bullet screen information under the condition of the target version video is obtained first. And readjusting the barrage information according to the video editing record. Finding out the barrage information corresponding to the barrage information at the same moment according to the moment of starting deleting the video, delaying to delete all barrage information in the video duration time period from the moment, determining whether barrage information exists in the video deleting duration in the corresponding barrage information for confirmation, and editing the barrage information to be aligned with the edited video if the barrage information is deleted; if the deleted barrage information does not exist, the barrage information does not need to be deleted, and only the blank barrage duration in the video duration is deleted.
S208, when the bullet screen information is not deleted, performing forward movement processing on the display of the bullet screen information after the start time of deleting the video according to the video deleting time.
The forward process is understood to be a process of playing the content at a certain moment in advance for a certain period of time, i.e. a predetermined time in advance.
Further, when the type of the edited record of the target version video is a deletion type, it is determined that the record of the deleted video content exists in the target version video, and then the bullet screen information is still in an unedited state, so that the problem that the edited target version video is not aligned with the unedited bullet screen information in time is caused. In order to align the edited target version video with the barrage information again, the barrage information corresponding to the time period when the video deletion time begins to pass after the video deletion time is deleted, but the barrage information may not appear in the time period, so that whether the barrage information exists in the deletion time period is first selected. When the video software detects that the corresponding barrage information does not exist in the corresponding deleting time period, the fact that the specific content of the barrage information is not deleted after the target version video is deleted and edited is indicated, and when the edited target version video is aligned with the barrage information again, only a blank barrage in the deleting video time length from the deleting video starting time corresponding to the barrage information is found, the barrage information showing after the deleting video starting time is moved forward according to the deleting video time length, and the alignment of the edited target version video and the edited barrage information is completed.
S209, when the bullet screen information is deleted, determining the replacement time of the deleted bullet screen information from the edited target version video, displaying the deleted bullet screen information at the replacement time, and performing forward moving processing on the display of the bullet screen information after the start time of deleting the video according to the deleting video time.
The deleted barrage information is understood to mean that there is barrage information deleting barrage specific information. The replacement time can be understood as new bullet screen information corresponding to the start time of deleting the video after deletion and editing.
Further, when the type of the edited record of the target version video is a deletion type, it is determined that the record of the deleted video content exists in the target version video, and then the bullet screen information is still in an unedited state, so that the problem that the edited target version video is not aligned with the unedited bullet screen information in time is caused. In order to align the edited target version video with the barrage information again, the barrage information corresponding to the time period when the video deletion time begins to pass after the video deletion time is deleted, but the barrage information may not appear in the time period, so that whether the barrage information exists in the deletion time period is first selected. When the video software detects that the corresponding barrage information does exist in the corresponding deleting time period, the influence on the display content of barrage information is caused after the target version video is deleted and edited, the edited target version video is aligned with the barrage information again, and meanwhile the influence of the drama transmission and the like caused by the deletion of the video content but the barrage information of the corresponding scenario when a user watches the edited target version video is avoided, and the watching experience of the user is influenced. The method comprises the steps that bullet screen information in the deleted video time period from the start time of the deleted video corresponding to bullet screen information is needed to be found, and corresponding bullet screen information in the deleted video time period is deleted, forward moving processing is carried out on the display of bullet screen information after the start time of the deleted video according to the deleted video time period, and the alignment processing of the edited target version video and the edited bullet screen information is completed.
And S210, when the type of the edit record is an insertion type, performing backward movement processing on the display of the barrage information after the start time of the inserted video according to the duration of the inserted video.
The post-shift process is understood to be a process of delaying the content at a certain moment for a certain period of time and playing the content, i.e. pre-postponing for a predetermined period of time.
Further, when the type of the edited record of the target version video is an insertion type, it is determined that the target version video has a record of the inserted video content, and at this time, the bullet screen information is still in an unedited state, so that the edited target version video and the unedited bullet screen information have a problem of time misalignment. To align the edited target version video with the bullet screen information again, blank bullet screen information corresponding to the period of time when the video insertion duration elapses from the time of video insertion start is added. In order to align the edited target version video with the barrage information again, only a blank barrage information is needed to be added in an insertion time period corresponding to the insertion operation of the target version video corresponding to the barrage information, the barrage information after the video insertion starting time in the barrage information is displayed, the backward moving processing is carried out according to the video insertion time, and the alignment processing of the edited target version video and the edited barrage information is completed.
Optionally, in the case of the same video ID, the videos corresponding to the multiple versions do not have corresponding edit records, and the difference of quality inspection of each version can be analyzed through an AI algorithm to perform distinguishing processing.
In one possible example scenario, fig. 3 is a schematic diagram of a video and barrage alignment method according to an embodiment of the present invention. Fig. 3 is a specific illustration of alignment processing of a bullet screen with a video after deletion editing and insertion editing with reference to the video with version number 1 in table 1. In fig. 3, D1 represents the total length of the video in the case of the first version of the video, t1 represents the time point of insertion when the insertion operation is performed in the first version of the video, and D1 represents the time length of insertion when the insertion operation is performed in the first version of the video; t2 represents a deletion time point when a deletion operation is performed in the first version of video, and t2> t1 is saved as the 2 nd version of video. d2 represents a deletion time period when a deletion operation is performed in the first version video. D2 represents the total length of the video of the second version obtained after the video has undergone editing of the first version. Wherein t3 represents a time point of the video corresponding to the deletion operation after the first version of video is subjected to the insertion operation.
According to the diagram provided in fig. 3, the main estimation steps of video and bullet screen alignment include:
step 1: according to the length of the first version video being D1, inserting the video with the length of D1 at the time t1, deleting the video with the length of D2 at the time t2, and obtaining the total length of the second version video as follows: d2 =d1+d1-D2.
Step 2: a point in time is determined at which the first version of barrage data corresponds to the second version.
The bullet screen data corresponding to the time period [ 0, t1 ] in the bullet screen data of the first version is [ 0, t1 ] in the bullet screen data of the second version;
corresponding barrage data in the time period of [ t1, t2 ] in the first version barrage data is [ t1+d1, t2+d1 ] in the time period of the second version barrage data;
corresponding barrage data in the time period of [ t2, D1 ] in the first version barrage data is [ t2+d1-D2, d1+d1-D2 ] in the second version barrage data time period.
Step 3: and determining a time point corresponding to the second version video corresponding to the second version barrage.
Binding the calculated video in the time period of 0, t1 of the video of the second version and bullet screen data in the time period of 0, t1 of bullet screen time of the second version, and finishing alignment processing;
binding the calculated video in the time period of [ t1+d1, t2+d1 ] of the video of the second version and bullet screen data in the time period of [ t1+d1, t2+d1 ] of the bullet screen time of the second version, and finishing alignment processing;
binding the calculated video in the time period of [ t2+d1-D2, d1+d1-D2 ] of the video of the second version with the bullet screen data in the time period of [ t2+d1-D2, d1+d1-D2 ] of the bullet screen time of the second version, and completing the alignment processing.
And finishing the alignment time of the corresponding second version video and second version barrage data after editing the first version video and the first version barrage data by the steps, and finishing the alignment processing of the video and the barrage.
When the edited current second version video is played, firstly, a video ID and a current version number corresponding to the second version video are obtained, and a query function is called according to the video ID and the current version number. And determining an editing record according to the version number, calling bullet screen data corresponding to all videos under the condition of editing records, and playing the video after alignment processing of the current version video and bullet screen information by inquiring all version numbers and associated editing records of the videos corresponding to the current version number. And the user is finished to watch the edited version video and the barrage data aligned with the scenario, so that the watching experience of the user is improved, and the technical effect of alignment processing of the video and the barrage is realized.
According to the video processing method provided by the embodiment of the invention, the video version number and the receiving information of the corresponding barrage information are bound through obtaining the editing record of the target version video, the barrage information corresponding to the target version video is obtained, and when the editing record type of the target version video is judged to be the insertion type, the barrage information is correspondingly inserted into the time period of the video starting time to carry out backward movement processing, so that the edited target version video and the edited barrage information finish alignment operation. When the editing type of the target version video is judged to be the deleting type, the barrage information in the time period from the start time of the video deletion to the video deletion corresponding to the barrage information is deleted, and the barrage information after deletion is advanced, so that the edited target version video and the barrage information after editing are aligned. Thereby realizing the technical effect of video and barrage alignment processing.
Fig. 4 is a schematic structural diagram of a video processing apparatus according to an embodiment of the present invention. According to the diagram provided in fig. 4, the video processing apparatus specifically includes:
the acquiring module 41 is configured to acquire an editing record of the target version video, where the editing record includes an editing time point and an editing duration;
the acquiring module 41 is further configured to acquire bullet screen information corresponding to the target version video;
and the alignment module 42 is configured to perform realignment processing on the barrage information in the target version video by using the editing time point and the editing time length, so as to obtain an updated target version video.
The video processing apparatus provided in this embodiment may be the video processing apparatus shown in fig. 4, and may perform all the steps of the video processing method shown in fig. 1-3, so as to achieve the technical effects of the video processing method shown in fig. 1-3, and the detailed description will be omitted herein for brevity.
Fig. 5 is a schematic structural diagram of a processing apparatus according to an embodiment of the present invention, and the processing apparatus 500 shown in fig. 5 includes: at least one processor 501, memory 502, at least one network interface 504, and other user interfaces 503. The various components in processing device 500 are coupled together by bus system 505. It is understood that bus system 505 is used to enable connected communications between these components. The bus system 505 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 505 in fig. 5.
The user interface 503 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, a trackball, a touch pad, or a touch screen, etc.).
It will be appreciated that the memory 502 in embodiments of the invention can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (Double Data Rate SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct memory bus RAM (DRRAM). The memory 502 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some implementations, the memory 502 stores the following elements, executable units or data structures, or a subset thereof, or an extended set thereof: an operating system 5021 and application programs 5022.
The operating system 5021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application 5022 includes various application programs such as a Media Player (Media Player), a Browser (Browser), and the like for realizing various application services. A program for implementing the method according to the embodiment of the present invention may be included in the application 5022.
In the embodiment of the present invention, the processor 501 is configured to execute the method steps provided by the method embodiments by calling a program or an instruction stored in the memory 502, specifically, a program or an instruction stored in the application 5022, for example, including:
acquiring an editing record of a target version video, wherein the editing record comprises an editing time point and an editing time length; acquiring bullet screen information corresponding to the target version video; and executing realignment processing on the barrage information in the target version video by utilizing the editing time point and the editing time length to obtain the updated target version video.
The method disclosed in the above embodiment of the present invention may be applied to the processor 501 or implemented by the processor 501. The processor 501 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware or instructions in software in the processor 501. The processor 501 may be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software elements in a decoding processor. The software elements may be located in a random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory 502, and the processor 501 reads information in the memory 502 and, in combination with its hardware, performs the steps of the method described above.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (Application Specific Integrated Circuits, ASIC), digital signal processors (Digital Signal Processing, DSP), digital signal processing devices (dspev, DSPD), programmable logic devices (Programmable Logic Device, PLD), field programmable gate arrays (Field-Programmable Gate Array, FPGA), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented by means of units that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
The processing device provided in this embodiment may be the processing device shown in fig. 5, and may perform all steps of the video processing method shown in fig. 1-3, so as to achieve the technical effects of the video processing method shown in fig. 1-3, and the detailed description will be omitted herein for brevity.
The embodiment of the invention also provides a storage medium (computer readable storage medium). The storage medium here stores one or more programs. Wherein the storage medium may comprise volatile memory, such as random access memory; the memory may also include non-volatile memory, such as read-only memory, flash memory, hard disk, or solid state disk; the memory may also comprise a combination of the above types of memories.
When one or more programs in the storage medium are executable by one or more processors, the above-described video processing method performed on the processing device side is implemented.
The processor is configured to execute a processing program of the video stored in the memory, so as to implement the following steps of a video processing method executed on a processing device side:
acquiring an editing record of a target version video, wherein the editing record comprises an editing time point and an editing time length; acquiring bullet screen information corresponding to the target version video; and executing realignment processing on the barrage information in the target version video by utilizing the editing time point and the editing time length to obtain the updated target version video.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of function in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The foregoing description of the embodiments has been provided for the purpose of illustrating the general principles of the invention, and is not meant to limit the scope of the invention, but to limit the invention to the particular embodiments, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (10)

1. A method for processing video, comprising:
acquiring an editing record of a target version video, wherein the editing record comprises an editing time point and an editing duration;
acquiring bullet screen information corresponding to the target version video;
and executing realignment processing on the barrage information in the target version video by utilizing the editing time point and the editing time length to obtain the updated target version video.
2. The method of claim 1, wherein the types of edit records include a delete class and an insert class;
wherein when the type of the edit record is the deletion type, the edit record includes: deleting the video starting time and deleting the video duration;
when the type of the edit record is the insertion type, the edit record includes: an insert video start time and an insert video duration.
3. The method of claim 1, wherein after performing the obtaining the edit recording of the target version video, the method further comprises:
acquiring a target version number of the target version video;
and carrying out association processing on the editing record and the target version number.
4. A method according to claim 3, characterized in that the method further comprises:
acquiring receiving information of the barrage information, wherein the receiving information comprises a sender identifier and a sending moment of the barrage information;
and binding the received information with the target version video by using the target version number.
5. The method according to claim 2, characterized in that the method further comprises:
when the type of the editing record is the deletion type, determining whether the bullet screen information is deleted or not according to the start time of the deleted video and the deleted video duration;
the executing the realignment processing on the barrage information in the target version video by using the editing time point and the editing time length includes:
and when the bullet screen information is not deleted, performing forward moving processing on the display of the bullet screen information after the start time of deleting the video according to the deleting video duration.
6. The method of claim 5, wherein said performing a realignment process on said bullet screen information in said target version video using said edit point in time and said edit duration further comprises:
and when the bullet screen information is deleted, determining the replacement time of the deleted bullet screen information from the edited target version video, displaying the deleted bullet screen information at the replacement time, and performing forward moving processing on the display of the bullet screen information after the start time of the deleted video according to the deleted video duration.
7. The method according to claim 2, characterized in that the method further comprises:
and when the type of the editing record is the insertion type, performing backward movement processing on the display of the bullet screen information after the start time of the insertion video according to the duration of the insertion video.
8. A video processing apparatus, comprising:
the acquisition module is used for acquiring an editing record of the target version video, wherein the editing record comprises an editing time point and an editing duration;
the acquisition module is also used for acquiring bullet screen information corresponding to the target version video;
and the alignment module is used for executing the realignment processing on the bullet screen information in the target version video by utilizing the editing time point and the editing time length to obtain the updated target version video.
9. A processing apparatus, comprising: a processor and a memory, the processor being configured to execute a processing program of a video stored in the memory to implement the video processing method of any one of claims 1 to 7.
10. A storage medium storing one or more programs executable by one or more processors to implement the method of processing video of any one of claims 1-7.
CN202310173881.6A 2023-02-23 2023-02-23 Video processing method, device, processing equipment and storage medium Pending CN116193195A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310173881.6A CN116193195A (en) 2023-02-23 2023-02-23 Video processing method, device, processing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310173881.6A CN116193195A (en) 2023-02-23 2023-02-23 Video processing method, device, processing equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116193195A true CN116193195A (en) 2023-05-30

Family

ID=86450347

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310173881.6A Pending CN116193195A (en) 2023-02-23 2023-02-23 Video processing method, device, processing equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116193195A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070263979A1 (en) * 2003-10-28 2007-11-15 Sony Corporation File Recording Apparatus and Editing Method for Video Effect
US20130094836A1 (en) * 2011-04-07 2013-04-18 Christopher Scott Gifford Embedded ancillary data processing method and system with program duration alteration
CN104104986A (en) * 2014-07-29 2014-10-15 小米科技有限责任公司 Audio frequency and subtitle synchronizing method and device
CN105828216A (en) * 2016-03-31 2016-08-03 北京奇艺世纪科技有限公司 Live broadcast video subtitle synthesis system and method
CN106060644A (en) * 2016-06-28 2016-10-26 武汉斗鱼网络科技有限公司 Live broadcast video clipping method and device associated with bullet screens
US10423660B1 (en) * 2017-12-07 2019-09-24 Amazon Technologies, Inc. System for detecting non-synchronization between audio and subtitle
CN110740387A (en) * 2019-10-30 2020-01-31 深圳Tcl数字技术有限公司 bullet screen editing method, intelligent terminal and storage medium
CN112929730A (en) * 2021-01-22 2021-06-08 北京奇艺世纪科技有限公司 Bullet screen processing method and device, electronic equipment, storage medium and system
CN112954434A (en) * 2021-02-26 2021-06-11 北京奇艺世纪科技有限公司 Subtitle processing method, system, electronic device and storage medium
CN113259776A (en) * 2021-04-14 2021-08-13 北京达佳互联信息技术有限公司 Binding method and device of caption and sound source
CN114157823A (en) * 2020-08-17 2022-03-08 富士胶片商业创新有限公司 Information processing apparatus, information processing method, and computer-readable medium
CN115529497A (en) * 2021-06-25 2022-12-27 腾讯科技(深圳)有限公司 Bullet screen playing method and device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070263979A1 (en) * 2003-10-28 2007-11-15 Sony Corporation File Recording Apparatus and Editing Method for Video Effect
US20130094836A1 (en) * 2011-04-07 2013-04-18 Christopher Scott Gifford Embedded ancillary data processing method and system with program duration alteration
CN104104986A (en) * 2014-07-29 2014-10-15 小米科技有限责任公司 Audio frequency and subtitle synchronizing method and device
CN105828216A (en) * 2016-03-31 2016-08-03 北京奇艺世纪科技有限公司 Live broadcast video subtitle synthesis system and method
CN106060644A (en) * 2016-06-28 2016-10-26 武汉斗鱼网络科技有限公司 Live broadcast video clipping method and device associated with bullet screens
US10423660B1 (en) * 2017-12-07 2019-09-24 Amazon Technologies, Inc. System for detecting non-synchronization between audio and subtitle
CN110740387A (en) * 2019-10-30 2020-01-31 深圳Tcl数字技术有限公司 bullet screen editing method, intelligent terminal and storage medium
CN114157823A (en) * 2020-08-17 2022-03-08 富士胶片商业创新有限公司 Information processing apparatus, information processing method, and computer-readable medium
CN112929730A (en) * 2021-01-22 2021-06-08 北京奇艺世纪科技有限公司 Bullet screen processing method and device, electronic equipment, storage medium and system
CN112954434A (en) * 2021-02-26 2021-06-11 北京奇艺世纪科技有限公司 Subtitle processing method, system, electronic device and storage medium
CN113259776A (en) * 2021-04-14 2021-08-13 北京达佳互联信息技术有限公司 Binding method and device of caption and sound source
CN115529497A (en) * 2021-06-25 2022-12-27 腾讯科技(深圳)有限公司 Bullet screen playing method and device

Similar Documents

Publication Publication Date Title
US8990195B2 (en) Systems and methods for searching media content based on an editing file
US7287222B2 (en) Information processing apparatus and method that determines effectiveness of metadata for editing information content
RU2379771C2 (en) Recording medium, recording device, playback device, recording method and playback method
CN111447505B (en) Video clipping method, network device, and computer-readable storage medium
EP1083567A2 (en) System and method for editing source metadata to produce an edited metadata sequence
US20080250080A1 (en) Annotating the dramatic content of segments in media work
US20050234858A1 (en) Recording and reproducing apparatus, reproducing apparatus, recording and reproducing method, reproducing method, program and recording medium
CN1653416A (en) System and method for reproducing information stored on a data recording medium in an interactive networked environment
CN102340699B (en) Preserve TV user history and use the method and system of information
JP2007074724A (en) Method and apparatus for synchronizing epg information between server and client in digital broadcasting network
CN103902664A (en) Page image rendering method and information providing method and device
CN109299352B (en) Method and device for updating website data in search engine and search engine
CN103217171A (en) Navigation equipment and method for editing map data by user
CN112947853B (en) Data storage method, device, server, medium and program product
CN116193195A (en) Video processing method, device, processing equipment and storage medium
JP4376218B2 (en) Content storage device, content storage method, and program recording medium
CN101562037A (en) Multimedia file playing method and device thereof
CN104135628A (en) Video editing method and terminal
JP2008084443A (en) Program structuring apparatus, program structuring method, and program
CN105049934A (en) Programmed video review processing method and device
CN112036133B (en) File storage method and device, electronic equipment and storage medium
JP5469974B2 (en) RECORDING DEVICE, RECORDING METHOD, AND RECORDING PROGRAM
CN102957948B (en) A kind of processing method of the broadcast content timeline data for television program assessment
CN111625508A (en) Information processing method and device
US20030233359A1 (en) Link resolving mechanism

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination