CN113645482A - Video processing method and device, electronic equipment and storage medium - Google Patents

Video processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113645482A
CN113645482A CN202010345562.5A CN202010345562A CN113645482A CN 113645482 A CN113645482 A CN 113645482A CN 202010345562 A CN202010345562 A CN 202010345562A CN 113645482 A CN113645482 A CN 113645482A
Authority
CN
China
Prior art keywords
video
editing
interface
target
account
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010345562.5A
Other languages
Chinese (zh)
Inventor
杨怀渊
江运柱
卢昳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010345562.5A priority Critical patent/CN113645482A/en
Publication of CN113645482A publication Critical patent/CN113645482A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The embodiment of the invention provides a video processing method and device, electronic equipment and a storage medium. The video processing method comprises the following steps: acquiring a playing video stream of a target video; entering a reference editing interface according to reference triggering operation of the played video stream; and editing the played video stream in the reference editing interface to generate the reference of the target video. In the scheme of the embodiment of the invention, the played video stream of the video is edited, and the reference of the video is correspondingly generated, so that the video can be secondarily created by a user, and the sharing effect is improved.

Description

Video processing method and device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a video processing method and device, electronic equipment and a storage medium.
Background
In social scenes such as sharing, users use video processing with mobile terminals more and more frequently. The video files stored in the mobile terminal include videos shot by the user himself using the camera application, videos transmitted by other users, videos downloaded from the internet, and the like. The meaning and value of the content in these videos are different from user to user, and the interest of the user in different videos is also different.
For example, in a social network, when different users are interested in the content of the same video, one party typically shares the video by uploading, and the other party forwards or downloads the video. However, there is room for improvement in this way of sharing.
Disclosure of Invention
Embodiments of the present invention provide a video processing method, an apparatus, an electronic device, and a storage medium to solve or alleviate the above problems.
According to a first aspect of the embodiments of the present invention, there is provided a video processing method, including: acquiring a playing video stream of a target video; entering a reference editing interface according to reference triggering operation of the played video stream; and editing the played video stream in the reference editing interface to generate the reference of the target video.
According to a second aspect of the embodiments of the present invention, there is provided a video processing method applied to a client, the method including: acquiring a reference of a second account to a target video of a first account, wherein the reference of the target video is generated according to editing of a playing video stream of the target video; and generating a release video of the second account based on the reference of the target video.
According to a third aspect of the embodiments of the present invention, there is provided a video processing method applied to a server, the method including: transmitting a play video stream of a target video so as to generate a reference of the target video by editing the play video stream; and receiving and saving the reference of the target video.
According to a fourth aspect of the embodiments of the present invention, there is provided a video processing method, including: acquiring a playing video stream of a target commodity video; generating a reference of the target commodity video by editing the played video stream; and performing related display on the reference of the target commodity video and the target commodity video.
According to a fifth aspect of the embodiments of the present invention, there is provided a video processing method, including: acquiring a real-time video stream from a video display interface; generating at least one playback video reference for a real-time video stream by editing the real-time video stream; publishing the at least one playback video reference into the video presentation interface.
According to a sixth aspect of the embodiments of the present invention, there is provided a video processing apparatus including: the acquisition module acquires a playing video stream of a target video; the control module is used for entering a reference editing interface according to reference triggering operation of the played video stream; and the editing module edits the played video stream in the reference editing interface to generate the reference of the target video.
According to a seventh aspect of the embodiments of the present invention, there is provided a video processing apparatus, applied to a client, the method including: the acquisition module acquires reference of a second account to a target video of the first account, wherein the reference of the target video is generated according to editing of a playing video stream of the target video; and the generating module is used for generating the publishing video of the second account based on the reference of the target video.
According to an eighth aspect of the embodiments of the present invention, there is provided a video processing apparatus, applied to a server, the method including: the transmission module is used for transmitting a playing video stream of a target video so as to generate a reference of the target video by editing the playing video stream; and the storage module receives and stores the reference of the target video.
According to a ninth aspect of embodiments of the present invention, there is provided an electronic apparatus, including: one or more processors; a computer readable medium configured to store one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method of any of the first to fifth aspects.
According to a tenth aspect of embodiments of the present invention, there is provided a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the method according to any one of the first to fifth aspects.
In the scheme of the embodiment of the invention, the played video stream of the video is edited, and the reference of the video is correspondingly generated, so that the video can be secondarily created by a user, and the sharing effect is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present invention, and it is also possible for a person skilled in the art to obtain other drawings based on the drawings.
Fig. 1 is a schematic diagram of a network data processing mechanism to which a video processing method according to an embodiment of the present invention is applied;
FIG. 2A is a schematic flow chart of a video processing method according to another embodiment of the invention;
FIG. 2B is a schematic diagram of a method according to another embodiment of the present invention;
FIG. 2C is a schematic diagram of a method according to another embodiment of the present invention;
FIG. 2D is a schematic diagram of a method according to another embodiment of the invention;
FIG. 2E is a schematic flow chart of a video processing method according to another embodiment of the invention;
FIG. 2F is a schematic flow chart of a video processing method according to another embodiment of the invention;
FIG. 3A is a schematic flow chart of a video processing method according to another embodiment of the invention;
FIG. 3B is a schematic interaction diagram of a video processing method according to another embodiment of the invention;
FIG. 4A is a schematic flow chart of a video processing method according to another embodiment of the invention;
FIG. 4B is a schematic interaction diagram of a video processing method according to another embodiment of the invention;
FIG. 5A is a schematic diagram illustrating an interface change of a video processing method according to another embodiment of the invention;
FIG. 5B is a schematic diagram illustrating an interface change of a video processing method according to another embodiment of the invention;
FIG. 6 is a schematic block diagram of a video processing device of another embodiment of the present invention;
FIG. 7 is a schematic block diagram of a video processing device of another embodiment of the present invention;
FIG. 8 is a schematic block diagram of a video processing device of another embodiment of the present invention;
FIG. 9 is a schematic block diagram of an electronic device of another embodiment of the present invention;
fig. 10 is a hardware configuration of an electronic device according to another embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments of the present invention shall fall within the scope of the protection of the embodiments of the present invention.
The following further describes specific implementation of the embodiments of the present invention with reference to the drawings. Fig. 1 is a schematic diagram of a network data processing mechanism to which the video processing method according to an embodiment of the present invention is applied. The network data processing mechanism may be implemented as a Feed stream (a data stream distribution mechanism). The Feed stream can also be called an information stream for connecting information producers and information consumers, thereby improving the information transmission and sharing efficiency.
Generally, Feed flow scenarios include, but are not limited to, e-commerce applications, live video applications, social networking applications such as instant messaging, social media, search engine applications, information recommendation applications such as audio recommendations, video recommendations, teletext recommendations, short video applications, news comment-like applications, content production platforms, content distribution platforms, educational-like applications, and the like.
For example, in an e-commerce application scenario, merchandise information may be presented to a consumer. For example, a consumer may be recommended a product that may be of interest, or a consumer may be recommended a new product of a store that the consumer is interested in, a new payment for a product of interest, or the like.
For example, in a social networking application scenario, the user may be pushed the latest dynamic or relevant topics of subscription numbers that the user is interested in. For example, a two-party status message based on a private domain connection relationship may be pushed. Status messages of subscribers based on public domain connectivity, such as news, comments, etc., may also be pushed.
In one example, distribution, pushing, recommendation, or the like of data, such as videos, may be implemented between a message generator and a message consumer through the data processing mechanism shown in FIG. 1. First, a user obtains a Feed identification list in an application such as that described above. The list may include all new messages (e.g., messages of another party to whom the user is interested or subscribed). Part of the new message may also be included. The user may open all messages in the list or may open a portion of the messages in the list. For example, the messages in the list may generate two parts, a body and an index, where the index contains the message ID and metadata. In addition, there are also storage and query services for relationships.
It should be understood that the above-described message may be considered as a type of semi-structured data (such as text, pictures, short video, audio, metadata, etc.). The read-write throughput of the data is high, and the specific read-write proportion can be determined according to specific scenes. Furthermore, due to the large storage space requirements, very good scalability is required to support the growth of specific services (e.g., the growth in the number of video productions). Further, such as in a social networking application scenario, the message may be updated one or more times, such as content modification, browsing number, like number, forwarding number, and tag deletion. Generally, messages may be deleted after a period of time, for example, for a news-like application scenario, after a year; for social networking application scenarios, messages may be deleted after several years.
Further, Feed stream processing mechanisms may include a push mode and a pull mode. The push mode is to copy the message for N times and send the message to the N message consumers' mailboxes, and the message consumers directly obtain the message from their own mailboxes when they want to see the message. For pull mode, the message producer writes its message into its inbox and the message consumer collects the message from the M inbox of interest when he wants to view it.
For video Feed streams, for example, in social networking applications, a video producer may publish a video message and a video consumer may forward or download the video message. In addition, other video consumers may forward or download the video message again. A data processing back office, such as a server, may record the number of forwards for a particular video and present the number in a user interface of a video producer or video consumer, associated or not with the video, or calculate a heat indicator for the particular video and push the particular video to other video consumers based on the heat indicator.
Fig. 2A is a schematic flow chart of a video reference generation method according to another embodiment of the present invention. The video processing method of fig. 2A, comprising:
210: and acquiring the playing video stream of the target video.
It should be understood that obtaining the played video stream of the target video may be implemented by establishing streaming media transmission with the server. For example, a video stream acquisition request may be generated in response to a play trigger of the target video, and the request may be sent to the server. For example, the acquisition request may be generated in response to a trigger of a video card of the target video. The acquisition request may also be generated, for example, in response to a trigger of a content introduction tag of the target video. For example, the obtaining request may also be generated in response to the triggering of a play triggering control of a play interface of the target video. In addition, for the played video stream of the target video, the second account may acquire the played video stream of the target video of the first account. In one example, the first account number may be the same account number as the second account number. In other words, the second account or the first account may view videos published by itself. In another example, the first account number and the second account number are different account numbers. For example, the first account may be a subscription account for the second account. For example, the first account number may be an account number of interest for the second account number. For example, the first account and the second account may be located in the same social group. The first account and the second account may be in a direct care or subscription relationship. There may also be an indirect subscription or focus relationship between the first account and the second account.
In addition, for a B/S (browser/server) web service mode, a play video stream of a target video is acquired, and a trigger operation for the target video can be acquired in a browser, for example, on a web page. For a C/S (client/server) web service mode, a trigger operation on a target video may be acquired in a specific application of a client. It should be understood that for any of the above modes, embodiments of the present invention may be applied to scenarios including, but not limited to, video blogs (vlog), video live broadcasts, and the like. For example, e-commerce applications, live video applications, social networking applications such as instant messaging, social media, search engine applications, information recommendation applications such as audio recommendations, video recommendations, teletext recommendations, short video applications, news review-like applications, content production platforms, content distribution platforms, educational-like applications, and the like. For example, in an e-commerce application scenario, merchandise information may be presented to a consumer. For example, a consumer may be recommended a product that may be of interest, or a consumer may be recommended a new product of a store that the consumer is interested in, a new payment for a product of interest, or the like. For example, in a social networking application scenario, the user may be pushed the latest dynamic or relevant topics of subscription numbers that the user is interested in. For example, a two-party status message based on a private domain connection relationship may be pushed. Status messages of subscribers based on public domain connectivity, such as news, comments, etc., may also be pushed.
In addition, the obtaining of the played video stream of the target video may be obtaining the played video stream of the target video in a video publishing space or a browsing space. The video publishing space or browsing control may be, for example, a web page of a social network, a browsing interface of an application, a discovery page (or interface), a push page (or interface), a Feed stream page, or the like. For example, the video card or the video introduction tab described above may be provided in the video distribution space or the browsing space. In addition, a video playing interface of the target video can also be arranged in the video publishing space or the browsing space.
220: and according to the reference triggering operation of the played video stream, entering a reference editing interface.
It should be appreciated that entering the reference editing interface may be switching from a video playback interface to a reference editing interface. The reference editing interface may also refer to a video playback interface as a reference editing mode of the reference editing interface. In one example, such as when the interface of a video card or video introduction tag is the same interface as the video playback interface, the reference trigger operation to play the video stream may be located on the same interface as the playback trigger operation described above. In another example, when the interface of the video card or the video introduction tag is different from the video playing interface, in other words, when the video playing interface is entered from the interface thereof by the trigger operation on the video card or the video introduction tag, the reference trigger operation for playing the video stream may be located at a different interface from the above-mentioned play trigger operation.
230: and editing the played video stream in a reference editing interface to generate a reference of the target video.
It should be understood that reference editing herein may be a video-based secondary authoring, such as editing such as cropping, adding video elements, special effects processing, and the like, for example. The reference to the target video may be a section of the target video that is truncated or cropped. The reference to the target video may be a video clip generated by editing a clip of the target video, adding a video element, performing special effect processing, or the like. The target video may be associated with a reference to the target video. For a particular target video, it may have one or more references. The one or more references may belong to the same account as the target video or may belong to a different account from the target video. For example, a target video is published for a first account and a reference to the target video is generated for a second account. The second account may locally save the generated reference of the target video, or send the reference of the target video to the server. For example, the second account may send a reference to the target video to the server for saving in the private storage space of the second account. In addition, multiple references to the target video may be from the same account or from multiple accounts. For example, the plurality of references are generated by a plurality of accounts, respectively, at respective client sides.
In addition, the first account may have target video settings allowed to be referenced, thereby allowing secondary authoring with other accounts, such as the second account. The first account number can also set the target video not to be allowed to be quoted, so that the protection of the target video copyright can be realized.
The target video has an identification associated with the reference to the target video. For example, the server may associate the target video with its reference based on the identification described above.
It should also be understood that generating the reference to the target video may be performed in the reference editing interface, or may be switched from the reference editing interface to the reference generation interface and performed in the reference generation interface. When generation of a reference of a target video is performed in the reference editing interface, it is possible to switch from the reference editing mode to the reference generation mode and perform in the reference generation mode.
In the scheme of the embodiment of the invention, the played video stream of the video is edited, and the reference of the video is correspondingly generated, so that the video can be secondarily created by a user, and the sharing effect is improved.
In addition, for a video producer, the scheme of the embodiment of the invention simplifies the production flow of secondary creation and can be realized on mobile equipment such as a mobile phone.
In addition, the scheme of the embodiment of the invention can realize secondary creation such as reference, so that high-quality video clips or video materials can be marked or referenced.
In addition, based on the quoted mode of the embodiment of the invention, the video consumer can be used as a video producer at the same time, and the video producer can be used as the video consumer at the same time, so that the interactivity of the video content is enhanced.
In addition, the reference mode of the embodiment of the invention enhances the interaction between the first authoring account and the second authoring account. Meanwhile, the video reference can be used as a sharing mode and a comment mode.
For example, in a video blog application scenario, the second account may obtain a video stream of a video blog by triggering a play trigger control of the video blog (VLOG) of the first account. The second account may enter a reference editing interface of the video blog by triggering a reference triggering control of the video blog, for example, entering a reference editing mode from a play mode. When editing is completed, for example, when cropping is completed, the second account may enter the video reference generation interface from the reference editing interface, or enter the video reference generation mode from a reference editing mode in the reference editing interface.
In addition, in one example, the reference trigger control and the play trigger control may be disposed in the same interface, and the reference trigger control and the play trigger control may control entry into the play mode and the reference edit mode, respectively. When reference editing is completed, entry into the reference generation mode may be automatically triggered. In another example, the reference trigger control and the play trigger control may be disposed in the play interface and the video card interface (or the video content introduction interface), respectively. For example, a play trigger control may be used to switch from a video card interface to a video play interface.
In a live scene, the second account can trigger the control by triggering the playing of the live video of the first account to acquire the video stream of the live video. The second account may enter a reference editing interface of the live video by triggering a reference trigger control of the live video, for example, entering a reference editing mode from a play mode. When editing is completed, for example, when cropping is completed, the second account may enter the video reference generation interface from the reference editing interface, or enter the video reference generation mode from a reference editing mode in the reference editing interface.
In a video blog application scenario, the second account may belong to the same topic group or the same information publishing group as the first account, or the second account may be in the same category of video blog group as the first account, for example, a food video category, a manual video category, a finishing video category, a travel video category, and the like. The first account may be a subscription account or an attention account of the second account. The second account may also be a subscription account or an attention account of the first account. The first account number and the second account number may also be a mutually interested account number.
When the second account number publishes the reference video, the server can publish a message to the first account number to inform that other account numbers refer to the video. For example, the comment of the first account by the second account is shown in the form of a video comment. The server can also issue the number of times that the video is referred to the first account in real time. For example, the server does not notify the first account of the number of times its respective videos were referenced, and the first account may view the total number of times its videos were referenced in its personal setting. The first account may also view the number of times its respective video is applied in its personal setting. The first account number may also view the total length of time that its video is referenced in this personal setting. The first account may also view the popularity rankings of the videos referenced in their respective videos. The same is true for the second account, whose release video may be referenced by other accounts, which may in turn obtain other notifications.
In a live video scene, the second account may belong to the same topic group or the same information publishing group as the first account, or the second account may be in a live group of the same category as the first account, for example, a food live category, a manual live category, a sorting live category, a travel live category, an e-commerce live broadcast, an education live broadcast, and the like. It should be understood that the first account may be a subscription account or an interest account of the second account. The second account may also be a subscription account or an attention account of the first account. The first account number and the second account number may also be a mutually interested account number. In this example, however, for a particular live session, the first account may be a live account and the second account may view the account.
Specifically, for an e-commerce live broadcast scene, the second account number may reference a video clip of a specific commodity in the live broadcast process and publish the video clip on the platform to represent a recommendation of the commodity. Accordingly, the server may push the reference of the video clip to other accounts having an interest relationship with the second account. Other accounts with an interest relationship with the second account may also forward or re-reference (e.g., secondary authoring) the video clip.
In another implementation manner of the present invention, acquiring a play video stream of a target video includes: and acquiring a playing video stream of the target video from the server according to the triggering operation of the second account on the video card of the first account.
In another implementation manner of the invention, the video card of the first account is generated by uploading the target video to the server.
In another implementation manner of the present invention, acquiring a play video stream of a target video includes: and acquiring a playing video stream of the target video according to the triggering operation of the second account on the video introduction interface of the first account.
In another implementation manner of the present invention, in a reference editing interface, editing a playing video stream includes: in a reference editing interface, editing the played video stream in a reference editing mode; and responding to the editing result in the reference editing mode, entering a reference generation mode, and starting to generate the reference of the target video.
In another implementation manner of the present invention, in the reference editing interface, editing the playing video stream further includes: if the generation of the reference to the target video is completed, the reference to the target video is saved and the editing interface exits from the reference.
In another implementation manner of the present invention, in the reference editing interface, editing the playing video stream further includes: and if the reference generation of the target video fails, returning to the reference editing mode from the reference generation mode for re-editing.
In another implementation manner of the present invention, entering a reference editing interface according to a reference triggering operation on a playing video stream includes: according to the reference triggering operation of the played video stream, the video playing interface enters a reference editing interface, wherein the method further comprises the following steps: and if the reference generation of the target video fails, returning to the video playing interface from the reference editing interface.
In another implementation manner of the present invention, in a reference editing interface, editing a play video stream to generate a reference of a target video includes: in a reference editing interface, cutting a played video stream to obtain a cut video; and transcoding the cut video to generate the reference of the target video.
In another implementation of the invention, the method further comprises: and uploading the reference of the second account to the target video to the server so as to store the reference of the target video in a video reference library of the second account.
In another implementation manner of the present invention, as an example, in a reference editing interface, editing processing is performed on a play video stream, and the editing processing includes: determining an editing time period for playing a video clip to be clipped of a video stream, and displaying the editing time period through an editing time axis; and generating an edited reference video according to the trigger operation on the editing time axis.
As another example, the method further comprises: responding to a reference triggering operation of the playing video stream, and determining a video segment to be clipped of the playing video stream, wherein the determining of the editing time interval of the video segment to be clipped of the playing video stream comprises the following steps: determining a current video frame corresponding to the quoting triggering operation in the played video stream; based on the current video frame, an editing period is determined such that a point in time at which the current video frame is located is in the editing period.
As another example, a crop box control for referencing a video is shown on an editing timeline, where in the referencing editing interface, editing processing is performed on a playing video stream, including: determining a cropping interval of the reference video by adjusting the cropping frame control on the editing timeline; and editing the playing video stream based on the clipping time interval.
As another example, in the reference editing interface, the editing process is performed on the play video stream, and includes: displaying a cropping frame control of a reference video at an editing time axis; and responding to the adjustment process of the cropping frame control, and editing and processing the played video stream.
As an example, in a reference editing interface, editing a playing video stream includes: displaying at least part of the played video stream in a quote editing interface as a video to be edited; and editing the video to be edited through the editing time axis of the video to be edited.
In another implementation of the invention, playing at least part of the video stream comprises playing an unplayed part of the video stream, the method further comprising: after editing is completed, a reference to the target video is generated by downloading the unplayed portion.
In another implementation manner of the present invention, a cropping frame control for referring to a video is displayed on an editing timeline, where editing a video to be edited through the editing timeline of the video to be edited includes: determining a cropping interval of the reference video by adjusting the cropping frame control on the editing timeline; and performing clipping processing on the playing video stream based on the clipping time interval.
In addition, when the generation of the reference is performed in the reference generation interface, the generation of the reference may fail or the generation of the reference may succeed. For example, the video clip to be edited may be determined based on the current video frame. The current video frame may be determined in response to a trigger timing in the video playback interface. The video clip to be edited may include a played portion and/or an unplayed portion. Preferably, the video clip to be edited includes a played portion and an unplayed portion. The unplayed portion may include a downloaded portion and/or an un-downloaded portion. Editing processing such as cropping may be performed on the video segment to be edited, resulting in an edited video segment. The edited video segment may be transcoded to generate a reference. In one example, transcoding may occur while downloading, e.g., transcoding a second video frame prior to a first video frame while downloading the first video frame. The unified transcoding may also be performed after the edited video segment is downloaded.
A reference generation time threshold may be set, and when generation of a reference is not completed within the reference generation time threshold, reference generation fails. For example, a prompt window may be presented and a reference edit interface returned. If the generation of the reference is completed within the reference generation time threshold, the reference generation is successful, and a prompt may be presented that the reference generation was successful, or a reference library viewing interface may be entered.
In addition, the triggering manner herein includes, but is not limited to, an operation having a certain trajectory direction such as a slide gesture, an operation having a certain operation duration such as a long-press operation, an operation having a certain number of touches within a predetermined duration such as a single-click or a double-click, and the like. In addition, the above-mentioned trigger operation may also be gesture, voice trigger, and other sensor trigger manners, that is, this example does not limit this.
In one particular example, a user (e.g., a second account number) may click on the reference button to create a new interface, i.e., a reference editing interface. And performing reference editing related operation on the reference editing interface. For example, the extent of the video segment that needs to be intercepted may be selected by sliding an edit bar (a sort of crop box control). It should be understood that in this example, in general, the edit bar may have a default edit total time (e.g., 30 seconds), with the first time period and the second time period (the first 15 seconds + the second 15 seconds) of the original screen. The time period may be set to a modifiable time period or may be set to a non-modifiable time period. A default selection period of time, for example, 10 seconds, may be set for the edit bar. In other words, it can be set to 10 seconds after the original screen. The selection period may be modified by a slide operation or may be set to be unchangeable. For example, after entering the video editing state, the playing from the original picture can be automatically started. In one example, the loop play may be repeated for a selected period of time. Further, a final selection time period may be determined, for example, a longest time period and a shortest time period. For example, the maximum time period is 30 s. For example, the shortest period of time is 3 s.
For the operation of sliding the edit bar, in another specific example, an editable total time length may be set. For example, when the reference is triggered, if the original frame is less than the first time period, the time period is continued to the later time, and the time period to be edited (the sum of the first time period and the second time period, for example, 30 seconds) is guaranteed. If less than 15 seconds after the trigger video frame (original picture), it can be extended forward. In addition, if the whole original video is short of the time period to be edited, the video can be completely intercepted. For the selection period, when the reference is triggered, if the original screen is not followed by the length of the selection period (i.e., 10 seconds), all selection may be by default.
For the operation of sliding the edit bar, in another specific example, if the user clicks the reference button without starting playing directly, triggering the video frame to press the first frame can be performed as described above.
In addition, after the user clicks the confirmation, the client performs the cutting coding and conversion, and stores the cutting coding and conversion locally or uploads the cutting coding and conversion to a server (for example, a cloud server). In one example, since the generation process needs to be time-consuming and the time consumption needs to be evaluated, the progress prompt may be temporarily performed on the current page, and the current editing page may stay after the progress prompt fails. In another example, the edit page is exited successfully, prompting completion with a prompt window such as a pop-up box. If the operation is quitted halfway, a prompt window can be displayed to the user so that the user can select whether to abandon the current operation.
In another implementation of the invention, the method further comprises: and displaying the reference of the target video and the target video in association.
In another implementation manner of the present invention, the exposing the reference of the target video in association with the video includes: and displaying the referenced playing interface entry of the target video in the playing interface of the target video.
In another implementation manner of the present invention, the exposing the reference of the target video in association with the video includes: and displaying an entry of a display interface of a reference list in the playing interface of the target video, wherein the reference list comprises the reference of the target video.
In another implementation of the invention, the method further comprises: entering the display interface according to the triggering operation of the entry of the display interface in the playing interface of the target video; and displaying the reference list by arranging video reference playing areas in the display interface.
In another implementation of the invention, the method further comprises: and displaying a playing interface inlet of the target video in the display interface.
Fig. 2C is a schematic diagram of a method according to another embodiment of the invention. As shown, the left side view is a topic interface (or topic page) showing the target video, as well as the presentation entry for the video reference. The right side of the figure is a reference interface (or reference page) showing the target video and the video references. In one example, the target video may be played in the subject interface and the reference interface may be entered in response to a triggering operation of a presentation entry referenced by the video. In another example, the target video cannot be played in the theme interface, and in the theme interface, the playing interface of the target video can be entered in response to a trigger for the area where the target video is located. For example, the playing interface may be a reference interface, or may be another interface. In one example, the target video may be played in the reference interface and the video reference may be played in the reference interface. In another example, the target video cannot be played and the reference video can be played in the reference interface.
Preferably, the target video can be played in the subject interface and cannot be played in the reference interface.
Fig. 2D is a schematic diagram of a method according to another embodiment of the invention. As shown, the left side diagram is a topic interface showing the target video, and the presentation entry for the video reference. The middle diagram is a reference interface showing the target video and the video references. The right-hand diagram is another reference interface (or reference page). In one example, the target video may be played in the subject interface and the reference interface may be entered in response to a triggering operation of a presentation entry referenced by the video. In another example, the target video cannot be played in the theme interface, and in the theme interface, the playing interface of the target video can be entered in response to a trigger for the area where the target video is located. For example, the playing interface may be a reference interface, or may be another interface. In one example, the target video may be played in the reference interface and the video reference may be played in the reference interface. In another example, the target video cannot be played and the reference video can be played in the reference interface. In addition, in the middle and right diagrams, three video references are shown in addition to the target video, but this is merely an example, and the embodiment of the present invention does not limit this. In addition, video reference 1, video reference 2, and video reference 3 are presented in association with the target video. For example, three video references may belong to the same topic as the target video.
Further, the interface change state is shown in this example as switching from one referencing interface to another. For example, in this example, video reference 1 is re-referenced by at least one other video reference. For example, the target video may also have a reference relationship with video reference 1. As shown in the right hand diagram, video reference 1 is shown at the top of the interface, indicating the state in which it is referenced, with the lower videos indicating their references to the top videos. Accordingly, in the middle diagram, if the target video is in a non-directly playable state (i.e., played by triggering entry into another interface), then in the right diagram, video reference 1 is also in a non-directly playable state. In addition, preferably, each video referencing the referenced video can be directly played in the referencing interface. It should also be understood that the above implementation is only exemplary, and in other examples, whether the video is in the direct play state, the number of the reference videos, and the number of the referenced videos may be arbitrary, which is not limited by the embodiment of the present invention.
Fig. 2E is a schematic flow chart of a video processing method according to another embodiment of the invention. The video processing method of fig. 2E, comprising:
240: and acquiring a playing video stream of the target commodity video.
250: and editing the played video stream to generate the reference of the target commodity video.
260: and performing related display on the reference of the target commodity video and the target commodity video.
In the embodiment of the invention, because the reference of the target commodity video is generated by editing the played video stream, the reference and the target commodity video are displayed in a correlation mode, and the interactivity of the content based on the target commodity video is improved.
In another implementation manner of the present invention, acquiring a play video stream of a target product video includes: and acquiring a playing video stream of the target commodity video from the commodity video display interface. The method for displaying the reference of the target commodity video and the target commodity video in a correlation mode comprises the following steps: and issuing the reference of the target commodity video to the commodity video display interface.
In this example, since the associated display is performed based on the commodity video display interface, in addition to improving the interactivity between the referenced publisher and the publisher of the target commodity video, the interactivity of other visitors who access the commodity video display interface can also be improved.
Fig. 2F is a schematic flow chart of a video processing method according to another embodiment of the invention. The video processing method of fig. 2F, comprising:
270: and acquiring the real-time video stream from the video display interface.
280: at least one playback video reference for the real-time video stream is generated by editing the real-time video stream.
290: publishing at least one playback video reference into the video presentation interface.
In the embodiment of the invention, as the at least one playback video reference is generated by editing the real-time video stream, the at least one playback video application is published into the video display interface, and the conversion from live broadcasting to on-demand broadcasting based on the video content is realized. For example, when a visitor makes a video access at a different point in time from a real-time video stream, efficient access can be achieved.
In another implementation of the present invention, the generating at least one playback video reference of the real-time video stream by editing the real-time video stream includes: editing the real-time video stream into at least one video segment; and transcoding the at least one video segment to obtain at least one playback video of the real-time video stream.
For example, the real-time video stream adopts a first coding mode with low time delay, the playback video adopts a second coding mode with high definition, and the transcoding process is to transcode from the first coding mode to the second coding mode.
Fig. 3A is a schematic flow chart of a video processing method according to another embodiment of the invention. The video processing method of fig. 3A may be performed by any suitable electronic device having data processing capabilities, including but not limited to: a server, a mobile terminal (such as a mobile phone, a PAD, etc.) and a client of a PC, etc. The method comprises the following steps:
310: and acquiring the reference of the second account to the target video of the first account, wherein the reference of the target video is generated according to the editing of the playing video stream of the target video.
320: and generating a release video of the second account based on the reference of the target video.
Fig. 3B is a schematic interaction diagram of a video processing method according to another embodiment of the invention. As shown, in step 301, a first account client uploads a target video to a server. For example, the target video may be an original video of the first account or a locally uploaded video, and the target video may also be a reference to other videos. Further, when the target video is referred to as another video, editing such as cropping may be performed based on the video stream, or clipping may be performed on the target video.
In step 302, the second account client obtains a video stream of the target video from the server and generates a reference to the target video based on the video stream. For example, the second account acquires the video stream of the target video by triggering play control of the video stream of the target video. For example, the client may generate an acquisition request for playing a video stream based on a trigger of play control, and send the acquisition request to the server. For example, the second account client may edit the video stream to generate a reference to the target video. For example, the second account client may edit all or part of the video stream and generate a reference to the target video.
For example, in a video blog application scenario, the second account may obtain a video stream of a video blog by triggering a play trigger control of the video blog (VLOG) of the first account. The second account may enter a reference editing interface of the video blog by triggering a reference triggering control of the video blog, for example, entering a reference editing mode from a play mode. When editing is completed, for example, when cropping is completed, the second account may enter the video reference generation interface from the reference editing interface, or enter the video reference generation mode from a reference editing mode in the reference editing interface.
In addition, in one example, the reference trigger control and the play trigger control may be disposed in the same interface, and the reference trigger control and the play trigger control may control entry into the play mode and the reference edit mode, respectively. When reference editing is completed, entry into the reference generation mode may be automatically triggered. In another example, the reference trigger control and the play trigger control may be disposed in the play interface and the video card interface (or the video content introduction interface), respectively. For example, a play trigger control may be used to switch from a video card interface to a video play interface.
In a live scene, the second account can trigger the control by triggering the playing of the live video of the first account to acquire the video stream of the live video. The second account may enter a reference editing interface of the live video by triggering a reference trigger control of the live video, for example, entering a reference editing mode from a play mode. When editing is completed, for example, when cropping is completed, the second account may enter the video reference generation interface from the reference editing interface, or enter the video reference generation mode from a reference editing mode in the reference editing interface.
In step 303, the second account client uploads a reference to the target video to the server. For example, the reference generated in step 302 may be uploaded directly to the server in response to the generation of the reference. For example, when the reference of the target video is generated, the second account client uploads the reference of the target video to the server according to an upload reference triggering operation of the user. For example, although not shown, the second account client may also save a reference to the target video locally. In addition, the video transmitted by the video playing module in the server side can be arranged separately from the reference library for storing the video references. For example, videos of different accounts may be stored in different storage objects, respectively. In addition, references to different accounts may also be stored in different reference repositories.
In step 304, the second account client queries the server for a reference to the target video. For example, the second account client may query its own reference library to obtain at least one video reference in the references. For example, the client may generate a video to be published based on the at least one reference. For example, the client may generate a video to be published based on the reference to the target video.
In step 305, the second account client uploads the published video to the server. It should be understood that the published video may be referenced by other accounts. For example, the first account or other account may also generate a video reference based on the published video. For example, the first account or other accounts may obtain a reference to the release video by obtaining a video stream of which the release is farz and editing the video stream.
In a video blog application scenario, the second account may belong to the same topic group or the same information publishing group as the first account, or the second account may be in the same category of video blog group as the first account, for example, a food video category, a manual video category, a finishing video category, a travel video category, and the like. The first account may be a subscription account or an attention account of the second account. The second account may also be a subscription account or an attention account of the first account. The first account number and the second account number may also be a mutually interested account number.
When the second account number publishes the reference video, the server can publish a message to the first account number to inform that other account numbers refer to the video. For example, the comment of the first account by the second account is shown in the form of a video comment. The server can also issue the number of times that the video is referred to the first account in real time. For example, the server does not notify the first account of the number of times its respective videos were referenced, and the first account may view the total number of times its videos were referenced in its personal setting. The first account may also view the number of times its respective video is applied in its personal setting. The first account number may also view the total length of time that its video is referenced in this personal setting. The first account may also view the popularity rankings of the videos referenced in their respective videos. The same is true for the second account, whose release video may be referenced by other accounts, which may in turn obtain other notifications.
In a live video scene, the second account may belong to the same topic group or the same information publishing group as the first account, or the second account may be in a live group of the same category as the first account, for example, a food live category, a manual live category, a sorting live category, a travel live category, an e-commerce live broadcast, an education live broadcast, and the like. It should be understood that the first account may be a subscription account or an interest account of the second account. The second account may also be a subscription account or an attention account of the first account. The first account number and the second account number may also be a mutually interested account number. In this example, however, for a particular live session, the first account may be a live account and the second account may view the account.
Specifically, for an e-commerce live broadcast scene, the second account number may reference a video clip of a specific commodity in the live broadcast process and publish the video clip on the platform to represent a recommendation of the commodity. Accordingly, the server may push the reference of the video clip to other accounts having an interest relationship with the second account. Other accounts with an interest relationship with the second account may also forward or re-reference (e.g., secondary authoring) the video clip.
Fig. 4A is a schematic flow chart of a video processing method according to another embodiment of the invention. The video processing method of fig. 4 may be performed by any suitable electronic device having data processing capabilities, including but not limited to: server, mobile terminal (such as mobile phone, PAD, etc.), PC, etc. For example, the method of the present embodiment may be performed by a server or a device having service capabilities. Specifically, it may be a cloud server such as a public cloud, a private cloud, or a hybrid cloud. The method comprises the following steps:
410: transmitting a play video stream of the target video so as to generate a reference of the target video by editing the play video stream;
420: a reference to the target video is received and saved.
In another implementation of the present invention, a method for transmitting a play video stream of a target video includes: receiving a transmission request generated by triggering operation of a video card of a first account according to a second account; and transmitting the playing video stream of the target video to the second account according to the transmission request.
In another implementation of the present invention, receiving and saving a reference to a target video includes: and storing the reference of the target video in a video reference library of the second account.
In another implementation of the invention, the method further comprises: determining the current reference times of the target video; and responding to the reference of the target video sent by the client, and updating the current reference times.
Fig. 4B is a schematic interaction diagram of a video processing method according to another embodiment of the invention. As shown in the figure, in step 401, the second account client obtains a video stream of the target video from the server. In step 402, the second account client generates a reference to the target video.
For example, in a video blog application scenario, the second account may obtain a video stream of a video blog by triggering a play trigger control of the video blog (VLOG) of the first account. The second account may enter a reference editing interface of the video blog by triggering a reference triggering control of the video blog, for example, entering a reference editing mode from a play mode. When editing is completed, for example, when cropping is completed, the second account may enter the video reference generation interface from the reference editing interface, or enter the video reference generation mode from a reference editing mode in the reference editing interface.
In addition, in one example, the reference trigger control and the play trigger control may be disposed in the same interface, and the reference trigger control and the play trigger control may control entry into the play mode and the reference edit mode, respectively. When reference editing is completed, entry into the reference generation mode may be automatically triggered. In another example, the reference trigger control and the play trigger control may be disposed in the play interface and the video card interface (or the video content introduction interface), respectively. For example, a play trigger control may be used to switch from a video card interface to a video play interface.
In a live scene, the second account can trigger the control by triggering the playing of the live video of the first account to acquire the video stream of the live video. The second account may enter a reference editing interface of the live video by triggering a reference trigger control of the live video, for example, entering a reference editing mode from a play mode. When editing is completed, for example, when cropping is completed, the second account may enter the video reference generation interface from the reference editing interface, or enter the video reference generation mode from a reference editing mode in the reference editing interface.
In step 403, the second account client sends a reference to the target video to the service ticket. For example, the server stores a reference to the target video in a reference library of the second account.
Fig. 5A is a schematic diagram of an interface change of a video processing method according to another embodiment of the invention. As shown, the left diagram shows a video playback interface (or, video playback mode), i.e., a current video frame played by a playback video stream of the target video. The middle diagram shows a reference editing interface (or reference editing mode), i.e. editing a video segment (or video segment to be edited) based on the current video frame based on a slidable edit bar. The right diagram shows a reference generation interface (or reference generation mode), i.e., the generation process is performed based on a video clip that clips a video to be edited.
In one example, when generation of a reference is performed in the reference generation interface, the generation of the reference may fail or the generation of the reference may succeed. For example, the video clip to be edited may be determined based on the current video frame. The current video frame may be determined in response to a trigger timing in the video playback interface. The video clip to be edited may include a played portion and/or an unplayed portion. Preferably, the video clip to be edited includes a played portion and an unplayed portion. The unplayed portion may include a downloaded portion and/or an un-downloaded portion. Editing processing such as cropping may be performed on the video segment to be edited, resulting in an edited video segment. The edited video segment may be transcoded to generate a reference. In one example, transcoding may occur while downloading, e.g., transcoding a second video frame prior to a first video frame while downloading the first video frame. The unified transcoding may also be performed after the edited video segment is downloaded.
A reference generation time threshold may be set, and when generation of a reference is not completed within the reference generation time threshold, reference generation fails. For example, a prompt window may be presented and a reference edit interface returned. If the generation of the reference is completed within the reference generation time threshold, the reference generation is successful, and a prompt may be presented that the reference generation was successful, or a reference library viewing interface may be entered.
Fig. 5B is a schematic diagram of an interface change of a video processing method according to another embodiment of the invention. As shown, the left side illustrates a video card presentation interface (or, video card presentation mode), i.e., a page or interface that a user (e.g., a second account number) browses or interacts in a browser or video application. The state of the middle graph may be entered based on a trigger operation on a particular video card (e.g., a second video card). The middle diagram shows the video playback interface (or video playback mode), i.e., the current video frame played by the playback video stream of the target video. The right diagram shows a reference editing interface (or reference editing mode), i.e. editing a video segment (or video segment to be edited) based on the current video frame based on a slidable edit bar. It should be understood that, although not shown, the present example may also include a reference generation interface (or reference generation mode), i.e., performing generation processing based on a video clip that clips a video to be edited. It should be understood that the video cards described above are merely examples. In addition, although in the present example, interface switching between the interface of the video card and the video playing interface is shown, in other examples, the target video may be played in the video card. In addition, the interface where the video card is located can be directly switched to the reference editing interface. The embodiment of the present invention is not limited thereto.
FIG. 6 is a schematic block diagram of a video processing device of another embodiment of the present invention. The video processing apparatus of fig. 6 includes:
the obtaining module 610 obtains a playing video stream of the target video.
And the control module 620 enters a reference editing interface according to the reference triggering operation of the played video stream.
The editing module 630 edits the played video stream in the reference editing interface to generate a reference of the target video.
In the scheme of the embodiment of the invention, the played video stream of the video is edited, and the reference of the video is correspondingly generated, so that the video can be secondarily created by a user, and the sharing effect is improved.
In another implementation manner of the present invention, the obtaining module is specifically configured to: and acquiring a playing video stream of the target video from the server according to the triggering operation of the second account on the video card of the first account.
In another implementation manner of the invention, the video card of the first account is generated by uploading the target video to the server.
In another implementation manner of the present invention, the obtaining module is specifically configured to: and acquiring a playing video stream of the target video according to the triggering operation of the second account on the video introduction interface of the first account.
In another implementation manner of the present invention, the editing module is specifically configured to: in a reference editing interface, editing the played video stream in a reference editing mode; and responding to the editing result in the reference editing mode, entering a reference generation mode, and starting to generate the reference of the target video.
In another implementation manner of the present invention, the editing module is further configured to: if the generation of the reference to the target video is completed, the reference to the target video is saved and the editing interface exits from the reference.
In another implementation manner of the present invention, the editing module is further configured to: and if the reference generation of the target video fails, returning to the reference editing mode from the reference generation mode for re-editing.
In another implementation manner of the present invention, the control module is specifically configured to: according to the quoting triggering operation of the playing video stream, the video playing interface enters a quoting editing interface, wherein the control module is further used for: and if the reference generation of the target video fails, returning to the video playing interface from the reference editing interface.
In another implementation manner of the present invention, the editing module is specifically configured to: in a reference editing interface, cutting a played video stream to obtain a cut video; and transcoding the cut video to generate the reference of the target video.
In another implementation manner of the present invention, the apparatus further includes an upload module: and uploading the reference of the second account to the target video to the server so as to store the reference of the target video in a video reference library of the second account.
In another implementation manner of the present invention, the editing module is specifically configured to: displaying at least part of the played video stream in a quote editing interface as a video to be edited; and editing the video to be edited through the editing time axis of the video to be edited.
In another implementation of the present invention, playing at least part of the video stream includes playing an unplayed part of the video stream, the apparatus further comprising: and the reference generation module generates the reference of the target video by downloading the unplayed part after the editing is finished.
In another implementation manner of the present invention, a crop box control referencing a video is displayed on an editing timeline, wherein the editing module is specifically configured to: determining a cropping interval of the reference video by adjusting the cropping frame control on the editing timeline; and performing clipping processing on the playing video stream based on the clipping time interval.
The apparatus of this embodiment is used to implement the corresponding method in the foregoing method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again. In addition, the functional implementation of each module in the apparatus of this embodiment can refer to the description of the corresponding part in the foregoing method embodiment, and is not described herein again.
FIG. 7 is a schematic block diagram of a video processing device of another embodiment of the present invention. The video processing apparatus of fig. 7, comprising:
the obtaining module 710 obtains a reference of a target video of the first account by the second account, where the reference of the target video is generated according to editing of a playing video stream of the target video;
the generating module 720 generates a release video of the second account based on the reference of the target video.
The apparatus of this embodiment is used to implement the corresponding method in the foregoing method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again. In addition, the functional implementation of each module in the apparatus of this embodiment can refer to the description of the corresponding part in the foregoing method embodiment, and is not described herein again.
FIG. 8 is a schematic block diagram of a video processing device of another embodiment of the present invention. The video processing apparatus of fig. 8 includes:
a transmission module 810 that transmits a play video stream of the target video so as to generate a reference of the target video by editing the play video stream;
a save module 820 receives and saves the reference to the target video.
In another implementation manner of the present invention, the transmission module is specifically configured to: receiving a transmission request generated by triggering operation of a video card of a first account according to a second account; and transmitting the playing video stream of the target video to the client of the second account according to the transmission request.
In another implementation manner of the present invention, the saving module is specifically configured to: and storing the reference of the target video in a video reference library of the second account.
In another implementation of the present invention, the apparatus further comprises: the updating module is used for determining the current reference times of the target video; and responding to the reference of the target video sent by the client, and updating the current reference times.
The apparatus of this embodiment is used to implement the corresponding method in the foregoing method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again. In addition, the functional implementation of each module in the apparatus of this embodiment can refer to the description of the corresponding part in the foregoing method embodiment, and is not described herein again.
Fig. 9 is a schematic structural diagram of an electronic device according to another embodiment of the invention; the electronic device may include:
one or more processors 901;
a computer-readable medium 902, which may be configured to store one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the methods as described in the embodiments above.
Fig. 10 is a hardware configuration of an electronic apparatus according to another embodiment of the present invention; as shown in fig. 10, the hardware structure of the electronic device may include: a processor 1001, a communication interface 1002, a computer-readable medium 1003, and a communication bus 1004;
wherein the processor 1001, the communication interface 1002, and the computer readable medium 1003 complete communication with each other through the communication bus 1004;
alternatively, the communication interface 1002 may be an interface of a communication module;
the processor 1001 may be specifically configured to: acquiring a playing video stream of a target video; entering a reference editing interface according to reference triggering operation of the played video stream; editing the playing video stream in the reference editing interface to generate a reference of the target video, or,
acquiring a reference of a second account to a target video of a first account, wherein the reference of the target video is generated according to editing of a playing video stream of the target video; generating a release video of the second account based on the reference of the target video, or,
transmitting a play video stream of a target video so as to generate a reference of the target video by editing the play video stream; a reference to the target video is received and saved, or,
acquiring a playing video stream of a target commodity video; generating a reference of the target commodity video by editing the played video stream; and the reference of the target commodity video is displayed in a correlation mode with the target commodity video, or,
acquiring a real-time video stream from a video display interface; generating at least one playback video reference for a real-time video stream by editing the real-time video stream; publishing the at least one playback video reference into the video presentation interface.
The Processor 1001 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The computer-readable medium 1003 may be, but is not limited to, a Random Access Memory (RAM), a Read-Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code configured to perform the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The computer program performs the above-described functions defined in the method of the present invention when executed by a Central Processing Unit (CPU). It should be noted that the computer readable medium of the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access storage media (RAM), a read-only storage media (ROM), an erasable programmable read-only storage media (EPROM or flash memory), an optical fiber, a portable compact disc read-only storage media (CD-ROM), an optical storage media piece, a magnetic storage media piece, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code configured to carry out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may operate over any of a variety of networks: including a Local Area Network (LAN) or a Wide Area Network (WAN) -to the user's computer, or alternatively, to an external computer (e.g., through the internet using an internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions configured to implement the specified logical function(s). In the above embodiments, specific precedence relationships are provided, but these precedence relationships are only exemplary, and in particular implementations, the steps may be fewer, more, or the execution order may be modified. That is, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present invention may be implemented by software or hardware. The names of these modules do not in some cases constitute a limitation of the module itself.
As another aspect, the present invention also provides a computer-readable medium on which a computer program is stored, which when executed by a processor implements the method as described in the above embodiments.
As another aspect, the present invention also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring a playing video stream of a target video; entering a reference editing interface according to reference triggering operation of the played video stream; editing the playing video stream in the reference editing interface to generate a reference of the target video, or,
acquiring a reference of a second account to a target video of a first account, wherein the reference of the target video is generated according to editing of a playing video stream of the target video; generating a release video of the second account based on the reference of the target video, or,
transmitting a play video stream of a target video so as to generate a reference of the target video by editing the play video stream; a reference to the target video is received and saved, or,
acquiring a playing video stream of a target commodity video; generating a reference of the target commodity video by editing the played video stream; and the reference of the target commodity video is displayed in a correlation mode with the target commodity video, or,
acquiring a real-time video stream from a video display interface; generating at least one playback video reference for a real-time video stream by editing the real-time video stream; publishing the at least one playback video reference into the video presentation interface.
The expressions "first", "second", "said first" or "said second" used in various embodiments of the present disclosure may modify various components regardless of order and/or importance, but these expressions do not limit the respective components. The above description is only configured for the purpose of distinguishing elements from other elements. For example, the first user equipment and the second user equipment represent different user equipment, although both are user equipment. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure.
When an element (e.g., a first element) is referred to as being "operably or communicatively coupled" or "connected" (operably or communicatively) to "another element (e.g., a second element) or" connected "to another element (e.g., a second element), it is understood that the element is directly connected to the other element or the element is indirectly connected to the other element via yet another element (e.g., a third element). In contrast, it is understood that when an element (e.g., a first element) is referred to as being "directly connected" or "directly coupled" to another element (a second element), no element (e.g., a third element) is interposed therebetween.
The foregoing description is only exemplary of the preferred embodiments of the invention and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention according to the present invention is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is possible without departing from the scope of the invention as defined by the appended claims. For example, the above features and (but not limited to) features having similar functions disclosed in the present invention are mutually replaced to form the technical solution.

Claims (26)

1. A video processing method, comprising:
acquiring a playing video stream of a target video;
entering a reference editing interface according to reference triggering operation of the played video stream;
and editing the played video stream in the reference editing interface to generate the reference of the target video.
2. The method of claim 1, wherein the method further comprises:
and displaying the reference of the target video and the target video in association.
3. The method of claim 2, wherein said presenting a reference to the target video in association with the video comprises:
and displaying the referenced playing interface entry of the target video in the playing interface of the target video.
4. The method of claim 2, wherein said presenting a reference to the target video in association with the video comprises:
and displaying an entry of a display interface of a reference list in the playing interface of the target video, wherein the reference list comprises the reference of the target video.
5. The method of claim 4, wherein the method further comprises:
entering the display interface according to the triggering operation of the entry of the display interface in the playing interface of the target video;
and displaying the reference list by arranging video reference playing areas in the display interface.
6. The method of claim 5, wherein the method further comprises:
and displaying a playing interface inlet of the target video in the display interface.
7. The method of claim 1, wherein said editing the playback video stream in the reference editing interface comprises:
in the reference editing interface, editing the playing video stream in a reference editing mode;
and responding to the editing result in the reference editing mode, entering a reference generation mode, and starting to generate the reference of the target video.
8. The method of claim 7, wherein said editing the playback video stream in the reference editing interface further comprises:
if the generation of the reference of the target video is finished, saving the reference of the target video and exiting from the reference editing interface.
9. The method of claim 7, wherein said editing the playback video stream in the reference editing interface further comprises:
and if the reference generation of the target video fails, returning the reference editing mode from the reference generation mode for re-editing.
10. The method of claim 1, wherein the entering a reference editing interface according to a reference triggering operation on the playing video stream comprises:
according to the reference triggering operation of the played video stream, the video playing interface enters a reference editing interface, wherein the method further comprises the following steps:
and if the reference generation of the target video fails, returning to the video playing interface from the reference editing interface.
11. The method of claim 1, wherein said editing the play video stream in the reference editing interface to generate the reference to the target video comprises:
in the reference editing interface, clipping the played video stream to obtain a clipped video;
and transcoding the cut video to generate the reference of the target video.
12. The method of claim 1, wherein said editing the playback video stream in the reference editing interface comprises:
displaying at least part of the played video stream in the reference editing interface as a video to be edited;
and editing the video to be edited through the editing time axis of the video to be edited.
13. The method of claim 12, wherein the at least a portion of the playback video stream comprises an unplayed portion of the playback video stream, the method further comprising:
after the editing is completed, generating a reference to the target video by downloading the unplayed portion.
14. The method of claim 12, wherein a crop box control of the reference video is exposed on the editing timeline, wherein,
the editing the video to be edited through the editing time axis of the video to be edited includes:
determining a cropping period of the reference video by adjusting a cropping frame control on the editing timeline;
and performing clipping processing on the playing video stream based on the clipping time interval.
15. A video processing method, comprising:
acquiring a reference of a second account to a target video of a first account, wherein the reference of the target video is generated according to editing of a playing video stream of the target video;
and generating a release video of the second account based on the reference of the target video.
16. A video processing method, comprising:
transmitting a play video stream of a target video so as to generate a reference of the target video by editing the play video stream;
and receiving and saving the reference of the target video.
17. The method of claim 16, wherein the transmitting the playback video stream of the target video comprises:
receiving a transmission request generated by triggering operation of a video card of a first account according to a second account;
and transmitting the playing video stream of the target video to the second account according to the transmission request.
18. The method of claim 17, wherein the receiving and saving a reference to the target video comprises:
and storing the reference of the target video in a video reference library of the second account.
19. A video processing method, comprising:
acquiring a playing video stream of a target commodity video;
generating a reference of the target commodity video by editing the played video stream;
and performing related display on the reference of the target commodity video and the target commodity video.
20. A video processing method, comprising:
acquiring a real-time video stream from a video display interface;
generating at least one playback video reference for a real-time video stream by editing the real-time video stream;
publishing the at least one playback video reference into the video presentation interface.
21. The method of claim 20, wherein the generating at least one playback video reference for the live video stream by editing the live video stream comprises:
editing the real-time video stream into at least one video segment;
and transcoding the at least one video segment to obtain at least one playback video of the real-time video stream.
22. A video processing apparatus comprising:
the acquisition module acquires a playing video stream of a target video;
the control module is used for entering a reference editing interface according to reference triggering operation of the played video stream;
and the editing module edits the played video stream in the reference editing interface to generate the reference of the target video.
23. A video processing device applied to a client comprises:
the acquisition module acquires reference of a second account to a target video of the first account from a server, wherein the reference of the target video is generated according to editing of a playing video stream of the target video;
and the generating module is used for generating the publishing video of the second account based on the reference of the target video.
24. A video processing device applied to a server comprises:
the transmission module is used for transmitting a playing video stream of a target video to a client so that the client can edit the playing video stream to generate a reference of the target video;
and the storage module is used for storing the reference of the target video sent by the client.
25. An electronic device, the device comprising:
one or more processors;
a computer readable medium configured to store one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-21.
26. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 21.
CN202010345562.5A 2020-04-27 2020-04-27 Video processing method and device, electronic equipment and storage medium Pending CN113645482A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010345562.5A CN113645482A (en) 2020-04-27 2020-04-27 Video processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010345562.5A CN113645482A (en) 2020-04-27 2020-04-27 Video processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113645482A true CN113645482A (en) 2021-11-12

Family

ID=78415073

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010345562.5A Pending CN113645482A (en) 2020-04-27 2020-04-27 Video processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113645482A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012166014A2 (en) * 2011-06-03 2012-12-06 Rawllin International Inc. Generating, editing, and sharing movie quotes
CN105519095A (en) * 2014-12-14 2016-04-20 深圳市大疆创新科技有限公司 Video processing processing method, apparatus and playing device
CN106303723A (en) * 2016-08-11 2017-01-04 网易(杭州)网络有限公司 Method for processing video frequency and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012166014A2 (en) * 2011-06-03 2012-12-06 Rawllin International Inc. Generating, editing, and sharing movie quotes
CN105519095A (en) * 2014-12-14 2016-04-20 深圳市大疆创新科技有限公司 Video processing processing method, apparatus and playing device
CN106303723A (en) * 2016-08-11 2017-01-04 网易(杭州)网络有限公司 Method for processing video frequency and device

Similar Documents

Publication Publication Date Title
US11671645B2 (en) System and method for creating customized, multi-platform video programming
US11615131B2 (en) Method and system for storytelling on a computing device via social media
WO2022028126A1 (en) Live streaming processing method and apparatus, and electronic device and computer readable storage medium
CN112383566B (en) Streaming media presentation system
US9639254B2 (en) Systems and methods for content aggregation, editing and delivery
US10452250B2 (en) Method for associating media files with additional content
US7769819B2 (en) Video editing with timeline representations
US7809802B2 (en) Browser based video editing
US8156176B2 (en) Browser based multi-clip video editing
CN104516892B (en) It is associated with dissemination method, system and the terminal of the user-generated content of rich media information
CN107920274B (en) Video processing method, client and server
US20140188997A1 (en) Creating and Sharing Inline Media Commentary Within a Network
US20140245334A1 (en) Personal videos aggregation
US20140068677A1 (en) System and method for generating content channels
WO2012031143A1 (en) Method and apparatus for embedding media programs having custom user selectable thumbnails
CN108737903B (en) Multimedia processing system and multimedia processing method
KR101982853B1 (en) Method for providing interaction based mobile streaming service using realtime customized platform
CN113645482A (en) Video processing method and device, electronic equipment and storage medium
KR102303309B1 (en) Method and system for sharing the time link of multimedia
CN116484031A (en) Multimedia interaction method, device, storage medium and equipment
WO2023073382A1 (en) Method for tracking distribution of a shared digital media file
CN117828109A (en) Data processing method, device, storage medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination