CN112702656A - Video editing method and video editing device - Google Patents

Video editing method and video editing device Download PDF

Info

Publication number
CN112702656A
CN112702656A CN202011518892.6A CN202011518892A CN112702656A CN 112702656 A CN112702656 A CN 112702656A CN 202011518892 A CN202011518892 A CN 202011518892A CN 112702656 A CN112702656 A CN 112702656A
Authority
CN
China
Prior art keywords
animation effect
transition animation
video
duration
clip control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011518892.6A
Other languages
Chinese (zh)
Inventor
谢松伦
韩乔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202011518892.6A priority Critical patent/CN112702656A/en
Publication of CN112702656A publication Critical patent/CN112702656A/en
Priority to PCT/CN2021/104096 priority patent/WO2022134524A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip

Abstract

The present disclosure relates to a video editing method and a video editing apparatus. The video editing method may include the steps of: acquiring and displaying at least two video materials to be spliced; receiving a selection of a transition animation effect for splicing two adjacent video materials; in response to the selection, displaying a clip control for changing a duration of the transition animation effect; changing a duration of the transition animation effect based on a change in a length of the clip control; and splicing the two adjacent video materials according to the changed transition animation effect to obtain a combined video material. The method and the device can change the duration of the transition animation effect more flexibly.

Description

Video editing method and video editing device
Technical Field
The present disclosure relates to the field of video processing, and in particular, to a video editing method and a video editing apparatus for adjusting transition time.
Background
With the continuous popularization and development of videos (especially short videos), more and more users begin to make and share videos on a video platform. When a user makes a video, two video clips can be spliced into a complete video. In order to enable the two video clips to be in natural transition in the playing process, when a plurality of video clips are spliced, transition processing needs to be carried out at the joint of the videos. At present, transition processing in a video splicing process is simple, and mostly inserts such as simple splicing or video insertion and pictures insertion are used as transition.
Disclosure of Invention
The present disclosure provides a video editing method and a video editing apparatus to at least solve the problem of complicated transition time adjustment in the related art. The technical scheme of the disclosure is as follows:
according to a first aspect of the embodiments of the present disclosure, there is provided a video editing method, which may include the steps of: acquiring and displaying at least two video materials to be spliced; receiving a selection of a transition animation effect for splicing two adjacent video materials; in response to the selection, displaying a clip control for changing a duration of the transition animation effect; changing a duration of the transition animation effect based on a change in a length of the clip control; and splicing the two adjacent video materials according to the changed transition animation effect to obtain a combined video material.
Optionally, the clip control may be displayed above or below the two adjacent video material.
Optionally, the vertical centerline of the clip control may be aligned with the splice of the two adjacent video material.
Optionally, the length of the clip control and the duration of the transition animation effect may be displayed in proportion.
Optionally, a progress bar corresponding to the length of time that the two adjacent video materials are covered by the transition animation effect may be displayed in the clip control.
Optionally, the step of changing the duration of the transition animation effect based on the change in the length of the clip control may include receiving a user input for dragging at least one of a left end and a right end of the clip control; and changing the length of the clip control to change the duration of the transition animation effect in accordance with the user input.
Optionally, the step of changing the length of the clip control to change the duration of the transition animation effect in accordance with the user input may comprise: changing the duration of a first half of the transition animation effects based on the left end of the clip control being dragged, wherein the first half corresponds to a first video material of the two adjacent video materials; and/or changing the duration of the second half animation effect in the transition animation effect based on the dragging of the right end of the clip control, wherein the second half corresponds to the second video material in the two adjacent video materials.
Optionally, when the left end of the clip control is dragged, a change in the clip control of a progress bar corresponding to the duration of the first half of the transition animation effect may be correspondingly displayed; when the right end of the clip control is dragged, the change of the progress bar corresponding to the duration of the animation effect of the latter half of the transition animation effect in the clip control can be correspondingly displayed.
Optionally, the user input may comprise one of: hovering inputs, touch inputs and inputs generated by input tools.
According to a second aspect of the embodiments of the present disclosure, there is provided a video editing apparatus, which may include: the acquisition module is configured to acquire and display at least two video materials to be spliced; a receiving module configured to receive a selection of a transition animation effect for stitching two adjacent video materials; and a processing module configured to: in response to the selection, displaying a clip control for changing a duration of the transition animation effect; changing a duration of the transition animation effect based on a change in a length of the clip control; and splicing the two adjacent video materials according to the changed transition animation effect to obtain a combined video material.
Optionally, the clip control may be displayed above or below the two adjacent video material.
Optionally, the vertical centerline of the clip control may be aligned with the splice of the two adjacent video material.
Optionally, the length of the clip control and the duration of the transition animation effect may be displayed in proportion.
Optionally, a progress bar corresponding to the length of time that the two adjacent video materials are covered by the transition animation effect may be displayed in the clip control.
Optionally, the receiving module may be configured to receive a user input for dragging at least one of a left end and a right end of the clip control.
Optionally, the processing module may be configured to change a length of the clip control to change a duration of the transition animation effect in accordance with the user input.
Optionally, the processing module may be configured to: changing the duration of a first half of the transition animation effects based on the left end of the clip control being dragged, wherein the first half corresponds to a first video material of the two adjacent video materials; and/or changing the duration of the second half animation effect in the transition animation effect based on the dragging of the right end of the clip control, wherein the second half corresponds to the second video material in the two adjacent video materials.
Optionally, when the left end of the clip control is dragged, a change of the progress bar in the clip control corresponding to the duration of the first half of the transition animation effect may be correspondingly displayed, and when the right end of the clip control is dragged, a change of the progress bar in the clip control corresponding to the duration of the second half of the transition animation effect may be correspondingly displayed.
Optionally, the user input comprises one of: hovering inputs, touch inputs and inputs generated by input tools.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus, which may include: at least one processor; at least one memory storing computer-executable instructions, wherein the computer-executable instructions, when executed by the at least one processor, cause the at least one processor to perform the video editing method as described above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium storing instructions that, when executed by at least one processor, cause the at least one processor to perform the video editing method as described above.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product, instructions of which are executed by at least one processor in an electronic device to perform the video editing method as described above.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
the method and the device have the advantages that transition time can be adjusted more conveniently and flexibly, the proportion of the time length occupied by the front part and the rear part of the transition animation effect can be adjusted, and a user can edit the video more accurately.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
Fig. 1 is a user interface for transition processing in the related art.
Fig. 2 is a diagram illustrating a window for setting a transition animation effect duration in the related art.
Fig. 3 is a flowchart illustrating a video editing method according to an exemplary embodiment of the present disclosure.
Fig. 4 is a user interface for splicing two video materials, shown in accordance with an exemplary embodiment of the present disclosure.
Fig. 5 is a schematic structural diagram illustrating a video editing apparatus according to an exemplary embodiment of the present disclosure.
Fig. 6 is a block diagram illustrating a video editing apparatus according to an exemplary embodiment of the present disclosure.
Fig. 7 is a block diagram illustrating an electronic device according to an exemplary embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of the embodiments of the disclosure as defined by the claims and their equivalents. Various specific details are included to aid understanding, but these are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the written meaning, but are used only by the inventors to achieve a clear and consistent understanding of the disclosure. Accordingly, it should be apparent to those skilled in the art that the following descriptions of the various embodiments of the present disclosure are provided for illustration only and not for the purpose of limiting the disclosure as defined by the appended claims and their equivalents.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Each paragraph (the smallest unit constituting a video film is a shot, and a sequence of shots formed by connecting individual shots together) has a single and relatively complete meaning, such as representing an action process, representing a correlation, representing a meaning, and the like. It is a complete narrative layer in the TV film, just like the scenes in dramas and chapters in novels, and the individual paragraphs are connected together to form a complete TV film. Therefore, the paragraphs are the most basic structural form of the video film, and the structural hierarchy of the video film in the content is expressed by the paragraphs. And the transition or transition between paragraphs and paragraphs, and scenes is called transition.
The transition may adjust the overall transition time. Usually, when the transition processing is performed, the application about the transition may display a corresponding adjustment interface to the user, and the user may set the time length of the transition through the interface, and after the setting is completed, the application completes the corresponding transition according to the newly set total transition time length. For example, a user interface for the transition process is shown in FIG. 1.
Fig. 1 is a user interface for transition processing in the related art.
Referring to fig. 1, a user may select a style of a transition animation effect on a user interface, and then may drag a transition control corresponding to the displayed transition animation effect into the middle of two video clips. When the length of the transition animation effect needs to be adjusted, the transition control can be clicked, and at the moment, a setting window pops up, as shown in fig. 2. The user can input the duration of the transition animation effect in the setting window to complete the setting of the duration of the transition animation effect.
Although the setting of the transition animation effect duration can be realized using the user interface of fig. 1, the adjustment operation is complicated and the time of the front and rear portions of the transition animation effect cannot be adjusted individually.
According to the embodiment of the disclosure, the duration of the transition animation effect can be set by adding the clipping control and dragging the two ends of the clipping control, and the durations of the parts before and after the transition can be adjusted more flexibly. Hereinafter, a method, an apparatus, and a device of the present disclosure will be described in detail with reference to the accompanying drawings, according to various embodiments of the present disclosure.
Fig. 3 is a flow diagram illustrating a video editing method, as shown in fig. 3, that may be used for transition processing between at least two video materials/segments, according to an example embodiment. The method shown in fig. 3 may be performed by any electronic device having an image processing function. The electronic device may be a device comprising at least one of: for example, smart phones, tablet Personal Computers (PCs), mobile phones, video phones, electronic book readers (e-book readers), desktop PCs, laptop PCs, netbook computers, workstations, servers, Personal Digital Assistants (PDAs), Portable Multimedia Players (PMPs), cameras, wearable devices, and the like.
Referring to fig. 3, in step S11, at least two video materials to be spliced are acquired and displayed. For example, when a user inputs two or more video clips (video material) to the electronic device, a user interface for editing the video clips may be displayed, such as shown in fig. 4. Fig. 4 is a user interface for splicing two video materials, shown in accordance with an exemplary embodiment of the present disclosure.
In fig. 4, the user interface may place and display a style button for a transition animation effect and a video clip to be edited.
In step S12, a selection of a transition animation effect for splicing two adjacent video materials is received. The user may select the style of transition animation effect that the user wants to add in the user interface of fig. 4. The user may select a favorite transition animation effect style by touching a style button, or the like.
In step S13, in response to the selection of the transition animation effect, a clip control for changing the duration of the transition animation effect is displayed.
The clip control may be a newly added track after the transition animation effect is selected, and the clip control may be a draggable control.
After the user selects the transition animation effect style, a clip control may be displayed above or below the adjacent two video materials to be edited, as shown in fig. 4. In this way, the user can more conveniently adjust the duration of the transition animation effect with reference to the video.
Alternatively, when a user clips a video, a transition control can be dragged between two video materials that need to be added to the transition when a transition animation effect needs to be added between the two video materials. For example, after the transition control is added, a clip track can be added above or below the currently edited video for placement and display of the clip control.
According to the embodiment of the disclosure, the clipping control part can be arranged to be rectangular and presented in a semitransparent mode, and the left side and the right side of the clipping control part are respectively provided with a handle. However, the above examples are merely exemplary, and the form and shape of the clip control may be differently set. There may also be no "handles" on the left and right sides of the clip control.
The vertical centerline of the clip control may be aligned with the splice of two adjacent video material to be edited, such as shown in fig. 4. However, the above positions are merely exemplary, and the present disclosure is not limited thereto.
The length of the clip control may be displayed proportionally to the duration of the transition animation effect.
To enhance the visualization of the transition animation effect, a progress bar corresponding to the length of time that two adjacent video materials are covered by the transition animation effect may be displayed in the clip control. That is, in the clip control, portions of two adjacent video material that are overlaid by the transition animation effect may be displayed. For example, the black bold bar in the clip control in fig. 4 is the progress bar. The progress bar can reflect the real time length of the transition animation effect, namely the covered time length of the two adjacent videos.
In addition, a label or icon for the user to select the transition animation effect may also be displayed in the clip control, so that the user may more conveniently view the selected transition animation effect.
The user interface shown in fig. 4 is merely exemplary and may be differently laid out and arranged according to design requirements.
In step S14, the duration of the transition animation effect is changed based on the change in the length of the clip control. The length of the clip control may be varied according to the user. For example, the user may individually change the length of the clip control by dragging the left or right end of the clip control.
After receiving a user input for dragging at least one of the left and right ends of the clip control, the length of the clip control may be changed to change the duration of the transition animation effect according to the user input. Specifically, the duration of the first half of the transition animation effect, which corresponds to the first of the two adjacent video materials, may be changed based on the left end of the clip control being dragged, and the duration of the second half of the transition animation effect, which corresponds to the second of the two adjacent video materials, may be changed based on the right end of the clip control being dragged.
In order to adjust the transition animation effect duration more intuitively and conveniently, the length of the clipping control can be divided into two parts, one part corresponds to the first half of the transition animation effect, the other part corresponds to the second half of the transition animation effect, correspondingly, the first half of the transition animation effect can correspond to a first video material (such as a video clip positioned at the front of two video clips), and the second half of the transition animation effect corresponds to a second video material (such as a video clip positioned at the rear of two video clips).
By way of example, when the duration of the transition effect needs to be adjusted, two ends of the clip control can be dragged, and the duration of the transition animation effect is correspondingly adjusted according to the change of the length of the dragged clip control. When the transition animation effect includes two animation effects (e.g., a front animation effect and a rear animation effect), if the left side of the clip control is dragged, the duration of the front animation effect may be adjusted, and if the right side of the clip control is dragged, the duration of the rear animation effect may be adjusted.
When the left end of the clipping control is dragged, the change of the progress bar in the clipping control, which corresponds to the duration of the first half of the transition animation effect, is correspondingly displayed. When the right end of the clipping control is dragged, the change of the progress bar in the clipping control, which corresponds to the duration of the animation effect of the latter half of the transition animation effect, is correspondingly displayed. That is, the timeline in the clip control corresponding to the transition animation effect will change accordingly based on the clip control being stretched left or right.
For example, in the case where the transition animation effect includes the first animation effect and the second animation effect, when the user drags the clip control to the left, the duration of the first animation effect becomes long, and the progress bar in the clip control corresponding to the first animation effect also correspondingly stretches to the left. When the user drags the clipping control to the right, the duration of the second animation effect is prolonged, and the progress bar corresponding to the second animation effect in the clipping control correspondingly stretches to the right. The above examples are merely to illustrate the display effect, and the present disclosure is not limited thereto.
The process that the video is completely shielded may occur in the transition process, and accordingly, the duration that the video is completely shielded can be displayed in the clipping control, so that when the user adjusts the clipping control by dragging, the user can also see the corresponding duration change, and the required effect can be adjusted more accurately.
In addition, the user may also change the previously selected transition animation effect through the clip control. For example, each time the user touches an icon of a transition animation effect in the clip control, the current transition animation effect may be changed once in the order of transition animation styles in the user interface. However, the above examples are merely exemplary, and the present disclosure is not limited thereto.
In step S15, two adjacent video materials are spliced according to the changed transition animation effect to obtain a merged video material.
After the transition animation effect is adjusted, the transition animation effect can be inserted between two adjacent video segments to obtain a combined complete video.
According to the embodiment of the disclosure, the duration of the transition animation effect can be set by dragging the clipping control, and the durations of the front part and the rear part of the transition animation effect can be independently adjusted, so that a user can adjust the transition animation effect more conveniently and accurately.
Fig. 5 is a schematic structural diagram of a video editing apparatus of a hardware operating environment according to an embodiment of the present disclosure.
As shown in fig. 5, the video editing apparatus 500 may include: a processing component 501, a communication bus 502, a network interface 503, an input-output interface 504, a memory 505, and a power component 506. Wherein a communication bus 502 is used to enable connective communication between these components. The input-output interface 504 may include a video display (such as a liquid crystal display), a microphone and speakers, and a user-interaction interface (such as a keyboard, mouse, touch-input device, etc.), and optionally, the input-output interface 504 may also include a standard wired interface, a wireless interface. The network interface 503 may optionally include a standard wired interface, a wireless interface (e.g., a wireless fidelity interface). The memory 505 may be a high speed random access memory or may be a stable non-volatile memory. The memory 505 may alternatively be a storage device separate from the processing component 501 described previously.
Those skilled in the art will appreciate that the configuration shown in fig. 5 does not constitute a limitation of the video editing apparatus 500 and may include more or fewer components than those shown, or some components in combination, or a different arrangement of components.
As shown in fig. 5, the memory 505, which is a storage medium, may include therein an operating system, a data storage module, a network communication module, a user interface module, a video editing program, and a database.
In the video editing apparatus 500 shown in fig. 5, the network interface 503 is mainly used for data communication with an external apparatus/terminal; the input/output interface 504 is mainly used for data interaction with a user; the processing component 501 and the memory 505 in the video editing apparatus 500 may be provided in the video editing apparatus 500, and the video editing apparatus 500 executes the video editing method provided by the embodiment of the present disclosure by calling the video editing program stored in the memory 505 by the processing component 501.
The processing component 501 may include at least one processor, and the memory 505 has stored therein a set of computer-executable instructions that, when executed by the at least one processor, perform a video editing method according to an embodiment of the present disclosure. Further, the processing component 501 may perform encoding operations and decoding operations, among others. However, the above examples are merely exemplary, and the present disclosure is not limited thereto.
The processing component 501 may control the input output interface 504 to obtain and display at least two video materials to be stitched.
The processing component 501 may control the input-output interface 504 to receive a selection of a transition animation effect for splicing two adjacent video materials.
In response to selection of the transition animation effect, the processing component 501 may control the input-output interface 504 to display a clip control for changing the duration of the transition animation effect. A progress bar corresponding to the length of time that two adjacent video materials are covered by the transition animation effect may be displayed in the clip control.
The processing component 501 may control the input output interface 504 to display a clip control above or below two adjacent video materials. The vertical centerline of the clip control may be aligned with the splice of two adjacent video materials.
The processing component 501 may control the input output interface 504 to display the length of the clip control in proportion to the duration of the transition animation effect.
The processing component 501 may control the input-output interface 504 to receive a user input for dragging at least one of the left and right ends of the clip control. The user input may be, for example, one of the following: hovering inputs, touch inputs and inputs generated by input tools.
The processing component 501 may change the duration of the transition animation effect based on the change in the length of the clip control and stitch the two adjacent video materials according to the changed transition animation effect to get a merged video material.
The processing component 501 may change the length of the clip control to change the duration of the transition animation effect based on user input.
The processing component 501 may change the duration of the first half of the transition animation effect, which corresponds to the first of the two adjacent video materials, from the animation effect based on the left end of the clip control being dragged.
The processing component 501 may change the duration of the second half of the transition animation effect based on the right end of the clip control being dragged, where the second half corresponds to the second of the two adjacent video materials.
When the left end of the clipping control is dragged, the change of the progress bar in the clipping control, corresponding to the duration of the first half of the transition animation effect, in the clipping control, can be correspondingly displayed, and when the right end of the clipping control is dragged, the change of the progress bar in the clipping control, corresponding to the duration of the second half of the transition animation effect, in the clipping control, can be correspondingly displayed.
The video editing apparatus 500 may receive or output video or images via the input-output interface 504. For example, a user may input video or images to the processing component 501 via the input-output interface 504, or a user may display processed video or images via the input-output interface 504.
By way of example, the video editing device 500 may be a PC computer, tablet device, personal digital assistant, smartphone, or other device capable of executing the set of instructions described above. Here, the video editing apparatus 500 does not have to be a single electronic apparatus, but may be any apparatus or collection of circuits that can individually or jointly execute the above-described instructions (or instruction sets). The video editing device 500 may also be part of an integrated control system or system manager, or may be configured as a portable electronic device that interfaces with local or remote (e.g., via wireless transmission).
In the video editing apparatus 500, the processing component 501 may include a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a programmable logic device, a dedicated processor system, a microcontroller, or a microprocessor. By way of example, and not limitation, processing component 501 may also include an analog processor, a digital processor, a microprocessor, a multi-core processor, a processor array, a network processor, and the like.
The processing component 501 may execute instructions or code stored in a memory, wherein the memory 505 may also store data. Instructions and data may also be sent and received over a network via the network interface 503, where the network interface 503 may employ any known transmission protocol.
The memory 505 may be integral to the processor, e.g., having RAM or flash memory disposed within an integrated circuit microprocessor or the like. Further, memory 505 may comprise a stand-alone device, such as an external disk drive, storage array, or any other storage device that may be used by a database system. The memory and the processor may be operatively coupled or may communicate with each other, such as through an I/O port, a network connection, etc., so that the processor can read files stored in the memory.
Fig. 6 is a block diagram illustrating a video editing apparatus according to an exemplary embodiment. Referring to fig. 6, the video editing apparatus 600 may include an acquisition module 601, a reception module 602, and a processing module 603. Each module in the video editing apparatus 600 may be implemented by one or more modules, and the name of the corresponding module may vary according to the type of the module. In various embodiments, some modules in the video editing apparatus 600 may be omitted, or additional modules may also be included. Furthermore, modules/elements according to various embodiments of the present disclosure may be combined to form a single entity, and thus may equivalently perform the functions of the respective modules/elements prior to combination.
Referring to fig. 6, the obtaining module 601 may obtain and display at least two video materials to be spliced.
The receiving module 602 may receive a selection of a transition animation effect for splicing two adjacent video materials.
In response to the selection of the transition animation effect, the processing module 603 may display a clip control for changing a duration of the transition animation effect, change the duration of the transition animation effect based on a change in a length of the clip control, and splice two adjacent video materials according to the changed transition animation effect to obtain a merged video material.
The clip control can be displayed above or below two adjacent video materials. The vertical centerline of the clip control may be aligned with the splice of two adjacent video materials.
The length of the clip control may be displayed in proportion to the duration of the transition animation effect.
A progress bar corresponding to the length of time that two adjacent video materials are covered by the transition animation effect may be displayed in the clip control.
The receiving module 602 may receive a user input for dragging at least one of the left and right ends of the clip control. The user input may include one of: hovering inputs, touch inputs and inputs generated by input tools. The input means may include a mouse, keyboard, electronic pen, etc.
The processing module 603 may change the length of the clip control to change the duration of the transition animation effect based on user input.
The processing module 603 may change the duration of the first half of the transition animation effect, which corresponds to the first of the two adjacent video materials, from the transition animation effect based on the left end of the clip control being dragged.
The processing module 603 may change a duration of a second half animation effect of the transition animation effects based on the right end of the clip control being dragged, wherein the second half corresponds to a second video material of the two adjacent video materials.
When the left end of the clipping control is dragged, the change of the progress bar in the clipping control, corresponding to the duration of the first half of the transition animation effect, in the clipping control, can be correspondingly displayed, and when the right end of the clipping control is dragged, the change of the progress bar in the clipping control, corresponding to the duration of the second half of the transition animation effect, in the clipping control, can be correspondingly displayed.
According to the embodiment of the disclosure, the method and the device can be applied to flexibly adjusting scenes with different occupied durations of the front part and the rear part of the transition animation effect.
According to an embodiment of the present disclosure, an electronic device may be provided. Fig. 7 is a block diagram of an electronic device 700 that may include at least one memory 702 and at least one processor 701, the at least one memory 702 storing a set of computer-executable instructions that, when executed by the at least one processor 701, perform a video editing method according to various embodiments of the present disclosure, according to an embodiment of the present disclosure.
The processor 701 may include a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a programmable logic device, a special-purpose processor system, a microcontroller, or a microprocessor. By way of example, and not limitation, processor 701 may also include an analog processor, a digital processor, a microprocessor, a multi-core processor, a processor array, a network processor, or the like.
The memory 702, which is a storage medium, may include an operating system, a data storage module, a network communication module, a user interface module, a video editing program, and a database.
The memory 702 may be integrated with the processor 701, for example, RAM or flash memory may be disposed within an integrated circuit microprocessor or the like. Further, memory 702 may comprise a stand-alone device, such as an external disk drive, storage array, or any other storage device usable by a database system. The memory 702 and the processor 701 may be operatively coupled or may communicate with each other, such as through I/O ports, network connections, etc., so that the processor 701 can read files stored in the memory 702.
In addition, the electronic device 700 may also include a video display (such as a liquid crystal display) and a user interaction interface (such as a keyboard, mouse, touch input device, etc.). All components of the electronic device 700 may be connected to each other via a bus and/or a network.
By way of example, the electronic device 700 may be a PC computer, tablet device, personal digital assistant, smartphone, or other device capable of executing the set of instructions described above. Here, the electronic device 700 need not be a single electronic device, but can be any collection of devices or circuits that can execute the above instructions (or sets of instructions) either individually or in combination. The electronic device 700 may also be part of an integrated control system or system manager, or may be configured as a portable electronic device that interfaces with local or remote (e.g., via wireless transmission).
Those skilled in the art will appreciate that the configuration shown in FIG. 7 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
According to an embodiment of the present disclosure, there may also be provided a computer-readable storage medium storing instructions that, when executed by at least one processor, cause the at least one processor to perform a video editing method according to the present disclosure. Examples of the computer-readable storage medium herein include: read-only memory (ROM), random-access programmable read-only memory (PROM), electrically erasable programmable read-only memory (EEPROM), random-access memory (RAM), dynamic random-access memory (DRAM), static random-access memory (SRAM), flash memory, non-volatile memory, CD-ROM, CD-R, CD + R, CD-RW, CD + RW, DVD-ROM, DVD-R, DVD + R, DVD-RW, DVD + RW, DVD-RAM, BD-ROM, BD-R, BD-R LTH, BD-RE, Blu-ray or compact disc memory, Hard Disk Drive (HDD), solid-state drive (SSD), card-type memory (such as a multimedia card, a Secure Digital (SD) card or a extreme digital (XD) card), magnetic tape, a floppy disk, a magneto-optical data storage device, an optical data storage device, a hard disk, a magnetic tape, a magneto-optical data storage device, a, A solid state disk, and any other device configured to store and provide a computer program and any associated data, data files, and data structures to a processor or computer in a non-transitory manner such that the processor or computer can execute the computer program. The computer program in the computer-readable storage medium described above can be run in an environment deployed in a computer apparatus, such as a client, a host, a proxy device, a server, and the like, and further, in one example, the computer program and any associated data, data files, and data structures are distributed across a networked computer system such that the computer program and any associated data, data files, and data structures are stored, accessed, and executed in a distributed fashion by one or more processors or computers.
According to an embodiment of the present disclosure, there may also be provided a computer program product, in which instructions are executable by a processor of a computer device to perform the above-mentioned video editing method.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A video editing method, comprising:
acquiring and displaying at least two video materials to be spliced;
receiving a selection of a transition animation effect for splicing two adjacent video materials;
in response to the selection, displaying a clip control for changing a duration of the transition animation effect;
changing a duration of the transition animation effect based on a change in a length of the clip control; and
and splicing the two adjacent video materials according to the changed transition animation effect to obtain a combined video material.
2. The video editing method of claim 1, wherein the clip control is displayed above or below the two adjacent video materials.
3. The video editing method of claim 1, wherein the length of the clip control is displayed in proportion to the duration of the transition animation effect.
4. The video editing method according to claim 1, wherein a progress bar corresponding to a time length of the two adjacent video materials covered by the transition animation effect is displayed in the clip control.
5. The video editing method of claim 1, wherein the step of changing the duration of the transition animation effect based on the change in the length of the clip control comprises:
receiving a user input for dragging at least one of a left end and a right end of the clip control;
and changing the length of the clipping control according to the user input so as to change the duration of the transition animation effect.
6. The video editing method of claim 5, wherein the step of changing the length of the clip control to change the duration of the transition animation effect according to the user input comprises:
changing the duration of a first half of the transition animation effects based on the left end of the clip control being dragged, wherein the first half corresponds to a first video material of the two adjacent video materials; and/or
Changing a duration of a latter half animation effect of the transition animation effects based on the right end of the clip control being dragged, wherein the latter half corresponds to a second video material of the two adjacent video materials,
wherein, when the left end of the clip control is dragged, a change of a progress bar in the clip control corresponding to a duration of a first half of the transition animation effect is correspondingly displayed,
when the right end of the clipping control is dragged, the change of the progress bar in the clipping control, which corresponds to the duration of the animation effect of the latter half of the transition animation effect, is correspondingly displayed.
7. A video editing apparatus, comprising:
the acquisition module is configured to acquire and display at least two video materials to be spliced;
a receiving module configured to receive a selection of a transition animation effect for stitching two adjacent video materials;
a processing module configured to:
in response to the selection, displaying a clip control for changing a duration of the transition animation effect;
changing a duration of the transition animation effect based on a change in a length of the clip control; and
and splicing the two adjacent video materials according to the changed transition animation effect to obtain a combined video material.
8. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the video editing method of any one of claims 1 to 6.
9. A computer readable storage medium whose instructions, when executed by a processor of an electronic device, enable the electronic device to perform the video editing method of any of claims 1 to 6.
10. A computer program product comprising computer instructions, characterized in that the computer instructions, when executed by a processor, implement the video editing method of any of claims 1 to 6.
CN202011518892.6A 2020-12-21 2020-12-21 Video editing method and video editing device Pending CN112702656A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011518892.6A CN112702656A (en) 2020-12-21 2020-12-21 Video editing method and video editing device
PCT/CN2021/104096 WO2022134524A1 (en) 2020-12-21 2021-07-01 Video editing method and video editing apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011518892.6A CN112702656A (en) 2020-12-21 2020-12-21 Video editing method and video editing device

Publications (1)

Publication Number Publication Date
CN112702656A true CN112702656A (en) 2021-04-23

Family

ID=75509419

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011518892.6A Pending CN112702656A (en) 2020-12-21 2020-12-21 Video editing method and video editing device

Country Status (2)

Country Link
CN (1) CN112702656A (en)
WO (1) WO2022134524A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113315883A (en) * 2021-05-27 2021-08-27 北京达佳互联信息技术有限公司 Method and device for adjusting video combined material
CN113825017A (en) * 2021-09-22 2021-12-21 北京达佳互联信息技术有限公司 Video editing method and video editing device
CN114268741A (en) * 2022-02-24 2022-04-01 荣耀终端有限公司 Transition dynamic effect generation method, electronic device, and storage medium
WO2022134524A1 (en) * 2020-12-21 2022-06-30 北京达佳互联信息技术有限公司 Video editing method and video editing apparatus
WO2024007898A1 (en) * 2022-07-08 2024-01-11 脸萌有限公司 Video processing method and apparatus, and electronic device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115515006B (en) * 2022-08-19 2023-10-17 北京达佳互联信息技术有限公司 Video processing method, device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030002851A1 (en) * 2001-06-28 2003-01-02 Kenny Hsiao Video editing method and device for editing a video project
CN107770457A (en) * 2017-10-27 2018-03-06 维沃移动通信有限公司 A kind of video creating method and mobile terminal
CN108322837A (en) * 2018-01-10 2018-07-24 链家网(北京)科技有限公司 Video generation method based on picture and device
CN110162343A (en) * 2019-04-10 2019-08-23 北京梧桐车联科技有限责任公司 Using starting method and device, electronic equipment and storage medium
CN110612721A (en) * 2018-01-19 2019-12-24 深圳市大疆创新科技有限公司 Video processing method and terminal equipment
CN110868631A (en) * 2018-08-28 2020-03-06 腾讯科技(深圳)有限公司 Video editing method, device, terminal and storage medium
CN112004136A (en) * 2020-08-25 2020-11-27 广州市百果园信息技术有限公司 Method, device, equipment and storage medium for video clipping

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000028543A1 (en) * 1998-11-10 2000-05-18 Sony Corporation Edit data creating device and edit data creating method
CN104184960A (en) * 2014-08-19 2014-12-03 厦门美图之家科技有限公司 Method for carrying out special effect processing on video file
CN111083526B (en) * 2019-12-31 2021-12-10 广州酷狗计算机科技有限公司 Video transition method and device, computer equipment and storage medium
CN111787395B (en) * 2020-05-27 2023-04-18 北京达佳互联信息技术有限公司 Video generation method and device, electronic equipment and storage medium
CN111669623B (en) * 2020-06-28 2023-10-13 腾讯科技(深圳)有限公司 Video special effect processing method and device and electronic equipment
CN112702656A (en) * 2020-12-21 2021-04-23 北京达佳互联信息技术有限公司 Video editing method and video editing device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030002851A1 (en) * 2001-06-28 2003-01-02 Kenny Hsiao Video editing method and device for editing a video project
CN107770457A (en) * 2017-10-27 2018-03-06 维沃移动通信有限公司 A kind of video creating method and mobile terminal
CN108322837A (en) * 2018-01-10 2018-07-24 链家网(北京)科技有限公司 Video generation method based on picture and device
CN110612721A (en) * 2018-01-19 2019-12-24 深圳市大疆创新科技有限公司 Video processing method and terminal equipment
CN110868631A (en) * 2018-08-28 2020-03-06 腾讯科技(深圳)有限公司 Video editing method, device, terminal and storage medium
CN110162343A (en) * 2019-04-10 2019-08-23 北京梧桐车联科技有限责任公司 Using starting method and device, electronic equipment and storage medium
CN112004136A (en) * 2020-08-25 2020-11-27 广州市百果园信息技术有限公司 Method, device, equipment and storage medium for video clipping

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022134524A1 (en) * 2020-12-21 2022-06-30 北京达佳互联信息技术有限公司 Video editing method and video editing apparatus
CN113315883A (en) * 2021-05-27 2021-08-27 北京达佳互联信息技术有限公司 Method and device for adjusting video combined material
CN113315883B (en) * 2021-05-27 2023-01-20 北京达佳互联信息技术有限公司 Method and device for adjusting video combined material
CN113825017A (en) * 2021-09-22 2021-12-21 北京达佳互联信息技术有限公司 Video editing method and video editing device
CN114268741A (en) * 2022-02-24 2022-04-01 荣耀终端有限公司 Transition dynamic effect generation method, electronic device, and storage medium
WO2024007898A1 (en) * 2022-07-08 2024-01-11 脸萌有限公司 Video processing method and apparatus, and electronic device

Also Published As

Publication number Publication date
WO2022134524A1 (en) 2022-06-30

Similar Documents

Publication Publication Date Title
CN112702656A (en) Video editing method and video editing device
JP6986186B2 (en) Visualization editing methods, devices, devices and storage media
US10068613B2 (en) Intelligent selection of scene transitions
RU2488157C2 (en) Presentation sections with user-defined properties
US11417367B2 (en) Systems and methods for reviewing video content
CN108833787B (en) Method and apparatus for generating short video
US20110161802A1 (en) Methods, processes and systems for centralized rich media content creation, custimization, and distributed presentation
US20130097552A1 (en) Constructing an animation timeline via direct manipulation
BRPI0501951B1 (en) colorized template previews
WO2020220773A1 (en) Method and apparatus for displaying picture preview information, electronic device and computer-readable storage medium
US20150355801A1 (en) Recorded history feature in operating system windowing system
EP4273808A1 (en) Method and apparatus for publishing video, device, and medium
WO2017008646A1 (en) Method of selecting a plurality targets on touch control terminal and equipment utilizing same
CN113038034A (en) Video editing method and video editing device
CN114154000A (en) Multimedia resource publishing method and device
CN114153347A (en) Conference recording method, conference recording device, electronic equipment and computer-readable storage medium
US10595086B2 (en) Selection and display of differentiating key frames for similar videos
CN113419655A (en) Application screen capturing method and device for electronic terminal
CN112434494A (en) Text editing method, device, terminal and storage medium
CN110109591B (en) Picture editing method and device
EP4050605A1 (en) Method and device for editing video
US11481088B2 (en) Dynamic data density display
CN114299198A (en) Animation generation method, animation generation device, electronic equipment, media and computer program product
US11928078B2 (en) Creating effect assets while avoiding size inflation
CN113825017A (en) Video editing method and video editing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210423