CN113811948A - User interface for video editing system - Google Patents
User interface for video editing system Download PDFInfo
- Publication number
- CN113811948A CN113811948A CN202080035266.3A CN202080035266A CN113811948A CN 113811948 A CN113811948 A CN 113811948A CN 202080035266 A CN202080035266 A CN 202080035266A CN 113811948 A CN113811948 A CN 113811948A
- Authority
- CN
- China
- Prior art keywords
- timeline
- playhead
- scale
- software product
- moving
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003993 interaction Effects 0.000 claims abstract description 35
- 238000000034 method Methods 0.000 claims abstract description 17
- 230000004044 response Effects 0.000 claims abstract description 6
- 230000002123 temporal effect Effects 0.000 claims description 10
- 238000003860 storage Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 239000000470 constituent Substances 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 239000000654 additive Substances 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
Abstract
The present invention relates to a method, system and software product for a video editing system. The software product includes a computer-readable medium storing instructions that, when executed by a processor: displaying a graphical user interface for a video editing system, the graphical user interface comprising first and second spaced apart timelines; and in response to detecting a user interaction with one of the timelines, applying the corresponding interaction to the other timeline.
Description
Technical Field
The present invention generally relates to video editing software applications. More particularly, the present invention relates to a graphical user interface for a video editing software application that, at least in preferred embodiments, provides a more efficient workflow environment for a user.
Background
The reference to any prior art in this specification is not an acknowledgement or suggestion that: this prior art forms part of the common general knowledge in any jurisdiction or can reasonably be expected to be understood by a person skilled in the art, considered relevant and/or combined with other parts of the prior art.
Typically, movies and/or video works are created on a video editing system by assembling items from a collection of component elements. The video editing system allows these constituent elements (which include video clips, audiovisual clips, audio clips, and associated metadata) to be imported and edited separately before being incorporated into the final work. Modern video editing systems (and particularly those used professionally in the movie and television industries) include sophisticated video editing software applications. Applicant's so-called Davinci palette (Davinci)) Is an example of a modern video editing system that is widely used in professional environments. DavinciIs divided into a plurality of individual pages (each having their own graphical user interface) that are organized in the order of a typical workflow. DavinciThe page includes: "media" (for media management and segment organization); "edit" (a non-linear video editor); "color" (for color correction and grading); "Sound" (number)Audio workstation) and "delivery" (for final rendering or output).
Davinci, like other non-linear video editorsThe user interface of the "edit" page of (1) includes a timeline, which is a graphical representation of the item being edited. The timeline includes a plurality of linearly spaced time code markers that extend along the length of the user interface window, typically in a horizontal direction. The timeline allows the component elements of the project to be arranged in a desired chronological order by positioning the elements relative to the time markers of the timeline. Once placed in the timeline, the elements may be edited by launching an editing tool to perform operations such as trimming, splitting, inserting, merging, and moving the segments to a desired location.
The present invention is directed to a graphical user interface for a video editing system that, at least in preferred embodiments, provides a more optimized video editing workflow environment for a user.
Disclosure of Invention
According to a first aspect of the invention, there is provided a software product comprising a computer readable medium storing instructions that, when executed by a processor:
displaying a graphical user interface for a video editing system, the graphical user interface comprising first and second spaced apart timelines; and
in response to detecting a user interaction with one of the timelines, a corresponding interaction is applied to the other timeline.
A graphical user interface for a video editing system includes a first link timeline and a second link timeline, where actions taken on one of the timelines are automatically applied to the other timeline. This provides a more flexible or at least alternative workflow environment in which video editing and production takes place.
According to one embodiment, the first timeline includes a plurality of linearly spaced time markers displayed according to a first scale, and the second timeline includes a plurality of linearly spaced time markers displayed according to a second scale, the second scale being different from the first scale.
The present invention is capable of detecting and processing a wide variety of user interactions with one of the timelines. For example, the user interaction may be a movement of the playhead relative to one of the timelines. In this case, the playhead of the other timeline automatically moves. Preferably, the playhead of the further timeline is moved such that it and the playhead on the first timeline are aligned with the same time marker. In other words, the playhead of each timeline refers to the same point in time both before and after the move.
In another embodiment, the user interaction is moving the playhead relative to one timeline and the corresponding interaction applied is moving another timeline relative to a second playhead. Preferably, in this case, the further timeline moves simultaneously with the playhead.
The user interaction may also involve changing the scale (or "zooming in or out") of one of the timelines. In this case, the scale of the other timeline may be automatically changed. Typically, the scale of the other timeline is changed in proportion to the change in scale of the first timeline in order to preserve the ratio of the two scales.
Due to the different scales of the two timelines, it may be advantageous to depict the constituent elements of the work (such as video, audio, and audiovisual segments) placed in the two timelines in different ways. For example, a timeline with a larger scale (that is, where the same linear distance represents a longer time interval) may depict a video segment or audiovisual segment in a schematic manner, while another timeline may use the constituent frames of the segment to describe the same segment.
According to another aspect of the present invention, there is provided a method for providing a graphical user interface for a video editing system, the method comprising:
displaying a graphical user interface for a video editing system, the graphical user interface comprising first and second spaced apart timelines; and
in response to detecting a user interaction with one of the timelines, a corresponding interaction is applied to the other timeline.
The invention also provides a video editing system comprising a processor and a software product according to the first aspect of the invention.
As used herein, unless the context requires otherwise, the term "comprise" and variations of the term, such as "comprises", "comprising", and "included", are not intended to exclude further additives, components, integers or steps.
Drawings
Other aspects of the invention and further embodiments of the aspects described in the preceding paragraphs will become apparent from the following description, by way of example, and with reference to the accompanying drawings, in which:
FIG. 1 is an illustration of a first view of a user interface according to an embodiment of the present invention;
FIG. 2 is an enlarged view of a portion of the user interface shown in FIG. 1;
FIG. 3 is an illustration of a second view of a user interface according to an embodiment of the invention;
FIG. 4 is an enlarged view of a portion of the user interface shown in FIG. 3;
FIG. 5 is a flowchart illustration of an event loop suitable for implementing graphical user interface features, according to an embodiment of the present invention; and
FIG. 6 is a schematic illustration of a hardware environment suitable for implementing a graphical user interface, according to an embodiment of the present invention.
Detailed Description
A user interface 100 for a video editing system comprising a software application is shown in fig. 1. The video editing software application allows for the creation of projects from the constituent elements of the source media that are imported into the system and displayed in the media bin region 101. Six constituent elements are shown in FIG. 1, including an audio track 102A and five audiovisual segments 102B-102F (i.e., each video segment has an audio track of a recording). As understood by those skilled in the art, the video editing application also stores and manages metadata for each constituent element as well as for the entire project.
The user interface 100 is simplified by displaying only a subset of the common user interface elements displayed in DaVinciOn the "edit" page of (c). For example, the user interface 100 does not include separate viewer windows for viewing video segments and audiovisual segments in the media bin region 101. The simplified user interface displays core tools required for certain items, for example, for importing media, editing, cropping, adding transitions and titles, automatically matching colors, and mixing audio.
The user interface 100 also includes a viewer window 110 for viewing the selected segment (in this example segment 102F), a first timeline 120, and a second timeline 140 disposed in parallel with the first timeline 120. In this example, the second timeline 140 is located above the first timeline 120, but may be located below in other embodiments. The first timeline 120 is a timeline familiar from video editing software, that is, a timeline that allows for the creation of projects through the insertion, editing, and placement of source media. In this regard, the various source media elements 102A-102F may be dragged or otherwise inserted from the media bin region 101 or the viewer window 110 into the first timeline 120. Once in the timeline, the source media elements may be edited and arranged as appropriate. In accordance with the present invention, source media elements may also be inserted into the second timeline 140 as appropriate so that they automatically appear in the first timeline 120.
The second timeline 140 shows a timeline for the entire project, with the timeline 120 showing a portion of the project in an enlarged form.
Each timeline 120 and 140 includes a respective playhead 145 and 125, which is a graphical line indicating the current temporal position of the frames of the segments being played in the viewer window 110 and located in the first timeline 120 and the second timeline 140.
As more clearly shown in fig. 2, the first timeline 120 is divided into regions of uniform length and marked in units of hour, minute, second, and frame timecode. In the zoom level (or scale) shown in FIG. 2, each region of the timeline 120 contains two seconds and eight frames of a segment being played, the segment comprising 24 frames per second.
The time codes are marked on the second timeline 140 at different scales, wherein the area of the second timeline 140 is shorter in distance than the area of the first timeline 120 and comprises a longer time interval of the segments being played. In this regard, as shown in FIG. 2, each (shorter) region of the second timeline 140 contains 51 seconds and eight frames of the segment being played.
By depicting longer time intervals in each linear unit length, the second timeline 140 is configured to show the entire item within a single window of the user interface 100. This provides a higher level of perspective of the overall structure of the item. On the other hand, the finer scale of the first timeline 120 is particularly suited for viewing the details of the various segments and performing edits thereon. The two timelines 120 and 140 (each with a different scale) are depicted simultaneously so that the overall project structure is contrasted with the details of the current element.
In an exemplary embodiment, the graphical user interface 100 depicts the constituent elements present in each of the timelines 120 and 140 in a different manner. This may give certain advantages due to the fact that: the two timelines 120 and 140 have different scales. More specifically, as described above, the timeline 140 has a larger scale than the timeline 120 in terms of the length of the time interval represented by units of linear distance. Thus, the timeline 140 is better suited to depict the overall structure of the project, with the importance of the details of its various segments second.
The display method is represented in the graphical user interface 100 by using a solid rectangular shape 155 to depict the elements present in the timeline 140 in a schematic manner. Instead, those same elements are depicted in timeline 120 using a linear series 135 of individual frames that include the elements. In both timelines 120 and 140, the elements are stacked vertically in the same order. In the illustrated embodiment, this includes two audiovisual tracks stacked above two audio tracks.
Returning to FIG. 1, the graphical user interface 100 allows a user to interact with either of the timelines 120 and 140 in a variety of ways. One such interaction is to move one of the playheads 125 to a different position, which has the effect of advancing or retreating the playhead to a different time position in the segment being played. Such movement is illustrated in fig. 3 and 4, and fig. 3 and 4 illustrate the position of the time code of the playhead after moving from the position illustrated in fig. 1 and 2 to a different position. For the playhead 125 (located in the timeline 140), the positions shown in fig. 3 and 4 are located in the area bounded by the time code units 01:03:25:08 and 01:04:16: 16. For the playhead 145 (located in the timeline 120), the positions shown in fig. 3 and 4 are located in the area bounded by the time code units 01:03:34:16 and 01:03:37: 00.
The graphical user interface 100 is programmed as follows: in this manner, user interaction with one of the timelines 120 and 140 is automatically applied to the other timeline. In the case of playhead movement, the movement of one playhead is automatically reflected to the position of the other playhead relative to its particular timeline. This automatic application of user interaction from one timeline to another is synergistically integrated with the different scales of each timeline 120 and 140. In this regard, a user can easily manipulate the timeline in a manner that optimizes the editing workflow. For example, the user may move the playhead 125 to quickly navigate to different locations throughout the project, where the playhead 145 automatically moves to a particular segment located at a new location in the timeline 120. Conversely, manipulating the playhead 145 allows the user to make more precise time-position adjustments, the results of which can be immediately viewed in a context global to the project by examining the playhead 125's new position in the timeline 140.
In another embodiment, the user may move the playhead 125 to navigate to different locations throughout the project, however, the playhead 145 remains in the same location, e.g., in the middle of the timeline 120. To reflect the change in temporal position, the timeline 120 itself moves relative to the (stationary) playhead 145. The movement of the timeline 120 occurs simultaneously with the movement of the playhead 125. This ensures that both during the playhead movement and when the movement is complete, playhead 145 and playhead 125 point to the same time value.
Another interaction with the timelines 120 and 140 that the present invention may facilitate is changing the scale (or "zooming in" or "zooming out") of one of the timelines. This change is made by manipulating an appropriate graphical user interface element (such as a slider) or pressing a keyboard shortcut provided by or mapped into the video editing system. The user interface may be configured such that a change in scale of one of the timelines does not affect the other timeline therein. Other embodiments allow changes in zoom level to be applied to other times, such as applying a scale change to preserve the ratio of two scales.
The user interface according to the present invention provides an enhanced workflow for operations involving moving segments from one location to another in a timeline. For example, the user may wish to move the clip being edited to the end of the presentation. Previously, such operations involved selecting a clip, zooming out the timeline to make the desired location visible, and then either dragging and dropping the clip to the desired location or copying and pasting the clip to the desired location. According to the present invention, this is simply accomplished by selecting a segment in the timeline 120 and dragging it up to the desired location in the timeline 140. The desired position is already visible in the timeline 140 due to the larger "zoom out" scale.
The two timelines 120 and 140 provide alternative views of the same item. In this way, the user interface 100 is programmed to detect movement of a segment from one of the timelines to the other and apply the appropriate action to the timeline from which the segment is moved. For example, if a user moves a snippet from the timeline 120 to a particular temporal location on the timeline 140, the user interface 100 places a copy of the snippet (or a reference to the snippet) at the same temporal location in the timeline 120. However, due to the smaller scale, the segments in the timeline 120 may not be visible to the user until they move the playhead 145 to that temporal position. As described above, this operation can be easily performed by moving the playhead 125 to a desired temporal position, where the playhead 145 is automatically moved over the segment in the timeline 120.
FIG. 5 illustrates an event loop suitable for implementing the Graphical User Interface (GUI) features of the present invention. The process begins at step 500, where an appropriate event listener attached to one of the first timeline 120 or the second timeline 130 receives a notification from the underlying GUI framework of an event representing a user interaction with one of the timelines.
At step 510, the video editing software application queries for the event and determines the particular type of user interaction that has occurred. As described above, an example of a user interaction is moving a playhead associated with a timeline and changing the scale of the timeline.
At step 520, the video editing software application applies the corresponding interaction to another timeline using the method set forth above.
At step 530, a determination is made as to whether the event listener attached to the timeline has terminated. In the case of termination, the event loop stops iterating. Instead, the event loops back to step 500 to listen for subsequent events that occur with respect to the timeline.
FIG. 6 provides a block diagram illustrating one example of a computer system 1000 by which embodiments of the invention may be implemented. Computer system 1000 includes a bus 1002 or other communication mechanism for communicating information, and a hardware processor 1004 coupled with bus 1002 for processing information. Hardware processor 1004 may be, for example, a general purpose microprocessor, a graphics processing unit, other types of processing units, or a combination thereof.
According to one embodiment, the techniques herein are implemented by computer system 1000 in response to processor 1004 executing one or more sequences of one or more instructions contained in main memory 1006. Such instructions may be read into main memory 1006 from another storage medium, such as a remote database. Execution of the sequences of instructions contained in main memory 1006 causes processor 1004 to perform the method steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
The term "storage medium" or "storage media" as used herein refers to any non-transitory medium that stores data and/or instructions that cause a machine to operate in a specific manner. Such storage media may include non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 1010. Volatile media includes dynamic memory, such as main memory 1006. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
It will be understood that the invention disclosed and defined in this specification extends to all alternative combinations of two or more of the individual features mentioned or evident from the text or drawings. All of these different combinations constitute various alternative aspects of the present invention.
Claims (19)
1. A software product comprising a computer-readable medium storing instructions that, when executed by a processor:
displaying a graphical user interface for a video editing system, the graphical user interface comprising first and second spaced apart timelines; and
in response to detecting a user interaction with one of the timelines, a corresponding interaction is applied to the other timeline.
2. The software product of claim 1, wherein the first timeline includes a plurality of linearly spaced time markers displayed according to a first scale and the second timeline includes a plurality of linearly spaced time markers displayed according to a second scale, the second scale being different from the first scale.
3. The software product of claim 2, wherein the first timeline and the second timeline visually display elements respectively placed thereon in different ways.
4. A software product according to claim 3, wherein the timeline with the larger scale visually displays the media elements placed thereon in a schematic manner.
5. A software product according to any preceding claim, wherein the user interaction is moving a playhead relative to one timeline and the corresponding interaction applied is moving a second playhead relative to another timeline.
6. The software product of claim 5, wherein the second playhead moves a distance such that it is aligned with a value of a time stamp to which the first playhead is aligned.
7. The software product of any of claims 1 to 4, wherein the user interaction is moving a playhead relative to one timeline and the corresponding interaction applied is moving another timeline relative to a second playhead.
8. The software product of claim 7, wherein the other timeline moves concurrently with the playhead.
9. A software product according to any preceding claim, wherein the user interaction is moving a media element from one timeline to a temporal position on another timeline, and the corresponding interaction applied is placing a copy of the media element or a reference to the media element at the corresponding temporal position on the one timeline.
10. A method for providing a graphical user interface for a video editing system, comprising:
displaying a graphical user interface comprising a first timeline and a second timeline that are spaced apart; and
in response to detecting a user interaction with one of the timelines, a corresponding interaction is applied to the other timeline.
11. The method of claim 10, wherein the first timeline includes a plurality of linearly spaced time markers displayed according to a first scale and the second timeline includes a plurality of linearly spaced time markers displayed according to a second scale, the second scale being different from the first scale.
12. The method of claim 11, wherein the first timeline and the second timeline visually display elements respectively placed thereon in different manners.
13. The method of claim 12, wherein the timeline having the larger scale visually displays the media elements placed thereon in a schematic manner.
14. The method of any of claims 10-13, wherein the user interaction is moving a playhead relative to one timeline and the applied corresponding interaction is moving a second playhead relative to another timeline.
15. The method of claim 14, wherein the second playhead moves a distance such that it is aligned with the value of the time stamp to which the first playhead is aligned.
16. The method of any of claims 10-13, wherein the user interaction is moving a playhead relative to one timeline and the applied corresponding interaction is moving another timeline relative to a second playhead.
17. The method of claim 16, wherein the other timeline moves concurrently with the playhead.
18. The method of any of claims 10-13, wherein the user interaction is moving a media element from one timeline to a temporal location on another timeline, and the applied corresponding interaction is placing a copy of the media element or a reference to the media element at the respective temporal location on the one timeline.
19. A video editing system, comprising:
a processor; and
the software product of any one of claims 1 to 9.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2019901099A AU2019901099A0 (en) | 2019-04-01 | User interface for video editing system | |
AU2019901099 | 2019-04-01 | ||
PCT/AU2020/050320 WO2020198792A1 (en) | 2019-04-01 | 2020-04-01 | User interface for video editing system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113811948A true CN113811948A (en) | 2021-12-17 |
CN113811948B CN113811948B (en) | 2023-12-15 |
Family
ID=72664326
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080035266.3A Active CN113811948B (en) | 2019-04-01 | 2020-04-01 | User interface for video editing system |
Country Status (5)
Country | Link |
---|---|
US (1) | US20220189511A1 (en) |
EP (1) | EP3948503A4 (en) |
JP (1) | JP2022528858A (en) |
CN (1) | CN113811948B (en) |
WO (1) | WO2020198792A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2022203064B1 (en) * | 2022-05-06 | 2023-06-29 | Canva Pty Ltd | Systems, methods, and user interfaces for editing digital assets |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101107667A (en) * | 2004-12-17 | 2008-01-16 | 诺基亚公司 | Method and apparatus for video editing on small screen with minimal input device |
CN101290787A (en) * | 2007-04-16 | 2008-10-22 | 奥多比公司 | Changing video frequency playback speed ratio |
CN101313319A (en) * | 2005-03-04 | 2008-11-26 | 夸德拉特公司 | User interface for appointment scheduling system showing appointment solutions within a day |
CN101657814A (en) * | 2007-04-13 | 2010-02-24 | 汤姆逊许可证公司 | Systems and methods for specifying frame-accurate images for media asset management |
US20100281372A1 (en) * | 2009-04-30 | 2010-11-04 | Charles Lyons | Tool for Navigating a Composite Presentation |
US20130104042A1 (en) * | 2011-02-16 | 2013-04-25 | Apple Inc. | Anchor Override for a Media-Editing Application with an Anchored Timeline |
JP2013156867A (en) * | 2012-01-31 | 2013-08-15 | Nk Works Kk | Image reproduction program and image reproduction device |
US20140143671A1 (en) * | 2012-11-19 | 2014-05-22 | Avid Technology, Inc. | Dual format and dual screen editing environment |
CN107111620A (en) * | 2014-10-10 | 2017-08-29 | 三星电子株式会社 | Video editing using context data and the content discovery using group |
US20180358049A1 (en) * | 2011-09-26 | 2018-12-13 | University Of North Carolina At Charlotte | Multi-modal collaborative web-based video annotation system |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0526064B1 (en) * | 1991-08-02 | 1997-09-10 | The Grass Valley Group, Inc. | Video editing system operator interface for visualization and interactive control of video material |
US7805678B1 (en) * | 2004-04-16 | 2010-09-28 | Apple Inc. | Editing within single timeline |
US8126312B2 (en) * | 2005-03-31 | 2012-02-28 | Apple Inc. | Use of multiple related timelines |
US7434155B2 (en) * | 2005-04-04 | 2008-10-07 | Leitch Technology, Inc. | Icon bar display for video editing system |
US8363055B1 (en) * | 2008-11-18 | 2013-01-29 | Pixar | Multiple time scales in computer graphics |
-
2020
- 2020-04-01 JP JP2021557984A patent/JP2022528858A/en active Pending
- 2020-04-01 WO PCT/AU2020/050320 patent/WO2020198792A1/en unknown
- 2020-04-01 CN CN202080035266.3A patent/CN113811948B/en active Active
- 2020-04-01 EP EP20784664.3A patent/EP3948503A4/en active Pending
- 2020-04-01 US US17/600,966 patent/US20220189511A1/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101107667A (en) * | 2004-12-17 | 2008-01-16 | 诺基亚公司 | Method and apparatus for video editing on small screen with minimal input device |
CN101313319A (en) * | 2005-03-04 | 2008-11-26 | 夸德拉特公司 | User interface for appointment scheduling system showing appointment solutions within a day |
CN101657814A (en) * | 2007-04-13 | 2010-02-24 | 汤姆逊许可证公司 | Systems and methods for specifying frame-accurate images for media asset management |
CN101290787A (en) * | 2007-04-16 | 2008-10-22 | 奥多比公司 | Changing video frequency playback speed ratio |
US20100281372A1 (en) * | 2009-04-30 | 2010-11-04 | Charles Lyons | Tool for Navigating a Composite Presentation |
US20130104042A1 (en) * | 2011-02-16 | 2013-04-25 | Apple Inc. | Anchor Override for a Media-Editing Application with an Anchored Timeline |
US8966367B2 (en) * | 2011-02-16 | 2015-02-24 | Apple Inc. | Anchor override for a media-editing application with an anchored timeline |
US20180358049A1 (en) * | 2011-09-26 | 2018-12-13 | University Of North Carolina At Charlotte | Multi-modal collaborative web-based video annotation system |
JP2013156867A (en) * | 2012-01-31 | 2013-08-15 | Nk Works Kk | Image reproduction program and image reproduction device |
US20140143671A1 (en) * | 2012-11-19 | 2014-05-22 | Avid Technology, Inc. | Dual format and dual screen editing environment |
CN107111620A (en) * | 2014-10-10 | 2017-08-29 | 三星电子株式会社 | Video editing using context data and the content discovery using group |
Non-Patent Citations (2)
Title |
---|
"Intro to Adobe Premiere Pro CC-06 Source Monitor, Program Monitor and timeline" * |
"The Program Monitor-Adobe Premiere Pro 2.0 Studio Techniques", pages 1 - 5 * |
Also Published As
Publication number | Publication date |
---|---|
CN113811948B (en) | 2023-12-15 |
WO2020198792A1 (en) | 2020-10-08 |
EP3948503A4 (en) | 2023-01-04 |
JP2022528858A (en) | 2022-06-16 |
US20220189511A1 (en) | 2022-06-16 |
EP3948503A1 (en) | 2022-02-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7930418B2 (en) | Collaborative computer-based production system including annotation, versioning and remote interaction | |
US8555170B2 (en) | Tool for presenting and editing a storyboard representation of a composite presentation | |
US10404959B2 (en) | Logging events in media files | |
US7623755B2 (en) | Techniques for positioning audio and video clips | |
RU2530342C2 (en) | Interaction with multimedia timeline | |
US6400378B1 (en) | Home movie maker | |
US8091039B2 (en) | Authoring interface which distributes composited elements about the display | |
US20120210222A1 (en) | Media-Editing Application with Novel Editing Tools | |
CN102099860A (en) | User interfaces for editing video clips | |
US20080263433A1 (en) | Multiple version merge for media production | |
JP2007533271A (en) | Audio-visual work and corresponding text editing system for television news | |
US11721365B2 (en) | Video editing or media management system | |
US20080256448A1 (en) | Multi-Frame Video Display Method and Apparatus | |
CN114450935A (en) | Video editing system, method and user interface | |
CN113811948B (en) | User interface for video editing system | |
US11942117B2 (en) | Media management system | |
US10692536B1 (en) | Generation and use of multiclips in video editing | |
CN113811843B (en) | Media management system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |