AU2004201179B2 - Editable Titling - Google Patents

Editable Titling Download PDF

Info

Publication number
AU2004201179B2
AU2004201179B2 AU2004201179A AU2004201179A AU2004201179B2 AU 2004201179 B2 AU2004201179 B2 AU 2004201179B2 AU 2004201179 A AU2004201179 A AU 2004201179A AU 2004201179 A AU2004201179 A AU 2004201179A AU 2004201179 B2 AU2004201179 B2 AU 2004201179B2
Authority
AU
Australia
Prior art keywords
media
editing
media file
production copy
text
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2004201179A
Other versions
AU2004201179A1 (en
Inventor
Choi Chi Evelene Ma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Information Systems Research Australia Pty Ltd
Original Assignee
Canon Information Systems Research Australia Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2003901315A external-priority patent/AU2003901315A0/en
Application filed by Canon Information Systems Research Australia Pty Ltd filed Critical Canon Information Systems Research Australia Pty Ltd
Priority to AU2004201179A priority Critical patent/AU2004201179B2/en
Publication of AU2004201179A1 publication Critical patent/AU2004201179A1/en
Application granted granted Critical
Publication of AU2004201179B2 publication Critical patent/AU2004201179B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Description

S&F Ref: 670220
AUSTRALIA
PATENTS ACT 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT Name and Address of Applicant: Actual Inventor(s): Address for Service: Invention Title: Canon Information Systems Research Australia Pty Ltd 1 Thomas Holt Drive North Ryde New South Wales 2113 Australia Choi Chi Evelene Ma Spruson Ferguson St Martins Tower Level 31 Market Street Sydney NSW 2000 (CCN 3710000177) Editable Titling ASSOCIATED PROVISIONAL APPLICATION DETAILS [33] Country [31] Applic. No(s) AU 2003901315 [32] Application Date 21 Mar 2003 The following statement is a full description of this invention, including the best method of performing it known to me/us:- 5815c -1- EDITABLE TITLING Field of the Invention The current invention relates to video editing and in particular to editing of titling and other text fields in-place on a video display.
Background Video editing refers to the process of arranging video footage in a time-line, trimming video clips, cutting clips out from raw footage, inserting video transitions, scene mattes and background music, tuning voice-over, and so on. When the process is carried out manually by users, it is referred to as "manual" editing. Usually, this process is time consuming and demands high skill sets. On the other hand, when the process is executed by software, with little or no user involvement, it is called "automatic" editing.
Nowadays, as automatic editing software is becoming more popular and widely used, video editing is becoming a lighter and easierjob.
In most automatic editing software, users are asked to select their preferred movie theme. When a selected movie theme is applied, a production can be produced in a few seconds. This typically involves cutting out video clips according to rules associated with the movie theme, and inserting scene mattes, transitions, effects and background music automatically. In contrast to manual editing, the drawback of automatic editing is the uniformity of the output productions. Although different video footage can be input, productions using the same movie theme are typically similar in style, rhythm and visual effects.
In order to make customised productions using automatic editing, there are a few different options and approaches. Once such option is the use of editable titling, which allows users to insert their own individual movie titles in "title" and scene mattes. Title mattes and scene mattes are short video clips made up of graphic objects and/or text.
They are placed at the beginning and/or end of a movie, and prior to the commencement 170304 670220.doc 170304 670220.doc -2of new scenes. They are used as effective means of marking different sections of movies.
The authoring process for title and scene mattes, ie. the process by which they are created, is referred to as matte authoring.
In much available video editing software, users can insert their titles into title and scene mattes. There are a few common approaches to provide this capability. For instance, a dialog box can be provided in which users can type in titles. The titles are then placed in the mattes and displayed in a preview window when the software rerenders the content in the preview window. The disadvantage of this approach is that users are unable to insert text "in place" directly on the displayed content.
Another approach is to let users type in titles in place over title/scene mattes that are displayed in a preview window. The editing software superimpose the resultant titles into mattes upon re-rendering the contents at the preview window. In more sophisticated video editing software implemented using this approach, users can load in their own authored title/scene mattes, thereby providing the users with enhanced flexibility and creativity. In particular circumstances, authors of title/scene mattes may prefer to create default titles with constrained styles to match the graphics of authored mattes during matte authoring. Unfortunately, default titles with constrained styles cannot typically be edited using this approach, which only works with overlay titles rather than editable titles in title/scene mattes.
Media file formats such as Macromedia Flash animation format (SWF) version or greater (hereinafter referred to as "SWF supports editing of displayed text in place. Many easily available "third party" media players support interactive playback of media files in the SWF format, and allow users to edit text fields in place. One example of such a media player is QuickTime T M Changes that are made to editable text fields in third party media players are not persistent, however, and are thus not stored to disk or 170304 670220.doc 170304 670220doc -3other persistent memory by the media player after the playback and/or editing session is completed. Although there are a wide variety of third party media players available, most, if not all of them use non-persistent memory for displaying and operating on media files.
In a movie, there can be multiple identical scene mattes and they refer to the same file on disk. It is important to ensure that when changes are made to one of them, the rest should be unaffected.
Summary of the Invention It is an object of the present invention to substantially overcome, or at least ameliorate, one or more disadvantages of existing arrangements.
Disclosed are arrangements which seek to address the above problems by providing a framework allowing the user to permanently edit titles and other text fields in media content files using the SWF format by extracting edited text fields from the third party media player using the player API, and storing these altered text fields in an external persistent memory.
According to a first aspect of the present invention, there is provided a method of editing a base media file in a format supporting editing of titles, the method comprising the steps of: creating, by a third party multi-media player, a present production copy of the base media file; storing, by the multi-media player, said present production copy in nonpersistent storage; displaying, by the multi-media player, said present production copy; editing text in-place on the displayed present production copy to thereby form an altered present production copy having changed text; 170304 670220.doc 170304 670220.doc -4extracting, via the API of the multi-media player, said changed text from the altered present production copy; and incorporating the extracted changed text into a current media file corresponding to the altered present production copy.
According to another aspect of the present invention, there is provided an apparatus for editing a base media file in a format supporting editing of titles, using a third party multi-media player for creating a present production copy of the base media file, the apparatus comprising: non-persistent storage means for storing, by the multi-media player, said present production copy; display means for displaying, by the multi-media player, said present production copy; editing means for editing text in-place on the displayed present production copy to thereby form an altered present production copy having changed text; extracting means for extracting, via the API of the multi-media player, said changed text from the altered present production copy; and incorporating means for incorporating the extracted changed text into a current media file corresponding to the altered present production copy.
According to another aspect of the present invention, there is provided a computer program for directing a processor to execute a method of editing a base media file in a format supporting editing of titles, using a third party multi-media player for creating a present production copy of the base media file, said multi-media player being adapted to store the present production copy in non-persistent storage, display the present production copy, and support editing text in-place on the displayed present production copy to thereby form an altered present production copy having changed text, the program comprising: 170304 670220.doc 170304 670220.doc code for extracting, via the API of the multi-media player, said changed text from the altered present production copy; and code for incorporating the extracted changed text into a current media file corresponding to the altered present production copy.
Other aspects of the invention are also disclosed.
Brief Description of the Drawings One or more embodiments of the present invention will now be described with reference to the drawings, in which: Fig. 1 depicts a functional overview of an editing process, and sets out some terminology; Fig. 2 shows a functional representation of the disclosed title editing arrangement; Fig. 3 shows a schematic block diagram of a general-purpose computer upon which the described method for editing titling can be practiced; Fig. 4 depicts part of a Graphical User Interface associated with the disclosed title editing arrangement; Fig. 5 shows a process for switching from "movie mode' to "clip mode" in order to commence a title editing session; Fig. 6 shows a process for switching from clip mode to movie mode at the conclusion of a title editing session; Fig. 7 shows a functional block representation of the render engine in Fig. 2; Fig. 8 depicts a data structure of an EDL for a production suitable for title editing; Fig. 9 shows an auto-editing process; Fig. 10 shows the process of Fig. 9 augmented with title editing capability; Fig. 11 shows a media parsing process; 170304 670220.doc 170304 670220.doc -6- Fig. 12 shows a process for commencing the clip mode; Figs. 13A and 13B show a process for terminating the clip mode after a title editing session; and Figs. 14A 14B show a process for saving title editing changes into the EDL of an edited production.
Detailed Description including Best Mode Before proceeding with a description of the embodiments, a glossary of terms is presented.
GLOSSARY
base media item: a media item, such as video, audio, or still image, used to form a production current EDL: an Edit Display List (EDL) describing the current production current mediafile: a copy of those base media items referred to in the current EDL which contain editable titles current production: that production defined by the current EDL EDL: a data structure describing a sequence of base media items used in a production; an EDL is made up of EDL elements EDL element: that part of the EDL that relates to an individual media item as it is used in a production; also referred to as a clip structure GUI: a Graphical User Interface media structure: those parameters in an EDL element relating to media type, and dimensions for a media item non-persistent storage: a storage mechanism which retains data dependent upon the state of an editing session; ie data is erased when an editing session terminates persistent storage: a storage mechanism which retains data irrespective of the state of an editing session 170304 670220.doc 170304 670220.doc -7present production copy: a copy of the media item presently being operated upon by the multi-media player, the copy being stored in non-persistent memory by the multimedia player textfield variable names: the names of those editable text fields in a SWF 4.0+ media file that supports editable titles in place timeline: a GUI construct into which media items can be placed to form a production title: a generic term used to indicate editable text, unless the context clearly indicates another meaning titled media structure: a media structure including additional parameters referring to a current mediafile and a list of titles and associated variable names for the titles slot: a position in the timeline to which a particular EDL element refers
DESCRIPTION
It is to be noted that the discussions contained in the "Background" section relating to prior art arrangements relate to discussions of documents or devices which form public knowledge through their respective publication and/or use. Such should not be interpreted as a representation that such documents or devices in any way form part of the common general knowledge in the art.
Where reference is made in any one or more of the accompanying drawings to steps and/or features, which have the same reference numerals, those steps and/or features have for the purposes of this description the same function(s) or operation(s), unless the contrary intention appears.
Some portions of the description which follows are explicitly or implicitly presented in terms of algorithms and symbolic representations of operations on data within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, 170304 670220.doc 170304 670220. doc -8conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that the above and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, and as apparent from the following, it will be appreciated that throughout the present specification, discussions utilizing terms such as "scanning", "calculating", "determining", "replacing", "generating" "initializing", "outputting", or the like, refer to the action and processes of a computer system, or similar electronic device, that manipulates and transforms data represented as physical (electronic) quantities within the registers and memories of the computer system into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
The present specification also discloses apparatus for performing the operations of the methods. Such apparatus may be specially constructed for the required purposes, or may comprise a general purpose computer or other device selectively activated or reconfigured by a computer program stored in the computer. The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus.
Various general purpose machines may be used with programs in accordance with the teachings herein. Alternatively, the construction of more specialized apparatus to perform the required method steps may be appropriate. The structure of a conventional general purpose computer will appear from the description below.
170304 670220.doc 170304 670220.doc -9- In addition, the present specification also discloses a computer readable medium comprising a computer program for performing the operations of the methods. The computer readable medium is taken herein to include any transmission medium for communicating the computer program between a source and a designation. The transmission medium may include storage devices such as magnetic or optical disks, memory chips, or other storage devices suitable for interfacing with a general purpose computer. The transmission medium may also include a hard-wired medium such as exemplified in the Internet system, or wireless medium such as exemplified in the GSM mobile telephone system. The computer program is not intended to be limited to any particular programming language and implementation thereof. It will be appreciated that a variety of programming languages and coding thereof may be used to implement the teachings of the disclosure contained herein.
Fig. 1 depicts a functional overview of an editing process, and sets out some terminology. Base media items 101 form the material from which a production is to be formed. To form a production, base media items are placed, using a manual editor for example, into a time-line 104 having a series of time line slots 103. Each time line slot is associated, as depicted by an arrow 106, with a current EDL element 105. The series of current EDL elements 105 form a current EDL 107. With formation of the current EDL 107, copies of the base media items 101 which have editable text fields therein and which form part of the production defined by the current EDL 107 are stored as depicted by an arrow 302, as a current media file, into a persistent system store 301 (see Fig. The system store 301 is implemented using a dynamic memory 806 in the computer system 800 (see Fig. 3).
To play back the production, parameters from an EDL element such as 105 are loaded, as depicted by an arrow 116, into a multimedia player 316. These parameters include a file reference to the relevant media item referenced by the EDL element 105. If 170304 670220.doc 170304 670220.doc the media item being considered has editable text fields, then the file reference (see 1115 in Fig. 8) loaded into the player 316 refers to a media item (referred to as a current media file) in the current production copy stored in the persistent system store 301. If the media item being considered does not have editable text fields, then the file reference (see 1114 in Fig. 8) loaded into the player 316 refers to a base media item. The player 316 loads as depicted by an arrow 319 a respective media item from the current production copy 112, or as depicted by an arrow 115 a respective media item from the base media items 101, and stores the media item, as depicted by an arrow 319', as a present production in nonpersistent storage. The present production is stored in a non-persistent manner by the multi-media player 316 because this is the specific manner in which the player 316 is designed to operate. The player 316 displays the present production on a video display 814 in the computer system 800 (see Fig. In Fig. 2 a preview module 322 in the GUI module 312 provides, as depicted by an arrow 317, the player 316 with control commands as necessary. The multi-media player 316 enables in-place text editing of scene mattes, title mattes and so on. Each time text fields in a media item played by the player 316 are edited, the player 316 updates the present production.
Fig. 2 is a functional block diagram of the disclosed editing arrangement 300.
Base media items 101 (see Fig. such as captured video and audio clips are stored in a base media content store 303. Authored media items such as scene mattes and title mattes containing editable titles are stored in an authored content store 305 using SWF 4.0+ or an equivalent format. The base media and authored media stores 303 and 305 are implemented using a hard disk drive 810 in a computer system 800 (see Fig. Using a core editor 309 in manual edit mode, base and authored media items are extracted as depicted by respective arrows 304, 306 and inserted into a time-line 104 (see Fig. 1) and consequently an associated current EDL 107 is formed. If the core editor 309 is optionally used in auto-edit mode, a template is selected, as depicted by an arrow 310, from a 170304 670220.doc 170304 670220. doc -11template store 311 which is implemented on the hard disk 810 on the computer system 800 (see Fig. The selected template embodies a desired movie theme. The core editor 309 applies rules in the selected template to the current EDL 107 thereby producing an auto-edited current EDL. A current production, formed by applying the current EDL 107 to the referenced base media items 101 and/or the current media items, is played by a multi-media player 316 by passing relevant file references from a GUI module 312 to a render engine 313 as depicted by an arrow 324, and to the multi-media player 316 as depicted by an arrow 315. The render engine 313 acts as an interface to the third party multi-media player 316. Third party players that can be used include QuickTime T M and Macromedia Flash T M which are produced respectively by AppleTM and MacromediaTM.
The render engine 313 processes the current EDL element provided by the GUI module 312 and provides, as depicted by an arrow 315, parameters from the EDL element to the multimedia player 316 in a form suitable for use by the player.
The user of the system can interact with the system by providing, as depicted by an arrow 323, commands to the GUI module 312. The GUI module interacts with the core editor 309 and the render engine 313 as depicted by arrows 308 and 324 respectively.
Fig. 3 depicts the general-purpose computer system 800 which is a suitable platform for implementing the disclosed title editing system described in relation to Fig. 2. The processes described in relation to Figs. 5, 6 and 9-14 may be implemented as software, such as an application program executing within the computer system 800. In particular, the method steps for editing titles are effected by instructions in the software that are carried out by the computer. The instructions may be formed as one or more code modules, each for performing one or more particular tasks. The code modules can be organised along functional lines as depicted in Fig. 2. The software may also be divided into two separate parts, in which a first part performs the editing title methods and a 170304 670220.doc 170304 670220.doc -12second part manages the user interface between the first part and the user. The software may be stored in a computer readable medium, including the storage devices described below, for example. The software is loaded into the computer from the computer readable medium, and then executed by the computer. A computer readable medium having such software or computer program recorded on it is a computer program product.
The use of the computer program product in the computer preferably effects an advantageous apparatus for editing titles.
The computer system 800 is formed by a computer module 801, input devices such as a keyboard 802 and mouse 803, output devices including a printer 815, the display device814 and loudspeakers 817. A Modulator-Demodulator (Modem) transceiver device 816 is used by the computer module 801 for communicating to and from a communications network 820, for example connectable via a telephone line 821 or other functional medium. The modem 816 can be used to obtain access to the Internet, and other network systems, such as a Local Area Network (LAN) or a Wide Area Network (WAN), and may be incorporated into the computer module 801 in some implementations.
The computer module 801 typically includes at least one processor unit 805, and the memory unit 806, for example formed from semiconductor random access memory (RAM) and read only memory (ROM). The module 801 also includes a number of input/output interfaces including an audio-video interface 807 that couples to the video display 814 and loudspeakers 817, an I/O interface 813 for the keyboard 802 and mouse 803 and optionally a joystick (not illustrated), and an interface 808 for the modem 816 and printer815. In some implementations, the modem816 may be incorporated within the computer module 801, for example within the interface 808. A storage device 809 is provided and typically includes the hard disk drive 810 and a floppy disk drive 811. A magnetic tape drive (not illustrated) may also be used. The CD-ROM 170304 670220.doc 170304 670220.doc 13 drive 812 is typically provided as a non-volatile source of data. The components 805 to 813 of the computer module 801, typically communicate via an interconnected bus 804 and in a manner which results in a conventional mode of operation of the computer system 800 known to those in the relevant art. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations or alike computer systems evolved therefrom.
Typically, the application program is resident on the hard disk drive 810 and read and controlled in its execution by the processor 805. Intermediate storage of the program and any data fetched from the network 820 may be accomplished using the semiconductor memory 806, possibly in concert with the hard disk drive 810. In some instances, the application program may be supplied to the user encoded on a CD-ROM or floppy disk and read via the corresponding drive 812 or 811, or alternatively may be read by the user from the network 820 via the modem device 816. Stillfurther, the software can also be loaded into the computer system 800 from other computer readable media.
The term "computer readable medium" as used herein refers to any storage or transmission medium that participates in providing instructions and/or data to the computer system 800 for execution and/or processing. Examples of storage media include floppy disks, magnetic tape, CD-ROM, a hard disk drive, a ROM or integrated circuit, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 801.
Examples of transmission media include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like.
The method of editing titles may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub 170304 670220.doc 170304 670220.doc -14functions of editing titles. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories.
Fig. 4 shows part of a GUI which is displayed on the video display 814 in Fig. 3 when implementing the arrangement 300 of Fig. 2 on the general purpose computer system 800 in Fig. 3. The GUI 200 is encompassed by a GUI view frame 212. A preview window 320 enables the output from the player 316 in Fig. 2 to be viewed by the user. A mouse cursor 214 that is controlled by the mouse 803 in Fig. 3 can be used to select text for editing, in a manner that is known in common text editors used for document editing. The mouse cursor 214 has the depicted shape when used in relation to text fields in the preview window 320. The mouse cursor 214 changes shape to an "arrow" format (not shown) when the cursor 214 is moved out thepreview window in order to operate GUI controls. The aforementioned arrow form of the cursor 214 can be used as a control to operate a play control button 206, a fast-forward control button 207, and a rewind control button 205. Other controls including "stop", "pause", and so on are not explicitly shown. The multi-media production comprises a series of clips 1101 1104 as will be described in more detail with reference to Fig. 8. These clips are depicted as alternating and dark and light bars (208 and 209 respectively) on a movie time-line 210 in the GUI 200. A movie cursor 204 shows the time instant corresponding to the scene shown in the preview window 320 in relation to the movie depicted in the move time line 210. The movie cursor 204 is shown adjacent to a clip 203 in the movie timeline 210, and the clip 203 is shown in expanded form 202 in a clip time-line 213. A clip cursor 201 is shown adjacent to the expanded clip representation 202, indicating the time instant of the material being shown in the preview window 320. A mode control 216 is used to switch between "clip mode" and "movie mode", these two modes being used in controlling the system text editing system 300 (see Fig. The mode control 216 shows that the present mode of operation of the GUI 200 is the clip mode 215.
170304 670220.doc 170304 670220.doc The automatic editing arrangement depicted in Fig. 2 supports title editing using a framework which can identify when a user starts and/or finishes a title editing session.
This framework is referred as the "dual preview" arrangement.
In the dual preview arrangement, there is a single timeline structure, which holds video, audio and title clips of a movie in order. The system in Fig. 2 supports two different preview modes. They are referred to as the "clip mode" and the "movie mode".
In the clip mode, only a single media clip is shown in the preview window 320 (see Fig. The disclosed arrangement allows editable text in "static clips" such as title mattes or scene mattes to be edited. However, the clip mode can be invoked in regard to any media item in the production being considered.
In the movie mode, all clips in the movie are played. The component in the arrangement 300 which manages the dual preview framework, is referred as the preview module 322 which is implemented, in the example shown in Fig. 2.
Fig. 5 shows a process 400 for basic operation of the dualpreview arrangement.
The process 400 commences with a block 401 indicating that the system 300 is in the movie mode. In a subsequent step 402, a mode change command, indicating a switch from the movie mode to the clip mode, is received. This command is received while the movie is being played in the movie mode, and the command can be initiated by the user giving a mouse-click, using the mouse 803 (see Fig. 3) in the preview window 320 (see Fig. Alternatively, the mouse click can be directed to the mode control 216 in Fig. 4.
The preview module 322 (see Fig. 2) understands from the mouse click that the user wants to change to the clip mode. The user can also initiate the change by mouse clicking on the clip time line 213 (see Fig. In a following step 403, the preview module 322 looks for any unsaved changes made to the current production. If changes are found, then the process 400 is directed in accordance with a "yes" arrow to a step 404 in which the preview module 322 saves the changes into the current EDL before leaving the movie 170304 670220.doc 170304 670220.doc -16mode and proceeding to a following step 407. Returning to the testing step 403, if no changes are found, then the process 400 is directed according to a "no" arrow to the step 407. In the step 407, the preview module 322 fetches the media clip, which is shown in the preview window when the user initiates the mode change, from the (updated) current EDL. In a following step 408, the preview module 322 refreshes the preview window 320 with the updated clip by sending parameters and a reference to a media item to the multi-media player. If there are editable text fields in the selected clip, then editing commences when the user clicks in editable text fields to invoke a text cursor.
Fig. 6 shows a process 500 for changing from the clip mode to the movie mode.
The process 500 commences with a block 501 indicating that the system 300 is in the clip mode. In a next step 502, a mode change command, from the clip mode to the movie mode is received via the mode control 216 in Fig. 2 or when the user clicks on the movie time-line 210. This command indicates, if there are editable text fields in the clip, that the text editing session is concluded. In a following step 503, the preview module 322 checks if any changes were made to the clip being displayed in the preview window 320 while the system was in the clip mode. If changes are found, the process is directed according to a "yes" arrow to a step 504 in which the preview module 322 saves the changes into the current EDL and quits the clip mode. In a following step 505, in the movie mode, the preview module 322 fetches the current (updated) EDL, and in a following step 506, the preview module 322 refreshes the preview window 320 with the entire current EDL for playback by sending relevant references to files and parameters to the multi-media player.
Returning to the step 503, if no changes were found, then the process 500 is directed according to a "no" arrow to the step 505.
Returning to Fig. 2, further details of the system 300 are now described. The GUI module 312 operates the GUI 200 on the display 814 of the computer system 800 (see Fig. The GUI module 312 also receives inputs 323 from the user via the GUI 170304 670220.doc 170304 670220. doc -17- 200. This user input 323 is typically notification of keyboard or mouse input from the operating system software operating on the computer module 801. The GUI module 312 responds to the input 323 from the user and initiates operations within the software accordingly.
Within the GUI module 312 is a preview module 322. The GUI module 312 controls the rendering of the movie onto the computer system display 814.
The core editor 309, which can operate in manual and/or auto-edit modes, provides high-level manipulation of a production EDL. If the auto-editor supports manual editing capabilities, then the editor 309 supports simple operations such as adding and removing video clips in the timeline, and adjusting in and out points of media items.
When performing operations on the timeline, the core editor 309 uses the render engine 313 to build a low level representation of the movie timeline that can be rendered by the multi-media player 316. The low level representation referred to comprises EDL elements, which are in a format that is understandable by the multimedia player 316.
A template engine 318, that is a module within the core editor 309 if the editor 309 has auto-edit capability, is used to apply the template selected from the template store 311 to the current EDL 107 which references base and authored media items drawn from the respective stores 303, 305 and also references current media files 112. The template application to the EDL 107 builds a new EDL.
The template is a set ofparameterised data and, possibly, additional media items, that are accessed by the template engine 318 based on the particular template selected.
The template engine 318 implements the template rules for the processing of the EDL elements for the finished movie. These template rules with the parameterised data determine how to cut the media items referenced by the current EDL, what effects to apply and what additional material to add. The template engine 318 uses the core editor 309 to perform the actual manipulations of the EDL.
170304 670220.doc -18- The render engine 313 is used as a separation and simplification layer between the complex third party multimedia player 316 and the rest of the system 300. When components in the system 300 request operations by the multimedia player 316, the render engine 313 typically translates these requests into a format understood by the player 316 and passes the translated requests, as depicted by an arrow 315, to the multimedia player 316. The types of requests supported by the player 316 include loading media from the base media items 101 and/or the current media items 112, constructing low-level representations of movies from loaded media and playing representations of movies.
The multimedia player 316 provides numerous features for access, modification and display of multimedia data. The GUI module 312 uses the multimedia player 316, via the render engine 313, to load files containing media data and determine the type of the media, the duration and so on. The core editor 309 uses the multimedia player 316, via the render engine 313, to create low-level representations of the movie timeline. The low-level representation can be played by the media player 316 on the computer display 814 under the control of the preview module 322.
Within the multimedia player 316 is an animation renderer 321 for playing and manipulating media in flash SWF 4.0+ or other equivalent formats. The SWF format contains a description of graphic objects and descriptions of transformations of those objects over time. The graphical objects supported by the animation renderer 321 include text that can be edited.
Fig. 9 shows a process 4000 for automatic editing of media to form a video production. In a first step 4010, automatic editing software application is launched. By default, the software provides an empty workspace in the preview window 320 (see Fig. 4) for the user to start a new session of video editing. In a following step 4020, video footage comprising base and authored media items from the respective stores 303 and 305 170304 670220.doc 170304 670220.doc -19in Fig. 2 are loaded into the time-line 104. In a following step 4030, a movie theme is applied by the core editor 309 operating in auto-edit mode, by applying the template selected from the template store 311 to the current EDL 107 associated with media loaded into the time-line 104 in the step 4020. In a following step 4040, the auto-edited current movie production and the associated current EDL 107, produced according to the selected theme, are stored into a project file in the system store 301.
Fig. 10 shows a process 5000 which augments the process 4000 of Fig. 9 by including incorporation of text editing functionality. Steps 5010 5030 parallel the steps 4010 4030 in Fig. 9. In the step 5030, a movie theme is applied using a selected template, and this applies the template to the current EDL 107 which refers in general to selected sequence of base input video from the store 303, to title/scene mattes with editable titles that are loaded in from the store 305, and to media items from the current production copy 112.
In a following step 5040, the system is switched to the clip mode. The user can perform title editing by entering into the clip mode with an editable title/scene matte.
This step will be further discussed with reference to Fig. 12. In the step 5050 titles are edited. Once the user gets into the clip mode with a title/scene matte containing editable titles in the preview window 320, the user can modify titles by typing in place over the titles using techniques known in the art of word processing and text editing. In a following step 5060, the system is switched to the movie mode thereby terminating the text editing session.
Title editing is terminated once the system leaves the clip mode. All changes made during the title editing session are saved and reflected in the display during the subsequent movie mode. This step will be further discussed with reference to Fig. 13A and 13B. A following step 5080 tests whether further text editing is required. If this is the case, then the process 5000 is directed according to a "yes" arrow to the step 5040. If 170304 670220.doc 170304 670220.doc no further text editing is required, then the process 5000 is directed according to a "no" arrow to a step 5070 which saves the edited production and the current EDL 107 into a project file in the system store 301.
Fig. 7 shows a functional block diagram 900 relating of modules concerned with loading media items into the timeline 104, and for storing title changes resulting from editing of editable titles. The modules can be distributed between the core editor 309, and the GUI module 312 in Fig. 2.
A media loader 901 is responsible for loading in specified media items, such as base media items and/or authored media items, from respective stores 303 and 305 from the hard disk 810 (see Fig. 3) into the timeline 104 when building the EDL 107. Another instance of a current EDL underlying the timeline 104 is described with reference to in Fig. 8. The media loader process involves identifying the media types of the loaded media, and constructing appropriate media structures for holding references to the media items.
Fig. 8 shows a current EDL 1100 which contains a sequence of EDL elements 1101-1104. Each EDL element such as 1101 is a descriptive data structure (in memory) associated, as depicted by a dashed arrow 1117, with a corresponding media item 1118 in the timeline 104. The media item 1118 can be used more than once in the production represented by the EDL 1100, and each such instance would have a corresponding EDL element in the EDL 1100.
Each EDL element such as 1101 comprises, as depicted by an arrow 1105, parameters for the corresponding media item 1118 as it is utilised in the timeline slot under consideration. These parameters include editable editing parameters including in and out points 1106, audio volume 1107, and play speed 1108. The parameters also include a media structure field 1109 which references the structure of the media item 1118 as utilised in the production slot being considered. The editable parameters 1106- 170304 670220.doc 170304 670220.doc -21- 1108 together with the media structure 1109 comprise the EDL element describing how the media item 1118 is utilised in the production.
The media structure 1109 comprises, as depicted by an arrow 1110, metadata relating to the media item 1118 as shown in the time line slot in question. This metadata includes parameters such as media type 1111, dimensions 1112 height and width) of the media item, and a reference 1114 (as depicted by an arrow 1121) to the base media file 1118. A media structure 1109 having only the aforementioned media structure elements 1120 is referred to simply as a "media structure". If the media item has editable text fields, then a reference 1115 to the current media file is added to the media structure 1109, as well as a list of titles and their associated variable names (ie. 1116). This augmented media structure which has all the fields 1119 plus 1120 is referred to as a "titled media structure". For a media item having editable text fields, and thus a titled media structure, a current media item 112 is made.
The titled media structure consisting of the field groups 1119 and 1120 also has the field 1116 which lists the editable titles (ie text) in the media item in question, along with the SWF text field variable names associated with those titles.
The EDL 1100 is general enough to describe media files of different types, including video, audio and still images, including those having editable titles. From a terminology perspective, the media file initially passed to the media loader 901 to be loaded is called the "base media file" and the media file created as a result of finding editable titles in the base media file is called the current media file. When a titled media item is played in the preview window 320, what is played is the current media file.
Returning to Fig. 7 when building the time line 104, the media loader 901 is provided a base media item (1118 in Fig. 8) as depicted by an arrow 916. The media loader 901 checks the media type of the base media item by passing, as depicted by an arrow 902, the base media item to a title parser 904. If the title parser 904 finds editable 170304 670220.doc 170304 670220.doc -22titles in the base media item, the title parser 904 extracts the title contents and corresponding "text field variable names" and passes them, as depicted by an arrow 903, back to the media loader 901. The text field variable names are the names of those editable text fields in SWF media item which support the ability to edit titles in place according to the disclosed arrangement.
The media loader 901 duplicates the base media item 1118 to create a current media item 1123, and constructs a corresponding titled media structure (1119 1120) for the current media item 1123. The titled media structure has references to both the base media item 1118 and the associated current media item 1123. The titled media structure also includes the list of titles found in the base media item 1118 together with the corresponding variable names for those titles. If the title parser 904 returns an empty list, indicating that there are no editable titles or text fields in the base media element being considered, then the media loader 901 constructs a media structure 1120 for the given base media file 1118, and a titled media structure is not required.
The media loader 901 passes, as depicted by an arrow 905, the media structure/titled media structure to a timeline manager 906. The timeline manager 906 wraps the media structure/titled media structure in an EDL element and loads the clip structure into the corresponding EDL element 1101. The timeline manager 906 is responsible for the management and manipulation of the EDL. After building all the EDL elements associated with the media items input into the time line 104, the timeline manager 906 notifies, as depicted by an arrow 908, the preview module 322 which reloads the EDL for future playback.
Fig. 11 shows a process 8000 running in the title parser 904 (see Fig. 7) for parsing a media item. In a first step 8010, given a media item, the title parser 904 opens the file, and in a following step 8020 determines the file version of the media item. If the version is not SWF then the process 8000 is directed accordingto a "no" arrow to a 170304 670220.doc 170304 670220.doc -23terminating step 8070. Otherwise, a following step 8030 checks whether the media item being considered has another "tag block".
Media files in the SWF 4.0+ format are made up of tag blocks. Each tag block has an identifier which indicates the type of the tag block. For instance, a tag block can be a static text, an animation object or a graphic object. Thus, all tag blocks which are editable text fields, should have the same identifier.
If the step 8030 determines that there are no further tag blocks in the media item in question, then the process 8000 is directed according to a "no" arrow to the terminating step 8070. Otherwise, the process 8000 is directed to a testing step 8040. In the step 8040, if a fetched tag block does not contain an editable text field, then the process 8000 is directed according to a "no" arrow back to the step 8030. Otherwise, the process 8000 is directed according to a "yes" arrow to a step 8050 which parses the tag block.
A tag block, depending on its type, consists of fields storing specific data according to the specification of the media format. In a tag block of an editable text field, there is a field for storing the variable name of the text field, and a field for the text. In the step 8050 during parsing, the variable name and the text, referred to generically as a "title", are retrieved. The variable name and the text are stored in a list in a following step 8060, and then the process 8000 is directed according to an arrow 8080 back to the step 8030.
After scanning through the whole media item, the title parser 904 returns, as depicted by an arrow 903 (see Fig. a list of titles with variable names to the media loader 901.
Once all media files have been loaded by the media loader 901 and processed by the title parser 904 to thereby construct the current EDL, the production defined by the EDL is ready for previewing and editing.
170304 670220.doc 170304 670220.doc -24- In order for a user to preview what has been loaded into the timeline 104, the timeline manager 906 passes, as depicted by an arrow 908, the current EDL to the preview module 322. The preview module 322 manages the dual preview mode framework described in relation to Fig. 2.
The multimedia player 316 (see Fig. 2) is the data manipulation engine behind the preview module 322 that supports the playback and in-place title editing feature provided by the arrangement 300 in Fig. 2.
In order to set up the playback in regard to a selected media item in the clip mode, the preview module 322 passes, as depicted by an arrow 913 and based upon the EDL received from the timeline manager, the handle of the associated preview window 320 to the multimedia player 316 to take control of. The handle depicted by the arrow 913 actually passes through the render engine 313, however the render engine is omitted for the sake of explanation. The handle tells the multi-media player 316 where on the display 814, to display the selected media item. The preview module 322 also passes, as depicted by an arrow 914, a reference to the media item. This reference will be a reference to the current media item 1123 or to the base media 1118 item, depending on whether the selected media item has editable titles or not. The multimedia player 316 loads a present production copy 114 of the referenced media file into the memory 806 (see Fig. 3) for non-persistent storage, and displays the selected media item in the preview window 320 (see Fig. 4).
When the user edits titles in the preview window 320, all the changes resulting from the editing are saved in the present production copy 114 saved in the memory managed by the multimedia player 316. After the editing session is completed, either by switching to another clip or by switching back to movie mode, the present production copy 114 including the changes resulting from title editing is deleted by the multimedia player 316. However, prior to the deletion of the present production copy, the Preview 170304 670220.doc 170304 670220.doc Module 322 requests, as depicted by an arrow 915, the multimedia player 316 to extract all titles in the present production copy 114 which is stored from the non-persistent memory. The preview module 322 makes this request by providing variable names, one at a time, stored in the list of titles 1116 stored in the titled media structure of the selected media item. The multi-media player provides titles one at a time in response to the requests from the preview module. The preview module, once having received all the titles, compiles a list of the updated titles as depicted by an arrow 912. All the extracted titles are then sent, as depicted by an arrow 911, to a title updater 907. The title updater 907 generates a new current media copy 1123 using the corresponding base media item 1118 and stores the updated current media copy 1123 as depicted by an arrow 307.
Returning to Fig. 10, in order to start title editing, a user identifies a title/scene matte consisting of editable titles in the preview window and initiates, if title editing is desired, a switch from the movie mode to the clip mode by clicking in the preview window. This corresponds to the step 5040 in Fig. Fig. 12 shows a more detailed process 10000 for editing titles in the clip mode.
The process 10000 commences with a block 10070 indicating that the system 300 (see Fig. 2) is in movie mode. In a following step 10010, a user initiates entry into the clip mode. Thereafter in a step 10020, the preview module 322 (see Fig. 7) responds by fetching the EDL element 1101 of the selected media item from the EDL 1100. In a following step 10030, the reference 1115 to the current media file 1123 is extracted by the preview module 322 and passed, as depicted by the arrow 914 to the multimedia player 316. The multimedia player 316 loads the current media file 1123 into non-persistent memory as the present production copy 114 (see Fig. 1) and refreshes the preview window 320 (see Fig. 3) with the present media file 114.
In a following step 10040, the preview module 322 checks the media type of the displayed clip using the field 1111 in Fig. 8 for the titled media structure of the present 170304 670220.doc 170304 670220.doc -26media item to determine if the media item has editable titles. If no editable titles are present, then the process 10000 is directed according to a "no" arrow to a terminating step 10080. Otherwise, in the step 10040 if the media type indicates a titled media, the preview module 322 sets, in a step 10050, the focus on the preview window 320 (see Fig. Setting of the "focus" indicates that the system will look for inputs provided to the preview window 320. In a following step 10060, the preview module 322 generates a fake mouse click event in the preview window 320, so that the multimedia player 316 brings up a cursor in the preview window 320 for the user to use to edit titles.
Figs. 13A-13B shows a process 11000 by which the title editing session is concluded and file updating is performed. The process 11000 commences in Fig. 13A with a block 11110 indicating that the system 300 in Fig. 2 is in the clip mode. When the user finishes the title editing session, the user initiates, in a step 11010, a change from the clip mode to the movie mode. Thereafter a step 11120 checks if the selected media item contains editable titles. If not, then the process 11000 follows a "no" arrow to a step 11080. Otherwise, the process 11000 follows a "yes" arrow to a step 11020 in which the preview module 322 performs a number of tasks to save edited title information before leaving the clip mode. Firstly, in a following step 11020 the preview module 322 gets the list of titles with corresponding variable names from the titled media structure of the current edited title/scene matte. In a following step 11030 the preview module 322 passes the variable names, one by one, to the multimedia player 316 and asks for corresponding present titles that are stored in the present production copy 114 in non-persistent memory in 301.
In a following step 11040, after fetching all the titles from the multimedia player 316, the preview module 322 compares the titles from the player 316 to the list 1116 in the current titled media structure 1119. If there are no differences, the preview module 322 switches out of the clip mode immediately, and the process 11000 is directed in 170304 670220.doc 170304 670220.doc -27accordance with a "no" arrow to a step 11080 on Fig. 13B which refreshes the preview window 320 (see Fig. 4) with the current EDL 1100. Otherwise, the process 11000 is directed according to a "yes" arrow to a step 11050. In 11050 the preview module 322 passes, as depicted by an arrow 911 in Fig. 7, the most updated title list, which has been derived from the present production copy from the player 316, together with corresponding variable names to the title updater 907. The preview module 322 also passes, as depicted by an arrow 910, the reference 1121 to the corresponding base media item 1118 to the title updater 907. The title updater 907 generates an updated current media item (ie creates an updated version of the current media item 1123) from the base media item 1118 and the updated present title list, and passes, as depicted by an arrow 909, the updated current media item back to the preview module 322.
In a following step 11060 on Fig. 13B, when the preview module 322 receives the new current media item (ie the updated version of 1123), the preview module 322 creates a new current titled media structure (ie an updated version of 1119 1120). The new titled media structure refers to the new current media, and stores the most updated list of titles and their associated variable names. The preview module 322 copies all metadata and the reference to the base media item 1118 from the old titled media structure to the new one. In a following step 11070, the preview module 322 updates the current EDL element 1101 for the edited title/scene matte with the new titled structure.
This ensures that other EDL elements in the current EDL that refer to the base media item from which the (changed) current media item is derived, but do not include the text changes made to the current media item, do not have the same text changes made with reference to them.
In the final step 11080, the changes made to the current EDL 1100 are reflected when the timeline structure is played in the movie mode. The old (previous current) titled 170304 670220.doc 170304 670220.doc -28media structure is only deleted when there are no current EDL elements in the current timeline referring to it.
Figs. 14A-14C shows a process 12000 for saving title changes in a persistent manner, and shows the importance of the title updater 907 in this regard. The process 12000 commences with a block 12150 in Fig. 14A indicating availability, from the preview module 322, of the base media item 1118 (via 910 in Fig. 7) and the list of updated editable titles with associated variable names (via 911 in Fig. Thereafter in a step 12010 the title updater 907 opens the base media file 1118 and, in a following step 12020 validates the file format. If the media file is not authored in SWF 4.0+ format, which supports editable titles, the process 12000 is directed according to a "no" arrow to a termination step 12160. Otherwise, the process creates, using the title updater 907 in a step 12030, a new blank current media file.
In a subsequent step 12040, the title updater 907 looks for a (next) tag block from the base media item. If this fails, ie if the end of the file is reached, or the content is corrupted, the process 12000 is directed according to a "no" arrow to a step 12050 which checks if the new current media item is blank. If this is the case, then the process 12000 is directed according to a "yes" arrow to a step 12060 which deletes the new current media item after which the process 12000 is directed to a step 12070 in Fig. 13B.
Otherwise, if the new current media item is not blank, then the process 12000 is directed according to a "no" arrow from the step 12050 to the step 12070 in Fig. 14B which updates the file size of the new current file after which a step 12080 closes opened files before exiting at a step 12180.
Returning to the step 12040, if a tag block is found, then the process 12000 is directed according to a "yes" arrow to a step 12090 in Fig. 14B which checks the associated tag block identifier to see if the tag block has an editable text field. If not, then the process 12000 is directed according to a "no" arrow to a step 12100 in Fig. 14B which 170304 670220.doc 170304 670220.doc -29copies the tag block into the new current media file and adds the size of the copied tag block to a counter. The process 12000 is then directed back to the step 12040 to fetch another tag block. The counter stores the size of the new file so far by accumulating the sizes of all copied tag blocks. At the end, ie when no more tag blocks can be found, the counter value is used to update the size of the new file.
Returning to the step 12090 in Fig. 14B, if an editable text field is found, then the process 12000 is directed according to a "yes" arrow to a step 12110 in which the title updater 907 parses the tag block to retrieve the title and the associated variable name. A following step 12120 uses the retrieved variable name to retrieve the title from the given current list of titles.
A subsequent test step 12130 compares the title from the tag block to the title in the updated list. If differences are not found, then the process 12000 is directed according to a "no" arrow to the step 12100 which copies the tag block into the new file. Otherwise, the process 12000 is directed according to a "yes" arrow to a step 12140 which replaces the title in the tag block with the title from the list and updates the tag block size, and copies the modified tag block into the new current file in 12100.
In summary, in order to overcome the problem of saving in-place changes made to editable titles in SWF 4.0+ files, persistent temporary files are created to save changes saved in a non-persistent manner by the multimedia player. The disclosed method does not only work on titles but on all defined editable texts, that can be captions, subtitles and other use in the particular media files.
Industrial Applicability It is apparent from the above that the arrangements described are applicable to the data processing industries.
170304 670220.doc 170304 670220.doc The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive.
In the context of this specification, the word "comprising" means "including principally but not necessarily solely" or "having" or "including", and not "consisting only of'. Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings.
170304 670220.doc

Claims (10)

  1. 2. A method according to claim 1, wherein the format is SWF
  2. 3. A method according to claim 1 or claim 2, wherein the third party media player is one of(a) QuickTime M and Macromedia Flash" m
  3. 4. A method according to claim I wherein the creating step comprises the steps of: copying, if the base media file contains editable text, the base media file to a current media file; and 270707 887991 1 -32- if the base media file contains said editable text, creating the present ,1 production copy from the current media file being an instance of the base media file; and if the base media file does not contain editable text, creating the present production copy from the base media file being another instance of the base media file. An apparatus for editing a base media file in a format supporting editing of titles, using a third party media player for creating a present production copy from the base Smedia file, the apparatus comprising: non-persistent storage means for storing, by the media player, said present production copy; display means for displaying, by the media player, said present production copy; editing means for editing text in-place on the displayed present production copy to thereby formnn an altered present production copy having changed text; extracting means for extracting, via the API of the media player, said changed text from the altered present production copy; incorporating means for incorporating the extracted changed text into a current media file corresponding to the altered present production copy and persistent storage means for storing the current media file.
  4. 6. An apparatus according to claim 5, wherein the formnnat is SWF
  5. 7. An apparatus according to claim 5, wherein the third party media player is one of QuickTime T M and Macromedia FlashTM
  6. 8. A computer program for directing a processor to execute a method of editing a base media file in a format supporting editing of titles, using a third party media player 270707 887991 1 -33 for creating a present production copy from the base media file, said media player being cK adapted to store the present production copy in non-persistent storage, display the Spresent production copy, and support editing text in-place on the displayed present production copy to thereby form an altered present production copy having changed text, the program comprising: _code for extracting, via the API of the media player, said changed text from the altered present production copy; Scode for incorporating the extracted changed text into a current media file corresponding to the altered present production copy; and code for storing the current media file in persistent storage.
  7. 9. A program according to claim 8, wherein the format is SWF A program according to claim 8, wherein the third party media player is one of QuickTime M and Macromedia Flash M
  8. 11. A method substantially as described herein with reference to the accompanying figures.
  9. 12. An apparatus substantially as described herein with reference to the accompanying figures. 270707 887991 1 -34-
  10. 13. A computer program substantially as described herein with reference to the accompanying figures. c n DATED this 27 th Day of July, 2007 Canon Information Systems Research Australia Pty Ltd _Patent Attorneys for the Applicant SSPRUSON&FERGUSON (Nl 270707 887991 1
AU2004201179A 2003-03-21 2004-03-19 Editable Titling Ceased AU2004201179B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2004201179A AU2004201179B2 (en) 2003-03-21 2004-03-19 Editable Titling

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AU2003901315 2003-03-21
AU2003901315A AU2003901315A0 (en) 2003-03-21 2003-03-21 Editable titling
AU2004201179A AU2004201179B2 (en) 2003-03-21 2004-03-19 Editable Titling

Publications (2)

Publication Number Publication Date
AU2004201179A1 AU2004201179A1 (en) 2004-10-07
AU2004201179B2 true AU2004201179B2 (en) 2007-08-16

Family

ID=34378451

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2004201179A Ceased AU2004201179B2 (en) 2003-03-21 2004-03-19 Editable Titling

Country Status (1)

Country Link
AU (1) AU2004201179B2 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030026592A1 (en) * 2000-12-28 2003-02-06 Minoru Kawahara Content creating device and method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030026592A1 (en) * 2000-12-28 2003-02-06 Minoru Kawahara Content creating device and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
QuickTimeTM media player *
SWF 4.0 format media player *

Also Published As

Publication number Publication date
AU2004201179A1 (en) 2004-10-07

Similar Documents

Publication Publication Date Title
US7836389B2 (en) Editing system for audiovisual works and corresponding text for television news
US5640560A (en) CD-ROM content repurposing
US6369835B1 (en) Method and system for generating a movie file from a slide show presentation
US8196032B2 (en) Template-based multimedia authoring and sharing
US7352952B2 (en) System and method for improved video editing
US8769421B2 (en) Graphical user interface for a media-editing application with a segmented timeline
US20010033296A1 (en) Method and apparatus for delivery and presentation of data
US20080247726A1 (en) Video editor and method of editing videos
US20060204214A1 (en) Picture line audio augmentation
US20050268279A1 (en) Automated multimedia object models
US20030215214A1 (en) Dual mode timeline interface
EP2110818A1 (en) Methods and apparatus for creation, distribution and presentation of polymorphic media
GB2400530A (en) Enabling an application program running on an electronic device to provide media manipulation capabilities
US20090064005A1 (en) In-place upload and editing application for editing media assets
JP2008152907A (en) Editing time-based media with enhanced content
KR20040023595A (en) Interactive media authoring without access to original source material
US20140255009A1 (en) Theme-based effects multimedia editor systems and methods
US20170025153A1 (en) Theme-based effects multimedia editor
US20140317506A1 (en) Multimedia editor systems and methods based on multidimensional cues
AU2004201179B2 (en) Editable Titling
US20090044115A1 (en) Previewing audio data
EP2075732A2 (en) Media editing system using digital rights management metadata to limit import, editing and export operations performed on temporal media
WO2006037162A1 (en) Method and system for preloading
CN113556576B (en) Video generation method and device
WO2008139469A1 (en) Method and device for accessing data in signage systems

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)
MK14 Patent ceased section 143(a) (annual fees not paid) or expired