GB2494437A - The handling and management of media files - Google Patents

The handling and management of media files Download PDF

Info

Publication number
GB2494437A
GB2494437A GB1115544.7A GB201115544A GB2494437A GB 2494437 A GB2494437 A GB 2494437A GB 201115544 A GB201115544 A GB 201115544A GB 2494437 A GB2494437 A GB 2494437A
Authority
GB
United Kingdom
Prior art keywords
file
media
user terminal
files
project
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1115544.7A
Other versions
GB201115544D0 (en
Inventor
James Ferguson Mellor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hogarth Worldwide Ltd
Original Assignee
Hogarth Worldwide Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hogarth Worldwide Ltd filed Critical Hogarth Worldwide Ltd
Priority to GB1115544.7A priority Critical patent/GB2494437A/en
Publication of GB201115544D0 publication Critical patent/GB201115544D0/en
Priority to PCT/GB2012/052203 priority patent/WO2013034922A2/en
Publication of GB2494437A publication Critical patent/GB2494437A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • G06F16/164File meta data generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/18File system types
    • G06F16/182Distributed file systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/11Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • H04N21/2353Processing of additional data, e.g. scrambling of additional data or processing content descriptors specifically adapted to content descriptors, e.g. coding, compressing or processing of metadata
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/26603Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel for automatically generating descriptors from content, e.g. when it is not made available by its provider, using content analysis techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4621Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen

Abstract

A method comprises uploading a media file (such as an image, music/audio file or video) from a user terminal 16 to a remote server 10; analysing the uploaded media file, associating metadata with the media file based on the analysis; generating a reduced-data copy (such as a video or image of poorer quality or lower resolution); and storing the media file, associated metadata and the reduced-data copy. This method allows different users to locate, access and edit files without having to transfer to full-data copy. Another method comprises receiving a project file (in or converted to canonical form), the project file comprising links to media files; and compiling a list of the media files to which the project files contains a link. A further method comprises receiving a request to copy one or more files from a first computer to a second computer, and only copying those files that are not already stored on the second computer to avoid redundant or duplicate copying.

Description

INTELLECTUAL
. .... PROPERTY OFFICE Applicalion No. GB1115544.7 RTIN4 Da1:18 Dcccmbcr2ol 2 The following terms are registered trademarks and should be read as such (possibly by using RTM) wherever they occur in this document (pages 8. 12 and 14): Adobe Premier Avid Apple Final Cut Pro Intellectual Property Office is an operaling name of Ihe Patent Office www.ipo.gov.uk Handling Media Files
Field
The invention relates to the handling of media files.
Background
Collaboration by many users on the creation of a media project, such as an audio visual advert, is difficult. There are many reasons for this, hut one is that the vorktng copy of the media project can only be stored on one user's terminal at any one time. As such, in order for more than one person to work on the media project, the working copy must he transferred between users. This reeluires a large amount of data transfer, particularly when high reso'ution video files are used.
The invention was made \vithin this context.
Summary
In a first aspect, this specification describes a method comprising uploading a media file from a user terminal to a renmote server, ana'ysing the up'oaded media file, associating metadata with the media file based on the analysis, generating a reduced-data copy of the media file, and storing the uploaded media file, the associated metadata and the reduced data copy.
In a second aspect, the specification describes apparatus comprising means for uploading a media file from a user terminal to a remote server, means for analysing the uploaded media file, means for associating metadata with the media file based on the analysis, means for generating a reduced-data copy of the media file, and means for storing the uploaded media file, the associated metadata and the reduced data copy.
so In a third aspect, this specification describes a method comprising receiving a project file, the project file comprising one or more links to one or more tnedia files, analysing the project file, and compiling a list of the one or more media files to which the project file contains a link.
In a fourth aspect, this specification describes apparatus comprising means for receiving a project file, the project file comprising one or more links to one or more media files, means for analysing the project file, and means for compiling a list of the one or more media files to which the project file contains a link.
In a fifth aspect, this specification describes apparatus comprising means for receiving a request to copy a collection of one or more files from first computing apparatus to second, remote computing apparatus, means for identifying one or more files of the collection that are already stored on the second computing apparatus. and means for copying from the first apparatus to the second, remote apparatus only those files of the collection that are not already stored on the second computing apparatus.
In a sixth aspect, this specification describes a method comprising receiving a request to copy a collection of one or more files from first computing apparatus to second, remote computing apparatus, identifying one or more files of the collection that are already stored on the second computing apparatus, and copying from the first apparatus to the second, remote apparatus only those flies of the collection that are not already stored on the second computing apparatus.
In a seventh aspect, this specification describes apparatus comprising at least one processor and at least one memory, the at least one memory having stored thereon computer-readable instructions which, when executed by the at least one processor, cause the at least one processor to perform a method according to any of the first, third or fifth aspects.
In a fourth aspect, this specification describes apparatus configured to perform a method according to any of the first, third or fifth aspects.
Computer-readable instructions which, when executed by at least one processor cause the at least one processor to perform a method according to any of the first, third or fifth aspects.
A non-transitory computer readable medium having stored thereon computer-readable code which, when executed by at least one processor, causes the at least one processor to perform a method according to any of the first, third or fifth aspects.
Brief Description of the Figures
For a more compkte understanding of example embodiments of the present invention, reference is now made to the following description taken in connection with the accompanying drawings in which: Figure 1 is an illustrative schematic of a media file management system; Figure 2 is an alternative schematic illustration of the MFM server, the storage management server, and the MFM storage shown in Figure 1; Figure 3 is an example operation that can he performed by the system of Figure 1; Figure 4 is an example operation that can he performed by the system of Figure 1; and Figure 5 is an example operation that can he performed by the system of Figure 1.
Detailed Description
In the description and drawings, like reference numerals refer to like elements throughout.
Figure 1 is an illustrative schematic of a media file management system I according to embodiments of the invention. The system I allows the management of media files, such as video files, audio and image files, throughout production from initial recording of the media files to final distribution to one or more intended recipients.
The media file management (MFM) system I is a "cloud-based" system. As such, the media files are stored in MFM storage 12 located "in the cloud" ID. One or more users can access and manage the media files, or copies thereof, by accessing an MFM application i42A, running on an MFM server 14 located in the cloud 10. The MFM application 14 can he accessed via a browser application 26A running on a user terminal 16.
The M]JNT system I comprises "the cloud" 10 and one or i-note user terminals 16 which are in communication with the cloud 10. Located in the cloud 10 are the MFI'vf server 14, MUM storage 12 and at least one transceiver 13. The MUM server 14 is host to the M]J?vl application 142A, which is accessible by remote user terminals 16 via a browser 261\ (e.g. a web browser) running thereon. The MUM storage 12 is operable to store the media files 12A that are to he managed by the MFM server 14. The at least one transceiver 18 enables communication between the user terminals 16 and the MFM server 14.
The M]JNT server 14, which may he provided in one location or may be distributed over multiple locations, comprises at least one controller 140 which controls the operation of the MUM server 14. The one or more controller 140 comprises at least one processor 140A which is operable to execute the MUM application 142A. The M]FM application may he stored in the form of computer-readable code in memory 142. The memory 142 comprises one or more distinct memory media, such as ROM, RAv1 or any other suitable type of memory. The at least one controller 140 is operable under the control of the MUM application 142A to allow the users to manage the media files 12A stored in the MUM storage 12, through a browser application 26A running on the user terminals 16.
The MUM server 140 is operable to cause media files to he stored in the MUM storage 12. The media files are uploaded from a user terminal 16 by the MFM application 141k through browser application 26A. The MUM server 140 is also operable to cause media files 12A or copies of media files to be retrieved from the MUM storage 12 for consumption by users through their user terminal 16.
A storage management server 20 may reside between the MUM server 14 and the MUM storage 12. This storage management server 20 is operable to manage the act of storing the media files in the MUrvi storage 12 so as to ensure the security of the media files. As such, the MFl server 14 is operable to pass media files 12A received from user terminals 16 to the storage management server 20 which stores the media files 12A one or more suitable locations within the M]7v1 storage 12.
Similarly, when a user requests retrieval of a media file 12A, or a copy of a media file, the MFM server 14 requests the retrieval of the media file 12A, or copy thereof, by the storage management server 20 from the MF1\i storage 12.
The user terminal 16 comprises a transceiver 22 for allowing communication between the user terminal 16 and the cloud 10. The user terminal 16 also comprises at least one controller 24 and at least one memory 26. The at least one controller 24 is operable, under the control of computer readable code 26B, stored in the at least one memory 26 to control the other components of the user termina' 16. These other components may include a display 28 and a user input interface 30. The user input interface 30 is operable to rcceive user inputs and to pass signals indicative thcrcof to the at least one controller 24. The user input interface 30 may comprise, for example, a mouse, keyboard, a keypad, a touchscreen, a touch pad or any other suitable type of user input interface 30. Stored in the memory 26 is the browser application 26, which, when executcd by the at least onc controller 24, allows the user to access the MlThvl application i42A.
The user terminal may also include a local M]JNi application 26G. The local M]FM application 26C, when executed by the at least one processor, may communicate with the MF1\1 server 14 and may perform media file management functions locally on the user terminal 16. The local MFM application 26C may operate in the background (i.e. may not he apparent to the user). Functionality that may be provided by the local MFM application 26C will be understood from the following description, particularly the description relating to Figure 3.
The user terminal 16 may comprise, for example, a desktop computer, a laptop computer, a tablet computer, a smart phone, a personal digital assistant system or any other type of user terminal that is operable to execute a browser application.
The user terminals 16 are operable to communicate directly with the MFM server 14. In some cases, user terminals may also he operable to communicate with the MFM server 14 via a local network server 32. The local network server 32 may include local storage 32A.
Figure 2 is an alternative schematic illustration of the MFM server 14, the storage management server 20, and the MFM storage 12 shown in Figure 1.
As can he seen in Figure 2, the MFM server 14 may be said to include different modules, each providing a different functionality. This is illustrative only. The skilled person \vould know that the different functionalities may not necessarily be provided by entirely distinct software modules within the MFM application l4OA.
In Figure 2, the 1\1F'M server 14 is shown to comprise three modules each providing a different functionality. These are the ingestion module 40, the management module 42 and the distribution module 44.
The ingestion module 40 provides the functionality that enables users to upload media files t2A from their user terminal 16 to the MF'v1 server 14. The uploaded media files are passed from the MFM server 14 to the storage management server \vhich stores the media files in one or more suitable locations within the MFi storage 12. Media files that have been uploaded to the MEM server 14 may hereafter he referred to as "media assets" As can be seen in Figure 2, MUM storage 12 may comprise multiple different storage area networks or net\vork attached storage (SAN or NAS) 46, 48, 50. These may he located in various different geographic locations. The storage management server 20 may be operable to cause media files to be replicated over different SANs in differetit geographic locations thereby to ensure the security of the niedia files.
Following upload (or ingestion) of the media file, the MFM server 14 is operable to produce a proxy version of the asset. This may he performed by the ingestion module or may he preformed by a different part of the MFM application 142A. The proxy version of the asset is copy of the asset that contains less data than the original asset. The proxy version of the asset may also be referred to as a reduced-data copy of the asset (or uploaded media file), or simply a proxy file. If we consider an example wherein the media file is a high resolution video data file, the proxy version may he, for example, an 11.264 Mpeg4 AVG file.
The proxy version is stored in the MFvl storage 12 in association with the original asset l2\ for later retrieval for delivery to one or more user terminals 16. The use of a proxy file, allows a user of a user terminal to view the asset through their browser, without needing to stream or download the entire "data-heavy" version.
The functionality provided by management module 42 allows the user to perform a number of operations in respect of the uploaded media file (also referred to as a media asset or, simply, an asset). These operations ma\ include browsing or searching through stored assets, editing or viewing nietadata associated with the asset, viewing the proxy version of the asset, downloading the proxy version or a copy of the original asset, sharing assets with other users, transcoding assets into different formats, deleting or removing assets, and, when the asset is a video file, generating one or more still images from the video file.
The functionality provided by the distribution module 44 allows the user to distribute assets simultaneously to a plurality of different recipients. Where the asset is, for example, a video for a television advert, the intended recipients may include broadcasters.
The MUM system I according to the present invention allows a user to upload a media file which can then be downloaded by other users for editing or manipulation on their user terminal. The edited or manipulated version of the media file can then he uploaded back to the lFTf server 14 and, once the asset is in its final form, can he distributed from the MFM storage to its intended final recipients. At any time from ingestion of an original un-edited media file to distribution of the final version of the asset, users are able to access and view proxy versions. As such, the system provides an environment wherein a plurality of users are able to collaborate on the editing and manipulation of a single project.
Often a prolect, such as a television advert, will comprise different may different media files. For example, a television advert might include parts of one or more high resolution video files, one or more audio files, one or more animations and one or more still images. Most common non-liner editing (NLLL) software applications, instead of creating a single file comprised of the relevant parts of the all the different media files, create what is known as a "project file". A project file is a data file that comprises a plurality of links (e.g. URLs) to the media files used in the project. In addition, the project file may contain a timestamp denoting the time within the project at which a particular media file is used and tirnestamps or other data items identifying the part of the media file that is used in the project, and any transitions or visual effects. As such, when a project is viewed, the project file may he used such that the NIE software application accesses and displays/outputs the relevant part of the linked media assets when they are required.
There are many different NLE software applications available. These include Adobe Premier, Avid Media Composer, Apple Final Cut Pro 7 and Final Cut Pro X. The format of project files created and utilised by each of these applications is, in many cases, different.
Figure 3 is a flowchart of an operation according to example embodiments of the invention. More specifically, the operation of Figure 3 relates to the ingestion functionality provided by the MFM server 14.
In step S3.1, a user makes a media file available to their user terminal 16. In some examples, this may include connecting to the user terminal 16 a removable storage medium upon which the media file is stored. For the purpose of this example, the media file is taken to be a high resolution video file. However, it will be understood that the operation also applies to different types of tnedia file.
Tn step S3.2, the local background process MUM application 26C running on the user terminal 16 examines the media file made available to the user terminal 16. If the user does not recognise the type of the media file, the local background process MFl application 26C ignores the media file. If, however, the application 26C does recognise the type of the media file, which is in this instance a video file, the operation proceeds to step S3.3.
In step S3.3, the local M1Th1 application 26C responds to recognition of the media file by authenticating, and thereby identifying, the user, or the user terminal, to the M1_:M server 14. This may be performed for example using a key that is shared between the local MI7M application 26C and the MFM server 14.
In step S3.4, once the user has been authenticated, uploading of the media file to the MFM server 14 is initiated. The upload may he instigated by either of the local MFM application 26C and the MFM server 14. This uploading, or ingestion, of the media file may he using UDP accelerated file transfer. The uploading may occur in
the background.
In step S3.5, subsequent to the MFM server 14 receiving the uploaded media file, the uploaded media file (or media asset) is provided with metadata. The metadata may he embedded within the media asset or may he a separate file that is associated with the media asset in sot-ne way.
The metadata includes a plurality of metadata fields which include information relating to the media asset. This information may md ude one or more of a media file name, an media file type, a country of origin, a duration, a resolurion and one or more identification numbers or codes. (i)ther metadata which specifically relates to a use of the media file may also he included. For example, if the asset is a high resolution video file that is to he used as part of an advertising campaign, the metadata information may also include an advertiser, an advertising agency, a brand, a product, a eainpaigti liame, an agency producer, a tnusic composer, a music title, a production company, a post-production company and various other different fields.
The metadata may be automatically determined from an analysis of the media asset by the MF1"1 server 14. The metadata may also (or alternatively) be determined based on the identity of the user or user terminal as authenticated in step S3.3. The metadata may also (or alternatively) he provided by the user via the browser application 26A.
Next, in step S3.6, the MFM application 14 generates the proxy version of the uploaded media file.
Subsequently, in step S3.7, the media asset, the metadata and the proxy file are passed to the storage management server 20 for storing in the MF'M storage 12.
As described above, users are able to view, and to comment on, the proxy file using their browser application 26A in conjunction with the MFM applicationl42A.
The operation described with reference to Figure 3 could he said to be an automatic method for the encrypted and accelerated transfer of large quantities of high resolution video data (often many gigabytes in siae) along with descriptive metadata to a cloud-based digital video file review and management application (i.e. the M]7vI server 14).
As described above, during the creation of a project, a project file may he created hy an NLE application. Figure 4 depicts an example of the way in which projects may he handled by the MFM server 14.
In step 54.1, a project is created by the user on their user terminal 16 using any suitable NLE application. As described above, the project includes a project file and at least one media file. In step 54.2, the project file is saved in the nonlinear editing application native format or is exported as an XML file.
Next, in step S4.3, the user uses the MFM application 142A via their browser application 26B to indicate that they wish to upload the saved projeci file to the MFM server 14. This may occur, for example, by dragging an icon representative of the project file into an appropriate region of the browser window.
In step S4.4, the MFvi application I 42A analyses the project file and extracts identifiers of the media files to which the project file links. In some types of project file, these identifiers may comprise universal resource locators (LTRLs).
Subsequently, in step S4.5 the MFIM application 142A uses compiles and displays on the display 28 of the user terminal 16 a list, or summary, of the media files to which links are provided in the project file.
In response to the displayed summary of linked media files, in step 54.6, the user locates the listed media files and marks them for uploading to the MFM server 14.
This may he carried out in any suitable way. For example, the user may drag icons representative of the media files into a particular region of the browser window.
Next, in step 54.7, once location and marking of the linked media files is complete, the marked media files are uploaded to the MEM server. This may be carried out using IJDP accelerated file transfer. Once the media files are uploaded to the MJFM server 14 they may be referred to as media assets. The initiation of the uploading may be in response to a user input received via the user input interface 30 user terminal. The user input may be indicative of all the linked files having been located and marked. Alternatively, the uploading may be initiated following an automatic determination by the MFM server 14 that all of the media files listed in step 54.5, have been marked for uploading by the user.
Next, in step 54.8 the uploaded assets are provided with metadata. This may be similar to step 53.5 as described with reference to Figure 3.
In step 54.9, the project file is uploaded from the user terminal 16 to the MFM server 14 and is converted into a canonical format. A canonical format is a platform-agnostic description format one that allows the a file to he cranslated easily into any different format.
In step 54:10, the canonically formatted project file is used in conjunction with the media files linked and conformed' therein to produce a reduced-data version (or a proxy version) of the project as a video file. The reduced data version of the project can he streamed to user terminals 16 on request.
Finally, in step 54-. 11, the canonically formatted project file and the linked assets are stored as part of a project collection object. The project file and the assets that make tip a projection collection object niay he referred to as "elements" of the colic c ti on.
Certain aspects of the method described in Figure 4 allow users to upload content created and referenced by NLE applications (such as Adobe Premier, Apple Final Cut Pro XML and Apple Final Cut Pro X, or Avid Media Composer) using an encrypted and accelerated transfer method to send large quantities of high resolution video data (often many gigabytes) along with descriptive metadata to a web-based digital video file review and management application (i.e. the MEIVI application).
Figure 5 illustrates a method that may occur when a user wishes to download and edit a project collection object such as was created in the operation of Figure 4.
First, in step S5.1, a user authenticates themselves to the MFM server 14. This may he carried out, for example, by submitting a username and password to the MFM server 14 via their browser applicatioti 2611 running on the user terminal 16.
Next, in step S5.2, the user hrowses or searches through the project collection objects, which are stored in the M]JM storage 12 and to which the user has access, to identify a collection object upon xvhich they wish to work. Searching may be done based on the inetadata associated with the media assets that form part of the project collection object, or based on metadata that is associated with the project collection object in some other way.
In step S5.3, once the user has selected a project collection object, the user indicates a format into which the canonically formatted project file that forms part of the selected project collection object is to be converted. The format may he indicated -[3-hr the user identifying an NLE application with which they wish to edit the project.
Alternatively, the user may simply select a format from a list.
In step 55.4, the MFM application 142A responds to the indication of the desired format by converting the canonically formatted project file from the canonical forniat into the desired format.
In step 55.5, the MJFM application determines which elements of the selected project collection object are already stored in storage local to the user terminal 16.
This may he determined by a web-plugin background process running on the user terminal! 6 using, for example, a checksum algorithm to validate the uniqueness of each individual media element.
In step S5.6, the MFT'vl application 142A causes only those elements that were not found already to he stored in the local storage to he downloaded to the user terminal 16. The elements that were found already to he preseit on the local storage are not downloaded. The local storage may he part of the user terminal 16 or may instead he availahle via a local network 32.
Following step S5.6, the converted project file and all of the media files linked therein are present in the local storage of the user terminal 16. In step 55.7, the user edits the project using an NLE application running on their terminal 16.
Once the user has finished editing the project, in step 5.8, he indicates to the MFM application 142A (that is accessible through the browser of his user terminal 16) that he wishes to upload the edited project to the MJJM server 14. Step 5.8 may he similar to step 54.3 described with reference to Figure 4.
In response to this indication, in step S5.9, the M11\'I server 14 analyses the project file and suhsedluentlv identifies which dements of the edited project are not currently stored in the MFv1 storage 12. The elements that are not stored may include the edited project file, and any edited or newly added media files. In some examples, similarly to step S4.5 of Figure 4, the MFM application 142/\ may cause a list of the elements that are not currently stored in the MFM storage 12 to be displayed on the display 28. The user may then locate those elements and mark them for uploading to the MFM server 14.
Finally, in step S5.1O, the MFM application 142A uploads from the user terminal 16 to the 1\IE\ server 14 those elements that were not previously stored in the MFM storage.
The operation described with reference to Figure 5 allows users to browse and search uploaded collections of data that were created using a variety of Desktop NLE software package formats including Adobe Premier, Apple Final Cut Pro Xv1L and Apple Final Cut Pro X, or Avid Media Composer. The users are subsequently able to convert project files into any alternative supported Desktop NUN software package format and to download the converted prolect file along with referenced high resolution video material that has not already been stored in the target location, using encrypted and accelerated transfer methods. By identifying and only transferring those elements that are not already present at the destination, the amount of required data transfer is reduced. This is particularly significant when the some of the media files linked in a project arc high resolution video files.
It will be appreciated that the operations described with reference to Figures 3 to 5 are only examples and that variations to these operations are also possible. In some embodiments the order of the steps shown iii the flow charts may be different or certain steps may be omitted. For example, step S4.7 of Figure 4 may be performed concurrently with or subsequent to step S4.9.
It should be realized that the foregoing embodiments should not he construed as limiting. (Dther variations and modifications will be apparent to persons skilled in the art upon reading the present application. Moreover, the disclosure of the present application should he understood to include any novel features or any novel combination of features either explicitly or implicitly disclosed herein or any generalization thereof and during the prosecution of the present application or of fli application derived therefrom, new claims may he formuiated to cover any such features anJ/or combination of such features.

Claims (1)

  1. <claim-text>Claims 1. A method comprising: uploading a media Bk from a user terminal to a remote server; analysing the uploaded media file; associating rnetadata with the nmdia file based on the analysis; generating a reduced-data copy of the media file; and storing the uploaded media file, the associated metadata and the reduced data copy.</claim-text> <claim-text>2. The method of claim I, comprising: detecting availability of the media file at the user terminal; and uploading the media file to the remote server in response to the detection.</claim-text> <claim-text>3. The method of claim I or claim 2, comprising: receiving a request from a remote user terminal to view the content of the uploaded media file; and responding to the recluest by communicating the reduced data copy of the media file to the user terminal from the remote server.</claim-text> <claim-text>4. The method of any preceding claim comprising: iii response to detecting the availability of the media file and prior to uploading the media file, authenticating the user terminal or the user of the user terminal to the remote server.</claim-text> <claim-text>3. The method of claim 4 comprising: associating rnetadata with the media file based on information associated with the authenticated user or user terminal.</claim-text> <claim-text>6. !\pparatus comprising: means for uploading a media file from a user terminal to a remote server; means for analysing the uploaded media file; means for associating metadata with the media file based on the analysis; means for generating a reduced-data copy of the media file; and means for storing the uploaded media file, the associated metadata and the reduced data copy.</claim-text> <claim-text>7. The apparatus of claim 6 comprising: means for detecting availability of the media file at the user terminal, wherein the means for uploading uploads the media file to the remote server in response to a detection of the of the availability of the media file.</claim-text> <claim-text>8. The apparatus of claim 6 or claim 7, comprising: means for receiving a request from a remote user terminal to view the content of the uploaded media file; and means for responding to the request by communicating the red uced data copy of the media file to the user terminal from the remote server.</claim-text> <claim-text>9. The apparatus of arty of claims 6 to 2, comprising: means for responding to detection of the availability of the media file by authenticating the user terminal or the user of the user terminal to the remote server, prior to uploading the media file.tO. The apparatus of claim 9, comprising: means for associating metadata with the media file based on information associated with the authenticated user or user terminal 11. A method comprising: receiving a project file, the project file comprising one or more links to one or more media files; analysing the project file; and compiling a list of the one or more media files to which the project file contains a link.12. The method of claim it, comprising: receiving the one or more media files that were included in the compiled list; uploading the project file and the one or more media files from a user terminal to a web server; storing the one or more media files and the project file in association with one another. )13. The method of claim 12, comprising: generating reduced-data copies of the one or more media files, and; storing the reduced-data copies for subsequent review via one or more remote user terminals.14. The method of claim 12 or claini 13, comprising converting the project file into a canonical format, wherein storing the project file comprises storing the converted project file.15. The method of any of claims ii to 14, comprising: analysing the uploaded one or media files; associating metadata with the one or more media files based on the analysis; and storing the metadata in association with the one or more media files.tO. The method of any of claims 11 to 15, comprising: receiving user-provided metadata, via the user terminal; and storing the user-provided metadata in association with the one or more media files.17. Apparatus comprising: means for receiving a project file, the project file comprising one or more links to one or more media files; means for analysing the project file; and means for compiling a list of the one or more media files to which the project tile contains a link.18. The apparatus of claim 17, comprising: means for receiving the one or more media files that were included in the compiled list; means for uploading the project file and the one or more media files from a user terminal to a web server; means for storing the one or more media files and the project file in association with one 21 nother.19. The apparatus of claim 18, comprising: means for generating reduced-data copies of the one or more media files; and; means for storing the reduced-data copies for subsequent review via one or more remote user terminals.20. The apparatus of claim 18 or claim 19, compri sing means for converting the project file into a canonical format, wherein the means for storing the project file comprises means for storing the converted project file.21. The apparatus of any of claims 17 to 20, comprtsing: means for analysing the uploaded one or media files; means for associating metadata with the one or more media files based on the analysis; and means for storing the metadata in association with the one or more media files.22. The apparatus of any of claims 17 to 21, comprising: means tor receiving user-provided metadata, via the user terminal; and nieans for storing the user-provided tnetadata in association with the one or more media files.23. A method comprising: receiving a request to copy a collection of one or more files from first computing apparatus to second, remote computing apparatus; -20 -identifying one or more fijes of the collection that are already stored on the second comp Ltting apparatus; and copying from the first apparatus to the second, remote apparatus only those files of the collection that are not already stored on the second computing apparatus.24. The method of claim 23, wherein the second computing apparatus is a user terminal, the method comprising: the user terminal editing the collection; subsequently receiving an instruction to copy the edited collection from the user terminal to the first computing apparatus; identifying one or more files of the edited collection that are already stored on the first computing apparatus; and copying from the user terminal to the first computing apparatus, only those tiles of the edited collection that are not already stored on the computing apparatus.25. The method of claim 23 or claim 24, wherein collection comprises one or more media files and a project file in a canonical format, the project file containing one or more hnks to the one or more media files, the method comprising: converting the project file into another format; and copying the converted project file to the second apparatus with one or more media files of the collection that are not already stored at the second apparatus.26. The method of daim 25, wherein the format into \vhich the project file is converted is determined based on a user instruction received via the second apparatus.27. The method of claim 25, wherein the format into xvhich the project file is converted is determined based on an apphcation selected by the user of the second apparatus.28. Apparatus comprising: -21 -means for receiving a request to copy a collection of one or more tiles from first computing apparatus to second, remote computing apparatus; means for identifying one or more files of the collection that are already stored on the second computing apparatus; and means for copying from the first apparatus to the second, remote apparatus only those files of the collection that are not already stored on the second computing apparatus.29. The apparatus of claim 28, wherein the second computing apparatus is a user terminal, wherein the user terminal comprises means for editing the collection, the apparatus further comprising: means for subsequently receiving an instruction to cop) the edited collection from the user terminal to the first computing apparatus; means for identifying one or more files of the edited collection that are already stored on the first computing apparatus; and means for copying from the user terminal to the first computing apparatus, only those files of the edited collection that are not already stored on the computing apparatus.30. The apparatus of claim 28 or claim 29, wherein collection comprises one or more media files and a project file in a canonical format, the project file containing one or more links to the one or more media files, the apparatus comprising: means for converting the project file into another format; and means for copying the converted project file to the second apparatus with one or more media files of the collection that are not already stored at the second apparatus.31. The apparatus of claim 30, wherein the format into which the project file is converted is determined based on a user instruction received via the second apparatus.-22 - 32. The apparatus of claim 30, wherein the format into which the project file is converted is determined based on an application selected by the user of the second apparatus.33. Apparatus configured to perform the method of any of claims I to 5, 11 to 16 and 23 to 27.34. Apparatus comprising: at least one processor; and at east one memory, the at least one memory having stored thereon computer-readable instructions which, when executed by the at least one processor, cause the at least one processor to perform the method of any one of claimsl to 5, 11 to 16 and 23 to 27.35. Computer-readable instructions which, when executed by at least one processor cause the at least one processor to perform the method of any of claims I -,, -7 to a, 11 to 16 and 2a to 2 36. A non-transitory computer readable medium having stored thereon computer-readable code which, when executed by at least one processor, causes the at least one processor to perform the method of any one of claims I to 5, 11 to 16 and 23 to 27.</claim-text>
GB1115544.7A 2011-09-08 2011-09-08 The handling and management of media files Withdrawn GB2494437A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1115544.7A GB2494437A (en) 2011-09-08 2011-09-08 The handling and management of media files
PCT/GB2012/052203 WO2013034922A2 (en) 2011-09-08 2012-09-07 Handling media files

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1115544.7A GB2494437A (en) 2011-09-08 2011-09-08 The handling and management of media files

Publications (2)

Publication Number Publication Date
GB201115544D0 GB201115544D0 (en) 2011-10-26
GB2494437A true GB2494437A (en) 2013-03-13

Family

ID=44908270

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1115544.7A Withdrawn GB2494437A (en) 2011-09-08 2011-09-08 The handling and management of media files

Country Status (2)

Country Link
GB (1) GB2494437A (en)
WO (1) WO2013034922A2 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030093790A1 (en) * 2000-03-28 2003-05-15 Logan James D. Audio and video program recording, editing and playback systems using metadata
US20060265425A1 (en) * 2005-05-17 2006-11-23 Raff Karl C Ii Media management for a computing device
US20080016390A1 (en) * 2006-07-13 2008-01-17 David Maxwell Cannon Apparatus, system, and method for concurrent storage pool migration and backup
US20080077630A1 (en) * 2006-09-22 2008-03-27 Keith Robert O Accelerated data transfer using common prior data segments
US20080123976A1 (en) * 2006-09-22 2008-05-29 Reuters Limited Remote Picture Editing
WO2008080140A2 (en) * 2006-12-22 2008-07-03 Commvault Systems, Inc. System and method for storing redundant information
US20090157688A1 (en) * 2004-11-24 2009-06-18 Koninklijke Philips Electronics, N.V. Usage history based content exchange between a base system and a mobile system
WO2010036754A1 (en) * 2008-09-26 2010-04-01 Commvault Systems, Inc. Systems and methods for managing single instancing data
US20100169786A1 (en) * 2006-03-29 2010-07-01 O'brien Christopher J system, method, and apparatus for visual browsing, deep tagging, and synchronized commenting
US20100332958A1 (en) * 2009-06-24 2010-12-30 Yahoo! Inc. Context Aware Image Representation

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6430576B1 (en) * 1999-05-10 2002-08-06 Patrick Gates Distributing and synchronizing objects
US20020145622A1 (en) * 2001-04-09 2002-10-10 International Business Machines Corporation Proxy content editing system
US7761574B2 (en) * 2004-03-18 2010-07-20 Andrew Liebman Media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems
US9552843B2 (en) * 2008-06-19 2017-01-24 Andrew Liebman Media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems
FR2933226B1 (en) * 2008-06-27 2013-03-01 Auvitec Post Production METHOD AND SYSTEM FOR PRODUCING AUDIOVISUAL WORKS
FR2940481B1 (en) * 2008-12-23 2011-07-29 Thales Sa METHOD, DEVICE AND SYSTEM FOR EDITING ENRICHED MEDIA
WO2011014772A1 (en) * 2009-07-31 2011-02-03 Citizenglobal Inc. Systems and methods for content aggregation, editing and delivery

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030093790A1 (en) * 2000-03-28 2003-05-15 Logan James D. Audio and video program recording, editing and playback systems using metadata
US20090157688A1 (en) * 2004-11-24 2009-06-18 Koninklijke Philips Electronics, N.V. Usage history based content exchange between a base system and a mobile system
US20060265425A1 (en) * 2005-05-17 2006-11-23 Raff Karl C Ii Media management for a computing device
US20100169786A1 (en) * 2006-03-29 2010-07-01 O'brien Christopher J system, method, and apparatus for visual browsing, deep tagging, and synchronized commenting
US20080016390A1 (en) * 2006-07-13 2008-01-17 David Maxwell Cannon Apparatus, system, and method for concurrent storage pool migration and backup
US20080077630A1 (en) * 2006-09-22 2008-03-27 Keith Robert O Accelerated data transfer using common prior data segments
US20080123976A1 (en) * 2006-09-22 2008-05-29 Reuters Limited Remote Picture Editing
WO2008080140A2 (en) * 2006-12-22 2008-07-03 Commvault Systems, Inc. System and method for storing redundant information
WO2010036754A1 (en) * 2008-09-26 2010-04-01 Commvault Systems, Inc. Systems and methods for managing single instancing data
US20100332958A1 (en) * 2009-06-24 2010-12-30 Yahoo! Inc. Context Aware Image Representation

Also Published As

Publication number Publication date
GB201115544D0 (en) 2011-10-26
WO2013034922A3 (en) 2013-05-10
WO2013034922A2 (en) 2013-03-14

Similar Documents

Publication Publication Date Title
US11172004B2 (en) Real time collaboration and document editing by multiple participants in a content management system
US10860185B2 (en) Content item activity feed for presenting events associated with content items
AU2017203690B2 (en) File-level commenting
US9942121B2 (en) Systems and methods for ephemeral eventing
US20200412793A1 (en) Link file sharing and synchronization
US9575981B2 (en) Cloud service enabled to handle a set of files depicted to a user as a single file in a native operating system
US20190180242A1 (en) Content Item Activity Feed for Presenting Events Associated with Content Items
US9374326B2 (en) Providing information for shared content
US20090100068A1 (en) Digital content Management system
GB2518245A (en) System and method for Rendering Document in Web Browser or Mobile Device regardless of Third-Party Plug-In software
EP2883155A2 (en) E-reader systems
US20140279896A1 (en) Cloud-based document suggestion service
JP2019021290A (en) Method, computer program and system for wirelessly connecting devices
US20230205979A1 (en) E-pub creator
US20210201371A1 (en) System and method for managing electronic files and data in a centralized collaborative workspace
JP2010140347A (en) Web server device, client device, control method therefor and program
TW201523304A (en) Document collections distribution and publishing
GB2494437A (en) The handling and management of media files
KR20150045050A (en) Method and apparatus for making work applications

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)