WO2004088664A2 - Systeme et procede de gestion de multimedia - Google Patents

Systeme et procede de gestion de multimedia Download PDF

Info

Publication number
WO2004088664A2
WO2004088664A2 PCT/GB2004/001493 GB2004001493W WO2004088664A2 WO 2004088664 A2 WO2004088664 A2 WO 2004088664A2 GB 2004001493 W GB2004001493 W GB 2004001493W WO 2004088664 A2 WO2004088664 A2 WO 2004088664A2
Authority
WO
WIPO (PCT)
Prior art keywords
media
metadata
item
items
attribute
Prior art date
Application number
PCT/GB2004/001493
Other languages
English (en)
Other versions
WO2004088664A3 (fr
Inventor
Robin Bettridge
Simon Andrewes
Stuart Murray
Neville Conway
Mark Jones
Original Assignee
Bbc Technology Holdings Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bbc Technology Holdings Limited filed Critical Bbc Technology Holdings Limited
Publication of WO2004088664A2 publication Critical patent/WO2004088664A2/fr
Publication of WO2004088664A3 publication Critical patent/WO2004088664A3/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/42Arrangements for resource management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/02Arrangements for generating broadcast information; Arrangements for generating broadcast-related information with a direct linking to broadcast information or to broadcast space-time; Arrangements for simultaneous generation of broadcast information and broadcast-related information
    • H04H60/06Arrangements for scheduling broadcast services or broadcast-related services

Definitions

  • This invention relates to media management and editing, and particularly, but not exclusively to the control of media and related metadata in a non-linear media editing system.
  • Non-linear editing is used to refer to editing of digital media in which direct access to media portions is possible, as opposed to linear editing which operates on sequentially stored formats such as tape.
  • Non-linear editing provides many advantages over linear editing such as speed of access to media segments or portions, and the ability to preview and correct each edit decision without having to go to tape or disk first. It also enables more than one editor to access a common media segment simultaneously. This allows the editor or editors a greater degree of creative freedom in the editing process.
  • a typical editing sequence will start with one or more input media items being input into an editing system. These may be ingested from a digital capture or storage device, or may be digitised analog, and are sometimes referred to as Non-Production Media Items. One or more editors can then use segments of the input items and performing various operations, such as cuts, fades and colour correction, create a new media item, which can be referred to as a Production Media Item.
  • auxiliary information, or metadata is included with media items. In this way the media essence is effectively tagged for use in database-like operations such as searching, storage tracking and retrieval.
  • the present invention consists in one aspect of:
  • a method for controlling metadata associated with media items in a media management system in which a plurality of media items are related in parent- child relationships comprising the steps of defining at least one metadata attribute; defining a first and a second propagation property; assigning either said first or second propagation property to said at least one metadata attribute; assigning a metadata attribute component to a first media item and, responsive to the assignment or modification; and identifying media items in the system related to said first media item and, based on the propagation property of the selected attribute, selectively assigning or modifying a metadata component attribute of the identified related media items accordingly.
  • the first propagation property is a parent to child inheritance property, whereby a metadata attribute component is assigned or modified for identified related media items which are derived from said first item if the propagation property of said attribute is said first propagation property.
  • the second propagation property is a bi-directional property, whereby a metadata attribute component is assigned or modified for identif ed related media items which are derived from said first item and for identif ed related media items from which said first item is derived if the propagat on property of said attribute is said second propagation property.
  • a metadata component attribute is assigned to a specified media segment of a media item, and related media items are defined as those media items within the system which include media corresponding to at least a portion of said media segment. It is then advantageous for the modification or assignment which occurs due to propagation to apply to a metadata attribute component of a related media item at a segment corresponding at least partially to said specified media segment. Using this method, propagated metadata can be assigned to particular segments or sequences of media, in a frame accurate fashion if required.
  • components are assigned start and end times corresponding to the specified media segment which they accompany. It should be understood that components need not necessarily be defined by start and end times, but should be defined by information which allows start and end times to be derived, for example a start time and a duration.
  • a metadata component attribute of an item is propagated to all related media items within the system.
  • metadata which will be of relevance to all corresponding media instances can be propagated to all related items accordingly, while other metadata which may not be of such widespread relevance is propagated only to a subset of items.
  • Values of existing components, if any, are preferably overwritten by propagated components. More preferably propagated metadata carries with it a time value indicating when that metadata was assigned or modified, and for any given item the most relevant metadata can be derived using these time values.
  • a further aspect of the invention provides:
  • a media editing system in which a plurality of media items are related in parent-child relationships, said system comprising a tree information database adapted to store information determining the relationships between media items; and a user interface to allow a user to assign a metadata attribute component to a media segment; wherein one or more attributes is governed by a bi-directional propagation rule, such that on allocation or modification of such an attribute component to a first media segment, the database is searched to identify both parent-related and child-related media items having at least a portion of media corresponding to that media segment and an attribute component at those identified portions in those related media items is allocated or modified accordingly
  • Propagation rules are preferably stored and maintained in a dedicated rules engine. Searching of the database can advantageously be performed by the rules engine.
  • the user interface preferably includes a graphical timeline representative of a media item or a segment of a media item top allow attribute components to be assigned simply and intuitively.
  • a still further aspect of the invention comprises:
  • Metadata for accompanying media items in a media management system in which a plurality of media items are related in parent-child relationships by including corresponding media segments, which metadata is propagated between parent and child media items; said metadata comprising a metadata attribute component associated with a segment of a media item; wherein said metadata attribute component has one of at least two propagation properties indicative of how that component propagates to related media items.
  • one propagation property is a parent to child inheritance property, indicating that such a metadata components associated with an item is automatically associated also with child items derived from that item.
  • one propagation property is a bi-directional propagation property, indicating that such a metadata component associated with an item is automatically associated also with child items derived from that item, and parent items from which that item is derived.
  • the metadata accompanies a specified segment of a media item, and association is only to related items including at least a portion of that specified media segment. This allows more accurate control of the propagation to identifiably segments of media.
  • bi-directional metadata propagation continues throughout the system so that bi-directional metadata associates with all related items within the media system which include at least a portion of that media segment.
  • Metadata is classified into attributes, and all metadata for a given attribute has the same propagation property.
  • values assigned or updates for those attributes will automatically be governed by the appropriate propagation property.
  • the metadata may include an indication of the time of amendment or assignment of that metadata. From this time it is possible to derive the most recently updated value.
  • Another aspect of the invention provides:
  • a method for managing metadata associated with a media item comprising maintaining a timeline associated with said media item, wherein attribute components are assignable to said media item at defined times; maintaining a database of attribute components for each media item, each component having an attribute value, a duration along said timeline, and a value representative of the time at which the component was assigned to said media item; processing the attribute components in said database to derive, for a given attribute, the most recently assigned value at each time along the attribute timeline; and displaying at each time on an attribute timeline the most recently assigned attribute value.
  • an initial attribute value is assignable to an item and, for any time on said attribute timeline not having an assigned component, that initial value is displayed. This has the same a similar effect as assigning an initial component along the length of the timeline which may be used in an alternative method.
  • Advantageously components can be amended, and the value representative of the time at which the component was assigned is updated to the time of the amendment each time a component is amended.
  • the method is operated in a media management system, so that multiple users can assign or update attribute values to items within the system via multiple user interfaces and, more preferably, each time an attribute value is assigned or amended, all user interfaces within the system are updated to reflect the assigned or amended value. This allows users to benefit from one another's annotation in a timely fashion.
  • components of a media item in the system may be assigned or updated automatically by propagation from another related media item.
  • Yet another aspect of the invention comprises:
  • a method for monitoring media usage in a media editing system comprising: maintaining a database of media items; defining a timeline associated with a media item, and a usage value representing the usage of said media item which varies along said timeline; searching said database and identifying portions of said media item occurring in derived media items; processing said identified portions and for each identified occurrence, updating the usage value for the corresponding portion of the timeline reflect the usage of said segment; displaying said usage value on said timeline.
  • the resulting display will, in many cases provide an indication of the usefulness of the different segments of the media item. This is ingeniously achieved without requiring any dedicated annotation, but by measuring the number of times each segment has been selected and used in other segments. By logging the results of editing processes a value added media item including a usage density measure is created.
  • the status of the media items in which portions of said media item occur is used in deriving the usage value.
  • Still another aspect of the invention comprises: A method for managing a restriction status of a media item in an editing system comprising setting an initial restriction status for the media item from one of a set of predetermined values; assigning one or more components of a restriction attribute to segments of said media item, which components may take one of the set of predetermined values; and deriving an overall restriction status for the media item based on the default status and the one or more usage restriction components.
  • the overall restriction status is the result of a Boolean function of the default value and the component values. Also, preferably the overall restriction status varies according to the production status of the media item.
  • the possible values are colour values.
  • the overall restriction status can be displayed extremely easily with a mark of a certain colour, or by altering the colour of a an existing feature of a display.
  • a user may optionally manually override the derived status, but preferably only to a more restrictive status, thus ensuring a minimum restriction threshold which is automatically derived.
  • a further aspect of the invention provides:
  • a method for managing a media system comprising the steps of storing in a non-linear media system one or more media instances of a first resolution; storing in a database in said non-linear media system, metadata associated with said media instances; storing in a rules engine in said non-linear media system, one or more archive rules; automatically identifying one or more media instances of said first resolution to be archived by querying metadata stored in said database according to said one or more rules, and returning metadata identifying one or more media instances; and copying said identified first media instances to a media archive.
  • archive rules comprise one or more archive terms, each term specifying a metadata field name, an operator, and a value to be satisfied by that operator. It is desirable that archive rules are executed periodically to identify media instances to be archived, each rule having a predefined periodic interval. In this way rules designed to identify regular media items eg. the One O'clock News, can be scripted and executed each day at, say, half past two in the afternoon, when it is known that the item will have been completed and not yet deleted.
  • each archive rule has an assigned priority, and that archive rules are executed according to priority status, to handle situations where a resource conflict arises.
  • the instances are preferably deleted from said non-linear media store, to maintain available on-line storage space.
  • the non-linear media system additionally includes a plurality of corresponding media instances at a second, lower resolution, such that a user a user of the non-linear media system can view the second lower resolution version of archived media instances after the higher resolution instances have been deleted.
  • the first resolution is preferably broadcast quality resolution and the second resolution is preferably web quality resolution.
  • a still further aspect of the invention provides:
  • a media archive system comprising a non-linear media store adapted to store a plurality of media segments, a linear media archive; a database adapted to store metadata associated with said plurality of media segments; and a rules engine adapted to store one or more archiving rules, wherein the data base is adapted to be queried periodically using said one or more archive rules to identify media segments meeting archive rule criteria, those identified segments being copied to said linear media archive.
  • the invention also provides a computer program and a computer program product for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein, and a computer readable medium having stored thereon a program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein.
  • the invention also provides a signal embodying a computer program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein, a method of transmitting such a signal, and a computer product having an operating system which supports a computer program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein.
  • Th e invention extends to methods and/or apparatus substantially as herein described with reference to the accompanying drawings.
  • the methods and apparatus described herein may be implemented in conjunction with media input, editing and transmission systems, aspects of which are described in the applicant's co-pending patent applications.
  • aspects of a system for managing data for transmission are described in the applicant's co-pending patent application entitled “System and Method for Media Management”, Attorney Reference No. IK/26522WO, filed on 5 April 2004, the disclosure of which is hereby incorporated by reference in its entirety.
  • aspects of a system and method for media data storage and retrieval are described in the applicant's co-pending patent application entitled “Data Storage and Retrieval System and Method", Attorney Reference No. IK/26523WO, filed on 5 April 2004, the disclosure of which is hereby incorporated by reference in its entirety.
  • Figure 1 illustrates a media item structure
  • Figure 2 is a schematic representation of a media management system
  • Figure 3 shows the form of component metadata
  • FIGS 4 to 6 show metadata inheritance according to an embodiment of the invention
  • Figure 7 is an example of summary traffic light metadata according to an embodiment of the invention.
  • FIGS 8a to 8d illustrate component annotation and inheritance according to an embodiment of the invention
  • Figures 9a and 9b also illustrate component annotation and inheritance Figure 10 is an example of application server structure for a media management system
  • FIGS 11 and 12 illustrate component metadata displays
  • Figures 13a to 13d show component timeline views according to an embodiment of the present invention
  • Figure 14 is an item usage view according to an embodiment of the present invention
  • Figures 15a and 15b are exemplary display screens
  • Figure 16 illustrates archive recommendation and deletion processes in a media management system
  • Figure 17 illustrates the implementation of a media management system
  • Figure 18 illustrates a server-client relationship in an exemplary media management system
  • Figure 19 shows the structure of a rule engine in an exemplary media management system.
  • Figure 1 illustrates an exemplary structure for a media item which might be handled in a media management system in accordance with an embodiment of the present invention.
  • the media item 100 is represented along a time axis extending horizontally across the page.
  • the media item comprises three separate media objects or tracks, 102, 104 and 106.
  • track 102 is video
  • tracks 104 and 106 are audio.
  • the tracks can be referred to as media essence.
  • Each media object or track can be divided in time into segments or portions. It should be noted that segments may have boundaries aligned across all tracks, such as segment 108, however a single segment in one track may span two or more segments in another track, such as segments 110 and 112.
  • Media items additionally comprise metadata, which describe attributes associated with a media item, and which is used to assist in processing of the media essence within the system eg. storing, tracking, editing, archiving etc. These attributes may apply to the whole media item eg. item duration, or may be specific to segments within the media item eg. copyright holder (this will be discussed in greater detail below) A media item can therefore be said to be made up of media essence (tracks), and associated metadata (attributes).
  • Table 1 describes a number of metadata attributes which can be associated with a media item:
  • composition of media items and metadata will be explained in greater detail below.
  • a metacore 200 is at the centre of the system, and comprises a metadata store 201 and a media store 202.
  • Media intake for example from video feeds, agencies, newsgathering teams etc. can be received via an edit matrix 206 which is controlled by a network control system 208.
  • metadata values which are stored in the metadata store.
  • Media intake can also be received from viewing and editing services 210 and Archive service 212.
  • the metadata values may be imported with the incoming media, may be assigned values by a system operator or may be assigned default values.
  • the associated media is then stored in the media store 202.
  • Users of the system can use viewing and editing services 210 to view and edit media managed by the system, and can search the system by metadata attributes to find relevant media. Once the relevant metadata describing the desired media has been found, the system can retrieve the associated media from the media store (if it exists there) for use by the user. Users can create new media items from existing essence, but with new metadata (which may be derived from existing metadata as will be explained below) to be input into the metacore.
  • the media store is an online store, and media held within it can be accessed and manipulated directly via devices networked to the metacore.
  • media held within it can be accessed and manipulated directly via devices networked to the metacore.
  • the practical constraints of media storage dictate that only a certain volume of media can be maintained online in this way, and as new media is constantly fed into the system, existing media must be removed. This is particularly true of the media essence, and less so of the metadata. If it is determined that the media is important and cannot simply be deleted, it must be stored offline, or archived. Both the process of selecting material to be archived, and the process of archiving it require considerable resources.
  • An archive service 212 is therefore linked to the metacore.
  • the archive service is in turn linked to one or more VTRs 214.
  • the archive service identifies media, via its metadata, to be taken from the media store and recorded to tape (offline).
  • the archive service can also act to re-ingest into the (online) media store tape based media.
  • the metacore is connected to transmission servers 216. These transmission servers can accept media items which are ready to be broadcast on transmission system 218.
  • the system also supports web based output, and the metacore, is further linked to a post processor 220 which in turn feeds a web hosting 222.
  • the routing of video, audio, and communication signals between various media systems and facilities, both internal and external, can be referred to in terms of 'Bookings'.
  • a Booking may simply specify a media feed from one location that is to be routed to another location, internal or external to the media management system. Bookings may also include recording of the media being routed. For example, a booking may be made to enable an on-air news presenter to interview someone live at a remote location. Bookings can also be communications only bookings, enabling staff from various locations to communicate via a dedicated communication link.
  • Bookings may be divided into Arrival bookings that are scheduled recordings, Departure bookings for media items that are to be played out from the system to another destination, and Archive bookings that represent requests for media to be moved to or from offline storage. Any tasks that require recording or playout of media can be tracked as a booking.
  • all media items and media essence contained in the system are either the result of recording from the edit matrix, or from the archive service, or imported from the Editing services.
  • a variety of recording methods are supported by the system, including methods for media sources that must be recorded live from within the news facility, from a tape source, or an Agency feed. These are managed by a centralized facility (an organisation unit) which is referred to as the 'Mediaport'. The Mediaport is also responsible for managing the subsequent accessing of recordings.
  • All of the media items recorded into the system are assigned metadata.
  • the different types of metadata assigned to media itmes may vary according to the particular item. This metadata is assigned by the Mediaport or by Mediaport staff.
  • the metadata may be automatically imported from an external source, assigned by the Mediaport, or automatically assigned a default value.
  • Metadata added automatically when a media item is created as a part of recording includes:
  • the system may usefully define 3 levels of restriction for metadata fields:
  • the system For each data value entered or altered, the system records the following information:
  • Crash Recording is employed.
  • the goal of Crash Recording is to provide a very rapid method for recording important breaking news.
  • the steps required to initiate the Crash Recording must be kept minimal. All required metadata fields will be filled with default values.
  • Clip name will be simplified to a default value of ⁇ CRASH RECORD/User Name/Date and Time> where 'user name' is derived from user login, and 'date and time' represents the time recording is started.
  • Media Status will be set as 'Raw'.
  • 'Item Identifier' will be set by default using a counter algorithm, with PR (production item) or MP (non product item) as a suffix. Start timecode for the media item shall begin at 00:00:00:00. Recorded/Created By will be derived from the users login, and outlet will default to the last outlet the user had selected. The default usage traffic light value is green, default usage restrictions are set to 'restricted to 24 hours without additional modification', story name and description are defaulted to 'Crash Record'.
  • crash record scenario can also be used where a system user has a tape that needs to be ingested into the system. This creates a fast and simple way to get media into the Mediaport, while minimizing the time required on the recording station. All required metadata can be updated afterwards at a viewing/editing/logging station.
  • a metadata field may have restricted access or not be visible to certain users or user groups.
  • a specific metadata field may be 'Mandatory' for a Media Coordinator, but 'Not Visible' for a general BBC user.
  • Data transfers can be received via a push process from a variety of external sources.
  • Data transfer may have Mediaport bookings representing anticipated data transfers. Mediaport bookings should be used for anticipated data transfers so that users may track data transfers, and so that collection and dissemination of related metadata can begin prior to arrival.
  • the metacore is intended to serve as a short-term repository of currently relevant media. As time goes on, more and more media items contained in the metacore will represent archived assets. When users need access to archived media items, they will request them be recorded back into online storage. This initiates a chain of events that result in the simple recording of tape-based media from the archive into the Mediaport.
  • the type of automated recording is specified when the booking is created.
  • the complexities of automated bookings lie primarily in the creating automated booking functionality.
  • the recording itself will simply proceed in an automated fashion, without user interaction or intervention.
  • Mediaport staff will monitor the recordings for errors or failures as a part of their normal duties ln certain applications the media system will be integrated into an existing or legacy system. It may be desired to use two systems cooperatively or for one system to replace another.
  • CBIS the Central Bookings and Information Service
  • a software application is an example of a legacy organization and software application that manages and coordinated resources and bookings.
  • the Mediaport is adapted to import bookings from CBIS.
  • Imported information should include:
  • the system can differentiate between Internal and External bookings.
  • External bookings are any bookings with a CBIS reference number, inferring that the bookings have been or will be imported to the system from CBIS.
  • Internal bookings are not imported from CBIS and will not have a CBIS reference number.
  • transcoding capabilities are provided within the system to enable the system to create additional media essence instances at different resolutions.
  • This supports multiformat editing, and makes efficient use of available storage facilities and archiving capabilities, whilst maintaining searching and viewing functionality.
  • the system will be able to create new media essence instances when these media resolutions needed. It is most preferable to encode and store multiple resolutions concurrently while recording. Alternatively multiple resolution instances can be encoded from the primary media input, as an automated process, during ingest. It is also desirable to be able to re-encode media for long-term media maintenance reasons. As encoding of media is in progress, a list of items to be transcoded can be viewed by Mediaport staff to monitor progress. An example of the different resolutions used in an embodiment of the system is shown in table 2:
  • Quality Format is DVCAM media, BetaSP or equivalent.
  • Web Quality Information rate approximately from 56Kbps to 300Kbps depending on user requirements and network capacity. Not provided to frame-level accuracy Format to be RealMedia or equivalent. Table 2 It should be noted that audio tracks will also often need to be supported. It is preferable that two audio tracks be provided for each video track.
  • a particularly suitable application of the present invention is news production, and the development of news stories for broadcast.
  • News stories are the reporting of details and background for regional and world events.
  • News stories are very dynamic.
  • News staff always have a large number of stories that are actively being reported, covering a range of categories.
  • the relative importance of a story changes continuously. Depending on other current events, a single story can rise or fall in relative importance. This is reflected in the amount of broadcast coverage the story receives.
  • the story is an item of metadata; an attribute of bookings, media items, and grouped stories.
  • the story also has attributes including Story Description, Story Valid From Date/Time, Story Expiry Date/Time, Created By, Creation Date, Story Status Code, Top Story Indicator, Dominant Story, Indicator, Story Group, and Outlet.
  • the story represents a news event that is being covered.
  • the story name is a short hand name or handle for the broadcast coverage that a news event is receiving.
  • Bookings, media items and production media items are all assigned a story name so that they can easily be identified as relating to a specific news event.
  • stories can be grouped into collections of story names that are related.
  • Story groups are a collection of stories and have a set of stories as attributes.
  • Story names are intended to be short-term identifiers. Media items that are to be archived should have more substantive descriptive metadata to facilitate better search results and improved utilization. Story names provide a simple way to reference media items that have short-term value, may not receive much additional descriptive metadata, and will not be archived. Stories may have a variety of production items related to them.
  • Bookings can exist without a story association, however users should be encouraged to add story associates whenever possible to maximise the usefulness of the system.
  • media items and component media items also can exist without a story association, but again they will not be as recognizable to users.
  • Story associations can be added and changed for both bookings and media items at anytime after a media item is created.
  • stories can exist without any associated bookings or media items.
  • stories can be created within Jupiter simply to serve as the dominant story in a story group to organize the story picklist.
  • a complex event may be covered though a series of related stories, with no production media items ever being created for the dominant story itself.
  • stories remain in the system until they are manually deleted. Story deletion and house cleaning will be a Mediaport user responsibility. Dominant stories must have their dominant status removed before they can be deleted.
  • all stories will have 'Valid from Date' and an 'Expiry Dale' metadata attributes. This allows recognition of stories that are current. From this information Mediaport users can maintain current 'picklisfs' containing only stories that are current. This functionality also enables users to be able to create and identify stories that will receive coverage at some time in the future. The system will also store stories that received coverage in the past. All story metadata including description, as well as Top Story' and 'Dominant Story' indicators for future, current and expired stories. This facility will enable a variety of options that will be discussed further in the description of search facilities and capabilities below.
  • Story names play a very important role in enabling users to identify relevant bookings and media items. They can also form a valuable bridge between the system and legacy systems.
  • the first characters of the legacy booking identify the story name, and this will automatically import into the Mediaport.
  • the clip naming convention will enable the users to recognize the same names across new and old systems.
  • the user interface will assist the user in selecting appropriate clip names by providing a selection of possible entries.
  • a default value can be selected for the story name, so thai even crash recording can be simply accomplished.
  • Journalists are encouraged to view the day's pick list of stories and use an existing story name whenever they: • Create or modifying bookings in the system
  • the story picklist would quickly become unmanageable without the ability to group and associate stories.
  • related stories can be grouped.
  • An evolving story can be associated with other related or sub-stories and new story names.
  • a new story name may eclipse or complement the original.
  • Associating stories creates a story hierarchy. However, the hierarchy may only have one sub level.
  • a story name can act as parent to a group of related stories.
  • one story name of the group can be selected as dominant representing the parent of the group. It is also possible to ungroup or disassociate stories as necessary. It is preferred that only one story can be the dominant story for a group.
  • Each story should have a 'description' to help prevent stories that are very different from being accidentally selected or grouped.
  • the Story description is optional, and can be updated by other Jupiter users.
  • Table 3 shows examples of metadata attributes of a story that can be added:
  • Metadata will be specified as mandatory, some as optional. Default values can be assigned so that some of these fields can be automatically populated. Also users may be restricted from adding, modifying or even viewing some metadata.
  • stories must have an 'Expiry Date'. This defines how the duration of time the story will be valid within the system. Only current stories appear in the story picklist. stories that have past their 'Expiry Date' are considered expired. If an 'Expiry Date' is not assigned at story creation, a default value will be assigned. Expiring stories is a way to remove the story from the pick list without losing the related metadata from the system. This can be done by changing the 'Expiry Date' metadata. The most common reasons for modifying a story will be to extend or expire story coverage. It may be useful to add user interface functions such as 'expire story' or 'extend 24 hours' to simplify these common tasks. Deleting a story not only removes it from the pick list, it totally removes it from the system. Before a story can be deleted, any associated media items must be associated with a different story.
  • the top story indicator is either set to 'yes' or 'no', with 'no' the default value.
  • Users of the system can perform various different types of searches by querying metadata stored online in the metacore, for various types of user activity. Different search facilities will be appropriate for different user groups.
  • Users will be able to search the system both for online media items, and also offline media items which have associated metadata in the metacore. This will preferably be items which were online in the system at some point in time. If desired, users can request that offline media items be moved online (as will be explained below in greater detail).
  • Story search is geared to the needs of Mediaport users and will facilitate Mediaport workflow and interoperability with other Mediaport applications and tools.
  • users will be concerned with searching/viewing stories by expiry date (so to extend as necessary), searching/viewing stories by status (so as to check 'unchecked' stories), and searching/viewing in the current pick list order (so as to review and if necessary change story associations).
  • Users will additionally be able to use the system's search facilities to search for Bookings. This enables users to create and save views for all bookings, internal and external, departures and arrivals using a simple search interface.
  • the search spans a predefined set of metadata fields
  • Simple search enables users to search for media (via on-line metadata) using a simple user interface similar to common Web search engines such as GoogleTM.
  • the user specifies a single word or phrase but does not specify the metadata fields to search across.
  • the system searches a predefined set of metadata fields.
  • Intermediate search is similar to the Simple Search but has additional control over the scope and filtering of the search criteria.
  • the user may select to search for online media or to perform an archive search, which would present a different set of search criteria depending on the search type.
  • search for current media items might include:
  • the Intermediate Search capabilities will enable the user to Search for expired, current and future stories by all story metadata.
  • Advanced Search enables a user to search for media by all available metadata.
  • Each search may consist of one or more search terms.
  • Each term consists of a metadata field name (e.g. Story Name), operator (e.g. equal to) and search value (e.g. "War"). Search terms are joined together by Boolean operators (e.g. AND) that in turn make complex search criteria with the capability of spanning multiple metadata fields.
  • Predefined searches can be accessed using either shortcuts or selections from a longer preset list of searches.
  • Mediaport users will typically have specific search requirements including, searching/viewing bookings for CBIS clean-up, record/playout decision, traffic light/copyright/usage restriction settings, search by status.
  • the results of these predefined searches will be geared towards the role of the logged in user's workflow and interoperability with other applications and tools. Searches and search results lists are persistent and roving. That is to say they travel with the user from machine to machine. Users can save searches and share saved searches, allowing team members to share collections of media items.
  • the results of the future search which may not yet have entered the system are notified to the user when such bookings or media items that meet the users criteria are found (enter the system).
  • the future search may be for bookings, media items or both. The user must enter an expiration date when saving a future search, after which results are no longer returned.
  • the system advantageously allows users to view media items on-line from any enabled system terminal.
  • this will preferably be any PC configured to use a news production system, for example ENPS (Electronic News Production Service) sold by Associated Press Inc.
  • ENPS Electronic News Production Service
  • the viewing and logging tools may be available integrated within ENPS or as a stand alone tool.
  • Media can be played out and viewed from the system at any time, even as it is being captured. Users will also be able to view the media item's essence data file as it is being written. This differs from watching the feed or package because playback is 'on demand' from the system. Whenever the user selects the item for playback, they can start play from the beginning of the clip, regardless of what time the recording began.
  • the system provides users with the ability to use the full set of media viewing functions such as fast forward, reverse, jog, shuttle and scrub.
  • Viewing will preferably use desktop resolution media instances, however in certain applications it may be desirable or even essential to use instances of higher or lower resolution. It is therefore not relevant for viewing purposes whether or where a broadcast quality equivalent of the viewed media essence is also available on-line. In this way, archive media can be viewed quickly and easily to allow a user to make decisions based on a desktop representation of that media.
  • the media system There are two main editing facilities provided by the media system. The first is a desktop editor. The majority of editorial work on media will be done using the desktop editor application, running on any ENPS capable desktop PC. Whether as a tool integrated within ENPS, of as a stand alone application, users will be able to complete editorial tasks from simple 'Tops and Tails' editing - the process of selecting the beginning and end of a single shot for broadcast, to the creation of EDLs that will be automatically conformed by system for broadcast.
  • the Desktop editing work will begin by searching and selecting media items in system as described above. Selected media items are moved into the Desktop editor application from the system by adding the items to a Desktop Edit Clip List. Each user has an individual Desktop Edit Clip List, which makes the media item essence and metadata available to the user in the Desktop Editor. Likewise, the user is able to delete media from the Desktop Edit Clip- list. Deleting from the Desktop Edit Clip-List does not delete the media item from the system. In addition to the Desktop Edit Clip-List, users can also view the last 10 media items that they have logged or viewed.
  • the package can be shared in two ways.
  • the system enables the viewing of the edit as an EDL, a simple tab delimited list of events representing the edit.
  • the EDL can also be shared with other applications or systems as a Private Text Document.
  • media may need to be moved to the local workstation.
  • Production media erns that are produced using the Desktop Editor are published to the system.
  • the system does not have any version control capabilities, so each revision of a production media item published to Jupiter will be new and unique. There will be no association between revisions other than that manually implied when choosing the media item name.
  • the Desktop Editor only publishes media item EDLs, and at this stage, no media essence.
  • the EDL is automatically conformed by the system.
  • the publish process initiates the creation of a new media item, an automatic EDL conform process, creates a broadcast quality media instance, as well desktop and Web resolutions.
  • an automatic EDL conform process creates a broadcast quality media instance, as well desktop and Web resolutions.
  • the production media item is published, the user will be notified of the estimated time of completion.
  • the queue list of expected conforms can also be viewed.
  • Once the automatic conform is complete the media item will work just as any other media item in the system. All searching/logging/viewing capabilities apply.
  • a journalist In a news application, a journalist often begins work on a news story, focusing on story content, producing a draft edit. This draft edit will be further refined for broadcast by an editor who addresses the more stylistic elements creating a polished production item. This process requires the decisions made by the journalist, referencing desktop resolution media, to be communicated to the Craft Editor, and media references redirected to broadcast resolution media instances within the system. The user will then use the Craft Editor application to take the draft edit and complete the refinement process.
  • the process of publishing from the Desktop Editor will create new media items within the system. These new media items can be used as source media for a Craft Edit.
  • the Desktop Editor can be used to do a first pass edit of source materials that can selected for longer feeds and clips, creating very usable new media items that may be very appropriate for Best Media and Archive Recommendation designation. Such items may be given a status of 'Rough cut'. No EDL metadata is necessary for the sharing of these assets. They will be available through normal search and view capabilities.
  • the Craft Editor uses broadcast quality media created within the system. However it will be necessary to export media items to the Craft Editor for working.
  • the export process will enable the Craft Editor to access media items available from the system. All the viewing / logging / searching capabilities are available to users while using the Craft Editor.
  • the user will also be able to use the History Clip-List to simplify the moving of media items to the Craft Editor.
  • Projects in process on a Craft Editor are not trackable within Jupiter unless a Placeholder media item is manually created.
  • the Placeholder can be annotated to enable news users to track the progress of the Craft Edits progress.
  • Production media item EDLs are published to the system and then conformed.
  • the queue of Production Media Items to be conformed can be viewed to monitor status. Users are notified of the expected duration of the conform process.
  • Production media items will be exported by the Craft using the Advanced Authoring Format (AAF) Working with a news production system and the media system, users may view online media items, identify and request that offline media items be moved online, view media items, annotate and add to metadata, modify or delete information for media items, copy annotation between media items, create component media items, add bookmarks within media, create and add simple EDLs and production media items into the, develop scripts and write captions all in a very flexible, easy to use environment.
  • AAF Advanced Authoring Format
  • System application views are presented as non-overlapping, sizeable windows as selected by the user. It will provide the necessary menus to select and display views. Shortcuts will feature heavily so users can quickly and easily access the views they need for their day-to-day work - their role will dictate how each view is presented. Where necessary to aid workflow, combinations of views will be arranged suitably.
  • the definition of the view presentation and combination will be configurable and stored as scenes.
  • Each view will have a required set of permissions that have to be satisfied in order for the view to be displayed.
  • Each view in itself will have various menus/buttons etc that are also subject to the users permissions and will be disabled and enabled as appropriate.
  • each of the views will support cut, paste, drag and drop between themselves, other views and ENPS.
  • the metacore 1702 includes a client side applications group 1704, a media service 1706 including a media store, an applications server 1708 and a metadata store 1710.
  • Client applications are written in C++ and communicate using J2EE (Java 2 Platform Enterprise Edition) component level communications (JNI-RMI invocations), J2EE messaging and queuing, (JMS via Active JMS).
  • Server applications including system management applications could be written in either Java or C++.
  • Media storage, transfer and editing will typically be provided by a third party media system and associated protocol running on a gigabit Ethernet.
  • the components of the metacore all run off a media gigabit Ethernet.
  • the metacore is linked to the transmissions domain 1712 by a transmission gateway 1714.
  • the transmission gateway will communicate with the Transmission domains using the appropriate MOS protocol.
  • the transmission gateway and domain is discussed in greater detail in our co-pending application filed on 5 April 2004 and bearing attorney reference No. IK/26522WO to which reference is directed.
  • SCAR Spur Central Apparatus Room
  • the edit matrix features a filter comprising a dual redundant pair of PCs managing, filtering and auditing control requests from the system and transmission domains.
  • Both the SCAR matrix and edit matrix are controlled by Broadcast Network Control System (BNCS Routers which are in turn controlled by the metacore using Fabian.
  • BNCS Routers Broadcast Network Control System
  • the CBIS will be configured to replicate to the Metadata Core on a regular basis (typically ⁇ 1min).
  • the Metadata Core will then update any application screens reliant on data that has changed.
  • system clients will be implemented as Win32 native clients 1802. As such a mechanism must be provided to allow the clients to communicate with the J2EE application server 1804. The client server communications will be facilitated via use of a Java-C++ bridge 1806.
  • the C++-Java Bridge allows C++ proxy stubs to be generated from Java classes. This allows any C++ client to behave exactly as a standard Java client. A thin C++ wrapper will be provided (generated) around the required J2EE client API's (Application Program Interface) to allow the client to access components on the application server.
  • the C++-Java Bridge will be used to generate C++ proxy stubs for the EJB (Enterprise Java Bean) remote and home interfaces, thereby, allowing the client to perform interaction with the application server in the same way as any Java client would.
  • Certain client views are required to receive events from the application server (e.g. notification on booking status changes). These will be sent to the clients in the form of JMS messages via JMS Service 1808 from the application server.
  • the C++-Java Bridge will convert the message into an event and the appropriate action can then be taken by the client application.
  • the client may register interest with any number of event topics. This will allow the client to receive events that represent actions performed by various metacore services.
  • the payload of the message will vary depending on the type of event fired by the system service and will include all the information required by the client to perform the required action.
  • a production media item can be considered to be the 'child' of one or more 'parent' media items from which it is derived.
  • Production media items exist in the system in their own right, and have associated with them their own metadata. While certain of this metadata will need to be entered into the system at or after the time of creation (eg. Creation time and duration) other metadata fields can be, and indeed are desirably made consistent with the parent media items from which the production item is made. It should be noted that in certain embodiments, production media items that are in process (being edited) are not identifiable within the system. Placeholder media items can be created so that such items in process can be tracked within the system.
  • Metadata typically comprises a number of attributes or fields.
  • each attribute or field associated with a media item can take varying forms.
  • a metadata field can comprise only a single value associated with a media item eg. duration.
  • a metadata field can take the form of 'components'.
  • Figure 3 shows a schematic representation of a media item 302 with the time axis extending horizontally across the page.
  • Component metadata has an associated timeline illustrated as 304. In other words it is information which relates to a particular time segment of a media item.
  • the metadata value for a particular attribute for a particular media item can therefore vary in time.
  • component fields may also have a default value 310, which is used to provide a value for a component field at all time instances along a media item for which no other value has been assigned.
  • a default value 310 is used to provide a value for a component field at all time instances along a media item for which no other value has been assigned.
  • component values have been assigned for time segments 306 and 308, and the remainder of the component timeline takes the default value.
  • Certain component fields also have a summary value 312 and optionally an override value 314.
  • the summary value like the default value is an overall value for the whole media item, and is derived from the component values along the media item. This can for example be performed using an arithmetic or Boolean operation on the individual component values.
  • the summary value may optionally be replaced by a user input override value.
  • rules defining the manner in which the summary value may be overridden eg. the summary value may only be made greater, or more restrictive. If the override is removed, or if the summary value changes such that the override no longer satisfies the rules, then the summary will be shown again.
  • Metadata fields may exhibit propagation properties.
  • metadata propagation is supported both from parent to child media items and also from child to parent media items.
  • Inheritance propagation refers to metadata from an item being automatically associated also with one or more child items.
  • Bi-directional propagation refers to metadata from an item being automatically associated with both parent item and child item(s).
  • Figure 4 shows an example of inheritance propagation of a particular metadata field from parent to child media items.
  • a child media item 402 is created from two parent media items 404 and 406.
  • the child item is defined by selecting a segment of item 404 between times ti and t 2 , and by selecting a segment of item 406 from time t 3 to t 4 , and by cutting these segments together.
  • This editing operation can be performed using the media system editing facilities described above.
  • the inherited metadata field in question is a component field.
  • Child item components are automatically inherited from the appropriate parent metadata, ie. child metadata component 414 between Ti and T 2 is derived from parent item 404 between times ti and t 2 , and child metadata component 416 between T 2 and T 3 is derived from parent item 406 between times t 3 and t .
  • component 416 no components were assigned to the parent media item and so the component value taken is the default value of the parent component.
  • the parent item has had components 420 and 422 assigned at particular times. Since these assigned components overlap with the segment used for creating the child item (ti to t 2 ), child component 414 inherits components 426 and 428 having corresponding values at the corresponding times. It should be noted that these components are only inherited in part, since only part of the relevant media is used in the child item. Components of metadata fields with inherited propagation will continue down a chain of inheritance (in part or as a whole) in a similar fashion.
  • Figure 5 shows the same media items as in Figure 4, but at a later lime a new component 502 (of a metadata field having an inheritance propagation type) has been added to parent item 406. This new component has been added to the portion of the parent which is included in the child media item, and therefore results in a corresponding new component 504 being added to the corresponding time portion of child item 402.
  • Figure 6 illustrates a metadata component having bi-directional propagation.
  • Figure 6 again shows the media items of Figure 4, but at a later stage a new component 602 having bi-directional propagation is assigned to child item 402.
  • This new component has been added to a portion of the child item which was derived from parent item 406, and therefore results in the addition of a corresponding new component 604 being added to the corresponding time portion of parent item 406.
  • Bi-directional propagation additionally functions in the same way as inherited propagation and therefore component 602 will propagate downwards to any media items which include any portion of child item 402 to which component 602 has been assigned (ie a 'grandchild' item for the purposes of Fig 6).
  • component 604 will therefore be propagated to any other child items which include any portion of parent item 406 to which component 604 has been assigned. It will be understood that by creating a metadata field with a bi-directional propagation property, a component of that field will, when assigned to a media item, be propagated (in part or as a whole) to all other media items in the system which include any of the media portion to which that component was assigned.
  • Edits can be performed on a frame accurate basis, and components can be propagated in part or as a whole.
  • the relationships between parent and child items within the system are managed via metadata in the metadata store of the system.
  • This is a database which stores metadata associated with each media item as XML object models.
  • the database supports a relational data model (without necessarily decomposing XML into its relational equivalent) allowing the metadata for each item to exist in a tree structure indicating parent-child relationships.
  • the database can be queried at an item level to return parent and child related items.
  • compositional structure of the object models used is extensible and able to support the Advanced Authoring Format (AAF).
  • AAF Advanced Authoring Format
  • Some media itmes may be composed in a craft editor and be output as EDLs.
  • the object model structure is therefore also capable of reading and writing such items as text files (CXM) via an adaptor.
  • CXM text files
  • a system user eg performing an edit function
  • Media item metadata includes one or more pointers to the relevant segment or segments of essence, which may be stored in the media core or offline. Where two or more media items are related by a common media essence segment, they will each include a pointer to the same essence segment.
  • the metadata management will be provided by interfaces that allow users and the system to add, update and remove metadata. All of these operations will be performed via the application server.
  • the service will also validate all metadata entered to ensure that it adheres to the metadata rules as defined by system users.
  • the service will also provide the ability to export metadata information to and from the Jupiter system in an XML format.
  • Figure 10 shows the components of the metadata management service. Metadata requested by the clients will be extracted from the database and maintained as model objects in the application server 1002. The model objects will be responsible for extracting the data from the database and populating its values correctly. The model objects will also be responsible for implementing the persistence mechanisms used to store to the database.
  • the workflow objects will use a business object 1006 to perform the relevant metadata changes for the business method called.
  • the business object will make use of services provided by the metadata service. This will allow it perform any get, update or delete operations.
  • value objects 1008 are simple objects that encapsulate the business data and provide methods to access that data. The contents of the value object will vary depending on the current user role
  • the value object factory is responsible for creating the value objects that are returned to the client. It will also contain methods that allow it to convert model objects to and from value objects. It will also make use of the user management service to obtain details about which fields the user is allowed to see in their current role. It will use this information to only populate the correct fields in the value objects created.
  • Updates from clients will also only contain fields that have been altered by the user.
  • the metadata service will ensure that only the relevant fields are updated.
  • metadata attributes can be defined as one of 4 types:
  • Component Single value e.g. item duration b) Component (no propagation) e.g. picture format, best media c) Component (Inheritance propagation only) e.g. archive flag, agency flag d) Component (Inheritance and bi-directional propagation) e.g. traffic light, usage restriction, copyright holder
  • Type (b) component attributes therefore do not inherit values from their parent item timelines. For all sections which are not otherwise marked-up with locally applied components, the timeline will be set with the default value.
  • Type (c) & (d) component attributes inherit from their parent items' corresponding portion of the timeline (taking into account any components applied on the parent). Locally applied components are also taken into account. If the item has no parents (i.e. it is not a production item), then it takes a default value from a default attribute as if it were a type (b).
  • a production item may be broken away from its parent items, severing the relationships and the inheritance of values. In this case it will be as though the item ceased to be a production item and becomes treated as if it were a non-production original item.
  • Type (c) attributes are inherited down to child items only, whereas Type (d) attribute annotation is required to appear consistent for all items using the same portion of media.
  • Type (d) attributes where annotation is required to propagate to all occurrences, from parent to child and from child to parent, wherever there is a relationship which identifies that the same portions of media are being used.
  • Type (c) annotation is simple to picture, with annotation on any item being inherited into further edits made from that item.
  • Type (d) is more technically complex. In order to achieve our design requirement of annotation propagating to all related media, it is necessary to ensure that, no matter where the annotation is added by the user, the system will add the information at the corresponding point(s) from which all items that share the same media inherit their information. We take advantage of the same inheritance described for type (c) to distribute the information, but type (d) modifies the process of adding the data in the first instance. This will be explained further in the following description of the Traffic Light attribute.
  • the 'Traffic Light' is an attribute which can take one of three values; red, amber or green.
  • Embargoed news (committee reports, crime figures etc.).
  • Green signifies less restrictive or unrestricted use
  • Red signifies more restrictive or prohibited use of the associated media.
  • Different parts of BBC News operate with differing degrees of freedom to use other organisations pictures which results in a traffic light which has to be interpreted with local knowledge. Where there is doubt over the suitability of material it will be Amber. Where the material is very likely to be unusable it will be Red: probability is a factor in choosing the colour
  • Figure 7 shows rules for rough cut and raw traffic light summary derivation. It can be seen that for finished and rough cut items, the summary 702 is red if any part of the item has a traffic light of red. The summary is green only when all of the item has a traffic light of green, and for all other cases the summary traffic light is amber. For Raw items, the summary 704 is green when all of the item has a traffic light of green, amber if some but not all of the item has a traffic light of green, and red (not shown) only if all of the item has a traffic light of red.
  • the Summary Item Traffic Light provides an at a glance summary view of the usability of an item, in the context of it either being for air or for further editing.
  • the summary does not affect the timeline view (which is described later) for production edits - this is derived from the related media, taking account of components on this item - neither does it affect edits derived from this Item (the relevant timeline value is inherited).
  • the summary may be manually changed to be more restrictive for editorial reasons and may be returned to the original derived.
  • FIG 8a A simple example of traffic light usage is shown in Figure 8.
  • a feed 802 containing a desired package (and other packages) is received.
  • the recording of the feed is, in effect, a master - it does not inherit from any parent media.
  • These original master recordings are required to take their default timeline value from a user's default setting 803 (in the absence of a parent). This avoids it being necessary to mark a full-length component with an amber traffic light (although this has a very similar effect for propagation purposes). Since it is a pool feed the default traffic light is set to amber.
  • Figure 8b shows Mediaport start to "top and tail” the desired package within this recorded feed to produce a first edit 804. Simultaneously, a news production team are doing a quick turnaround headline edit or OOV, 806 derived from the feed.
  • the two edits display a derived rights timeline, in this case built from the matching periods in the parent.
  • the parent is all amber, so the two edits appear amber (as are the item summary values 808 and 810).
  • the system derives new Item-level traffic light summary 815 for item 804 based on the mix of the individual segments of the timeline.
  • the timeline has been fully annotated with components: this need not be the case. Items may have only been given components against some portions of the timeline. Where there is no overriding component traffic light, that portion of the timeline will take on the value (or values) of the corresponding portions referred to in the parent. The value in the parent will itself have to be derived by applying the same rules. Two things are being derived, one is a time- varying view of the traffic light, the other is a single overall value for the Item traffic light summary.
  • Figure 8d shows the effect of annotation of item 804.
  • the traffic light component is propagated upwards, as shown by arrow 816 and will locate and mark against the master items at the corresponding timecodes.
  • the appropriate parts of these traffic light components are then propagated down to News24 headline 806 as shown by arrow 818.
  • the media essence for the master recording may be deleted to save storage space without affecting the propagation of components to headline 818.
  • FIG. 9 A more complex traffic light example illustrated in Figure 9 will be briefly explained.
  • a number of media items are shown having parent child relationships indicated by thin solid arrows as in previous figures.
  • components 902 have been added to two items 904 and 906 specifying green from start to finish.
  • the system locates the master in each instance and places a green traffic light component against the appropriate time portions so that ail other items using the same media also have appropriate components added.
  • Figure 9a shows components 902 being applied directly to the master, in some implementations upwards propagation of components may be performed step by step updating a single relationship at a time.
  • Usage restriction operates in a very similar fashion to the traffic light described above in that an item will by default take the default usage restriction, but component usage restrictions can also be added. Usage restrictions propagate up and down in the inheritance structure as described above.
  • a wide variety of metadata having different types of properties can exist with media items.
  • the system can present this data to a user in a number of ways. Taking the example of Fig 11 , there is shown on an annotated timeline a media item 1102 with a default traffic light of amber and two annotated components, the first 1104 with a green traffic light, and the second 1106 with a red traffic light. To display this information to a user a text view of the component data 1108 can be displayed.
  • a timeline for the media item as shown in Figure 12.
  • the system derives and displays, for the whole length of the item the relevant traffic light status 1202, whether it be from an assigned component, an inherited component or a default value. This can be thought of as a flattened or 'end on' timeline with the most recent components shown on top and obscuring previous components or items behind.
  • a textual representation 1204 of the flattened timeline is also shown. Although the textual representation does not show as much information as the component data view 1108, by moving the cursor over a particular point on timeline 1202, full component information for that point can be displayed.
  • annotation timeline is inherently linked with the example of metadata inheritance using components as described above, since when a segment of a media item is used in another item, and metadata from a component field is to be propagated, the values taken for that segment are the annotation timeline values which would be displayed for that segment.
  • Figure 13a shows an annotation timeline of a media item 1302 and components 1304, 1306 and 1308, along with a flattened timeline view 1310. It will be seen that since item 1302 and component 1304 have the same traffic light value, no distinction is made between them in the flattened timeline view, and they are effectively merged.
  • component 1304 is modified to change the traffic light from green to amber. This causes it to move down the timeline to position 1312, and the timeline is modified accordingly with the traffic light value of component 1312 now obscuring or overriding the value of component 1306.
  • Views of components can apply either to single attributes or to groups of attributes.
  • single attributes the system needs to know the last date/time the attribute was modified.
  • groups of attributes the system needs to know the last date/time when any of the attributes in the group was modified (it does not need to know which attribute in the group it was).
  • the Rights group includes the attributes:
  • the timeline display as described can be embedded in third party editing systems. This will show compositional events, their type and duration. If provides not only event information from the system but also provides the liability to display metadata properties from the meta core. This will take the form of an area of screen real estate that the system user will control directly. This control will be self contained in that it will have it's own keyboard and mouse events that will obtain property information from the system such as copyright information.
  • a usage value is indicative of how many times any particular portion of a media item occurs in another, related media item. This value can then be displayed in a timeline view of the media item in a fashion similar to the rights timeline described above.
  • An example of a usage timeline is shown in Figure 14. Portions of the media item which are not shared with any other items in the system are indicated as a light colour or clear as shown at timeline portion 1400. Portions 1402 and 1402 have a first usage value indicating that these portions are included in one other media item in the system, and are shown darker. Portions 1406 and 1408 have a second usage value indicating that these portions are included in two other items within the system. It can be seen that the usage timeline provides a usage density display with darker areas being more heavily used.
  • any production media item will automatically have at least one instance of corresponding media for its entire duration; namely the parent or parent items it was derived from, assuming these have not been deleted. In certain cases therefore where production items are being considered, it may be desirable to assign the lowest usage value to media portions having a single occurrence of corresponding media. Alternatively 'raw' items could be excluded from the derivation of the usage value.
  • Figures 15a and 15b show user displays of an exemplary media management system.
  • Figure 15a includes a traffic light summary indicator 1502 and a timeline view 1504 for a media item being worked on. There are also a number of media viewing windows 1505.
  • Figure 15b includes Iwo traffic light summary indicators and also a timeline view 1508.
  • media item essence exists at different qualities within the system e.g. Desktop, broadcast etc.
  • Media Items will be ingested into the system at Broadcast Quality.
  • further Media Item Instances will automatically be created. These are web quality, desktop quality and keyframe quality, and therefore, normally, four Media Item Instances will exist on the system for a Media Item.
  • the non-broadcast quality Media Item Instances are for viewing purposes within the system only.
  • Archiving a Media Item means that a broadcast quality copy of the Media Item essence is made on an offline storage item (e.g. tape). All essence is then deleted from the online and offline archive stores with the exception of the web quality essence and keyframe essence.
  • the metadata for the Media Item, and the metadata for the associated keyframes, Components and Bookmarks are kept online. All operations from the system to Offline Archive are copy actions. A move action is performed by first copying the item to the relevant store and then later by the system deleting the essence from the current online store.
  • the metadata will additionally include a tape ID to facilitate a user requesting that items on tape be brought back online.
  • a web hosting caters for users on low bandwidth network with a requirement to access media For example, bureau users have 56Kb or ISDN connections. Such low bandwidth availability precludes working with 1.5 Mbps Desktop media. However, they will still be able to access and update metadata, whilst also being able to view Web media.
  • Any system user with the appropriate access privileges can recommend a media item for archiving from a terminal networked to the metacore by manually setting the archive flag. This causes the media item to be reviewed at the archive service to determine whether it should be recorded offline to tape, or deleted. An item may be deleted at this stage for a variety of reasons including resource constraints. This review is typically performed manually by a member of archive staff.
  • the system also has the facility automatically to make archive recommendations based on Archive/Keep Rules.
  • the automatic rules are configurable, adaptable and periodic.
  • the periodic rate of the search can be configured as well.
  • archive rules define various metadata fields and values of those fields, that the media item must match in order to be selected by the rules. Examples of metadata field used in archive rules are:
  • Each rule will consist of one or more archive terms.
  • Each archive term consists of a metadata field name, an operator, and a value.
  • the operators allowed by the system will vary depending on the type of the metadata field (i.e. in some cases only an equivalence will be allowed).
  • the archive terms will be linked together using Boolean operators (AND/OR). This will allow archivists to create complex archive criteria that will match certain sets of media items (provided they have been marked up correctly).
  • Rules matching service will provide the ability to identify business objects within the system whose metadata matches certain criteria.
  • the criteria for matching will be complex and may involve interaction with other system business objects.
  • the outcome of a rule match will also vary depending on the rule being evaluated.
  • a scripting environment be provided to allow users to create complex rules. It is assumed that only technically able and responsible users will be able to create rules.
  • the service will provide the ability for rules to be defined using a defined syntax. For some metadata fields the comparison value may consist of wildcard characters.
  • the system will also allow system wide pre-defined rules to be implemented to enable users with lesser privileges to make use of the rule matching
  • the RuleFactory 1902 will obtain all rules that have previously been persisted to the rule store 1904.
  • the rule factory will be able to read rules from file and a defined JDBC connection.
  • each rule is maintained within one of a number of RuleSets 1906 that is used to identify if as pertaining to a type of rule e.g. an archive rule. This allows the RuleEngine to identify all rules required to perform evaluation on a specific group of rules.
  • Rules are evaluated by a call to the rules evaluation engine 1910. This call passes a collection of business objects 1912 to be evaluated and the name of a RuleSet containing the rules to be used.
  • the rule engine is able to perform the action defined by the rule immediately or defer it to a later date/time.
  • the decision to defer is made through the use of a priority defined by the rule when it is created. If the action is deferred until a later date/time a match object is constructed and placed in a match queue 1916 until processed by the rule action processor 1914. This processor is run as a low priority thread and will only action the rules when system resources allow.
  • Each rule contains:
  • Condition script This script is used to declare how matching is to be performed and returns true if a match occurs.
  • ® Action script This script is used to perform any operations that are required as a result of a match. This script may simple pass handling of the match over to another system service for processing. « Group action script (optional) - This script is used if the rule is set-up to allow group matches. See below for details on grouping of matches. • Item object type - This is business object type that will be exposed to the script in order to evaluate the match
  • Match object type (optional) - This is the object that may be used if the script passes on handling of the match to a pre-defined component.
  • the rules can be set up to perform a number of different tasks on the media item when a match occurs. For instance when the rule matches a media item the system may update the media item archive decision code to "recommend for archive". Alternatively the rule may be set up to automatically create a new departure booking and assign the matching media item(s) to it. For instance, this will allow rules to be set up that collect up all media items that were used in the one o'clock news and automatically create an archive departure booking to record them to tape. When rules match a media item the system will update the policy match code to indicate that the media item has been matched by a rule. This is required as some rules may wish to filter out media items that have already been matched by a rule. This also implies that the order in which rules are checked is very important especially where media items are likely to be matched by more than one rule.
  • an archive rule might be defined as:
  • a user can view modify, add, delete and review archive rules.
  • the user will be able to prioritise how the list is viewed.
  • Media Items will be viewed in a list prioritised by 'archive decision'.
  • Subscription option will display a list of all automatic archive recommendations the user is subscribed to.
  • Storage Items will also be displayed in a list with right click menu for options such as Add, Maintain, details and Associate storage with media item.
  • the interface will also allow users to adjust the priority or order in which the rules are exercised against the media items.
  • the system will search all available bookings, media items, components, and keyframes for content that may be appropriate for archiving.
  • the actual rule evaluation will be executed either as a one off operation or as a set of 'batched' rules.
  • the archive recommendations list can contain recommendations for media items previously archived offline.
  • the system can eliminate previously archived media items from being entered into the list.
  • the user is able to make an archive recommendation for media items already archived but the system will warn the user when they attempt to do so.
  • the system will attempt to remove all media items for which the Planned Deletion Date/Time has expired. When the deletion date is reached therefore, the system first checks whether the media item has been previously archived. If it has been archived, only the broadcast and desktop qualities are deleted. This leaves the web quality and keyframe qualities online to allow a user to perform search and viewing functions. If it has not been previously archived, the system will delete all essence and metadata for the Media Item, (this includes component metadata, bookmark metadata, keyframe essence and metadata). The system will not, however, remove items that are currently on a list of items recommended for archive awaiting review or which form part of an archive departure booking awaiting copying to tape (archive queue), even if their Planned Deletion Date/Time has past. This ensures that media items that have been marked for archive are not deleted by the system until the playout to offline archive has been completed successfully.
  • the system will prevent a deletion from occurring if the Media Item is being used in an unconformed Production Media Item or if there is a planned departure booking for the Media Item. If the Media Item is being used in a conformed Production Media Item, the system will allow deletion of all the Media Item's essence but will prevent deletion of the Media Item's metadata; this metadata will only be available by reference from the Production Media item - i.e. it will never appear in search results.
  • the first type 1602 is for those items which are not recommended for archive either manually or as the result of an automated rule. These are deleted when their deletion date is reached (subject to the item already being held offline in which case only the essence is deleted as explained above) as indicated at 1610.
  • the second type 1604 consists of items which do not satisfy an automated rule, but which have been manually recommended for archive. These are passed for archive review at 1612. This results in one of three actions.
  • the item may be deleted as shown at 1610.
  • the item may alternatively be kept online until a later date as shown at 1614 (which may involve changing the planned deletion date) for further review or for deletion. If deletion is selected after the the selected date, but a new recommendation is subsequently made in the meantime then the item is again reviewed.
  • Lastly the item may be Archived as shown at 1616. It should be noted that actions 1614 and 1616 may both take place for a particular item. If the item is to be archived and not also maintained online, the online essence is deleted at the planned deletion time (excluding metadata, web resolution and bookmarks).
  • the third type 1606, comprises items which have not been manually recommended for archive, but which satisfy an archive rule. These are treated according to the rule outcome as shown at 1618. These outcomes are early deletion as shown at 1620 (subject to checking that item is not held offline), or manual review 1612, keep online 1614, and/or archive 1616 as shown.
  • the last type of item 1608 is where a manual recommendation has been made and an archive rule has been satisfied. This is passed for review at 1622, where it is decided whether to proceed with archive rule outcome or amend according to journalist recommendation. All media is stored using a storage item.
  • a storage item may be videotape, a data format tape, or a logical device like a partition on a hard drive or an optical worm drive. The system keeps track of where media items are by tracking which storage item it is stored on. When a videotape of media is recorded for the archive, a new storage item must be created, and selected for the recording. This creates a media item instance on the new storage device. Users can view the details or attributes of a storage item such as:
  • Storage items are modified whenever they are moved, more media is added to the storage item, or if the name, ID or tape ID must be changed. If practical, a new storage item should be created if a tape storage item is duplicated creating duplicate media item instances of everything on the tape. Storage items can also be deleted.
  • Requests for items from the offline archive are handled using the booking processes.
  • a user will submit a request from archive when they require access to archived media items.
  • Archivists will locate the relevant tape(s) and insert it into a system enabled VTR. The user will then enter the tape ID (s) and the VTR location into the system.
  • the system will automatically queue the tape to the correct timecode and begin ingest of the media items. This ingest will be performed using record tools. The system will prompt the user when/if other tapes are required.
  • the system also allows 'legacy' archive tapes to be ingested into the system. That is, archive tapes which are not formatted according to the present system.
  • the tape ID will not be a recognized system archive tape ID.
  • the system will create a new media item with default values (similar to crash recording). This may be exactly the same as crash recording except the archivist is allowed to manually set the archive flag.
  • the archivist will be responsible for queuing the tape to the correct location and beginning the recording. Once recorded the archivist can then use the standard modify media item tools to update any metadata from the existing archive systems. If multiple media items are required for a single tape then the archivist must perform separate ingests. The system will continue to track all details about this tape in the same way as for system archive tapes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Television Signal Processing For Recording (AREA)
  • Management Or Editing Of Information On Record Carriers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)
  • Selective Calling Equipment (AREA)
  • Small-Scale Networks (AREA)
  • Multi Processors (AREA)

Abstract

L'invention concerne un procédé permettant de gérer les métadonnées associées à des articles multimédia dans un système de mise en forme de multimédia comprenant une pluralité d'articles multimédia associés par des relations parent-enfant, dans lequel les métadonnées sont propagées d'un article multimédia jusqu'à un article multimédia associé de différentes manières, conformément à une caractéristique de propagation, p. ex. uniquement dans le sens parent-enfant, ou de manière bidirectionnelle, c'est-à-dire également dans le sens enfant-parent. Pour gérer les métadonnées, un schéma chronologique permettant d'obtenir à un moment quelconque la valeur d'attribut attribuée le plus récemment à un article multimédia est tenu à jour. L'archivage des articles multimédia peut être géré de manière automatique au moyen de règles d'archivage, et les métadonnées peuvent être utilisées de manière avantageuse pour simplifier la recherche et la lecture du matériel archivé.
PCT/GB2004/001493 2003-04-04 2004-04-05 Systeme et procede de gestion de multimedia WO2004088664A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US46064903P 2003-04-04 2003-04-04
US60/460,649 2003-04-04

Publications (2)

Publication Number Publication Date
WO2004088664A2 true WO2004088664A2 (fr) 2004-10-14
WO2004088664A3 WO2004088664A3 (fr) 2005-03-17

Family

ID=33131929

Family Applications (7)

Application Number Title Priority Date Filing Date
PCT/US2004/010766 WO2004090677A2 (fr) 2003-04-04 2004-04-05 Systeme et procede de traitement de contenu multimedia
PCT/GB2004/001506 WO2004088990A2 (fr) 2003-04-04 2004-04-05 Commande de stockage de contenu multimedia
PCT/GB2004/001505 WO2004088553A2 (fr) 2003-04-04 2004-04-05 Procede et appareil permettant de commander dynamiquement un systeme de production de contenu multimedia diffuse
PCT/GB2004/001468 WO2004088663A2 (fr) 2003-04-04 2004-04-05 Processeur de media
PCT/GB2004/001493 WO2004088664A2 (fr) 2003-04-04 2004-04-05 Systeme et procede de gestion de multimedia
PCT/GB2004/001492 WO2004088984A1 (fr) 2003-04-04 2004-04-05 Systeme et procede de stockage et de recherche de donnees video avec conversion de la resolution
PCT/GB2004/001481 WO2004088887A2 (fr) 2003-04-04 2004-04-05 Systeme et procede de gestion multimedia

Family Applications Before (4)

Application Number Title Priority Date Filing Date
PCT/US2004/010766 WO2004090677A2 (fr) 2003-04-04 2004-04-05 Systeme et procede de traitement de contenu multimedia
PCT/GB2004/001506 WO2004088990A2 (fr) 2003-04-04 2004-04-05 Commande de stockage de contenu multimedia
PCT/GB2004/001505 WO2004088553A2 (fr) 2003-04-04 2004-04-05 Procede et appareil permettant de commander dynamiquement un systeme de production de contenu multimedia diffuse
PCT/GB2004/001468 WO2004088663A2 (fr) 2003-04-04 2004-04-05 Processeur de media

Family Applications After (2)

Application Number Title Priority Date Filing Date
PCT/GB2004/001492 WO2004088984A1 (fr) 2003-04-04 2004-04-05 Systeme et procede de stockage et de recherche de donnees video avec conversion de la resolution
PCT/GB2004/001481 WO2004088887A2 (fr) 2003-04-04 2004-04-05 Systeme et procede de gestion multimedia

Country Status (1)

Country Link
WO (7) WO2004090677A2 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006045767A1 (fr) 2004-10-25 2006-05-04 International Business Machines Corporation Systeme et procede de gestion de donnees configurable basee sur des entites
EP1758398A1 (fr) 2005-08-23 2007-02-28 Syneola SA Moyens d'interface pour metadata et utilisateur basés sur une semiotique à plusieurs niveaux et une logique floue pour un système interactif multimédia ayant une capacité d'adaptation par acquisition de connaissances
US8019155B2 (en) 2007-03-26 2011-09-13 Eastman Kodak Company Digital object information via category-based histograms
US10169389B2 (en) * 2007-10-26 2019-01-01 Microsoft Technology Licensing, Llc Metadata driven reporting and editing of databases
US11693827B2 (en) 2016-12-29 2023-07-04 Microsoft Technology Licensing, Llc Syncing and propagation of metadata changes across multiple endpoints

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7587507B2 (en) 2005-07-22 2009-09-08 Microsoft Corporation Media recording functions in a streaming media server
GB2522296B (en) * 2014-10-08 2016-11-02 Deluxe Broadcast Services Ltd Broadcasting Apparatus
CN111508468B (zh) * 2020-04-17 2021-01-01 北京灵伴即时智能科技有限公司 录音编辑管理方法及系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5428737A (en) * 1991-10-16 1995-06-27 International Business Machines Corporation Comprehensive bilateral translation between SQL and graphically depicted queries
US20010033295A1 (en) * 1998-04-03 2001-10-25 Phillips Michael E. System, method, and product for resolution-independent image translation
US6539163B1 (en) * 1999-04-16 2003-03-25 Avid Technology, Inc. Non-linear editing system and method employing reference clips in edit sequences

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1322422C (fr) * 1988-07-18 1993-09-21 James P. Emmond Fichier indexe a cle unique pour gisements de file d'attente de traitement transactionnel
US4989191A (en) * 1989-01-03 1991-01-29 Frank Sheafen Kuo Data processing system with mixed media memory packs
JP3248380B2 (ja) * 1994-12-15 2002-01-21 ソニー株式会社 データ復号化装置およびデータ復号化方法
JPH0981497A (ja) * 1995-09-12 1997-03-28 Toshiba Corp 実時間ストリームサーバ並びに実時間ストリームデータの格納方法および転送方法
CA2251225C (fr) * 1996-04-12 2009-12-29 Avid Technology, Inc. Systeme multimedia avec des mecanismes ameliores de gestion des donnees
US5826081A (en) * 1996-05-06 1998-10-20 Sun Microsystems, Inc. Real time thread dispatcher for multiprocessor applications
US5964849A (en) * 1997-04-01 1999-10-12 Sony Corporation Controlling video devices
JP3741299B2 (ja) * 1997-04-06 2006-02-01 ソニー株式会社 映像信号処理装置及び映像信号処理方法
JP4150083B2 (ja) * 1997-09-25 2008-09-17 ソニー株式会社 符号化ストリーム生成装置及び方法、ならびに編集システム及び方法
US6070228A (en) * 1997-09-30 2000-05-30 International Business Machines Corp. Multimedia data storage system and method for operating a media server as a cache device and controlling a volume of data in the media server based on user-defined parameters
US20030001880A1 (en) * 2001-04-18 2003-01-02 Parkervision, Inc. Method, system, and computer program product for producing and distributing enhanced media
US6766357B1 (en) * 1999-04-15 2004-07-20 Avid Technology, Inc. Apparatus and method for efficient transfer of multimedia data for playback
US6792615B1 (en) * 1999-05-19 2004-09-14 New Horizons Telecasting, Inc. Encapsulated, streaming media automation and distribution system
WO2000072574A2 (fr) * 1999-05-21 2000-11-30 Quokka Sports, Inc. Architecture de commande du flux et de la transformation de donnees multimedia
WO2001028238A2 (fr) * 1999-10-08 2001-04-19 Sarnoff Corporation Procede et appareil permettant d'ameliorer et d'indexer des signaux video et audio
AU2603401A (en) * 1999-12-23 2001-07-03 Michael Moynihan Personal video channel system
US20010037379A1 (en) * 2000-03-31 2001-11-01 Noam Livnat System and method for secure storage of information and grant of controlled access to same
US6954795B2 (en) * 2000-04-05 2005-10-11 Matsushita Electric Industrial Co., Ltd. Transmission/reception system and method for data broadcast, and transmission apparatus for data broadcast
US6882793B1 (en) * 2000-06-16 2005-04-19 Yesvideo, Inc. Video processing system
US20020089602A1 (en) * 2000-10-18 2002-07-11 Sullivan Gary J. Compressed timing indicators for media samples
WO2003001788A2 (fr) * 2001-06-25 2003-01-03 Redhawk Vision Inc. Appareil et procede de capture, de stockage et de traitement d'evenements video
EP1483909B1 (fr) * 2002-03-13 2010-04-28 Imax Corporation SYSTEMES ET PROCEDES PERMETTANT la REMASTERISATION OU la MODIFICATION NUMERIQUE DE FILMS CINEMATOGRAPHIQUES OU AUTRES DONNEES DE SEQUENCES D'IMAGES

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5428737A (en) * 1991-10-16 1995-06-27 International Business Machines Corporation Comprehensive bilateral translation between SQL and graphically depicted queries
US20010033295A1 (en) * 1998-04-03 2001-10-25 Phillips Michael E. System, method, and product for resolution-independent image translation
US6539163B1 (en) * 1999-04-16 2003-03-25 Avid Technology, Inc. Non-linear editing system and method employing reference clips in edit sequences

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006045767A1 (fr) 2004-10-25 2006-05-04 International Business Machines Corporation Systeme et procede de gestion de donnees configurable basee sur des entites
US7493350B2 (en) 2004-10-25 2009-02-17 International Business Machines Corporation Entity based configurable data management system and method
EP1758398A1 (fr) 2005-08-23 2007-02-28 Syneola SA Moyens d'interface pour metadata et utilisateur basés sur une semiotique à plusieurs niveaux et une logique floue pour un système interactif multimédia ayant une capacité d'adaptation par acquisition de connaissances
US8280827B2 (en) 2005-08-23 2012-10-02 Syneola Luxembourg Sa Multilevel semiotic and fuzzy logic user and metadata interface means for interactive multimedia system having cognitive adaptive capability
US8019155B2 (en) 2007-03-26 2011-09-13 Eastman Kodak Company Digital object information via category-based histograms
US10169389B2 (en) * 2007-10-26 2019-01-01 Microsoft Technology Licensing, Llc Metadata driven reporting and editing of databases
US11693827B2 (en) 2016-12-29 2023-07-04 Microsoft Technology Licensing, Llc Syncing and propagation of metadata changes across multiple endpoints

Also Published As

Publication number Publication date
WO2004088887A3 (fr) 2005-06-30
WO2004088663A2 (fr) 2004-10-14
WO2004088553A2 (fr) 2004-10-14
WO2004090677A9 (fr) 2005-03-31
WO2004090677A2 (fr) 2004-10-21
WO2004088553A3 (fr) 2004-12-09
WO2004088990A2 (fr) 2004-10-14
WO2004090677A3 (fr) 2007-05-10
WO2004088984A1 (fr) 2004-10-14
WO2004088664A3 (fr) 2005-03-17
WO2004088990A3 (fr) 2004-11-18
WO2004088663A3 (fr) 2004-12-02
WO2004088887A2 (fr) 2004-10-14

Similar Documents

Publication Publication Date Title
US20230325445A1 (en) Methods and apparatuses for assisting the production of media works and the like
US10592075B1 (en) System and method for media content collaboration throughout a media production process
CA2600207C (fr) Procede et systeme d'edition et de stockage distribues de supports numeriques via un reseau
US7743037B2 (en) Information processing apparatus and method and program
US8644679B2 (en) Method and system for dynamic control of digital media content playback and advertisement delivery
Miller et al. News on-demand for multimedia networks
US8977108B2 (en) Digital media asset management system and method for supporting multiple users
US20090217352A1 (en) Web managed multimedia asset management method and system
US20060047698A1 (en) Method and system for creating, tracking, casting and reporting on moving image projects
CN100546368C (zh) 信息处理装置及信息处理方法
US20070113184A1 (en) Method and system for providing remote digital media ingest with centralized editorial control
US20140129563A1 (en) Media catalog system, method and computer program product useful for cataloging video clips
US20050165840A1 (en) Method and apparatus for improved access to a compacted motion picture asset archive
US20030105743A1 (en) Use of database queries for manipulation of media content
WO2004088664A2 (fr) Systeme et procede de gestion de multimedia
CN107526747A (zh) 一种多媒体编目方法及系统
EP3518120B1 (fr) Indexation d'agrégats de contenu multimédia dans un environnement à bases de données multiples
US20150006540A1 (en) Dynamic media directories
CN100559857C (zh) 信息处理装置和信息处理方法
Addis et al. Digital preservation of audiovisual files within PrestoPRIME
Geller Apple Pro Training Series: Getting Started with Final Cut Guide

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase