US20030187820A1 - Media management system and process - Google Patents

Media management system and process Download PDF

Info

Publication number
US20030187820A1
US20030187820A1 US10109798 US10979802A US2003187820A1 US 20030187820 A1 US20030187820 A1 US 20030187820A1 US 10109798 US10109798 US 10109798 US 10979802 A US10979802 A US 10979802A US 2003187820 A1 US2003187820 A1 US 2003187820A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
content items
plurality
metadata
managing
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10109798
Inventor
Michael Kohut
Larry Goodman
Mark Koffman
Jim Mercs
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Sony Pictures Entertainment Inc
Original Assignee
Sony Corp
Sony Pictures Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/30Information retrieval; Database structures therefor ; File system structures therefor
    • G06F17/30067File systems; File servers
    • G06F17/30091File storage and access structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/30Information retrieval; Database structures therefor ; File system structures therefor
    • G06F17/30244Information retrieval; Database structures therefor ; File system structures therefor in image databases
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification

Abstract

A system and method for managing a plurality content items employs an ingest station which may digitize the content items, generate metadata corresponding to each of the content items, and store the metadata and the content items in a storage medium or mediums. The content items may be audio content, video content, or audio/video content. The system and method may provide an association between the metadata and a corresponding content item to facilitate retrieval of the content item. The content items may be retrieved, at least in part, by searching the metadata and employing the association between the metadata and a corresponding content item to locate the desired content item.

Description

    RELATED APPLICATION
  • The present disclosure relates to co-pending U.S. patent application Ser. No. ______, titled “Media Storage And Management System And Process” (Attorney Docket No. 041892-0222), which is incorporated herein by reference.[0001]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0002]
  • The invention relates to a system and a method for managing media and, in particular embodiments, to a system and method for managing content items, such as, for example, audio content, by generating metadata related to a content item, storing the content item and the related metadata, and maintaining a virtual link between the content item and the related metadata so as to facilitate retrieval of the content item. [0003]
  • 2. Related Art [0004]
  • Audio and visual content items may be stored in a variety of physical media including film, optical (CDs and DVDs), and magnetic. As used herein, the expression “content items” broadly and generally refers to information stored on a physical media. Examples of content items may include audio data, such as music, dialog, movie soundtracks, sound effects, or other like audio data. Examples of content items also may include visual data, such as video, including, but not limited to, movies, television programs, animation, home video, and the like. Examples of visual data may also include still images, such as pictures, a writing, charts, or other like images. Other like content items, including combinations of content items (e.g. audio and video data associated with a movie) may also be included. There are a variety of problems associated with conventional means for storing content items, including the possibility of (1) lost or misplaced content items; (2) deterioration of content items; (3) difficulty in locating and/or retrieving content items; and (4) information loss during production. [0005]
  • Content items may be lost or misplaced. Content items stored on a physical media may be difficult to locate after some time has passed. For example, in the case of movie audio, large amounts of audio data may be generated that ultimately is not mixed into a movie soundtrack. The audio data may be stored on magnetic tape, film, or computer disks. These physical media may be labeled and stored. However, the physical media may also be misplaced, mislabeled, or otherwise difficult to locate after some time has passed. [0006]
  • Content items stored on a physical media also may be subject to deterioration over time. This may be particularly problematic for content items stored in an analog format. For example, content items stored on magnetic tape or on film may be subject to deterioration. As such, someone desiring, for example, audio tracks from a movie that was created several years ago, may discover that the sound quality of the magnetic recordings is no longer satisfactory. [0007]
  • In addition, there may be difficulty in locating and accessing desired content items. For example, even if audio and/or video content items are stored on a physical media and archived, it may be difficult to locate or access the specific content items desired. Content items may be created and archived by multiple people, groups, or companies. As such, it may be difficult to identify where the desired content item is archived or stored. Also, once a physical media containing the desired content item is located, it may still need to be sent or delivered to an individual who desires access to the content item. Sending or delivering a physical media may be time intensive and inefficient. [0008]
  • Finally, content items may not be adequately archived or stored during the production of the content items. For example, in the case of movie audio, there may be multiple phases or stages of production of the audio that are never stored or archived. Actors may record their lines multiple times and on different tracks. These types of content items may be recorded over or discarded once the movie has been completed. Similarly, sound effects and background music may be added to a movie soundtrack in multiple phases. These content items may never be archived during production of the soundtrack, which may render the content unavailable in the future. Thus, many content items may never be adequately archived. Also, much information may be lost when it is not recorded during production. After a movie has been completed, for example, it may be difficult to discern who is speaking in a particular scene, how a particular sound effect was created, or other like information that may be readily available during the production of the movie audio data. [0009]
  • SUMMARY OF THE DISCLOSURE
  • Therefore, it is an advantage of embodiments of the invention that a system and method for managing media may result in media content being stored in conjunction with related metadata such that retrieval of the stored content may be facilitated. [0010]
  • It is an advantage of embodiments of the system and method for managing media that content items may be stored in a centralized location such that the content items may be less likely to be lost or misplaced. Embodiments of the invention may provide for the generation of metadata related to the content items, which may further facilitate locating a particular content item after some time has passed. [0011]
  • It is a further advantage of embodiments of the invention that content items stored on a physical media may be digitized by an ingest station and input into the system for storage. Digitized versions of the content items may not be as susceptible to deterioration over time as, for example, analog versions of media content. Thus, embodiments of the system and method advantageously solve a problem of deterioration of media content that may be associated with some physical media. [0012]
  • It is a further advantage of embodiments of the invention that there may be an improved ability to locate and retrieve content items. Metadata may be generated that is related to each content item managed. As used herein, “metadata” refers to data associated with a content item, such as, for example, an audio content item, that defines or describes information about the content item. In some embodiments, metadata may include a plurality of metadata fields. Each of the plurality of metadata fields may define a separate informational item about a content item. The metadata fields may contain information about the format of the content item, such as sample rate, bit depth, sync source, and the like. The metadata fields may also contain information about the content, such as movie title, actor information, genre, and the like. An association may be created to virtually link a content item to related metadata. Therefore, metadata or metadata files may be searched in order to locate a desired content item. A plurality of metadata fields may be employed to facilitate a search and retrieval process. Also, retrieval of the content items may be facilitated because each of the content items may have been digitized and stored electronically. As such, content items may be delivered electronically in a digital format, as opposed to a time intensive process of delivering content stored on a physical media. [0013]
  • It is yet another advantage of embodiments of the invention that media content may be archived during the production of the content. In embodiments of the invention, content items may be archived, and related metadata generated, during the production of the content. Thus, more content may be available. For example, in embodiments of the invention concerning movie audio, the movie audio data may be archived at regular intervals during its production so that multiple versions of the audio data may be available. Also, because the metadata may be generated during the production of the content items, the metadata may be highly accurate and it may contain more information than if it was created after production of the content. For example, when movie audio data is archived after a movie has been completed, some information about the audio data may no longer be available. After a movie has been completed, for example, it may be difficult to discern who is speaking in a particular scene, how a particular sound effect was created, or other like information that may be readily available during the production of the audio data. [0014]
  • Embodiments of the invention may comprise, for example, one or more ingest stations connected to a server with a database installed on it, and a connected storage medium controlled by storage software. The ingest stations may be employed to input and/or digitize content items and store them in the storage medium. The ingest stations may also be employed for the generation of metadata related to each content item stored in the storage medium. The metadata may be generated manually or automatically, and it may be stored in the database installed on the server. Virtual links between each stored content item and related metadata may also be stored in the database installed on the server. [0015]
  • In embodiments of the invention, ingest stations may also be employed for the retrieval of content items stored in the system. An ingest station may be employed to search metadata files for desired attributes, such as a particular movie, actor, quote, or other desired attributes. If metadata is located which contains the desired attributes, then the virtual links may be employed to locate and retrieve an associated content item. [0016]
  • These and other objects, features, and advantages of embodiments of the invention will be apparent to those skilled in the art from the following detailed description of embodiments of the invention, when read with the drawings and appended claims.[0017]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the following, embodiments of the invention will be explained in further detail with reference to the drawings, in which: [0018]
  • FIG. 1 is a block diagram illustrating an example hardware environment for embodiments of the invention. [0019]
  • FIG. 2 is a representative view of an example ingest station that may be employed in embodiments of the invention. [0020]
  • FIG. 3 is a diagram showing a portion of a function of an ingest station according to embodiments of the invention. [0021]
  • FIG. 4 shows a table that includes a plurality of example metadata fields that may be employed in embodiments of the invention. [0022]
  • FIG. 5 shows a table which may be employed in embodiments of the invention to manage virtual links between digital media and metadata.[0023]
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • The present invention provides a system and a method for managing and cataloging media, such as content items. In embodiments of the invention, metadata may be generated that corresponds to a particular content item. The metadata may include information about the corresponding content, such as subject matter, format, length, or other like information. The metadata may be generated when a content item is produced or created, or after a content item has been created. When metadata is generated in conjunction with the development or creation of a content item, the metadata may be highly accurate and may include information about the development of the particular content item. [0024]
  • In embodiments of the invention, the generation of metadata in conjunction with the development of content items may facilitate archiving and retrieval of the content items. Once a content item and related metadata have been generated, a coding system may be employed to relate a content item to specific metadata, or to a specific file of metadata. A content item may then be stored either with related metadata or apart from related metadata. A content item may then be retrieved by referencing or searching related metadata. In embodiments of the invention, content items may be stored in a centralized location so that they may be accessed by multiple users or from multiple remote locations, such as for example, by users in different cities or countries. [0025]
  • In some embodiments of the invention, the content items may be audio tracks or audio files associated with recorded video, such as, for example, audio associated with a movie. For example, a movie soundtrack may be treated as a single content item. In such embodiments, the system and process may be employed to manage a plurality of movie soundtracks. In other embodiments, a movie soundtrack may be divided into a plurality of smaller content items, or a plurality of audio files. A movie soundtrack may be divided, for example, based on scenes in the movie, lines of particular characters, fixed time intervals, or based on other manners of division. [0026]
  • In embodiments concerning audio files, metadata which relates to an audio file may be generated during the development of the audio file (i.e. during the development of a movie soundtrack). In such embodiments, audio files may be archived at various stages during their production, from pre-production to production to post-production. When a content item is archived, metadata may be generated which may include, for example, the title of a movie to which an audio file relates, the language of dialog in an audio file, the run-time of the audio file, version information, or other like attributes. Metadata related to a particular audio file may be stored in a file or table. An audio file may be virtually linked to a file or table containing related metadata by employing a coding mechanism. [0027]
  • In embodiments of the invention, by employing a coding mechanism, an audio file may be retrieved by reference to metadata related to the audio file. For example, in embodiments of the invention concerning movie audio, metadata may include information about a scene in a movie from which a particular audio file originated. In such embodiments, an audio file may be retrieved by searching metadata files for a particular movie scene. In other embodiments, metadata may include, for example, information about characters or actors whose voices are included in a particular audio file. In these embodiments, audio files that contain dialog by a particular movie character may be retrieved by searching the metadata for references to the particular character. [0028]
  • In other embodiments, the content items may be video content, such as video tracks, video files, or other video content. The content items may also be graphics, still photos, animation, or other visual content. [0029]
  • An example hardware environment of embodiments of the invention is illustrated in FIG. 1. The example hardware environment in FIG. 1 includes a plurality of ingest stations [0030] 2, 4, and 6; a server 8 with a database 10; and a storage medium 12 with storage software 14. Ingest stations 2-6 may be employed to obtain and digitize content items, such as, for example, audio data. Ingest stations 2-6 may also be employed for the addition or generation of metadata associated with particular content items.
  • An example hardware environment may employ a TeleScope Enterprise application and implementation provided by North Plains Systems Corporation, located in Mississauga, ON Canada. In such an embodiment, a TeleScope Database & Application Server may comprise a Sun E420R server, Oracle 8i software, and ADIC AMASS software. In this embodiment, the storage medium [0031] 12 may comprise an ADIC Scalar 1000. An ingest station 2-6 may comprise a TeleScope Windows Client with an Oracle ODBC driver and a Fibre NIC card. This example hardware environment is merely an example provided for illustrative purposes. Other suitable hardware environments may be employed with different servers (or without a server), different software applications, and different storage mediums. Indeed, suitable hardware environments may range in scale from a single personal computer to a large, distributed network, and the like.
  • In the example hardware environment of FIG. 1, ingest stations [0032] 2-6 are connected to a server 8, on which a database 10 is installed. In other embodiments, the database 10 may be located remote from the server 8 and be accessible by the server 8 through a suitable communication link. The database 10 may store information in a manner which provides virtual links between content items and related metadata or files of metadata. The relationships between content items and metadata may be stored, for example, in a look-up table format, or the relationships may be stored in another suitable manner.
  • In the example hardware environment in FIG. 1, the server [0033] 8 is connected to a storage medium 12, which is under the control of storage software 14. The storage medium 12 may store content items. The storage software 14 may control the retrieval of the content items stored in the storage medium 12.
  • The example hardware environment shown in FIG. 1, and the elements included therein, may be varied without straying from the scope or spirit of the invention. The ingest stations [0034] 2-6 may comprise workstations, terminals, personal computers, or other suitable devices that may be employed to obtain, and in some embodiments digitize, content items. An example ingest station is illustrated in FIG. 2.
  • The example ingest station [0035] 20 illustrated in FIG. 2 comprises a workstation 21 having a communication link 22 to other parts of the system (such as the server 8 shown in FIG. 1). The workstation shown in FIG. 2 includes an optical drive 24, a floppy drive 25, and a media input device 26, any of which may be employed to obtain, input, and/or digitize content items. The media input device 26 may comprise one or more media playing devices (not shown), such as an audio tape player, a videotape player, a film player/projector, or another suitable device or devices, connected to the workstation for inputting content items into the system. An ingest station may also include other input devices, such as a mouse 27, a keyboard 28, a scanner 29, and other suitable input devices, which an operator may employ to input metadata.
  • In FIG. 1, the server [0036] 8 allows for multiple ingest stations to be part of a single media management system. Other embodiments, however, may employ a single ingest station without a server. In such embodiments, a database may be installed, for example, in the ingest station. In further embodiments, multiple servers and multiple databases may be employed, either in a single location or in multiple locations.
  • In the example hardware environment in FIG. 1, the storage medium [0037] 12 stores the content items. The storage medium 12 may be any suitable storage device or devices, including, but not limited to, disk storage, tape storage, digital cartridge storage, or the like. The storage medium 12 may include a plurality of storage devices of the same type or of different types. In some embodiments, the storage medium 12 may comprise a mass storage facility comprising an array of storage media, such as disks or tapes. The storage medium 12 may be controlled by storage software 14. Storage software 14 may be employed for retrieving content items from the storage medium 12.
  • Next, a method for managing media according to embodiments of the invention will be described. An embodiment of the invention includes generating metadata during the production or creation of a content item (such as, for example, an audio content item), associating the generated metadata with the related content item, storing the content item and the related metadata, and retrieving the stored content item based at least in part on the metadata related to that content item. In an example embodiment of the invention, the content items being managed may include audio data from a movie or movies. However, embodiments of the invention may include other types of media or content, or audio content from sources other than movies, including, but not limited to, content from such sources as television programs, documentaries, educational or training programs, animated programs, graphical presentations, or the like. [0038]
  • In an example embodiment of the invention, metadata related to movie audio data may be generated during the production of the audio data. In embodiments of the invention, movie audio data may be produced or created in a conventional manner. For example, audio may be developed and recorded in either an analog or a digital format. Audio may be recorded on, for example, a CD/DVD, film, videotape or audiotape, a computer disk, or other suitable physical recording media. Also, audio data may be developed in multiple phases. For example, in the case of movie audio data, actors' voices may be added in a single phase or in multiple phases, sound effects may be added in a separate phase, and background music may be added in yet another phase. In addition, multiple voice tracks may be developed so that a movie may be released in multiple languages. Additional phases may also be employed. It is an advantage of embodiments of the invention that metadata may be generated that is related to audio that may be archived during multiple phases of development of a movie soundtrack. [0039]
  • For example, at selected intervals during the production or creation of movie audio data, audio data may be delivered to an ingest station for archiving and for metadata generation. The selected intervals may vary according to embodiments of the invention. For example, audio content may be archived daily, weekly, at the end of a recording session, at the time of completion of a particular scene or portion of a movie, or at another suitable or desired interval or combination of intervals. By selecting a more frequent interval, more information about the production of audio may be captured. [0040]
  • FIG. 3 illustrates a portion of a method, according to embodiments of the invention, for generating metadata during the production of, for example, audio for a movie. FIG. 3 shows that audio data may be recorded on a CD/DVD [0041] 30, film 31, a videotape or audiotape 32, a computer disk 33, or another suitable physical media. The audio data may have been recorded from a live presentation, copied from one or more previous recordings, or the like. At a selected interval, audio content may be presented to an ingest station in a process 34 in FIG. 3. When audio content is produced in a digital format, then the audio content may be input into an ingest station without being digitized. When audio content is created in an analog format, then an ingest station may be employed to digitize the audio content. Ultimately, a content item may be archived in the system as a digitized media item 35 and related metadata 36.
  • For example, an operator may employ an ingest station to digitize an audio file which is stored on a magnetic audiotape. In such an embodiment, an ingest station may include a personal computer connected to an audiotape player, through a suitable interface. In other embodiments, if an audio file is produced in a digital format, such as on a computer disk, then an operator may employ an ingest station to input the audio data into the system without a further step of digitizing the file. In further embodiments, ingest stations may automatically obtain content items, such as audio files, without the assistance of an operator. For example, an ingest station may include a workstation connected to the Internet or to a suitable network, which automatically obtains or receives audio files submitted or sent over the Internet or over a suitable network. [0042]
  • In an example embodiment illustrated in FIG. 3, an ingest station may be employed to both obtain audio content items and to generate related metadata. As discussed herein, “metadata” refers to data associated with a content item, such as, for example, an audio content item, which defines or describes information about the content item. In an example embodiment, metadata may include a plurality of metadata fields. Each of the plurality of metadata fields may define a separate informational item about a content item. [0043]
  • FIG. 4 shows an example metadata matrix including a plurality of metadata fields employed in one representative embodiment of the invention. The example metadata matrix shown in FIG. 4 includes data relating to a physical asset, data relating to a digital asset, and data relating to hard drive management. A “physical asset” refers to the physical media on which a content item (i.e. audio) was originally recorded and which may be stored (in addition to the digitized content being stored in the system). The “digital asset” fields include detailed information about a digitized version of a content item. These metadata fields may contain important format information such as sample rate, sync source, bit depth, and the like. The “hard drive management” fields may be employed in embodiments in which content items are archived on hard drives. Embodiments of the invention may employ suitable metadata fields other than or in addition to the metadata fields shown in FIG. 4. Example metadata fields for movie audio may include a movie title, a movie scene, a character, an actor, and other like data that may be descriptive of the audio content item. [0044]
  • In embodiments of the invention, an operator may input metadata into an ingest station or an ingest station may generate metadata automatically. For example, an operator may employ an ingest station as shown in FIG. 2 to manually fill in the metadata fields shown in FIG. 4. An operator may obtain information to place in the metadata fields by, in the case of audio content, for example, listening to the audio. An operator may also input information from a physical media on which an audio content item is recorded. For example, an audiotape may have a label with identifying information or format information that an operator may input as metadata. An operator may also scan such a label into the system so that an image of the label may be included as metadata. A label may describe, for example, the audio data format and whether an audio track is left channel, right channel, surround, or the like. Moreover, in embodiments in which metadata is generated during the production of the content items, an operator may query individuals involved in the production of the content about information that may be included as metadata. Other sources of information may also be available. [0045]
  • In further embodiments, an ingest station may generate metadata automatically. For example, an ingest station may divide a large audio file into a plurality of smaller audio files. In such embodiments, an ingest station may automatically populate certain metadata fields with information that describes the relationship of each of the smaller audio files to the original audio file. In some embodiments, an ingest station may recognize different voices and divide an audio file into a plurality of audio files, with each file containing dialog by a particular speaker. [0046]
  • In such embodiments, an ingest station may include voice-recognition software to determine where to divide a file. An ingest station may then populate metadata fields with appropriate data regarding the speaker, automatically. In such embodiments, an ingest station may run a program that causes the ingest station to perform these operations. For example, a program may cause an ingest station to scroll through an audio file, employing voice recognition software, and pause each time a new voice is recognized. The program may then cause an ingest station to divide the audio file into two files, and then continue to scroll through the remainder of the original file. Each time an audio file is split, the portion which has been scrolled through may be stored as a separate file and metadata fields corresponding to the separate file may be populated with information ascertained by, for example, voice recognition software. In other embodiments, an operator may be prompted to enter metadata at appropriate times. [0047]
  • In other embodiments, an ingest station may automatically insert a time and date of archive into metadata fields. In such embodiments, an ingest station may have a program that causes the ingest station to read time and date data from an internal clock and input that information into automatically generated metadata fields. Other types of metadata may be generated automatically by an ingest station without deviating from the scope and spirit of the invention. [0048]
  • After an ingest station has been employed to input and/or digitize a content item, and related metadata has been generated, then a virtual link may be established between the metadata and the content item. An example of a virtual link is shown in FIG. 5. FIG. 5 shows a three-column table which includes a column [0049] 50 for a sort code, a column 52 for a name of a digital asset, and a column 54 for a name of a metadata file. The sort code virtually links a content item to a related metadata file. Thus, the sort code facilitates retrieval of a content item with reference to the related metadata file. For example, referring to FIG. 5, if a search for a particular sound effect reveals a description of that sound effect in metadata file Metadata-2, then the sort code (2) virtually links the metadata file Metadata-2 to digital media file SoundEffect1. Further examples of locating content items by searching metadata files are provided below. Embodiments of the invention may employ other means of virtually linking stored content items to metadata files. In further embodiments, a virtual link may not be employed at all where, for example, content items are stored together with related metadata files.
  • In embodiments of the invention, a media management system stores content items, metadata related to content items, and virtual links between them. In an example hardware environment shown in FIG. 1, content items may be stored in a storage medium [0050] 12. Metadata files may be stored in a database 10. Virtual links between content items and metadata files may also be stored in the database 10. In further embodiments, metadata and content items may be stored together in a database or in a storage medium, or another suitable storage arrangement may be employed.
  • In embodiments of the invention, once a content item has been digitized and stored and a virtual link has been established between the content item and related metadata, then retrieval of the content item may be facilitated. In embodiments of the invention, stored content items may be retrieved by employing an ingest station or content items may be retrieved from a remote location. In either embodiment, retrieval of a content item may be accomplished by searching metadata files. The complexity of the search available may be dependent on the complexity of the metadata. [0051]
  • For example, in embodiments of the invention, content items may include entire movie soundtracks. In such embodiments, metadata files may include information about the movie soundtracks, such as the name of the movie, actors appearing in the movie, the movie genre, the year in which the movie was made or released, or other like information. In such embodiments, a search may be conducted for a movie soundtrack that includes a particular actor appearing in a movie in a particular year. For example, a metadata file related to the movie soundtrack for “Jerry Maguire” may include the following example fields: [0052]
    Movie Title Jerry Maguire
    Year Movie Made 1996
    Directed By Cameron Crowe
    Actresses Include Renee Zellweger
    Kelly Preston
    Actors Include Tom Cruise
    Cuba Gooding, Jr.
    Jonathan Lipnicki
    Characters Jerry Maguire
    Rod Tidwell
    Dorothy Boyd
    Avery Bishop
    Ray Boyd
    Subjects Sports
    Football
    Romantic Comedy
    Audio Format 5.1 Discrete
    8-Channel Audio Discrete
    2-Track Stereo
    Mono
    Notable Quotes “Show me the money.”
    “You had me at ‘hello.’”
  • In this example embodiment, a search may include a single search parameter of ‘Movie Title=Jerry Maguire’ to return the soundtrack for the movie “Jerry Maguire.” One could also search for the “Jerry Maguire” soundtrack recorded in a particular format by employing multiple search parameters, such as ‘Movie Title=Jerry Maguire’ and ‘Audio Format=2 Track Stereo.’[0053]
  • Similarly, in the above example, a search including the search parameters ‘Year Movie Made=1996’ and ‘Actors Include=Tom Cruise’ may also yield the soundtrack for the movie “Jerry Maguire.” Also, a search for a movie soundtrack which includes search parameters of ‘Actors Include=Cuba Gooding, Jr.’ and ‘Notable Quotes=“Show me the money.”’ may yield the soundtrack for the movie “Jerry Maguire.” Accordingly, by relating metadata to a content item, such a movie soundtrack, retrieval of the content item may be facilitated. In the example above, one may retrieve the soundtrack for the movie “Jerry Maguire” even if their only recollection of the movie was that a character played by Cuba Gooding, Jr. said “Show me the money” or that a character played by Renee Zellweger said, “You had me at ‘hello.’”[0054]
  • The example metadata fields shown above relating to “Jerry Maguire” are merely examples included for illustrative purposes. Embodiments of the invention may include fewer metadata fields, different metadata fields, or more metadata fields. The more rich and complex generated metadata is, the more flexible embodiments of a retrieval function may be. Generally, a retrieval function in embodiments of the invention may be made more complex by including greater amounts of metadata with respect to each stored content item. [0055]
  • In further embodiments of the invention, the content items may include portions of a movie soundtrack. In such embodiments, metadata associated with the portions of a movie soundtrack (content items) may include information pertaining to where the portions of the soundtrack fit into the movie, the subject matter of the particular content item, or other like information. For example, a content item may include only the portion of the “Jerry Maguire” soundtrack in which the character played by Cuba Gooding, Jr. utters the phrase, “Show me the money.” Metadata associated with this content item may identify from which movie scene the content item originated, other characters present during the scene, other dialog that is part of this content item, or other like information. In such embodiments, a soundtrack of a movie, for example, may be archived as a plurality of smaller content items, with each content item having related metadata. [0056]
  • For example, in some embodiments, each sentence or phrase of dialog in a movie may be archived as a separate content item. Similarly, each sound effect in a movie may be archived as a separate content item. In such embodiments, one may locate a particular content item more quickly than in an embodiment in which an entire movie soundtrack is archived as a single content item. One may retrieve, for example, a single quote as discussed above. However, one may also retrieve multiple portions of a movie soundtrack with an appropriate search. In embodiments in which, for example, the soundtrack for “Jerry Maguire” is stored as a plurality of audio content items, a search with parameters of ‘Actors Include=Tom Cruise’ and ‘Movie Title=Jerry Maguire’ may yield content items that include all of the dialog spoken by Tom Cruise in the movie Jerry Maguire. Similarly, an appropriate search may yield a plurality of content items that include an entire conversation from a movie or the dialog of an entire movie scene. [0057]
  • In further embodiments of the invention, because metadata may be generated during the production of the content items, the metadata may also include information about the production of the content itself. In an example embodiment in which the content items include movie audio data, multiple versions of the audio data may be archived with metadata containing information about the multiple versions. For example, during the production of the audio for the movie “Jerry Maguire,” the expression, “Show me the money” may have been recorded several times for a particular scene even though only one version ultimately was mixed into the movie. During production, each of the versions may be archived separately with related metadata generated to describe the various versions. In such an embodiment, a rich collection of audio data may be available such that a movie or a portion of a movie may be re-mixed. Other embodiments of the invention may employ archiving different versions of audio data for a movie so that the audio data may be re-mixed to omit profanity, to provide dialog in a different language, or for other like purposes. [0058]
  • Also, in embodiments of the invention in which metadata may be generated during the production of a content item, such as during the production of movie audio, metadata may be more detailed and more accurate than where metadata is generated after production has been completed. For example, each actor, or other people in a movie scene (e.g. extras), may be observed or queried for information, such as name, age, height, eye color, and other like information. Similarly, those producing the audio may be observed or queried for information about the production, such as, for example, how a particular sound effect was generated, or other like information. When metadata is generated after production of a content item has been completed, such as, for example, by listening to movie audio or watching a movie, some of the these sources of information may be unavailable. [0059]
  • Although example embodiments discussed above concern audio data from the movie “Jerry Maguire” for purposes of illustration, embodiments of the invention may include content items, such as audio data, from a variety of sources, such as other movies, television programs, pilots, commercials, or other like sources of audio data. In such embodiments, a search of all metadata files for information about a particular actor may yield audio content from movies in which the actor has appeared, as well as from television programs, commercials, and other sources of archived audio data pertaining to the particular actor. Similarly, depending on the richness of the metadata generated, a search of all metadata files for information about ‘teacher’ or ‘doctor’ characters, for example, may yield audio data from a plurality of movies, television shows, and the like, of teacher or doctor characters. Such a collection of audio data may be of great value to someone who is casting, for example, a teacher or a doctor. [0060]
  • Further embodiments of the invention may include other types of content items. For example, although example embodiments discussed above include audio data as the content items, video data or a combination of audio data and video data may comprise the content items. In an example embodiment, a feature film may be divided into a plurality of video content items, with or without associated audio content. In such an embodiment, metadata may be generated that describes the production or content of a particular video content item. Accordingly, metadata may be searched to locate video content items showing, for example, city scenes. In further embodiments, more detailed metadata may be searched to locate video content items showing a particular city, such as New York City, or a particular actor in a particular city, such as Tom Cruise in New York City. Further embodiments may include other types of video content or other combinations of audio/video content. [0061]
  • While particular embodiments of the present invention have been disclosed, it is to be understood that various different modifications and combinations are possible and are contemplated within the true spirit and scope of the appended claims. There is no intention, therefore, of limitations to the exact abstract and disclosure herein presented. [0062]

Claims (32)

    What is claimed is:
  1. 1. A method for managing a plurality of content items comprising:
    generating metadata corresponding to each of the plurality of content items when each of the plurality of content items is generated;
    storing the plurality of content items;
    storing the metadata corresponding to each of the plurality of content items;
    associating the metadata with corresponding content items; and
    retrieving stored content items based at least in part on the metadata associated with the content items.
  2. 2. The method for managing a plurality of content items according to claim 1, wherein the metadata corresponding to each of the plurality of content items is a file including a plurality of metadata fields.
  3. 3. The method for managing a plurality of content items according to claim 1, wherein the metadata comprises a plurality of fields of information related to a corresponding content item.
  4. 4. The method for managing a plurality of content items according to claim 1, wherein the plurality of content items includes audio content items and wherein the metadata corresponding to a particular audio content item is generated when the particular audio content item is produced.
  5. 5. The method for managing a plurality of content items according to claim 1, wherein the plurality of content items includes multiple versions of at least one of the plurality of content items.
  6. 6. The method for managing a plurality of content items according to claim 5, wherein metadata is generated for each of the multiple versions of the at least one of the plurality of content items.
  7. 7. The method for managing a plurality of content items according to claim 1, wherein the plurality of content items includes content items generated at selected intervals during production of said content items.
  8. 8. The method for managing a plurality of content items according to claim 1, wherein the plurality of content items includes audio data.
  9. 9. The method for managing a plurality of content items according to claim 1, wherein the plurality of content items includes video data.
  10. 10. The method for managing a plurality of content items according to claim 1, wherein storing the plurality of content items including digitizing a content item and storing the digitized data.
  11. 11. The method for managing a plurality of content items according to claim 1, wherein generating metadata corresponding to each of the plurality of content items comprises automatically generating metadata.
  12. 12. The method for managing a plurality of content items according to claim 1, wherein associating the metadata with corresponding content items comprises employing a sort code.
  13. 13. The method for managing a plurality of content items according to claim 1, wherein retrieving stored content items comprises:
    searching the metadata; and
    retrieving at least one stored content item based on the results of searching the metadata.
  14. 14. The method for managing a plurality of content items according to claim 1, wherein the metadata corresponding to a particular content item is generated when the particular content item is initially recorded.
  15. 15. The method for managing a plurality of content items according to claim 1, wherein the metadata corresponding to a particular content item is generated in connection with a live recording of the particular content item.
  16. 16. A system for managing a plurality of content items comprising:
    at least one ingest station for inputting a plurality of content items and for generating metadata corresponding to each of the plurality of content items when each of the plurality of content items is generated;
    a first storing medium for storing the plurality of content items;
    a second storing medium for storing the metadata; and a device for retrieving one of the plurality of content items based at least in part on the metadata.
  17. 17. The system for managing a plurality of content items according to claim 16, wherein the at least one ingest station comprises:
    a digitizing device for digitizing content items; and
    an inputting device for inputting content items into the system.
  18. 18. The system for managing a plurality of content items according to claim 16, wherein the at least one ingest station further comprises a device for inputting metadata into the system.
  19. 19. The system for managing a plurality of content items according to claim 16, wherein the at least one ingest station comprises the device for retrieving the one of the plurality of content items.
  20. 20. The system for managing a plurality of content items according to claim 16, wherein the metadata corresponding to each of the plurality of content items is stored in a separate file.
  21. 21. The system for managing a plurality of content items according to claim 16, wherein the metadata comprises a plurality of fields of information related to a corresponding content item.
  22. 22. The system for managing a plurality of content items according to claim 16, wherein the plurality of content items includes audio content and wherein the metadata is generated when the audio content is produced.
  23. 23. The system for managing a plurality of content items according to claim 16, wherein the plurality of content items includes multiple versions of at least one of the plurality of content items.
  24. 24. The system for managing a plurality of content items according to claim 23, wherein metadata is generated for each of the multiple versions of the at least one of the plurality of content items.
  25. 25. The system for managing a plurality of content items according to claim 16, wherein the plurality of content items includes content items generated at selected intervals during production of said content items.
  26. 26. The system for managing a plurality of content items according to claim 16, wherein the plurality of content items includes audio data.
  27. 27. The system for managing a plurality of content items according to claim 16, wherein the plurality of content items includes video data.
  28. 28. The system for managing a plurality of content items according to claim 16, wherein the ingest station automatically generates the metadata corresponding to each of the plurality of content items.
  29. 29. The system for managing a plurality of content items according to claim 16, wherein the metadata is associated with the corresponding content items by employing a sort code.
  30. 30. The system for managing a plurality of content items according to claim 16, wherein the device for retrieving one of the plurality of content items based at least in part on the metadata retrieves the one of the plurality of content items by searching the metadata and retrieving at least one stored content item based on the results of searching the metadata.
  31. 31. The method for managing a plurality of content items according to claim 16, wherein the metadata corresponding to a particular content item is generated when the particular content item is initially recorded.
  32. 32. The method for managing a plurality of content items according to claim 16, wherein the metadata corresponding to a particular content item is generated in connection with a live recording of the particular content item.
US10109798 2002-03-29 2002-03-29 Media management system and process Abandoned US20030187820A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10109798 US20030187820A1 (en) 2002-03-29 2002-03-29 Media management system and process

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10109798 US20030187820A1 (en) 2002-03-29 2002-03-29 Media management system and process
US14227516 US9348829B2 (en) 2002-03-29 2014-03-27 Media management system and process

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14227516 Division US9348829B2 (en) 2002-03-29 2014-03-27 Media management system and process

Publications (1)

Publication Number Publication Date
US20030187820A1 true true US20030187820A1 (en) 2003-10-02

Family

ID=28453173

Family Applications (2)

Application Number Title Priority Date Filing Date
US10109798 Abandoned US20030187820A1 (en) 2002-03-29 2002-03-29 Media management system and process
US14227516 Active US9348829B2 (en) 2002-03-29 2014-03-27 Media management system and process

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14227516 Active US9348829B2 (en) 2002-03-29 2014-03-27 Media management system and process

Country Status (1)

Country Link
US (2) US20030187820A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050105396A1 (en) * 2003-11-17 2005-05-19 Nokia Corporation Applications and methods for providing a reminder or an alert to a digital media capture device
US20050105374A1 (en) * 2003-11-17 2005-05-19 Nokia Corporation Media diary application for use with digital device
US20050108234A1 (en) * 2003-11-17 2005-05-19 Nokia Corporation Speed browsing of media items in a media diary application
US20060217990A1 (en) * 2002-12-20 2006-09-28 Wolfgang Theimer Method and device for organizing user provided information with meta-information
US20080046240A1 (en) * 2006-08-17 2008-02-21 Anchorfree, Inc. Software web crowler and method therefor
EP1932391A2 (en) * 2005-09-16 2008-06-18 Sony Electronics, Inc. Method and apparatus for audio data analysis in an audio player
EP1985115A2 (en) * 2006-02-15 2008-10-29 Thomson Licensing Non-linear, digital dailies
US7774718B2 (en) 2003-12-17 2010-08-10 Nokia Corporation Time handle in a media diary application for accessing media files
US20110122527A1 (en) * 2009-11-25 2011-05-26 International Business Machines Corporation Storing and Locating a Self-Describing Storage Cartridge
US20110173196A1 (en) * 2005-09-02 2011-07-14 Thomson Licensing Inc. Automatic metadata extraction and metadata controlled production process
US8010579B2 (en) 2003-11-17 2011-08-30 Nokia Corporation Bookmarking and annotating in a media diary application
US8990255B2 (en) 2003-11-17 2015-03-24 Nokia Corporation Time bar navigation in a media diary application
US20150310894A1 (en) * 2014-04-23 2015-10-29 Daniel Stieglitz Automated video logging methods and systems

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2817744A4 (en) * 2012-02-23 2015-11-04 Vidispine Ab Method and system for searches of digital content using a time interval

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5469560A (en) * 1991-04-10 1995-11-21 International Business Machines Corporation Prioritizing pending read requests in an automated storage library
US6260040B1 (en) * 1998-01-05 2001-07-10 International Business Machines Corporation Shared file system for digital content
US6330572B1 (en) * 1998-07-15 2001-12-11 Imation Corp. Hierarchical data storage management
US6341290B1 (en) * 1999-05-28 2002-01-22 Electronic Data Systems Corporation Method and system for automating the communication of business information
US6366914B1 (en) * 1997-08-08 2002-04-02 Qorvis Media Group, Inc. Audiovisual content distribution system
US6415280B1 (en) * 1995-04-11 2002-07-02 Kinetech, Inc. Identifying and requesting data in network using identifiers which are based on contents of data
US6484199B2 (en) * 2000-01-24 2002-11-19 Friskit Inc. Streaming media search and playback system for continuous playback of media resources through a network
US20030163457A1 (en) * 2002-02-28 2003-08-28 Hitachi, Ltd. Storage system
US6640217B1 (en) * 2000-09-19 2003-10-28 Bocada, Inc, Method for extracting and storing records of data backup activity from a plurality of backup devices
US6647396B2 (en) * 2000-12-28 2003-11-11 Trilogy Development Group, Inc. Classification based content management system
US20040015441A1 (en) * 2001-03-23 2004-01-22 Munetake Ebihara Information processing apparatus
US6728849B2 (en) * 2001-12-14 2004-04-27 Hitachi, Ltd. Remote storage system and method
US6760721B1 (en) * 2000-04-14 2004-07-06 Realnetworks, Inc. System and method of managing metadata data
US6778978B1 (en) * 1999-09-17 2004-08-17 International Business Machines Corporation Determining a workbasket identification for an item in a data store
US6782394B1 (en) * 1999-11-22 2004-08-24 Oracle International Corporation Representing object metadata in a relational database system
US7305384B2 (en) * 1999-12-16 2007-12-04 Microsoft Corporation Live presentation searching

Family Cites Families (106)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4716458A (en) 1987-03-06 1987-12-29 Heitzman Edward F Driver-vehicle behavior display apparatus
US4987552A (en) 1988-02-08 1991-01-22 Fumiko Nakamura Automatic video editing system and method
US4970666A (en) 1988-03-30 1990-11-13 Land Development Laboratory, Inc. Computerized video imaging system for creating a realistic depiction of a simulated object in an actual environment
US5253275A (en) 1991-01-07 1993-10-12 H. Lee Browne Audio and video transmission and receiving system
US7448063B2 (en) 1991-11-25 2008-11-04 Actv, Inc. Digital interactive system for providing full interactivity with live programming events
US5610653A (en) 1992-02-07 1997-03-11 Abecassis; Max Method and system for automatically tracking a zoomed video image
US5823786A (en) 1993-08-24 1998-10-20 Easterbrook; Norman John System for instruction of a pupil
US5815411A (en) 1993-09-10 1998-09-29 Criticom Corporation Electro-optic vision system which exploits position and attitude
US6037936A (en) 1993-09-10 2000-03-14 Criticom Corp. Computer vision system with a graphic user interface and remote camera control
US5398075A (en) 1993-11-19 1995-03-14 Intel Corporation Analog chroma keying on color data
US5577188A (en) 1994-05-31 1996-11-19 Future Labs, Inc. Method to provide for virtual screen overlay
US5966132A (en) 1994-06-17 1999-10-12 Namco Ltd. Three-dimensional image synthesis which represents images differently in multiple three dimensional spaces
US5835667A (en) 1994-10-14 1998-11-10 Carnegie Mellon University Method and apparatus for creating a searchable digital video library and a system and method of using such a library
US5600368A (en) 1994-11-09 1997-02-04 Microsoft Corporation Interactive television system and method for viewer control of multiple camera viewpoints in broadcast programming
US5714997A (en) 1995-01-06 1998-02-03 Anderson; David P. Virtual reality television system
US5930741A (en) 1995-02-28 1999-07-27 Virtual Technologies, Inc. Accurate, rapid, reliable position sensing using multiple sensing technologies
US6061056A (en) 1996-03-04 2000-05-09 Telexis Corporation Television monitoring system with automatic selection of program material of interest and subsequent display under user control
US5689442A (en) 1995-03-22 1997-11-18 Witness Systems, Inc. Event surveillance system
US5729471A (en) 1995-03-31 1998-03-17 The Regents Of The University Of California Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
US5673401A (en) 1995-07-31 1997-09-30 Microsoft Corporation Systems and methods for a customizable sprite-based graphical user interface
GB9601101D0 (en) 1995-09-08 1996-03-20 Orad Hi Tech Systems Limited Method and apparatus for automatic electronic replacement of billboards in a video image
US5865624A (en) 1995-11-09 1999-02-02 Hayashigawa; Larry Reactive ride simulator apparatus and method
US6193610B1 (en) 1996-01-05 2001-02-27 William Junkin Trust Interactive television system and methodology
US5860862A (en) 1996-01-05 1999-01-19 William W. Junkin Trust Interactive system allowing real time participation
DE69734496D1 (en) 1996-04-12 2005-12-08 Sony Corp Data decoder and method for decoding data
US5850232A (en) 1996-04-25 1998-12-15 Microsoft Corporation Method and system for flipping images in a window using overlays
US5838310A (en) 1996-05-17 1998-11-17 Matsushita Electric Industrial Co., Ltd. Chroma-key signal generator
DE69736622T2 (en) 1996-07-03 2007-09-13 Hitachi, Ltd. System for motion detection
US6151009A (en) 1996-08-21 2000-11-21 Carnegie Mellon University Method and apparatus for merging real and synthetic images
WO1998011494A1 (en) 1996-09-16 1998-03-19 Advanced Research Solutions, Llc Data correlation and analysis tool
US5878174A (en) 1996-11-12 1999-03-02 Ford Global Technologies, Inc. Method for lens distortion correction of photographic images for texture mapping
US6159016A (en) 1996-12-20 2000-12-12 Lubell; Alan Method and system for producing personal golf lesson video
US6080063A (en) 1997-01-06 2000-06-27 Khosla; Vinod Simulated real time game play with live event
US6178007B1 (en) 1997-01-21 2001-01-23 Xerox Corporation Method for continuous incremental color calibration for color document output terminals
US5900868A (en) 1997-04-01 1999-05-04 Ati International Method and apparatus for multiple channel display
US6044397A (en) 1997-04-07 2000-03-28 At&T Corp System and method for generation and interfacing of bitstreams representing MPEG-coded audiovisual objects
JP3724117B2 (en) 1997-05-23 2005-12-07 ソニー株式会社 Image generating device and image generating method
US6353461B1 (en) 1997-06-13 2002-03-05 Panavision, Inc. Multiple camera video assist control system
US6072504A (en) 1997-06-20 2000-06-06 Lucent Technologies Inc. Method and apparatus for tracking, storing, and synthesizing an animated version of object motion
US6330486B1 (en) 1997-07-16 2001-12-11 Silicon Graphics, Inc. Acoustic perspective in a virtual three-dimensional environment
US6961954B1 (en) 1997-10-27 2005-11-01 The Mitre Corporation Automated segmentation, information extraction, summarization, and presentation of broadcast news
JPH11146325A (en) 1997-11-10 1999-05-28 Hitachi Ltd Video retrieval method, device therefor, video information generating method and storage medium storing its processing program
US6571054B1 (en) 1997-11-10 2003-05-27 Nippon Telegraph And Telephone Corporation Method for creating and utilizing electronic image book and recording medium having recorded therein a program for implementing the method
US20050028194A1 (en) 1998-01-13 2005-02-03 Elenbaas Jan Hermanus Personalized news retrieval system
US6750919B1 (en) 1998-01-23 2004-06-15 Princeton Video Image, Inc. Event linked insertion of indicia into video
US6449540B1 (en) 1998-02-09 2002-09-10 I-Witness, Inc. Vehicle operator performance recorder triggered by detection of external waves
GB9803551D0 (en) 1998-02-20 1998-04-15 Discreet Logic Inc Generating registration data for a virtual set
US6438165B2 (en) 1998-03-09 2002-08-20 Lg Electronics Method and apparatus for advanced encoder system
JP3657424B2 (en) 1998-03-20 2005-06-08 松下電器産業株式会社 Center device and the terminal device to broadcast the program information
US6545705B1 (en) 1998-04-10 2003-04-08 Lynx System Developers, Inc. Camera with object recognition/data output
JP3745117B2 (en) 1998-05-08 2006-02-15 キヤノン株式会社 Image processing apparatus and image processing method
US6674461B1 (en) 1998-07-07 2004-01-06 Matthew H. Klapman Extended view morphing
US6714909B1 (en) 1998-08-13 2004-03-30 At&T Corp. System and method for automated multimedia content indexing and retrieval
US6144375A (en) 1998-08-14 2000-11-07 Praja Inc. Multi-perspective viewer for content-based interactivity
US6833865B1 (en) 1998-09-01 2004-12-21 Virage, Inc. Embedded metadata engines in digital capture devices
US6266100B1 (en) 1998-09-04 2001-07-24 Sportvision, Inc. System for enhancing a video presentation of a live event
US6229550B1 (en) 1998-09-04 2001-05-08 Sportvision, Inc. Blending a graphic
US6133962A (en) 1998-10-30 2000-10-17 Sony Corporation Electronic program guide having different modes for viewing
US6466205B2 (en) 1998-11-19 2002-10-15 Push Entertainment, Inc. System and method for creating 3D models from 2D sequential image data
US6525780B1 (en) 1998-12-18 2003-02-25 Symah Vision, Sa “Midlink” virtual insertion system
US6760916B2 (en) 2000-01-14 2004-07-06 Parkervision, Inc. Method, system and computer program product for producing and distributing enhanced media downstreams
US7211000B2 (en) 1998-12-22 2007-05-01 Intel Corporation Gaming utilizing actual telemetry data
US6720990B1 (en) 1998-12-28 2004-04-13 Walker Digital, Llc Internet surveillance system and method
US6282317B1 (en) 1998-12-31 2001-08-28 Eastman Kodak Company Method for automatic determination of main subjects in photographic images
US6825875B1 (en) 1999-01-05 2004-11-30 Interval Research Corporation Hybrid recording unit including portable video recorder and auxillary device
US6593936B1 (en) 1999-02-01 2003-07-15 At&T Corp. Synthetic audiovisual description scheme, method and system for MPEG-7
US6236395B1 (en) 1999-02-01 2001-05-22 Sharp Laboratories Of America, Inc. Audiovisual information management system
US6545601B1 (en) 1999-02-25 2003-04-08 David A. Monroe Ground based security surveillance system for aircraft and other commercial vehicles
US6295115B1 (en) 1999-03-01 2001-09-25 Hewlett-Packard Company Producing an optimized color image from a negative image and its developed print image
CN101242515B (en) 1999-03-30 2013-03-13 提维股份有限公司 Multimedia program bookmarking system and method
US6466275B1 (en) 1999-04-16 2002-10-15 Sportvision, Inc. Enhancing a video of an event at a remote location using data acquired at the event
US6378132B1 (en) 1999-05-20 2002-04-23 Avid Sports, Llc Signal capture and distribution system
US7313808B1 (en) 1999-07-08 2007-12-25 Microsoft Corporation Browsing continuous multimedia content
US6707456B1 (en) 1999-08-03 2004-03-16 Sony Corporation Declarative markup for scoring multiple time-based assets and events within a scene composition system
US6408257B1 (en) 1999-08-31 2002-06-18 Xerox Corporation Augmented-reality display method and system
GB9921235D0 (en) 1999-09-08 1999-11-10 Sony Uk Ltd A system and method for navigating through source content
US7369130B2 (en) 1999-10-29 2008-05-06 Hitachi Kokusai Electric Inc. Method and apparatus for editing image data, and computer program product of editing image data
US7000245B1 (en) 1999-10-29 2006-02-14 Opentv, Inc. System and method for recording pushed data
US7016540B1 (en) 1999-11-24 2006-03-21 Nec Corporation Method and system for segmentation, classification, and summarization of video images
US7478108B2 (en) 1999-12-06 2009-01-13 Micro Strain, Inc. Data collection using sensing units and separate control units with all power derived from the control units
US7015978B2 (en) 1999-12-13 2006-03-21 Princeton Video Image, Inc. System and method for real time insertion into video with occlusion on areas containing multiple colors
US6868440B1 (en) 2000-02-04 2005-03-15 Microsoft Corporation Multi-level skimming of multimedia content using playlists
US6792321B2 (en) 2000-03-02 2004-09-14 Electro Standards Laboratories Remote web-based control
JP4095227B2 (en) 2000-03-13 2008-06-04 株式会社コナミデジタルエンタテインメント Video game device, the background sound in the video game output setting method and the background sound output setting program recorded computer-readable recording medium
US6771272B2 (en) 2000-03-17 2004-08-03 Sun Microsystems, Inc. Graphics system having a super-sampled sample buffer with hot spot correction
US6535114B1 (en) 2000-03-22 2003-03-18 Toyota Jidosha Kabushiki Kaisha Method and apparatus for environment recognition
US20020016971A1 (en) 2000-03-31 2002-02-07 Berezowski David M. Personal video recording system with home surveillance feed
US20020010928A1 (en) 2000-04-24 2002-01-24 Ranjit Sahota Method and system for integrating internet advertising with television commercials
US6359585B1 (en) 2000-05-22 2002-03-19 Rockwell Collins, Inc. Apparatus and method of determining an orientation of a GPS receiver
US6882793B1 (en) 2000-06-16 2005-04-19 Yesvideo, Inc. Video processing system
US6810397B1 (en) 2000-06-29 2004-10-26 Intel Corporation Collecting event data and describing events
US7624337B2 (en) * 2000-07-24 2009-11-24 Vmark, Inc. System and method for indexing, searching, identifying, and editing portions of electronic multimedia files
US6850250B2 (en) 2000-08-29 2005-02-01 Sony Corporation Method and apparatus for a declarative representation of distortion correction for add-on graphics in broadcast video
US20020152462A1 (en) 2000-08-29 2002-10-17 Michael Hoch Method and apparatus for a frame work for structured overlay of real time graphics
US6791574B2 (en) 2000-08-29 2004-09-14 Sony Electronics Inc. Method and apparatus for optimized distortion correction for add-on graphics for real time video
JP3479685B2 (en) 2000-10-24 2003-12-15 新潟大学長 Anisotropy analysis method and anisotropy analysis apparatus
US20020064764A1 (en) 2000-11-29 2002-05-30 Fishman Lewis R. Multimedia analysis system and method of use therefor
US6537076B2 (en) 2001-02-16 2003-03-25 Golftec Enterprises Llc Method and system for presenting information for physical motion analysis
US20020115047A1 (en) 2001-02-16 2002-08-22 Golftec, Inc. Method and system for marking content for physical motion analysis
US20020170068A1 (en) 2001-03-19 2002-11-14 Rafey Richter A. Virtual and condensed television programs
JP4620889B2 (en) 2001-03-22 2011-01-26 三菱電機株式会社 The power semiconductor device
US6810146B2 (en) 2001-06-01 2004-10-26 Eastman Kodak Company Method and system for segmenting and identifying events in images using spoken annotations
US7203693B2 (en) 2001-06-12 2007-04-10 Lucent Technologies Inc. Instantly indexed databases for multimedia content analysis and retrieval
US6990681B2 (en) 2001-08-09 2006-01-24 Sony Corporation Enhancing broadcast of an event with synthetic scene using a depth map
US6860806B2 (en) 2001-10-23 2005-03-01 Teletech Co., Ltd. Virtual horseracing system
US6778085B2 (en) 2002-07-08 2004-08-17 James Otis Faulkner Security system and method with realtime imagery

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5469560A (en) * 1991-04-10 1995-11-21 International Business Machines Corporation Prioritizing pending read requests in an automated storage library
US6415280B1 (en) * 1995-04-11 2002-07-02 Kinetech, Inc. Identifying and requesting data in network using identifiers which are based on contents of data
US6366914B1 (en) * 1997-08-08 2002-04-02 Qorvis Media Group, Inc. Audiovisual content distribution system
US6260040B1 (en) * 1998-01-05 2001-07-10 International Business Machines Corporation Shared file system for digital content
US6330572B1 (en) * 1998-07-15 2001-12-11 Imation Corp. Hierarchical data storage management
US6341290B1 (en) * 1999-05-28 2002-01-22 Electronic Data Systems Corporation Method and system for automating the communication of business information
US6778978B1 (en) * 1999-09-17 2004-08-17 International Business Machines Corporation Determining a workbasket identification for an item in a data store
US6782394B1 (en) * 1999-11-22 2004-08-24 Oracle International Corporation Representing object metadata in a relational database system
US7305384B2 (en) * 1999-12-16 2007-12-04 Microsoft Corporation Live presentation searching
US6484199B2 (en) * 2000-01-24 2002-11-19 Friskit Inc. Streaming media search and playback system for continuous playback of media resources through a network
US6760721B1 (en) * 2000-04-14 2004-07-06 Realnetworks, Inc. System and method of managing metadata data
US6640217B1 (en) * 2000-09-19 2003-10-28 Bocada, Inc, Method for extracting and storing records of data backup activity from a plurality of backup devices
US6647396B2 (en) * 2000-12-28 2003-11-11 Trilogy Development Group, Inc. Classification based content management system
US20040015441A1 (en) * 2001-03-23 2004-01-22 Munetake Ebihara Information processing apparatus
US6728849B2 (en) * 2001-12-14 2004-04-27 Hitachi, Ltd. Remote storage system and method
US20030163457A1 (en) * 2002-02-28 2003-08-28 Hitachi, Ltd. Storage system

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7797331B2 (en) * 2002-12-20 2010-09-14 Nokia Corporation Method and device for organizing user provided information with meta-information
US8612473B2 (en) * 2002-12-20 2013-12-17 Nokia Corporation Method and device for organizing user provided information with meta-information
US20060217990A1 (en) * 2002-12-20 2006-09-28 Wolfgang Theimer Method and device for organizing user provided information with meta-information
US20110060754A1 (en) * 2002-12-20 2011-03-10 Wolfgang Theimer Method and device for organizing user provided information with meta-information
US20050105374A1 (en) * 2003-11-17 2005-05-19 Nokia Corporation Media diary application for use with digital device
US7109848B2 (en) 2003-11-17 2006-09-19 Nokia Corporation Applications and methods for providing a reminder or an alert to a digital media capture device
US20050105396A1 (en) * 2003-11-17 2005-05-19 Nokia Corporation Applications and methods for providing a reminder or an alert to a digital media capture device
US8990255B2 (en) 2003-11-17 2015-03-24 Nokia Corporation Time bar navigation in a media diary application
US20050108234A1 (en) * 2003-11-17 2005-05-19 Nokia Corporation Speed browsing of media items in a media diary application
US8010579B2 (en) 2003-11-17 2011-08-30 Nokia Corporation Bookmarking and annotating in a media diary application
US7774718B2 (en) 2003-12-17 2010-08-10 Nokia Corporation Time handle in a media diary application for accessing media files
US20110173196A1 (en) * 2005-09-02 2011-07-14 Thomson Licensing Inc. Automatic metadata extraction and metadata controlled production process
US9420231B2 (en) * 2005-09-02 2016-08-16 Gvbb Holdings S.A.R.L. Automatic metadata extraction and metadata controlled production process
EP1932391A2 (en) * 2005-09-16 2008-06-18 Sony Electronics, Inc. Method and apparatus for audio data analysis in an audio player
EP1932391A4 (en) * 2005-09-16 2011-04-20 Sony Electronics Inc Method and apparatus for audio data analysis in an audio player
US20100286806A1 (en) * 2005-09-16 2010-11-11 Sony Corporation, A Japanese Corporation Device and methods for audio data analysis in an audio player
US8184260B2 (en) * 2006-02-15 2012-05-22 Thomson Licensing Non-linear, digital dailies
US20090079864A1 (en) * 2006-02-15 2009-03-26 Terry Scott Brown Non-Linear, Digital Dailies
EP1985115A2 (en) * 2006-02-15 2008-10-29 Thomson Licensing Non-linear, digital dailies
US20080046240A1 (en) * 2006-08-17 2008-02-21 Anchorfree, Inc. Software web crowler and method therefor
US7693872B2 (en) * 2006-08-17 2010-04-06 Anchorfree, Inc. Software web crowler and method therefor
US20110122527A1 (en) * 2009-11-25 2011-05-26 International Business Machines Corporation Storing and Locating a Self-Describing Storage Cartridge
US20150310894A1 (en) * 2014-04-23 2015-10-29 Daniel Stieglitz Automated video logging methods and systems
US9583149B2 (en) * 2014-04-23 2017-02-28 Daniel Stieglitz Automated video logging methods and systems

Also Published As

Publication number Publication date Type
US9348829B2 (en) 2016-05-24 grant
US20140214907A1 (en) 2014-07-31 application

Similar Documents

Publication Publication Date Title
US5715400A (en) System and method for providing merchant information and establishing links to merchants while presenting a movie
US6011895A (en) Keyword responsive variable content video program
US6243725B1 (en) List building system
US7096234B2 (en) Methods and systems for providing playlists
US7220910B2 (en) Methods and systems for per persona processing media content-associated metadata
US6941324B2 (en) Methods and systems for processing playlists
US7054547B1 (en) Disc having a segment code for prohibiting a play control function during a playing of a video segment
US6061758A (en) System and method for managing storage and retrieval of media data including dynamic linkage of media data files to clips of the media data
Christel et al. Evolving video skims into useful multimedia abstractions
US6760884B1 (en) Interactive memory archive
Mauthe et al. Professional content management systems: handling digital media assets
Dimitrova et al. Applications of video-content analysis and retrieval
US5559949A (en) Computer program product and program storage device for linking and presenting movies with their underlying source information
Tseng et al. Using MPEG-7 and MPEG-21 for personalizing video
Rowe et al. Indexes for user access to large video databases
US20020108127A1 (en) Low bandwidth transmission
US7506262B2 (en) User interface for creating viewing and temporally positioning annotations for media content
Ponceleon et al. Key to effective video retrieval: effective cataloging and browsing
US6944611B2 (en) Method and apparatus for digital media management, retrieval, and collaboration
US20050015713A1 (en) Aggregating metadata for media content from multiple devices
US20080147558A1 (en) Method and system for providing prospective licensees and/or purchasers with access to licensable media content
US5781730A (en) System and method for enabling the creation of personalized movie presentations and personalized movie collections
Brown et al. Automatic content-based retrieval of broadcast news
US5852435A (en) Digital multimedia editing and data management system
US5664227A (en) System and method for skimming digital audio/video data

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY PICTURES ENTERTAINMENT, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KOHUT, MICHAEL;GOODMAN, LARRY;KOFFMAN, MARK;AND OTHERS;REEL/FRAME:014902/0547;SIGNING DATES FROM 20020604 TO 20040113

Owner name: SONY CORPORATION (TOKYO, JAPAN), CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KOHUT, MICHAEL;GOODMAN, LARRY;KOFFMAN, MARK;AND OTHERS;REEL/FRAME:014902/0547;SIGNING DATES FROM 20020604 TO 20040113