US7636090B2 - Apparatus and method for storing a movie within a movie - Google Patents

Apparatus and method for storing a movie within a movie Download PDF

Info

Publication number
US7636090B2
US7636090B2 US11/497,183 US49718306A US7636090B2 US 7636090 B2 US7636090 B2 US 7636090B2 US 49718306 A US49718306 A US 49718306A US 7636090 B2 US7636090 B2 US 7636090B2
Authority
US
United States
Prior art keywords
media
movie
sequence
sequences
media sequences
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US11/497,183
Other versions
US20060262122A1 (en
Inventor
Peter Hoddie
James D. Batson
Sean Michael Callahan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US11/497,183 priority Critical patent/US7636090B2/en
Publication of US20060262122A1 publication Critical patent/US20060262122A1/en
Assigned to APPLE INC. reassignment APPLE INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: APPLE COMPUTER, INC., A CALIFORNIA CORPORATION
Application granted granted Critical
Publication of US7636090B2 publication Critical patent/US7636090B2/en
Adjusted expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234318Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23608Remultiplexing multiplex streams, e.g. involving modifying time stamps or remapping the packet identifiers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4344Remultiplexing of multiplex streams, e.g. by modifying time stamps or remapping the packet identifiers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/44029Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display for generating different versions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8453Structuring of content, e.g. decomposing content into time segments by locking or enabling a set of features, e.g. optional functionalities in an executable program

Definitions

  • the present invention relates to a hierarchical movie structure, and more specifically, to a structure for embedding a movie within another movie.
  • a movie generally consists of a set of tracks slaved to a movie clock.
  • the array of tracks includes a video track and an audio track.
  • a video track consists of a sequence of samples of video data.
  • An audio track is a sequence of samples of audio data.
  • movies may also include tracks which store other types of information.
  • a movie may also include a text track that contains text for subtitles, a music track separate from the main audio track, and a time code track.
  • FIG. 1 illustrates a typical movie 100 .
  • Movie 100 includes a video track 122 that includes a sequence of video samples 102 , 104 , 106 , 108 and 110 .
  • Movie 100 also includes a sound track 124 that includes a sequence of audio samples 112 , 114 , 116 , 118 and 120 .
  • playback of all of the tracks of the movie 100 are synchronized based on a movie clock. For example, at a time T 1 of the movie clock, video sample 102 is being displayed and audio sample 112 is being played. At a time T 2 of the movie clock, the video sample 104 is being displayed and audio sample 114 is being played. The audio and video samples that are being played at any given time on the movie clock remain the same for each performance of movie 100 .
  • one track of a movie is edited, some of the other tracks of the movie may also have to be edited.
  • a movie in which a character is giving a speech with the national anthem playing in the background If one wishes to delete a portion of the speech from the movie, the corresponding sequence of video must be cut from the video track, the corresponding sequence of audio must be cut from the audio track, and the corresponding text sequence must be cut from the text track.
  • the corresponding music track should not be cut.
  • a hierarchical movie is provided.
  • a hierarchical movie is a movie that contains one or more embedded movies. Embedded movies may themselves contain embedded movies. Each movie contains zero or more media sequences. Within a hierarchical movie, media sequences that should be edited together may be grouped together using embedded movies. The media sequences of a hierarchical movie may be sequenced during playback based on a different time coordinate system than the time coordinate system that governs any embedded movies. This allows a movie to contain both time-based and time-independent media sequences. Also, the relative timing of events in the movie may vary from performance to performance.
  • the hierarchical movie structure allows movies to be used as user interface controls, and even as field-sensitive databases.
  • a hierarchical media container includes a first set of media sequences.
  • One media sequence in the first set of media sequences contains and an embedded media container including a second set of media sequences.
  • the hierarchical media container may or may not have the same relationship to time as the embedded media container.
  • both the hierarchical media container and the embedded media container are time-based media containers.
  • the hierarchical media container is a time-independent media container and the embedded media container is a time-based media container.
  • the hierarchical media container is a time-based media container and the embedded media container is a time-independent media container.
  • a method for providing a “control movie” is provided.
  • a user may select a parameter value by interacting with a movie.
  • the method may be used in a computer system that includes a display device.
  • the method includes a step for providing a media container that includes a media sequence of visual data.
  • the media sequence includes a plurality of samples.
  • the plurality of samples includes a set of samples. Each sample in the set is associated with a value.
  • the method also includes steps for determining a current sample from the set of samples, establishing the value associated with the current sample as the parameter value, and displaying on the display device the image represented by the current sample.
  • the method also includes the steps of receiving input from the user specifying a sequencing direction, determining a next sample, the next sample being a sample of the set of samples located in the sequencing direction relative to the current sample, establishing the value associated with the next sample as the parameter value, and displaying on the display device the image represented by the next sample.
  • a method for editing a movie is provided.
  • a first plurality of media sequences of the movie is stored in a first container.
  • a second plurality of media sequences of the movie is stored in a second container.
  • One of the first container and the second container are embedded in the other of the first container and the second container.
  • An edit of a media sequence of the second plurality of media sequences is received from a user. All media sequences of the second plurality of media sequences are automatically edited responsive to receiving the edit.
  • a method for playing a movie is provided. According to the method, a first plurality of samples of a first media sequence are sequentially played The first media sequence is stored in a first media container. A second plurality of samples of a second media sequence is also sequentially played. The second media sequence is stored in a second media container. The second media container is embedded within the first media container.
  • the step of sequentially playing the first plurality of samples may be performed responsive to a first time coordinate system, while the step of sequentially playing the second plurality of samples is performed responsive to a second time coordinate system, where the first time coordinate system is different from the second time coordinate system.
  • FIG. 1 illustrates a movie structure used in the prior art
  • FIG. 2 illustrates a computer system upon which the present invention may be implemented
  • FIG. 3 a illustrates a hierarchical movie structure according to an embodiment of the invention
  • FIG. 3 b illustrates a hierarchical movie structure that has more complex hierarchical relationships than that illustrated in FIG. 3 a;
  • FIG. 4 illustrates a movie generated based on data stored in a hierarchical movie structures where a time-based movie is embedded in a time-based movie;
  • FIG. 5 illustrates a movie generated based on data stored in a hierarchical movie structure where a time-independent movie is embedded within a time-based movie
  • FIG. 6 illustrates a time-based movie generated based on a hierarchical movie-structure where a time-based movie is embedded in a time-independent movie
  • FIG. 7 illustrates a movie sequence for a control movie according to an embodiment of the invention
  • FIG. 8 illustrates a movie generated based on a media container with an embedded control movie
  • FIG. 9 a illustrates a media container for a field-insensitive database movie
  • FIG. 9 b illustrates a database movie generated responsive to the media container shown in FIG. 9 a;
  • FIG. 10 illustrates a media container for a field-sensitive database movie
  • FIG. 10 b illustrates a database movie generated responsive to the media container shown in FIG. 10 a.
  • Computer system 200 comprises a bus or other communication means 201 for communicating information, and a processing means 202 coupled with bus 201 for processing information.
  • System 200 further comprises a random access memory (RAM) or other dynamic storage device 204 (referred to as main memory), coupled to bus 201 for storing information and instructions to be executed by processor 202 .
  • Main memory 204 also may be used for storing temporary variables or other intermediate information during execution of instructions by processor 202 .
  • Computer system 200 also comprises a read only memory (ROM) and/or other static storage device 206 coupled to bus 201 for storing static information and instructions for processor 202 .
  • ROM read only memory
  • a data storage device 207 such as a magnetic disk or optical disk and its corresponding disk drive can be coupled to computer system 200 .
  • Computer system 200 can also be coupled via bus 201 to a display device 221 , such as a cathode ray tube (CRT), for displaying information to a computer user.
  • An alphanumeric input device 222 is typically coupled to bus 201 for communicating information and command selections to processor 202 .
  • cursor control 223 is Another type of user input device, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 202 and for controlling cursor movement on display device 221 .
  • This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), which allows the device to specify positions in a plane.
  • a stylus or pen can be used to interact with the display.
  • a displayed object on a computer screen can be selected by using a stylus or pen to touch the displayed object.
  • the computer detects the selection by implementing a touch sensitive screen.
  • a light pen and a light sensitive screen can be used for selecting a displayed object.
  • Such devices may thus detect selection position and the selection as a single operation instead of the “point and click,” as in a system incorporating a mouse or trackball.
  • Stylus and pen based input devices as well as touch and light sensitive screens are well known in the art.
  • Such a system may also lack a keyboard such as 222 wherein all interface is provided via the stylus as a writing instrument (like a pen) and the written text is interpreted using optical character recognition (OCR) techniques.
  • OCR optical character recognition
  • Hard copy device 224 may be used for printing instructions, data, or other information on a medium such as paper, film, or similar types of media
  • computer system 200 can be coupled to a device for audio playback 225 such as a speaker. Further, the device may include a speaker which is coupled to a digital to analog (D/A) converter for playing back the digitized sounds.
  • computer system 200 can be a terminal in a computer network (e.g., a LAN).
  • computer system 200 is one of the Macintosh® family of personal computers such as the Macintosh® II manufactured by Apple® Computer, Inc. of Cupertino, Calif. (Apple and Macintosh are registered trademarks of Apple Computer, Inc.).
  • the present invention is related to the use of a computer system 200 to create, store, and play back movies that contain other movies.
  • a video track for example, is a media sequence in which each sample contains video data representing an image.
  • a sound track is a media sequence in which each sample contains audio data representing sound.
  • a QuickTime movie is a media container in that it stores multiple media sequences, such as video tracks, audio tracks, sound tracks, text tracks, etc. QuickTime movies are described in detail in Inside Macintosh: QuickTime by Apple Computer Inc., published by Addison-Wesley Publishing Company (1993). All of the media sequences that belong to a media container are sequenced according to a common time coordinate system.
  • embedded movie refers to a media container that is a logical component of another movie container.
  • containing movie refers to the media container of which an embedded movie is a logical component.
  • the media sequences of an embedded movie are not necessarily sequenced according to the same time coordinate system as the media sequences that belong to the containing movie.
  • the embedded relationship is a logical relationship, not a physical relationship. Therefore, the data that represents an embedded movie is not necessarily located in the same physical file or even on the same physical storage device as other data “contained in” the containing movie. For example, a movie A stored in a file A located on a storage medium A and a movie B that is stored in a file B located on a storage medium B may both be embedded in a movie C that also includes tracks D, E and F that are stored in a file G on a storage medium H.
  • control data structures are used to reflect the logical relationship between files.
  • time-based media sequences progress from sample to sample based on the occurrence of an event.
  • the event is the passage of time in a time coordinate system
  • the media sequences are referred to as “time-based” media sequences.
  • a video track is an example of a time-based media sequence.
  • a sample in a video media sequence is displayed for a set time interval. After the time interval expires, the next sample in the video media sequence is displayed. This process continues until all of the samples in the video media sequence have been displayed.
  • the time interval may be modified to speed up or slow down playback, but the playback timing is still driven by the passage of time.
  • the media sequence is referred to as a “time-independent” media sequence.
  • a time-independent media sequence For example, consider a media sequence in which each sample contains the text of a page in a novel. During playback, the page represented in a sample should be displayed until the reader has completed reading the page. Since reading speeds vary greatly, the playback mechanism should not display the page associated with the next sample until the reader indicates a desire to turn the page. Thus, a mechanism may be provided to the user through which the user may initiate an event to move to the next page. For example, a user may operate a mouse or other pointing device to click on a “Turn Page” button to cause the playback mechanism to sequence to the next sample.
  • a media container is “slaved” to a clock if the clock determines when the media sequences that belong to the media container progress from one sample to the next. All of the media sequences in a typical movie are slaved to the same clock (the “movie clock”) to ensure that the media sequences remain synchronized during playback.
  • a media container is “independent” of a clock if the media sequences within the media container are sequenced based on an event other than the passage of time on the clock. For example, a media container is independent of a clock if playback of the media sequences within the media container may be slowed without slowing the clock.
  • a sequencing direction is the direction in which a media sequence is played relative to the order of the samples. Because media sequences are “ordered”, all media sequences have at least two possible sequencing directions. For the purposes of discussion, these two sequencing directions will be referred to as “forward” and “backward”. However, it should be understood that “forward” does not necessarily mean the “normal” or “typical” direction, since some applications may process media sequences in one direction, other applications may process media sequences in the other direction, and yet other applications may process sequences in either or both directions.
  • the “active interval” of a sample is the time interval during which the sample may be played.
  • the active interval for the first video sample in a 30-frame per second movie is the first 1/30 second of playback.
  • the mechanism for playing a movie is implemented through a series of instructions executed on processor 202 .
  • the series of instruction may be stored on storage device 207 .
  • the instructions are copied from storage device 207 into memory 204 , and then accessed and executed by processor 202 .
  • the samples of the media sequences of a movie are processed by processor 202 responsive to the series of instructions. Specifically, processor 202 causes the samples to be “played”.
  • the particular steps for playing a sample depend on the nature of the data within the sample. For example, a sample of video data is “played” by causing the image represented in the sample to be displayed on display device 221 . Samples containing audio data are played by generating the sound represented in the audio sample. Sound may be generated, for example, on speaker 225 .
  • Processor 202 sequences through the movie responsive to the series of instructions.
  • the series of instructions may cause processor 202 to sequence through the movie responsive to the passage of time and/or the occurrence of another type of event.
  • An event which causes processor 202 to sequence to the next sample in a media sequence may be a user-actuated event, such as the selection of a key on keyboard 222 , or the operation of a user-interface control through actuation of mouse 223 .
  • Typical movies include a plurality of time-based media sequences that are played back based on a common time coordinate system.
  • a media container format is provided in which media sequences may contain samples that are themselves media containers. Such media containers may be stored, for example, on storage device 207 .
  • FIG. 3 a it illustrates a media container 300 that contains four media sequences 302 , 304 , 306 and 311 .
  • Media sequences 302 , 304 and 306 are typical media sequences, such as sound, video or text tracks.
  • Media sequence 311 contains a sample 309 that contains another media container 308 .
  • Media container 308 includes two media sequences 310 and 312 .
  • the active interval of sample 309 is between times T 1 and T 2 . Consequently, media container 308 may only be played during times T 1 and T 2 . These times are determined by the mechanism used to sequence media container 300 .
  • one media container 308 is a component of a media sequence 311 in another media container 300 , a hierarchy exists between the media containers.
  • the data structure that establishes this hierarchy is referred to herein as a hierarchical media container.
  • the “contained in” relationship is logical, not necessarily physical.
  • the data for each of the various media sequences shown in FIG. 3 a may be stored in separate files on separate storage devices.
  • FIG. 3 a illustrates media container 300 with a single embedded media container 308
  • the hierarchical media container structure allows media containers to have any number of embedded media containers.
  • a single media container may contain multiple media sequences that contain embedded media containers.
  • a single media sequence may have multiple samples, each of which contains its own embedded media container.
  • embedded media containers may themselves contain media sequences that contain embedded media containers. As a result, the structure of media containers may be tailored to particular applications.
  • FIG. 3 b illustrates a media container 320 that has a more complicated hierarchical structure than media container 300 in FIG. 3 a .
  • Media container 320 contains two media sequences 322 and 336 , each of which contain embedded media containers.
  • Media sequence 322 contains two movie samples. The first movie sample in media sequence 322 contains a media container 324 , and the second movie sample contains a media container 326 .
  • Media container 324 contains two media sequences 328 and 330
  • media container 326 contains two media sequences 332 and 334 .
  • Media sequence 336 contains one movie sample.
  • the movie sample contained in media sequence 336 contains a media container 338 that has three media sequences 340 , 342 and 344 .
  • Media sequence 344 has two movie samples which contain media container 346 and media container 352 respectively.
  • Media container 346 includes media sequences 348 and 350
  • media container 352 contains media sequences 354 and 356 .
  • media containers 346 and 352 are embedded in a media container 338 that is itself embedded in a media container 320 .
  • At least two significant benefits result from the use of hierarchical media containers.
  • First, the logical relationship between related media sequences may be reflected in the structure of the media container itself.
  • Second, different media sequences within a single movie may be driven by different time coordinate systems.
  • all media sequences are time-based and the time coordinate system that applies to the embedded movie is slaved to the time coordinate system that applies to the containing movie.
  • media sequences 302 , 304 , 306 , 310 , 311 and 312 are all time-based media sequences. Because media sequences 302 , 304 , 306 and 311 belong to media container 300 , media sequences 302 , 304 , 306 and 311 will sequence based on a common time coordinate system during playback.
  • media sequences 310 and 312 will sequence based on a common time coordinate system during playback
  • the time coordinate system used to sequence the media sequences in media container 300 is the same time coordinate system used to sequence the media sequences in media container 308 .
  • media sequences 310 and 312 will be played back as if they where simply two more media sequences contained in media container 300 .
  • the time coordinate system that applies to an embedded movie need not be slaved to the time coordinate system that applies to the containing movie.
  • the time coordinate system for media container 300 may be a different time coordinate system than that used to sequence media container 308 .
  • media containers embedded in a given media container may have different time coordinate systems than the given media container, various multimedia effects are possible. For example, a user may be able to speed up or slow down certain aspects of a movie relative to other aspects of a movie.
  • media sequences 302 , 304 and 306 represent the sound and image of a helicopter 402 as it flies form a point A to a point B, as shown in FIG. 4 .
  • Media sequences 310 and 312 may represent the sound and image of a car 404 as it travels from a point C to a point D. Because media container 308 is not slaved to the clock of media container 300 , playback of the media sequences associated with car 404 relative to playback of the media sequences of helicopter 402 may vary from performance to performance.
  • helicopter 402 may begin to move before car 404 .
  • car 404 may begin to move before helicopter 402 .
  • helicopter 402 may move faster than car 404 .
  • car 404 may move faster than helicopter 402 .
  • the relative playback starting times and playback rates may be based on user input. Thus, users may operate controls to cause helicopter 402 and car 404 to race across the screen, where the outcome of the race is not predetermined.
  • the playback of the embedded movie When an embedded movie is not slaved to the clock of the containing movie, it is possible for the playback of the embedded movie to be completed before the end of the active interval of the sample in which the embedded movie is contained. For example, if media container 308 is slaved to a clock that is running twice as fast as the clock associated with media container 300 , then media container 308 may be played twice between T 1 and T 2 . Under some circumstances it is desirable for the embedded movie to play in a continuous loop during the active interval with which it is associated. Under other circumstances, it is desirable for the embedded movie to play once and then stop, even if the active interval for the sample in which it is contained has not ended. Under yet other circumstances, it is desirable for the embedded movie to play up to N times, and then stop, where N is some specified number.
  • one embodiment of the invention allows “stop data” to be stored for each embedded movie.
  • the stop data specifies a condition under which the playback mechanism is to stop playing the embedded movie.
  • the stop data may indicate that the playback mechanism is to stop playing an embedded movie after it has been repeated ten times.
  • the playback mechanism reads the stop data and stops playback of the embedded movie when one of the following events occurs: (1) the termination condition specified in the stop data is satisfied, or (2) the active interval associated with the embedded movie ends.
  • Hierarchical media containers may be used to mix time-based media sequences with time-independent media sequences.
  • media sequences 302 , 304 and 306 may be time-based media sequences while media sequences 310 and 312 are time-independent sequences.
  • FIG. 5 illustrates one application of a time-independent movie embedded in a time-based movie.
  • media sequences 302 , 304 and 306 provide the video and sound for a helicopter 502 flying across a screen 508 . While the helicopter 502 is flying across the screen 508 , a user may browse through a book 504 . The text of book 504 may be stored in media sequence 310 , and the sound of tuning pages may be stored in media sequence 312 . When a user selects the upper portion 506 of a page, the media sequences 310 and 312 are advanced (text of the next page is shown, and the sound of a page turning is generated). The rate at which the user turns pages has no affect on the rate at which helicopter 502 moves across the screen 508 .
  • a media container storing time-based sequences may also be embedded in a media container that stores time-independent sequences.
  • media sequences 302 , 304 and 306 may be time-independent media sequences, while media sequences 310 and 312 are time-dependent sequences.
  • media sequences 302 , 304 and 306 may correspond a series of static scenes.
  • the user may move from one scene to the next by entering user input to cause media sequences 302 , 304 and 306 to sequence to subsequent information samples.
  • One of the static scenes 600 may include the image of a television 602 .
  • Media sequences 310 and 312 may store video and audio that is played on the television 602 .
  • the rate at which media sequences 310 and 312 are played is unrelated to the rate at which a user moves from one scene to the next.
  • a media container storing time-independent sequences may be embedded in a media container that stores time-independent sequences.
  • each sample in the media sequences of the containing media container may correspond to a chapter of a book.
  • One of the media sequences of the containing media container may contain embedded movies.
  • Each embedded movie may contain the text for each page in a chapter of the book.
  • One button may be provided for sequencing the containing movie (to move from chapter to chapter).
  • a second button may be provided for sequencing the embedded movies (to move from page to page within a chapter).
  • Editing operations are complicated by the fact that edits to some tracks in a movie may require edits to some but not all other tracks in the movie.
  • Editing operations may be simplified by using the hierarchical media container structure of the present invention to reflect relationships between media sequences. Specifically, related media sequences may be stored in the same media container, while unrelated media sequences are assigned to different containers in the hierarchical structure.
  • Edits to the video track should be reflected in the audio track and the subtitle track.
  • video edits should not affect the sound tracks.
  • the media sequences of the video, audio and subtitle tracks can be stored in a first media container, and the media sequences of the sound tracks can be stored in a second media container.
  • the second media container can be embedded in the first media container.
  • Editing utilities may then be configured to respond to edits by automatically editing all media sequences that belong to the same media container as the edited media sequence, and to leave all other media sequences intact.
  • all of the clips from the same source can be stored in the same embedded media container.
  • the editor need only look to the other media sequences in the same media container to determine whether other media sequences must be modified responsive to the modification. Consequently, embedded containers provide to editors the ability to maintain logical media sequence groupings. This ability, in turn, makes complex editing operations more manageable.
  • Hierarchical media containers may also reduce the size of some movies. For example, consider a twenty minute movie in which a four minute musical theme is repeated five times. Using the prior art movie structure, the movie would contain a sound track covering the full twenty minutes of music. Using a hierarchical media container structure, a four minute media sequence could be stored in a first media container separate from a second media container that stores the rest of the movie. The first media container may then be embedded in the second media container. Attributes of the first media container may be set so that playback of the first media container begins with the playback of the second media container and continuously repeats until the end of playback of the second media container. During playback, the movie will appear and sound the same, but the sound data of the hierarchical media container will take up approximately one fifth as much storage space as the twenty minute sound track.
  • Controls through which a user may designate operational parameters. Examples of such controls include scroll bars, check boxes and radio buttons. Controls typically allow a user to select one value from a predetermined range of values. For example, a user may designate the value “checked” or the value “unchecked” by interacting with a checkbox. A user may select one of a range of values by interacting with a scroll bar. In general, the more complicated the control, the more difficult it is to display and manage the control.
  • control movie is provided.
  • a “control movie” is a movie that performs the traditional functions of a control. More specifically, a control movie is a media container that contains at least one media sequence, where (1) samples in the media sequence are associated with parameter values and (2) the current sample of the media sequence determines the value of a parameter.
  • Media sequence 700 is a video media sequence and includes an ordered series of samples 702 , 704 , 706 , 708 and 710 of video data. Each sample of video data in media sequence 700 represents an image. Each sample in media sequence 700 is also associated with a value for a parameter. Specifically, sample 702 is associated with the value “1”, sample 704 is associated with the value “2”, sample 706 is associated with the value “3”, sample 708 is associated with the value “4”, sample 710 is associated with the value “5” and sample 712 is associated with the value “6”.
  • a media container that includes media sequence 700 may be embedded into the movie.
  • a movie in which a helicopter 802 flies from a point E to a point F on a screen 804 is illustrated in FIG. 8 .
  • Embedded in the movie is a media container that includes media sequence 700 .
  • An image 806 corresponding to a sample in media sequence 700 is displayed on screen 804 during playback of the movie.
  • media sequence 700 is contained in an embedded movie, it does not sequence responsive to the time coordinate system that controls the containing movie. However, the image 806 generated by the media sequence may be used to determine a parameter associated with the containing movie, such as the rate at which the containing movie is played back.
  • image sequence 700 As a control for the value of “speed”, the image associated with the default value of “speed” is initially displayed on screen 804 .
  • the image associated with sample 710 which is the sample associated with the value “5”, would be displayed.
  • This image would not change (i.e. media sequence 700 would not be sequenced) responsive to the passage of time in the time coordinate system associated with the containing movie. Rather, some other event, such as user interaction with the image 806 , would trigger the sequencing of media sequence 700 .
  • media sequence 700 is sequenced responsive to the selection of arrows 810 and 812 on image 806 .
  • the selection of arrow 810 will cause media sequence 700 to sequence “backward”.
  • the image 806 will reflect the image associated with sample 708 .
  • the value of “speed” will be updated to the value associated with the currently-displayed sample.
  • the currently-displayed sample will be sample 708 , which is associated with the parameter value “4”. Therefore the value of “speed” will be changed to “4”, and the rate of playback of the movie associated with helicopter 802 will decrease.
  • media sequence 700 will be sequence “forward”.
  • image 806 will reflect the image associated with sample 712 .
  • value of “speed” will be updated to the value associated with the currendy-displayed sample.
  • the currently-displayed sample will be sample 712 , which is associated with the value “6”. Therefore the value of “speed” will be changed to “6”, and the rate of playback of the movie associated with helicopter 802 will increase.
  • the sequencing of the control movie containing media sequence 700 is performed by selection of arrows 810 and 812 .
  • This selection may be performed, for example, by operating mouse 223 to position a cursor over one of arrows 810 or 812 and clicking a button on the mouse 223 .
  • Other user actuated sequencing mechanisms may also be used.
  • the sequencing of the control movie may alternatively be triggered by the selection of screen regions outside of image 806 , or by pressing certain keys on keyboard 222 .
  • Control movies have the benefit that once they are created, they may easily be embedded in other movies to provide a graphical user interface for parameter control. Because control movies are movies, they may provide visually sophisticated controls that would otherwise be difficult to display and manage.
  • each sample of media sequence 700 is associated with a parameter value
  • other variations are possible. For example, every fifth sample may be associated with a parameter value.
  • the media sequence 700 may be played in the designated direction until arriving at the next sample associated with a parameter value.
  • the image 806 displayed responsive to the control movie will appear animated during the parameter change operation.
  • FIG. 9 a illustrates a media container 901 that contains a media sequence 900 in which each sample 902 , 904 , 906 , 908 and 910 includes the name and age of a person.
  • a movie container storing media sequence 900 may be played as shown in FIG. 9 b.
  • an image 920 corresponding to the currently-played sample of media container 901 is displayed on a screen 924 .
  • a control panel 922 for sequencing media container 901 is also displayed.
  • Control panel 922 includes control arrows 926 and 928 for sequencing media container 901 backward and forward, respectively.
  • control panel 922 includes a text box 930 into which a user may enter search terms. Entry of search terms into text box 930 initiates a search for terms in the media container 901 . If the terms are found, then the media container 901 is sequenced until the sample containing the search terms is displayed.
  • media container 901 may be used as a rudimentary database.
  • the database provided by media container 901 has the disadvantage that searches are performed on all of the text in the media container 901 .
  • the hierarchical media container format described herein may be used to segregate a movie database into fields.
  • FIG. 10 a it illustrates a media container 1000 that includes a plurality of media sequences 1016 , 1017 , 1019 , and 1021 .
  • Media sequences 1017 , 1019 and 1021 respectively contain media containers 1004 , 1006 and 1008 .
  • Each of the embedded media containers 1004 , 1006 and 1008 includes a text media sequence 1010 , 1012 and 1014 , respectively.
  • Media sequence 1016 is a media sequence that stores the names of database fields, such as “Name”, “Age” and “Birthday”. Each sample of media sequence 1010 stores text that indicates the name of an individual. Each sample of media sequence 1012 stores text that indicates the age of an individual. Each sample of media sequence 1014 stores text that indicates the birthday of an individual. Media sequences 1010 , 1012 and 1014 are ordered such that at any given sequence location, all three media sequences represent data from the same individual. For example, samples 1018 , 1020 and 1022 , which are all located at the first sequence position, respectively store the name, age and birthday of the same individual.
  • media container 1000 may be “played” to display an image 1050 on a screen 1052 .
  • the image 1050 includes a region 1054 in which the current sample of media sequence 1016 is displayed, a region 1056 in which the current sample of media sequence 1010 is displayed, a region 1058 in which the current sample of media sequence 1012 is displayed, and a region 1060 in which the current sample of media sequence 1014 is displayed.
  • Screen 1052 also contains a control panel 1062 that contains control arrows 1064 and 1066 analogous to control arrows 926 and 928 of FIG. 9 b , and a text box 1068 analogous to text box 930 of FIG. 9 b .
  • Control panel 1062 also contains controls 1070 , 1072 and 1074 that allow a user to choose one or more of the available fields. Because the data for each field is contained in a separate media container, searches may be performed on a field-by-field basis. For example, if a user selects control 1072 and enters “10” into text box 1068 , the search is limited to the contents of media container 1006 . If a match is found, media container 1000 is sequenced until the sample of sequence 1012 in which the match occurred is displayed.
  • the hierarchical movie structure described herein allows movies to be applied to applications that have previously required complex, customized programming.
  • a single hierarchical movie, in the form of a media container that contains embedded media containers, can contain an entire multimedia application. Further, all or some of the embedded movies may be time-independent. Thus, the timing of one segment or aspect of the resulting movie may vary from performance to performance relative to other segments or aspects of the movie.
  • the ability to group related media sequences into containers simplifies the movie editing process.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Management Or Editing Of Information On Record Carriers (AREA)
  • Television Signal Processing For Recording (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A hierarchical movie is provided. A hierarchical movie is a movie that contains one or more embedded movies. Embedded movies may themselves contain embedded movies. Each movie contains zero or more media sequences. Within a hierarchical movie, media sequences that should be edited together may be grouped together using embedded movies. The media sequences of a hierarchical movie may be sequenced during playback based on a different time coordinate system than the time coordinate system that governs any embedded movies. This allows a movie to contain both time-based and time-independent media sequences. Also, the relative timing of events in the movie may vary from performance to performance. The hierarchical movie structure allows movies to be used as user interface controls, and even as field-sensitive databases.

Description

This application is a continuation application of U.S. patent application Ser. No. 10/638,037, filed Aug. 8, 2003 now U.S. Pat. No. 7,102,644, which is a continuation of U.S. patent application Ser. No. 09/911,946, filed Jul. 23, 2001, now issued as U.S. Pat. No. 6,630,934, which is a continuation of U.S. patent application Ser. No. 09/049,715, filed Mar. 27, 1998, now issued as U.S. Pat. No. 6,297,830, which is a divisional of U.S. patent application Ser. No. 08/570,542, filed Dec. 11, 1995, now issued as U.S. Pat. No. 5,751,281.
FIELD OF THE INVENTION
The present invention relates to a hierarchical movie structure, and more specifically, to a structure for embedding a movie within another movie.
BACKGROUND OF THE INVENTION
A movie generally consists of a set of tracks slaved to a movie clock. In a typical movie, the array of tracks includes a video track and an audio track. A video track consists of a sequence of samples of video data. An audio track is a sequence of samples of audio data.
Besides video and audio tracks, movies may also include tracks which store other types of information. For example, a movie may also include a text track that contains text for subtitles, a music track separate from the main audio track, and a time code track.
FIG. 1 illustrates a typical movie 100. Movie 100 includes a video track 122 that includes a sequence of video samples 102, 104, 106, 108 and 110. Movie 100 also includes a sound track 124 that includes a sequence of audio samples 112, 114, 116, 118 and 120. When movie 100 is played, playback of all of the tracks of the movie 100 are synchronized based on a movie clock. For example, at a time T1 of the movie clock, video sample 102 is being displayed and audio sample 112 is being played. At a time T2 of the movie clock, the video sample 104 is being displayed and audio sample 114 is being played. The audio and video samples that are being played at any given time on the movie clock remain the same for each performance of movie 100.
If one track of a movie is edited, some of the other tracks of the movie may also have to be edited. Consider, for example, a movie in which a character is giving a speech with the national anthem playing in the background. If one wishes to delete a portion of the speech from the movie, the corresponding sequence of video must be cut from the video track, the corresponding sequence of audio must be cut from the audio track, and the corresponding text sequence must be cut from the text track. However, to maintain the continuity and integrity of the national anthem, the corresponding music track should not be cut.
Flat movie formats provide no mechanism for keeping track of relationships between tracks. Because the editing of one track of a movie may require the editing of some but not all of the other tracks in the movie, movie editing can quickly become a difficult and complex task. Complex editing operations are even more complicated. For example, during an operation in which one movie is created by splicing together tracks of other movies, it may be virtually impossible to keep track of which tracks should and should not be edited together.
Based on the foregoing, it is desirable to simplify the movie editing process. It is further desirable to expand the application of movies beyond simple deterministic time-based media playback applications.
SUMMARY OF THE INVENTION
A hierarchical movie is provided. A hierarchical movie is a movie that contains one or more embedded movies. Embedded movies may themselves contain embedded movies. Each movie contains zero or more media sequences. Within a hierarchical movie, media sequences that should be edited together may be grouped together using embedded movies. The media sequences of a hierarchical movie may be sequenced during playback based on a different time coordinate system than the time coordinate system that governs any embedded movies. This allows a movie to contain both time-based and time-independent media sequences. Also, the relative timing of events in the movie may vary from performance to performance. The hierarchical movie structure allows movies to be used as user interface controls, and even as field-sensitive databases.
According to one embodiment of the invention, a hierarchical media container is provided. The hierarchical media container includes a first set of media sequences. One media sequence in the first set of media sequences contains and an embedded media container including a second set of media sequences.
The hierarchical media container may or may not have the same relationship to time as the embedded media container. For example, in one embodiment both the hierarchical media container and the embedded media container are time-based media containers. In another embodiment, the hierarchical media container is a time-independent media container and the embedded media container is a time-based media container. In yet another embodiment, the hierarchical media container is a time-based media container and the embedded media container is a time-independent media container.
According to another aspect of the invention, a method for providing a “control movie” is provided. According to the method, a user may select a parameter value by interacting with a movie. The method may be used in a computer system that includes a display device.
The method includes a step for providing a media container that includes a media sequence of visual data. The media sequence includes a plurality of samples. The plurality of samples includes a set of samples. Each sample in the set is associated with a value. The method also includes steps for determining a current sample from the set of samples, establishing the value associated with the current sample as the parameter value, and displaying on the display device the image represented by the current sample.
The method also includes the steps of receiving input from the user specifying a sequencing direction, determining a next sample, the next sample being a sample of the set of samples located in the sequencing direction relative to the current sample, establishing the value associated with the next sample as the parameter value, and displaying on the display device the image represented by the next sample.
According to another aspect of the invention, a method for editing a movie is provided. According to the method, a first plurality of media sequences of the movie is stored in a first container. A second plurality of media sequences of the movie is stored in a second container. One of the first container and the second container are embedded in the other of the first container and the second container. An edit of a media sequence of the second plurality of media sequences is received from a user. All media sequences of the second plurality of media sequences are automatically edited responsive to receiving the edit.
According to another embodiment of the invention, a method for playing a movie is provided. According to the method, a first plurality of samples of a first media sequence are sequentially played The first media sequence is stored in a first media container. A second plurality of samples of a second media sequence is also sequentially played. The second media sequence is stored in a second media container. The second media container is embedded within the first media container.
Optionally, the step of sequentially playing the first plurality of samples may be performed responsive to a first time coordinate system, while the step of sequentially playing the second plurality of samples is performed responsive to a second time coordinate system, where the first time coordinate system is different from the second time coordinate system.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
FIG. 1 illustrates a movie structure used in the prior art;
FIG. 2 illustrates a computer system upon which the present invention may be implemented;
FIG. 3 a illustrates a hierarchical movie structure according to an embodiment of the invention;
FIG. 3 b illustrates a hierarchical movie structure that has more complex hierarchical relationships than that illustrated in FIG. 3 a;
FIG. 4 illustrates a movie generated based on data stored in a hierarchical movie structures where a time-based movie is embedded in a time-based movie;
FIG. 5 illustrates a movie generated based on data stored in a hierarchical movie structure where a time-independent movie is embedded within a time-based movie;
FIG. 6 illustrates a time-based movie generated based on a hierarchical movie-structure where a time-based movie is embedded in a time-independent movie;
FIG. 7 illustrates a movie sequence for a control movie according to an embodiment of the invention;
FIG. 8 illustrates a movie generated based on a media container with an embedded control movie;
FIG. 9 a illustrates a media container for a field-insensitive database movie;
FIG. 9 b illustrates a database movie generated responsive to the media container shown in FIG. 9 a;
FIG. 10 illustrates a media container for a field-sensitive database movie; and
FIG. 10 b illustrates a database movie generated responsive to the media container shown in FIG. 10 a.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
A method and apparatus for creating and using a movie within a movie is described. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.
Referring to FIG. 2, it illustrates a computer system 200 upon which the preferred embodiment of the present invention can be implemented. Computer system 200 comprises a bus or other communication means 201 for communicating information, and a processing means 202 coupled with bus 201 for processing information. System 200 further comprises a random access memory (RAM) or other dynamic storage device 204 (referred to as main memory), coupled to bus 201 for storing information and instructions to be executed by processor 202. Main memory 204 also may be used for storing temporary variables or other intermediate information during execution of instructions by processor 202. Computer system 200 also comprises a read only memory (ROM) and/or other static storage device 206 coupled to bus 201 for storing static information and instructions for processor 202.
Furthermore, a data storage device 207 such as a magnetic disk or optical disk and its corresponding disk drive can be coupled to computer system 200. Computer system 200 can also be coupled via bus 201 to a display device 221, such as a cathode ray tube (CRT), for displaying information to a computer user. An alphanumeric input device 222, including alphanumeric and other keys, is typically coupled to bus 201 for communicating information and command selections to processor 202. Another type of user input device is cursor control 223, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 202 and for controlling cursor movement on display device 221. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), which allows the device to specify positions in a plane.
Alternatively, other input devices such as a stylus or pen can be used to interact with the display. A displayed object on a computer screen can be selected by using a stylus or pen to touch the displayed object. The computer detects the selection by implementing a touch sensitive screen. Similarly, a light pen and a light sensitive screen can be used for selecting a displayed object. Such devices may thus detect selection position and the selection as a single operation instead of the “point and click,” as in a system incorporating a mouse or trackball. Stylus and pen based input devices as well as touch and light sensitive screens are well known in the art. Such a system may also lack a keyboard such as 222 wherein all interface is provided via the stylus as a writing instrument (like a pen) and the written text is interpreted using optical character recognition (OCR) techniques.
Another device which may be coupled to bus 201 is hard copy device 224. Hard copy device 224 may be used for printing instructions, data, or other information on a medium such as paper, film, or similar types of media Additionally, computer system 200 can be coupled to a device for audio playback 225 such as a speaker. Further, the device may include a speaker which is coupled to a digital to analog (D/A) converter for playing back the digitized sounds. Finally, computer system 200 can be a terminal in a computer network (e.g., a LAN).
In the currently preferred embodiment, computer system 200 is one of the Macintosh® family of personal computers such as the Macintosh® II manufactured by Apple® Computer, Inc. of Cupertino, Calif. (Apple and Macintosh are registered trademarks of Apple Computer, Inc.). In the currently preferred embodiment, the present invention is related to the use of a computer system 200 to create, store, and play back movies that contain other movies.
Terms
In the following discussion, the term “media sequence” refers to a plurality of ordered samples. A video track, for example, is a media sequence in which each sample contains video data representing an image. Similarly, a sound track is a media sequence in which each sample contains audio data representing sound.
The term “media container” refers to a data structure that includes zero or more media sequences. A QuickTime movie is a media container in that it stores multiple media sequences, such as video tracks, audio tracks, sound tracks, text tracks, etc. QuickTime movies are described in detail in Inside Macintosh: QuickTime by Apple Computer Inc., published by Addison-Wesley Publishing Company (1993). All of the media sequences that belong to a media container are sequenced according to a common time coordinate system.
The terms “embedded movie”, “contained movie”, “embedded media container” and “contained media container” refer to a media container that is a logical component of another movie container. The terms “containing movie” and “containing media container” refer to the media container of which an embedded movie is a logical component. As shall be explained below, the media sequences of an embedded movie are not necessarily sequenced according to the same time coordinate system as the media sequences that belong to the containing movie.
Significantly, the embedded relationship is a logical relationship, not a physical relationship. Therefore, the data that represents an embedded movie is not necessarily located in the same physical file or even on the same physical storage device as other data “contained in” the containing movie. For example, a movie A stored in a file A located on a storage medium A and a movie B that is stored in a file B located on a storage medium B may both be embedded in a movie C that also includes tracks D, E and F that are stored in a file G on a storage medium H. In embodiments where the embedded relationship is not reflected in the actual physical location of data, control data structures are used to reflect the logical relationship between files.
During playback, all media sequences progress from sample to sample based on the occurrence of an event. When the event is the passage of time in a time coordinate system, the media sequences are referred to as “time-based” media sequences. A video track is an example of a time-based media sequence. During playback, a sample in a video media sequence is displayed for a set time interval. After the time interval expires, the next sample in the video media sequence is displayed. This process continues until all of the samples in the video media sequence have been displayed. The time interval may be modified to speed up or slow down playback, but the playback timing is still driven by the passage of time.
If the event that causes a media sequence to progress from one sample to the next is anything other than the passage of time in a time coordinate system, the media sequence is referred to as a “time-independent” media sequence. For example, consider a media sequence in which each sample contains the text of a page in a novel. During playback, the page represented in a sample should be displayed until the reader has completed reading the page. Since reading speeds vary greatly, the playback mechanism should not display the page associated with the next sample until the reader indicates a desire to turn the page. Thus, a mechanism may be provided to the user through which the user may initiate an event to move to the next page. For example, a user may operate a mouse or other pointing device to click on a “Turn Page” button to cause the playback mechanism to sequence to the next sample.
A media container is “slaved” to a clock if the clock determines when the media sequences that belong to the media container progress from one sample to the next. All of the media sequences in a typical movie are slaved to the same clock (the “movie clock”) to ensure that the media sequences remain synchronized during playback. A media container is “independent” of a clock if the media sequences within the media container are sequenced based on an event other than the passage of time on the clock. For example, a media container is independent of a clock if playback of the media sequences within the media container may be slowed without slowing the clock.
A sequencing direction is the direction in which a media sequence is played relative to the order of the samples. Because media sequences are “ordered”, all media sequences have at least two possible sequencing directions. For the purposes of discussion, these two sequencing directions will be referred to as “forward” and “backward”. However, it should be understood that “forward” does not necessarily mean the “normal” or “typical” direction, since some applications may process media sequences in one direction, other applications may process media sequences in the other direction, and yet other applications may process sequences in either or both directions.
The “active interval” of a sample is the time interval during which the sample may be played. For example, the active interval for the first video sample in a 30-frame per second movie is the first 1/30 second of playback.
Playback Mechanism
In the preferred embodiment, the mechanism for playing a movie is implemented through a series of instructions executed on processor 202. Initially, the series of instruction may be stored on storage device 207. When the playback mechanism is invoked, the instructions are copied from storage device 207 into memory 204, and then accessed and executed by processor 202.
During execution of the series of instructions, the samples of the media sequences of a movie are processed by processor 202 responsive to the series of instructions. Specifically, processor 202 causes the samples to be “played”. The particular steps for playing a sample depend on the nature of the data within the sample. For example, a sample of video data is “played” by causing the image represented in the sample to be displayed on display device 221. Samples containing audio data are played by generating the sound represented in the audio sample. Sound may be generated, for example, on speaker 225.
Processor 202 sequences through the movie responsive to the series of instructions. The series of instructions may cause processor 202 to sequence through the movie responsive to the passage of time and/or the occurrence of another type of event. An event which causes processor 202 to sequence to the next sample in a media sequence may be a user-actuated event, such as the selection of a key on keyboard 222, or the operation of a user-interface control through actuation of mouse 223.
Hierarchical Media Containers
Typical movies include a plurality of time-based media sequences that are played back based on a common time coordinate system. According to one aspect of the present invention, a media container format is provided in which media sequences may contain samples that are themselves media containers. Such media containers may be stored, for example, on storage device 207.
Referring to FIG. 3 a, it illustrates a media container 300 that contains four media sequences 302, 304, 306 and 311. Media sequences 302, 304 and 306 are typical media sequences, such as sound, video or text tracks. Media sequence 311 contains a sample 309 that contains another media container 308. Media container 308 includes two media sequences 310 and 312.
The active interval of sample 309 is between times T1 and T2. Consequently, media container 308 may only be played during times T1 and T2. These times are determined by the mechanism used to sequence media container 300.
Because one media container 308 is a component of a media sequence 311 in another media container 300, a hierarchy exists between the media containers. The data structure that establishes this hierarchy is referred to herein as a hierarchical media container. As mentioned above, the “contained in” relationship is logical, not necessarily physical. Thus, the data for each of the various media sequences shown in FIG. 3 a may be stored in separate files on separate storage devices.
While FIG. 3 a illustrates media container 300 with a single embedded media container 308, the hierarchical media container structure allows media containers to have any number of embedded media containers. Specifically, a single media container may contain multiple media sequences that contain embedded media containers. In addition, a single media sequence may have multiple samples, each of which contains its own embedded media container. Further, embedded media containers may themselves contain media sequences that contain embedded media containers. As a result, the structure of media containers may be tailored to particular applications.
FIG. 3 b illustrates a media container 320 that has a more complicated hierarchical structure than media container 300 in FIG. 3 a. Media container 320 contains two media sequences 322 and 336, each of which contain embedded media containers. Media sequence 322 contains two movie samples. The first movie sample in media sequence 322 contains a media container 324, and the second movie sample contains a media container 326. Media container 324 contains two media sequences 328 and 330, and media container 326 contains two media sequences 332 and 334.
Media sequence 336 contains one movie sample. The movie sample contained in media sequence 336 contains a media container 338 that has three media sequences 340, 342 and 344. Media sequence 344 has two movie samples which contain media container 346 and media container 352 respectively. Media container 346 includes media sequences 348 and 350, while media container 352 contains media sequences 354 and 356. Thus, media containers 346 and 352 are embedded in a media container 338 that is itself embedded in a media container 320.
At least two significant benefits result from the use of hierarchical media containers. First, the logical relationship between related media sequences may be reflected in the structure of the media container itself. Second, different media sequences within a single movie may be driven by different time coordinate systems.
These benefits make it possible to use movies for applications that previously required complex, custom-designed objects. In addition, the task of editing and combining movies is simplified. Various applications made possible by the ability to embed media containers within media containers shall now be described in greater detail.
Synchronously-Played Embedded Movies
In the simplest example of an embedded movie, all media sequences are time-based and the time coordinate system that applies to the embedded movie is slaved to the time coordinate system that applies to the containing movie. For example, assume that media sequences 302, 304, 306, 310, 311 and 312 are all time-based media sequences. Because media sequences 302, 304, 306 and 311 belong to media container 300, media sequences 302, 304, 306 and 311 will sequence based on a common time coordinate system during playback. Similarly, because media sequences 310 and 312 belong to media container 308, media sequences 310 and 312 will sequence based on a common time coordinate system during playback In the simplest situation, the time coordinate system used to sequence the media sequences in media container 300 is the same time coordinate system used to sequence the media sequences in media container 308. In this situation, media sequences 310 and 312 will be played back as if they where simply two more media sequences contained in media container 300.
Asynchronous-Played Embedded Movies
The time coordinate system that applies to an embedded movie need not be slaved to the time coordinate system that applies to the containing movie. Thus, the time coordinate system for media container 300 may be a different time coordinate system than that used to sequence media container 308.
Because media containers embedded in a given media container may have different time coordinate systems than the given media container, various multimedia effects are possible. For example, a user may be able to speed up or slow down certain aspects of a movie relative to other aspects of a movie. Consider the situation in which media sequences 302, 304 and 306 represent the sound and image of a helicopter 402 as it flies form a point A to a point B, as shown in FIG. 4. Media sequences 310 and 312 may represent the sound and image of a car 404 as it travels from a point C to a point D. Because media container 308 is not slaved to the clock of media container 300, playback of the media sequences associated with car 404 relative to playback of the media sequences of helicopter 402 may vary from performance to performance.
For example, during one performance of media container 300, helicopter 402 may begin to move before car 404. During another performance, car 404 may begin to move before helicopter 402. Similarly, during one performance, helicopter 402 may move faster than car 404. During another performance, car 404 may move faster than helicopter 402. The relative playback starting times and playback rates may be based on user input. Thus, users may operate controls to cause helicopter 402 and car 404 to race across the screen, where the outcome of the race is not predetermined.
Stop Data
When an embedded movie is not slaved to the clock of the containing movie, it is possible for the playback of the embedded movie to be completed before the end of the active interval of the sample in which the embedded movie is contained. For example, if media container 308 is slaved to a clock that is running twice as fast as the clock associated with media container 300, then media container 308 may be played twice between T1 and T2. Under some circumstances it is desirable for the embedded movie to play in a continuous loop during the active interval with which it is associated. Under other circumstances, it is desirable for the embedded movie to play once and then stop, even if the active interval for the sample in which it is contained has not ended. Under yet other circumstances, it is desirable for the embedded movie to play up to N times, and then stop, where N is some specified number.
Based on the foregoing, one embodiment of the invention allows “stop data” to be stored for each embedded movie. The stop data specifies a condition under which the playback mechanism is to stop playing the embedded movie. For example, the stop data may indicate that the playback mechanism is to stop playing an embedded movie after it has been repeated ten times. The playback mechanism reads the stop data and stops playback of the embedded movie when one of the following events occurs: (1) the termination condition specified in the stop data is satisfied, or (2) the active interval associated with the embedded movie ends.
Time-Independent Within Time-Based
Hierarchical media containers may be used to mix time-based media sequences with time-independent media sequences. For example, media sequences 302, 304 and 306 may be time-based media sequences while media sequences 310 and 312 are time-independent sequences.
FIG. 5 illustrates one application of a time-independent movie embedded in a time-based movie. In FIG. 5, media sequences 302, 304 and 306 provide the video and sound for a helicopter 502 flying across a screen 508. While the helicopter 502 is flying across the screen 508, a user may browse through a book 504. The text of book 504 may be stored in media sequence 310, and the sound of tuning pages may be stored in media sequence 312. When a user selects the upper portion 506 of a page, the media sequences 310 and 312 are advanced (text of the next page is shown, and the sound of a page turning is generated). The rate at which the user turns pages has no affect on the rate at which helicopter 502 moves across the screen 508.
Time-Based Within Time-Independent
A media container storing time-based sequences may also be embedded in a media container that stores time-independent sequences. For example, media sequences 302, 304 and 306 may be time-independent media sequences, while media sequences 310 and 312 are time-dependent sequences.
Referring to FIG. 6, media sequences 302, 304 and 306 may correspond a series of static scenes. The user may move from one scene to the next by entering user input to cause media sequences 302, 304 and 306 to sequence to subsequent information samples. One of the static scenes 600 may include the image of a television 602. Media sequences 310 and 312 may store video and audio that is played on the television 602. The rate at which media sequences 310 and 312 are played is unrelated to the rate at which a user moves from one scene to the next.
Time-Independent Within Time-Independent
A media container storing time-independent sequences may be embedded in a media container that stores time-independent sequences. For example, each sample in the media sequences of the containing media container may correspond to a chapter of a book. One of the media sequences of the containing media container may contain embedded movies. Each embedded movie may contain the text for each page in a chapter of the book. One button may be provided for sequencing the containing movie (to move from chapter to chapter). A second button may be provided for sequencing the embedded movies (to move from page to page within a chapter).
Editing Operations
With the movie structures of the prior art, editing operations are complicated by the fact that edits to some tracks in a movie may require edits to some but not all other tracks in the movie. Editing operations may be simplified by using the hierarchical media container structure of the present invention to reflect relationships between media sequences. Specifically, related media sequences may be stored in the same media container, while unrelated media sequences are assigned to different containers in the hierarchical structure.
Consider the example of a speech made during a performance of the national anthem. Edits to the video track should be reflected in the audio track and the subtitle track. However, video edits should not affect the sound tracks. Using a hierarchical media container structure, the media sequences of the video, audio and subtitle tracks can be stored in a first media container, and the media sequences of the sound tracks can be stored in a second media container. The second media container can be embedded in the first media container. Editing utilities may then be configured to respond to edits by automatically editing all media sequences that belong to the same media container as the edited media sequence, and to leave all other media sequences intact.
The process of incorporating clips from many movies into a single movie is also simplified through the use of embedded movies. Consider, for example, a movie with sixty tracks created by combining tracks from twenty different sources. In the prior art movie structure, there is no indication of relationship between tracks. Therefore, the inadvertent destruction of synchronization between related tracks is difficult to avoid. Using embedded media containers, the relationship between related tracks may be maintained.
Specifically, all of the clips from the same source can be stored in the same embedded media container. Thus, when a media sequence is modified, the editor need only look to the other media sequences in the same media container to determine whether other media sequences must be modified responsive to the modification. Consequently, embedded containers provide to editors the ability to maintain logical media sequence groupings. This ability, in turn, makes complex editing operations more manageable.
Space Savings
Use of hierarchical media containers may also reduce the size of some movies. For example, consider a twenty minute movie in which a four minute musical theme is repeated five times. Using the prior art movie structure, the movie would contain a sound track covering the full twenty minutes of music. Using a hierarchical media container structure, a four minute media sequence could be stored in a first media container separate from a second media container that stores the rest of the movie. The first media container may then be embedded in the second media container. Attributes of the first media container may be set so that playback of the first media container begins with the playback of the second media container and continuously repeats until the end of playback of the second media container. During playback, the movie will appear and sound the same, but the sound data of the hierarchical media container will take up approximately one fifth as much storage space as the twenty minute sound track.
Animated Controls
Many computer applications display “controls” through which a user may designate operational parameters. Examples of such controls include scroll bars, check boxes and radio buttons. Controls typically allow a user to select one value from a predetermined range of values. For example, a user may designate the value “checked” or the value “unchecked” by interacting with a checkbox. A user may select one of a range of values by interacting with a scroll bar. In general, the more complicated the control, the more difficult it is to display and manage the control.
According to one aspect of the present invention, a “control movie” is provided. A “control movie” is a movie that performs the traditional functions of a control. More specifically, a control movie is a media container that contains at least one media sequence, where (1) samples in the media sequence are associated with parameter values and (2) the current sample of the media sequence determines the value of a parameter.
Referring to FIG. 7, a media sequence 700 of a control movie is illustrated. Media sequence 700 is a video media sequence and includes an ordered series of samples 702, 704, 706, 708 and 710 of video data. Each sample of video data in media sequence 700 represents an image. Each sample in media sequence 700 is also associated with a value for a parameter. Specifically, sample 702 is associated with the value “1”, sample 704 is associated with the value “2”, sample 706 is associated with the value “3”, sample 708 is associated with the value “4”, sample 710 is associated with the value “5” and sample 712 is associated with the value “6”.
To provide a control interface during the playback of a movie, a media container that includes media sequence 700 may be embedded into the movie. For example, a movie in which a helicopter 802 flies from a point E to a point F on a screen 804 is illustrated in FIG. 8. Embedded in the movie is a media container that includes media sequence 700. An image 806 corresponding to a sample in media sequence 700 is displayed on screen 804 during playback of the movie.
Because media sequence 700 is contained in an embedded movie, it does not sequence responsive to the time coordinate system that controls the containing movie. However, the image 806 generated by the media sequence may be used to determine a parameter associated with the containing movie, such as the rate at which the containing movie is played back.
Assume, for example, that the movie associated with helicopter 802 is played back at a rate determined by the parameter “speed”. The higher the value of “speed”, the faster the movie is sequenced during playback. Assume also that the default value for “speed” is “5”.
To use image sequence 700 as a control for the value of “speed”, the image associated with the default value of “speed” is initially displayed on screen 804. In the present example, the image associated with sample 710, which is the sample associated with the value “5”, would be displayed. This image would not change (i.e. media sequence 700 would not be sequenced) responsive to the passage of time in the time coordinate system associated with the containing movie. Rather, some other event, such as user interaction with the image 806, would trigger the sequencing of media sequence 700.
In one embodiment, media sequence 700 is sequenced responsive to the selection of arrows 810 and 812 on image 806. For example, the selection of arrow 810 will cause media sequence 700 to sequence “backward”. As a result, the image 806 will reflect the image associated with sample 708. Also, the value of “speed” will be updated to the value associated with the currently-displayed sample. The currently-displayed sample will be sample 708, which is associated with the parameter value “4”. Therefore the value of “speed” will be changed to “4”, and the rate of playback of the movie associated with helicopter 802 will decrease.
Conversely, the selection of arrow 812 will cause media sequence 700 to sequence “forward”. As a result, the image 806 will reflect the image associated with sample 712. Also, the value of “speed” will be updated to the value associated with the currendy-displayed sample. The currently-displayed sample will be sample 712, which is associated with the value “6”. Therefore the value of “speed” will be changed to “6”, and the rate of playback of the movie associated with helicopter 802 will increase.
In the embodiment described above, the sequencing of the control movie containing media sequence 700 is performed by selection of arrows 810 and 812. This selection may be performed, for example, by operating mouse 223 to position a cursor over one of arrows 810 or 812 and clicking a button on the mouse 223. Other user actuated sequencing mechanisms may also be used. For example, the sequencing of the control movie may alternatively be triggered by the selection of screen regions outside of image 806, or by pressing certain keys on keyboard 222.
Control movies have the benefit that once they are created, they may easily be embedded in other movies to provide a graphical user interface for parameter control. Because control movies are movies, they may provide visually sophisticated controls that would otherwise be difficult to display and manage.
While each sample of media sequence 700 is associated with a parameter value, other variations are possible. For example, every fifth sample may be associated with a parameter value. Upon receipt of user input designating a sequence direction, the media sequence 700 may be played in the designated direction until arriving at the next sample associated with a parameter value. Thus, the image 806 displayed responsive to the control movie will appear animated during the parameter change operation.
Database Moves
It is possible to search the text track of movies to locate specific words or word patterns. A movie that consists of only a text track is analogous to a text document in which each sample of the text track corresponds to a different page. FIG. 9 a illustrates a media container 901 that contains a media sequence 900 in which each sample 902, 904, 906, 908 and 910 includes the name and age of a person. A movie container storing media sequence 900 may be played as shown in FIG. 9 b.
Referring to FIG. 9 b, an image 920 corresponding to the currently-played sample of media container 901 is displayed on a screen 924. Along with image 920, a control panel 922 for sequencing media container 901 is also displayed. Control panel 922 includes control arrows 926 and 928 for sequencing media container 901 backward and forward, respectively. In addition, control panel 922 includes a text box 930 into which a user may enter search terms. Entry of search terms into text box 930 initiates a search for terms in the media container 901. If the terms are found, then the media container 901 is sequenced until the sample containing the search terms is displayed.
As described above, media container 901 may be used as a rudimentary database. However, the database provided by media container 901 has the disadvantage that searches are performed on all of the text in the media container 901. For most database applications, it is desirable to limit searches to specified fields. For example, one may want to search for someone born on the tenth of a month, but not someone who is ten years old. If the birth date field cannot be searched separately from the age field, then a search for “10” will match all instances of “10”, including people who are “10” years old.
The hierarchical media container format described herein may be used to segregate a movie database into fields. Referring to FIG. 10 a, it illustrates a media container 1000 that includes a plurality of media sequences 1016, 1017, 1019, and 1021. Media sequences 1017, 1019 and 1021 respectively contain media containers 1004, 1006 and 1008. Each of the embedded media containers 1004, 1006 and 1008 includes a text media sequence 1010, 1012 and 1014, respectively.
Media sequence 1016 is a media sequence that stores the names of database fields, such as “Name”, “Age” and “Birthday”. Each sample of media sequence 1010 stores text that indicates the name of an individual. Each sample of media sequence 1012 stores text that indicates the age of an individual. Each sample of media sequence 1014 stores text that indicates the birthday of an individual. Media sequences 1010, 1012 and 1014 are ordered such that at any given sequence location, all three media sequences represent data from the same individual. For example, samples 1018, 1020 and 1022, which are all located at the first sequence position, respectively store the name, age and birthday of the same individual.
Referring to FIG. 10 b, media container 1000 may be “played” to display an image 1050 on a screen 1052. The image 1050 includes a region 1054 in which the current sample of media sequence 1016 is displayed, a region 1056 in which the current sample of media sequence 1010 is displayed, a region 1058 in which the current sample of media sequence 1012 is displayed, and a region 1060 in which the current sample of media sequence 1014 is displayed.
Screen 1052 also contains a control panel 1062 that contains control arrows 1064 and 1066 analogous to control arrows 926 and 928 of FIG. 9 b, and a text box 1068 analogous to text box 930 of FIG. 9 b. Control panel 1062 also contains controls 1070, 1072 and 1074 that allow a user to choose one or more of the available fields. Because the data for each field is contained in a separate media container, searches may be performed on a field-by-field basis. For example, if a user selects control 1072 and enters “10” into text box 1068, the search is limited to the contents of media container 1006. If a match is found, media container 1000 is sequenced until the sample of sequence 1012 in which the match occurred is displayed.
As is evident by the foregoing, the hierarchical movie structure described herein allows movies to be applied to applications that have previously required complex, customized programming. A single hierarchical movie, in the form of a media container that contains embedded media containers, can contain an entire multimedia application. Further, all or some of the embedded movies may be time-independent. Thus, the timing of one segment or aspect of the resulting movie may vary from performance to performance relative to other segments or aspects of the movie. In addition, the ability to group related media sequences into containers simplifies the movie editing process.
While specific embodiments of the present invention have been described, various modifications and substitutions will become apparent to one skilled in the art by this disclosure. Such modifications and substitutions are within the scope of the present invention, and are intended to be covered by the following claims.

Claims (19)

1. A system for playing a movie comprising:
a first storage area to store a first set of media sequences, the first set including at least one media sequence;
a second storage area to store a second set of media sequences, the second set including at least one other media sequence, the second set having an embed relationship with the first set wherein the second set does not inherit the temporal frame of reference environment of the first set; and
a display device coupled to a processor to sequentially play a media sequence from the first set and to sequentially play a media sequence from the second set according to the embed relationship, the processor coupled to the first and second storage areas.
2. The system of claim 1, wherein the embed relationship specifies that one of the sets is logically embedded in the other of the sets.
3. The system of claim 1, wherein one of the sets of media sequences comprises at least one video sequence, and the other of the sets of media sequences comprises at least one audio sequence corresponding to the at least one video sequence.
4. The system of claim 1, wherein one of the sets of media sequences comprises at least one time-based sequence, and the other of the sets of media sequences comprises at least one time-independent sequence.
5. The system of claim 1, wherein the first set of media sequences is driven by a first time coordinate system, and the second set of media sequences is driven by a second time coordinate system that is different from the first time coordinate system.
6. The system of claim 1, wherein the first set of media sequences is driven by a first time coordinate system, and the second set of media sequences is driven by a second time coordinate system, and wherein one of the time coordinate systems is slaved to the other of the time coordinate systems.
7. The system of claim 1, wherein the first set of media sequences is driven by a first time coordinate system, and the second set of media sequences is driven by a second time coordinate system, and wherein the time coordinate systems are independent from one another.
8. The system of claim 1, wherein the processor causes the display device to play the media sequence from the second set in a continuous loop during a specified active interval.
9. The system of claim 8, wherein upon reaching the end of the active interval, the processor causes the display device to stop playing the media sequence from the second set.
10. The system of claim 8, wherein the specified active interval corresponds to the duration of the sequential playing of the media sequence from the first set.
11. The system of claim 1, wherein the processor causes the display device to play the media from the second set a predefined number of times.
12. The system of claim 1, wherein the media sequence from the first set has a first duration and the media sequence from the second set has a second duration different from the first duration.
13. The system of claim 1, further comprising a third storage area, coupled to the processor, to store at least one condition under which the processor is to cause the display device to stop playing a specified media sequence, and wherein, responsive to a specified condition being met, the processor causes the display device to stop to playing the specified media sequence.
14. The system of claim 13, wherein the specified condition comprises expiry of a predefined period of time.
15. The system of claim 13, wherein the specified condition comprises playback of the media sequence a predefined number of times.
16. A system for storing media sequences, comprising:
i) one or more storage resources to store:
a top level media container having at least one of the media sequences;
a lower level media container within one of the media sequences and having another one of the media sequences wherein the one of the media sequences has a different temporal frame of reference environment than the another one of the media sequences;
wherein, each of the media sequences include a plurality of samples;
ii) a display
iii) a processor; and
iv) a presentation of keys to be touched by a user.
17. The system of claim 16, wherein at least one of the media sequences comprises at least one database field name.
18. The system of claim 16, wherein the display, keys and processor operate to accept a navigation command from a user; and present a selected sample responsive to the navigation command.
19. The system of claim 16, wherein the display, keys and processor operate to accept a search command from a user; and responsive to the search command, perform the search specified in the search command to obtain at least one search result; and display at least one search result.
US11/497,183 1995-12-11 2006-07-31 Apparatus and method for storing a movie within a movie Expired - Fee Related US7636090B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/497,183 US7636090B2 (en) 1995-12-11 2006-07-31 Apparatus and method for storing a movie within a movie

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US08/570,542 US5751281A (en) 1995-12-11 1995-12-11 Apparatus and method for storing a movie within a movie
US09/049,715 US6297830B1 (en) 1995-12-11 1998-03-27 Apparatus and method for storing a move within a movie
US09/911,946 US6630934B2 (en) 1995-12-11 2001-07-23 Apparatus and method for storing a movie within a movie
US10/638,037 US7102644B2 (en) 1995-12-11 2003-08-08 Apparatus and method for storing a movie within a movie
US11/497,183 US7636090B2 (en) 1995-12-11 2006-07-31 Apparatus and method for storing a movie within a movie

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/638,037 Continuation US7102644B2 (en) 1995-12-11 2003-08-08 Apparatus and method for storing a movie within a movie

Publications (2)

Publication Number Publication Date
US20060262122A1 US20060262122A1 (en) 2006-11-23
US7636090B2 true US7636090B2 (en) 2009-12-22

Family

ID=24280058

Family Applications (5)

Application Number Title Priority Date Filing Date
US08/570,542 Expired - Lifetime US5751281A (en) 1995-12-11 1995-12-11 Apparatus and method for storing a movie within a movie
US09/049,715 Expired - Lifetime US6297830B1 (en) 1995-12-11 1998-03-27 Apparatus and method for storing a move within a movie
US09/911,946 Expired - Lifetime US6630934B2 (en) 1995-12-11 2001-07-23 Apparatus and method for storing a movie within a movie
US10/638,037 Expired - Fee Related US7102644B2 (en) 1995-12-11 2003-08-08 Apparatus and method for storing a movie within a movie
US11/497,183 Expired - Fee Related US7636090B2 (en) 1995-12-11 2006-07-31 Apparatus and method for storing a movie within a movie

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US08/570,542 Expired - Lifetime US5751281A (en) 1995-12-11 1995-12-11 Apparatus and method for storing a movie within a movie
US09/049,715 Expired - Lifetime US6297830B1 (en) 1995-12-11 1998-03-27 Apparatus and method for storing a move within a movie
US09/911,946 Expired - Lifetime US6630934B2 (en) 1995-12-11 2001-07-23 Apparatus and method for storing a movie within a movie
US10/638,037 Expired - Fee Related US7102644B2 (en) 1995-12-11 2003-08-08 Apparatus and method for storing a movie within a movie

Country Status (1)

Country Link
US (5) US5751281A (en)

Families Citing this family (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5751281A (en) * 1995-12-11 1998-05-12 Apple Computer, Inc. Apparatus and method for storing a movie within a movie
JPH11506250A (en) * 1996-03-04 1999-06-02 フィリップス エレクトロニクス ネムローゼ フェンノートシャップ User-oriented multimedia presentation system
US6340978B1 (en) * 1997-01-31 2002-01-22 Making Everlasting Memories, Ltd. Method and apparatus for recording and presenting life stories
US7657835B2 (en) * 1997-01-31 2010-02-02 Making Everlasting Memories, L.L.C. Method and system for creating a commemorative presentation
KR100647448B1 (en) * 1997-10-30 2006-11-23 코닌클리케 필립스 일렉트로닉스 엔.브이. Method for coding a presentation
JPH11203782A (en) * 1998-01-06 1999-07-30 Sony Corp Information recording and reproducing device and control method therefor
US6788292B1 (en) * 1998-02-25 2004-09-07 Sharp Kabushiki Kaisha Display device
US7139970B2 (en) 1998-04-10 2006-11-21 Adobe Systems Incorporated Assigning a hot spot in an electronic artwork
US6268864B1 (en) 1998-06-11 2001-07-31 Presenter.Com, Inc. Linking a video and an animation
US6307550B1 (en) 1998-06-11 2001-10-23 Presenter.Com, Inc. Extracting photographic images from video
EP1097568A2 (en) * 1998-06-11 2001-05-09 Presenter.Com Creating animation from a video
US6081278A (en) * 1998-06-11 2000-06-27 Chen; Shenchang Eric Animation object having multiple resolution format
US7233321B1 (en) 1998-12-15 2007-06-19 Intel Corporation Pointing device with integrated audio input
US6751604B2 (en) * 1999-01-06 2004-06-15 Hewlett-Packard Development Company, L.P. Method of displaying temporal and storage media relationships of file names protected on removable storage media
US6847373B1 (en) * 1999-04-16 2005-01-25 Avid Technology, Inc. Natural color matching in a video editing system
EP1054321A3 (en) * 1999-05-21 2002-06-19 Sony Corporation Information processing method and apparatus
US6404441B1 (en) * 1999-07-16 2002-06-11 Jet Software, Inc. System for creating media presentations of computer software application programs
US20050128220A1 (en) * 1999-08-03 2005-06-16 Marrin Christopher F. Methods and apparatuses for adjusting a frame rate when displaying continuous time-based content
WO2004042552A1 (en) * 1999-08-03 2004-05-21 Sony Electronics Inc. Declarative markup for scoring multiple time-based assets and events within a scene composition system
US7330186B2 (en) * 1999-08-03 2008-02-12 Sony Corporation Methods and systems for scoring multiple time-based assets and events
US6707456B1 (en) 1999-08-03 2004-03-16 Sony Corporation Declarative markup for scoring multiple time-based assets and events within a scene composition system
US6856322B1 (en) 1999-08-03 2005-02-15 Sony Corporation Unified surface model for image based and geometric scene composition
JP4629173B2 (en) * 1999-09-17 2011-02-09 ソニー株式会社 Recording apparatus, recording method, and recording medium
US6522333B1 (en) * 1999-10-08 2003-02-18 Electronic Arts Inc. Remote communication through visual representations
US6976032B1 (en) 1999-11-17 2005-12-13 Ricoh Company, Ltd. Networked peripheral for visitor greeting, identification, biographical lookup and tracking
US7653925B2 (en) * 1999-11-17 2010-01-26 Ricoh Company, Ltd. Techniques for receiving information during multimedia presentations and communicating the information
US7299405B1 (en) * 2000-03-08 2007-11-20 Ricoh Company, Ltd. Method and system for information management to facilitate the exchange of ideas during a collaborative effort
US6249281B1 (en) 2000-02-28 2001-06-19 Presenter.Com On-demand presentation graphical user interface
US6922702B1 (en) 2000-08-31 2005-07-26 Interactive Video Technologies, Inc. System and method for assembling discrete data files into an executable file and for processing the executable file
US20020026521A1 (en) * 2000-08-31 2002-02-28 Sharfman Joshua Dov Joseph System and method for managing and distributing associated assets in various formats
US6839059B1 (en) 2000-08-31 2005-01-04 Interactive Video Technologies, Inc. System and method for manipulation and interaction of time-based mixed media formats
US20020091840A1 (en) * 2000-11-28 2002-07-11 Gregory Pulier Real-time optimization of streaming media from a plurality of media sources
US20020180803A1 (en) * 2001-03-29 2002-12-05 Smartdisk Corporation Systems, methods and computer program products for managing multimedia content
US7683903B2 (en) 2001-12-11 2010-03-23 Enounce, Inc. Management of presentation time in a digital media presentation system with variable rate presentation capability
AU2002231289A1 (en) * 2000-12-19 2002-07-01 Coolernet, Inc. System and method for multimedia authoring and playback
US20060129933A1 (en) * 2000-12-19 2006-06-15 Sparkpoint Software, Inc. System and method for multimedia authoring and playback
US20030009485A1 (en) * 2001-06-25 2003-01-09 Jonni Turner System and method for recombinant media
JP3695581B2 (en) * 2001-08-08 2005-09-14 ソニー株式会社 Recording apparatus, recording method, recording medium, and electronic camera
US20040205116A1 (en) * 2001-08-09 2004-10-14 Greg Pulier Computer-based multimedia creation, management, and deployment platform
US7203380B2 (en) * 2001-11-16 2007-04-10 Fuji Xerox Co., Ltd. Video production and compaction with collage picture frame user interface
US6612418B2 (en) * 2002-01-14 2003-09-02 General Mills, Inc. System for use in an assembly line
US7113183B1 (en) 2002-04-25 2006-09-26 Anark Corporation Methods and systems for real-time, interactive image composition
US20050142371A1 (en) * 2002-05-31 2005-06-30 Swain Stuart D. Phosphorescent sheets or films having protective topcoat and methods of making the same
EP1556753A4 (en) * 2002-11-01 2008-03-05 Sony Electronics Inc Declarative markup for scoring multiple time-based assets and events within a scene composition system
US7082572B2 (en) * 2002-12-30 2006-07-25 The Board Of Trustees Of The Leland Stanford Junior University Methods and apparatus for interactive map-based analysis of digital video content
US7385613B1 (en) * 2002-12-31 2008-06-10 Adobe Systems Incorporated Collecting scripts in a distributed scripting environment
US20040165012A1 (en) * 2003-02-20 2004-08-26 International Business Machines Corp. Cascading menu with selectable offset
US20050088458A1 (en) * 2003-07-31 2005-04-28 Marrin Christopher F. Unified surface model for image based and geometric scene composition
US7689712B2 (en) 2003-11-26 2010-03-30 Ricoh Company, Ltd. Techniques for integrating note-taking and multimedia information
US7720924B2 (en) * 2003-12-12 2010-05-18 Syniverse Icx Corporation System providing methodology for the restoration of original media quality in messaging environments
US20050268249A1 (en) * 2004-05-21 2005-12-01 Paulo Colaco-Dias System and method for multiple document interface
US20060200745A1 (en) * 2005-02-15 2006-09-07 Christopher Furmanski Method and apparatus for producing re-customizable multi-media
US8805929B2 (en) * 2005-06-20 2014-08-12 Ricoh Company, Ltd. Event-driven annotation techniques
US7554576B2 (en) * 2005-06-20 2009-06-30 Ricoh Company, Ltd. Information capture and recording system for controlling capture devices
US8077179B2 (en) * 2005-07-11 2011-12-13 Pandoodle Corp. System and method for creating animated video with personalized elements
KR101187787B1 (en) * 2006-02-18 2012-10-05 삼성전자주식회사 Method and apparatus for searching moving picture using key frame
US7823056B1 (en) * 2006-03-15 2010-10-26 Adobe Systems Incorporated Multiple-camera video recording
US8006189B2 (en) * 2006-06-22 2011-08-23 Dachs Eric B System and method for web based collaboration using digital media
US20090024922A1 (en) * 2006-07-31 2009-01-22 David Markowitz Method and system for synchronizing media files
US7623755B2 (en) 2006-08-17 2009-11-24 Adobe Systems Incorporated Techniques for positioning audio and video clips
US20090297120A1 (en) * 2006-09-20 2009-12-03 Claudio Ingrosso Methods an apparatus for creation and presentation of polymorphic media
US20090297121A1 (en) * 2006-09-20 2009-12-03 Claudio Ingrosso Methods and apparatus for creation, distribution and presentation of polymorphic media
EP2068323A3 (en) * 2006-09-20 2009-07-01 John W Hannay & Company Limited Methods and apparatus for creation, distribution and presentation of polymorphic media
US7685154B2 (en) * 2006-10-13 2010-03-23 Motorola, Inc. Method and system for generating a play tree for selecting and playing media content
US8643653B2 (en) * 2007-06-08 2014-02-04 Apple Inc. Web-based animation
US20100077289A1 (en) 2008-09-08 2010-03-25 Eastman Kodak Company Method and Interface for Indexing Related Media From Multiple Sources
US8380866B2 (en) * 2009-03-20 2013-02-19 Ricoh Company, Ltd. Techniques for facilitating annotations
US8407596B2 (en) * 2009-04-22 2013-03-26 Microsoft Corporation Media timeline interaction
US20100318571A1 (en) * 2009-06-16 2010-12-16 Leah Pearlman Selective Content Accessibility in a Social Network
US20110161166A1 (en) * 2009-12-30 2011-06-30 Mindrum G Scott System and method for capturing, processing, and presenting information
KR20120053609A (en) * 2010-11-18 2012-05-29 삼성전자주식회사 Method and apparatus for providing electronic book function in portable terminal
US10158847B2 (en) * 2014-06-19 2018-12-18 Vefxi Corporation Real—time stereo 3D and autostereoscopic 3D video and image editing
US11341725B2 (en) 2018-09-27 2022-05-24 Apple Inc. Intermediary emergent content

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4538188A (en) * 1982-12-22 1985-08-27 Montage Computer Corporation Video composition method and apparatus
US5109482A (en) * 1989-01-11 1992-04-28 David Bohrman Interactive video control system for displaying user-selectable clips
US5202961A (en) * 1990-06-08 1993-04-13 Apple Computer, Inc. Sequential information controller
US5237648A (en) * 1990-06-08 1993-08-17 Apple Computer, Inc. Apparatus and method for editing a video recording by selecting and displaying video clips
US5339393A (en) * 1993-04-15 1994-08-16 Sony Electronics, Inc. Graphical user interface for displaying available source material for editing
US5359712A (en) 1991-05-06 1994-10-25 Apple Computer, Inc. Method and apparatus for transitioning between sequences of digital information
US5388197A (en) * 1991-08-02 1995-02-07 The Grass Valley Group, Inc. Video editing system operator inter-face for visualization and interactive control of video material
US5442744A (en) * 1992-04-03 1995-08-15 Sun Microsystems, Inc. Methods and apparatus for displaying and editing multimedia information
US5592609A (en) * 1994-10-31 1997-01-07 Nintendo Co., Ltd. Video game/videographics program fabricating system and method with unit based program processing
US5592602A (en) * 1994-05-17 1997-01-07 Macromedia, Inc. User interface and method for controlling and displaying multimedia motion, visual, and sound effects of an object on a display
US5606655A (en) * 1994-03-31 1997-02-25 Siemens Corporate Research, Inc. Method for representing contents of a single video shot using frames
US5680619A (en) * 1995-04-03 1997-10-21 Mfactory, Inc. Hierarchical encapsulation of instantiated objects in a multimedia authoring system
US5751281A (en) * 1995-12-11 1998-05-12 Apple Computer, Inc. Apparatus and method for storing a movie within a movie
US5831617A (en) 1995-11-27 1998-11-03 Bhukhanwala; Saumil A. Browsing and manipulating objects using movie like icons
US6463206B1 (en) * 1993-07-29 2002-10-08 Gemstar Development Corporation Television and video cassette recorder system with an electronic program guide

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4538188A (en) * 1982-12-22 1985-08-27 Montage Computer Corporation Video composition method and apparatus
US5109482A (en) * 1989-01-11 1992-04-28 David Bohrman Interactive video control system for displaying user-selectable clips
US5202961A (en) * 1990-06-08 1993-04-13 Apple Computer, Inc. Sequential information controller
US5237648A (en) * 1990-06-08 1993-08-17 Apple Computer, Inc. Apparatus and method for editing a video recording by selecting and displaying video clips
US5359712A (en) 1991-05-06 1994-10-25 Apple Computer, Inc. Method and apparatus for transitioning between sequences of digital information
US5388197A (en) * 1991-08-02 1995-02-07 The Grass Valley Group, Inc. Video editing system operator inter-face for visualization and interactive control of video material
US5442744A (en) * 1992-04-03 1995-08-15 Sun Microsystems, Inc. Methods and apparatus for displaying and editing multimedia information
US5339393A (en) * 1993-04-15 1994-08-16 Sony Electronics, Inc. Graphical user interface for displaying available source material for editing
US6463206B1 (en) * 1993-07-29 2002-10-08 Gemstar Development Corporation Television and video cassette recorder system with an electronic program guide
US5606655A (en) * 1994-03-31 1997-02-25 Siemens Corporate Research, Inc. Method for representing contents of a single video shot using frames
US5592602A (en) * 1994-05-17 1997-01-07 Macromedia, Inc. User interface and method for controlling and displaying multimedia motion, visual, and sound effects of an object on a display
US5592609A (en) * 1994-10-31 1997-01-07 Nintendo Co., Ltd. Video game/videographics program fabricating system and method with unit based program processing
US5680619A (en) * 1995-04-03 1997-10-21 Mfactory, Inc. Hierarchical encapsulation of instantiated objects in a multimedia authoring system
US5831617A (en) 1995-11-27 1998-11-03 Bhukhanwala; Saumil A. Browsing and manipulating objects using movie like icons
US5751281A (en) * 1995-12-11 1998-05-12 Apple Computer, Inc. Apparatus and method for storing a movie within a movie

Non-Patent Citations (9)

* Cited by examiner, † Cited by third party
Title
A structure for transportable dynamic multimedia documents, by Dick C. A. Bulterman et al., USENIX-Summer 91 -Nashville, TN, Jun. 1991. *
Adobe Premier 3.0 User Guide, pp. 5-6, 22, 28, 220-223, c. 1994. *
Adobe Premier 4.0 User Guide, pp. 49, 264, May 1994. *
Adobe Premier User Guide , Adobe System Incorporated, May 1994. *
Adobe Premiere 3.0 User Guide, Chapters 1 and 2, 1993. *
CCWS: A computer Based Multimedia Informatipon System, by a. Poggio et al., IEEE 1985, Jan. 1985. *
Object Composition and playback models for Handling Multimedia Data, by Rei Hamakawa et al., National Conference on Multimedia, Anaheim, California U.S.A, Jan. 1993. *
RCA Model VR673HF User's Guide, pp. 12, 17, 29, 1994. *
The O2 system, by O. Deux it el., Communication of the ACM/Oct. 1991/vol. 34. No. 30, Oct. 1991. *

Also Published As

Publication number Publication date
US7102644B2 (en) 2006-09-05
US5751281A (en) 1998-05-12
US20010043218A1 (en) 2001-11-22
US20040047208A1 (en) 2004-03-11
US6297830B1 (en) 2001-10-02
US6630934B2 (en) 2003-10-07
US20060262122A1 (en) 2006-11-23

Similar Documents

Publication Publication Date Title
US7636090B2 (en) Apparatus and method for storing a movie within a movie
US11467706B2 (en) Multipurpose media players
US5524193A (en) Interactive multimedia annotation method and apparatus
US5680639A (en) Multimedia control system
JP4064489B2 (en) Method and system for multimedia application development sequence editor using time event specification function
US5600775A (en) Method and apparatus for annotating full motion video and other indexed data structures
US5596696A (en) Method and apparatus for synchronizing graphical presentations
US5537528A (en) System and method for inputting scene information
US5553222A (en) Multimedia synchronization system
JP3943636B2 (en) Computer controlled display system
JP3943635B2 (en) Method for controlling session playback points in a computer controlled display system
US5717869A (en) Computer controlled display system using a timeline to control playback of temporal data representing collaborative activities
US5717879A (en) System for the capture and replay of temporal data representing collaborative activities
JP3378759B2 (en) Method and system for multimedia application development sequence editor using spacer tool
US20050069225A1 (en) Binding interactive multichannel digital document system and authoring tool
US20050071736A1 (en) Comprehensive and intuitive media collection and management tool
KR101661772B1 (en) Copying of animation effects from a source object to at least one target object
US20010056434A1 (en) Systems, methods and computer program products for managing multimedia content
JPH1031662A (en) Method and system for multimedia application development sequence editor using synchronous tool
WO1994027235A1 (en) Midi synchronization system
JPH0232473A (en) Moving image retrieving/editing system
JP2713147B2 (en) How to edit multimedia presentations
US20230032115A1 (en) Multipurpose media players
JPH06282426A (en) Interactive program generation assisting device
JPH1031661A (en) Method and system for multimedia application development sequence editor

Legal Events

Date Code Title Description
AS Assignment

Owner name: APPLE INC.,CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:APPLE COMPUTER, INC., A CALIFORNIA CORPORATION;REEL/FRAME:019279/0140

Effective date: 20070109

Owner name: APPLE INC., CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:APPLE COMPUTER, INC., A CALIFORNIA CORPORATION;REEL/FRAME:019279/0140

Effective date: 20070109

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.)

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20171222