US20140244607A1 - System and Method for Real-Time Media Presentation Using Metadata Clips - Google Patents
System and Method for Real-Time Media Presentation Using Metadata Clips Download PDFInfo
- Publication number
- US20140244607A1 US20140244607A1 US14/267,092 US201414267092A US2014244607A1 US 20140244607 A1 US20140244607 A1 US 20140244607A1 US 201414267092 A US201414267092 A US 201414267092A US 2014244607 A1 US2014244607 A1 US 2014244607A1
- Authority
- US
- United States
- Prior art keywords
- media
- media file
- metadata
- clip
- network location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06F17/30053—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/43—Querying
- G06F16/438—Presentation of query results
- G06F16/4387—Presentation of query results by the use of playlists
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
Definitions
- the present invention relates generally to media playback. More particularly, the present invention relates to processing and presentation of media content.
- On-demand streaming media has become one of the most popular methods of media consumption. By offering users the ability to choose amongst a wide variety of user generated and commercially produced content, and by offering the convenience of anytime, anywhere access with an available Internet connection, on-demand media streaming has solidified its position as a killer application for today's wired media age. However, precisely because streaming media offers such a wealth of choices to the user, it may be difficult to navigate to relevant or interesting portions of the vast amount of media content available.
- One way of manageably presenting from such a large catalog of media is to present only clips, or smaller subsections of larger media. This way, the user can selectively view only the segments of interest.
- many services require that a completely separate media clip be generated from the original media, which results in inefficient use of server resources through redundant content. Due to scarce storage, bandwidth, and processing resources, users may find themselves limited in the amount of media content they can provide and view.
- Besides clip segments, user variations or alternative versions of original media content can provide further options for tailoring content to user interests. For example, parodies using fake captions or replaced audio tracks can enhance the entertainment value of content, and also enables users to participate in the content creation process. Remixing and juxtaposing clips may also be a desirable feature, as well as adding intelligent playback behaviors.
- existing streaming video applications provide only a limited amount of user input for the manipulation of media streaming behavior, with acceptable user input typically limited to merely playback manipulation of a single media source without any further creative input. Users must therefore learn complex video editing software to achieve a desired result, a technical barrier that many users cannot surmount.
- the resulting edited media requires its own hosting, storage, and distribution, which may be difficult to secure, particularly if a user is a prolific media content editor.
- FIG. 1 presents a block diagram of a media database for supporting real-time media presentation using metadata clips, according to one embodiment of the present invention
- FIG. 2 presents a block diagram of a system for real-time media presentation using metadata clips, according to one embodiment of the present invention.
- FIG. 3 presents a block diagram of a media device for providing one or more media files for a display using metadata clips, according to one embodiment of the present invention.
- FIG. 4 shows a flowchart describing the steps, according to one embodiment of the present invention, by which a media device can provide one or more media files for a display using metadata clips.
- the present application is directed to a system and method for real-time media presentation using metadata clips.
- the following description contains specific information pertaining to the implementation of the present invention.
- One skilled in the art will recognize that the present invention may be implemented in a manner different from that specifically discussed in the present application. Moreover, some of the specific details of the invention are not discussed in order not to obscure the invention. The specific details not described in the present application are within the knowledge of a person of ordinary skill in the art.
- the drawings in the present application and their accompanying detailed description are directed to merely exemplary embodiments of the invention. To maintain brevity, other embodiments of the invention, which use the principles of the present invention, are not specifically described in the present application and are not specifically illustrated by the present drawings.
- FIG. 1 presents a block diagram of a media database for supporting real-time media presentation using metadata clips, according to one embodiment of the present invention.
- Environment 100 of FIG. 1 includes media database 110 , web server 120 , client 125 , and media servers 130 a - 130 b .
- Media database 110 includes records 111 a - 111 c and metadata clips 115 a - 115 c .
- Record 111 a contains tags 112 a and attributes 113 a
- record 111 b contains tags 112 b and attributes 113 b
- record 111 c contains tags 112 c and attributes 113 c .
- Web server 120 includes processor 123 and memory 122 storing media player service 121 .
- Client 125 includes processor 128 and memory 127 storing web browser 126 .
- Media server 130 a includes media file 135 a , which includes audio stream 136 a and video stream 137 a .
- Media server 130 b includes media file 135 b , which includes audio stream 136 b and video stream 137 b.
- Client 125 may comprise, for example, a personal computer, a set-top box, a videogame console, a mobile device, or some other device in communication with web server 120 over a communication network, such as the Internet.
- Processor 128 may execute web browser 126 from memory 127 to establish a connection from client 125 to web server 120 .
- Processor 123 of web server 120 may be executing media player service 121 from memory 122 , which may comprise a hosted application that interfaces with media database 110 for presenting a media player interface to client 125 using Flash, Java, Hypertext Markup Language (HTML), and other web technologies interpretable by web browser 126 .
- HTML Hypertext Markup Language
- a different web server may remotely access media player service 121 through an Application Program Interface (API) or through web standard protocols such as Simple Object Access Protocol (SOAP), Extensible Markup Language (XML), and HTTP (Hypertext Transfer Protocol).
- API Application Program Interface
- SOAP Simple Object Access Protocol
- XML Extensible Markup Language
- HTTP Hypertext Transfer Protocol
- This may allow, for example, embedding a player widget on a personal website, a social networking site, a message board, or any other web destination, or extending a player widget with additional functionality using other web services.
- web centric protocols are used for environment 100 , alternative protocols may also be used in other embodiments.
- a standalone media player and service executable could be distributed to client 125 , allowing client 125 to interface directly with media database 110 .
- client 125 might comprise a tightly integrated embedded system with built-in media player capabilities.
- a web server may not always be used as an intermediary, but may be included in many usage scenarios.
- media database 110 a plurality of metadata clips are stored and associated with records containing further information for the metadata clips.
- metadata clips 115 a - 115 c are each respectively associated with records 111 a - 111 c .
- media published on the Internet such as video clips, images, and audio, are often “tagged” with descriptive tags by moderators or users, which can be stored in tags 112 a - 112 c . Search algorithms may use these tags to find media keyed to the user's interests.
- tags might indicate media genres, authorship, media content, categories, ratings, time periods, and other information that may be helpful for sorting through large media catalogues.
- Attributes 113 a - 113 c may also indicate additional information regarding metadata clips 115 a - 115 c besides the information already provided through tags 112 a - 112 c .
- attributes 113 a - 113 c may indicate source quality information, such as video format, compression codecs and bit-rates, video resolution, audio sampling rate, number of audio channels, and other information regarding source media quality.
- attributes 113 a - 113 c may include metadata needed to support digital rights management (DRM), if implemented, and other access and permission schemes.
- Attributes 113 a - 113 c may also include statistical information, such as access logs or user ratings, to help gauge popularity for organizing search results.
- media player service 121 may serve the most appropriate content to client 125 .
- web browser 126 may report to media player service 121 the capabilities of client 125 , such as screen resolution, video window size, processor speed, sound output configuration, and other details. If, for example, multiple quality variations are available for metadata clips referencing the same content, media player service 121 can read attributes 113 a - 113 c to choose a metadata clip providing optimal quality, considering hardware constraints at client 125 . Additionally, media player service 121 may also test a connection speed with client 125 , allowing the matching of metadata clips of varying bandwidth requirements to varying network conditions, maintaining a high quality user experience even during network congestion.
- access logs can also provide some information about the popularity of particular metadata clips, and media player service 121 could be configured to present the most popular metadata clips first, as determined by total access counts, user ratings, or some other metric. Since other information besides access counts and user ratings can be stored in media database 110 , media searches can scale to arbitrary complexity and can match against any number of different attributes.
- Metadata clips 115 a - 115 c stored in media database 110 , only provide a reference to the actual media files to be served to client 125 . Metadata clips may reference the same referenced source media, or may reference different referenced source media. As shown in FIG. 1 , metadata clips 115 a - 115 b refer to different segments of the same media file 135 a on media server 130 a , whereas metadata clip 115 c refers to a segment of media file 135 b on media server 130 b . By accessing metadata clips from media database 110 , media player service 121 is enabled to flexibly playback portions of media files from any media server.
- media player service 121 can flexibly stream in real-time to web browser 126 from any arbitrary media file segments by using metadata clips provided by media database 110 , without the need to generate separate media files.
- metadata clip 115 a includes information for media player service 121 to locate the actual referenced source media file, with a filename of “Cats.avi.” In alternative embodiments, a hash value and a file database may be used instead of accessing by filename directly. Additionally, metadata clip 115 a includes location information for retrieving the actual referenced source media file from a network, in the form of a uniform resource locator (URL) pointing to media server 130 a via an Internet Protocol (IP) address. Media server 130 a may, for example, reside on a local intranet shared with web server 120 , preventing outside third parties from directly accessing media server 130 a . Media server 130 b may also be present on the same local intranet. With the above location information available to media player service 121 , media file 135 a can be located and retrieved.
- URL uniform resource locator
- IP Internet Protocol
- Metadata clip 115 a specifies the portion of media file 135 a that will be demanded, media player service 121 can retrieve only the portion that is necessary for streaming to web browser 126 .
- Metadata clip 115 a specifies the clip has a referenced start position of 00:01:30.000, or one minute thirty seconds, and a referenced end position of 00:01:45.000, or one minute forty five seconds.
- metadata clip 115 a may instead specify a length or duration of playback, from which the referenced end position can be derived.
- media player service 121 might retrieve only a portion of media file 135 a necessary to support playback from 00:01:30.000 to 00:01:45.000. Additionally, if a metadata clip 115 a only specified video or audio portions, then transfer of audio stream 136 a or video stream 137 a could also be omitted. If media file 135 a is a particularly long media file, this can help conserve network and processing resources, reducing costs of operation and providing a better end user experience.
- web browser 126 may access media player service 121 , which presents a player interface allowing a user to search for videos.
- the player interface might present a list of broad topics to choose as a list of tags, from which the user clicks the “Animals” tag.
- Media player server 121 queries media database 110 to provide all available metadata clips matching the “Animals” tag. Records 111 a - 111 c of media database 110 are thus searched, and tags 112 a - 112 c can be matched against the “Animals” tag, which may be present for record 111 a and record 111 b .
- Attributes 113 a might indicate that metadata clip 115 a has been viewed 500 times, whereas attributes 113 b might indicate that metadata clip 115 b has been viewed 100 times.
- media player service 121 presents the results, it can show metadata clip 115 a , the more popular clip first, and metadata clip 115 b second.
- Media player server 121 may present the resulting clips for individual selection, or may play them sequentially in a playlist.
- FIG. 2 presents a block diagram of a system for real-time media presentation using metadata clips, according to one embodiment of the present invention.
- Environment 200 includes web server 220 , client 225 , playlists 240 a - 240 b , and media server 230 .
- Web server 220 includes processor 223 and memory 222 storing media player service 221 .
- Client 225 includes processor 228 and memory 227 storing web browser 226 .
- Media server 230 includes media files 235 a - 235 d .
- Media file 235 a includes audio stream 236 a and video stream 237 a .
- Media file 235 b includes audio stream 236 b and video stream 237 b .
- Media file 235 c includes audio stream 236 c and video stream 237 c .
- Media file 235 d includes audio stream 236 d .
- Playlist 240 a includes metadata clips 215 a - 215 d and playback script 241 a .
- web server 220 corresponds to web server 110 from FIG. 1
- client 225 corresponds to client 115 from FIG. 1
- media server 230 corresponds to media servers 130 a - 130 b.
- Playlist 240 a shows an example of how multiple metadata clips can be arranged together to form one continuous sequential media stream viewable by a user.
- Playback script 241 a gives media player service 221 some guidance on how to interpret metadata clips 215 a - 215 d .
- processor 223 of web server 220 executing media player service 221 from memory 222 should stream to processor 228 of client 225 executing web browser 226 from memory 227 , the referenced video from metadata clips 215 a , 215 b , and 215 c , in order, accompanied by the referenced audio from metadata clip 215 d .
- disparate video clips can be unified with a continuous background soundtrack, rather than being interrupted by individual audio streams.
- media files 235 a - 235 c may each be set to a different song, but playlist 240 a can specify an external soundtrack, media file 235 d , to maintain audio consistency through playlist 240 a .
- Multiple audio tracks might be specified to play concurrently at varying volume levels to support, for example, audio commentary mixed with an original audio track playing softly in the background.
- media player service 221 therefore only needs to access video stream 237 a , video stream 237 b , video stream 237 c , and audio stream 236 d .
- Audio streams 236 a - 236 c can be left alone, and unused portions of video streams 237 a - 237 c can also be left alone, allowing web server 220 to stream content more efficiently.
- storage usage can be optimized as well, since a new media file does not need to be generated and stored for accommodating playlist 240 a.
- Playback script 241 a may contain other details besides the playback ordering of metadata clips. For example, visual effects such as transition effects can be specified, where a two second cross fading effect is chosen for video transitions between video clips in playlist 240 a . References to captions, subtitles, image overlays, commentary, external web links, and other related content could also be included in playback script 241 a.
- Event behaviors can also be implemented as well during playback, to provide more robust playback behaviors than simply playing media from start to finish. Playback start, playback end, rewinding, muting, pausing, and other events may trigger these playback behaviors.
- playback script 241 a for example, one such event behavior is that after playlist 240 a finishes playback, playback should continue onto a separately externally referenced playlist 240 b .
- processor 223 executing media player service 221 can automatically continue playback using externally referenced playlist 240 b , whose contents are omitted from FIG. 2 for brevity.
- This playlist ability to reference external playlists can be extended to implement a hierarchical playlist structure, with inherited behaviors and properties.
- a master playlist could reference several sub-playlists, with global behaviors defined in the master playlist that are inherited by the sub-playlists.
- the sub-playlists can include independent behaviors besides the inherited behaviors, or themselves reference other hierarchical playlists.
- playback script 241 a may specify a continuing playback after the processing of playlist 240 a , or where the continuing playback begins using the last referenced source file and the last referenced end position as a new start position.
- playback would continue at video stream 237 c and audio stream 236 d , using the referenced 00:02:30.000 video end position as a new video start position and the referenced 00:02:51.000 audio end position as a new audio start position.
- Playback may continue until video stream 237 c or audio stream 236 d ends. If video stream 237 c and audio stream 236 d do not end concurrently, then a black screen may be shown for missing video, and silence may be substituted for missing audio.
- various conditions can be attached to these event behaviors. During playback, these conditions can be monitored for completion before implementing the associated event behaviors. For example, instead of unconditionally jumping to playlist 240 b at the end of playlist 240 a , a button might be overlaid on top of video playback on web browser 226 , with the message “Click here to watch my favorite funny home videos!” If the user clicks the button, then media player service 221 may process that click event to jump directly to playlist 240 b , even during the middle of playback. If playlist 240 a instead comprised a collection of music videos for an artist, playback script 241 a could include event behaviors where clicks on the video window took the user to an e-commerce site selling the associated album or other merchandise.
- Another event behavior could be included where pausing the video playback brings up an informational pane providing links to tour dates and an official website for the artist. Yet another event behavior might automatically display subtitles if the user mutes the audio. These event behaviors can enhance the user's viewing experience by reacting intelligently to user actions, providing relevant or related information, and accessing related media.
- event behaviors may also be utilized to limit media access as well, to enforce user registration and login restrictions, to implement trial periods, or for other purposes. For example, a teaser selection of a longer media playlist might be presented to the user, where after a predetermined time of media playback, processing of the playlist is terminated and the user is prompted to provide login credentials or sign-up as a new member to resume playback of the playlist. Alternatively, playback may continue, but the event behaviors may specify randomly interspersed prompts pausing playback and exhorting the user to register or login.
- FIG. 3 presents a block diagram of a media device for providing one or more media files for a display using metadata clips, according to one embodiment of the present invention.
- Environment 300 includes web server 320 , client 325 , input device 329 , media server 330 , playlist 340 , and display 350 .
- Web server 320 includes processor 323 and memory 322 storing media player service 321 .
- Client 325 includes processor 328 and memory 327 storing web browser 326 .
- Display 350 includes video window 360 , seek bar 370 , position 371 , start position 375 , end position 376 , and playlist window 380 .
- FIG. 2 it should be noted that web server 320 corresponds to web server 220 from FIG. 1 , that client 325 corresponds to client 225 , that media server 330 corresponds to media server 230 , and that playlist 340 corresponds to playlist 240 a.
- processor 328 of client 325 executing web browser 326 from memory 327 and processor 323 of web server 320 executing media player service 321 from memory 322 provide a client-server web based model for providing media files on display 350 .
- web server 320 acts as the primary media device for handling media retrieval and logic processing
- client 325 via web browser 326 interprets a player interface provided by media player service 321 to output to display 350 .
- other models could be adopted, such as an integrated system on a single device, where client 325 and web server 320 are combined into one media device.
- Display 350 shows a simplified player interface example that may be presented by web browser 326 through media player service 321 .
- the components of display 350 might be contained within a larger browser window, with video window 360 displaying a video of cats to the upper-left, seek bar 370 updated below, and playlist window 380 displayed to the right.
- Playlist window 380 shows a sequential listing of metadata clips contained in the currently playing playlist, corresponding to playlist 240 a of FIG. 2 , with a presently playing metadata clip highlighted by a thicker border. As shown in FIG.
- Position 371 may be updated as media playback advances to reflect a present position within the presently playing metadata clip.
- Start position 375 shown as a right facing triangle, shows the start position corresponding to the currently playing metadata clip.
- End position 376 shown as a left facing triangle, shows the end position corresponding to the currently playing metadata clip.
- Selection 377 filled in black, shows the segment of playback time defined by start position 375 and end position 376 .
- seek bar 370 Although only a single set of position indicators and one selection is shown on seek bar 370 , multiple position indicators and selections may be possible where several metadata clips reference the same source media file. For example, if playlist 340 instead contained metadata clip 115 a and metadata clip 115 b from FIG. 1 , then two sets of start positions, end positions, and selections may be shown on seek bar 370 , since both metadata clips 115 a - 115 b refer to the same media file 135 a , but at different positions. However, since only one selection may be playing back at a time, the inactive selection might be grayed out to visually indicate that it is not the presently active selection.
- the player interface depicted in display 350 could also support, via user input read from input device 329 , the creation and editing of playlists and metadata clips as well.
- Input device 329 may comprise, for example, a keyboard, a pointing device such as a mouse, a pointer for use with a touchscreen, or some other input device providing user input.
- Processor 328 executing web browser 326 on client 325 may then detect the user input from input device 329 . This user input may then be forwarded to media player service 321 executing on processor 323 of web server 320 , depending on whether the user input may be handled solely on the client side via web browser 326 , or whether the server side driven by media player service 321 is also necessary.
- web browser 326 may be able to solely handle local events such as moving a cursor across display 350 , but user input affecting external data such as playlist 340 may need to be sent to media player service 321 for further processing.
- start position 375 and end position 376 might be clicked and dragged across seek bar 370 via input device 329 to define start position 375 and end position 376 , with selection 377 updated accordingly.
- the underlying referenced start position and referenced end position may also be updated in the relevant referenced metadata clip in playlist 340 .
- Video window 360 might be clicked and dragged into playlist window 380 using input device 329 to add the presently playing metadata clip to playlist 340 , or items from playlist window 380 might be clicked and dragged into a trashcan icon using input device 329 to remove items from playlist 340 .
- Additional player interface elements might be added to support adding subtitles, referencing outside audio sources, linking to other playlists, adding user-defined custom event behaviors, and other editing and creation facilities.
- FIG. 4 shows a flowchart describing the steps, according to one embodiment of the present invention, by which a media device can provide one or more media files for a display using metadata clips.
- Certain details and features have been left out of flowchart 400 that are apparent to a person of ordinary skill in the art.
- a step may comprise one or more substeps or may involve specialized equipment or materials, as known in the art.
- steps 410 through 440 indicated in flowchart 400 are sufficient to describe one embodiment of the present invention, other embodiments of the invention may utilize steps, different from those shown in flowchart 400 .
- step 410 of flowchart 400 comprises web server 320 generating one or more of a plurality of metadata clips from user input of input device 329 received from a player interface displayed by web browser 326 on display 350 .
- a user at client 325 might search various media files from media server 330 to assemble playlist 340 , providing start position 375 and end position 376 for metadata clips referencing those media files.
- input device 329 which may comprise a keyboard and a pointing device such as a mouse
- the user can manipulate the player interface shown on display 350 . In this manner, the user can move start position 375 and end position 376 to create selection 377 for each media file referenced in playlist window 380 .
- web browser 326 executing on processor 328 of client 325 can notify media player service 321 executing on processor 323 of web server 320 to store the newly created metadata clips in playlist 340 .
- Metadata clips may also be added to playlist 340 .
- media database 110 already contains metadata clips 115 a - 115 c , which may have been previously generated by fellow users or moderators.
- Processor 123 of web server 120 may then execute a search query through media player service 121 to retrieve metadata clips from media database 110 matching certain user specified criteria, such as genre or author. Matching metadata clips may then be added to playlist 340 , in addition to any metadata clips newly generated by the user.
- step 420 of flowchart 400 comprises web server 320 determining playlist 340 , including a listing of at least one metadata clip from the plurality of metadata clips of step 410 .
- the user may input device 329 to add and remove metadata clips in playlist window 380 , which may correspond to a listing of metadata clips in playlist 340 .
- other users or moderators might assemble predetermined playlists, such as playlists 240 a - 240 b of FIG. 2 , for browsing and selection by the user of client 325 .
- step 420 may determine playlist 340 to be the same as playlist 240 a of FIG. 2 .
- step 430 of flowchart 400 comprises web server 220 processing playlist 240 a to retrieve metadata clips 215 a - 215 d for playback at the referenced start and end positions.
- media player service 221 may stream to web browser 226 the video and audio streams indicated by playback script 241 a .
- Audio stream 236 d of media file 235 d or “Soundtrack.mp3,” is streamed to web browser 226 . Concurrently and in sequential order, three video streams are also sent to web browser 226 .
- position 00:01:30.000 to 00:02:30.000 from video stream 237 c of media file 235 c , or “Penguins.avi,” is streamed to web browser 226 .
- web server 220 can intelligently stream the above referenced media files, separate media files do not need to be generated and stored, optimizing server resource usage.
- step 440 of flowchart 400 comprises web server 320 beginning a playback on display 350 according to the processed playlist of step 430 .
- media player service 221 of web server 220 may direct client 225 to playback on display 350 , via web browser 226 , video streams 237 a - 237 c sequentially at the indicated start and end positions, while concurrently playing back audio stream 236 d through an audio output device attached to client 225 , which may, for example, be integrated into display 350 .
- client 325 can view, edit, and enjoy selections of media from a wide variety of different source media on display 350 .
- various event behaviors can also be supported to provide flexible behaviors beyond standard start to finish playback.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
There is provided a system and method for real-time media presentation using metadata clips. There is provided a media device for providing one or more media files for a display, the media device including a memory and a processor. The memory includes a plurality of metadata clips, wherein each of the plurality of metadata clips references a referenced source media from the media files, a referenced start position, and a referenced end position. The processor can determine a playlist including some of the plurality of metadata clips for streaming segments of media files as defined by their referenced start and end positions. Flexible behaviors can also be added to enhance playback logic or to enforce access restrictions, and metadata clips and playlists may also be user generated.
Description
- 1. Field of the Invention
- The present invention relates generally to media playback. More particularly, the present invention relates to processing and presentation of media content.
- 2. Background Art
- On-demand streaming media has become one of the most popular methods of media consumption. By offering users the ability to choose amongst a wide variety of user generated and commercially produced content, and by offering the convenience of anytime, anywhere access with an available Internet connection, on-demand media streaming has solidified its position as a killer application for today's wired media age. However, precisely because streaming media offers such a wealth of choices to the user, it may be difficult to navigate to relevant or interesting portions of the vast amount of media content available.
- One way of manageably presenting from such a large catalog of media is to present only clips, or smaller subsections of larger media. This way, the user can selectively view only the segments of interest. However, to provide such clip viewing functionality, many services require that a completely separate media clip be generated from the original media, which results in inefficient use of server resources through redundant content. Due to scarce storage, bandwidth, and processing resources, users may find themselves limited in the amount of media content they can provide and view.
- Besides clip segments, user variations or alternative versions of original media content can provide further options for tailoring content to user interests. For example, parodies using fake captions or replaced audio tracks can enhance the entertainment value of content, and also enables users to participate in the content creation process. Remixing and juxtaposing clips may also be a desirable feature, as well as adding intelligent playback behaviors. However, existing streaming video applications provide only a limited amount of user input for the manipulation of media streaming behavior, with acceptable user input typically limited to merely playback manipulation of a single media source without any further creative input. Users must therefore learn complex video editing software to achieve a desired result, a technical barrier that many users cannot surmount. Moreover, the resulting edited media requires its own hosting, storage, and distribution, which may be difficult to secure, particularly if a user is a prolific media content editor.
- Accordingly, there is a need to overcome the drawbacks and deficiencies in the art by providing a way to present and modify streaming media in a flexible manner optimized for efficient resource usage.
- There are provided systems and methods for real-time media presentation using metadata clips, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
- The features and advantages of the present invention will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, wherein:
-
FIG. 1 presents a block diagram of a media database for supporting real-time media presentation using metadata clips, according to one embodiment of the present invention; -
FIG. 2 presents a block diagram of a system for real-time media presentation using metadata clips, according to one embodiment of the present invention; and -
FIG. 3 presents a block diagram of a media device for providing one or more media files for a display using metadata clips, according to one embodiment of the present invention; and -
FIG. 4 shows a flowchart describing the steps, according to one embodiment of the present invention, by which a media device can provide one or more media files for a display using metadata clips. - The present application is directed to a system and method for real-time media presentation using metadata clips. The following description contains specific information pertaining to the implementation of the present invention. One skilled in the art will recognize that the present invention may be implemented in a manner different from that specifically discussed in the present application. Moreover, some of the specific details of the invention are not discussed in order not to obscure the invention. The specific details not described in the present application are within the knowledge of a person of ordinary skill in the art. The drawings in the present application and their accompanying detailed description are directed to merely exemplary embodiments of the invention. To maintain brevity, other embodiments of the invention, which use the principles of the present invention, are not specifically described in the present application and are not specifically illustrated by the present drawings.
-
FIG. 1 presents a block diagram of a media database for supporting real-time media presentation using metadata clips, according to one embodiment of the present invention.Environment 100 ofFIG. 1 includesmedia database 110,web server 120,client 125, and media servers 130 a-130 b.Media database 110 includes records 111 a-111 c and metadata clips 115 a-115 c. Record 111 a containstags 112 a andattributes 113 a,record 111 b containstags 112 b andattributes 113 b, andrecord 111 c containstags 112 c andattributes 113 c.Web server 120 includesprocessor 123 andmemory 122 storingmedia player service 121.Client 125 includesprocessor 128 andmemory 127 storingweb browser 126.Media server 130 a includesmedia file 135 a, which includesaudio stream 136 a andvideo stream 137 a.Media server 130 b includesmedia file 135 b, which includesaudio stream 136 b andvideo stream 137 b. -
Environment 100 presents an overview of various components to support media streaming to a user atclient 125.Client 125 may comprise, for example, a personal computer, a set-top box, a videogame console, a mobile device, or some other device in communication withweb server 120 over a communication network, such as the Internet.Processor 128 may executeweb browser 126 frommemory 127 to establish a connection fromclient 125 toweb server 120.Processor 123 ofweb server 120 may be executingmedia player service 121 frommemory 122, which may comprise a hosted application that interfaces withmedia database 110 for presenting a media player interface toclient 125 using Flash, Java, Hypertext Markup Language (HTML), and other web technologies interpretable byweb browser 126. Although only a single client and a single web server are shown for simplicity inenvironment 100, alternative embodiments may serve multiple clients and utilize multiple web servers for load balancing, availability, or for other reasons. Additionally, although only three metadata clips and two media servers are shown for simplicity inFIG. 1 , a commercial implementation may include thousands or even millions of metadata clips to reference a similarly large library of media files hosted on several different media servers. - In alternative embodiments, a different web server may remotely access
media player service 121 through an Application Program Interface (API) or through web standard protocols such as Simple Object Access Protocol (SOAP), Extensible Markup Language (XML), and HTTP (Hypertext Transfer Protocol). This may allow, for example, embedding a player widget on a personal website, a social networking site, a message board, or any other web destination, or extending a player widget with additional functionality using other web services. Although web centric protocols are used forenvironment 100, alternative protocols may also be used in other embodiments. For example, a standalone media player and service executable could be distributed toclient 125, allowingclient 125 to interface directly withmedia database 110. In alternative embodiments,client 125 might comprise a tightly integrated embedded system with built-in media player capabilities. Thus, a web server may not always be used as an intermediary, but may be included in many usage scenarios. - In
media database 110, a plurality of metadata clips are stored and associated with records containing further information for the metadata clips. InFIG. 1 specifically, metadata clips 115 a-115 c are each respectively associated with records 111 a-111 c. This allows each metadata clip to be associated with additional metadata, information, references to other data, and other helpful data for providing more robust search, navigation, and player functionality formedia player service 121. For example, media published on the Internet, such as video clips, images, and audio, are often “tagged” with descriptive tags by moderators or users, which can be stored in tags 112 a-112 c. Search algorithms may use these tags to find media keyed to the user's interests. For example, tags might indicate media genres, authorship, media content, categories, ratings, time periods, and other information that may be helpful for sorting through large media catalogues. Attributes 113 a-113 c may also indicate additional information regarding metadata clips 115 a-115 c besides the information already provided through tags 112 a-112 c. For example, attributes 113 a-113 c may indicate source quality information, such as video format, compression codecs and bit-rates, video resolution, audio sampling rate, number of audio channels, and other information regarding source media quality. Additionally, attributes 113 a-113 c may include metadata needed to support digital rights management (DRM), if implemented, and other access and permission schemes. Attributes 113 a-113 c may also include statistical information, such as access logs or user ratings, to help gauge popularity for organizing search results. - These attributes may be helpful for
media player service 121 to serve the most appropriate content toclient 125. For example,web browser 126 may report tomedia player service 121 the capabilities ofclient 125, such as screen resolution, video window size, processor speed, sound output configuration, and other details. If, for example, multiple quality variations are available for metadata clips referencing the same content,media player service 121 can read attributes 113 a-113 c to choose a metadata clip providing optimal quality, considering hardware constraints atclient 125. Additionally,media player service 121 may also test a connection speed withclient 125, allowing the matching of metadata clips of varying bandwidth requirements to varying network conditions, maintaining a high quality user experience even during network congestion. As previously mentioned, access logs can also provide some information about the popularity of particular metadata clips, andmedia player service 121 could be configured to present the most popular metadata clips first, as determined by total access counts, user ratings, or some other metric. Since other information besides access counts and user ratings can be stored inmedia database 110, media searches can scale to arbitrary complexity and can match against any number of different attributes. - Metadata clips 115 a-115 c, stored in
media database 110, only provide a reference to the actual media files to be served toclient 125. Metadata clips may reference the same referenced source media, or may reference different referenced source media. As shown inFIG. 1 , metadata clips 115 a-115 b refer to different segments of the same media file 135 a onmedia server 130 a, whereasmetadata clip 115 c refers to a segment of media file 135 b onmedia server 130 b. By accessing metadata clips frommedia database 110,media player service 121 is enabled to flexibly playback portions of media files from any media server. Conventionally, to provide clip segments from a longer media file, or to combine multiple clip segments into a continuous sequential stream, a completely separate media file needs to be generated, as typical media player services and applications are only configured to playback media files from start to completion. However,media player service 121 can flexibly stream in real-time toweb browser 126 from any arbitrary media file segments by using metadata clips provided bymedia database 110, without the need to generate separate media files. - For example, assume
media player service 121 is requested to streammetadata clip 115 a frommedia database 110. As shown inFIG. 1 ,metadata clip 115 a includes information formedia player service 121 to locate the actual referenced source media file, with a filename of “Cats.avi.” In alternative embodiments, a hash value and a file database may be used instead of accessing by filename directly. Additionally,metadata clip 115 a includes location information for retrieving the actual referenced source media file from a network, in the form of a uniform resource locator (URL) pointing tomedia server 130 a via an Internet Protocol (IP) address.Media server 130 a may, for example, reside on a local intranet shared withweb server 120, preventing outside third parties from directly accessingmedia server 130 a.Media server 130 b may also be present on the same local intranet. With the above location information available tomedia player service 121, media file 135 a can be located and retrieved. - Additionally, since
metadata clip 115 a specifies the portion of media file 135 a that will be demanded,media player service 121 can retrieve only the portion that is necessary for streaming toweb browser 126.Metadata clip 115 a specifies the clip has a referenced start position of 00:01:30.000, or one minute thirty seconds, and a referenced end position of 00:01:45.000, or one minute forty five seconds. In alternative embodiments,metadata clip 115 a may instead specify a length or duration of playback, from which the referenced end position can be derived. - Thus, rather than retrieving the entire contents of media file 135 a,
media player service 121 might retrieve only a portion of media file 135 a necessary to support playback from 00:01:30.000 to 00:01:45.000. Additionally, if ametadata clip 115 a only specified video or audio portions, then transfer ofaudio stream 136 a orvideo stream 137 a could also be omitted. If media file 135 a is a particularly long media file, this can help conserve network and processing resources, reducing costs of operation and providing a better end user experience. - For example,
web browser 126 may accessmedia player service 121, which presents a player interface allowing a user to search for videos. The player interface might present a list of broad topics to choose as a list of tags, from which the user clicks the “Animals” tag.Media player server 121 then queriesmedia database 110 to provide all available metadata clips matching the “Animals” tag. Records 111 a-111 c ofmedia database 110 are thus searched, and tags 112 a-112 c can be matched against the “Animals” tag, which may be present forrecord 111 a andrecord 111 b.Attributes 113 a might indicate thatmetadata clip 115 a has been viewed 500 times, whereasattributes 113 b might indicate thatmetadata clip 115 b has been viewed 100 times. Thus, whenmedia player service 121 presents the results, it can showmetadata clip 115 a, the more popular clip first, andmetadata clip 115 b second.Media player server 121 may present the resulting clips for individual selection, or may play them sequentially in a playlist. -
FIG. 2 presents a block diagram of a system for real-time media presentation using metadata clips, according to one embodiment of the present invention.Environment 200 includesweb server 220,client 225, playlists 240 a-240 b, andmedia server 230.Web server 220 includesprocessor 223 andmemory 222 storing media player service 221.Client 225 includesprocessor 228 andmemory 227 storingweb browser 226.Media server 230 includes media files 235 a-235 d. Media file 235 a includesaudio stream 236 a andvideo stream 237 a.Media file 235 b includes audio stream 236 b andvideo stream 237 b.Media file 235 c includesaudio stream 236 c andvideo stream 237 c.Media file 235 d includes audio stream 236 d.Playlist 240 a includes metadata clips 215 a-215 d andplayback script 241 a. With respect toFIG. 2 , it should be noted thatweb server 220 corresponds toweb server 110 fromFIG. 1 , thatclient 225 corresponds to client 115 fromFIG. 1 , and thatmedia server 230 corresponds to media servers 130 a-130 b. -
Playlist 240 a shows an example of how multiple metadata clips can be arranged together to form one continuous sequential media stream viewable by a user.Playback script 241 a gives media player service 221 some guidance on how to interpret metadata clips 215 a-215 d. Examining the contents ofplayback script 241 a,processor 223 ofweb server 220 executing media player service 221 frommemory 222 should stream toprocessor 228 ofclient 225 executingweb browser 226 frommemory 227, the referenced video frommetadata clips metadata clip 215 d. By choosing a separate audio stream, disparate video clips can be unified with a continuous background soundtrack, rather than being interrupted by individual audio streams. For example, media files 235 a-235 c may each be set to a different song, butplaylist 240 a can specify an external soundtrack, media file 235 d, to maintain audio consistency throughplaylist 240 a. Multiple audio tracks might be specified to play concurrently at varying volume levels to support, for example, audio commentary mixed with an original audio track playing softly in the background. - As shown in
FIG. 2 , media player service 221 therefore only needs to accessvideo stream 237 a,video stream 237 b,video stream 237 c, and audio stream 236 d. Audio streams 236 a-236 c can be left alone, and unused portions of video streams 237 a-237 c can also be left alone, allowingweb server 220 to stream content more efficiently. Additionally, since only references to media files 235 a-235 c are needed, storage usage can be optimized as well, since a new media file does not need to be generated and stored for accommodatingplaylist 240 a. -
Playback script 241 a may contain other details besides the playback ordering of metadata clips. For example, visual effects such as transition effects can be specified, where a two second cross fading effect is chosen for video transitions between video clips inplaylist 240 a. References to captions, subtitles, image overlays, commentary, external web links, and other related content could also be included inplayback script 241 a. - Event behaviors can also be implemented as well during playback, to provide more robust playback behaviors than simply playing media from start to finish. Playback start, playback end, rewinding, muting, pausing, and other events may trigger these playback behaviors. In
playback script 241 a, for example, one such event behavior is that afterplaylist 240 a finishes playback, playback should continue onto a separately externally referencedplaylist 240 b. Thus,processor 223 executing media player service 221 can automatically continue playback using externally referencedplaylist 240 b, whose contents are omitted fromFIG. 2 for brevity. - This playlist ability to reference external playlists can be extended to implement a hierarchical playlist structure, with inherited behaviors and properties. For example, a master playlist could reference several sub-playlists, with global behaviors defined in the master playlist that are inherited by the sub-playlists. Additionally, the sub-playlists can include independent behaviors besides the inherited behaviors, or themselves reference other hierarchical playlists. With
processor 223 processing these external playlists recursively through media player service 221, these hierarchical playlist structures can be readily traversed for any arbitrary number of hierarchical levels. - In another example,
playback script 241 a may specify a continuing playback after the processing ofplaylist 240 a, or where the continuing playback begins using the last referenced source file and the last referenced end position as a new start position. In this case, playback would continue atvideo stream 237 c and audio stream 236 d, using the referenced 00:02:30.000 video end position as a new video start position and the referenced 00:02:51.000 audio end position as a new audio start position. Playback may continue untilvideo stream 237 c or audio stream 236 d ends. Ifvideo stream 237 c and audio stream 236 d do not end concurrently, then a black screen may be shown for missing video, and silence may be substituted for missing audio. - In alternative embodiments, various conditions can be attached to these event behaviors. During playback, these conditions can be monitored for completion before implementing the associated event behaviors. For example, instead of unconditionally jumping to
playlist 240 b at the end ofplaylist 240 a, a button might be overlaid on top of video playback onweb browser 226, with the message “Click here to watch my favorite funny home videos!” If the user clicks the button, then media player service 221 may process that click event to jump directly toplaylist 240 b, even during the middle of playback. Ifplaylist 240 a instead comprised a collection of music videos for an artist,playback script 241 a could include event behaviors where clicks on the video window took the user to an e-commerce site selling the associated album or other merchandise. Another event behavior could be included where pausing the video playback brings up an informational pane providing links to tour dates and an official website for the artist. Yet another event behavior might automatically display subtitles if the user mutes the audio. These event behaviors can enhance the user's viewing experience by reacting intelligently to user actions, providing relevant or related information, and accessing related media. - Additionally, these event behaviors may also be utilized to limit media access as well, to enforce user registration and login restrictions, to implement trial periods, or for other purposes. For example, a teaser selection of a longer media playlist might be presented to the user, where after a predetermined time of media playback, processing of the playlist is terminated and the user is prompted to provide login credentials or sign-up as a new member to resume playback of the playlist. Alternatively, playback may continue, but the event behaviors may specify randomly interspersed prompts pausing playback and exhorting the user to register or login.
-
FIG. 3 presents a block diagram of a media device for providing one or more media files for a display using metadata clips, according to one embodiment of the present invention.Environment 300 includesweb server 320,client 325,input device 329,media server 330,playlist 340, anddisplay 350.Web server 320 includesprocessor 323 andmemory 322 storing media player service 321.Client 325 includesprocessor 328 andmemory 327 storingweb browser 326.Display 350 includesvideo window 360, seekbar 370,position 371, startposition 375,end position 376, andplaylist window 380. With respect toFIG. 2 , it should be noted thatweb server 320 corresponds toweb server 220 fromFIG. 1 , thatclient 325 corresponds toclient 225, thatmedia server 330 corresponds tomedia server 230, and thatplaylist 340 corresponds toplaylist 240 a. - In
FIG. 3 ,processor 328 ofclient 325 executingweb browser 326 frommemory 327 andprocessor 323 ofweb server 320 executing media player service 321 frommemory 322 provide a client-server web based model for providing media files ondisplay 350. In this model,web server 320 acts as the primary media device for handling media retrieval and logic processing, whereasclient 325 viaweb browser 326 interprets a player interface provided by media player service 321 to output to display 350. As previously discussed, other models could be adopted, such as an integrated system on a single device, whereclient 325 andweb server 320 are combined into one media device. -
Display 350 shows a simplified player interface example that may be presented byweb browser 326 through media player service 321. The components ofdisplay 350 might be contained within a larger browser window, withvideo window 360 displaying a video of cats to the upper-left, seekbar 370 updated below, andplaylist window 380 displayed to the right.Playlist window 380 shows a sequential listing of metadata clips contained in the currently playing playlist, corresponding toplaylist 240 a ofFIG. 2 , with a presently playing metadata clip highlighted by a thicker border. As shown inFIG. 3 , the title description “Funny Cats!” applies tometadata clip 215 a, or the referenced media “Cats.avi,” whereas the title description “Dog Tricks!” applies tometadata clip 215 b, or the referenced media “Dogs.avi,” and the title description “Lovable Penguins!” applies tometadata clip 215 c, or the referenced media “Penguins.avi.” These title descriptions might be embedded within records of a database, as shown inFIG. 1 . - Within seek
bar 370, various components are shown to give a visual indication of various location offsets and segments. Staggered time markings are provided on seekbar 370 to give a visual indication of time versus display width, and time markings are also provided directly forstart position 375 andend position 376.Position 371 may be updated as media playback advances to reflect a present position within the presently playing metadata clip. Startposition 375, shown as a right facing triangle, shows the start position corresponding to the currently playing metadata clip.End position 376, shown as a left facing triangle, shows the end position corresponding to the currently playing metadata clip.Selection 377, filled in black, shows the segment of playback time defined bystart position 375 andend position 376. - Although only a single set of position indicators and one selection is shown on seek
bar 370, multiple position indicators and selections may be possible where several metadata clips reference the same source media file. For example, ifplaylist 340 instead containedmetadata clip 115 a andmetadata clip 115 b fromFIG. 1 , then two sets of start positions, end positions, and selections may be shown on seekbar 370, since both metadata clips 115 a-115 b refer to the same media file 135 a, but at different positions. However, since only one selection may be playing back at a time, the inactive selection might be grayed out to visually indicate that it is not the presently active selection. - Besides supporting playlist playback, the player interface depicted in
display 350 could also support, via user input read frominput device 329, the creation and editing of playlists and metadata clips as well.Input device 329 may comprise, for example, a keyboard, a pointing device such as a mouse, a pointer for use with a touchscreen, or some other input device providing user input.Processor 328 executingweb browser 326 onclient 325 may then detect the user input frominput device 329. This user input may then be forwarded to media player service 321 executing onprocessor 323 ofweb server 320, depending on whether the user input may be handled solely on the client side viaweb browser 326, or whether the server side driven by media player service 321 is also necessary. For example,web browser 326 may be able to solely handle local events such as moving a cursor acrossdisplay 350, but user input affecting external data such asplaylist 340 may need to be sent to media player service 321 for further processing. - For example, start
position 375 andend position 376 might be clicked and dragged across seekbar 370 viainput device 329 to definestart position 375 andend position 376, withselection 377 updated accordingly. The underlying referenced start position and referenced end position may also be updated in the relevant referenced metadata clip inplaylist 340.Video window 360 might be clicked and dragged intoplaylist window 380 usinginput device 329 to add the presently playing metadata clip toplaylist 340, or items fromplaylist window 380 might be clicked and dragged into a trashcan icon usinginput device 329 to remove items fromplaylist 340. Additional player interface elements might be added to support adding subtitles, referencing outside audio sources, linking to other playlists, adding user-defined custom event behaviors, and other editing and creation facilities. By utilizing metadata clips to support these features and event behaviors, server resource usage can be optimized without the need to generate and store separate media file copies. -
FIG. 4 shows a flowchart describing the steps, according to one embodiment of the present invention, by which a media device can provide one or more media files for a display using metadata clips. Certain details and features have been left out offlowchart 400 that are apparent to a person of ordinary skill in the art. For example, a step may comprise one or more substeps or may involve specialized equipment or materials, as known in the art. Whilesteps 410 through 440 indicated inflowchart 400 are sufficient to describe one embodiment of the present invention, other embodiments of the invention may utilize steps, different from those shown inflowchart 400. - Referring to step 410 of
flowchart 400 inFIG. 4 andenvironment 300 ofFIG. 3 , step 410 offlowchart 400 comprisesweb server 320 generating one or more of a plurality of metadata clips from user input ofinput device 329 received from a player interface displayed byweb browser 326 ondisplay 350. A user atclient 325 might search various media files frommedia server 330 to assembleplaylist 340, providingstart position 375 andend position 376 for metadata clips referencing those media files. By usinginput device 329, which may comprise a keyboard and a pointing device such as a mouse, the user can manipulate the player interface shown ondisplay 350. In this manner, the user can move startposition 375 andend position 376 to createselection 377 for each media file referenced inplaylist window 380. In turn,web browser 326 executing onprocessor 328 ofclient 325 can notify media player service 321 executing onprocessor 323 ofweb server 320 to store the newly created metadata clips inplaylist 340. - Besides generating one or more metadata clips for
playlist 340, existing metadata clips may also be added toplaylist 340. For example, examiningFIG. 1 ,media database 110 already contains metadata clips 115 a-115 c, which may have been previously generated by fellow users or moderators.Processor 123 ofweb server 120 may then execute a search query throughmedia player service 121 to retrieve metadata clips frommedia database 110 matching certain user specified criteria, such as genre or author. Matching metadata clips may then be added toplaylist 340, in addition to any metadata clips newly generated by the user. - Referring to step 420 of
flowchart 400 inFIG. 4 andenvironment 300 ofFIG. 3 , step 420 offlowchart 400 comprisesweb server 320 determiningplaylist 340, including a listing of at least one metadata clip from the plurality of metadata clips ofstep 410. For example, the user may inputdevice 329 to add and remove metadata clips inplaylist window 380, which may correspond to a listing of metadata clips inplaylist 340. Alternatively, other users or moderators might assemble predetermined playlists, such as playlists 240 a-240 b ofFIG. 2 , for browsing and selection by the user ofclient 325. For the purposes of the present example, step 420 may determineplaylist 340 to be the same asplaylist 240 a ofFIG. 2 . - Referring to step 430 of
flowchart 400 inFIG. 4 andenvironment 200 ofFIG. 2 , step 430 offlowchart 400 comprisesweb server 220processing playlist 240 a to retrieve metadata clips 215 a-215 d for playback at the referenced start and end positions. Thus, media player service 221 may stream toweb browser 226 the video and audio streams indicated byplayback script 241 a. Audio stream 236 d of media file 235 d, or “Soundtrack.mp3,” is streamed toweb browser 226. Concurrently and in sequential order, three video streams are also sent toweb browser 226. First, position 00:00:20.000 to 00:01:15.000 fromvideo stream 237 a of media file 235 a, or “Cats.avi,” is streamed toweb browser 226. Second, position 00:02:00.000 to 00:03:00.000 fromvideo stream 237 b of media file 235 b, or “Dogs.avi,” is streamed toweb browser 226. Third, position 00:01:30.000 to 00:02:30.000 fromvideo stream 237 c of media file 235 c, or “Penguins.avi,” is streamed toweb browser 226. Asweb server 220 can intelligently stream the above referenced media files, separate media files do not need to be generated and stored, optimizing server resource usage. - Referring to step 440 of
flowchart 400 inFIG. 4 ,environment 200 ofFIG. 2 , andenvironment 300 ofFIG. 3 , step 440 offlowchart 400 comprisesweb server 320 beginning a playback ondisplay 350 according to the processed playlist ofstep 430. Thus, media player service 221 ofweb server 220 may directclient 225 to playback ondisplay 350, viaweb browser 226, video streams 237 a-237 c sequentially at the indicated start and end positions, while concurrently playing back audio stream 236 d through an audio output device attached toclient 225, which may, for example, be integrated intodisplay 350. In this manner, the user ofclient 325 can view, edit, and enjoy selections of media from a wide variety of different source media ondisplay 350. Additionally, depending on the settings ofplaylist 340, various event behaviors can also be supported to provide flexible behaviors beyond standard start to finish playback. - From the above description of the invention it is manifest that various techniques can be used for implementing the concepts of the present invention without departing from its scope. Moreover, while the invention has been described with specific reference to certain embodiments, a person of ordinary skills in the art would recognize that changes can be made in form and detail without departing from the spirit and the scope of the invention. As such, the described embodiments are to be considered in all respects as illustrative and not restrictive. It should also be understood that the invention is not limited to the particular embodiments described herein, but is capable of many rearrangements, modifications, and substitutions without departing from the scope of the invention.
Claims (17)
1-20. (canceled)
21. A media device comprising:
a memory storing a database including a playlist having a plurality of metadata clips, wherein each of the plurality of metadata clips references a network location of a source media file, a referenced start position in the referenced source media, and a referenced end position in the referenced source media, wherein the plurality of metadata clips includes a first metadata clip and a second metadata clip, and wherein the first media clip references a first network location of a first media file, a first start position and a first end position, and the second media clip references a second network location of a second media file, a second start position and a second end position, and wherein the first network location is different from the second network location, and the first media file is different from the second media file; and
a processor configured to access the database to play the playlist using the plurality of metadata clips by:
retrieving from the first metadata clip, the first network location of the first media file, the first start position and the first end position;
playing the first media file located at the first network location from the first start position to the first end position;
retrieving from the second metadata clip, the second network location of the second media file, the second start position and the second end position; and
playing the second media file located at the second network location from the second start position to the second end position.
22. The media device of claim 21 , wherein the database includes attributes for each of the plurality of metadata clips.
23. The media device of claim 21 , wherein the first network location is a first web address for a first media server, and the second network location is a second web address for a second media server.
24. The media device of claim 21 , wherein the first media file is a video clip and the second media file is an audio clip.
25. The media device of claim 21 , wherein the first media file is a first video clip and the second media file is second video clip.
26. The media device of claim 21 , wherein the plurality of metadata clips includes a third metadata clip referencing a third media file, wherein the first media file is a first video clip, the second media file is second video clip and the third media file is an audio file.
27. The media device of claim 26 , wherein the processor is configured to retrieve and play the first media file, the second media file and the third media file contemporaneously.
28. The media device of claim 21 , wherein the processor is configured to retrieve and play the first media file and the second media file contemporaneously.
29. A method for use by a media device having a processor and a memory, the method comprising:
accessing a database in the memory, by the processor, for playing a playlist using a plurality of metadata clips of the playlist stored in the database, wherein each of the plurality of metadata clips references a network location of a source media file, a referenced start position in the referenced source media, and a referenced end position in the referenced source media, wherein the plurality of metadata clips includes a first metadata clip and a second metadata clip, and wherein the first media clip references a first network location of a first media file, a first start position and a first end position, and the second media clip references a second network location of a second media file, a second start position and a second end position, and wherein the first network location is different from the second network location, and the first media file is different from the second media file;
wherein the playing of the playlist includes:
retrieving from the first metadata clip, the first network location of the first media file, the first start position and the first end position;
playing the first media file located at the first network location from the first start position to the first end position;
retrieving from the second metadata clip, the second network location of the second media file, the second start position and the second end position; and
playing the second media file located at the second network location from the second start position to the second end position.
30. The method of claim 29 , wherein the database includes attributes for each of the plurality of metadata clips.
31. The method of claim 29 , wherein the first network location is a first web address for a first media server, and the second network location is a second web address for a second media server.
32. The method of claim 29 , wherein the first media file is a video clip and the second media file is an audio clip.
33. The method of claim 29 , wherein the first media file is a first video clip and the second media file is second video clip.
34. The method of claim 29 , wherein the plurality of metadata clips includes a third metadata clip referencing a third media file, wherein the first media file is a first video clip, the second media file is second video clip and the third media file is an audio file.
35. The method of claim 34 , wherein the retrieving and playing the first media file, the second media file and the third media file occur contemporaneously.
36. The method of claim 29 , wherein the retrieving and playing the first media file and the second media file occur contemporaneously.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/267,092 US20140244607A1 (en) | 2009-04-14 | 2014-05-01 | System and Method for Real-Time Media Presentation Using Metadata Clips |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/386,198 US8793282B2 (en) | 2009-04-14 | 2009-04-14 | Real-time media presentation using metadata clips |
US14/267,092 US20140244607A1 (en) | 2009-04-14 | 2014-05-01 | System and Method for Real-Time Media Presentation Using Metadata Clips |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/386,198 Continuation US8793282B2 (en) | 2009-04-14 | 2009-04-14 | Real-time media presentation using metadata clips |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140244607A1 true US20140244607A1 (en) | 2014-08-28 |
Family
ID=42935176
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/386,198 Active 2029-10-19 US8793282B2 (en) | 2009-04-14 | 2009-04-14 | Real-time media presentation using metadata clips |
US14/267,092 Abandoned US20140244607A1 (en) | 2009-04-14 | 2014-05-01 | System and Method for Real-Time Media Presentation Using Metadata Clips |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/386,198 Active 2029-10-19 US8793282B2 (en) | 2009-04-14 | 2009-04-14 | Real-time media presentation using metadata clips |
Country Status (1)
Country | Link |
---|---|
US (2) | US8793282B2 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140207473A1 (en) * | 2013-01-24 | 2014-07-24 | Google Inc. | Rearrangement and rate allocation for compressing multichannel audio |
US20150032769A1 (en) * | 2013-07-25 | 2015-01-29 | Google Inc. | Generating Playlists Using Calendar, Location And Event Data |
US20150046645A1 (en) * | 2013-08-12 | 2015-02-12 | International Business Machines Corporation | Method, Storage System, and Program for Spanning Single File Across Plurality of Tape Media |
US20150279427A1 (en) * | 2012-12-12 | 2015-10-01 | Smule, Inc. | Coordinated Audiovisual Montage from Selected Crowd-Sourced Content with Alignment to Audio Baseline |
US20160094600A1 (en) * | 2014-09-30 | 2016-03-31 | The Nielsen Company (Us), Llc | Methods and apparatus to measure exposure to streaming media |
Families Citing this family (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8793282B2 (en) * | 2009-04-14 | 2014-07-29 | Disney Enterprises, Inc. | Real-time media presentation using metadata clips |
US9191639B2 (en) | 2010-04-12 | 2015-11-17 | Adobe Systems Incorporated | Method and apparatus for generating video descriptions |
KR20120034550A (en) | 2010-07-20 | 2012-04-12 | 한국전자통신연구원 | Apparatus and method for providing streaming contents |
US9467493B2 (en) | 2010-09-06 | 2016-10-11 | Electronics And Telecommunication Research Institute | Apparatus and method for providing streaming content |
KR101206698B1 (en) * | 2010-10-06 | 2012-11-30 | 한국항공대학교산학협력단 | Apparatus and method for providing streaming contents |
US9369512B2 (en) | 2010-10-06 | 2016-06-14 | Electronics And Telecommunications Research Institute | Apparatus and method for providing streaming content |
US9313535B2 (en) | 2011-02-03 | 2016-04-12 | Ericsson Ab | Generating montages of video segments responsive to viewing preferences associated with a video terminal |
KR20140016357A (en) * | 2011-06-08 | 2014-02-07 | 코닌클리즈케 케이피엔 엔.브이. | Spatially-segmented content delivery |
US20130067346A1 (en) * | 2011-09-09 | 2013-03-14 | Microsoft Corporation | Content User Experience |
US9100245B1 (en) * | 2012-02-08 | 2015-08-04 | Amazon Technologies, Inc. | Identifying protected media files |
US20140130182A1 (en) * | 2012-11-02 | 2014-05-08 | Genesismedia Llc | Controlled Grant Of Access To Media Content |
US9894022B2 (en) | 2013-07-19 | 2018-02-13 | Ambient Consulting, LLC | Image with audio conversation system and method |
US10057731B2 (en) | 2013-10-01 | 2018-08-21 | Ambient Consulting, LLC | Image and message integration system and method |
US9977591B2 (en) * | 2013-10-01 | 2018-05-22 | Ambient Consulting, LLC | Image with audio conversation system and method |
US10180776B2 (en) | 2013-10-01 | 2019-01-15 | Ambient Consulting, LLC | Image grouping with audio commentaries system and method |
US10324733B2 (en) | 2014-07-30 | 2019-06-18 | Microsoft Technology Licensing, Llc | Shutdown notifications |
US10592080B2 (en) | 2014-07-31 | 2020-03-17 | Microsoft Technology Licensing, Llc | Assisted presentation of application windows |
US10678412B2 (en) | 2014-07-31 | 2020-06-09 | Microsoft Technology Licensing, Llc | Dynamic joint dividers for application windows |
US9836464B2 (en) | 2014-07-31 | 2017-12-05 | Microsoft Technology Licensing, Llc | Curating media from social connections |
US9787576B2 (en) | 2014-07-31 | 2017-10-10 | Microsoft Technology Licensing, Llc | Propagating routing awareness for autonomous networks |
US10254942B2 (en) | 2014-07-31 | 2019-04-09 | Microsoft Technology Licensing, Llc | Adaptive sizing and positioning of application windows |
US11182431B2 (en) * | 2014-10-03 | 2021-11-23 | Disney Enterprises, Inc. | Voice searching metadata through media content |
EP3216220A4 (en) * | 2014-11-07 | 2018-07-11 | H4 Engineering, Inc. | Editing systems |
WO2016118519A1 (en) * | 2015-01-19 | 2016-07-28 | Berman Matthew | System and methods for facile, instant, and minimally disruptive playback of media files |
US9827209B2 (en) | 2015-02-09 | 2017-11-28 | Microsoft Technology Licensing, Llc | Display system |
US11086216B2 (en) | 2015-02-09 | 2021-08-10 | Microsoft Technology Licensing, Llc | Generating electronic components |
US10018844B2 (en) | 2015-02-09 | 2018-07-10 | Microsoft Technology Licensing, Llc | Wearable image display system |
JP6930421B2 (en) * | 2015-08-03 | 2021-09-01 | ソニーグループ株式会社 | Information processing systems, information processing methods, recording media, and programs |
CN105915996A (en) * | 2015-12-15 | 2016-08-31 | 乐视网信息技术(北京)股份有限公司 | Multipath stream media playing method and equipment |
GB2549324A (en) | 2016-04-15 | 2017-10-18 | Quantel Ltd | Media file systems and methods of storing media files in a media file system |
CN106909603A (en) * | 2016-08-31 | 2017-06-30 | 阿里巴巴集团控股有限公司 | Search information processing method and device |
US20180217964A1 (en) * | 2017-02-02 | 2018-08-02 | Futurewei Technologies, Inc. | Content-aware energy savings for web browsing utilizing selective loading priority |
CN107172500B (en) * | 2017-06-26 | 2020-06-26 | 北京金山安全软件有限公司 | Method and device for playing videos in webpage |
AU2018102058B4 (en) * | 2018-06-15 | 2019-07-18 | Fetch Tv Pty Ltd | Systems and methods for creating and managing virtual channels |
US11233838B2 (en) * | 2019-06-21 | 2022-01-25 | Grass Valley Limited | System and method of web streaming media content |
US20240064362A1 (en) * | 2021-02-01 | 2024-02-22 | Google Llc | Playing media content in response to triggers |
US11475058B1 (en) * | 2021-10-19 | 2022-10-18 | Rovi Guides, Inc. | Systems and methods for generating a dynamic timeline of related media content based on tagged content |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030093790A1 (en) * | 2000-03-28 | 2003-05-15 | Logan James D. | Audio and video program recording, editing and playback systems using metadata |
US20050005308A1 (en) * | 2002-01-29 | 2005-01-06 | Gotuit Video, Inc. | Methods and apparatus for recording and replaying sports broadcasts |
US20050037740A1 (en) * | 2003-07-25 | 2005-02-17 | Smith Sunny P. | System and method for delivery of multimedia content into end-user devices |
US20070043766A1 (en) * | 2005-08-18 | 2007-02-22 | Nicholas Frank C | Method and System for the Creating, Managing, and Delivery of Feed Formatted Content |
US20080147711A1 (en) * | 2006-12-19 | 2008-06-19 | Yahoo! Inc. | Method and system for providing playlist recommendations |
US20080155627A1 (en) * | 2006-12-04 | 2008-06-26 | O'connor Daniel | Systems and methods of searching for and presenting video and audio |
US20080313227A1 (en) * | 2007-06-14 | 2008-12-18 | Yahoo! Inc. | Method and system for media-based event generation |
US20080319856A1 (en) * | 2007-06-12 | 2008-12-25 | Anthony Zito | Desktop Extension for Readily-Sharable and Accessible Media Playlist and Media |
US20100110200A1 (en) * | 2008-07-31 | 2010-05-06 | Kim Lau | Generation and use of user-selected scenes playlist from distributed digital content |
US7725829B1 (en) * | 2002-01-23 | 2010-05-25 | Microsoft Corporation | Media authoring and presentation |
US8209609B2 (en) * | 2008-12-23 | 2012-06-26 | Intel Corporation | Audio-visual search and browse interface (AVSBI) |
US8793282B2 (en) * | 2009-04-14 | 2014-07-29 | Disney Enterprises, Inc. | Real-time media presentation using metadata clips |
-
2009
- 2009-04-14 US US12/386,198 patent/US8793282B2/en active Active
-
2014
- 2014-05-01 US US14/267,092 patent/US20140244607A1/en not_active Abandoned
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030093790A1 (en) * | 2000-03-28 | 2003-05-15 | Logan James D. | Audio and video program recording, editing and playback systems using metadata |
US7725829B1 (en) * | 2002-01-23 | 2010-05-25 | Microsoft Corporation | Media authoring and presentation |
US20050005308A1 (en) * | 2002-01-29 | 2005-01-06 | Gotuit Video, Inc. | Methods and apparatus for recording and replaying sports broadcasts |
US20050037740A1 (en) * | 2003-07-25 | 2005-02-17 | Smith Sunny P. | System and method for delivery of multimedia content into end-user devices |
US20070043766A1 (en) * | 2005-08-18 | 2007-02-22 | Nicholas Frank C | Method and System for the Creating, Managing, and Delivery of Feed Formatted Content |
US20080155627A1 (en) * | 2006-12-04 | 2008-06-26 | O'connor Daniel | Systems and methods of searching for and presenting video and audio |
US20080147711A1 (en) * | 2006-12-19 | 2008-06-19 | Yahoo! Inc. | Method and system for providing playlist recommendations |
US20080319856A1 (en) * | 2007-06-12 | 2008-12-25 | Anthony Zito | Desktop Extension for Readily-Sharable and Accessible Media Playlist and Media |
US20080313227A1 (en) * | 2007-06-14 | 2008-12-18 | Yahoo! Inc. | Method and system for media-based event generation |
US20100110200A1 (en) * | 2008-07-31 | 2010-05-06 | Kim Lau | Generation and use of user-selected scenes playlist from distributed digital content |
US8209609B2 (en) * | 2008-12-23 | 2012-06-26 | Intel Corporation | Audio-visual search and browse interface (AVSBI) |
US8793282B2 (en) * | 2009-04-14 | 2014-07-29 | Disney Enterprises, Inc. | Real-time media presentation using metadata clips |
Non-Patent Citations (3)
Title |
---|
Rene Kaiser et al., "Metadata-driven Interactive Web Video Assembly", Springer Science + Business Media, published online 30 October 2008, pages 437-467. * |
Ronen Brafman & Doren Friedman, "Adaptive Rich Media Presentations via Preference-Based Constrained Optimization", 2006, Dagstuhl Seminar Proceedings 04271, pages 1-6. * |
Ryan Shaw & Patrick Schmitz, "Community Annotation and Remix: A Research Platform and Pilot Deployment", ACM HCM, 27 October 2006, pages 89-98. * |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150279427A1 (en) * | 2012-12-12 | 2015-10-01 | Smule, Inc. | Coordinated Audiovisual Montage from Selected Crowd-Sourced Content with Alignment to Audio Baseline |
US10971191B2 (en) * | 2012-12-12 | 2021-04-06 | Smule, Inc. | Coordinated audiovisual montage from selected crowd-sourced content with alignment to audio baseline |
US9336791B2 (en) * | 2013-01-24 | 2016-05-10 | Google Inc. | Rearrangement and rate allocation for compressing multichannel audio |
US20140207473A1 (en) * | 2013-01-24 | 2014-07-24 | Google Inc. | Rearrangement and rate allocation for compressing multichannel audio |
US10176179B2 (en) * | 2013-07-25 | 2019-01-08 | Google Llc | Generating playlists using calendar, location and event data |
US20150032769A1 (en) * | 2013-07-25 | 2015-01-29 | Google Inc. | Generating Playlists Using Calendar, Location And Event Data |
US11151190B2 (en) | 2013-07-25 | 2021-10-19 | Google Llc | Generating playlists using calendar, location and event data |
US9684454B2 (en) * | 2013-08-12 | 2017-06-20 | International Business Machines Corporation | Method, storage system, and program for spanning single file across plurality of tape media |
US20150046645A1 (en) * | 2013-08-12 | 2015-02-12 | International Business Machines Corporation | Method, Storage System, and Program for Spanning Single File Across Plurality of Tape Media |
US20180352052A1 (en) * | 2014-09-30 | 2018-12-06 | The Nielsen Company (Us), Llc | Methods and apparatus to measure exposure to streaming media |
US10681174B2 (en) * | 2014-09-30 | 2020-06-09 | The Nielsen Company (US) | Methods and apparatus to measure exposure to streaming media using media watermarks |
US20160094600A1 (en) * | 2014-09-30 | 2016-03-31 | The Nielsen Company (Us), Llc | Methods and apparatus to measure exposure to streaming media |
US11240341B2 (en) * | 2014-09-30 | 2022-02-01 | The Nielsen Company (Us), Llc | Methods and apparatus to measure exposure to streaming media using media watermarks |
US11902399B2 (en) | 2014-09-30 | 2024-02-13 | The Nielsen Company (Us), Llc | Methods and apparatus to measure exposure to streaming media |
Also Published As
Publication number | Publication date |
---|---|
US20100262618A1 (en) | 2010-10-14 |
US8793282B2 (en) | 2014-07-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8793282B2 (en) | Real-time media presentation using metadata clips | |
US9467504B2 (en) | Techniques and systems for supporting podcasting | |
JP5586647B2 (en) | Obtain, manage and synchronize podcasting | |
US9396193B2 (en) | Method and system for managing playlists | |
US8798777B2 (en) | System and method for using a list of audio media to create a list of audiovisual media | |
US10282425B2 (en) | Identifying popular segments of media objects | |
US20110191684A1 (en) | Method of Internet Video Access and Management | |
US20140052770A1 (en) | System and method for managing media content using a dynamic playlist | |
JP2007036830A (en) | Moving picture management system, moving picture managing method, client, and program | |
EP3192258A1 (en) | Storage and editing of video of activities using sensor and tag data of participants and spectators | |
WO2007130472A2 (en) | Methods and systems for providing media assets over a network | |
WO2007064715A2 (en) | Systems, methods, and computer program products for the creation, monetization, distribution, and consumption of metacontent | |
WO2015121456A1 (en) | Delivering media content based on analysis of user's behaviour | |
JP5306555B1 (en) | System capable of providing a plurality of digital contents and method using the same | |
JP6234080B2 (en) | System capable of providing a plurality of digital contents and method using the same | |
Fricke et al. | Work Package 5: LinkedTV platform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DISNEY ENTERPRISES, INC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEDINSSON, SKARPHEDINN;SIDI, ARIFF;WATSON, DAVID;AND OTHERS;SIGNING DATES FROM 20090407 TO 20090410;REEL/FRAME:032803/0172 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |