US20180332094A1 - Systems, Methods, and Media for Streaming Media Content - Google Patents

Systems, Methods, and Media for Streaming Media Content Download PDF

Info

Publication number
US20180332094A1
US20180332094A1 US15/972,841 US201815972841A US2018332094A1 US 20180332094 A1 US20180332094 A1 US 20180332094A1 US 201815972841 A US201815972841 A US 201815972841A US 2018332094 A1 US2018332094 A1 US 2018332094A1
Authority
US
United States
Prior art keywords
file
media content
segment
index
user device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/972,841
Inventor
Jason A. Braness
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Divx LLC
Original Assignee
Divx LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Divx LLC filed Critical Divx LLC
Priority to US15/972,841 priority Critical patent/US20180332094A1/en
Publication of US20180332094A1 publication Critical patent/US20180332094A1/en
Assigned to SONIC IP, INC. reassignment SONIC IP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BRANESS, JASON A.
Assigned to DIVX CF HOLDINGS LLC reassignment DIVX CF HOLDINGS LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SONIC IP, INC.
Assigned to DIVX, LLC reassignment DIVX, LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: DIVX CF HOLDINGS LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • H04L65/607
    • H04L65/4084
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/612Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for unicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/752Media network packet handling adapting media to network capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/756Media network packet handling adapting media to device capabilities

Definitions

  • Methods, systems, and media for streaming media content are provided. More particularly, the disclosed subject matter relates to adaptive bitrate streaming.
  • media content can be encoded at multiple bit rates.
  • the encoded media content can then be transmitted using a suitable protocol, such as the Hypertext Transfer Protocol (HTTP), the Real-time Transport Protocol (RTP), the Real Time Streaming Protocol (RTSP), etc.
  • HTTP Hypertext Transfer Protocol
  • RTP Real-time Transport Protocol
  • RTSP Real Time Streaming Protocol
  • conventional approaches do not provide users with the capabilities to stream, store, and playback media content at variable bitrates.
  • methods for streaming media content comprising: receiving top level index data from a server; caching the top level index data in an index file; receiving header data associated with a first media content file from the server; caching the header data in a header file; receiving a first segment of the first media content file based at least in part on the index file; caching the first segment of the first media content file in a first file; updating the index file to include information about the first file; and causing the first fragment to be displayed based at least in part on the index file and the header file.
  • systems for streaming media content comprising at least one hardware processor that is configured to: receive top level index data from a server; cache the top level index data in an index file; receive header data associated with a first media content file from the server; cache the header data in a header file; receive a first segment of the first media content file based at least in part on the index file; cache the first segment of the first media content file in a first file; update the index file to include information about the first file; and cause the first fragment to be displayed based at least in part on the index file and the header file.
  • non-transitory media containing computer-executable instructions that, when executed by a hardware processor, cause the hardware processor to streaming media content are provided, the method comprising: receiving top level index data from a server; caching the top level index data in an index file; receiving header data associated with a first media content file from the server; caching the header data in a header file; receiving a first segment of the first media content file based at least in part on the index file; caching the first segment of the first media content file in a first file; updating the index file to include information about the first file; and causing the first fragment to be displayed based at least in part on the index file and the header file.
  • FIG. 1 shows a generalized block diagram of an example of an architecture of hardware that can be used to stream media content in accordance with some embodiments of the invention
  • FIG. 2 shows examples of a top level index file and Matroska container files in accordance with some embodiments of the invention
  • FIG. 3 shows an example of a structure of a Matroska container file in accordance with some embodiments of the invention
  • FIG. 4 shows a flow chart of an example of a process for streaming media content in accordance with some embodiments of the invention
  • FIG. 5 shows a flow chart of an example of a process for rendering media content in accordance with some embodiments of the invention
  • FIG. 6 shows an example of a top level index file in accordance with some embodiments of the invention.
  • FIG. 7 shows an example of Matroska container files containing cached media content in accordance with some embodiments of the invention.
  • This invention generally relates to mechanisms (which can be systems, methods, media, etc.) for streaming media content.
  • the mechanisms can be used in many applications.
  • the mechanisms can be used to stream, store, and/or playback media content having different versions (e.g., such as video content encoded at multiple bit rates, resolutions, frame rates, etc.).
  • media content can be stored in one or more Matroska container files on a server.
  • the Matroska container is a media container developed as an open standard project by the Matroska non-profit organization of Aussonne, France.
  • the Matroska specification (which can be retrieved from the Internet: http://matroska.org/technical/specs/index.html) is hereby incorporated by reference herein in its entity.
  • the server can store multiple Matroska container files containing encoded video content having different bit rates, resolutions, frame rates, etc.
  • a user device can request a top level index file from the server.
  • the user device can send one or more requests containing information relating to resources that can provide the top level index file under a suitable protocol (e.g., such as a Hypertext Transfer Protocol (HTTP), a Transmission Control Protocol (TCP), etc.).
  • HTTP Hypertext Transfer Protocol
  • TCP Transmission Control Protocol
  • the user device can request the top level index file via one or more HTTP requests containing one or more Uniform Resource Identifiers (URI) associated with the top level index file.
  • URI Uniform Resource Identifiers
  • the user device can receive the requested top level index file via one or more responses sent by the server.
  • the top level index file can be received via one or more HTTP responses corresponding to the HTTP requests.
  • the top level index file can be received in any suitable format.
  • the top level index file can be received as a Synchronized Multimedia Integration Language (SMIL) file, an Extensible Markup Language (XML) file, etc.
  • SMIL Synchronized Multimedia Integration Language
  • XML Extensible Markup Language
  • the user device upon receiving the top level index file, can cache the top level index file in a suitable manner.
  • the top level index file can be cached in the form of one or more SMIL files, XML files, etc.
  • the user device can request one or more headers associated with one or more Matroska container files based on the cached top level index file. For example, the user device can parse the cached top level index file and obtain one or more URIs corresponding to the headers. The user device can then request the headers by sending one or more requests containing the URIs to the server and/or another server.
  • the user device can receive the requested headers through one or more responses that are sent by the server in response to the requests.
  • the user device can also cache the received headers in a suitable manner.
  • each of the headers can be cached as an Extensible Binary Meta Language (EBML) file.
  • EBML Extensible Binary Meta Language
  • the user device can request one or more media content fragments from the server.
  • the user device can request one or more cluster elements of one or more Matroska container files stored on the server.
  • the user device can request a cluster element of a Matroska container file (e.g., a video file containing suitable video data) based on the streaming conditions (e.g., such as the bandwidth, the hardware capacity, etc. that can be utilized to stream media content) that is experienced by the user device.
  • the user device upon receiving the requested media content fragments from the server, can cache the media content fragments. For example, the user device can cache each media content fragment in an EBML file upon receiving the media content fragment.
  • the user device can also update the cached top level index file to include information about the cached media content fragments. For example the cached top level index file can be updated to include one or more URIs corresponding to each EBML file that stores the cached media content fragment.
  • the user device can cause the cached media content to be rendered.
  • the user device can cause cached video content, audio content, subtitles, etc. to be rendered based on the cached top level index file, the cached headers, and/or any other suitable information.
  • the user device can retrieve multiple EBML files that store the cached media content fragments based on the top level index file (e.g., using the URIs corresponding to each EBML file). The user device can then extract the media content stored in the EBML files, decode the media content, and cause the decoded media content to be rendered.
  • the cached media content can be rendered at any suitable time.
  • the cached media content can be rendered when the user device is streaming and/or downloading media content from the server.
  • the cached media content can be rendered after the user device has finished streaming and/or caching media content from the server.
  • the user device can cause the cached media content to be rendered in response a user requesting a playback of part or all of the cached media content at any time with or without a live communication connection with the server.
  • architecture 100 can include a media content source 102 , one or more servers 104 , a communications network 106 , one or more user devices 108 , and communications paths 110 , 112 , 114 , and 116 .
  • Media content source 102 can include any suitable device that can provide media content.
  • media content source 102 can include any suitable circuitry that is capable of encoding media content, such as one or more suitable video encoders, audio encoders, video decoders, audio decoders, etc.
  • media content source 102 can include one or more suitable video encoders that are capable of encoding video content into different versions, each of which can have a particular bit rate, a particular resolution, a particular frame rate, a particular bit depth, etc.
  • media content source 102 can include one or more types of content distribution equipment for distributing any suitable media content, including television distribution facility equipment, cable system head-end equipment, satellite distribution facility equipment, programming source equipment (e.g., equipment of television broadcasters, such as NBC, ABC, HBO, etc.), intermediate distribution facility equipment, Internet provider equipment, on-demand media server equipment, and/or any other suitable media content provider equipment.
  • programming source equipment e.g., equipment of television broadcasters, such as NBC, ABC, HBO, etc.
  • intermediate distribution facility equipment e.g., Internet provider equipment, on-demand media server equipment, and/or any other suitable media content provider equipment.
  • NBC is a trademark owned by the National Broadcasting Company, Inc.
  • ABC is a trademark owned by the ABC, INC.
  • HBO is a trademark owned by the Home Box Office, Inc.
  • Media content source 102 may be operated by the originator of content (e.g., a television broadcaster, a Webcast provider, etc.) or may be operated by a party other than the originator of content (e.g., an on-demand content provider, an Internet provider of content of broadcast programs for downloading, etc.).
  • the originator of content e.g., a television broadcaster, a Webcast provider, etc.
  • a party other than the originator of content e.g., an on-demand content provider, an Internet provider of content of broadcast programs for downloading, etc.
  • Media content source 102 may be operated by cable providers, satellite providers, on-demand providers, Internet providers, providers of over-the-top content, and/or any other suitable provider(s) of content.
  • Media content source 102 may include a remote media server used to store different types of content (including video content selected by a user) in a location remote from any of the user equipment devices.
  • media content source 102 can include one or more content delivery networks (CDN).
  • CDN content delivery networks
  • the term “media content” or “content” should be understood to mean one or more electronically consumable media assets, such as television programs, pay-per-view programs, on-demand programs (e.g., as provided in video-on-demand (VOD) systems), Internet content (e.g., streaming content, downloadable content, Webcasts, etc.), movies, films, video clips, audio, audio books, and/or any other media or multimedia and/or combination of the same.
  • the term “multimedia” should be understood to mean media content that utilizes at least two different content forms described above, for example, text, audio, images, video, or interactivity content forms.
  • Media content may be recorded, played, displayed or accessed by user equipment devices, but can also be part of a live performance.
  • media content can include over-the-top (OTT) content.
  • OTT content providers include YOUTUBE, NETFLIX, and HULU, which provide audio and video via IP packets.
  • YOUTUBE a trademark owned by Google Inc.
  • Netflix is a trademark owned by Netflix Inc.
  • Hulu is a trademark owned by Hulu, LLC.
  • Media content can be provided from any suitable source in some embodiments.
  • media content can be electronically delivered to a user's location from a remote location.
  • media content such as a Video-On-Demand movie
  • media content can be delivered to a user's home from a cable system server.
  • media content such as a television program, can be delivered to a user's home from a streaming media provider over the Internet.
  • Server(s) 104 can be and/or include any suitable device that is capable of receiving, storing, processing, and/or delivering media content, and/or communicating with one or more user devices and/or other components of architecture 100 under one or more suitable protocols.
  • server(s) 104 can include any suitable circuitry that can receive requests, process requests, send responses, and/or perform other functions under a Hypertext Transfer Protocol (HTTP), a Transmission Control Protocol (TCP), etc.
  • HTTP Hypertext Transfer Protocol
  • TCP Transmission Control Protocol
  • server(s) 104 can store media content that can be delivered to one or more components of architecture 100 in a suitable manner.
  • the media content can be stored in one or more suitable multimedia containers, such as Matroska media containers, Audio Video Interleaved (AVI) media containers, MPEG-4 Part 14 (MP4) media containers, etc.
  • suitable multimedia containers such as Matroska media containers, Audio Video Interleaved (AVI) media containers, MPEG-4 Part 14 (MP4) media containers, etc.
  • server(s) 104 can store one or more Matroska container files 210 and one or more top level index file 220 .
  • Matroska container files 210 can include any suitable files containing data about suitable media content, such as video content, audio content, subtitles, etc.
  • Matroska container files 210 can include one or more MKV files that can include data about video content, audio content, subtitles, etc.
  • Matroska container files 210 can include one or more MKA files that can include audio data.
  • Matroska container files 210 can include one or more MKS files that can include data about subtitles.
  • Matroska container files 210 can include one or more MK3D files that can include data about stereoscopic video content.
  • Matroska container files 210 can include one or more video files, such as video files 212 and 214 as illustrated in FIG. 2 .
  • each of video files 212 and 214 can include data about video content having a particular bit rate, a particular resolution, a particular frame rate, etc.
  • each of video files 212 and 214 can contain a version of particular video content. More particularly, for example, video file 212 can contain a version of the particular video content including encoded video content having a first bit rate (and/or a first frame rate, a first resolution, etc.).
  • Video file 214 can contain a version of the particular video content including encoded video content having a second bit rate (and/or a second frame rate, a second resolution, etc.).
  • Matroska container files 210 can include multiple video files (e.g., nine files or any suitable number of files), where each video file contains a version of particular video content (e.g., encoded video content having a particular bit rate, a particular resolution, a particular frame rate, etc.).
  • each video file contains a version of particular video content (e.g., encoded video content having a particular bit rate, a particular resolution, a particular frame rate, etc.).
  • Matroska container files 210 can include one or more audio files, such as an audio file 216 .
  • audio file 216 can contain audio content that is associated with the video content contained in one or more video files, such as video files 212 and 214 .
  • Matroska container files 210 can include one or more files that contain subtitles associated with suitable video content and/or audio content, such as a subtitle file 218 .
  • subtitle file 218 can contain data about subtitles that relate to the video content contained in video files 212 and 214 and/or the audio content contained in audio file 216 .
  • each of Matroska container files 210 can have a structure as illustrated in FIG. 3 .
  • Matroska container file 300 can include a header element 310 , one or more cluster elements 320 , an index element 330 , and/or any other suitable components.
  • Header element 310 can include any suitable information relating to Matroska container file 300 , such as a description of file 300 , the version of file 300 , etc. Header element 310 can also include any suitable information relating to the media content stored in file 300 , such as the bit rate, the resolution, the frame rate, etc. of the media content.
  • header element 310 can include an Extensible Binary Meta Language (EBML) element 311 , one or more segment elements 312 , and/or any other suitable components.
  • EBML Extensible Binary Meta Language
  • EBML element 311 can include information about EBML version of the file, the type of EBML file (e.g., a Matroska file), etc.
  • Segment element 312 can contain any suitable data about media content, header, etc.
  • segment element 312 can include a seekhead element 313 , a segmentinfo element 314 , a tracks element 315 , and/or any other suitable components.
  • seekhead element 313 can include any suitable information about one or more components of segment element 312 , such as a list of the positions of the components of segment element 312 (e.g., such as segmentinfo element 314 , tracks element 315 , etc.).
  • Segmentinfo element 314 can include any suitable information about segment element 312 and/or file 300 , such as the duration of the media content contained in segment element 312 , an identification number corresponding to segment element 312 (e.g., a randomly generated unique number that can be used to identify segment element 312 ), a title of segment element 312 and/or file 300 , etc.
  • Tracks element 316 can include any suitable information about one or more media tracks that are stored in segment element 312 , such as the type of each of the tracks (e.g., audio, video, subtitles, etc.), the codec used to generate each of the tracks, the resolution of video content, the frame rate of video content, the bit depth of video content, etc.
  • the type of each of the tracks e.g., audio, video, subtitles, etc.
  • the codec used to generate each of the tracks e.g., the resolution of video content, the frame rate of video content, the bit depth of video content, etc.
  • Cluster element 320 can contain any suitable information relating to media content, such as video content, audio content, subtitles, etc.
  • cluster element 320 can contain video data, audio data, or subtitles corresponding to media content having a particular duration (e.g., two seconds, or any suitable duration).
  • cluster element 320 can also contain a timecode element that can indicate the start time of the media content contained in cluster element 320 .
  • cluster element 320 can include one or more blockgroup elements 322 .
  • Blockgroup element 322 can include any suitable information relating to a part of or all of the media content data contained in cluster element 320 .
  • blockgroup element 322 can contain one or more block elements 324 , each of which can contain a block of media content data (e.g., video data, audio data, subtitles, etc.) that can be rendered by a user device.
  • media content data e.g., video data, audio data, subtitles, etc.
  • blockgroup element 322 can also contain any suitable information relating to the block of media content data, such as the start time of the media content, the duration of the media content, the type of media content data contained in blockgroup element 322 (e.g., video, audio, subtitles, etc.), etc.
  • blockgroup element 322 can include one or more suitable timecodes corresponding to the start time, the end time, the duration, and/or other suitable information of the media content contained in blockgroup element 322 .
  • file 300 can include multiple cluster elements 320 (e.g., cluster element 321 , cluster element 326 , . . . , and cluster element 328 ).
  • each of the cluster elements can contain data about a portion of a piece of media content.
  • each cluster element can contain a portion of the piece of media content having the same duration (e.g., such as two seconds, or any other suitable duration).
  • cluster elements 321 and 326 can contain data about a first portion of the piece of media content (e.g., the first two seconds of the media content), a second portion of the piece of media content (e.g., the second two seconds of the media content), respectively.
  • multiple Matroska container files can contain cluster elements corresponding to the same portion of the piece of media content.
  • the first cluster element of video file 212 e.g., cluster element 321 of FIG. 3
  • the first cluster element of video file 214 e.g., cluster element 321 of FIG. 3
  • the first cluster element of audio file 216 and the first cluster element of subtitle file 218 can contain audio data and subtitles corresponding to the first portion of the media content.
  • index element 330 can include any suitable information relating to identifying one or more cluster elements 320 or any suitable portions of the cluster elements.
  • index element 330 can include one or more Cues elements 332 that can contain any suitable information that can be used to identify and/or seek one or more cluster elements, block elements, etc.
  • Cues element 332 can include one or more timecodes containing information about the duration, the start time, the end time, etc. of the media content contained in one or more cluster elements, block elements, video frames, etc.
  • cues element 332 can include a list of positions of multiple cluster elements, block elements, video frames, etc. More particularly, for example, the list of positions can include the positions of the cluster elements, block elements, video frames, etc. associated with a particular timecode.
  • top level index file 220 can be any suitable file containing any suitable information relating to one or more of Matroska container files 210 .
  • top level index file 220 can be a Synchronized Multimedia Integration Language (SMIL) file, an Extensible Markup Language (XML) file, a HyperText Markup Language (HTML) file, etc.
  • SMIL Synchronized Multimedia Integration Language
  • XML Extensible Markup Language
  • HTML HyperText Markup Language
  • top level index file 220 can include any suitable information concerning the media content contained in one or more of Matroska container files 210 .
  • top level index file 220 can include information about the bit rates, frames rates, resolutions, etc. of the video content contained in video files 212 and 214 .
  • top level index file 220 can also include any suitable information that can be used to identify and/or seek one or more of Matroska container files 210 and/or any suitable portions of Matroska container files 210 .
  • top level index file 220 can include information that can be used to identify one or more resources from which one or more of Matroska container files 210 can be obtained, such as the names of the resources, the locations of the resources, etc.
  • top level index file 220 can include one or more uniform resource identifiers (URIs) associated with one or more of Matroska container file 220 (e.g., such as video file 212 , video file 214 , audio file 216 , subtitle file 218 , etc.).
  • top level index file 210 can also include one or more URIs associated with one or more header elements, cluster elements, block elements, segment elements, index elements, etc. of one or more Matroska container files 210 .
  • system 100 can also include one or more user devices 108 .
  • Each user device 108 can be any suitable device that is capable of receiving, processing, converting, and/or rendering media content, and/or performing any other suitable functions.
  • system 100 can include a desktop computer, a laptop computer, a tablet computer, a mobile phone, a television device, a set-top box, a streaming media player, a digital media receiver, a DVD player, a BLU-RAY player, a game console, etc., and/or any other suitable combination of the same.
  • communications network 106 may be any one or more networks including the Internet, a mobile phone network, a mobile voice, a mobile data network (e.g., a 3G, 4G, or LTE network), a cable network, a satellite network, a public switched telephone network, a local area network, a wide area network, a fiber-optic network, any other suitable type of communications network, and/or any suitable combination of these communications networks.
  • a mobile phone network e.g., a mobile voice, a mobile data network (e.g., a 3G, 4G, or LTE network), a cable network, a satellite network, a public switched telephone network, a local area network, a wide area network, a fiber-optic network, any other suitable type of communications network, and/or any suitable combination of these communications networks.
  • a mobile phone network e.g., a 3G, 4G, or LTE network
  • cable network e.g., a 3G, 4G, or LTE network
  • media content source 102 , server(s) 104 , communications network 106 , and user device(s) 108 can be implemented in any suitable hardware.
  • each of media content source 102 , server(s) 104 , communications network 106 , and user device(s) 108 can be implemented in any of a general purpose device such as a computer or a special purpose device such as a client, a server, mobile terminal (e.g., mobile phone), etc.
  • a general purpose device such as a computer
  • a special purpose device such as a client, a server, mobile terminal (e.g., mobile phone), etc.
  • Any of these general or special purpose devices can include any suitable components such as a hardware processor (which can be a microprocessor, digital signal processor, a controller, etc.).
  • each of media content source 102 , server(s) 104 , communications network 106 , and user device(s) 108 can include a suitable storage device, such as random-access memory, read-only memory, hard drives, optical drives, digital video disc (DVD) recorders, compact disc (CD) recorders, BLU-RAY disc (BD) recorders, BLU-RAY 3 D disc recorders, digital video recorders (DVR, sometimes called a personal video recorder, or PVR), solid state devices, quantum storage devices, gaming consoles, gaming media, or any other suitable fixed or removable storage devices, and/or any combination of the same.
  • a suitable storage device such as random-access memory, read-only memory, hard drives, optical drives, digital video disc (DVD) recorders, compact disc (CD) recorders, BLU-RAY disc (BD) recorders, BLU-RAY 3 D disc recorders, digital video recorders (DVR, sometimes called a personal video recorder, or PVR), solid state devices, quantum storage devices, gaming consoles, gaming media
  • each of media content source 102 , server(s) 104 , communications network 106 , and user device(s) 108 can be implemented as a stand-alone device or integrated with other components of architecture 100 .
  • media content source 102 can be connected to server(s) 104 and communications network 106 through communications paths 110 and 112 , respectively.
  • communications network 106 can be connected to server(s) 104 and user device(s) 108 through communications paths 114 and 116 , respectively.
  • Communications paths 110 , 112 , 114 , and 116 may separately or together include one or more communications paths, such as, a satellite path, a fiber-optic path, a cable path, a path that supports Internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths, in some embodiments.
  • a satellite path such as, a satellite path, a fiber-optic path, a cable path, a path that supports Internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths, in some embodiments.
  • IPTV Internet communications
  • free-space connections e.g., for broadcast or other wireless signals
  • process 400 can be implemented in a suitable user device (such as user device 108 of in FIG. 1 ).
  • process 400 can begin by requesting a top level index file at 402 .
  • the top level index file can be requested in any suitable manner.
  • the user device can request the top level index file by sending one or more requests containing information about the name of the top level index file, the resources from which the top level index file can be obtained, the location of the top level index file, etc. under a suitable protocol (e.g., such as HTTP, TCP, etc.).
  • a suitable protocol e.g., such as HTTP, TCP, etc.
  • the user device can send to the server one or more HTTP requests containing information about one or more URIs associated with the top level index file to the server.
  • the user device can receive a top level index file from the server.
  • the top level index file can be received in any suitable manner.
  • the top level index file can be received via one or more responses that are sent by the server.
  • the user device can receive the top level index file via one or more HTTP responses that are sent by the server in response to the HTTP requests.
  • the top level index file can include any suitable information relating to one or more media content files (e.g., such as Matroska container files, etc.).
  • top level index file 220 as described above in connection with FIG. 2 can be received at 404 .
  • the top level index file can be received in any suitable format.
  • the received top level file can be a SMIL file, an XML file, etc.
  • the user device can store the received top level index file.
  • the top level index file can be stored in any suitable manner.
  • the top level index file can be cached in a suitable format (e.g., as an index file 710 of FIG. 7 that can be a SMIL file, an XML file, etc.).
  • the cached top level index file 710 can include any suitable portions of SMIL file 600 as shown in FIG. 6 .
  • each of the headers can contain any suitable information relating to the version of its corresponding Matroska container file, the media content contained in the Matroska container file, the components of the Matroska container file, etc.
  • each of the headers can include one or more header elements of a Matroska container file (e.g., such as video files 212 or 214 , audio file 216 , subtitle file 218 , etc. as illustrated in FIG. 2 ).
  • each of the headers can include an EBML element, a segment element, a seekhead element, a segmentinfo element, a tracks elements, and/or any other suitable components.
  • the headers can be requested in any suitable manner.
  • the headers can be requested based on the top level index file received at 404 .
  • the user device can parse the top level index file and obtain information relating to one or more URIs corresponding to the headers. The user device can then send one or more requests (e.g., HTTP requests, etc.) containing the URIs to the server.
  • requests e.g., HTTP requests, etc.
  • the user device can receive one or more headers associated with one or more Matroska container files.
  • the headers can be received in any suitable manner.
  • the headers can be received via one or more responses that are sent by the server.
  • the user device can receive the headers via one or more HTTP responses that are sent by the server in response to the HTTP requests.
  • a header of a video file e.g., video file 212 and/or video file 214 of FIG. 2
  • a header of an audio file e.g., audio file 216 of FIG. 2
  • a header of a subtitle file e.g., subtitle file 218 of FIG. 2
  • each of the received headers can be cached as a Matroska container file.
  • each of the header of the video file e.g., video file 212 of FIG. 2
  • the header of the audio file e.g., audio file 216 of FIG. 2
  • the header of the subtitle file e.g., subtitle file 218 of FIG. 2
  • each of header files 722 , 742 , and 752 can have a structure similar to header element 310 of FIG. 3 .
  • the user device in response to caching the header file(s), can update the top level index file.
  • the top level index file can be updated in any suitable manner.
  • the top level index file can be updated to include information relating to the one or more headers that have been received and cached, such as the location of the header file(s), the size of the header file(s), etc.
  • the user device can request fragment index information. Any suitable fragment index information can be requested.
  • the user device can request one or more index elements associated with one or more Matroska container files as described above in connection with FIGS. 2 and 3 .
  • the user device can request one or more suitable portions of the index elements from the server.
  • the fragment index information can be requested in any suitable manner.
  • the user device can request the fragment index information based on the top level index file received at 404 .
  • the user device can parse the top level index file and obtain information relating to one or more URIs corresponding to the fragment index information.
  • the user device can then send one or more requests (e.g., HTTP requests, etc.) containing the URIs to the server.
  • the user device can request the fragment index information based on one or more of the headers received at 410 .
  • the user can make such requests based on information relating to the location of the fragment index information (e.g., such as a seekhead element of a header).
  • the user device can receive the requested fragment index information.
  • the fragment index information can be received in any suitable manner.
  • the fragment index information can be received via one or more responses that are sent by the server.
  • the user device can receive the fragment index information via one or more HTTP responses that are sent by the server in response to the HTTP requests.
  • the user device does not need to cache or store the received fragment index information.
  • the user device can request one or more media content fragments from the server. Any suitable media content fragments can be requested.
  • the user device can request one or more cluster elements of one or more Matroska container files containing media content corresponding to a particular timecode (e.g., such as a particular start time and/or a particular end time).
  • the user device can request one or more cluster elements of a video file (e.g., video file 212 and/or 214 of FIG. 2 ), an audio file (e.g., audio file 216 of FIG. 2 ), a subtitle file (e.g., subtitle file 218 of FIG. 2 ) that contain media content corresponding to the particular timecode.
  • the user device can request one or more media content fragments containing media content having a particular version.
  • the user device can request a Cluster element of a video file that contains encoded video content having a particular bit rate, a particular frame rate, a particular resolution, etc.
  • the media content fragments can be requested in any suitable manner.
  • the user device can request the media content fragment(s) based on the streaming conditions experienced by the user device, such as the network bandwidth, the processor capacity, etc. that can be utilized to transmit media content, one or more user preferences (e.g., such as a desired resolution, a desired bit rate, etc.), etc.
  • the user device can request a cluster of a video file (e.g., video file 212 , video file 214 , etc. as illustrated in FIG. 2 ) containing encoded video content having a suitable bit rate that can be transmitted using the network bandwidth.
  • the user device can send to the server one or more requests containing information relating to the names of the media content fragments, the resources from which the media content fragments can be obtained, and/or other suitable information relating to the media content fragments using a suitable protocol (e.g., such as HTTP, TCP, etc.).
  • a suitable protocol e.g., such as HTTP, TCP, etc.
  • the requests can contain one or more URIs corresponding to the media content fragments to be requested.
  • the URIs can be obtained based on the top level index file. More particularly, for example, the user device can parse the top level index file received at 404 and obtain one or more URIs corresponding to one or more cluster elements to be requested.
  • the user device can receive one or more media content fragments.
  • the media content fragments can be received in any suitable manner.
  • the media content fragments can be received via one or more responses that are sent by the server.
  • the user device can receive the media content fragments via one or more HTTP responses that are sent by the server in response to the HTTP requests.
  • the user device in response to receiving the media content fragment, can extract media content data (e.g., such as video data, audio data, subtitles, etc.) from the media content fragments. The user device can then decode the media content data and cause the decoded media content to be rendered. For example, in response to receiving a cluster of a video file as described above, the user device can extract encoded video data from the block elements of the cluster element. The user device can then decode the encoded video data and cause the decoded video data to be displayed on a suitable display. In a more particular example, the decoded video data can be displayed based on one or more timecodes associated with the cluster element.
  • media content data e.g., such as video data, audio data, subtitles, etc.
  • the user device can store the received media content fragments and update the top level index file.
  • the media content fragments can be stored in any suitable manner.
  • the user device can cache the cluster element as a Matroska container file.
  • the user device in response to receiving a cluster element of video file 212 of FIG. 2 (e.g., cluster element 321 as shown in FIG. 3 ), the user device can cache the received cluster element as an EBML file 724 as shown in FIG. 7 .
  • a cluster of an audio file e.g., audio file 216 of FIG.
  • the user device can cache the cluster element as an EBML file 734 as shown in FIG. 7 .
  • the user device in response to receiving a cluster of a subtitle file (e.g., subtitle file 218 of FIG. 2 ), the user device can cache the cluster element as an EBML file 744 as shown in FIG. 7 .
  • the user device in response to caching the media content fragments, can update the top level index file that has been stored in the user device.
  • the top level index file can be updated in any suitable manner.
  • the user device can edit the top level index file to include information relating to the EBML file that stores the media content fragments.
  • the user device can include a video element 610 in top level index file 600 of FIG. 6 corresponding to EBML file 724 .
  • video element 610 can include a URI element 612 , a start-time element 614 , an end-time element 616 , and a param element 618 .
  • URI element 612 can include any suitable information relating to the name of the cached EBML file, the location of the EBML file, and/or any other suitable information about the cached EBML file.
  • URI element 612 can include a file path through which EBML file 724 can be retrieved.
  • start-time element 614 and end-time element 616 can contain information about the start time and the end time of the media content contained in the cached cluster element, respectively.
  • param element 618 can include any suitable information about the cached EBML file.
  • param element 618 can include information about the size of the cached EBML file.
  • param element 618 can include information about the bit rate, the resolution, the frame rate, etc. of the media content contained in the cached EBML file.
  • the user device in response to receiving and/or caching the media content fragment, can extract media content data (e.g., such as video data, audio data, subtitles, etc.) from the media content fragments.
  • the user device can then decode the media content data and cause the decoded media content to be rendered.
  • the user device in response to receiving a cluster of a video file as described above, can extract encoded video data from the block elements of the cluster element.
  • the user device can then decode the encoded video data and cause the decoded video data to be displayed on a suitable display.
  • the decoded video data can be displayed based on one or more timecodes associated with the cluster element.
  • process 400 can loop back to 418 . That is, process 400 can request, receive, and/or cache one or more media content fragments.
  • the media content fragments can be requested in any suitable manner.
  • process 400 can request a Cluster element corresponding to a particular timecode.
  • the user device can request and receive cluster element 326 ( FIG. 3 ) of audio file 216 ( FIG. 2 ) and cache the received cluster element as an EBML file 736 as illustrated in FIG. 7 .
  • the user device can request and receive cluster element 326 ( FIG. 3 ) of subtitle file 218 ( FIG. 2 ) and cache the received Cluster element as an EBML file 746 as illustrated in FIG. 7 .
  • process 400 can also request a cluster element of a Matroska container file that contains a particular version of a piece of media content based on the streaming conditions experienced by the user device and/or user preferences. More particularly, for example, the user device can determine the bandwidth, the processor capacity, etc. that can be utilized to transmit media content. Alternatively or additionally, the user device can determine a particular frame rate, a particular resolution, and/or other parameters about the media content to be rendered that are preferred by a user. The user device can then request a cluster element containing video data having a suitable bit rate, a suitable frame rate, a suitable resolution, etc. based on the streaming conditions and/or user preferences.
  • the user device can request and receive cluster element 326 ( FIG. 3 ) of video file 212 ( FIG. 2 ) in response to determining that video file 214 ( FIG. 2 ) contains video content having the suitable bit rate, frame rate, and/or resolution.
  • the user device can also cache the received Cluster element as an EBML file 726 of FIG. 7 .
  • the user device can also update top level index file 600 to include suitable information relating to EBML file 726 .
  • the user device can include a video element 620 in top level index file 600 corresponding to EBML file 726 .
  • video element 620 can include any suitable information relating to the name of the cached EBML file, the location of the EBML file, and/or any other suitable information about the cached EBML file, such as a file path through which EBML file 726 can be retrieved.
  • each of the top level index file, the header file(s), the media content fragments, and other media content files can be stored/cached for any suitable period of time.
  • a suitable cache duration can be specified for each of the stored/cached files, such as seconds, minutes, hours, days, weeks, months, or any suitable period of time.
  • no particular cache duration needs to be specified for the stored/cached files.
  • the content stored/cached in the files can be stored for an indefinite duration and will not expire by time.
  • the user device can cause the cached media content contained in the EBML files to be rendered.
  • the cached media content can be rendered at any suitable time.
  • the cached media content can be rendered when the user device is streaming media content from the server (e.g., using process 400 or other suitable processes).
  • the user device can simultaneously cache the media content fragment(s) and render the media content contained in the media content fragment(s).
  • the user device can request, receive, and/or cache one or more media content fragments as fast as it can (e.g., by utilizing the available bandwidth, hardware capacity, etc.). The user device can then render the media content fragment(s).
  • the user device can retrieve the EBML files based on the cached top level index file.
  • the user device can then extract the media content data (e.g., video data, audio data, subtitles, etc.) contained in the EBML files, decode the media content data, and cause the media content to be rendered based on the top level index file and/or one or more of the header files.
  • the media content data e.g., video data, audio data, subtitles, etc.
  • media content contained in multiple cashed/stored media content fragments can be rendered based on process 500 .
  • process 500 can be implemented in a suitable user device (e.g., such as user device 108 of FIG. 1 ).
  • process 500 can begin by retrieving a first cached media content fragment at 502 .
  • the user device can retrieve the first cached media content fragment based on the cached top level index file. More particularly, for example, the user device can parse the cached top level index file and extract data about the location of the media content file that contains the first cached media content fragment.
  • the cached top level index file can include URI element 612 that is associated with EBML file 724 ( FIG. 7 ) containing the first cached media content fragment.
  • the user device can locate EBML file 724 based on URI element 612 (e.g., by converting the URI into one or more files paths corresponding to the location of EBML file 724 ).
  • the user device upon retrieval of the first cached media content fragment, can render the first cached media content fragment at 504 .
  • the first cached media content fragment can be rendered in any suitable manner.
  • the user device can extract and decode the media content data (e.g., video data, audio data, subtitles, etc.) from the retrieved EBML file.
  • the user device can then cause the decoded media content data to be rendered.
  • the decoded content data can be rendered based on the cached header file (e.g., header file 722 of FIG. 7 ).
  • process 500 in response to determining that the retrieved media content fragment is the last cached media content fragment to be retrieved and/or rendered, process 500 can end at 508 .
  • the user device in response to determining that the retrieved media content fragment is not the last cached media content fragment to be retrieved and/or rendered, can retrieve the next cached media content fragment at 510 .
  • the user device can retrieve the second cached media content fragment based on the cached top level index file. More particularly, for example, the user device can parse the cached top level index file and extract data about the location of the media content file that contains the second cached media content fragment.
  • the cached top level index file can include URI element 622 associated with EBML file 726 ( FIG. 7 ) that contains the second cached media content fragment.
  • the user device can locate EBML file 726 based on URI element 622 (e.g., by converting the URI into one or more files paths corresponding to the location of EBML file 726 ).
  • the user device in response to retrieval of the second cached media content fragment, can render the second cached media content fragment at 512 .
  • the second cached media content fragment can be rendered in any suitable manner.
  • the user device can extract and decode the media content data (e.g., video data, audio data, subtitles, etc.) from the retrieved EBML file (e.g., EBML file 726 of FIG. 7 ).
  • the user device can then cause the decoded media content data to be rendered.
  • the decoded content data can be rendered based on the cached header file (e.g., header file 722 of FIG. 7 ).
  • the user device can cause a cached media content fragment to be rendered upon retrieval of the cached media content fragment. In some embodiments, the user device can retrieve multiple cached media content fragments and cause some or all of the retrieved media content fragments to be rendered in a suitable order.
  • any suitable computer readable media can be used for storing instructions for performing the mechanisms and/or processes described herein.
  • computer readable media can be transitory or non-transitory.
  • non-transitory computer readable media can include media such as magnetic media (such as hard disks, floppy disks, etc.), optical media (such as compact discs, digital video discs, Blu-ray discs, etc.), semiconductor media (such as flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), etc.), any suitable media that is not fleeting or devoid of any semblance of permanence during transmission, and/or any suitable tangible media.
  • transitory computer readable media can include signals on networks, in wires, conductors, optical fibers, circuits, any suitable media that is fleeting and devoid of any semblance of permanence during transmission, and/or any suitable intangible media.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

Methods, systems, and computer readable media for streaming media content are provided. In some embodiments, the methods comprise: receiving top level index data from a server; caching the top level index data in an index file; receiving header data associated with a first media content file from the server; caching the header data in a header file; receiving a first segment of the first media content file based at least in part on the index file; caching the first segment of the first media content file in a first file; updating the index file to include information about the first file; and causing the first fragment to be displayed based at least in part on the index file and the header file.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 13/931,198 entitled “Systems, Methods, and Media for Streaming Media Content” to Jason A. Braness, filed Jun. 28, 2013, the disclosure of which is expressly incorporated by reference herein in its entirety.
  • TECHNICAL FIELDS
  • Methods, systems, and media for streaming media content are provided. More particularly, the disclosed subject matter relates to adaptive bitrate streaming.
  • BACKGROUND OF THE INVENTION
  • There are many conventional approaches to streaming media content, such as television programs, pay-per-view programs, on-demand programs, Internet content, movies, etc. For example, media content can be encoded at multiple bit rates. The encoded media content can then be transmitted using a suitable protocol, such as the Hypertext Transfer Protocol (HTTP), the Real-time Transport Protocol (RTP), the Real Time Streaming Protocol (RTSP), etc. However, conventional approaches do not provide users with the capabilities to stream, store, and playback media content at variable bitrates.
  • Accordingly, new mechanisms for streaming media content are desirable.
  • SUMMARY OF THE INVENTION
  • In view of the foregoing, systems, methods, and media for streaming media content are provided.
  • In some embodiments, methods for streaming media content are provided, the methods comprising: receiving top level index data from a server; caching the top level index data in an index file; receiving header data associated with a first media content file from the server; caching the header data in a header file; receiving a first segment of the first media content file based at least in part on the index file; caching the first segment of the first media content file in a first file; updating the index file to include information about the first file; and causing the first fragment to be displayed based at least in part on the index file and the header file.
  • In some embodiments, systems for streaming media content are provided, the systems comprising at least one hardware processor that is configured to: receive top level index data from a server; cache the top level index data in an index file; receive header data associated with a first media content file from the server; cache the header data in a header file; receive a first segment of the first media content file based at least in part on the index file; cache the first segment of the first media content file in a first file; update the index file to include information about the first file; and cause the first fragment to be displayed based at least in part on the index file and the header file.
  • In some embodiments, non-transitory media containing computer-executable instructions that, when executed by a hardware processor, cause the hardware processor to streaming media content are provided, the method comprising: receiving top level index data from a server; caching the top level index data in an index file; receiving header data associated with a first media content file from the server; caching the header data in a header file; receiving a first segment of the first media content file based at least in part on the index file; caching the first segment of the first media content file in a first file; updating the index file to include information about the first file; and causing the first fragment to be displayed based at least in part on the index file and the header file.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other objects and advantages of the invention will be apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
  • FIG. 1 shows a generalized block diagram of an example of an architecture of hardware that can be used to stream media content in accordance with some embodiments of the invention;
  • FIG. 2 shows examples of a top level index file and Matroska container files in accordance with some embodiments of the invention;
  • FIG. 3 shows an example of a structure of a Matroska container file in accordance with some embodiments of the invention;
  • FIG. 4 shows a flow chart of an example of a process for streaming media content in accordance with some embodiments of the invention;
  • FIG. 5 shows a flow chart of an example of a process for rendering media content in accordance with some embodiments of the invention;
  • FIG. 6 shows an example of a top level index file in accordance with some embodiments of the invention; and
  • FIG. 7 shows an example of Matroska container files containing cached media content in accordance with some embodiments of the invention.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • This invention generally relates to mechanisms (which can be systems, methods, media, etc.) for streaming media content. The mechanisms can be used in many applications. For example, the mechanisms can be used to stream, store, and/or playback media content having different versions (e.g., such as video content encoded at multiple bit rates, resolutions, frame rates, etc.).
  • In some embodiments, media content (e.g., such as video content, audio content, subtitles, etc.) can be stored in one or more Matroska container files on a server. The Matroska container is a media container developed as an open standard project by the Matroska non-profit organization of Aussonne, France. The Matroska specification (which can be retrieved from the Internet: http://matroska.org/technical/specs/index.html) is hereby incorporated by reference herein in its entity. In some embodiments, for example, the server can store multiple Matroska container files containing encoded video content having different bit rates, resolutions, frame rates, etc.
  • In some embodiments, a user device can request a top level index file from the server. For example, the user device can send one or more requests containing information relating to resources that can provide the top level index file under a suitable protocol (e.g., such as a Hypertext Transfer Protocol (HTTP), a Transmission Control Protocol (TCP), etc.). In a more particular example, the user device can request the top level index file via one or more HTTP requests containing one or more Uniform Resource Identifiers (URI) associated with the top level index file.
  • In some embodiments, the user device can receive the requested top level index file via one or more responses sent by the server. In the example where multiple HTTP requests are used to request the top level index file, the top level index file can be received via one or more HTTP responses corresponding to the HTTP requests. In some embodiments, the top level index file can be received in any suitable format. For example, the top level index file can be received as a Synchronized Multimedia Integration Language (SMIL) file, an Extensible Markup Language (XML) file, etc.
  • In some embodiments, upon receiving the top level index file, the user device can cache the top level index file in a suitable manner. For example, the top level index file can be cached in the form of one or more SMIL files, XML files, etc.
  • In some embodiments, the user device can request one or more headers associated with one or more Matroska container files based on the cached top level index file. For example, the user device can parse the cached top level index file and obtain one or more URIs corresponding to the headers. The user device can then request the headers by sending one or more requests containing the URIs to the server and/or another server.
  • In some embodiments, the user device can receive the requested headers through one or more responses that are sent by the server in response to the requests. In some embodiments, the user device can also cache the received headers in a suitable manner. For example, each of the headers can be cached as an Extensible Binary Meta Language (EBML) file.
  • In some embodiments, the user device can request one or more media content fragments from the server. For example, the user device can request one or more cluster elements of one or more Matroska container files stored on the server. In a more particular example, the user device can request a cluster element of a Matroska container file (e.g., a video file containing suitable video data) based on the streaming conditions (e.g., such as the bandwidth, the hardware capacity, etc. that can be utilized to stream media content) that is experienced by the user device.
  • In some embodiments, upon receiving the requested media content fragments from the server, the user device can cache the media content fragments. For example, the user device can cache each media content fragment in an EBML file upon receiving the media content fragment. In some embodiments, the user device can also update the cached top level index file to include information about the cached media content fragments. For example the cached top level index file can be updated to include one or more URIs corresponding to each EBML file that stores the cached media content fragment.
  • In some embodiments, after one or more media content fragments are cached (e.g., in multiple EBML files, respectively), the user device can cause the cached media content to be rendered. For example, the user device can cause cached video content, audio content, subtitles, etc. to be rendered based on the cached top level index file, the cached headers, and/or any other suitable information. In a more particular example, the user device can retrieve multiple EBML files that store the cached media content fragments based on the top level index file (e.g., using the URIs corresponding to each EBML file). The user device can then extract the media content stored in the EBML files, decode the media content, and cause the decoded media content to be rendered.
  • In some embodiments, the cached media content can be rendered at any suitable time. For example, the cached media content can be rendered when the user device is streaming and/or downloading media content from the server. As another example, the cached media content can be rendered after the user device has finished streaming and/or caching media content from the server. In a more particular example, the user device can cause the cached media content to be rendered in response a user requesting a playback of part or all of the cached media content at any time with or without a live communication connection with the server.
  • Turning to FIG. 1, a generalized block diagram of an example 100 of an architecture of hardware that can be used to stream media content in accordance with some embodiments is shown. As illustrated, architecture 100 can include a media content source 102, one or more servers 104, a communications network 106, one or more user devices 108, and communications paths 110, 112, 114, and 116.
  • Media content source 102 can include any suitable device that can provide media content. For example, media content source 102 can include any suitable circuitry that is capable of encoding media content, such as one or more suitable video encoders, audio encoders, video decoders, audio decoders, etc. In a more particular example, media content source 102 can include one or more suitable video encoders that are capable of encoding video content into different versions, each of which can have a particular bit rate, a particular resolution, a particular frame rate, a particular bit depth, etc.
  • As another example, media content source 102 can include one or more types of content distribution equipment for distributing any suitable media content, including television distribution facility equipment, cable system head-end equipment, satellite distribution facility equipment, programming source equipment (e.g., equipment of television broadcasters, such as NBC, ABC, HBO, etc.), intermediate distribution facility equipment, Internet provider equipment, on-demand media server equipment, and/or any other suitable media content provider equipment. NBC is a trademark owned by the National Broadcasting Company, Inc., ABC is a trademark owned by the ABC, INC., and HBO is a trademark owned by the Home Box Office, Inc.
  • Media content source 102 may be operated by the originator of content (e.g., a television broadcaster, a Webcast provider, etc.) or may be operated by a party other than the originator of content (e.g., an on-demand content provider, an Internet provider of content of broadcast programs for downloading, etc.).
  • Media content source 102 may be operated by cable providers, satellite providers, on-demand providers, Internet providers, providers of over-the-top content, and/or any other suitable provider(s) of content.
  • Media content source 102 may include a remote media server used to store different types of content (including video content selected by a user) in a location remote from any of the user equipment devices. For example, media content source 102 can include one or more content delivery networks (CDN).
  • As referred to herein, the term “media content” or “content” should be understood to mean one or more electronically consumable media assets, such as television programs, pay-per-view programs, on-demand programs (e.g., as provided in video-on-demand (VOD) systems), Internet content (e.g., streaming content, downloadable content, Webcasts, etc.), movies, films, video clips, audio, audio books, and/or any other media or multimedia and/or combination of the same. As referred to herein, the term “multimedia” should be understood to mean media content that utilizes at least two different content forms described above, for example, text, audio, images, video, or interactivity content forms. Media content may be recorded, played, displayed or accessed by user equipment devices, but can also be part of a live performance. In some embodiments, media content can include over-the-top (OTT) content. Examples of OTT content providers include YOUTUBE, NETFLIX, and HULU, which provide audio and video via IP packets. Youtube is a trademark owned by Google Inc., Netflix is a trademark owned by Netflix Inc., and Hulu is a trademark owned by Hulu, LLC.
  • Media content can be provided from any suitable source in some embodiments. In some embodiments, media content can be electronically delivered to a user's location from a remote location. For example, media content, such as a Video-On-Demand movie, can be delivered to a user's home from a cable system server. As another example, media content, such as a television program, can be delivered to a user's home from a streaming media provider over the Internet.
  • Server(s) 104 can be and/or include any suitable device that is capable of receiving, storing, processing, and/or delivering media content, and/or communicating with one or more user devices and/or other components of architecture 100 under one or more suitable protocols. For example, server(s) 104 can include any suitable circuitry that can receive requests, process requests, send responses, and/or perform other functions under a Hypertext Transfer Protocol (HTTP), a Transmission Control Protocol (TCP), etc.
  • In some embodiments, server(s) 104 can store media content that can be delivered to one or more components of architecture 100 in a suitable manner. For example, the media content can be stored in one or more suitable multimedia containers, such as Matroska media containers, Audio Video Interleaved (AVI) media containers, MPEG-4 Part 14 (MP4) media containers, etc.
  • In a more particular example, as illustrated in FIG. 2, server(s) 104 can store one or more Matroska container files 210 and one or more top level index file 220.
  • Matroska container files 210 can include any suitable files containing data about suitable media content, such as video content, audio content, subtitles, etc. For example, Matroska container files 210 can include one or more MKV files that can include data about video content, audio content, subtitles, etc. As another example, Matroska container files 210 can include one or more MKA files that can include audio data. As yet another example, Matroska container files 210 can include one or more MKS files that can include data about subtitles. As yet another example, Matroska container files 210 can include one or more MK3D files that can include data about stereoscopic video content.
  • In a more particular example, Matroska container files 210 can include one or more video files, such as video files 212 and 214 as illustrated in FIG. 2. In some embodiments, each of video files 212 and 214 can include data about video content having a particular bit rate, a particular resolution, a particular frame rate, etc. In some embodiments, each of video files 212 and 214 can contain a version of particular video content. More particularly, for example, video file 212 can contain a version of the particular video content including encoded video content having a first bit rate (and/or a first frame rate, a first resolution, etc.). Video file 214 can contain a version of the particular video content including encoded video content having a second bit rate (and/or a second frame rate, a second resolution, etc.).
  • Although two video files are shown in FIG. 2 to avoid over-complicating the drawing, any suitable number of these video files can be used in some embodiments. For example, Matroska container files 210 can include multiple video files (e.g., nine files or any suitable number of files), where each video file contains a version of particular video content (e.g., encoded video content having a particular bit rate, a particular resolution, a particular frame rate, etc.).
  • In another more particular example, Matroska container files 210 can include one or more audio files, such as an audio file 216. In some embodiments, audio file 216 can contain audio content that is associated with the video content contained in one or more video files, such as video files 212 and 214.
  • In yet another more particular example, Matroska container files 210 can include one or more files that contain subtitles associated with suitable video content and/or audio content, such as a subtitle file 218. In some embodiments, subtitle file 218 can contain data about subtitles that relate to the video content contained in video files 212 and 214 and/or the audio content contained in audio file 216.
  • In some embodiments, each of Matroska container files 210 can have a structure as illustrated in FIG. 3. As shown, Matroska container file 300 can include a header element 310, one or more cluster elements 320, an index element 330, and/or any other suitable components.
  • Header element 310 can include any suitable information relating to Matroska container file 300, such as a description of file 300, the version of file 300, etc. Header element 310 can also include any suitable information relating to the media content stored in file 300, such as the bit rate, the resolution, the frame rate, etc. of the media content.
  • In some embodiments, header element 310 can include an Extensible Binary Meta Language (EBML) element 311, one or more segment elements 312, and/or any other suitable components.
  • In some embodiments, EBML element 311 can include information about EBML version of the file, the type of EBML file (e.g., a Matroska file), etc.
  • Segment element 312 can contain any suitable data about media content, header, etc. In some embodiments, segment element 312 can include a seekhead element 313, a segmentinfo element 314, a tracks element 315, and/or any other suitable components.
  • In some embodiments, seekhead element 313 can include any suitable information about one or more components of segment element 312, such as a list of the positions of the components of segment element 312 (e.g., such as segmentinfo element 314, tracks element 315, etc.). Segmentinfo element 314 can include any suitable information about segment element 312 and/or file 300, such as the duration of the media content contained in segment element 312, an identification number corresponding to segment element 312 (e.g., a randomly generated unique number that can be used to identify segment element 312), a title of segment element 312 and/or file 300, etc. Tracks element 316 can include any suitable information about one or more media tracks that are stored in segment element 312, such as the type of each of the tracks (e.g., audio, video, subtitles, etc.), the codec used to generate each of the tracks, the resolution of video content, the frame rate of video content, the bit depth of video content, etc.
  • Cluster element 320 can contain any suitable information relating to media content, such as video content, audio content, subtitles, etc. For example, cluster element 320 can contain video data, audio data, or subtitles corresponding to media content having a particular duration (e.g., two seconds, or any suitable duration). As another example, cluster element 320 can also contain a timecode element that can indicate the start time of the media content contained in cluster element 320.
  • In a more particular example, cluster element 320 can include one or more blockgroup elements 322. Blockgroup element 322 can include any suitable information relating to a part of or all of the media content data contained in cluster element 320. For example, blockgroup element 322 can contain one or more block elements 324, each of which can contain a block of media content data (e.g., video data, audio data, subtitles, etc.) that can be rendered by a user device.
  • As another example, blockgroup element 322 can also contain any suitable information relating to the block of media content data, such as the start time of the media content, the duration of the media content, the type of media content data contained in blockgroup element 322 (e.g., video, audio, subtitles, etc.), etc. In a more particular example, blockgroup element 322 can include one or more suitable timecodes corresponding to the start time, the end time, the duration, and/or other suitable information of the media content contained in blockgroup element 322.
  • In some embodiments, file 300 can include multiple cluster elements 320 (e.g., cluster element 321, cluster element 326, . . . , and cluster element 328). In some embodiments, for example, each of the cluster elements can contain data about a portion of a piece of media content. In a more particular example, each cluster element can contain a portion of the piece of media content having the same duration (e.g., such as two seconds, or any other suitable duration). More particularly, for example, cluster elements 321 and 326 can contain data about a first portion of the piece of media content (e.g., the first two seconds of the media content), a second portion of the piece of media content (e.g., the second two seconds of the media content), respectively.
  • In some embodiments, multiple Matroska container files can contain cluster elements corresponding to the same portion of the piece of media content. For example, in the example where video file 212 and video file 214 (FIG. 2) contain different versions of the same source media content, the first cluster element of video file 212 (e.g., cluster element 321 of FIG. 3) and the first cluster element of video file 214 (e.g., cluster element 321 of FIG. 3) can contain different versions of the first portion of the piece of media content (e.g., encoded video content having different bit rates, resolutions, frame rates, etc.). In some embodiments, the first cluster element of audio file 216 and the first cluster element of subtitle file 218 can contain audio data and subtitles corresponding to the first portion of the media content.
  • Referring back to FIG. 3, index element 330 can include any suitable information relating to identifying one or more cluster elements 320 or any suitable portions of the cluster elements. For example, index element 330 can include one or more Cues elements 332 that can contain any suitable information that can be used to identify and/or seek one or more cluster elements, block elements, etc. In a more particular example, Cues element 332 can include one or more timecodes containing information about the duration, the start time, the end time, etc. of the media content contained in one or more cluster elements, block elements, video frames, etc. In another more particular example, cues element 332 can include a list of positions of multiple cluster elements, block elements, video frames, etc. More particularly, for example, the list of positions can include the positions of the cluster elements, block elements, video frames, etc. associated with a particular timecode.
  • Referring back to FIG. 2, top level index file 220 can be any suitable file containing any suitable information relating to one or more of Matroska container files 210. In some embodiments, for example, top level index file 220 can be a Synchronized Multimedia Integration Language (SMIL) file, an Extensible Markup Language (XML) file, a HyperText Markup Language (HTML) file, etc.
  • In some embodiments, for example, top level index file 220 can include any suitable information concerning the media content contained in one or more of Matroska container files 210. In a more particular example, top level index file 220 can include information about the bit rates, frames rates, resolutions, etc. of the video content contained in video files 212 and 214.
  • In some embodiments, top level index file 220 can also include any suitable information that can be used to identify and/or seek one or more of Matroska container files 210 and/or any suitable portions of Matroska container files 210. For example, top level index file 220 can include information that can be used to identify one or more resources from which one or more of Matroska container files 210 can be obtained, such as the names of the resources, the locations of the resources, etc. In a more particular example, top level index file 220 can include one or more uniform resource identifiers (URIs) associated with one or more of Matroska container file 220 (e.g., such as video file 212, video file 214, audio file 216, subtitle file 218, etc.). In another more particular example, top level index file 210 can also include one or more URIs associated with one or more header elements, cluster elements, block elements, segment elements, index elements, etc. of one or more Matroska container files 210.
  • Referring back to FIG. 1, system 100 can also include one or more user devices 108. Each user device 108 can be any suitable device that is capable of receiving, processing, converting, and/or rendering media content, and/or performing any other suitable functions. For example, system 100 can include a desktop computer, a laptop computer, a tablet computer, a mobile phone, a television device, a set-top box, a streaming media player, a digital media receiver, a DVD player, a BLU-RAY player, a game console, etc., and/or any other suitable combination of the same.
  • In some embodiments, communications network 106 may be any one or more networks including the Internet, a mobile phone network, a mobile voice, a mobile data network (e.g., a 3G, 4G, or LTE network), a cable network, a satellite network, a public switched telephone network, a local area network, a wide area network, a fiber-optic network, any other suitable type of communications network, and/or any suitable combination of these communications networks.
  • In some embodiments, media content source 102, server(s) 104, communications network 106, and user device(s) 108 can be implemented in any suitable hardware. For example, each of media content source 102, server(s) 104, communications network 106, and user device(s) 108 can be implemented in any of a general purpose device such as a computer or a special purpose device such as a client, a server, mobile terminal (e.g., mobile phone), etc. Any of these general or special purpose devices can include any suitable components such as a hardware processor (which can be a microprocessor, digital signal processor, a controller, etc.). In some embodiments, each of media content source 102, server(s) 104, communications network 106, and user device(s) 108 can include a suitable storage device, such as random-access memory, read-only memory, hard drives, optical drives, digital video disc (DVD) recorders, compact disc (CD) recorders, BLU-RAY disc (BD) recorders, BLU-RAY 3D disc recorders, digital video recorders (DVR, sometimes called a personal video recorder, or PVR), solid state devices, quantum storage devices, gaming consoles, gaming media, or any other suitable fixed or removable storage devices, and/or any combination of the same.
  • In some embodiments, each of media content source 102, server(s) 104, communications network 106, and user device(s) 108 can be implemented as a stand-alone device or integrated with other components of architecture 100.
  • In some embodiments, media content source 102 can be connected to server(s) 104 and communications network 106 through communications paths 110 and 112, respectively. In some embodiments, communications network 106 can be connected to server(s) 104 and user device(s) 108 through communications paths 114 and 116, respectively.
  • Communications paths 110, 112, 114, and 116 may separately or together include one or more communications paths, such as, a satellite path, a fiber-optic path, a cable path, a path that supports Internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths, in some embodiments.
  • Turning to FIG. 4, an example 400 of a process for streaming and storing media content in accordance with some embodiments of the disclosure is shown. In some embodiments, process 400 can be implemented in a suitable user device (such as user device 108 of in FIG. 1).
  • As illustrated, process 400 can begin by requesting a top level index file at 402. The top level index file can be requested in any suitable manner. For example, the user device can request the top level index file by sending one or more requests containing information about the name of the top level index file, the resources from which the top level index file can be obtained, the location of the top level index file, etc. under a suitable protocol (e.g., such as HTTP, TCP, etc.). In a more particular example, the user device can send to the server one or more HTTP requests containing information about one or more URIs associated with the top level index file to the server.
  • Next, at 404, the user device can receive a top level index file from the server. The top level index file can be received in any suitable manner. For example, the top level index file can be received via one or more responses that are sent by the server. In a more particular example, in the example described above where one or more HTTP requests are used to request the top level index file, the user device can receive the top level index file via one or more HTTP responses that are sent by the server in response to the HTTP requests.
  • In some embodiments, the top level index file can include any suitable information relating to one or more media content files (e.g., such as Matroska container files, etc.). In a more particular example, top level index file 220 as described above in connection with FIG. 2 can be received at 404.
  • In some embodiments, the top level index file can be received in any suitable format. For example, the received top level file can be a SMIL file, an XML file, etc.
  • At 406, the user device can store the received top level index file. The top level index file can be stored in any suitable manner. For example, the top level index file can be cached in a suitable format (e.g., as an index file 710 of FIG. 7 that can be a SMIL file, an XML file, etc.). In a more particular example, as shown in FIG. 6, the cached top level index file 710 can include any suitable portions of SMIL file 600 as shown in FIG. 6.
  • At 408, the user device can request one or more headers associated with one or more Matroska container files. Each of the headers can contain any suitable information relating to the version of its corresponding Matroska container file, the media content contained in the Matroska container file, the components of the Matroska container file, etc. For example, each of the headers can include one or more header elements of a Matroska container file (e.g., such as video files 212 or 214, audio file 216, subtitle file 218, etc. as illustrated in FIG. 2). In a more particular example, as described above in connection with FIG. 3, each of the headers can include an EBML element, a segment element, a seekhead element, a segmentinfo element, a tracks elements, and/or any other suitable components.
  • In some embodiments, the headers can be requested in any suitable manner. For example, the headers can be requested based on the top level index file received at 404. In a more particular example, the user device can parse the top level index file and obtain information relating to one or more URIs corresponding to the headers. The user device can then send one or more requests (e.g., HTTP requests, etc.) containing the URIs to the server.
  • At 410, the user device can receive one or more headers associated with one or more Matroska container files. The headers can be received in any suitable manner. For example, the headers can be received via one or more responses that are sent by the server. In a more particular example, in the example described above where one or more HTTP requests are used to request the headers, the user device can receive the headers via one or more HTTP responses that are sent by the server in response to the HTTP requests.
  • More particularly, for example, one or more of a header of a video file (e.g., video file 212 and/or video file 214 of FIG. 2), a header of an audio file (e.g., audio file 216 of FIG. 2), and a header of a subtitle file (e.g., subtitle file 218 of FIG. 2) can be received in response to the requests sent at 408.
  • Next, at 412, the user device can store the received headers and update the top level index file. The headers can be stored in any suitable manner. For example, each of the received headers can be cached as a Matroska container file. In a more particular example, as illustrated in FIG. 7, each of the header of the video file (e.g., video file 212 of FIG. 2), the header of the audio file (e.g., audio file 216 of FIG. 2), and the header of the subtitle file (e.g., subtitle file 218 of FIG. 2) can be cached as an EBML file, i.e., header file 722, header file 732, and header file 742, respectively. In some embodiments, each of header files 722, 742, and 752 can have a structure similar to header element 310 of FIG. 3.
  • In some embodiments, in response to caching the header file(s), the user device can update the top level index file. The top level index file can be updated in any suitable manner. For example, the top level index file can be updated to include information relating to the one or more headers that have been received and cached, such as the location of the header file(s), the size of the header file(s), etc.
  • At 414, the user device can request fragment index information. Any suitable fragment index information can be requested. For example, the user device can request one or more index elements associated with one or more Matroska container files as described above in connection with FIGS. 2 and 3. Alternatively or additionally, the user device can request one or more suitable portions of the index elements from the server.
  • The fragment index information can be requested in any suitable manner. For example, the user device can request the fragment index information based on the top level index file received at 404. In a more particular example, the user device can parse the top level index file and obtain information relating to one or more URIs corresponding to the fragment index information. The user device can then send one or more requests (e.g., HTTP requests, etc.) containing the URIs to the server.
  • As another example, the user device can request the fragment index information based on one or more of the headers received at 410. In a more particular example, the user can make such requests based on information relating to the location of the fragment index information (e.g., such as a seekhead element of a header).
  • At 416, the user device can receive the requested fragment index information. The fragment index information can be received in any suitable manner. For example, the fragment index information can be received via one or more responses that are sent by the server. In a more particular example, in the example described above where one or more HTTP requests are used to request the fragment index information, the user device can receive the fragment index information via one or more HTTP responses that are sent by the server in response to the HTTP requests. In some embodiments, the user device does not need to cache or store the received fragment index information.
  • At 418, the user device can request one or more media content fragments from the server. Any suitable media content fragments can be requested. For example, the user device can request one or more cluster elements of one or more Matroska container files containing media content corresponding to a particular timecode (e.g., such as a particular start time and/or a particular end time). In a more particular example, the user device can request one or more cluster elements of a video file (e.g., video file 212 and/or 214 of FIG. 2), an audio file (e.g., audio file 216 of FIG. 2), a subtitle file (e.g., subtitle file 218 of FIG. 2) that contain media content corresponding to the particular timecode.
  • As another example, the user device can request one or more media content fragments containing media content having a particular version. In a more particular example, the user device can request a Cluster element of a video file that contains encoded video content having a particular bit rate, a particular frame rate, a particular resolution, etc.
  • In some embodiments, the media content fragments can be requested in any suitable manner. For example, the user device can request the media content fragment(s) based on the streaming conditions experienced by the user device, such as the network bandwidth, the processor capacity, etc. that can be utilized to transmit media content, one or more user preferences (e.g., such as a desired resolution, a desired bit rate, etc.), etc. In a more particular example, upon determining the network bandwidth that can be utilized to transmit media content, the user device can request a cluster of a video file (e.g., video file 212, video file 214, etc. as illustrated in FIG. 2) containing encoded video content having a suitable bit rate that can be transmitted using the network bandwidth.
  • As another example, the user device can send to the server one or more requests containing information relating to the names of the media content fragments, the resources from which the media content fragments can be obtained, and/or other suitable information relating to the media content fragments using a suitable protocol (e.g., such as HTTP, TCP, etc.). In a more particular example, the requests can contain one or more URIs corresponding to the media content fragments to be requested. In some embodiments, the URIs can be obtained based on the top level index file. More particularly, for example, the user device can parse the top level index file received at 404 and obtain one or more URIs corresponding to one or more cluster elements to be requested.
  • At 420, the user device can receive one or more media content fragments. The media content fragments can be received in any suitable manner. For example, the media content fragments can be received via one or more responses that are sent by the server. In a more particular example, in the example described above where one or more HTTP requests are used to request the media content fragments, the user device can receive the media content fragments via one or more HTTP responses that are sent by the server in response to the HTTP requests.
  • In some embodiments, in response to receiving the media content fragment, the user device can extract media content data (e.g., such as video data, audio data, subtitles, etc.) from the media content fragments. The user device can then decode the media content data and cause the decoded media content to be rendered. For example, in response to receiving a cluster of a video file as described above, the user device can extract encoded video data from the block elements of the cluster element. The user device can then decode the encoded video data and cause the decoded video data to be displayed on a suitable display. In a more particular example, the decoded video data can be displayed based on one or more timecodes associated with the cluster element.
  • Next, at 422, the user device can store the received media content fragments and update the top level index file. The media content fragments can be stored in any suitable manner. For example, upon receiving a cluster element of a Matroska container file (e.g., such as Matroska container files 210 of FIG. 2), the user device can cache the cluster element as a Matroska container file. In a more particular example, in response to receiving a cluster element of video file 212 of FIG. 2 (e.g., cluster element 321 as shown in FIG. 3), the user device can cache the received cluster element as an EBML file 724 as shown in FIG. 7. In another more particular example, in response to receiving a cluster of an audio file (e.g., audio file 216 of FIG. 2), the user device can cache the cluster element as an EBML file 734 as shown in FIG. 7. In yet another more particular example, in response to receiving a cluster of a subtitle file (e.g., subtitle file 218 of FIG. 2), the user device can cache the cluster element as an EBML file 744 as shown in FIG. 7.
  • In some embodiments, in response to caching the media content fragments, the user device can update the top level index file that has been stored in the user device. The top level index file can be updated in any suitable manner. For example, the user device can edit the top level index file to include information relating to the EBML file that stores the media content fragments.
  • In a more particular example, in the example described above where a cluster of video file 212 (FIG. 2) is cached as EBML file 724, the user device can include a video element 610 in top level index file 600 of FIG. 6 corresponding to EBML file 724. As shown, video element 610 can include a URI element 612, a start-time element 614, an end-time element 616, and a param element 618. URI element 612 can include any suitable information relating to the name of the cached EBML file, the location of the EBML file, and/or any other suitable information about the cached EBML file. In a more particular example, URI element 612 can include a file path through which EBML file 724 can be retrieved.
  • In some embodiments, start-time element 614 and end-time element 616 can contain information about the start time and the end time of the media content contained in the cached cluster element, respectively.
  • In some embodiments, param element 618 can include any suitable information about the cached EBML file. For example, param element 618 can include information about the size of the cached EBML file. As another example, param element 618 can include information about the bit rate, the resolution, the frame rate, etc. of the media content contained in the cached EBML file.
  • In some embodiments, in response to receiving and/or caching the media content fragment, the user device can extract media content data (e.g., such as video data, audio data, subtitles, etc.) from the media content fragments. The user device can then decode the media content data and cause the decoded media content to be rendered. For example, in response to receiving a cluster of a video file as described above, the user device can extract encoded video data from the block elements of the cluster element. The user device can then decode the encoded video data and cause the decoded video data to be displayed on a suitable display. In a more particular example, the decoded video data can be displayed based on one or more timecodes associated with the cluster element.
  • Referring back to FIG. 4, in some embodiments, after step 422 is performed, process 400 can loop back to 418. That is, process 400 can request, receive, and/or cache one or more media content fragments. The media content fragments can be requested in any suitable manner. For example, process 400 can request a Cluster element corresponding to a particular timecode. In a more particular example, in the example where cluster 321 (FIG. 3) of audio file 216 (FIG. 2) has been cached as EBML file 734, the user device can request and receive cluster element 326 (FIG. 3) of audio file 216 (FIG. 2) and cache the received cluster element as an EBML file 736 as illustrated in FIG. 7.
  • In another more particular example, in the example where cluster 321 (FIG. 3) of subtitle file 218 (FIG. 2) has been cashed as EBML file 744, the user device can request and receive cluster element 326 (FIG. 3) of subtitle file 218 (FIG. 2) and cache the received Cluster element as an EBML file 746 as illustrated in FIG. 7.
  • As another example, process 400 can also request a cluster element of a Matroska container file that contains a particular version of a piece of media content based on the streaming conditions experienced by the user device and/or user preferences. More particularly, for example, the user device can determine the bandwidth, the processor capacity, etc. that can be utilized to transmit media content. Alternatively or additionally, the user device can determine a particular frame rate, a particular resolution, and/or other parameters about the media content to be rendered that are preferred by a user. The user device can then request a cluster element containing video data having a suitable bit rate, a suitable frame rate, a suitable resolution, etc. based on the streaming conditions and/or user preferences.
  • In a more particular example, in the example where cluster 321 (FIG. 3) of video file 212 (FIG. 2) has been cashed as EBML file 724, the user device can request and receive cluster element 326 (FIG. 3) of video file 212 (FIG. 2) in response to determining that video file 214 (FIG. 2) contains video content having the suitable bit rate, frame rate, and/or resolution. The user device can also cache the received Cluster element as an EBML file 726 of FIG. 7.
  • In some embodiments, upon caching EBML file 726 (FIG. 7), the user device can also update top level index file 600 to include suitable information relating to EBML file 726. In a more particular example, as illustrated in FIG. 6, the user device can include a video element 620 in top level index file 600 corresponding to EBML file 726. As shown, video element 620 can include any suitable information relating to the name of the cached EBML file, the location of the EBML file, and/or any other suitable information about the cached EBML file, such as a file path through which EBML file 726 can be retrieved.
  • In some embodiments, each of the top level index file, the header file(s), the media content fragments, and other media content files (e.g., the EBML files as illustrated in FIG. 7) can be stored/cached for any suitable period of time. For example, a suitable cache duration can be specified for each of the stored/cached files, such as seconds, minutes, hours, days, weeks, months, or any suitable period of time. As another example, no particular cache duration needs to be specified for the stored/cached files. In such an example, the content stored/cached in the files can be stored for an indefinite duration and will not expire by time.
  • In some embodiments, upon caching/storing one or more EBML files as described above, the user device can cause the cached media content contained in the EBML files to be rendered. The cached media content can be rendered at any suitable time. For example, the cached media content can be rendered when the user device is streaming media content from the server (e.g., using process 400 or other suitable processes). In a more particular example, upon receiving one or more media content fragments, the user device can simultaneously cache the media content fragment(s) and render the media content contained in the media content fragment(s). In another more particular example, the user device can request, receive, and/or cache one or more media content fragments as fast as it can (e.g., by utilizing the available bandwidth, hardware capacity, etc.). The user device can then render the media content fragment(s).
  • As another example, the media content can be rendered after the user device has finished streaming and/or caching media content from the server. In a more particular example, the user device can cause the cached media content to be rendered upon a user requesting a playback of the cached media content at any time with or without a live communication connection with the server.
  • For example, the user device can retrieve the EBML files based on the cached top level index file. The user device can then extract the media content data (e.g., video data, audio data, subtitles, etc.) contained in the EBML files, decode the media content data, and cause the media content to be rendered based on the top level index file and/or one or more of the header files.
  • In a more particular example, as illustrated in FIG. 5, media content contained in multiple cashed/stored media content fragments can be rendered based on process 500. In some embodiments, process 500 can be implemented in a suitable user device (e.g., such as user device 108 of FIG. 1).
  • As shown, process 500 can begin by retrieving a first cached media content fragment at 502. For example, the user device can retrieve the first cached media content fragment based on the cached top level index file. More particularly, for example, the user device can parse the cached top level index file and extract data about the location of the media content file that contains the first cached media content fragment.
  • In a more particular example, as described above in connection with FIGS. 4, 6, and 7, the cached top level index file can include URI element 612 that is associated with EBML file 724 (FIG. 7) containing the first cached media content fragment. In such an example, the user device can locate EBML file 724 based on URI element 612 (e.g., by converting the URI into one or more files paths corresponding to the location of EBML file 724).
  • In some embodiments, upon retrieval of the first cached media content fragment, the user device can render the first cached media content fragment at 504. The first cached media content fragment can be rendered in any suitable manner. For example, the user device can extract and decode the media content data (e.g., video data, audio data, subtitles, etc.) from the retrieved EBML file. The user device can then cause the decoded media content data to be rendered. In a more particular example, the decoded content data can be rendered based on the cached header file (e.g., header file 722 of FIG. 7).
  • Next, at 506, the user device can determine whether the retrieved media content fragment is the last cached media content fragment to be retrieved and/or rendered. Such determination can be made in any suitable manner. For example, the user device can check the cached top level index file (e.g., top level index file 600 of FIG. 6) and determine whether all the cached media content files linked to the cached top level index file have been retrieved and/or rendered. As another example, a user can select one or more cached media content fragments to be rendered (e.g., one or more cached media content fragments corresponding to a scene of a movie, etc.). In such an example, the user device can determine whether all of the selected cached media content fragments have been retrieved and/or rendered at 506.
  • In some embodiments, in response to determining that the retrieved media content fragment is the last cached media content fragment to be retrieved and/or rendered, process 500 can end at 508.
  • In some embodiments, in response to determining that the retrieved media content fragment is not the last cached media content fragment to be retrieved and/or rendered, the user device can retrieve the next cached media content fragment at 510. For example, the user device can retrieve the second cached media content fragment based on the cached top level index file. More particularly, for example, the user device can parse the cached top level index file and extract data about the location of the media content file that contains the second cached media content fragment. In a more particular example, as described above in connection with FIGS. 4, 6, and 7, the cached top level index file can include URI element 622 associated with EBML file 726 (FIG. 7) that contains the second cached media content fragment. In such an example, the user device can locate EBML file 726 based on URI element 622 (e.g., by converting the URI into one or more files paths corresponding to the location of EBML file 726).
  • In some embodiments, in response to retrieval of the second cached media content fragment, the user device can render the second cached media content fragment at 512. The second cached media content fragment can be rendered in any suitable manner. For example, the user device can extract and decode the media content data (e.g., video data, audio data, subtitles, etc.) from the retrieved EBML file (e.g., EBML file 726 of FIG. 7). The user device can then cause the decoded media content data to be rendered. In a more particular example, the decoded content data can be rendered based on the cached header file (e.g., header file 722 of FIG. 7).
  • In some embodiments, after 512 is performed, process 500 can loop back to 506.
  • In some embodiments, the user device can cause a cached media content fragment to be rendered upon retrieval of the cached media content fragment. In some embodiments, the user device can retrieve multiple cached media content fragments and cause some or all of the retrieved media content fragments to be rendered in a suitable order.
  • It should be noted that process 400 of FIG. 4 and process 500 of FIG. 5 can be performed concurrently in some embodiments. It should also be noted that the above steps of the flow diagrams of FIGS. 4-5 may be executed or performed in any order or sequence not limited to the order and sequence shown and described in the figures. Furthermore, it should be noted, some of the above steps of the flow diagrams of FIGS. 4-5 may be executed or performed substantially simultaneously where appropriate or in parallel to reduce latency and processing times. And still furthermore, it should be noted, some of the above steps of the flow diagrams of FIGS. 4-5 may be omitted.
  • In some embodiments, any suitable computer readable media can be used for storing instructions for performing the mechanisms and/or processes described herein. For example, in some embodiments, computer readable media can be transitory or non-transitory. For example, non-transitory computer readable media can include media such as magnetic media (such as hard disks, floppy disks, etc.), optical media (such as compact discs, digital video discs, Blu-ray discs, etc.), semiconductor media (such as flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), etc.), any suitable media that is not fleeting or devoid of any semblance of permanence during transmission, and/or any suitable tangible media. As another example, transitory computer readable media can include signals on networks, in wires, conductors, optical fibers, circuits, any suitable media that is fleeting and devoid of any semblance of permanence during transmission, and/or any suitable intangible media.
  • The above described embodiments of the present disclosure are presented for purposes of illustration and not of limitation, and the present disclosure is limited only by the claims which follow.

Claims (20)

What is claimed is:
1. A method for streaming media content, the method comprising:
receiving top level index data from a server;
caching the top level index data in an index file;
receiving header data associated with a first media content file from the server;
caching the header data in a header file;
receiving a first segment of the first media content file based at least in part on the index file;
caching the first segment of the first media content file in a first file;
updating the index file to include information about the first file; and
causing the first fragment to be displayed based at least in part on the index file and the header file.
2. The method of claim 1, further comprising:
receiving a second segment of a second media content file based at least in part on the index file;
caching the second segment of the second media content file in a second file;
updating the index file to include information about the location of the second file; and
causing the second segment to be displayed based at least in part on the updated index file and the header file.
3. The method of claim 2, wherein the second media content file and the first media content file contain encoded video data having different bit rates.
4. The method of claim 2, wherein the second media content file and the first media content file contain encoded video data having different frame rates.
5. The method of claim 2, wherein the second media content file and the first media content file contain encoded video data having different resolutions.
6. The method of claim 2, further comprising retrieving the second file based on the updated index file.
7. The method of claim 1, further comprising:
obtaining at least one Uniform Resource Identifier (URI) corresponding to the first segment based on the index file; and
requesting the first segment from a server based on the at least one URI.
8. The method of claim 7, further comprising:
obtaining at least one URI corresponding to the second segment based on the index file; and
requesting the second segment from the server based on the at least one URI.
9. The method of claim 1, further comprising:
requesting fragment index data based at least in part on the top level index file; and
receiving the fragment index data from the server.
10. The method of claim 1, further comprising retrieving the first file based on the updated index file.
11. The method of claim 1, further comprising simultaneously caching the first segment of the first media content file and causing the first fragment to be displayed.
12. The method of claim 1, wherein the first file is a Matroska container file.
13. A system for streaming media content, the system comprising:
at least one hardware processor that is configured to:
receive top level index data from a server;
cache the top level index data in an index file;
receive header data associated with a first media content file from the server;
cache the header data in a header file;
receive a first segment of the first media content file based at least in part on the index file;
cache the first segment of the first media content file in a first file;
update the index file to include information about the first file; and
cause the first fragment to be displayed based at least in part on the index file and the header file.
14. The system of claim 13, wherein the at hardware processor is further configured to:
receive a second segment of a second media content file based at least in part on the index file;
cache the second segment of the second media content file in a second file;
update the index file to include information about the location of the second file; and
cause the second segment to be displayed based at least in part on the updated index file and the header file.
15. The system of claim 14, wherein the second media content file and the first media content file contain encoded video data having different bit rates.
16. The system of claim 14, wherein the second media content file and the first media content file contain encoded video data having different frame rates.
17. The system of claim 14, wherein the second media content file and the first media content file contain encoded video data having different resolutions.
18. The system of claim 14, wherein the hardware processor is further configured to retrieve the second file based on the updated index file.
19. The system of claim 13, wherein the hardware processor is further configured to:
obtain at least one Uniform Resource Identifier (URI) corresponding to the first segment based on the index file; and
request the first segment from a server based on the at least one URI.
20. The system of claim 19, wherein the hardware processor is further configured to:
obtain at least one URI corresponding to the second segment based on the index file; and
request the second segment from the server based on the at least one URI.
US15/972,841 2013-06-28 2018-05-07 Systems, Methods, and Media for Streaming Media Content Abandoned US20180332094A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/972,841 US20180332094A1 (en) 2013-06-28 2018-05-07 Systems, Methods, and Media for Streaming Media Content

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/931,198 US9967305B2 (en) 2013-06-28 2013-06-28 Systems, methods, and media for streaming media content
US15/972,841 US20180332094A1 (en) 2013-06-28 2018-05-07 Systems, Methods, and Media for Streaming Media Content

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/931,198 Continuation US9967305B2 (en) 2013-06-28 2013-06-28 Systems, methods, and media for streaming media content

Publications (1)

Publication Number Publication Date
US20180332094A1 true US20180332094A1 (en) 2018-11-15

Family

ID=52116734

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/931,198 Active 2034-02-23 US9967305B2 (en) 2013-06-28 2013-06-28 Systems, methods, and media for streaming media content
US15/972,841 Abandoned US20180332094A1 (en) 2013-06-28 2018-05-07 Systems, Methods, and Media for Streaming Media Content

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/931,198 Active 2034-02-23 US9967305B2 (en) 2013-06-28 2013-06-28 Systems, methods, and media for streaming media content

Country Status (1)

Country Link
US (2) US9967305B2 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10225588B2 (en) 2011-09-01 2019-03-05 Divx, Llc Playback devices and methods for playing back alternative streams of content protected using a common set of cryptographic keys
US10368096B2 (en) 2011-01-05 2019-07-30 Divx, Llc Adaptive streaming systems and methods for performing trick play
US10687095B2 (en) 2011-09-01 2020-06-16 Divx, Llc Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US10715806B2 (en) 2013-03-15 2020-07-14 Divx, Llc Systems, methods, and media for transcoding video data
US10805368B2 (en) 2012-12-31 2020-10-13 Divx, Llc Systems, methods, and media for controlling delivery of content
US10893305B2 (en) 2014-04-05 2021-01-12 Divx, Llc Systems and methods for encoding and playing back video at different frame rates using enhancement layers
US11102553B2 (en) 2009-12-04 2021-08-24 Divx, Llc Systems and methods for secure playback of encrypted elementary bitstreams
USRE48761E1 (en) 2012-12-31 2021-09-28 Divx, Llc Use of objective quality measures of streamed content to reduce streaming bandwidth
US11457054B2 (en) 2011-08-30 2022-09-27 Divx, Llc Selection of resolutions for seamless resolution switching of multimedia content
US11470405B2 (en) 2013-05-30 2022-10-11 Divx, Llc Network video streaming with trick play based on separate trick play files
US11886545B2 (en) 2006-03-14 2024-01-30 Divx, Llc Federated digital rights management scheme including trusted systems

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2749170C (en) 2009-01-07 2016-06-21 Divx, Inc. Singular, collective and automated creation of a media guide for online content
US8787570B2 (en) 2011-08-31 2014-07-22 Sonic Ip, Inc. Systems and methods for automatically genenrating top level index files
US10397292B2 (en) 2013-03-15 2019-08-27 Divx, Llc Systems, methods, and media for delivery of content
US9100618B2 (en) 2013-06-17 2015-08-04 Spotify Ab System and method for allocating bandwidth between media streams
US10097604B2 (en) 2013-08-01 2018-10-09 Spotify Ab System and method for selecting a transition point for transitioning between media streams
US9674225B2 (en) * 2013-09-20 2017-06-06 Open Text Sa Ulc System and method for updating downloaded applications using managed container
EP2851833B1 (en) 2013-09-20 2017-07-12 Open Text S.A. Application Gateway Architecture with Multi-Level Security Policy and Rule Promulgations
US9529888B2 (en) 2013-09-23 2016-12-27 Spotify Ab System and method for efficiently providing media and associated metadata
US9654532B2 (en) 2013-09-23 2017-05-16 Spotify Ab System and method for sharing file portions between peers with different capabilities
US9063640B2 (en) 2013-10-17 2015-06-23 Spotify Ab System and method for switching between media items in a plurality of sequences of media items
US9491239B2 (en) 2014-01-31 2016-11-08 Comcast Cable Communications, Llc Methods and systems for processing data requests
JP6357813B2 (en) * 2014-03-12 2018-07-18 富士通株式会社 Distribution method, resource acquisition method, distribution server, and terminal device
JP6944371B2 (en) * 2015-01-06 2021-10-06 ディビックス, エルエルシー Systems and methods for encoding content and sharing content between devices
CN104717528A (en) * 2015-03-23 2015-06-17 北京云拓世通信息技术有限公司 Streaming media live telecast PAAS processing method, device and system
US9954930B2 (en) * 2015-08-06 2018-04-24 Airwatch Llc Generating content fragments for content distribution
US11593075B2 (en) 2015-11-03 2023-02-28 Open Text Sa Ulc Streamlined fast and efficient application building and customization systems and methods
EP3410730A4 (en) * 2016-01-26 2018-12-26 Sony Corporation Reception device, reception method, and transmission device
US11388037B2 (en) 2016-02-25 2022-07-12 Open Text Sa Ulc Systems and methods for providing managed services
US10498726B2 (en) 2016-03-22 2019-12-03 International Business Machines Corporation Container independent secure file system for security application containers
GB2549997B (en) * 2016-04-19 2019-07-03 Cisco Tech Inc Management of content delivery in an IP network
US20170310752A1 (en) * 2016-04-21 2017-10-26 Samsung Electronics Company, Ltd. Utilizing a Content Delivery Network as a Notification System
CN109478400B (en) * 2016-07-22 2023-07-07 杜比实验室特许公司 Network-based processing and distribution of multimedia content for live musical performances
US10498795B2 (en) 2017-02-17 2019-12-03 Divx, Llc Systems and methods for adaptive switching between multiple content delivery networks during adaptive bitrate streaming
KR20210076652A (en) * 2019-12-16 2021-06-24 현대자동차주식회사 In vehicle multimedia system and method of managing storage for the same

Family Cites Families (597)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4009331A (en) 1974-12-24 1977-02-22 Goldmark Communications Corporation Still picture program video recording composing and playback method and system
US4694357A (en) 1985-04-24 1987-09-15 Thomson-Csf Broadcast, Inc. Apparatus and method for video signal processing
US4802170A (en) 1987-04-29 1989-01-31 Matrox Electronics Systems Limited Error disbursing format for digital information and method for organizing same
US4964069A (en) 1987-05-12 1990-10-16 International Business Machines Corporation Self adjusting video interface
US5274758A (en) 1989-06-16 1993-12-28 International Business Machines Computer-based, audio/visual creation and presentation system and method
US5119474A (en) 1989-06-16 1992-06-02 International Business Machines Corp. Computer-based, audio/visual creation and presentation system and method
JP3236015B2 (en) 1990-10-09 2001-12-04 キヤノン株式会社 Information processing apparatus and method
US5557518A (en) 1994-04-28 1996-09-17 Citibank, N.A. Trusted agents for open electronic commerce
US5404436A (en) 1991-11-27 1995-04-04 Digital Equipment Corporation Computer method and apparatus for converting compressed characters for display in full size
US6850252B1 (en) 1999-10-05 2005-02-01 Steven M. Hoffberg Intelligent electronic appliance system and method
CA2084575C (en) 1991-12-31 1996-12-03 Chris A. Dinallo Personal computer with generalized data streaming apparatus for multimedia devices
US5276735A (en) 1992-04-17 1994-01-04 Secure Computing Corporation Data enclave and trusted path system
US5420974A (en) 1992-10-15 1995-05-30 International Business Machines Corporation Multimedia complex form creation, display and editing method apparatus
JP2708683B2 (en) 1992-10-21 1998-02-04 日本電信電話株式会社 Special playback control processing method for digital video files
US5420801A (en) 1992-11-13 1995-05-30 International Business Machines Corporation System and method for synchronization of multimedia streams
US5471576A (en) 1992-11-16 1995-11-28 International Business Machines Corporation Audio/video synchronization for application programs
US5539908A (en) 1992-11-24 1996-07-23 International Business Machines Corporation Dynamically linked and shared compression/decompression
US5509070A (en) 1992-12-15 1996-04-16 Softlock Services Inc. Method for encouraging purchase of executable and non-executable software
US5717816A (en) 1993-01-13 1998-02-10 Hitachi America Ltd. Method and apparatus for the selection of data for use in VTR trick playback operation in a system using intra-coded video frames
US5493339A (en) 1993-01-21 1996-02-20 Scientific-Atlanta, Inc. System and method for transmitting a plurality of digital services including compressed imaging services and associated ancillary data services
US5719786A (en) 1993-02-03 1998-02-17 Novell, Inc. Digital media data stream network management system
CA2115976C (en) 1993-02-23 2002-08-06 Saiprasad V. Naimpally Digital high definition television video recorder with trick-play features
US5396497A (en) 1993-02-26 1995-03-07 Sony Corporation Synchronization of audio/video information
US5574934A (en) 1993-11-24 1996-11-12 Intel Corporation Preemptive priority-based transmission of signals using virtual channels
US5684542A (en) 1993-12-21 1997-11-04 Sony Corporation Video subtitle processing system
US5583652A (en) 1994-04-28 1996-12-10 International Business Machines Corporation Synchronized, variable-speed playback of digitally recorded audio and video
JP3089160B2 (en) 1994-05-20 2000-09-18 シャープ株式会社 Digital recording and playback device
US5642171A (en) 1994-06-08 1997-06-24 Dell Usa, L.P. Method and apparatus for synchronizing audio and video data streams in a multimedia system
US5633472A (en) 1994-06-21 1997-05-27 Microsoft Corporation Method and system using pathnames to specify and access audio data at fidelity levels other than the level at which the data is stored
JPH089319A (en) 1994-06-23 1996-01-12 Sony Corp Method and device for recording and device for reproducing digital video signal
JP3172635B2 (en) 1994-07-14 2001-06-04 シャープ株式会社 Digital recording and playback device
JP3155426B2 (en) 1994-07-29 2001-04-09 シャープ株式会社 Image storage communication device
US5907597A (en) 1994-08-05 1999-05-25 Smart Tone Authentication, Inc. Method and system for the secure communication of data
US5541662A (en) 1994-09-30 1996-07-30 Intel Corporation Content programmer control of video and data display using associated data
JPH08111842A (en) 1994-10-07 1996-04-30 Sanyo Electric Co Ltd Video data recording system
GB9421206D0 (en) 1994-10-20 1994-12-07 Thomson Consumer Electronics Digital VCR MPEG- trick play processing
US6047100A (en) 1994-10-20 2000-04-04 Thomson Licensing S.A. Trick play stream derivation for pre-recorded digital video recording
US6058240A (en) 1994-10-20 2000-05-02 Thomson Licensing S.A. HDTV trick play stream derivation for VCR
US5867625A (en) 1994-10-20 1999-02-02 Thomson Consumer Electronics, Inc. Digital VCR with trick play steam derivation
WO1996017313A1 (en) 1994-11-18 1996-06-06 Oracle Corporation Method and apparatus for indexing multimedia information streams
US5715403A (en) 1994-11-23 1998-02-03 Xerox Corporation System for controlling the distribution and use of digital works having attached usage rights where the usage rights are defined by a usage rights grammar
US20050149450A1 (en) 1994-11-23 2005-07-07 Contentguard Holdings, Inc. System, method, and device for controlling distribution and use of digital works based on a usage rights grammar
JPH08163488A (en) 1994-12-12 1996-06-21 Matsushita Electric Ind Co Ltd Method and device for generating moving image digest
US5533021A (en) 1995-02-03 1996-07-02 International Business Machines Corporation Apparatus and method for segmentation and time synchronization of the transmission of multimedia data
US5892900A (en) 1996-08-30 1999-04-06 Intertrust Technologies Corp. Systems and methods for secure transaction management and electronic rights protection
JP3602635B2 (en) 1995-02-16 2004-12-15 株式会社東芝 Audio switching method and playback device
US5887110A (en) 1995-03-28 1999-03-23 Nippon Telegraph & Telephone Corp. Video data playback system using effective scheme for producing coded video data for fast playback mode
US6965724B1 (en) 1995-03-30 2005-11-15 Thomson Licensing S.A. Trick-play modes for pre-encoded video
US6064794A (en) 1995-03-30 2000-05-16 Thomson Licensing S.A. Trick-play control for pre-encoded video
US6252964B1 (en) 1995-04-03 2001-06-26 Scientific-Atlanta, Inc. Authorization of services in a conditional access system
US5745643A (en) 1995-04-06 1998-04-28 Kabushiki Kaisha Toshiba System for and method of reproducing playback data appropriately by the use of attribute information on the playback data
KR19990014676A (en) 1995-05-12 1999-02-25 비에가스 빅터 Video media protection and tracking system
JPH0937225A (en) 1995-07-19 1997-02-07 Hitachi Ltd Method for distributing fast forwarding image in multimedia system
US5822524A (en) 1995-07-21 1998-10-13 Infovalue Computing, Inc. System for just-in-time retrieval of multimedia files over computer networks by transmitting data packets at transmission rate determined by frame size
JP3326670B2 (en) 1995-08-02 2002-09-24 ソニー株式会社 Data encoding / decoding method and apparatus, and encoded data recording medium
US5763800A (en) 1995-08-14 1998-06-09 Creative Labs, Inc. Method and apparatus for formatting digital audio data
TW305043B (en) 1995-09-29 1997-05-11 Matsushita Electric Ind Co Ltd
US5751280A (en) 1995-12-11 1998-05-12 Silicon Graphics, Inc. System and method for media stream synchronization with a base atom index file and an auxiliary atom index file
US5675511A (en) 1995-12-21 1997-10-07 Intel Corporation Apparatus and method for event tagging for multiple audio, video, and data streams
US5627936A (en) 1995-12-21 1997-05-06 Intel Corporation Apparatus and method for temporal indexing of multiple audio, video and data streams
US5765164A (en) 1995-12-21 1998-06-09 Intel Corporation Apparatus and method for management of discontinuous segments of multiple audio, video, and data streams
US6292621B1 (en) 1996-02-05 2001-09-18 Canon Kabushiki Kaisha Recording apparatus for newly recording a second encoded data train on a recording medium on which an encoded data train is recorded
US5841432A (en) 1996-02-09 1998-11-24 Carmel; Sharon Method and system of building and transmitting a data file for real time play of multimedia, particularly animation, and a data file for real time play of multimedia applications
US5959690A (en) 1996-02-20 1999-09-28 Sas Institute, Inc. Method and apparatus for transitions and other special effects in digital motion video
US5675382A (en) 1996-04-08 1997-10-07 Connectix Corporation Spatial compression and decompression for video
US6031622A (en) 1996-05-16 2000-02-29 Agfa Corporation Method and apparatus for font compression and decompression
US6065050A (en) 1996-06-05 2000-05-16 Sun Microsystems, Inc. System and method for indexing between trick play and normal play video streams in a video delivery system
US5903261A (en) 1996-06-20 1999-05-11 Data Translation, Inc. Computer based video system
US5844575A (en) 1996-06-27 1998-12-01 Intel Corporation Video compression interface
US5828370A (en) 1996-07-01 1998-10-27 Thompson Consumer Electronics Inc. Video delivery system and method for displaying indexing slider bar on the subscriber video screen
US5999812A (en) 1996-08-09 1999-12-07 Himsworth; Winston E. Method for self registration and authentication for wireless telephony devices
US5956729A (en) 1996-09-06 1999-09-21 Motorola, Inc. Multimedia file, supporting multiple instances of media types, and method for forming same
US5805700A (en) 1996-10-15 1998-09-08 Intel Corporation Policy based selective encryption of compressed video data
JP2000508818A (en) 1997-02-03 2000-07-11 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Recording of trick play signal on record carrier
US6038257A (en) 1997-03-12 2000-03-14 Telefonaktiebolaget L M Ericsson Motion and still video picture transmission and display
US6115420A (en) 1997-03-14 2000-09-05 Microsoft Corporation Digital video signal encoder and encoding method
EP0968607B1 (en) 1997-03-21 2003-02-12 Canal+ Technologies Smartcard for use with a receiver of encrypted broadcast signals, and receiver
JP4832619B2 (en) 1997-04-07 2011-12-07 エイ・ティ・アンド・ティ・コーポレーション System and method for processing audio-visual information based on an object
KR100243209B1 (en) 1997-04-30 2000-02-01 윤종용 Apparatus and method of digital recording/reproducing
WO1999010836A1 (en) 1997-08-27 1999-03-04 Geo Interactive Media Group, Ltd. Real-time multimedia transmission
US6044469A (en) 1997-08-29 2000-03-28 Preview Software Software publisher or distributor configurable software security mechanism
US6697566B2 (en) 1997-10-17 2004-02-24 Sony Corporation Encoded signal characteristic point recording apparatus
US6046778A (en) 1997-10-29 2000-04-04 Matsushita Electric Industrial Co., Ltd. Apparatus for generating sub-picture units for subtitles and storage medium storing sub-picture unit generation program
US20020159598A1 (en) 1997-10-31 2002-10-31 Keygen Corporation System and method of dynamic key generation for digital communications
KR100592651B1 (en) 1997-11-27 2006-06-23 브리티쉬 텔리커뮤니케이션즈 파블릭 리미티드 캄퍼니 Transcoding
KR19990042668A (en) 1997-11-27 1999-06-15 정선종 Video encoding apparatus and method for multiple video transmission
JP3206530B2 (en) 1997-11-28 2001-09-10 日本ビクター株式会社 Moving picture coding apparatus, moving picture decoding apparatus, moving picture coding method, and moving picture decoding method
US6141754A (en) 1997-11-28 2000-10-31 International Business Machines Corporation Integrated method and system for controlling information access and distribution
US20020057898A1 (en) 1997-12-19 2002-05-16 James Oliver Normile Method and apparatus for trick play of bitstream data
KR100252108B1 (en) 1997-12-20 2000-04-15 윤종용 Apparatus and method for digital recording and reproducing using mpeg compression codec
US6665835B1 (en) 1997-12-23 2003-12-16 Verizon Laboratories, Inc. Real time media journaler with a timing event coordinator
US6453355B1 (en) 1998-01-15 2002-09-17 Apple Computer, Inc. Method and apparatus for media data transmission
US6453459B1 (en) 1998-01-21 2002-09-17 Apple Computer, Inc. Menu authoring system and method for automatically performing low-level DVD configuration functions and thereby ease an author's job
JP3394899B2 (en) 1998-01-23 2003-04-07 株式会社東芝 Audio data recording medium, data processing apparatus and method
TW416220B (en) 1998-01-23 2000-12-21 Matsushita Electric Ind Co Ltd Image transmitting method, image processing method, image processing device, and data storing medium
EP0936812A1 (en) 1998-02-13 1999-08-18 CANAL+ Société Anonyme Method and apparatus for recording of encrypted digital data
IL123819A (en) 1998-03-24 2001-09-13 Geo Interactive Media Group Lt Network media streaming
JP3120773B2 (en) 1998-04-06 2000-12-25 日本電気株式会社 Image processing device
US6510554B1 (en) 1998-04-27 2003-01-21 Diva Systems Corporation Method for generating information sub-streams for FF/REW applications
JP3383580B2 (en) 1998-05-15 2003-03-04 株式会社東芝 Information storage medium, information recording / reproducing apparatus and method
US6282653B1 (en) 1998-05-15 2001-08-28 International Business Machines Corporation Royalty collection method and system for use of copyrighted digital materials on the internet
US6859496B1 (en) 1998-05-29 2005-02-22 International Business Machines Corporation Adaptively encoding multiple streams of video data in parallel for multiplexing onto a constant bit rate channel
WO1999065239A2 (en) 1998-06-11 1999-12-16 Koninklijke Philips Electronics N.V. Trick play signal generation for a digital video recorder
JP3383587B2 (en) 1998-07-07 2003-03-04 株式会社東芝 Still image continuous information recording method, optical disc, optical disc information reproducing apparatus and information reproducing method
CN1867068A (en) 1998-07-14 2006-11-22 联合视频制品公司 Client-server based interactive television program guide system with remote server recording
ES2212605T3 (en) 1998-09-08 2004-07-16 Sharp Kabushiki Kaisha VARIABLE IMAGE EDITION METHOD IN TIME AND VARIABLE IMAGE EDITION DEVICE IN TIME.
JP2000090644A (en) 1998-09-08 2000-03-31 Sharp Corp Image management method and device
US6155840A (en) 1998-09-18 2000-12-05 At Home Corporation System and method for distributed learning
US6438652B1 (en) 1998-10-09 2002-08-20 International Business Machines Corporation Load balancing cooperating cache servers by shifting forwarded request
US7024678B2 (en) 1998-11-30 2006-04-04 Sedna Patent Services, Llc Method and apparatus for producing demand real-time television
US6389218B2 (en) 1998-11-30 2002-05-14 Diva Systems Corporation Method and apparatus for simultaneously producing compressed play and trick play bitstreams from a video frame sequence
WO2000033307A1 (en) 1998-12-02 2000-06-08 Koninklijke Philips Electronics N.V. Apparatus and method for recording a digital information signal with trick play information in slant tracks on a record carrier
US6414996B1 (en) 1998-12-08 2002-07-02 Stmicroelectronics, Inc. System, method and apparatus for an instruction driven digital video processor
US6374144B1 (en) 1998-12-22 2002-04-16 Varian Semiconductor Equipment Associates, Inc. Method and apparatus for controlling a system using hierarchical state machines
US7209892B1 (en) 1998-12-24 2007-04-24 Universal Music Group, Inc. Electronic music/media distribution system
JP2000201343A (en) 1999-01-05 2000-07-18 Toshiba Corp Moving image data reproducing device, moving image data dividing device and recording medium readable by computer in which program is recorded
US6510513B1 (en) 1999-01-13 2003-01-21 Microsoft Corporation Security services and policy enforcement for electronic data
JP3433125B2 (en) 1999-01-27 2003-08-04 三洋電機株式会社 Video playback device
US6169242B1 (en) 1999-02-02 2001-01-02 Microsoft Corporation Track-based music performance architecture
CN1181676C (en) 1999-02-08 2004-12-22 三洋电机株式会社 Moving image recording device and digital code camera
EP1030257B1 (en) 1999-02-17 2011-11-02 Nippon Telegraph And Telephone Corporation Original data circulation method, system, apparatus, and computer readable medium
US7730300B2 (en) 1999-03-30 2010-06-01 Sony Corporation Method and apparatus for protecting the transfer of data
US6658056B1 (en) 1999-03-30 2003-12-02 Sony Corporation Digital video decoding, buffering and frame-rate converting method and apparatus
US8689265B2 (en) 1999-03-30 2014-04-01 Tivo Inc. Multimedia mobile personalization system
US8065708B1 (en) 1999-03-31 2011-11-22 Cox Communications, Inc. Method for reducing latency in an interactive information distribution system
CA2388565A1 (en) 1999-04-21 2000-10-26 Research Investment Network, Inc. System, method and article of manufacture for updating content stored on a portable storage medium
US6453420B1 (en) 1999-04-21 2002-09-17 Research Investment Network, Inc. System, method and article of manufacture for authorizing the use of electronic content utilizing a laser-centric medium
US20050182828A1 (en) 1999-04-21 2005-08-18 Interactual Technologies, Inc. Platform specific execution
US8055588B2 (en) 1999-05-19 2011-11-08 Digimarc Corporation Digital media methods
US7493018B2 (en) 1999-05-19 2009-02-17 Kwang Su Kim Method for creating caption-based search information of moving picture data, searching and repeating playback of moving picture data based on said search information, and reproduction apparatus using said method
EP1056273A3 (en) 1999-05-25 2002-01-02 SeeItFirst, Inc. Method and system for providing high quality images from a digital video stream
US6289450B1 (en) 1999-05-28 2001-09-11 Authentica, Inc. Information security architecture for encrypting documents for remote access while maintaining access control
US6807306B1 (en) 1999-05-28 2004-10-19 Xerox Corporation Time-constrained keyframe selection method
US6330286B1 (en) 1999-06-09 2001-12-11 Sarnoff Corporation Flow control, latency control, and bitrate conversions in a timing correction and frame synchronization apparatus
US6725281B1 (en) 1999-06-11 2004-04-20 Microsoft Corporation Synchronization of controlled device state using state table and eventing in data-driven remote device control model
US7330875B1 (en) 1999-06-15 2008-02-12 Microsoft Corporation System and method for recording a presentation for on-demand viewing over a computer network
JP2001043668A (en) 1999-08-02 2001-02-16 Nippon Columbia Co Ltd Magnetic disk device
US7133598B1 (en) 1999-08-20 2006-11-07 Thomson Licensing Method for converting packetized video data and corresponding navigation data into a second data format
US7647618B1 (en) 1999-08-27 2010-01-12 Charles Eric Hunter Video distribution system
MXPA02004015A (en) 1999-10-22 2003-09-25 Activesky Inc An object oriented video system.
ATE359669T1 (en) 1999-10-27 2007-05-15 Sedna Patent Services Llc MULTIPLE VIDEO STREAMS USING SLICE-BASED ENCODING
US6449719B1 (en) 1999-11-09 2002-09-10 Widevine Technologies, Inc. Process and streaming server for encrypting a data stream
US7151832B1 (en) 1999-11-18 2006-12-19 International Business Machines Corporation Dynamic encryption and decryption of a stream of data
US6988144B1 (en) 1999-11-18 2006-01-17 International Business Machines Corporation Packet scheduling system and method for multimedia data
JP3403168B2 (en) 1999-12-15 2003-05-06 三洋電機株式会社 Image processing method, image processing apparatus capable of using the method, and television receiver
US6996720B1 (en) 1999-12-17 2006-02-07 Microsoft Corporation System and method for accessing protected content in a rights-management architecture
US20010030710A1 (en) 1999-12-22 2001-10-18 Werner William B. System and method for associating subtitle data with cinematic material
US6871008B1 (en) 2000-01-03 2005-03-22 Genesis Microchip Inc. Subpicture decoding architecture and method
KR100320476B1 (en) 2000-01-12 2002-01-15 구자홍 Video decoder and method for the same
US6810031B1 (en) 2000-02-29 2004-10-26 Celox Networks, Inc. Method and device for distributing bandwidth
WO2001065762A2 (en) 2000-03-02 2001-09-07 Tivo, Inc. Conditional access system and method for prevention of replay attacks
GB0007868D0 (en) 2000-03-31 2000-05-17 Koninkl Philips Electronics Nv Methods and apparatus for editing digital video recordings and recordings made by such methods
EP1143658A1 (en) 2000-04-03 2001-10-10 Canal+ Technologies Société Anonyme Authentication of data transmitted in a digital transmission system
US7526450B2 (en) 2000-04-19 2009-04-28 Sony Corporation Interface for presenting downloadable digital data content format options
JP2001359070A (en) 2000-06-14 2001-12-26 Canon Inc Data processing unit, data processing method and computer-readable storage medium
US8091106B1 (en) 2000-06-26 2012-01-03 Thomson Licensing Method and apparatus for using DVD subpicture information in a television receiver
US6891953B1 (en) 2000-06-27 2005-05-10 Microsoft Corporation Method and system for binding enhanced software features to a persona
US7647340B2 (en) 2000-06-28 2010-01-12 Sharp Laboratories Of America, Inc. Metadata in JPEG 2000 file format
US6771703B1 (en) 2000-06-30 2004-08-03 Emc Corporation Efficient scaling of nonscalable MPEG-2 Video
FI109393B (en) 2000-07-14 2002-07-15 Nokia Corp Method for encoding media stream, a scalable and a terminal
WO2002008948A2 (en) 2000-07-24 2002-01-31 Vivcom, Inc. System and method for indexing, searching, identifying, and editing portions of electronic multimedia files
US6395969B1 (en) 2000-07-28 2002-05-28 Mxworks, Inc. System and method for artistically integrating music and visual effects
KR20020013664A (en) 2000-08-14 2002-02-21 민홍기 An Implementation of the Internet Caption Broadcasting Server and Client for the Hearing Impairment
JP4552294B2 (en) 2000-08-31 2010-09-29 ソニー株式会社 Content distribution system, content distribution method, information processing apparatus, and program providing medium
US7165175B1 (en) 2000-09-06 2007-01-16 Widevine Technologies, Inc. Apparatus, system and method for selectively encrypting different portions of data sent over a network
US7242772B1 (en) 2000-09-07 2007-07-10 Eastman Kodak Company Encryption apparatus and method for synchronizing multiple encryption keys with a data stream
US7689510B2 (en) 2000-09-07 2010-03-30 Sonic Solutions Methods and system for use in network management of content
US6934334B2 (en) 2000-10-02 2005-08-23 Kabushiki Kaisha Toshiba Method of transcoding encoded video data and apparatus which transcodes encoded video data
US7231132B1 (en) 2000-10-16 2007-06-12 Seachange International, Inc. Trick-mode processing for digital video
US20020073177A1 (en) 2000-10-25 2002-06-13 Clark George Philip Processing content for electronic distribution using a digital rights management system
KR20020032803A (en) 2000-10-27 2002-05-04 구자홍 File structure for streaming service
US6985588B1 (en) 2000-10-30 2006-01-10 Geocodex Llc System and method for using location identity to control access to digital information
US6810389B1 (en) 2000-11-08 2004-10-26 Synopsys, Inc. System and method for flexible packaging of software application licenses
ATE552562T1 (en) 2000-11-10 2012-04-15 Aol Musicnow Llc DIGITAL CONTENT DISTRIBUTION AND SUBSCRIPTION SYSTEM
US7043473B1 (en) 2000-11-22 2006-05-09 Widevine Technologies, Inc. Media tracking system and method
US7117231B2 (en) 2000-12-07 2006-10-03 International Business Machines Corporation Method and system for the automatic generation of multi-lingual synchronized sub-titles for audiovisual data
US20070136817A1 (en) 2000-12-07 2007-06-14 Igt Wager game license management in a peer gaming network
KR100433516B1 (en) 2000-12-08 2004-05-31 삼성전자주식회사 Transcoding method
US20030002578A1 (en) 2000-12-11 2003-01-02 Ikuo Tsukagoshi System and method for timeshifting the encoding/decoding of audio/visual signals in real-time
US7150045B2 (en) 2000-12-14 2006-12-12 Widevine Technologies, Inc. Method and apparatus for protection of electronic media
US6798912B2 (en) 2000-12-18 2004-09-28 Koninklijke Philips Electronics N.V. Apparatus and method of program classification based on syntax of transcript information
US6973576B2 (en) 2000-12-27 2005-12-06 Margent Development, Llc Digital content security system
US7472280B2 (en) 2000-12-27 2008-12-30 Proxense, Llc Digital rights management
US7023924B1 (en) 2000-12-28 2006-04-04 Emc Corporation Method of pausing an MPEG coded video stream
EP1220173A1 (en) 2000-12-29 2002-07-03 THOMSON multimedia System and method for the secure distribution of digital content in a sharing network
KR20030005223A (en) 2001-01-12 2003-01-17 코닌클리케 필립스 일렉트로닉스 엔.브이. Method and device for scalable video transcoding
JP4048407B2 (en) 2001-01-18 2008-02-20 富士フイルム株式会社 Digital camera
US20020136298A1 (en) 2001-01-18 2002-09-26 Chandrashekhara Anantharamu System and method for adaptive streaming of predictive coded video data
GB0103245D0 (en) 2001-02-09 2001-03-28 Radioscape Ltd Method of inserting additional data into a compressed signal
US7610365B1 (en) 2001-02-14 2009-10-27 International Business Machines Corporation Automatic relevance-based preloading of relevant information in portable devices
TWI223942B (en) 2001-02-20 2004-11-11 Li Jian Min Contents transmission network system and creating method thereof
US20020120934A1 (en) 2001-02-28 2002-08-29 Marc Abrahams Interactive television browsing and buying method
US7162314B2 (en) 2001-03-05 2007-01-09 Microsoft Corporation Scripting solution for interactive audio generation
US6970822B2 (en) 2001-03-07 2005-11-29 Microsoft Corporation Accessing audio processing components in an audio generation system
US7305273B2 (en) 2001-03-07 2007-12-04 Microsoft Corporation Audio generation system manager
JP3889233B2 (en) 2001-03-08 2007-03-07 株式会社モノリス Image encoding method and apparatus, and image decoding method and apparatus
EP1244310A1 (en) 2001-03-21 2002-09-25 Canal+ Technologies Société Anonyme Data referencing system
US20030061305A1 (en) 2001-03-30 2003-03-27 Chyron Corporation System and method for enhancing streaming media delivery and reporting
US8060906B2 (en) 2001-04-06 2011-11-15 At&T Intellectual Property Ii, L.P. Method and apparatus for interactively retrieving content related to previous query results
JP3788260B2 (en) 2001-04-09 2006-06-21 日本電気株式会社 Distribution system, distribution method thereof, and distribution program
SE0101295D0 (en) 2001-04-10 2001-04-10 Ericsson Telefon Ab L M A method and network for delivering streaming data
US7121666B2 (en) 2001-04-25 2006-10-17 Tseng Scheffer C G Apparatus and method for the kinetic analysis of tear stability
US7136485B2 (en) 2001-05-04 2006-11-14 Hewlett-Packard Development Company, L.P. Packetizing devices for scalable data streaming
US20030041257A1 (en) 2001-05-04 2003-02-27 Wee Susie J. Systems, methods and storage devices for scalable data streaming
US6973445B2 (en) 2001-05-31 2005-12-06 Contentguard Holdings, Inc. Demarcated digital content and method for creating and processing demarcated digital works
US7103261B2 (en) 2001-06-04 2006-09-05 William Grecia Consumer friendly error correcting formating method for white book 2.0 video compact disc with CD-DA red book audio tracks
US7747853B2 (en) 2001-06-06 2010-06-29 Sony Corporation IP delivery of secure digital content
US7181131B2 (en) 2001-06-18 2007-02-20 Thomson Licensing Changing a playback speed for video presentation recorded in a progressive frame structure format
JP2003087785A (en) 2001-06-29 2003-03-20 Toshiba Corp Method of converting format of encoded video data and apparatus therefor
US7356245B2 (en) 2001-06-29 2008-04-08 International Business Machines Corporation Methods to facilitate efficient transmission and playback of digital information
US7421411B2 (en) 2001-07-06 2008-09-02 Nokia Corporation Digital rights management in a mobile communications environment
US20030031178A1 (en) 2001-08-07 2003-02-13 Amplify.Net, Inc. Method for ascertaining network bandwidth allocation policy associated with network address
US6925183B2 (en) 2001-08-16 2005-08-02 Asustek Computer Inc. Preventing shortened lifetimes of security keys in a wireless communications security system
US6829358B2 (en) 2001-08-20 2004-12-07 Asustek Computer Inc. Processing channel resets while performing a ciphering configuration change in a wireless communications protocol
EP1286349A1 (en) 2001-08-21 2003-02-26 Canal+ Technologies Société Anonyme File and content management
JP3593075B2 (en) 2001-09-10 2004-11-24 株式会社東芝 Information storage medium, information recording method and apparatus, and information reproducing method and apparatus
FI20011871A (en) 2001-09-24 2003-03-25 Nokia Corp Processing of multimedia data
US7457359B2 (en) 2001-09-26 2008-11-25 Mabey Danny L Systems, devices and methods for securely distributing highly-compressed multimedia content
US7191216B2 (en) 2001-10-03 2007-03-13 Nokia Corporation System and method for controlling access to downloadable resources
US20030093799A1 (en) 2001-11-14 2003-05-15 Kauffman Marc W. Streamed content Delivery
US7242773B2 (en) 2002-09-09 2007-07-10 Sony Corporation Multiple partial encryption using retuning
US7075587B2 (en) 2002-01-04 2006-07-11 Industry-Academic Cooperation Foundation Yonsei University Video display apparatus with separate display means for textual information
EP1470497A1 (en) 2002-01-12 2004-10-27 Coretrust, Inc. Method and system for the information protection of digital content
US7328345B2 (en) 2002-01-29 2008-02-05 Widevine Technologies, Inc. Method and system for end to end securing of content for video on demand
JP2003250113A (en) 2002-02-22 2003-09-05 Fuji Photo Film Co Ltd Video audio recording and reproducing apparatus
US7254634B1 (en) 2002-03-08 2007-08-07 Akamai Technologies, Inc. Managing web tier session state objects in a content delivery network (CDN)
US20030236836A1 (en) 2002-03-21 2003-12-25 Borthwick Ernest Mark System and method for the design and sharing of rich media productions via a computer network
US7170936B2 (en) 2002-03-28 2007-01-30 Intel Corporation Transcoding apparatus, system, and method
US20030185302A1 (en) 2002-04-02 2003-10-02 Abrams Thomas Algie Camera and/or camera converter
US20040006701A1 (en) 2002-04-13 2004-01-08 Advanced Decisions Inc. Method and apparatus for authentication of recorded audio
CA2485053A1 (en) 2002-05-10 2003-11-20 Protexis Inc. System and method for multi-tiered license management and distribution using networked clearinghouses
WO2003096669A2 (en) 2002-05-10 2003-11-20 Reisman Richard R Method and apparatus for browsing using multiple coordinated device
US7054804B2 (en) 2002-05-20 2006-05-30 International Buisness Machines Corporation Method and apparatus for performing real-time subtitles translation
US7197234B1 (en) 2002-05-24 2007-03-27 Digeo, Inc. System and method for processing subpicture data
US20030233464A1 (en) 2002-06-10 2003-12-18 Jonathan Walpole Priority progress streaming for quality-adaptive transmission of data
JP2004013823A (en) 2002-06-11 2004-01-15 Hitachi Ltd Content distributing system and program
US7155109B2 (en) 2002-06-14 2006-12-26 Microsoft Corporation Programmable video recorder having flexible trick play
US7644172B2 (en) 2002-06-24 2010-01-05 Microsoft Corporation Communicating via a connection between a streaming server and a client without breaking the connection
US9445133B2 (en) 2002-07-10 2016-09-13 Arris Enterprises, Inc. DVD conversion for on demand
US20040021684A1 (en) 2002-07-23 2004-02-05 Dominick B. Millner Method and system for an interactive video system
US7627229B2 (en) 2002-07-26 2009-12-01 Fujifilm Corporation Moving image recording apparatus and method of recording moving image
US7418190B2 (en) 2002-08-22 2008-08-26 Microsoft Corporation Accelerated access to frames from a compressed digital video stream without keyframes
US20040052501A1 (en) 2002-09-12 2004-03-18 Tam Eddy C. Video event capturing system and method
US7594271B2 (en) 2002-09-20 2009-09-22 Widevine Technologies, Inc. Method and system for real-time tamper evidence gathering for software
US7869691B2 (en) 2002-09-26 2011-01-11 Koninklijke Philips Electronics N.V. Apparatus for recording a main file and auxiliary files in a track on a record carrier
KR100910974B1 (en) 2002-10-02 2009-08-05 엘지전자 주식회사 Method for managing a graphic data of high density optical disc
US7185363B1 (en) 2002-10-04 2007-02-27 Microsoft Corporation Using a first device to engage in a digital rights management transaction on behalf of a second device
US20040071453A1 (en) 2002-10-08 2004-04-15 Valderas Harold M. Method and system for producing interactive DVD video slides
US20040081434A1 (en) 2002-10-15 2004-04-29 Samsung Electronics Co., Ltd. Information storage medium containing subtitle data for multiple languages using text data and downloadable fonts and apparatus therefor
US7295673B2 (en) 2002-10-23 2007-11-13 Divx, Inc. Method and system for securing compressed digital video
US7926080B2 (en) 2002-11-07 2011-04-12 Microsoft Corporation Trick mode support for VOD with long intra-frame intervals
US7702904B2 (en) 2002-11-15 2010-04-20 Nec Corporation Key management system and multicast delivery system using the same
EP1420580A1 (en) 2002-11-18 2004-05-19 Deutsche Thomson-Brandt GmbH Method and apparatus for coding/decoding items of subtitling data
JP3935419B2 (en) 2002-11-19 2007-06-20 Kddi株式会社 Video coding bit rate selection method
JP2004187161A (en) 2002-12-05 2004-07-02 Matsushita Electric Ind Co Ltd Moving video data processing equipment and moving video data processing method
WO2004054247A1 (en) 2002-12-09 2004-06-24 Koninklijke Philips Electronics N.V. Interactive television system with partial character set generator
US8054880B2 (en) 2004-12-10 2011-11-08 Tut Systems, Inc. Parallel rate control for digital video encoder with multi-processor architecture and picture-based look-ahead window
US7251328B2 (en) 2003-01-14 2007-07-31 General Instrument Corporation System for secure decryption of streaming media using selective decryption of header information and decryption of reassembled content
US7581255B2 (en) 2003-01-21 2009-08-25 Microsoft Corporation Systems and methods for licensing one or more data streams from an encoded digital media file
JP2004234128A (en) 2003-01-28 2004-08-19 Matsushita Electric Ind Co Ltd Information terminal device and server device therefor
US20060053080A1 (en) 2003-02-03 2006-03-09 Brad Edmonson Centralized management of digital rights licensing
US20040158878A1 (en) 2003-02-07 2004-08-12 Viresh Ratnakar Power scalable digital video decoding
KR100553885B1 (en) 2003-03-13 2006-02-24 삼성전자주식회사 Method for managing multimedia contents made with SMIL and SMIL file system thereof
US7356143B2 (en) 2003-03-18 2008-04-08 Widevine Technologies, Inc System, method, and apparatus for securely providing content viewable on a secure device
US7007170B2 (en) 2003-03-18 2006-02-28 Widevine Technologies, Inc. System, method, and apparatus for securely providing content viewable on a secure device
KR20050118197A (en) 2003-03-20 2005-12-15 코닌클리케 필립스 일렉트로닉스 엔.브이. Cpi data for stream buffer channels
US7313236B2 (en) 2003-04-09 2007-12-25 International Business Machines Corporation Methods and apparatus for secure and adaptive delivery of multimedia content
US7237061B1 (en) 2003-04-17 2007-06-26 Realnetworks, Inc. Systems and methods for the efficient reading of data in a server system
WO2004097824A1 (en) 2003-04-29 2004-11-11 Lg Electronics Inc. Recording medium having a data structure for managing reproduction of graphic data and methods and apparatuses of recording and reproducing
US7616865B2 (en) 2003-04-30 2009-11-10 Lg Electronics Inc. Recording medium having a data structure for managing reproduction of subtitle data and methods and apparatuses of recording and reproducing
WO2004100518A1 (en) 2003-05-05 2004-11-18 Behruz Vazvan A communication method, system, devices and software arranged to operate in this system and devices
CN1791939A (en) 2003-05-16 2006-06-21 皇家飞利浦电子股份有限公司 Method of recording and of replaying and video recording and replay systems
JP4047237B2 (en) 2003-07-04 2008-02-13 株式会社リコー Index data creation device, video playback device
US7382879B1 (en) 2003-07-23 2008-06-03 Sprint Communications Company, L.P. Digital rights management negotiation for streaming media over a network
JP3785642B2 (en) 2003-09-03 2006-06-14 日本電気株式会社 Encoding apparatus and decoding apparatus using encryption key included in digital watermark, and methods thereof
US7467202B2 (en) 2003-09-10 2008-12-16 Fidelis Security Systems High-performance network content analysis platform
US7389273B2 (en) 2003-09-25 2008-06-17 Scott Andrew Irwin System and method for federated rights management
WO2005034092A2 (en) 2003-09-29 2005-04-14 Handheld Entertainment, Inc. Method and apparatus for coding information
US7979886B2 (en) 2003-10-17 2011-07-12 Telefonaktiebolaget Lm Ericsson (Publ) Container format for multimedia presentations
US7406174B2 (en) 2003-10-21 2008-07-29 Widevine Technologies, Inc. System and method for n-dimensional encryption
EP2051510B1 (en) 2003-10-30 2013-08-14 Panasonic Corporation Mobile-terminal-orientated transmission method and apparatus
CN1830164A (en) 2003-10-30 2006-09-06 松下电器产业株式会社 Mobile-terminal-oriented transmission method and apparatus
US7673062B2 (en) 2003-11-18 2010-03-02 Yahoo! Inc. Method and apparatus for assisting with playback of remotely stored media files
US7882034B2 (en) 2003-11-21 2011-02-01 Realnetworks, Inc. Digital rights management for content rendering on playback devices
US8472792B2 (en) 2003-12-08 2013-06-25 Divx, Llc Multimedia distribution system
US7900140B2 (en) 2003-12-08 2011-03-01 Microsoft Corporation Media processing methods, systems and application program interfaces
US7519274B2 (en) 2003-12-08 2009-04-14 Divx, Inc. File format for multiple track digital data
JP4118232B2 (en) 2003-12-19 2008-07-16 三菱電機株式会社 Video data processing method and video data processing apparatus
EP1553779A1 (en) 2004-01-12 2005-07-13 Deutsche Thomson-Brandt GmbH Data reduction of video streams by selection of frames and partial deletion of transform coefficients
US20050183120A1 (en) 2004-01-13 2005-08-18 Saurabh Jain Multi-user personalized digital multimedia distribution methods and systems
US20050180641A1 (en) 2004-02-02 2005-08-18 Clark Adam L. System and method for transmitting live audio/video information
WO2005076601A1 (en) 2004-02-10 2005-08-18 Lg Electronic Inc. Text subtitle decoder and method for decoding text subtitle streams
JP5119566B2 (en) 2004-02-16 2013-01-16 ソニー株式会社 REPRODUCTION DEVICE AND REPRODUCTION METHOD, PROGRAM RECORDING MEDIUM, AND PROGRAM
US7512658B2 (en) 2004-02-26 2009-03-31 International Business Machines Corporation Providing a portion of an electronic mail message based upon a transfer rate, a message size, and a file format
EP1746825B1 (en) 2004-04-16 2011-06-08 Panasonic Corporation Recording medium, reproduction device, program
TWI405466B (en) 2004-04-16 2013-08-11 Panasonic Corp A regeneration device, a regeneration program, a regeneration method, and a regeneration system
US8868772B2 (en) 2004-04-30 2014-10-21 Echostar Technologies L.L.C. Apparatus, system, and method for adaptive-rate shifting of streaming content
US20050254508A1 (en) * 2004-05-13 2005-11-17 Nokia Corporation Cooperation between packetized data bit-rate adaptation and data packet re-transmission
US20060037057A1 (en) 2004-05-24 2006-02-16 Sharp Laboratories Of America, Inc. Method and system of enabling trick play modes using HTTP GET
US7299406B2 (en) 2004-06-02 2007-11-20 Research In Motion Limited Representing spreadsheet document content
GB0413261D0 (en) 2004-06-15 2004-07-14 Ibm Method and arrangement for front building
EP1758121B1 (en) 2004-06-18 2010-09-01 Panasonic Corporation Reproduction device, program, and reproduction method
US7571246B2 (en) 2004-07-29 2009-08-04 Microsoft Corporation Media transrating over a bandwidth-limited network
EP1779659B1 (en) 2004-08-12 2020-10-07 Gracenote Inc. Selection of content from a stream of video or audio data
US8055585B2 (en) 2004-08-12 2011-11-08 Enreach Technology, Inc. Digital media distribution
US20070271317A1 (en) 2004-08-16 2007-11-22 Beinsync Ltd. System and Method for the Synchronization of Data Across Multiple Computing Devices
DE602005012410D1 (en) 2004-09-08 2009-03-05 Panasonic Corp Anti-chatter, anti-rattle timer for application input in a DVD player.
JP2007065928A (en) 2005-08-30 2007-03-15 Toshiba Corp Information storage medium, information processing method, information transfer method, information reproduction method, information reproduction device, information recording method, information recording device, and program
US20060093320A1 (en) 2004-10-29 2006-05-04 Hallberg Bryan S Operation modes for a personal video recorder using dynamically generated time stamps
CN101069423A (en) 2004-11-29 2007-11-07 松下电器产业株式会社 Transmitting apparatus and receiving apparatus
WO2006066052A2 (en) 2004-12-16 2006-06-22 Sonic Solutions Methods and systems for use in network management of content
JP2008527945A (en) 2005-01-19 2008-07-24 トムソン ライセンシング Method and apparatus for real-time parallel encoding
DE102005004941B4 (en) 2005-02-02 2006-12-21 Avt Audio Vision Technology Gmbh Conversion of data, in particular for the reproduction of audio and / or video information
JP4599581B2 (en) 2005-02-08 2010-12-15 ブラザー工業株式会社 Information distribution system, distribution request program, transfer program, distribution program, etc.
US7350029B2 (en) 2005-02-10 2008-03-25 International Business Machines Corporation Data stream prefetching in a microprocessor
EP1862008A2 (en) 2005-02-18 2007-12-05 Koninklijke Philips Electronics N.V. Method of mutltiplexing auxiliary data in an audio/video stream
US7349886B2 (en) 2005-03-25 2008-03-25 Widevine Technologies, Inc. Securely relaying content using key chains
US20080120342A1 (en) 2005-04-07 2008-05-22 Iofy Corporation System and Method for Providing Data to be Used in a Presentation on a Device
US20080120330A1 (en) 2005-04-07 2008-05-22 Iofy Corporation System and Method for Linking User Generated Data Pertaining to Sequential Content
US8261356B2 (en) 2005-04-08 2012-09-04 Electronics And Telecommunications Research Institute Tool pack structure and contents execution device
US7676495B2 (en) 2005-04-14 2010-03-09 Microsoft Corporation Advanced streaming format table of contents object
US20060259589A1 (en) 2005-04-20 2006-11-16 Lerman David R Browser enabled video manipulation
US7478325B2 (en) 2005-04-22 2009-01-13 Microsoft Corporation Methods for providing an accurate visual rendition of a text element formatted with an unavailable font
JP4356645B2 (en) 2005-04-28 2009-11-04 ソニー株式会社 Subtitle generation apparatus and method
US8640166B1 (en) 2005-05-06 2014-01-28 Rovi Guides, Inc. Systems and methods for content surfing
DE102005022834A1 (en) 2005-05-11 2006-11-16 Deutsche Telekom Ag Method for disseminating DRM-based digital content
KR100717008B1 (en) 2005-05-31 2007-05-10 삼성전자주식회사 Method and apparatus for transmitting and receiving of partial font file
US7805470B2 (en) * 2005-06-23 2010-09-28 Emc Corporation Methods and apparatus for managing the storage of content in a file system
US20070005333A1 (en) 2005-06-30 2007-01-04 Benard Setiohardjo Analog tape emulation
CA2513016A1 (en) 2005-07-22 2007-01-22 Research In Motion Limited A secure method of synchronizing cache contents of a mobile browser with a proxy server
JP2007036666A (en) 2005-07-27 2007-02-08 Onkyo Corp Contents distribution system, client, and client program
US8104054B2 (en) 2005-09-01 2012-01-24 At&T Intellectual Property I, L.P. Methods, systems, and devices for bandwidth conservation
JP4682759B2 (en) 2005-09-08 2011-05-11 ソニー株式会社 Playback apparatus, playback method, and playback program
US7817608B2 (en) 2005-09-23 2010-10-19 Widevine Technologies, Inc. Transitioning to secure IP communications for encoding, encapsulating, and encrypting data
US8522290B2 (en) 2005-10-07 2013-08-27 Infosys Technologies, Ltd. Video on demand system and methods thereof
KR101291225B1 (en) 2005-10-07 2013-07-31 구글 인코포레이티드 Indirect subscriptions to user-selected content feeds and top n lists of content feeds
US20070086528A1 (en) 2005-10-18 2007-04-19 Mauchly J W Video encoder with multiple processors
FR2892885B1 (en) 2005-11-02 2008-01-25 Streamezzo Sa METHOD FOR MANAGING CHARACTER FONTS WITHIN MULTIMEDIA SCENES, CORRESPONDING COMPUTER PROGRAM AND TERMINAL.
JP4793856B2 (en) 2005-12-22 2011-10-12 Kddi株式会社 Image scrambling device and image descrambling device
US8949146B2 (en) 2005-12-31 2015-02-03 Michelle Fisher Method for purchasing tickets using a mobile communication device
EP1806919A1 (en) 2006-01-05 2007-07-11 Alcatel Lucent Media delivery system with content-based trick play mode
US8214516B2 (en) 2006-01-06 2012-07-03 Google Inc. Dynamic media serving infrastructure
US8139768B2 (en) 2006-01-19 2012-03-20 Microsoft Corporation Encrypting content in a tuner device and analyzing content protection policy
US20070178933A1 (en) 2006-01-30 2007-08-02 Sony Ericsson Mobile Communications Ab Wireless communication network selection
JP4826270B2 (en) 2006-02-03 2011-11-30 富士ゼロックス株式会社 Electronic ticket issue management system, issuer system, program
US7962942B1 (en) 2006-02-28 2011-06-14 Rovi Guides, Inc. Systems and methods for enhanced trick-play functions
JP4631747B2 (en) 2006-03-02 2011-02-16 ソニー株式会社 Playback apparatus, playback method, and playback program
US20070217759A1 (en) 2006-03-10 2007-09-20 Microsoft Corporation Reverse Playback of Video Data
EP1999883A4 (en) 2006-03-14 2013-03-06 Divx Llc Federated digital rights management scheme including trusted systems
US20070217339A1 (en) 2006-03-16 2007-09-20 Hitachi, Ltd. Cross-layer QoS mechanism for video transmission over wireless LAN
WO2007113836A2 (en) 2006-04-03 2007-10-11 Beinsync Ltd. Peer to peer syncronization system and method
US20070239839A1 (en) 2006-04-06 2007-10-11 Buday Michael E Method for multimedia review synchronization
US7823210B2 (en) 2006-05-23 2010-10-26 Microsoft Corporation Rights management using recording definition information (RDI)
US8245264B2 (en) 2006-05-26 2012-08-14 John Toebes Methods and systems to reduce channel selection transition delay in a digital network
US8516531B2 (en) 2006-05-31 2013-08-20 Alcatel Lucent Reducing channel change delays
US20080005175A1 (en) 2006-06-01 2008-01-03 Adrian Bourke Content description system
US8412927B2 (en) 2006-06-07 2013-04-02 Red Hat, Inc. Profile framework for token processing system
US9380096B2 (en) 2006-06-09 2016-06-28 Qualcomm Incorporated Enhanced block-request streaming system for handling low-latency streaming
US9209934B2 (en) 2006-06-09 2015-12-08 Qualcomm Incorporated Enhanced block-request streaming using cooperative parallel HTTP and forward error correction
KR101013686B1 (en) 2006-06-29 2011-02-10 엘지전자 주식회사 Method and system for managing devices in user domain in digital rights management
EP2043293A4 (en) 2006-07-19 2011-01-19 Panasonic Corp Medium data processing device and medium data processing method
US20080043832A1 (en) 2006-08-16 2008-02-21 Microsoft Corporation Techniques for variable resolution encoding and decoding of digital video
US20080066181A1 (en) 2006-09-07 2008-03-13 Microsoft Corporation DRM aspects of peer-to-peer digital content distribution
US20080066099A1 (en) 2006-09-11 2008-03-13 Apple Computer, Inc. Media systems with integrated content searching
EP2461586A1 (en) 2006-09-29 2012-06-06 United Video Properties, Inc. Management of profiles for interactive media guidance applications
US8832742B2 (en) 2006-10-06 2014-09-09 United Video Properties, Inc. Systems and methods for acquiring, categorizing and delivering media in interactive media guidance applications
US8381249B2 (en) 2006-10-06 2013-02-19 United Video Properties, Inc. Systems and methods for acquiring, categorizing and delivering media in interactive media guidance applications
US8270819B2 (en) 2006-10-31 2012-09-18 Tivo Inc. Performing trick play functions in a digital video recorder with efficient use of resources
US8711929B2 (en) 2006-11-01 2014-04-29 Skyfire Labs, Inc. Network-based dynamic encoding
US8341282B2 (en) 2006-11-21 2012-12-25 Verizon Patent And Licensing Inc. Hybrid buffer management
JP4609773B2 (en) 2006-11-28 2011-01-12 コニカミノルタビジネステクノロジーズ株式会社 Document data creation apparatus, document data creation method, and control program
US8929360B2 (en) 2006-12-07 2015-01-06 Cisco Technology, Inc. Systems, methods, media, and means for hiding network topology
WO2008072093A2 (en) 2006-12-13 2008-06-19 Quickplay Media Inc. Mobile media platform
US20130166580A1 (en) 2006-12-13 2013-06-27 Quickplay Media Inc. Media Processor
US9124650B2 (en) 2006-12-13 2015-09-01 Quickplay Media Inc. Digital rights management in a mobile environment
AR064274A1 (en) 2006-12-14 2009-03-25 Panasonic Corp MOVEMENT IMAGE CODING METHOD, MOVING IMAGE CODING DEVICE, MOVING IMAGE RECORDING METHOD, RECORDING MEDIA, MOVING IMAGE PLAYBACK METHOD, IMPROVEMENT AND IMPROVEMENT SYSTEM
US8537659B2 (en) 2006-12-20 2013-09-17 Apple Inc. Method and system for reducing service interruptions to mobile communication devices
CA2616440C (en) 2006-12-29 2015-02-17 Broadband Royalty Corporation Source optimized dynamic trickplay
US8069260B2 (en) 2007-01-12 2011-11-29 Microsoft Corporation Dynamic buffer settings for media playback
US9826197B2 (en) 2007-01-12 2017-11-21 Activevideo Networks, Inc. Providing television broadcasts over a managed network and interactive content over an unmanaged network to a client device
CA2676192A1 (en) 2007-01-22 2008-07-31 Min Tnetap I Goeteborg Ab Method and apparatus for obtaining digital objects in a communication network
US20080192818A1 (en) 2007-02-09 2008-08-14 Dipietro Donald Vincent Systems and methods for securing media
US7788395B2 (en) 2007-02-14 2010-08-31 Microsoft Corporation Adaptive media playback
US20080209534A1 (en) * 2007-02-15 2008-08-28 Bcode Pty Limited Token based applicaions platform method, system and apparatus
GB0703695D0 (en) 2007-02-26 2007-04-04 The Technology Partnership Plc A Device for Receiving Digital Broadcasts
JP4812117B2 (en) 2007-03-16 2011-11-09 日本放送協会 Content encryption apparatus and program thereof, and content decryption apparatus and program thereof
US7872975B2 (en) 2007-03-26 2011-01-18 Microsoft Corporation File server pipelining with denial of service mitigation
BRPI0810699B1 (en) 2007-05-04 2021-03-02 Nokia Technologies Oy method and device for recording media stream on a hint track of receiving a multimedia storage file
US20080279535A1 (en) 2007-05-10 2008-11-13 Microsoft Corporation Subtitle data customization and exposure
US20080294691A1 (en) 2007-05-22 2008-11-27 Sunplus Technology Co., Ltd. Methods for generating and playing multimedia file and recording medium storing multimedia file
US20080294453A1 (en) 2007-05-24 2008-11-27 La La Media, Inc. Network Based Digital Rights Management System
US7953079B2 (en) 2007-06-04 2011-05-31 Cisco Technology, Inc. Method and apparatus to control access to content
US20080310454A1 (en) 2007-06-12 2008-12-18 Bellwood Thomas A Provisioning Bandwidth For A Digital Media Stream
US7558760B2 (en) 2007-06-12 2009-07-07 Microsoft Corporation Real-time key frame generation
EP2172867A4 (en) 2007-06-20 2012-08-08 Panasonic Corp Network av content reproduction terminal, server, and system
KR20090004659A (en) 2007-07-02 2009-01-12 엘지전자 주식회사 Digital broadcasting system and method of processing data in digital broadcasting system
US8321556B1 (en) 2007-07-09 2012-11-27 The Nielsen Company (Us), Llc Method and system for collecting data on a wireless device
US7991904B2 (en) 2007-07-10 2011-08-02 Bytemobile, Inc. Adaptive bitrate management for streaming media over packet networks
WO2009019739A1 (en) 2007-08-09 2009-02-12 Thomson Licensing A video data reproduction system
US8521540B2 (en) 2007-08-17 2013-08-27 Qualcomm Incorporated Encoding and/or decoding digital signals using a permutation value
US9118811B2 (en) 2007-08-24 2015-08-25 The Invention Science Fund I, Llc Predicted concurrent streaming program selection
US9602757B2 (en) 2007-09-04 2017-03-21 Apple Inc. Display of video subtitles
KR101058042B1 (en) 2007-09-06 2011-08-19 삼성전자주식회사 Multistream playback device and playback method
US8023562B2 (en) 2007-09-07 2011-09-20 Vanguard Software Solutions, Inc. Real-time video coding/decoding
US8046453B2 (en) 2007-09-20 2011-10-25 Qurio Holdings, Inc. Illustration supported P2P media content streaming
US10277956B2 (en) 2007-10-01 2019-04-30 Cabot Communications Method and apparatus for streaming digital media content and a communication system
JP4234770B1 (en) 2007-10-10 2009-03-04 株式会社東芝 Playback apparatus and playback control method
ES2895384T3 (en) * 2007-11-16 2022-02-21 Divx Llc Fragment header incorporating binary flags and correlated variable-length fields
CN101861583B (en) 2007-11-16 2014-06-04 索尼克Ip股份有限公司 Hierarchical and reduced index structures for multimedia files
US20090136216A1 (en) 2007-11-28 2009-05-28 Kourosh Soroushian System and Method for Playback of Partially Available Multimedia Content
US8543720B2 (en) 2007-12-05 2013-09-24 Google Inc. Dynamic bit rate scaling
US7697557B2 (en) 2007-12-26 2010-04-13 Alcatel Lucent Predictive caching content distribution network
US8997161B2 (en) 2008-01-02 2015-03-31 Sonic Ip, Inc. Application enhancement tracks
US8265168B1 (en) 2008-02-01 2012-09-11 Zenverge, Inc. Providing trick mode for video stream transmitted over network
US8274874B2 (en) 2008-02-04 2012-09-25 International Business Machines Corporation Apparatus, system, and method for locating and fast-searching units of digital information in volume, optical-storage disks
US8776161B2 (en) 2008-02-12 2014-07-08 Ciena Corporation Systems and methods for video processing in network edge devices
US8401900B2 (en) 2008-02-14 2013-03-19 At&T Intellectual Property I, Lp System and method for presenting advertising data based on end user trick-play trend data
US8745670B2 (en) 2008-02-26 2014-06-03 At&T Intellectual Property I, Lp System and method for promoting marketable items
US8245124B1 (en) 2008-03-20 2012-08-14 Adobe Systems Incorporated Content modification and metadata
US9762692B2 (en) 2008-04-04 2017-09-12 Level 3 Communications, Llc Handling long-tail content in a content delivery network (CDN)
US8456380B2 (en) 2008-05-15 2013-06-04 International Business Machines Corporation Processing computer graphics generated by a remote computer for streaming to a client computer
US8175268B2 (en) 2008-05-22 2012-05-08 Red Hat, Inc. Generating and securing archive keys
KR20110014995A (en) 2008-06-06 2011-02-14 디브이엑스, 인크. Systems and methods for font file optimization for multimedia files
US7991801B2 (en) 2008-06-10 2011-08-02 International Business Machines Corporation Real-time dynamic and synchronized captioning system and method for use in the streaming of multimedia data
US8527876B2 (en) 2008-06-12 2013-09-03 Apple Inc. System and methods for adjusting graphical representations of media files based on previous usage
US20090313564A1 (en) 2008-06-12 2009-12-17 Apple Inc. Systems and methods for adjusting playback of media files based on previous usage
US8472779B2 (en) 2008-06-17 2013-06-25 Microsoft Corporation Concurrently displaying multiple trick streams for video
US8218633B2 (en) 2008-06-18 2012-07-10 Kiu Sha Management Limited Liability Company Bidirectionally decodable Wyner-Ziv video coding
US8387150B2 (en) 2008-06-27 2013-02-26 Microsoft Corporation Segmented media content rights management
US7797426B1 (en) 2008-06-27 2010-09-14 BitGravity, Inc. Managing TCP anycast requests
US8352996B2 (en) 2008-06-27 2013-01-08 Microsoft Corporation Adaptive video switching for variable network conditions
US20110055585A1 (en) 2008-07-25 2011-03-03 Kok-Wah Lee Methods and Systems to Create Big Memorizable Secrets and Their Applications in Information Engineering
US8473628B2 (en) 2008-08-29 2013-06-25 Adobe Systems Incorporated Dynamically altering playlists
US8752100B2 (en) 2008-08-29 2014-06-10 At&T Intellectual Property Ii, Lp Systems and methods for distributing video on demand
US8311111B2 (en) 2008-09-11 2012-11-13 Google Inc. System and method for decoding using parallel processing
US8718135B2 (en) 2008-09-19 2014-05-06 The Hong Kong University Of Science And Technology Method and system for transcoding based robust streaming of compressed video
CN101686383B (en) 2008-09-23 2013-05-01 Utc消防和保安美国有限公司 Method and system for transmitting medium through network
US20100083322A1 (en) 2008-09-29 2010-04-01 Alan Rouse Providing selective video program content and associated license in response to a promotion token
US8117306B1 (en) 2008-09-29 2012-02-14 Amazon Technologies, Inc. Optimizing content management
EP2350909A4 (en) 2008-10-10 2013-06-19 Zapmytv Com Inc Controlled delivery of content data streams to remote users
US20100094970A1 (en) 2008-10-15 2010-04-15 Patentvc Ltd. Latency based selection of fractional-storage servers
US8051287B2 (en) 2008-10-15 2011-11-01 Adobe Systems Incorporated Imparting real-time priority-based network communications in an encrypted communication session
JP5141494B2 (en) 2008-10-27 2013-02-13 ブラザー工業株式会社 Content distributed storage system, special content acquisition method, node device, and node processing program
US8249168B2 (en) 2008-11-06 2012-08-21 Advanced Micro Devices, Inc. Multi-instance video encoder
EP2359536B1 (en) 2008-11-24 2016-03-16 Ankeena Networks, Inc., Adaptive network content delivery system
US9548859B2 (en) 2008-12-03 2017-01-17 Google Technology Holdings LLC Ticket-based implementation of content leasing
US9015209B2 (en) 2008-12-16 2015-04-21 Sandisk Il Ltd. Download management of discardable files
US9009337B2 (en) 2008-12-22 2015-04-14 Netflix, Inc. On-device multiplexing of streaming media content
US20100166060A1 (en) 2008-12-31 2010-07-01 Texas Instruments Incorporated Video transcoder rate control
CA2749170C (en) 2009-01-07 2016-06-21 Divx, Inc. Singular, collective and automated creation of a media guide for online content
US8311115B2 (en) 2009-01-29 2012-11-13 Microsoft Corporation Video encoding using previously calculated motion information
US8396114B2 (en) 2009-01-29 2013-03-12 Microsoft Corporation Multiple bit rate video encoding using variable bit rate and dynamic resolution for adaptive video streaming
JP5353277B2 (en) 2009-02-06 2013-11-27 日本電気株式会社 Stream signal transmission apparatus and transmission method
US8621044B2 (en) 2009-03-16 2013-12-31 Microsoft Corporation Smooth, stateless client media streaming
US8595378B1 (en) 2009-03-30 2013-11-26 Amazon Technologies, Inc. Managing communications having multiple alternative destinations
RU2011147112A (en) 2009-04-20 2013-05-27 Конинклейке Филипс Электроникс Н.В. VERIFICATION AND SYNCHRONIZATION OF FILES RECEIVED SEPARATELY FROM VIDEO CONTENT
US8774609B2 (en) 2009-05-18 2014-07-08 Disney Enterprises, Inc. System and method for providing time-adapted video content
WO2010134996A2 (en) 2009-05-20 2010-11-25 Intertrust Technologies Corporation Content sharing systems and methods
US20100306249A1 (en) 2009-05-27 2010-12-02 James Hill Social network systems and methods
US8296434B1 (en) 2009-05-28 2012-10-23 Amazon Technologies, Inc. Providing dynamically scaling computing load balancing
US9602864B2 (en) 2009-06-08 2017-03-21 Time Warner Cable Enterprises Llc Media bridge apparatus and methods
US8270473B2 (en) 2009-06-12 2012-09-18 Microsoft Corporation Motion based dynamic resolution multiple bit rate video encoding
BRPI1013145B1 (en) 2009-06-15 2021-01-12 Blackberry Limited methods and devices for transmitting media content via hypertext transfer protocol
US8392959B2 (en) 2009-06-16 2013-03-05 Comcast Cable Communications, Llc Portable media asset
US9680892B2 (en) 2009-06-26 2017-06-13 Adobe Systems Incorporated Providing integration of multi-bit-rate media streams
US8588296B2 (en) 2009-07-02 2013-11-19 Dialogic Corporation Bitrate control algorithm for video transcoding systems
US8225061B2 (en) 2009-07-02 2012-07-17 Apple Inc. Method and apparatus for protected content data processing
US8433814B2 (en) 2009-07-16 2013-04-30 Netflix, Inc. Digital content distribution system and method
US8412841B1 (en) 2009-08-17 2013-04-02 Adobe Systems Incorporated Media content streaming using stream message fragments
US8355433B2 (en) 2009-08-18 2013-01-15 Netflix, Inc. Encoding video streams for adaptive video streaming
US8903940B2 (en) 2009-09-10 2014-12-02 Tribal Technologies, Inc. System and method for intelligently distributing content to a mobile device based on a detected location of the mobile device and context data defining characteristics of the location at a particular date and time
US8990854B2 (en) 2009-09-14 2015-03-24 Broadcom Corporation System and method in a television for providing user-selection of objects in a television program
US8392600B2 (en) 2009-09-14 2013-03-05 Adobe Systems Incorporated Dynamic stream switch control
US20110096828A1 (en) 2009-09-22 2011-04-28 Qualcomm Incorporated Enhanced block-request streaming using scalable encoding
CN102034177A (en) 2009-09-29 2011-04-27 国际商业机器公司 Method and device for realizing effective mobile ticket transfer
EP2484090A1 (en) 2009-09-29 2012-08-08 Nokia Corp. System, method and apparatus for dynamic media file streaming
JP2013507084A (en) 2009-10-05 2013-02-28 アイ.シー.ヴイ.ティー リミテッド Method and system for image processing
EP2486517A4 (en) 2009-10-05 2014-06-11 Icvt Ltd Apparatus and methods for recompression of digital images
US9237387B2 (en) 2009-10-06 2016-01-12 Microsoft Technology Licensing, Llc Low latency cacheable media streaming
US8341255B2 (en) 2009-10-06 2012-12-25 Unwired Planet, Inc. Managing network traffic by editing a manifest file
US10264029B2 (en) 2009-10-30 2019-04-16 Time Warner Cable Enterprises Llc Methods and apparatus for packetized content delivery over a content delivery network
KR101750048B1 (en) 2009-11-13 2017-07-03 삼성전자주식회사 Method and apparatus for providing trick play service
US9164967B2 (en) 2009-11-25 2015-10-20 Red Hat, Inc. Extracting font metadata from font files into searchable metadata for package distribution
US8625667B2 (en) 2009-11-30 2014-01-07 Alcatel Lucent Method of opportunity-based transmission of wireless video
US20110138018A1 (en) 2009-12-04 2011-06-09 Qualcomm Incorporated Mobile media server
CA2782825C (en) 2009-12-04 2016-04-26 Divx, Llc Elementary bitstream cryptographic material transport systems and methods
US8806341B2 (en) 2009-12-10 2014-08-12 Hulu, LLC Method and apparatus for navigating a media program via a histogram of popular segments
KR20110066578A (en) 2009-12-11 2011-06-17 삼성전자주식회사 Digital contents, apparatus and method for playing the digital contents
US8949436B2 (en) 2009-12-18 2015-02-03 Alcatel Lucent System and method for controlling peer-to-peer connections
US9344735B2 (en) 2009-12-21 2016-05-17 Tmm, Inc. Progressive shape based encoding of video content within a swarm environment
US20110149753A1 (en) 2009-12-21 2011-06-23 Qualcomm Incorporated Switching between media broadcast streams having varying levels of quality
EP2360924A1 (en) 2010-01-18 2011-08-24 Alcatel-Lucent España, S.A. A digital multimedia data transmission device and method
US20110184738A1 (en) 2010-01-25 2011-07-28 Kalisky Dror Navigation and orientation tools for speech synthesis
US20110191439A1 (en) 2010-01-29 2011-08-04 Clarendon Foundation, Inc. Media content ingestion
US8291460B1 (en) 2010-02-12 2012-10-16 Adobe Systems Incorporated Rate adaptation based on dynamic performance monitoring
EP3647974A1 (en) 2010-02-17 2020-05-06 Verimatrix, Inc. Systems and methods for securing content delivered using a playlist
US8516147B2 (en) 2010-02-26 2013-08-20 Simula Innovation Sa Data segmentation, request and transfer method
US8527649B2 (en) 2010-03-09 2013-09-03 Mobixell Networks Ltd. Multi-stream bit rate adaptation
US8386621B2 (en) 2010-03-12 2013-02-26 Netflix, Inc. Parallel streaming
US8402155B2 (en) 2010-04-01 2013-03-19 Xcira, Inc. Real-time media delivery with automatic catch-up
GB2479455B (en) 2010-04-07 2014-03-05 Apple Inc Real-time or near real-time streaming
US20130152767A1 (en) 2010-04-22 2013-06-20 Jamrt Ltd Generating pitched musical events corresponding to musical content
US20110264530A1 (en) 2010-04-23 2011-10-27 Bryan Santangelo Apparatus and methods for dynamic secondary content and data insertion and delivery
EP2564354A4 (en) 2010-04-29 2014-03-12 Icvt Ltd Apparatus and methods for re-compression having a monotonic relationship between extent of comprission and quality of compressed image
WO2011139305A1 (en) 2010-05-04 2011-11-10 Azuki Systems, Inc. Method and apparatus for carrier controlled dynamic rate adaptation and client playout rate reduction
US8533337B2 (en) 2010-05-06 2013-09-10 Citrix Systems, Inc. Continuous upgrading of computers in a load balanced environment
KR101837687B1 (en) 2010-06-04 2018-03-12 삼성전자주식회사 Method and apparatus for adaptive streaming based on plurality of elements determining quality of content
US8705616B2 (en) 2010-06-11 2014-04-22 Microsoft Corporation Parallel multiple bitrate video encoding to reduce latency and dependences between groups of pictures
US8819269B2 (en) 2010-06-30 2014-08-26 Cable Television Laboratories, Inc. Adaptive bit rate method and system using retransmission and replacement
US8782268B2 (en) 2010-07-20 2014-07-15 Microsoft Corporation Dynamic composition of media
US9226045B2 (en) 2010-08-05 2015-12-29 Qualcomm Incorporated Signaling attributes for network-streamed video data
US20120036365A1 (en) 2010-08-06 2012-02-09 Microsoft Corporation Combining request-dependent metadata with media content
EP2614653A4 (en) 2010-09-10 2015-04-15 Nokia Corp A method and apparatus for adaptive streaming
US9014471B2 (en) 2010-09-17 2015-04-21 I.C.V.T. Ltd. Method of classifying a chroma downsampling error
WO2012035533A2 (en) 2010-09-17 2012-03-22 I.C.V.T Ltd. A method of classifying a chroma downsampling error
US9998749B2 (en) 2010-10-19 2018-06-12 Otoy, Inc. Composite video streaming using stateless compression
US8532464B2 (en) 2010-11-08 2013-09-10 Deluxe Digital Studios, Inc. Methods and systems for use in controlling playback of content in relation to recorded content
US8856846B2 (en) 2010-11-29 2014-10-07 At&T Intellectual Property I, L.P. Content placement
US8910295B2 (en) 2010-11-30 2014-12-09 Comcast Cable Communications, Llc Secure content access authorization
US9510061B2 (en) 2010-12-03 2016-11-29 Arris Enterprises, Inc. Method and apparatus for distributing video
US20120144117A1 (en) 2010-12-03 2012-06-07 Microsoft Corporation Recommendation based caching of content items
WO2012094171A1 (en) 2011-01-05 2012-07-12 Divx, Llc. Adaptive bitrate streaming of media stored in matroska container files using hypertext transfer protocol
US9247312B2 (en) 2011-01-05 2016-01-26 Sonic Ip, Inc. Systems and methods for encoding source media in matroska container files for adaptive bitrate streaming using hypertext transfer protocol
US9020039B2 (en) 2011-01-06 2015-04-28 Sonic Ip, Inc. Systems and methods for encoding alternative streams of video for use in adaptive bitrate streaming
US8451905B1 (en) 2011-02-25 2013-05-28 Adobe Systems Incorporated Efficient encoding of video frames in a distributed video coding environment
US8589996B2 (en) 2011-03-16 2013-11-19 Azuki Systems, Inc. Method and system for federated over-the-top content delivery
US8862763B2 (en) 2011-03-30 2014-10-14 Verizon Patent And Licensing Inc. Downloading video using excess bandwidth
US9154826B2 (en) 2011-04-06 2015-10-06 Headwater Partners Ii Llc Distributing content and service launch objects to mobile devices
US8839282B2 (en) 2011-04-07 2014-09-16 Level 3 Communications, Llc Multimedia test systems
US8861929B2 (en) 2011-04-14 2014-10-14 Cisco Technology, Inc. Per-subscriber adaptive bit rate stream management method
EP2515262A1 (en) 2011-04-18 2012-10-24 Amadeus S.A.S. De-synchronization monitoring system and method
US8681866B1 (en) 2011-04-28 2014-03-25 Google Inc. Method and apparatus for encoding video by downsampling frame resolution
US8516144B2 (en) 2011-04-29 2013-08-20 Cbs Interactive Inc. Startup bitrate in adaptive bitrate streaming
US9071841B2 (en) 2011-05-17 2015-06-30 Microsoft Technology Licensing, Llc Video transcoding with dynamically modifiable spatial resolution
US8914478B2 (en) 2011-05-19 2014-12-16 International Business Machines Corporation Automated deployment of software for managed hardware in a storage area network
CA2837755C (en) 2011-06-01 2019-04-09 Zhou Wang Method and system for structural similarity based perceptual video coding
US8856283B2 (en) 2011-06-03 2014-10-07 Apple Inc. Playlists for real-time or near real-time streaming
US8843586B2 (en) 2011-06-03 2014-09-23 Apple Inc. Playlists for real-time or near real-time streaming
US9001898B2 (en) 2011-06-10 2015-04-07 Thinklogical, Llc Method and system for serial digital interface (SDI) video data extension
WO2012171113A1 (en) 2011-06-14 2012-12-20 Zhou Wang Method and system for structural similarity based rate-distortion optimization for perceptual video coding
CN102857478B (en) * 2011-06-30 2016-09-28 华为技术有限公司 media data control method and device
US9002978B2 (en) 2011-07-08 2015-04-07 Ming Li Content delivery prediction and feedback systems
US8925021B2 (en) 2011-07-11 2014-12-30 Telefonaktiebolaget Lm Ericsson (Publ) Method and system for trick play in over-the-top video delivery
US20130041808A1 (en) 2011-08-10 2013-02-14 Nathalie Pham Distributed media access
US9264508B2 (en) 2011-08-19 2016-02-16 Time Warner Cable Enterprises Llc Apparatus and methods for reduced switching delays in a content distribution network
EP2751990A4 (en) 2011-08-29 2015-04-22 Icvt Ltd Controlling a video content system
US8787570B2 (en) 2011-08-31 2014-07-22 Sonic Ip, Inc. Systems and methods for automatically genenrating top level index files
US8909922B2 (en) 2011-09-01 2014-12-09 Sonic Ip, Inc. Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US8964977B2 (en) 2011-09-01 2015-02-24 Sonic Ip, Inc. Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US20130066838A1 (en) * 2011-09-12 2013-03-14 Microsoft Corporation Efficient data recovery
US8885702B2 (en) 2011-10-17 2014-11-11 Google Inc. Rate-distortion-complexity optimization of video encoding guided by video description length
US8726264B1 (en) 2011-11-02 2014-05-13 Amazon Technologies, Inc. Architecture for incremental deployment
US8806193B2 (en) 2011-12-22 2014-08-12 Adobe Systems Incorporated Methods and apparatus for integrating digital rights management (DRM) systems with native HTTP live streaming
US9189252B2 (en) 2011-12-30 2015-11-17 Microsoft Technology Licensing, Llc Context-based device action prediction
US20130179199A1 (en) 2012-01-06 2013-07-11 Rovi Corp. Systems and methods for granting access to digital content using electronic tickets and ticket tokens
US8473630B1 (en) * 2012-01-18 2013-06-25 Google Inc. Adaptive video streaming using time-to-offset mapping
US20140355668A1 (en) 2012-01-23 2014-12-04 I.C.V.T. Ltd. Method and System for Controlling Video Frame Encoding
US20130196292A1 (en) 2012-01-30 2013-08-01 Sharp Kabushiki Kaisha Method and system for multimedia-based language-learning, and computer program therefor
WO2013119802A1 (en) 2012-02-11 2013-08-15 Social Communications Company Routing virtual area based communications
CN104412577A (en) 2012-02-23 2015-03-11 大专院校网站公司 Asynchronous video interview system
US8787726B2 (en) 2012-02-26 2014-07-22 Antonio Rossi Streaming video navigation systems and methods
US9450997B2 (en) 2012-02-27 2016-09-20 Qualcomm Incorporated Dash client and receiver with request cancellation capabilities
WO2013144942A1 (en) 2012-03-28 2013-10-03 I.C.V.T. Ltd. Controlling a compression of an image according to a degree of photo-realism
US9537920B2 (en) 2012-05-18 2017-01-03 Google Technology Holdings LLC Enforcement of trick-play disablement in adaptive bit rate video content delivery
US9571827B2 (en) 2012-06-08 2017-02-14 Apple Inc. Techniques for adaptive video streaming
US9197685B2 (en) 2012-06-28 2015-11-24 Sonic Ip, Inc. Systems and methods for fast video startup using trick play streams
CA2877378A1 (en) 2012-06-29 2014-01-03 Nanostring Technologies, Inc. Methods of treating breast cancer with gemcitabine therapy
US9332292B2 (en) * 2012-08-15 2016-05-03 Verizon Patent And Licensing Inc. Media playlists with selective media expiration
US9215269B2 (en) 2012-08-23 2015-12-15 Amazon Technologies, Inc. Predictive caching for content
US8997254B2 (en) 2012-09-28 2015-03-31 Sonic Ip, Inc. Systems and methods for fast startup streaming of encrypted multimedia content
US8914836B2 (en) 2012-09-28 2014-12-16 Sonic Ip, Inc. Systems, methods, and computer program products for load adaptive streaming
US20140140417A1 (en) 2012-11-16 2014-05-22 Gary K. Shaffer System and method for providing alignment of multiple transcoders for adaptive bitrate streaming in a network environment
US9544344B2 (en) 2012-11-20 2017-01-10 Google Technology Holdings LLC Method and apparatus for streaming media content to client devices
US9300734B2 (en) 2012-11-21 2016-03-29 NETFLIX Inc. Multi-CDN digital content streaming
US9191465B2 (en) 2012-11-21 2015-11-17 NETFLIX Inc. Multi-CDN digital content streaming
US9264475B2 (en) 2012-12-31 2016-02-16 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9191457B2 (en) 2012-12-31 2015-11-17 Sonic Ip, Inc. Systems, methods, and media for controlling delivery of content
US9313510B2 (en) 2012-12-31 2016-04-12 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9357210B2 (en) 2013-02-28 2016-05-31 Sonic Ip, Inc. Systems and methods of encoding multiple video streams for adaptive bitrate streaming
US9350990B2 (en) 2013-02-28 2016-05-24 Sonic Ip, Inc. Systems and methods of encoding multiple video streams with adaptive quantization for adaptive bitrate streaming
US9906785B2 (en) 2013-03-15 2018-02-27 Sonic Ip, Inc. Systems, methods, and media for transcoding video data according to encoding parameters indicated by received metadata
US10397292B2 (en) 2013-03-15 2019-08-27 Divx, Llc Systems, methods, and media for delivery of content
US20140297804A1 (en) 2013-03-28 2014-10-02 Sonic IP. Inc. Control of multimedia content streaming through client-server interactions
US9344517B2 (en) 2013-03-28 2016-05-17 Sonic Ip, Inc. Downloading and adaptive streaming of multimedia content to a device with cache assist
US9094737B2 (en) 2013-05-30 2015-07-28 Sonic Ip, Inc. Network video streaming with trick play based on separate trick play files
US9247317B2 (en) 2013-05-30 2016-01-26 Sonic Ip, Inc. Content streaming with client device trick play index
US20140359678A1 (en) 2013-05-30 2014-12-04 Sonic Ip, Inc. Device video streaming with trick play based on separate trick play files
WO2014193996A2 (en) 2013-05-30 2014-12-04 Sonic Ip, Inc. Network video streaming with trick play based on separate trick play files
US9400670B2 (en) 2013-07-22 2016-07-26 International Business Machines Corporation Network resource management system utilizing physical network identification for load balancing
US10165281B2 (en) 2013-09-06 2018-12-25 Ssimwave Inc. Method and system for objective perceptual video quality assessment
US20150117837A1 (en) 2013-10-31 2015-04-30 Sonic Ip, Inc. Systems and methods for supplementing content at a user device
US9343112B2 (en) 2013-10-31 2016-05-17 Sonic Ip, Inc. Systems and methods for supplementing content from a server
US20150188842A1 (en) 2013-12-31 2015-07-02 Sonic Ip, Inc. Flexible bandwidth allocation in a content distribution network
US20150189017A1 (en) 2013-12-31 2015-07-02 Sonic Ip, Inc. Cooperative nodes in a content distribution network
US20150188921A1 (en) 2013-12-31 2015-07-02 Sonic Ip, Inc. Local distribution node in a content distribution network
US20150189373A1 (en) 2013-12-31 2015-07-02 Sonic Ip, Inc. Efficient channel surfing in a content distribution network
US20150188758A1 (en) 2013-12-31 2015-07-02 Sonic Ip, Inc. Flexible network configuration in a content distribution network

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11886545B2 (en) 2006-03-14 2024-01-30 Divx, Llc Federated digital rights management scheme including trusted systems
US11102553B2 (en) 2009-12-04 2021-08-24 Divx, Llc Systems and methods for secure playback of encrypted elementary bitstreams
US11638033B2 (en) 2011-01-05 2023-04-25 Divx, Llc Systems and methods for performing adaptive bitrate streaming
US10368096B2 (en) 2011-01-05 2019-07-30 Divx, Llc Adaptive streaming systems and methods for performing trick play
US10382785B2 (en) 2011-01-05 2019-08-13 Divx, Llc Systems and methods of encoding trick play streams for use in adaptive streaming
US11457054B2 (en) 2011-08-30 2022-09-27 Divx, Llc Selection of resolutions for seamless resolution switching of multimedia content
US10687095B2 (en) 2011-09-01 2020-06-16 Divx, Llc Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US10225588B2 (en) 2011-09-01 2019-03-05 Divx, Llc Playback devices and methods for playing back alternative streams of content protected using a common set of cryptographic keys
US10856020B2 (en) 2011-09-01 2020-12-01 Divx, Llc Systems and methods for distributing content using a common set of encryption keys
US10244272B2 (en) 2011-09-01 2019-03-26 Divx, Llc Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US11683542B2 (en) 2011-09-01 2023-06-20 Divx, Llc Systems and methods for distributing content using a common set of encryption keys
US10341698B2 (en) 2011-09-01 2019-07-02 Divx, Llc Systems and methods for distributing content using a common set of encryption keys
US11178435B2 (en) 2011-09-01 2021-11-16 Divx, Llc Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US11438394B2 (en) 2012-12-31 2022-09-06 Divx, Llc Systems, methods, and media for controlling delivery of content
US10805368B2 (en) 2012-12-31 2020-10-13 Divx, Llc Systems, methods, and media for controlling delivery of content
USRE48761E1 (en) 2012-12-31 2021-09-28 Divx, Llc Use of objective quality measures of streamed content to reduce streaming bandwidth
US11785066B2 (en) 2012-12-31 2023-10-10 Divx, Llc Systems, methods, and media for controlling delivery of content
US10715806B2 (en) 2013-03-15 2020-07-14 Divx, Llc Systems, methods, and media for transcoding video data
US11849112B2 (en) 2013-03-15 2023-12-19 Divx, Llc Systems, methods, and media for distributed transcoding video data
US11470405B2 (en) 2013-05-30 2022-10-11 Divx, Llc Network video streaming with trick play based on separate trick play files
US11711552B2 (en) 2014-04-05 2023-07-25 Divx, Llc Systems and methods for encoding and playing back video at different frame rates using enhancement layers
US10893305B2 (en) 2014-04-05 2021-01-12 Divx, Llc Systems and methods for encoding and playing back video at different frame rates using enhancement layers

Also Published As

Publication number Publication date
US20150006662A1 (en) 2015-01-01
US9967305B2 (en) 2018-05-08

Similar Documents

Publication Publication Date Title
US20180332094A1 (en) Systems, Methods, and Media for Streaming Media Content
US9503765B2 (en) Averting ad skipping in adaptive bit rate systems
US9351020B2 (en) On the fly transcoding of video on demand content for adaptive streaming
US10284615B2 (en) Enhanced playlist definition and delivery for fast channel change with HTTP adaptive streaming
US8548303B2 (en) Reconciling digital content at a digital media device
US10291681B2 (en) Directory limit based system and method for storing media segments
EP3091711B1 (en) Content-specific identification and timing behavior in dynamic adaptive streaming over hypertext transfer protocol
CA2848262C (en) Filtering content for adaptive streaming
US9223944B2 (en) Media rights management on multiple devices
US10277927B2 (en) Movie package file format
US20140064711A1 (en) Systems, Methods, and Media for Presenting Media Content Using Cached Assets
US20100098153A1 (en) System and Method to Record Encoded Video Data
JP2015534312A (en) Control during rendering
US9924239B2 (en) Video on demand over satellite
US10694241B2 (en) Capturing border metadata while recording content

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: SONIC IP, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BRANESS, JASON A.;REEL/FRAME:048257/0446

Effective date: 20140307

Owner name: DIVX CF HOLDINGS LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONIC IP, INC.;REEL/FRAME:048257/0458

Effective date: 20180212

Owner name: DIVX, LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:DIVX CF HOLDINGS LLC;REEL/FRAME:048257/0477

Effective date: 20180212

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION