US20120307883A1 - Multi-Instance Video Encoder - Google Patents

Multi-Instance Video Encoder Download PDF

Info

Publication number
US20120307883A1
US20120307883A1 US13/585,421 US201213585421A US2012307883A1 US 20120307883 A1 US20120307883 A1 US 20120307883A1 US 201213585421 A US201213585421 A US 201213585421A US 2012307883 A1 US2012307883 A1 US 2012307883A1
Authority
US
United States
Prior art keywords
video
video data
processors
components
individual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/585,421
Inventor
Hans W. Graves
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US13/585,421 priority Critical patent/US20120307883A1/en
Publication of US20120307883A1 publication Critical patent/US20120307883A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4341Demultiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2368Multiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/241Operating system [OS] processes, e.g. server setup

Definitions

  • Embodiments of the invention relate generally to information processing systems. More specifically, embodiments of the invention provide a system and method for providing improved processing of video data.
  • a multi-instance encoding module receives combined video and audio input, which is then separated into a video source stream and an audio source stream.
  • video encoder instances are initiated and associated with a corresponding processor operable to perform encoding operations.
  • the processor is a central processor of a central processing unit.
  • the processor is a graphics processor of a display controller.
  • the preprocessed video source stream is split into video data components such as a frame or group of frames.
  • the individual video data components are then assigned to a corresponding encoder instance.
  • the assignment of the individual video data components is load balanced equally across multiple processors. In another embodiment, the assignment of the individual video data components is load balanced dynamically across multiple processors.
  • Encoding operations are then performed by each video encoder instance to generate video output components.
  • the encoding operations may comprise the transcoding, compression, decompression, or scaling of video data components to generate the video output components.
  • the video output components are then assembled in a predetermined sequence to generate an encoded video output stream.
  • the audio source stream is encoded with an audio encoder to generate an encoded audio output stream.
  • the encoded video output stream and the encoded audio output stream are combined to generate a combined encoded output stream, which is then provided as combined video and audio output.
  • FIG. 1 is a generalized block diagram illustrating an information processing system as implemented in accordance with an embodiment of the invention
  • FIG. 2 is a simplified block diagram of a multi-instance encoding module as implemented in accordance with an embodiment of the invention.
  • FIG. 3 is a generalized flowchart of the operation of a multi-instance encoding module as implemented in accordance with an embodiment of the invention.
  • FIG. 1 is a generalized block diagram illustrating an information processing system 100 as implemented in accordance with an embodiment of the invention.
  • System 100 comprises a real-time clock 102 , a power management module 104 , a central processing unit 106 and memory 110 , all physically coupled via bus 140 .
  • the central processing unit 106 comprises at least one central processor 144 and the memory 110 comprises volatile random access memory (RAM), non-volatile read-only memory (ROM), non-volatile flash memory, or any combination thereof
  • RAM random access memory
  • ROM non-volatile read-only memory
  • memory 110 also comprises communications stack 142 , a multi-instance encoding module 150 , and other applications 154 .
  • I/O controller 112 Also physically coupled to bus 140 is an input/out (I/O) controller 112 , further coupled to a plurality of I/O ports 114 .
  • I/O port 114 may comprise a keyboard port, a mouse port, a parallel communications port, an RS-232 serial communications port, a gaming port, a universal serial bus (USB) port, an IEEE1394 (Firewire) port, or any combination thereof
  • Display controller 116 is likewise physically coupled to bus 140 and further coupled to display 118 .
  • display controller 116 comprises at least one graphic processor 146 .
  • display 118 is separately coupled, such as a stand-alone, flat panel video monitor.
  • display 118 is directly coupled, such as a laptop computer screen, a tablet PC screen, or the screen of a personal digital assistant (PDA).
  • PDA personal digital assistant
  • storage controller 120 which is further coupled to mass storage devices such as a tape drive or hard disk 124 .
  • Peripheral device controller is also physically coupled to bus 140 and further coupled to peripheral device 128 , such as a random array of independent disk (RAID) array or a storage area network (SAN).
  • RAID random array of independent disk
  • SAN storage area network
  • communications controller 130 is physically coupled to bus 140 and is further coupled to network port 132 , which in turn couples the information processing system 100 to one or more physical networks 134 , such as a local area network (LAN) based on the Ethernet standard.
  • network port 132 may comprise a digital subscriber line (DSL) modem, cable modem, or other broadband communications system operable to connect the information processing system 100 to network 134 .
  • network 134 may comprise the public switched telephone network (PSTN), the public Internet, a corporate intranet, a virtual private network (VPN), or any combination of telecommunication technologies and protocols operable to establish a network connection for the exchange of information.
  • PSTN public switched telephone network
  • VPN virtual private network
  • wireless network 138 comprises a personal area network (PAN), based on technologies such as Bluetooth or Ultra Wideband (UWB).
  • PAN personal area network
  • UWB Ultra Wideband
  • wireless network 138 comprises a wireless local area network (WLAN), based on variations of the IEEE 802.11 specification, often referred to as WiFi.
  • WLAN wireless local area network
  • wireless network 138 comprises a wireless wide area network (WWAN) based on an industry standard including two and a half generation (2.5G) wireless technologies such as global system for mobile communications (GPRS) and enhanced data rates for GSM evolution (EDGE).
  • GPRS global system for mobile communications
  • EDGE enhanced data rates for GSM evolution
  • wireless network 138 comprises WWANs based on existing third generation (3G) wireless technologies including universal mobile telecommunications system (UMTS) and wideband code division multiple access (W-CDMA).
  • 3G wireless technologies including universal mobile telecommunications system (UMTS) and wideband code division multiple access (W-CDMA).
  • Other embodiments also comprise the implementation of other 3G technologies, including evolution-data optimized (EVDO), IEEE 802.16 (WiMAX), wireless broadband (WiBro), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), and emerging fourth generation (4G) wireless technologies.
  • 3G wireless technologies including universal mobile telecommunications system (UMTS) and wideband code division multiple access (W-CDMA).
  • EVDO evolution-data optimized
  • WiMAX wireless broadband
  • HSDPA high-speed downlink packet access
  • HSUPA high-speed uplink packet access
  • 4G fourth generation
  • FIG. 2 is a simplified block diagram of a multi-instance encoding module 150 as implemented in accordance with an embodiment of the invention to improve the processing of video data.
  • resources available for the performance of multi-instance video encoding operations are determined, followed by the receipt of combined video and audio input 206 from an application plug-in 202 .
  • an application plug-in refers to computer executable code that interacts with a host application to provide a function, such as the encoding of video data, on demand.
  • the combined video and audio input 206 is separated into a video source stream 208 and an audio source stream 210 .
  • the video source stream 208 is then pre-processed with a video preprocessor 212 familiar to those of skill in the art and provided to a multi-instance wrapper 214 .
  • the multi-instance wrapper comprises a stream splitter 216 , a plurality of video encoder instances ‘ 1 ’ 218 through ‘n’ 220 , and a stream merge sequencer 222 .
  • the video encoder instances ‘ 1 ’ 218 through ‘n’ 220 are initiated corresponding to the preprocessed video source stream and resources available for encoding video data.
  • the individual video encoder instances ‘ 1 ’ 218 through ‘n’ 220 are associated with a corresponding processor.
  • the processor is a central processor of a central processing unit.
  • the processor is a graphics processor of a display controller. In various embodiments, these processors are operable to perform encoding operations.
  • a video data component refers to a unit of video data, such as a frame, group of frames, or fields. Skilled practitioners of the art will be knowledgeable of many such units of video data and references to the foregoing are not intended to limit the spirit, scope, or intent of the invention.
  • the individual video data components are then assigned to a corresponding encoder instance ‘ 1 ’ 218 through ‘n’ 220 . In one embodiment, the assignment of the individual video data components is load balanced equally across multiple processors as described in greater detail herein. In another embodiment, the assignment of the individual video data components is load balanced dynamically across multiple processors as likewise described in greater detail herein.
  • Encoding operations are then performed by each video encoder instance ‘ 1 ’ 218 through ‘n’ 220 to generate video output components.
  • the encoding operations may comprise the transcoding, compression, or scaling of video data components to generate the video output components.
  • the video output components are then assembled in a predetermined sequence by the stream merge sequencer to generate an encoded video output stream.
  • the audio source stream 210 is encoded with an audio encoder 224 to generate an encoded audio output stream.
  • the encoded video source stream and the encoded audio source stream are multiplexed by the stream multiplexer 226 to generate a combined encoded output stream 230 .
  • the combined encoded output stream 230 is then provided to the application plug-in 202 as combined video and audio output 232 .
  • FIG. 3 is a generalized flow chart of the operation of a multi-instance encoding module as implemented in accordance with an embodiment of the invention.
  • a multi-instance encoding module is implemented to improve the processing of video data.
  • multi-instance video encoding operations are begun in step 302 , followed by the determination in step 304 of resources available for the performance of multi-instance video encoding operations.
  • combined video and audio input is received in step 306 from an application plug-in.
  • video input and audio input are received in step 306 as separate streams from an application plug-in.
  • an application plug-in refers to computer executable code that interacts with a host application to provide a function, such as the encoding of video data, on demand.
  • Known application plug-ins typically rely upon services provided by the host application and generally are unable to operate independently. Conversely, the host application is able to operate independently of an application plug-in, allowing plug-ins to be dynamically added, updated, or removed without a corresponding modification to the host application.
  • plug-ins generally rely upon the host application's user interface and have well defined boundaries to their allowed actions.
  • the combined video and audio input is separated into a video source stream and an audio source stream in step 308 .
  • the video input and audio input streams are already separate and are treated as a video source stream and an audio source stream in step 308 .
  • the video source stream is then pre-processed in step 310 with a video preprocessor familiar to those of skill in the art.
  • Instances of a video encoder instances are then initiated in step 312 , corresponding to the preprocessed video source stream and resources available for encoding video data.
  • individual video encoder instances are associated with a corresponding processor.
  • the processor is a central processor of a central processing unit.
  • the processor is a graphics processor of a display controller. In various embodiments, these processors are operable to perform encoding operations.
  • a video data component refers to a unit of video data, such as a frame, group of frames, or fields. Skilled practitioners of the art will be knowledgeable of many such units of video data and references to the foregoing are not intended to limit the spirit, scope, or intent of the invention.
  • the individual video data components are then assigned to a corresponding encoder instance in step 316 . In one embodiment, the assignment of the individual video data components is load balanced equally across multiple processors.
  • an information processing system may comprise a quad core processor, each of which is associated with a corresponding instance of a video encoder.
  • Each video data component is sequentially assigned to the first, second, third and fourth video encoder instance, which are in turn respectively associated with a first, second, third and fourth processor.
  • the assignment of the individual video data components is load balanced dynamically across multiple processors.
  • an information processing system may comprise a quad core processor, each of which is associated with a corresponding instance of a video encoder.
  • Each video data component is dynamically assigned to the first, second, third and fourth video encoder instance as their respectively associated first, second, third and fourth processor become available.
  • Encoding operations are then performed in step 318 by each video encoder instance to generate video output components.
  • the encoding operations may comprise the transcoding, compression, decompression, or scaling of video data components to generate the video output components.
  • the video output components are then assembled in a predetermined sequence in 320 to generate an encoded video output stream.
  • the audio source stream is encoded with an audio encoder in step 322 to generate an encoded audio output stream.
  • the encoded video source stream and the encoded audio source stream are multiplexed to generate combined video and audio output in step 324 .
  • the combined video and audio output is then provided to the application plug-in in step 326 and multi-instance encoding operations are then ended in step 328 .
  • each of the referenced components in this embodiment of the invention may be comprised of a plurality of components, each interacting with the other in a distributed environment.
  • other embodiments of the invention may expand on the referenced embodiment to extend the scale and reach of the system's implementation.

Abstract

A system and method are disclosed for providing improved processing of video data. A multi-instance encoding module receives combined video and audio input, which is then separated into a video and audio source streams. The video source stream is pre-processed and corresponding video encoder instances are initiated. The preprocessed video source stream is split into video data components, which are assigned to a corresponding encoder instance. Encoding operations are performed by each video encoder instance to generate video output components. The video output components are then assembled in a predetermined sequence to generate an encoded video output stream. Concurrently, the audio source stream is encoded with an audio encoder to generate an encoded audio output stream. The encoded video and audio output streams are combined to generate a combined encoded output stream, which is provided as combined video and audio output.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • Embodiments of the invention relate generally to information processing systems. More specifically, embodiments of the invention provide a system and method for providing improved processing of video data.
  • 2. Description of the Related Art
  • The resolution of graphical and video formats has continued to improve in recent years. As an example, the current high definition video standard supports resolution rates of 1920×1080 pixels resulting in over two million pixels of video data per video frame. The need to process these large volumes of video data, which includes encoding operations such as compression, transcoding, and scaling, has resulted in the development of more powerful processors.
  • However, it is not uncommon for the full power of these processors to be underutilized. In some cases, a stream of video data is processed by a single processor of a multi-processor central processing unit, even though the remaining processors are idle. In other cases, the video data stream fails to be off-loaded to an available display controller that may include one or more specialized graphic processors. In an effort to address these issues, a number of current video encoder products (e.g., DIVX, Microsoft's Windows Media Video, etc.) provide a means to distribute the processing of video data across multiple processors. However, since many such providers fail to publish supporting documentation, it is often difficult to fully utilize the capabilities they provide.
  • Another issue is scalability. Some known approaches are optimized for real-time streaming, meaning you cannot “look ahead in time.” As a result, scaling linearly using multiple processors becomes challenging. Other approaches include segmenting a video stream, processing each segment as a single thread, and then merging the processed segments into a resulting output stream. Each of these approaches has attendant advantages and disadvantages. In view of the foregoing, there is a need to more fully utilize the resources provided by multiple processors to improve the processing of video data.
  • SUMMARY OF THE INVENTION
  • A system and method are disclosed for providing improved processing of video data. In various embodiments, a multi-instance encoding module receives combined video and audio input, which is then separated into a video source stream and an audio source stream. In these and other embodiments, video encoder instances are initiated and associated with a corresponding processor operable to perform encoding operations. In one embodiment, the processor is a central processor of a central processing unit. In another embodiment, the processor is a graphics processor of a display controller.
  • Once the video encoder instances are initiated, the preprocessed video source stream is split into video data components such as a frame or group of frames. The individual video data components are then assigned to a corresponding encoder instance. In one embodiment, the assignment of the individual video data components is load balanced equally across multiple processors. In another embodiment, the assignment of the individual video data components is load balanced dynamically across multiple processors.
  • Encoding operations are then performed by each video encoder instance to generate video output components. In various embodiments, the encoding operations may comprise the transcoding, compression, decompression, or scaling of video data components to generate the video output components. The video output components are then assembled in a predetermined sequence to generate an encoded video output stream. In one embodiment, the audio source stream is encoded with an audio encoder to generate an encoded audio output stream. The encoded video output stream and the encoded audio output stream are combined to generate a combined encoded output stream, which is then provided as combined video and audio output.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention may be better understood, and its numerous objects, features and advantages made apparent to those skilled in the art by referencing the accompanying drawings. The use of the same reference number throughout the several figures designates a like or similar element.
  • FIG. 1 is a generalized block diagram illustrating an information processing system as implemented in accordance with an embodiment of the invention;
  • FIG. 2 is a simplified block diagram of a multi-instance encoding module as implemented in accordance with an embodiment of the invention; and
  • FIG. 3 is a generalized flowchart of the operation of a multi-instance encoding module as implemented in accordance with an embodiment of the invention.
  • DETAILED DESCRIPTION
  • A system and method are disclosed for providing improved processing of video data. FIG. 1 is a generalized block diagram illustrating an information processing system 100 as implemented in accordance with an embodiment of the invention. System 100 comprises a real-time clock 102, a power management module 104, a central processing unit 106 and memory 110, all physically coupled via bus 140. In various embodiments, the central processing unit 106 comprises at least one central processor 144 and the memory 110 comprises volatile random access memory (RAM), non-volatile read-only memory (ROM), non-volatile flash memory, or any combination thereof In one embodiment, memory 110 also comprises communications stack 142, a multi-instance encoding module 150, and other applications 154.
  • Also physically coupled to bus 140 is an input/out (I/O) controller 112, further coupled to a plurality of I/O ports 114. In different embodiments, I/O port 114 may comprise a keyboard port, a mouse port, a parallel communications port, an RS-232 serial communications port, a gaming port, a universal serial bus (USB) port, an IEEE1394 (Firewire) port, or any combination thereof Display controller 116 is likewise physically coupled to bus 140 and further coupled to display 118. In various embodiments, display controller 116 comprises at least one graphic processor 146. In one embodiment, display 118 is separately coupled, such as a stand-alone, flat panel video monitor. In another embodiment, display 118 is directly coupled, such as a laptop computer screen, a tablet PC screen, or the screen of a personal digital assistant (PDA). Likewise physically coupled to bus 140 is storage controller 120 which is further coupled to mass storage devices such as a tape drive or hard disk 124. Peripheral device controller is also physically coupled to bus 140 and further coupled to peripheral device 128, such as a random array of independent disk (RAID) array or a storage area network (SAN).
  • In one embodiment, communications controller 130 is physically coupled to bus 140 and is further coupled to network port 132, which in turn couples the information processing system 100 to one or more physical networks 134, such as a local area network (LAN) based on the Ethernet standard. In other embodiments, network port 132 may comprise a digital subscriber line (DSL) modem, cable modem, or other broadband communications system operable to connect the information processing system 100 to network 134. In these embodiments, network 134 may comprise the public switched telephone network (PSTN), the public Internet, a corporate intranet, a virtual private network (VPN), or any combination of telecommunication technologies and protocols operable to establish a network connection for the exchange of information.
  • In another embodiment, communications controller 130 is likewise physically coupled to bus 140 and is further coupled to wireless modem 136, which in turn couples the information processing system 100 to one or more wireless networks 138. In one embodiment, wireless network 138 comprises a personal area network (PAN), based on technologies such as Bluetooth or Ultra Wideband (UWB). In another embodiment, wireless network 138 comprises a wireless local area network (WLAN), based on variations of the IEEE 802.11 specification, often referred to as WiFi. In yet another embodiment, wireless network 138 comprises a wireless wide area network (WWAN) based on an industry standard including two and a half generation (2.5G) wireless technologies such as global system for mobile communications (GPRS) and enhanced data rates for GSM evolution (EDGE). In other embodiments, wireless network 138 comprises WWANs based on existing third generation (3G) wireless technologies including universal mobile telecommunications system (UMTS) and wideband code division multiple access (W-CDMA). Other embodiments also comprise the implementation of other 3G technologies, including evolution-data optimized (EVDO), IEEE 802.16 (WiMAX), wireless broadband (WiBro), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), and emerging fourth generation (4G) wireless technologies.
  • FIG. 2 is a simplified block diagram of a multi-instance encoding module 150 as implemented in accordance with an embodiment of the invention to improve the processing of video data. In this embodiment, resources available for the performance of multi-instance video encoding operations are determined, followed by the receipt of combined video and audio input 206 from an application plug-in 202. As used herein, an application plug-in refers to computer executable code that interacts with a host application to provide a function, such as the encoding of video data, on demand.
  • The combined video and audio input 206 is separated into a video source stream 208 and an audio source stream 210. In one embodiment, the video source stream 208 is then pre-processed with a video preprocessor 212 familiar to those of skill in the art and provided to a multi-instance wrapper 214. In various embodiments, the multi-instance wrapper comprises a stream splitter 216, a plurality of video encoder instances ‘1218 through ‘n’ 220, and a stream merge sequencer 222. The video encoder instances ‘1218 through ‘n’ 220 are initiated corresponding to the preprocessed video source stream and resources available for encoding video data. In one embodiment, the individual video encoder instances ‘1218 through ‘n’ 220 are associated with a corresponding processor. In another embodiment, the processor is a central processor of a central processing unit. In yet another embodiment, the processor is a graphics processor of a display controller. In various embodiments, these processors are operable to perform encoding operations.
  • Once the video encoder instances ‘1218 through ‘n’ 220 are initiated, the preprocessed video source stream is split into video data components by the stream splitter 216. As used herein, a video data component refers to a unit of video data, such as a frame, group of frames, or fields. Skilled practitioners of the art will be knowledgeable of many such units of video data and references to the foregoing are not intended to limit the spirit, scope, or intent of the invention. The individual video data components are then assigned to a corresponding encoder instance ‘1218 through ‘n’ 220. In one embodiment, the assignment of the individual video data components is load balanced equally across multiple processors as described in greater detail herein. In another embodiment, the assignment of the individual video data components is load balanced dynamically across multiple processors as likewise described in greater detail herein.
  • Encoding operations are then performed by each video encoder instance ‘1218 through ‘n’ 220 to generate video output components. In various embodiments, the encoding operations may comprise the transcoding, compression, or scaling of video data components to generate the video output components. The video output components are then assembled in a predetermined sequence by the stream merge sequencer to generate an encoded video output stream. In one embodiment, the audio source stream 210 is encoded with an audio encoder 224 to generate an encoded audio output stream. In this embodiment, the encoded video source stream and the encoded audio source stream are multiplexed by the stream multiplexer 226 to generate a combined encoded output stream 230. The combined encoded output stream 230 is then provided to the application plug-in 202 as combined video and audio output 232.
  • FIG. 3 is a generalized flow chart of the operation of a multi-instance encoding module as implemented in accordance with an embodiment of the invention. In various embodiments, a multi-instance encoding module is implemented to improve the processing of video data. In this embodiment, multi-instance video encoding operations are begun in step 302, followed by the determination in step 304 of resources available for the performance of multi-instance video encoding operations. In one embodiment, combined video and audio input is received in step 306 from an application plug-in. In another embodiment, video input and audio input are received in step 306 as separate streams from an application plug-in. As used herein, an application plug-in refers to computer executable code that interacts with a host application to provide a function, such as the encoding of video data, on demand. Known application plug-ins typically rely upon services provided by the host application and generally are unable to operate independently. Conversely, the host application is able to operate independently of an application plug-in, allowing plug-ins to be dynamically added, updated, or removed without a corresponding modification to the host application. In addition, plug-ins generally rely upon the host application's user interface and have well defined boundaries to their allowed actions.
  • In one embodiment, the combined video and audio input is separated into a video source stream and an audio source stream in step 308. In another embodiment, the video input and audio input streams are already separate and are treated as a video source stream and an audio source stream in step 308. In one embodiment, the video source stream is then pre-processed in step 310 with a video preprocessor familiar to those of skill in the art. Instances of a video encoder instances are then initiated in step 312, corresponding to the preprocessed video source stream and resources available for encoding video data. In one embodiment, individual video encoder instances are associated with a corresponding processor. In another embodiment, the processor is a central processor of a central processing unit. In yet another embodiment, the processor is a graphics processor of a display controller. In various embodiments, these processors are operable to perform encoding operations.
  • Once the instances of the video encoder are initiated, the preprocessed video source stream is split into video data components in step 314. As used herein, a video data component refers to a unit of video data, such as a frame, group of frames, or fields. Skilled practitioners of the art will be knowledgeable of many such units of video data and references to the foregoing are not intended to limit the spirit, scope, or intent of the invention. The individual video data components are then assigned to a corresponding encoder instance in step 316. In one embodiment, the assignment of the individual video data components is load balanced equally across multiple processors. As an example, an information processing system may comprise a quad core processor, each of which is associated with a corresponding instance of a video encoder. Each video data component is sequentially assigned to the first, second, third and fourth video encoder instance, which are in turn respectively associated with a first, second, third and fourth processor. In another embodiment, the assignment of the individual video data components is load balanced dynamically across multiple processors. As an example, an information processing system may comprise a quad core processor, each of which is associated with a corresponding instance of a video encoder. Each video data component is dynamically assigned to the first, second, third and fourth video encoder instance as their respectively associated first, second, third and fourth processor become available.
  • Encoding operations are then performed in step 318 by each video encoder instance to generate video output components. In various embodiments, the encoding operations may comprise the transcoding, compression, decompression, or scaling of video data components to generate the video output components. The video output components are then assembled in a predetermined sequence in 320 to generate an encoded video output stream. In one embodiment, concurrent with the performance of steps 310 through 320, the audio source stream is encoded with an audio encoder in step 322 to generate an encoded audio output stream. In one embodiment, the encoded video source stream and the encoded audio source stream are multiplexed to generate combined video and audio output in step 324. The combined video and audio output is then provided to the application plug-in in step 326 and multi-instance encoding operations are then ended in step 328.
  • Skilled practitioners in the art will recognize that many other embodiments and variations of the present invention are possible. In addition, each of the referenced components in this embodiment of the invention may be comprised of a plurality of components, each interacting with the other in a distributed environment. Furthermore, other embodiments of the invention may expand on the referenced embodiment to extend the scale and reach of the system's implementation.

Claims (25)

1-20. (canceled)
21. A system for providing improved processing of video data, comprising:
an encoder module operable to:
receive a first stream of video data as video input;
split said first stream of video data into a plurality of video data components;
assign individual video data components of said plurality of video data components to a plurality of processors based on loads of the plurality of processors, the plurality of processors being operable to generate video output components from said individual video data components;
assemble said video output components in a predetermined sequence to generate a second stream of video data; and
provide said second stream of video data as video output.
22. The system of claim 21, wherein said individual video data components are transcoded to generate said video output components.
23. The system of claim 21, said individual video data components are compressed to generate said video output components.
24. The system of claim 21, said individual video data components are decompressed to generate said video output components.
25. The system of claim 21, said individual video data components are scaled to generate said video output components.
26. The system of claim 21, wherein said encoder module comprises a plurality of video encoder instances.
27. The system of claim 26, wherein an individual one of said video encoder instances is associated with an individual processor of said plurality of processors.
28. The system of claim 21, wherein an individual processor in said plurality of processors comprises a core of a central processing unit.
29. The system of claim 21, wherein an individual processor in said plurality of processors comprises a graphics processor.
30. The system of claim 21, wherein said video output components are generated in parallel by said plurality of processors.
31. The system of claim 21, wherein said individual video data components are assigned to said plurality of processors based on loads of the plurality of processors by assigning said individual video data components sequentially to said plurality of processors.
32. The system of claim 21, wherein said individual video data components are assigned to said plurality of processors based on loads of the plurality of processors by assigning said individual video data components dynamically to said plurality of processors as respective processors in said plurality of processors become available.
33. A method for providing improved processing of video data, comprising:
receiving a first stream of video data as video input;
splitting said first stream of video data into a plurality of video data components;
assigning individual video data components of said plurality of video data components to a plurality of processors based on loads of the plurality of processors, the plurality of processors being operable to generate video output components from said individual video data components;
assembling said video output components in a predetermined sequence to generate a second stream of video data; and
providing said second stream of video data as video output.
34. The method of claim 33, wherein generating said video output components includes transcoding said individual video data to generate said video output components.
35. The method of claim 33, generating said video output components includes compressing said individual video data to generate said video output components.
36. The method of claim 33, generating said video output components includes decompressing said individual video data to generate said video output components.
37. The method of claim 33, generating said video output components includes scaling said individual video data to generate said video output components.
38. The method of claim 33, wherein said encoder module comprises a plurality of video encoder instances.
39. The method of claim 38, wherein an individual one of said video encoder instances is associated with an individual processor of said plurality of processors.
40. The method of claim 33, wherein an individual processor in said plurality of processors comprises a core of a central processing unit.
41. The method of claim 33, wherein an individual processor in said plurality of processors comprises a graphics processor.
42. The method of claim 33, wherein said video output components are generated in parallel by said plurality of processors.
43. The method of claim 33, wherein assigning said individual video data components to said plurality of processors based on loads of the plurality of processors includes assigning said individual video data components sequentially to said plurality of processors.
44. The method of claim 33, wherein assigning said individual video data components to said plurality of processors based on loads of the plurality of processors includes assigning said individual video data components dynamically to said plurality of processors as respective processors in said plurality of processors become available.
US13/585,421 2008-11-06 2012-08-14 Multi-Instance Video Encoder Abandoned US20120307883A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/585,421 US20120307883A1 (en) 2008-11-06 2012-08-14 Multi-Instance Video Encoder

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/266,155 US8249168B2 (en) 2008-11-06 2008-11-06 Multi-instance video encoder
US13/585,421 US20120307883A1 (en) 2008-11-06 2012-08-14 Multi-Instance Video Encoder

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/266,155 Continuation US8249168B2 (en) 2008-11-06 2008-11-06 Multi-instance video encoder

Publications (1)

Publication Number Publication Date
US20120307883A1 true US20120307883A1 (en) 2012-12-06

Family

ID=42131369

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/266,155 Active 2031-05-29 US8249168B2 (en) 2008-11-06 2008-11-06 Multi-instance video encoder
US13/585,421 Abandoned US20120307883A1 (en) 2008-11-06 2012-08-14 Multi-Instance Video Encoder

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/266,155 Active 2031-05-29 US8249168B2 (en) 2008-11-06 2008-11-06 Multi-instance video encoder

Country Status (1)

Country Link
US (2) US8249168B2 (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8649669B2 (en) 2011-01-05 2014-02-11 Sonic Ip, Inc. Systems and methods for performing smooth visual search of media encoded for adaptive bitrate streaming via hypertext transfer protocol using trick play streams
US8909922B2 (en) 2011-09-01 2014-12-09 Sonic Ip, Inc. Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US8914836B2 (en) 2012-09-28 2014-12-16 Sonic Ip, Inc. Systems, methods, and computer program products for load adaptive streaming
US8918908B2 (en) 2012-01-06 2014-12-23 Sonic Ip, Inc. Systems and methods for accessing digital content using electronic tickets and ticket tokens
US8997254B2 (en) 2012-09-28 2015-03-31 Sonic Ip, Inc. Systems and methods for fast startup streaming of encrypted multimedia content
US8997161B2 (en) 2008-01-02 2015-03-31 Sonic Ip, Inc. Application enhancement tracks
US9094737B2 (en) 2013-05-30 2015-07-28 Sonic Ip, Inc. Network video streaming with trick play based on separate trick play files
US9124773B2 (en) 2009-12-04 2015-09-01 Sonic Ip, Inc. Elementary bitstream cryptographic material transport systems and methods
US9143812B2 (en) 2012-06-29 2015-09-22 Sonic Ip, Inc. Adaptive streaming of multimedia
US9184920B2 (en) 2006-03-14 2015-11-10 Sonic Ip, Inc. Federated digital rights management scheme including trusted systems
US9191457B2 (en) 2012-12-31 2015-11-17 Sonic Ip, Inc. Systems, methods, and media for controlling delivery of content
US9197685B2 (en) 2012-06-28 2015-11-24 Sonic Ip, Inc. Systems and methods for fast video startup using trick play streams
US9201922B2 (en) 2009-01-07 2015-12-01 Sonic Ip, Inc. Singular, collective and automated creation of a media guide for online content
US9247317B2 (en) 2013-05-30 2016-01-26 Sonic Ip, Inc. Content streaming with client device trick play index
US9264475B2 (en) 2012-12-31 2016-02-16 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9313510B2 (en) 2012-12-31 2016-04-12 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9344517B2 (en) 2013-03-28 2016-05-17 Sonic Ip, Inc. Downloading and adaptive streaming of multimedia content to a device with cache assist
US9343112B2 (en) 2013-10-31 2016-05-17 Sonic Ip, Inc. Systems and methods for supplementing content from a server
US9350990B2 (en) 2013-02-28 2016-05-24 Sonic Ip, Inc. Systems and methods of encoding multiple video streams with adaptive quantization for adaptive bitrate streaming
US9357210B2 (en) 2013-02-28 2016-05-31 Sonic Ip, Inc. Systems and methods of encoding multiple video streams for adaptive bitrate streaming
US9369687B2 (en) 2003-12-08 2016-06-14 Sonic Ip, Inc. Multimedia distribution system for multimedia files with interleaved media chunks of varying types
CN105873187A (en) * 2015-01-21 2016-08-17 中兴通讯股份有限公司 Method and device for issuing indication information
US9532080B2 (en) 2012-05-31 2016-12-27 Sonic Ip, Inc. Systems and methods for the reuse of encoding information in encoding alternative streams of video data
US9866878B2 (en) 2014-04-05 2018-01-09 Sonic Ip, Inc. Systems and methods for encoding and playing back video at different frame rates using enhancement layers
US9906785B2 (en) 2013-03-15 2018-02-27 Sonic Ip, Inc. Systems, methods, and media for transcoding video data according to encoding parameters indicated by received metadata
US9967305B2 (en) 2013-06-28 2018-05-08 Divx, Llc Systems, methods, and media for streaming media content
US10032485B2 (en) 2003-12-08 2018-07-24 Divx, Llc Multimedia distribution system
US10148989B2 (en) 2016-06-15 2018-12-04 Divx, Llc Systems and methods for encoding video content
US10397292B2 (en) 2013-03-15 2019-08-27 Divx, Llc Systems, methods, and media for delivery of content
US10452715B2 (en) 2012-06-30 2019-10-22 Divx, Llc Systems and methods for compressing geotagged video
US10498795B2 (en) 2017-02-17 2019-12-03 Divx, Llc Systems and methods for adaptive switching between multiple content delivery networks during adaptive bitrate streaming
US10591984B2 (en) 2012-07-18 2020-03-17 Verimatrix, Inc. Systems and methods for rapid content switching to provide a linear TV experience using streaming content distribution
US10687095B2 (en) 2011-09-01 2020-06-16 Divx, Llc Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US10708587B2 (en) 2011-08-30 2020-07-07 Divx, Llc Systems and methods for encoding alternative streams of video for playback on playback devices having predetermined display aspect ratios and network connection maximum data rates
US10721285B2 (en) 2016-03-30 2020-07-21 Divx, Llc Systems and methods for quick start-up of playback
US10902883B2 (en) 2007-11-16 2021-01-26 Divx, Llc Systems and methods for playing back multimedia files incorporating reduced index structures
US10931982B2 (en) 2011-08-30 2021-02-23 Divx, Llc Systems and methods for encoding and streaming video encoded using a plurality of maximum bitrate levels
US11457054B2 (en) 2011-08-30 2022-09-27 Divx, Llc Selection of resolutions for seamless resolution switching of multimedia content

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2840677C (en) 2009-02-03 2020-02-25 Robert N. Clausi Sound attenuating laminate materials
US8587653B1 (en) * 2009-04-30 2013-11-19 Verint Systems, Inc. Modifying the resolution of video before transferring to a display system
US10249305B2 (en) * 2016-05-19 2019-04-02 Microsoft Technology Licensing, Llc Permutation invariant training for talker-independent multi-talker speech separation
US10957337B2 (en) 2018-04-11 2021-03-23 Microsoft Technology Licensing, Llc Multi-microphone speech separation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020120741A1 (en) * 2000-03-03 2002-08-29 Webb Theodore S. Systems and methods for using distributed interconnects in information management enviroments
US20050018915A1 (en) * 2003-06-16 2005-01-27 Sony Corporation Image processing apparatus and image processing method
US20080162713A1 (en) * 2006-12-27 2008-07-03 Microsoft Corporation Media stream slicing and processing load allocation for multi-user media systems

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030023982A1 (en) * 2001-05-18 2003-01-30 Tsu-Chang Lee Scalable video encoding/storage/distribution/decoding for symmetrical multiple video processors
US8054880B2 (en) * 2004-12-10 2011-11-08 Tut Systems, Inc. Parallel rate control for digital video encoder with multi-processor architecture and picture-based look-ahead window
US7634776B2 (en) * 2004-05-13 2009-12-15 Ittiam Systems (P) Ltd. Multi-threaded processing design in architecture with multiple co-processors
JP2008527945A (en) * 2005-01-19 2008-07-24 トムソン ライセンシング Method and apparatus for real-time parallel encoding
US20070086528A1 (en) * 2005-10-18 2007-04-19 Mauchly J W Video encoder with multiple processors
US8121197B2 (en) * 2007-11-13 2012-02-21 Elemental Technologies, Inc. Video encoding and decoding using parallel processors

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020120741A1 (en) * 2000-03-03 2002-08-29 Webb Theodore S. Systems and methods for using distributed interconnects in information management enviroments
US20050018915A1 (en) * 2003-06-16 2005-01-27 Sony Corporation Image processing apparatus and image processing method
US20080162713A1 (en) * 2006-12-27 2008-07-03 Microsoft Corporation Media stream slicing and processing load allocation for multi-user media systems

Cited By (97)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9369687B2 (en) 2003-12-08 2016-06-14 Sonic Ip, Inc. Multimedia distribution system for multimedia files with interleaved media chunks of varying types
US11735227B2 (en) 2003-12-08 2023-08-22 Divx, Llc Multimedia distribution system
US11735228B2 (en) 2003-12-08 2023-08-22 Divx, Llc Multimedia distribution system
US11012641B2 (en) 2003-12-08 2021-05-18 Divx, Llc Multimedia distribution system for multimedia files with interleaved media chunks of varying types
US11017816B2 (en) 2003-12-08 2021-05-25 Divx, Llc Multimedia distribution system
US11509839B2 (en) 2003-12-08 2022-11-22 Divx, Llc Multimedia distribution system for multimedia files with packed frames
US10032485B2 (en) 2003-12-08 2018-07-24 Divx, Llc Multimedia distribution system
US10257443B2 (en) 2003-12-08 2019-04-09 Divx, Llc Multimedia distribution system for multimedia files with interleaved media chunks of varying types
US11159746B2 (en) 2003-12-08 2021-10-26 Divx, Llc Multimedia distribution system for multimedia files with packed frames
US11355159B2 (en) 2003-12-08 2022-06-07 Divx, Llc Multimedia distribution system
US11297263B2 (en) 2003-12-08 2022-04-05 Divx, Llc Multimedia distribution system for multimedia files with packed frames
US9184920B2 (en) 2006-03-14 2015-11-10 Sonic Ip, Inc. Federated digital rights management scheme including trusted systems
US11886545B2 (en) 2006-03-14 2024-01-30 Divx, Llc Federated digital rights management scheme including trusted systems
US10878065B2 (en) 2006-03-14 2020-12-29 Divx, Llc Federated digital rights management scheme including trusted systems
US9798863B2 (en) 2006-03-14 2017-10-24 Sonic Ip, Inc. Federated digital rights management scheme including trusted systems
US11495266B2 (en) 2007-11-16 2022-11-08 Divx, Llc Systems and methods for playing back multimedia files incorporating reduced index structures
US10902883B2 (en) 2007-11-16 2021-01-26 Divx, Llc Systems and methods for playing back multimedia files incorporating reduced index structures
US8997161B2 (en) 2008-01-02 2015-03-31 Sonic Ip, Inc. Application enhancement tracks
US9201922B2 (en) 2009-01-07 2015-12-01 Sonic Ip, Inc. Singular, collective and automated creation of a media guide for online content
US10437896B2 (en) 2009-01-07 2019-10-08 Divx, Llc Singular, collective, and automated creation of a media guide for online content
US9672286B2 (en) 2009-01-07 2017-06-06 Sonic Ip, Inc. Singular, collective and automated creation of a media guide for online content
US10484749B2 (en) 2009-12-04 2019-11-19 Divx, Llc Systems and methods for secure playback of encrypted elementary bitstreams
US11102553B2 (en) 2009-12-04 2021-08-24 Divx, Llc Systems and methods for secure playback of encrypted elementary bitstreams
US10212486B2 (en) 2009-12-04 2019-02-19 Divx, Llc Elementary bitstream cryptographic material transport systems and methods
US9124773B2 (en) 2009-12-04 2015-09-01 Sonic Ip, Inc. Elementary bitstream cryptographic material transport systems and methods
US9706259B2 (en) 2009-12-04 2017-07-11 Sonic Ip, Inc. Elementary bitstream cryptographic material transport systems and methods
US10382785B2 (en) 2011-01-05 2019-08-13 Divx, Llc Systems and methods of encoding trick play streams for use in adaptive streaming
US9247312B2 (en) 2011-01-05 2016-01-26 Sonic Ip, Inc. Systems and methods for encoding source media in matroska container files for adaptive bitrate streaming using hypertext transfer protocol
US11638033B2 (en) 2011-01-05 2023-04-25 Divx, Llc Systems and methods for performing adaptive bitrate streaming
US9025659B2 (en) 2011-01-05 2015-05-05 Sonic Ip, Inc. Systems and methods for encoding media including subtitles for adaptive bitrate streaming
US8649669B2 (en) 2011-01-05 2014-02-11 Sonic Ip, Inc. Systems and methods for performing smooth visual search of media encoded for adaptive bitrate streaming via hypertext transfer protocol using trick play streams
US10368096B2 (en) 2011-01-05 2019-07-30 Divx, Llc Adaptive streaming systems and methods for performing trick play
US9210481B2 (en) 2011-01-05 2015-12-08 Sonic Ip, Inc. Systems and methods for performing smooth visual search of media encoded for adaptive bitrate streaming via hypertext transfer protocol using trick play streams
US9883204B2 (en) 2011-01-05 2018-01-30 Sonic Ip, Inc. Systems and methods for encoding source media in matroska container files for adaptive bitrate streaming using hypertext transfer protocol
US8914534B2 (en) 2011-01-05 2014-12-16 Sonic Ip, Inc. Systems and methods for adaptive bitrate streaming of media stored in matroska container files using hypertext transfer protocol
US10931982B2 (en) 2011-08-30 2021-02-23 Divx, Llc Systems and methods for encoding and streaming video encoded using a plurality of maximum bitrate levels
US11611785B2 (en) 2011-08-30 2023-03-21 Divx, Llc Systems and methods for encoding and streaming video encoded using a plurality of maximum bitrate levels
US10708587B2 (en) 2011-08-30 2020-07-07 Divx, Llc Systems and methods for encoding alternative streams of video for playback on playback devices having predetermined display aspect ratios and network connection maximum data rates
US11457054B2 (en) 2011-08-30 2022-09-27 Divx, Llc Selection of resolutions for seamless resolution switching of multimedia content
US10225588B2 (en) 2011-09-01 2019-03-05 Divx, Llc Playback devices and methods for playing back alternative streams of content protected using a common set of cryptographic keys
US10341698B2 (en) 2011-09-01 2019-07-02 Divx, Llc Systems and methods for distributing content using a common set of encryption keys
US10856020B2 (en) 2011-09-01 2020-12-01 Divx, Llc Systems and methods for distributing content using a common set of encryption keys
US10687095B2 (en) 2011-09-01 2020-06-16 Divx, Llc Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US8918636B2 (en) 2011-09-01 2014-12-23 Sonic Ip, Inc. Systems and methods for protecting alternative streams in adaptive bitrate streaming systems
US9621522B2 (en) 2011-09-01 2017-04-11 Sonic Ip, Inc. Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US10244272B2 (en) 2011-09-01 2019-03-26 Divx, Llc Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US11178435B2 (en) 2011-09-01 2021-11-16 Divx, Llc Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US11683542B2 (en) 2011-09-01 2023-06-20 Divx, Llc Systems and methods for distributing content using a common set of encryption keys
US9247311B2 (en) 2011-09-01 2016-01-26 Sonic Ip, Inc. Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US8909922B2 (en) 2011-09-01 2014-12-09 Sonic Ip, Inc. Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US10289811B2 (en) 2012-01-06 2019-05-14 Divx, Llc Systems and methods for enabling playback of digital content using status associable electronic tickets and ticket tokens representing grant of access rights
US9626490B2 (en) 2012-01-06 2017-04-18 Sonic Ip, Inc. Systems and methods for enabling playback of digital content using electronic tickets and ticket tokens representing grant of access rights
US11526582B2 (en) 2012-01-06 2022-12-13 Divx, Llc Systems and methods for enabling playback of digital content using status associable electronic tickets and ticket tokens representing grant of access rights
US8918908B2 (en) 2012-01-06 2014-12-23 Sonic Ip, Inc. Systems and methods for accessing digital content using electronic tickets and ticket tokens
US11025902B2 (en) 2012-05-31 2021-06-01 Nld Holdings I, Llc Systems and methods for the reuse of encoding information in encoding alternative streams of video data
US9532080B2 (en) 2012-05-31 2016-12-27 Sonic Ip, Inc. Systems and methods for the reuse of encoding information in encoding alternative streams of video data
US9197685B2 (en) 2012-06-28 2015-11-24 Sonic Ip, Inc. Systems and methods for fast video startup using trick play streams
US9143812B2 (en) 2012-06-29 2015-09-22 Sonic Ip, Inc. Adaptive streaming of multimedia
US10452715B2 (en) 2012-06-30 2019-10-22 Divx, Llc Systems and methods for compressing geotagged video
US10591984B2 (en) 2012-07-18 2020-03-17 Verimatrix, Inc. Systems and methods for rapid content switching to provide a linear TV experience using streaming content distribution
US8997254B2 (en) 2012-09-28 2015-03-31 Sonic Ip, Inc. Systems and methods for fast startup streaming of encrypted multimedia content
US8914836B2 (en) 2012-09-28 2014-12-16 Sonic Ip, Inc. Systems, methods, and computer program products for load adaptive streaming
USRE48761E1 (en) 2012-12-31 2021-09-28 Divx, Llc Use of objective quality measures of streamed content to reduce streaming bandwidth
US9191457B2 (en) 2012-12-31 2015-11-17 Sonic Ip, Inc. Systems, methods, and media for controlling delivery of content
US10225299B2 (en) 2012-12-31 2019-03-05 Divx, Llc Systems, methods, and media for controlling delivery of content
US11785066B2 (en) 2012-12-31 2023-10-10 Divx, Llc Systems, methods, and media for controlling delivery of content
US11438394B2 (en) 2012-12-31 2022-09-06 Divx, Llc Systems, methods, and media for controlling delivery of content
US10805368B2 (en) 2012-12-31 2020-10-13 Divx, Llc Systems, methods, and media for controlling delivery of content
US9264475B2 (en) 2012-12-31 2016-02-16 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9313510B2 (en) 2012-12-31 2016-04-12 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9357210B2 (en) 2013-02-28 2016-05-31 Sonic Ip, Inc. Systems and methods of encoding multiple video streams for adaptive bitrate streaming
US10728564B2 (en) 2013-02-28 2020-07-28 Sonic Ip, Llc Systems and methods of encoding multiple video streams for adaptive bitrate streaming
US9350990B2 (en) 2013-02-28 2016-05-24 Sonic Ip, Inc. Systems and methods of encoding multiple video streams with adaptive quantization for adaptive bitrate streaming
US10178399B2 (en) 2013-02-28 2019-01-08 Sonic Ip, Inc. Systems and methods of encoding multiple video streams for adaptive bitrate streaming
US10397292B2 (en) 2013-03-15 2019-08-27 Divx, Llc Systems, methods, and media for delivery of content
US9906785B2 (en) 2013-03-15 2018-02-27 Sonic Ip, Inc. Systems, methods, and media for transcoding video data according to encoding parameters indicated by received metadata
US11849112B2 (en) 2013-03-15 2023-12-19 Divx, Llc Systems, methods, and media for distributed transcoding video data
US10715806B2 (en) 2013-03-15 2020-07-14 Divx, Llc Systems, methods, and media for transcoding video data
US10264255B2 (en) 2013-03-15 2019-04-16 Divx, Llc Systems, methods, and media for transcoding video data
US9344517B2 (en) 2013-03-28 2016-05-17 Sonic Ip, Inc. Downloading and adaptive streaming of multimedia content to a device with cache assist
US9094737B2 (en) 2013-05-30 2015-07-28 Sonic Ip, Inc. Network video streaming with trick play based on separate trick play files
US10462537B2 (en) 2013-05-30 2019-10-29 Divx, Llc Network video streaming with trick play based on separate trick play files
US9247317B2 (en) 2013-05-30 2016-01-26 Sonic Ip, Inc. Content streaming with client device trick play index
US9712890B2 (en) 2013-05-30 2017-07-18 Sonic Ip, Inc. Network video streaming with trick play based on separate trick play files
US9967305B2 (en) 2013-06-28 2018-05-08 Divx, Llc Systems, methods, and media for streaming media content
US9343112B2 (en) 2013-10-31 2016-05-17 Sonic Ip, Inc. Systems and methods for supplementing content from a server
US10321168B2 (en) 2014-04-05 2019-06-11 Divx, Llc Systems and methods for encoding and playing back video at different frame rates using enhancement layers
US9866878B2 (en) 2014-04-05 2018-01-09 Sonic Ip, Inc. Systems and methods for encoding and playing back video at different frame rates using enhancement layers
US11711552B2 (en) 2014-04-05 2023-07-25 Divx, Llc Systems and methods for encoding and playing back video at different frame rates using enhancement layers
CN105873187A (en) * 2015-01-21 2016-08-17 中兴通讯股份有限公司 Method and device for issuing indication information
US10721285B2 (en) 2016-03-30 2020-07-21 Divx, Llc Systems and methods for quick start-up of playback
US11483609B2 (en) 2016-06-15 2022-10-25 Divx, Llc Systems and methods for encoding video content
US11729451B2 (en) 2016-06-15 2023-08-15 Divx, Llc Systems and methods for encoding video content
US10148989B2 (en) 2016-06-15 2018-12-04 Divx, Llc Systems and methods for encoding video content
US10595070B2 (en) 2016-06-15 2020-03-17 Divx, Llc Systems and methods for encoding video content
US10498795B2 (en) 2017-02-17 2019-12-03 Divx, Llc Systems and methods for adaptive switching between multiple content delivery networks during adaptive bitrate streaming
US11343300B2 (en) 2017-02-17 2022-05-24 Divx, Llc Systems and methods for adaptive switching between multiple content delivery networks during adaptive bitrate streaming

Also Published As

Publication number Publication date
US20100111192A1 (en) 2010-05-06
US8249168B2 (en) 2012-08-21

Similar Documents

Publication Publication Date Title
US8249168B2 (en) Multi-instance video encoder
US11120677B2 (en) Transcoding mixing and distribution system and method for a video security system
US8170123B1 (en) Media acceleration for virtual computing services
EP2098995B1 (en) System for real-time volume rendering on thin clients via a render server.
US9146884B2 (en) Push pull adaptive capture
US10283091B2 (en) Buffer optimization
US20140074911A1 (en) Method and apparatus for managing multi-session
CN101582926B (en) Method for realizing redirection of playing remote media and system
US20110138069A1 (en) Systems and methods for a client-side remote presentation of a multimedia stream
US9426476B2 (en) Video stream
US7312800B1 (en) Color correction of digital video images using a programmable graphics processing unit
US9860285B2 (en) System, apparatus, and method for sharing a screen having multiple visual components
US9560310B2 (en) Method and system for rescaling image files
EP3643069B1 (en) Effective encoding for screen data
WO2017107911A1 (en) Method and device for playing video with cloud video platform
US8600155B2 (en) Classification and encoder selection based on content
US20150201199A1 (en) Systems and methods for facilitating video encoding for screen-sharing applications
US9894126B1 (en) Systems and methods of smoothly transitioning between compressed video streams
WO2019055086A1 (en) Data transmission with plural jitter buffers
KR20160131830A (en) System for cloud streaming service, method of cloud streaming service of providing multi-view screen based on resize and apparatus for the same
US20060253857A1 (en) Method for processing a data stream by utilizing multi-processor
US11785281B2 (en) System and method for decimation of image data for multiviewer display
CN111432159B (en) Computing task processing method, device and system and computer readable storage medium
Katsak et al. Catalyst: A cloud-based media processing framework
EP2946554B1 (en) System, apparatus and method for sharing a screen having multiple visual components

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION