GB2397964A - Optimising resource allocation in a multipoint communication control unit - Google Patents

Optimising resource allocation in a multipoint communication control unit Download PDF

Info

Publication number
GB2397964A
GB2397964A GB0408547A GB0408547A GB2397964A GB 2397964 A GB2397964 A GB 2397964A GB 0408547 A GB0408547 A GB 0408547A GB 0408547 A GB0408547 A GB 0408547A GB 2397964 A GB2397964 A GB 2397964A
Authority
GB
United Kingdom
Prior art keywords
video
data stream
control unit
compressed
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0408547A
Other versions
GB0408547D0 (en
GB2397964B (en
Inventor
Meir Feder
Moshe Elbaza
Noam Eshkoli
Aviv Eisenberg
Ilan Yona
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Polycom Israel Ltd
Original Assignee
Accord Networks Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/506,861 external-priority patent/US6300973B1/en
Application filed by Accord Networks Ltd filed Critical Accord Networks Ltd
Publication of GB0408547D0 publication Critical patent/GB0408547D0/en
Publication of GB2397964A publication Critical patent/GB2397964A/en
Application granted granted Critical
Publication of GB2397964B publication Critical patent/GB2397964B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/152Multipoint control units therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/115Selection of the code volume for a coding unit prior to coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/117Filters, e.g. for pre-processing or post-processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/12Selection from among a plurality of transforms or standards, e.g. selection between discrete cosine transform [DCT] and sub-band transform or selection between H.263 and H.264
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/152Data rate or code amount at the encoder output by measuring the fullness of the transmission buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/184Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being bits, e.g. of the compressed video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/40Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/59Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial sub-sampling or interpolation, e.g. alteration of picture size or resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards

Abstract

A multipoint control unit (MCU) for facilitating communication between a plurality of endpoints. Each endpoint sends a compressed video signal and receives a compressed video signal. The MCU has a plurality of video input modules and a video output module. Each of the video input modules receives a video signal from one of the endpoints and generally decodes the data. The video output module includes a rate control unit and a generalized encoder that receive the decoded data for generally encoding to a compressed output stream for transmission to an endpoint. Hardware architecture is arranged to optimise resource allocation for the system requirements. This may be done through the use of a fat port which provides a single logical unit for each single endpoint or a slim port arrangement in which temporary logical connections between endpoints are made.

Description

METHOD AND SYSTEM FOR
COMPRESSED VIDEO PROCESSING
BACKGROID
0 In video communication, e.g., video conferencing, hIultipoint Control Units ("MCU's") serve as switches and conference builders for the network.
The MCU's receive multiple audio/video streams *om the various users' terminals, or codecs, and transmit to the various users' terminals audio/video streams that correspond to the desired signal at the users' stations. In some cases, where the MCU serves as a switchboard, the transmitted steam to the end terminal is a simple stream from a single other user. In other cases, it is a combined "conference" stream. composed of a combination of several users' streams.
An important function of the MCU is to translate the input streams into the desired output streams from all and to all codecs. One aspect of this "translation" is a modification of the bit-rate between the original stream and the output stream. This rate matching modification can be achieved, for example, by changing the frame rate, the spatial resolution, or the quantization - accuracy of the corresponding video. The output bit-rate7 and thus the modified factor used to achieve the output bit rate, can be different for different users, s even for the same input steam. For instance, in a four parlor conference, one of the parties may be operating at 128 Kbps, another at 256 Kbpsl and two others at T1. Each party needs to receive the transmission at the appropriate bit rate.
The same principles apply to "translation," or transcoding, between parameters that vary between coders, e.g. different coding standards lilce H.261lH263; different input resolutions; and different maximal frame rates in the input streams.
Another use of the MCU can be to construct an output stream that combines several input streams. This option, somethnes called "compositing" or "continuous presence," allows a user at a remote terminal to observe, simultaneously, several other participants in the conference. The choice of these participants can vary among different users at different remote terminals of the conference. In this situation, the amount of bits allocated to each paiicipan-t can also vary, and may depend on the on screen activity of the users; on the specific resolution given to the participant, or some other criterion.
All of this elaborate processing, e.g. transcoding and continuous presence processing, must be done under the constraint that the input streams are already compressed by a known compression method, usually based on a standard like ITU's H.261 or H.263. These standards, as well as other video compression standards like MPEG, are generally based on a Discrete Cosine Transform ("DCT") process wherein the blocks ofthe image (video frame) are s transformed, and the resulting transfonn coefficients are quantized and coded.
One prior art method first decompresses the video streams; performs the required combination, bridging and image construction; and finally recompresses for transmission. This method requires high computation power, leads to degradation in the resulting video quality and suffers Mom large propagation delay. One of the most computation intensive portions of the prior art methods is the encoding portion of the operation where such Dings as motion vectors and DCT coefficients have to be generated so as to take advantage of spatial and temporal redundancies. For instance, to take advantage of spatial redundancies in the video picture, the DCT function can be performed. To generate DCT coefficients, each frame of the picture is broken into blocks and the discrete cosine transform function is performed upon each block. An order to take advantage of temporal redundancies, motion vectors can be generated. To generate motion vectors, consecutive frames are compared to each other in an attempt to discern pattern movement from one Darne to the next. As would be expected, these computations require a great deal of computing power.
In order to reduce computation complexity and increase qualify, others have searched for methods of performing such operations in a more efficient manner. Proposals have included operating in the transform domain on motion compensated, DCT compressed video signals by removing the notion s compensation portion and compositlilg in the DCT transform domain.
Therefore, a method is needed for performing the "translation" operations of an MCU, such as modifying bit rates, flame rates, and compression alogorithrns in an efficient mangler that reduces propagation delays, degradation in signal quality, video bandwidth use within the MCU and \ 0 computational complexity.
S(]MMAR16 The present invention relates to an improved method of processing multimedia / Video data in an MCU. By reusing information embedded in a compressed video stream received from an endpoint, the MCU can reduce the total computations needed to process the video data before resending it to the endpoint.
l7IGIJRlES the construction designed to carry of the invention will hereinafter be described, together with other features thereof. The invention i will be more readily understood Tom a reading of the following specification and by reference to the accompanying drawings forming a part thereof, wherein an example of the invention is shown and wherein: Figure 1 illustrates a system block diagram for implementation of an exemplary embodiment ofthe general function ofthis invention.
Figure 2 illustrates a block diagram of an exemplary embodiment of a generalized decoder.
Figure 3 Illustrates a block diagram of another exemplary embodiment of a generalized decoder.
Figure 4 illustrates a block diagram of an exemplary embodiment of a generalized encoder/operating in the spatial domain.
Figure 5 illustrates a blocl: diagram of an exemplary embodiment of a generalized encoder/operating in the DCT domain.
Figure 6 illustrates an exemplary embodiment of a rate control unit for operation with an embodiment of the present invention.
IS Figure 7 is a flow diagram depicting exemplary steps in the operation of a rate control unit.
Figure 8 illustrates an exemplary embodiment of the present invention operating within an MCU wherem each endpoint has a single dedicated video output module and a plurality of dedicated video input modules.
Figure 9 illustrates an exempIay embodiment of the present invention having a single video input module and a single video output module per logical unit.
DETAILED DESCRIPTION
An MCU is used where multiple users at endpoint codecs communicate in a simultaneous video conference. A user at a given endpoint may simultaneously view multiple endpoint users at his discretion. In addition, the endpoints may communicate at differing data rates using different coding standards, so the MCU facilitates transcoding of the video signals between these endpoints.
Figure 1 illustrates a system block diagram for implementation of an exemplary embodiment of the general Unction of the invention. In an MCU, compressed video input 115 from a first endpoint codec is brought into a video input module 105, routed through a common interface 150, and directed to video output module 110 for transmission as compressed video output 195 to a second endpoint codec. The common interface may include any of a variety of interfaces, such as shared memory, ATM bus, TDM bus, switching and direct connect. The invention contemplates that there will be a plurality of endpoints enabling multiple users to participate in a video conference. For each endpoint, a video input module 105 and a video outputmodule 110 may be assigned.
Common interface 150 facilitates the transfer of video infonnation between multiple video input modules 105 and multiple video output modules 110.
Compressed Video 115 is sent to error correction decoder block 117 within video input module 105. Error correction decoder block 117 talces the incoming compressed video 115 and removes the error correction code. An example of an error correction code is BCH coding. This error correction block 117 is optional and may not be needed with certain codecs.
- The video stream is next routed to the variable length decoder, VLC-i 120, for decoding the variable length coding usually present within the compressed video input stream. Depending on the compression used (H.26], H.263, MPEG etc.) it recognizes the stream header markers and the specific fields associated with the video dame structure. Although the main task ofthe VLC- 120 is to decode this variable length code and prepare the data for the following steps, VLC-' 120 may take some ofthe information it receives' e g., stream header markers and specific field information, and pass this information on to later function blocks in the system.
The video data of the incoming stream contains quantized DCT coefficients. After decoding the variable length code, Q' 12S dequantizes the Is representation of these coefficients to restore the numerical value ofthe DCT coefficients in a well known manner. In addition to dequantizing the DCT coefficients, Q 125 may pass through some information, such as the step size, to over blocks for additional processing.
Generalized decoder 130 takes the video stream received from the VLC-' 120 through Q-i 125 and based on the frame memory 135 content, converts it into "generalized decoded" frames (according to the domain chosen for transcoding). The generalized decoder 130 then generates two streams: a primary data skeam and a secondary data stream. The primary data stream can be either Dames represented in the image (spatial) domain, Dames represented in the DCT domain, or some variation of these, e.g., error Dames. The secondary data stream contains "control" or "side information" associated with the primary stream and may contain motion vectors, quantizes identifications, coded/uncoded decisions, filter/non-filter decisions, Came type, resolution and other infonnation that would be useful to the encoding of a video signal.
For example, for every macro block, there may be an associated motion lO vector. Reuse of the motion vectors can reduce the amount of computations significantly. Quantizer values are established prior to the reception of encoded video 115. Reuse of quantizer values, when possible, can allow generalized encoder 170 to avoid quantization errors and send the video coefficients in the same form as they entered the generalized decoder 130. This configuration avoids quality degradation. In other cases, quantizer values may serve as first guesses during the reencodg process. Statistical information can be sent from the generalized decoder 130 over the secondary data stream. Such statistical ifornation may include data about the amount oiKinformation Wichita each macroblock of an image. In this way, more bits may later be allocated by rate control unit 180 to those macroblocks having more information.
Because filters may be used in the encoding process, extraction of filter usage information in the generalized decoder 130 also can reduce the complexity of processing in the generalized encoder 170. While the use of filters in the encoding process is a feature ofthe H.261 standard, it will be appreciated that the notion of the reuse of filter infonnation should be read broadly to include the reuse of information used by other artifact removal techniques.
In addition, tile secondary data stream may contain decisions made by processing the incoming stream, such as image segmentation decisions and lo camera movements identification. Camera movements include such data as pan, zoom and other general camera movement information. By providing this infonnation over the secondary data stream, the generalized encoder 170 may mane a better approximation when re-encoding the picture by knowing that the image is being panned or zoomed.
This secondary data skearn is routed over Me secondary (Side formation) channel 132 to the rate control unit 180 for use in video output block 110. Rate control unit 180 is responsible for the efficient allocation of bits to the video stream in order to obtain maximum quality while at the same time using the information extracted from generalized decoder 130 within the video input block 105 to reduce the total computations of the video output module 110.
The scaler 140 talces the primary data stream and scales it. The purpose of scaling is to change the Dame resolution in order to later incorporate it into a continuous presence frame. Such a continuous presence frame may consist of a plurality of appropriately scaled frames. The scaler 140 also applies proper s filters for both decimation and picture qualify preservation. The scaler 140 may be bypassed if the scaling function is not required in a particular implementation or usage.
The data formatter 145 creates a representation of the video stream. This representation may include a progressively compressed stream. In a to progressively compressed stream, a progressive compression technique, such as wavelet based compression, represents the video image in an increasing resolution pyramid. Using this technique, the scaler 140 may be avoided arid the data analyzer and the editor 160, may take from the common interface only Me amount of information that the editor requires for the selected resolution.
The data formatter 145 facilitates communication over the common interface and assists the editor 160 in certain embodiments of the invention.
The data formatter 145 may also serve to reduce the bandwidth required of the common interface by compressing the video stream. The data formatter 145 may be bypassed if its function is not required in a particular embodiment.
When the formatted video leaves data formatter 145 of the video input block, it is routed through common interface 150 to the data analyzer 155 of video output block 110. Routing may be accomplished through various means including busses, switches or memos.
The data analyzer 155 inverts the representation created by the data fonnatter 14S into a video Farce structure. In the case of progressive coding, the data analyzer I55 may take only a portion of the generated bit-stream to create a reduced resolution video frame. In embodiments where the data formatter 145 is not present or is bypassed, the data analyzer 155 is not utilized.
After the video stream leaves the data analyzer 155, the editor 160 can generate the composite video image. It receives a plurality of video frames; it 0 may scale the video frame (applying a suitable filter for decimation arid quality), and/or combine various video inputs into one video frame by placing them inside the frame according to a predefned or user defined screen layout scheme. The editor 160 may receive external editor inputs 162 containing layout preferences or text required to be added to the video frame, such as speech translation, menus, or endpoint names. Ike editor 160 is not required and may be bypassed or not present in certain embodiments not requiring the compositing fimction.
The rate control unit 180 controls the bit rate of the outgoing video stream. The rate control operation is not limited to a single stream arid can be used to control multiple streams in an embodiment comprising a plurality of video input modules 105. The rate control and bit allocation decisions are made based on the activities and desired quality for the output stream. A simple feedback mechanism that monitors the tote] amount of bits to all streams can assist in these decisions. In effect, the rate control unit becomes a statistical multiplexer of these streams. In this fashion, certain portions of the video stream may be allocated more bits or more processing effort.
In addition to the feedback from generalized encoder 1 7O, feedback from VLC 190, and side information Tom the secondary channel 132, as well as external input 182 all may be used to allow a user to select certain aspects of signal quality. For instance, a user may choose to allocate more bits of a video stream to a particular portion of an image in order to enhance clarity of that portion. The external input 182 is a bidirectional port to facilitate communications from and to an external device.
In addition to using the side information from the secondary channel 132 to assist in its rate control function, rate control unit 180 may, optionally, merely pass side information directly to the generalized encoder 170. The rate control unit 180 also assists the quantifier 175 with quantizing the DCT coefficients by identifyir; the quantizer to be used.
Generalized encoder 170 basically persons the inverse operation of the generalized decoder 130. The generalized encoder 170 receives two streams: a primary stream, originally generated by one or more generalized decoders, edited and combined by the editor 160; and a secondary stream of relevant side information coming from the respective generalized decoders. Since the secondary streams generated by the generalized decoders are passed to the rate-control Unction 180, the generalized encoder 170 may receive the side information through the rate control Unction 180 either in its original form or s alter being processed. The output of the generalized encoder 170 is a stream of DCT coefficients and additional parameters ready to be transformed into a compressed stream after quantization and VLC The output DCT coefficients from the generalized encoder 170 are quantized by Q2 175, according to a decision made by the rate control Alit 180.
These coefficients are fed back to the inverse quantizer block Q2- 185 to generate as a reference a replica of what the decoder at the endpoint codec would obtain. This reference is typically the sum of this feedback and the content of the frame memory 165. This process is aimed to avoid error propagation. Now, depending ore the domain used for encoding, the difference between the output of the editor 160 and the motion compensated reference (calculated either in the DCT or spatial domain) is encoded into DCT coefficients which are the output of tile generalized encoder 170.
The N7LC 190, or variable lengths coder, reproves the remaining redundancies from the quantized DCT coefficients stream by using lossless coding tables defined by the chosen standard (H.261, H.263.). VLC 190 also inserts the appropriate motion vectors, the necessary headers arid synchronization fields according to the chosen standard. The VLC 190 also sends to the Rate Control Unit 180 the data on the actual amount of bits used after variable length coding.
The error correction encoder block 192 next receives the video stream s and inserts the error correction code. In some cases this may be BCH coding.
This error correction encoder block 192 is optional and, depending on the codec, may be bypassed. rinaily, it sends the stream to the end user codec for viewing.
In order to more fully describe aspects of the invention, further detail on the generalized decoder 130 and the generalized encoder 170 follows.
Figure 2 illustrates a block diagram of an exemplary embodiment of a generalized decoder 130. Dequantized video is routed Mom the dequantizer to the Selector 210 within the generalized decoder 130. The Selector 210 splits We dequantized video stream, sending the stream to one or more data processors-220 and a spatial decoder 230. The data processors 220 calculate side information, such as statistical information like pan and zoom, as well as quantiz.er values Ed the like, Dom the video stream. The data processors 220 then pass this information LO the side information channel 132. A spatial decoder 230, in conjunction with frame memory 135 (shown in Figure 1) hilly or partially decodes the compressed video stream. The DCT decoder 240, optionally, performs the inverse of the discrete cosine transfer function. The motion compensator 250, optionally, in conjunction with frame memory 135 (shown in Figure 1) uses the motion vectors as pointers to a reference block in the reference Dame to be summed with the incoming residual information block. The fully or partially decoded video stream is then sent along the primary charnel to the scaler 140, shown in Figure 1, for further processing.
Motion vectors data is transferred from spatial decoder 230 via side channel 132 for possible reuse at rate control unit 180 and generalized encoder 170.
Figure 3 illustrates a block diagram of another exemplary embodiment of a generalized decoder 130. Dequantized video is routed from dequantizer 125 lo to the selector 210 within generalized decoder 130. The selector 210 splits the dequantized video stream sending the stream to one or more data processors 320 and DCT decoder 330. The data processors 320 calculate side information, such as statistical information like pan and zoom, as well as quantizer values and the like, from the video stream. The data processors 320 then pass this information through the side information channel 132. The DCT decoder 330 in conjunction with the frame memory 135, shown in Figure 1, fully or partially decodes the compressed video stream using a DCT domain motion compensator 3411 which performs, ire the DCI dual., calculations needed to sum the reference block pointed to by We motion vectors in the DCT domain reference Dame with Me residual DCT domain input block. The fully or partially decoded video stream is sent along the primary channel to the scaler 140, shown in Figure 1, for further processing. Motion vectors data is transferred from the DCT decoder 330 via the side channel 132 for possible reuse at the rate control unit 180 and the generalized encoder 170.
Figure 4 illustrates a block diagram of an exemplary embodiment of a s generalized encoder 170 operating in the spatial domain. The generalized encoder's first task is to determine the motion associated with each macroblock ofthe received image over the primary data channel from the editor 160. This is performed by the enhanced motion estimator 450. The enhanced motion estimator 450 receives motion predictors that originate in the side information, lO processed by the rate control function 180 and sent through the encoder manager 410 to the enhanced motion estimator 450. The enhanced motion estimator 450 compares, if needed, the received image with the reference Nonage that exists in the frame memory 165 and finds the best motion prediction in the environment in a manner well known to those skilled in the art. The motion vectors, as weld as a quality factor associated with them, are then passed to the encoder manager 410. The coefficients are passed on to lye MB processor 460.
The M13, or macroblock, processor 460 is a general purpose processing unit for the rnacroblock level wherein one of its many functions is to calculate the difference MB. This is done according to an input coming from the encoder manager 410, in the form of indications whether to code the MB or not, whether to use a de-blocking filter or not, and other video parameters. In general, responsibility of the MB processor 460 is to calculate the macroblock in the form that is appropriate for transformation and quantization. The output of the MB processor 460 is passed to the DCT coder 420 for generation of the DCT coefficients prior to quantization.
s All these blocks are controlled by the encoder manager 410. It decides whether to code or not to code a macroblock; it may decide to use some deblocking filters; it gets quality results from the enhanced motion estimator 450; it serves to control the DCT coder 420; and it serves as an interface to the rate-control block 180. The decisions and control made by the encoder lo manager 410 are subject to the input coming Tom the rate control block 180.
The generalized encoder 170 also contains a feedback loop. The purpose of the feedback loop is to avoid error propagation by reentering the firarne as seen by the remote decoder and referencing it when encoding the new frame. Ike output of the encoder which was sent to the quantization block is decoded back by using an inverse quantization block, and then fed back to the generalized encoder 170 into the inverse DCT 430 and motion compensation blocks 440, generating a reference image in the frame memory:165.
Figure illustrates a block diagram of a second exemplar; embodiment of a generalized encoder 170 operating in the DCT domain. The generalized encoder's first task is to determine the motion associated with each macroblock of the received image over the primary data channel from the editor 160. This is performed by the DCT domain enhanced motion estimator 540. The DCT domain enhanced motion estimator 540 receives motion predictors that originate in the side information channel, processed by rate control Unction and sent through the encoder manager S10 to the DCT domain enhanced s motion estimator 540. It compares, if needed, the received image with the DCT domain reference image that exists in the frame memory 165 and finds the best motion prediction in the envirorunent. The motion vectors, as well as a quality factor associated with them, are then passed to the encoder manager 510. The coefficients are passed on to the DCT domain MB processor 520.
The DCT domain macroblock, or MB, processor 520 is a general purpose processing unit for the macroblock level, wherein one of its many functions is to calculate the difference MB in the DCT domain. This is done according to art input coming from the encoder manager 510, in the form of indications whether to code the MB or not, to use a de-blocking filter or not, and other video parameters. In general, the DCT domain MB processor 520 responsibility is to calculate the macrobIock in the form that is appropriate for transformation and quantization.
All these blocks are controlled by Idle encoder manager S10. The encoder manager 510 decides whether to code or not to code a macroblock; it may decide to use some deblocking Alters; it gets quality results from the DCT domain enhanced motion estimator 540; and it serves as an interface to the rate control block 180. The decisions and control made by the encoder manager 510 are subject to the input coming from the rate control block 180.
The generalized encoder 170 also contains a feedback loop. The output of the encoder which was sent to the quantizaton block is decoded back, by s using an inverse quantization block and then fed back to the DCT domain motion compensation blocks 530, generating a DCT domain reference image in the frame memory 165.
While the generalized encoder 170 has been described with reference to a DCT domain configuration and a spatial domain configuration, it will be appreciated by those skilled in the art that a single hardware configuration may operate in either the DCT domain or the spatial domain. This invention is not limited to either the DCT domain or the spatial domain but may operate in either domain or in the continuum between the two domains.
Figure 6 illustrates an exemplary embodiment of a rate control unit for I s operation with an embodiment of the present invention. Exemplary rate control unit 180 controls the bit rate ofthe outgoing video stream. As was stated previously, the rate control operation can apply joint transcoding of multiple streams. Bit allocation decisions are made based on Me activities and desired quality for the various streams assisted by a feedback mechanism that monitors the total amount of bits to all streams. Certain portions of the video skeam may be allocated more bits or more processing time.
The rate control unit I80 comprises a commurucation module 610, a side infonnation module 620, and a quality control module 630. The communication module 610 interfaces with functions outside of the rate control unit 180. The communication module 610 reads side information Mom the s secondary channel 132, serves as a two-way interface with the external input 182, sends the quantizer level to a quantized 175, reads the actual number of bits needed to encode the information bom the \C Ago, and sends instructions and data and receives processed data from the generalized encoder 170.
The side information module 620 receives the side information from allappropriate generalized decoders from the communication module 610 and arranges the information for use in the generalized encoder. Parameters generated in the side information module 620 are sent via communication module 610 for further processing in the general encoder 170.
Is The quality control nnodule 630 controls the operative side of the rate control block 180. The quality control module 630 stores the desired and measmed quality parameters. Based on these parameters, the quality control module 630 may instruct the side i.nformatiorl module 620 or the generalized encoder 170 to begin certain tasks in order to refine the video parameters.
Further understanding ofthe operation of the rate control module 180 will be facilitated by referencing; the flowchart shown in Figure 7. While the rate control unit 180 can perform numerous tinctions, the illustration of Figure 7 depicts exemplary steps in the operation of a rate control unit such as rate control unit 180. The context of this description is the reuse of motion vectors; in practice those skilled in the art will appreciate that other information can be s exploited in a similar manner. The method depicted in Figure 7 at step 705, the communications module 610 within the rate control unit 1XQ reads external instructions for the user desired picture qu 'id- and frame rate. At step 710, conmunications module 610 reads the motion vectors of the incoming frames from all of the generalized decoders that are sending picture data to the generalized encoder. For example if the generalized encoder is transmitting a continuous presence image fiom six incoming images, motion vectors Mom the six incoming images are read by the communications module 610. Once the motion vectors are read by the comununications module 610' they are transferred to the side information module 620.
At step 715, the quality control module 630 instructs the side information module 620 to calculate new motion vectors using the motion vectors that were retrieved from the generalized decoders and stored, at step 710, in the side information module 620. The new motion vectors may have to be generated for a variety of reasons including reduction of frame hopping and down scaling. In addition to use in generating new motion vectors, the motion vectors in the side information module are used to perform error estimation calculations with the result being used for farther estimations or enlianced bit allocation. In addition, the motion vectors give an indication of a degree of movement within a particular region of the picture (or "region of interest"), so that the rate control unit 180 can allocate more bits to blocks in that particular region.
At step 720, the quality control module 630 may instruct the side information module 620 to send the new motion vectors to the generalized encoder via the communications module 610. The generalized encoder may then refine the motion vectors further. Alternatively, due to constraults in processing power or a decision by the quality control module 630 that refinement is unnecessary, motion vectors may not be sent. At step 725, the generalized encoder will search for improved motion vectors based on the new motion vectors. At step 730, the generalized encoder will return these improved motion vectors to the quality control module 630 and will return information about the frame and/or block quality.
ls At step 735, the quality control module 630 determines the quantization level parameters and the temporal reference and updates the external devices like: VLC 90, General Encoder 170 and the user with this information. At step 740, the quality control module 630 sends the quantization parameters to the quantizer 175. At step 745, the rate control unit 180 receives the bit information from the VLC 190 which informs the rate control unit 180 ofthe number of bits used to encode each Came or block. At step 7SO, in response to i this information, the quality control module 630 updates its objective parameters for farther control and processing returns to block 710.
l'he invention described above may be implemented in a variety of configurations. Two such configurations are the "fat port" configuration generally illustrated in Figure 8 and the "slim port'' configuration generally illustrated in Figure 9. These two embodiments are for illustrative purposes only, and those skilled in the art will appreciate the variety Possible corfiguratiorls inplementing this invention.
Figure 8 illustrates an excrnplay embodiment of the present invention 0 operating within an MCU wherein each endpoint has a single dedicated video output module 110 and a plurality of dedicated video input modules 105. In this so called "fat port', embodiment, a single logical unit applies all of its functionality for a single endpoint. Incoming video streams are directed from the Back Plane Bus 800 to a plurality of video input modules 105. Video inputs from the Back Plane Bus 800 are assigned to a respective video input module 105. This exemplary embodiment is more costly than the options which follow. While costly, the advantage is that end users may allocate the layout of their conference to their- liking. In addition to this "private layout" feature, having all of the video input modules and the video output module on the same logical unit perniits a dedicated data pipe 850 that resides within the logical unit to facilitate increased throughput. The fact Mat this data pipe 850 is internal to a logical unit eases the physical lactation found when multiple units share the pipe. The dedicated data pipe 850 can contain paths for both the primary data channel and the side infonnation channel.
Figure 9 illustrates an exemplary embodiment of the present invention with a single video input module and a single video output module per logical unit. In an MCU in this "Slim Port" configuration, a video input module 105 receives a single video input stream frown Back Plane Bus 800. After processing, the video input stream is sent to cordon interface 950 where it may be picked up by another video output module for processing. Video output 0 module 110 receives rntiple video input streams from the connon interface 950 for compilation in the editor and output to the Back Plane Bus 800 where it will be routed to an end user codec. In this embodiment of tle invention, the video output module 110 and video input module 105 are on the same logical unit and may be dedicated to serving the input/output video needs of a single end user codec, or the video input module 105 and the video output module 110 may be logically assigned as needed. In this mariner, resources may be better utilized; for example, for a video stream of an end user that is never viewed by other end users, there is no need to use a video input module resource.
Because of the reduction in digital processing caused by the present architecture, including this reuse of video parameters, the video input modules and the video output modules 110 can use microprocessors like digital signal processors (DSP's) which can be significantly more versatile and less expensive than the hardware required for prior art AlCU's. Prior art MCU's that perfonn full, traditional decoding and encoding of video signals typically require specialized video processing chips. These specialized video processing s chips are expensive, "black box" chips that are not amenable to rapid development. Their specialized nature means that they have a limited market that does not facilitate the same type of growth in speed and power as has been seen in the microprocessor and digital signal processor ("DSP") field. By reducing the computational complexity of the kICU, this invention facilitates the use of fast, rapidly evolving DSP's to implement the NICU features.
Prom the foregoing description, it will be appreciated that the present invention describes a method of and apparatus for perfonning operations on a compressed video stream. Me present invention has been described in relation to particular embodiments which are intended in all respects to be illustrative rather than restrictive. Alternative embodiments will become apparent to those skilled in the art to which the present invention pertains without departing Dom its spirit and scope. Accordingly, the scope of the present invention is described by the appended claims and supported by the foregoing description. CLA:S

Claims (1)

  1. I hi ultipoint control urLit for facilitating communication between a
    plurality of endpoints, each endpoint including a video screen and a video camera, each endpoint being operative to send a compressed video input signal to the multipoint control unit and receive a con:rpressed video output signal from said multi oi p nt control unit, said multipoint control unit comprising at least one vice 0 fat port, each video fat port receiving at least one compressed video input signal Mom at least one endpoint and sending a compressed video output signal to at least one endpoint of said plurality of points, each video fat port compri i at least one video input module, each video input module receiving a compressed video input signal from one endpoint of said plurality p rats, each video input module co i i iced decoder for decoding the comp d video input si 1 fir endpoint for generating a prima vid d stream; put module, the output module recei i one of said primary video data streams, the output module comprising: a rate control unit; a generalized encoder, in communication with said rate control unit and operative to receive each of said primary data stream from at least one of said input modules and encode said primal data streams into a compressed video output stream for transmission to at least one endpoint of said plurality of endpoints; and :rneans to route said primary data stream from at least one input module to the oath if nodule; whereby the use of said fat port enables sharing video streams of a conference in the compressed domain on the Bacllane this increases the number of participants in a conference and eliminates hanentation compared to a case where the video sharing is done in the spatial domain on an uncompressed video bus 2. A multipoint control unit for facilitating communication between a plurality of endpoints, each endpoint including a video screen and a video camera, each endpoint being operative to send a compressed video input signal to the multipoint control unit and receive a compressed video output signal from said rnultipoint control unit, said multipont control unit comprising: at least one video fat port, each video fat port receiving at least one compressed video input signal fiom at least one endpoint and sending a compressed video output signal to at least one endpoint of said plurality of endpoints, each video fat port comprising: at least one video input module, each video input module receiving a compressed video input signal from one endpoint of said pluralizer of endpoints, each video input module compnsing: a generalized decoder for decoding the compressed video input signal from said endpoint for generating a primary video data stream; a video output module, the output module receiving at least one of said primary video data streams, the output module comprising: a rate control unit; a generalized encoder, in connuncation with said rate control unit and operative to receive each of said primary data stream frown at least one of said input modules and encode said primary data streams into a compressed video output stream for transmission to at least one endpoint of said plurality of endpoints; and means to route said primary data stream Loon at least one input module to the output module; -whereby the use of said fat port enables sharing video streams of a conference in the compressed domain on the Backplane this increases the number of participants in a conference and eliminates fragmentation compared to a case where the video sharing is done in the spatial domain on an uncompressed video bus; and wherein the compressed output stream is a transcoded video output signal received from an endpoint, or a combination of two or more video output signals received from two or more endpoints or both.
    l 3. An apparatus for manipulating compressed digital video forming 2 manipulated compressed digital video, the manipulated compressed digital video being a 3 manipulation of data from at least one of a plurality of compressed digital video sources 4 and destinations, the apparatus comprising: at least one video input module, each video input module of the at least 6 r one video input module being operative to receive a compressed video input 7 signal that belongs to one of the compressed digital video sources depending on 8 the required manipulation, to decode the compressed video input signal for 9 generating a decoded video data stream and to transfer the decoded video data stream to a common interface; 11 at least one video output module, each video output module of the at least 12 one video output module being operative to grab the decoded video data stream 13 from the common interface, to combine the decoded video data stream, to encode 14 the decoded video data stream that was combined into a compressed video output stream, and to transfer the compressed video output stream to at least one 16 destination of the plurality of destinations; and 17 a common interface forming a temporary logical connection for routing 18 the decoded video data stream from at least one input module to at least one 19 output module; wherein there is no permanent logical relation or connection between the 21 at least one video input module and the at least one video output module, and the 22 apparatus has a configuration in which the temporary logical connection depends 23 on the current needs of a current manipulation, whereby use of the configuration 24 improves resources allocation of the apparatus.
    l 4. An apparatus for manipulating compressed digital video forming 2 manipulated compressed digital video, the manipulated compressed digital video being a 3 manipulation of data from. at least one of a plurality of compressed digital video sources 4 and destinations, the apparatus comprising: S at least one video input module, each video input module of the at least 6 one video input module being operative to receive a compressed video input 7 signal that belongs to one of the compressed digital video sources depending on 8 the required manipulation, to decode the compressed video input signal for 9 generating a decoded video data stream and to transfer the decoded video data stream to a common interface; at least one video output module, each video output module of the at least 12 one video output module being operative to grab the decoded video data stream 13 from the common interface, to combine the decoded video data stream, to encode 14 the decoded video data stream that was combined into a compressed video output stream, and to transfer the compressed video output stream to at least one 16 destination of the plurality of destinations; and 17 a common interface forming a temporary logical connection for routing 18 the decoded video data stream from at least one input module to at least one 19 output module.
    wherein there is no permanent logical relation or connection between the 21 at least one video input module and the at least one video output module, and the 22 apparatus has a configuration in which the temporary logical connection depends 23 on the current needs of a current manipulation, whereby use of the configuration 24 improves resources allocation of the apparatus. !
    1 5. An apparatus for manipulating compressed digital video forming 2 manipulated compressed digital video, the manipulated compressed digital video being a 3 manipulation of data from at least one of a plurality of compressed digital video sources 4 and destinations, the apparatus comprising: at least one video input module, each video input module of the at least 6 one video input module being operative to receive a compressed video input 7 signal that belongs to one of the corr.pressed digital video sources depending on 8 the required manipulation, to decode the compressed video input signal for 9 generating a decoded video data stream and to transfer the decoded video data stream to a common interface; 11 at least one video output module, each video output module of the at least 12 one video output module being operative to grab the decoded video data stream 13 from the common interface, to encode the decoded video data stream into a 14 compressed video output stream, and to transfer the compressed video output stream to at least one destination of the plurality of destinations; and 16 a common interface forming a non-dedicated connection for routing the 17 decoded video data stream from at least one video input module to at least one 18 video output module;
    19 wherein there is no dedicated logical relation or connection between the at least one video input module, and the at least one video output module and the 21 apparatus has a configuration in which the non-dedicated logical connection 22 depends on the current needs of a current manipulation, whereby use of the 23 configuration improves resources allocation of the apparatus.
    1 6 The Apparatus of claim 5, wherein the at least one video 2 output module composes the decoded video output stream prior to encoding.
    7. multipoint control unit for facilitating cormnunication between a plurality of endpoints, each respective endpoint sending a compressed video output signal and receiving a compressed video input signal, comprising: a plurality of video input modules, each video input nodule receiving a respective video output signal from a respective endpoint, each video input module comprising: a generalized decoder for reading the respective video output signal and for generating a respective primary data stream comprising video information and a respective secondary data stream comprising side information; and a video output module comprising: a rate control unit operable to read each of the respective secondary data streams, pre-process the respective secondary data stream, and control a generalized encoder; and the generalized encoder, in communication with the rate control unit and operable to receive each of the respective primary data streams from each respective video input module, and encode the respective primary data stream into a compressed video output stream for transmission to an endpoint.
    8. A multipoint control unit for facilitating communication between a plurality of endpoints, each of said plurality of endpoints including a video screen and a video camera, each of said plurality of endpoints being operative to send a compressed video input signal to said multipoint control unit and receive a compressed video output signal from said multipoint control unit, the multipoint control unit comprising: at least one video input module, for receiving a compressed video input signal Mom at least one endpoint of said plurality of endpoints, the video input module comprising: a generalized decoder operative to decode the compressed video input signal and generate a primary video data stream, the generalized decoder comprising: a data processing unit operative to process said compressed video input signal and said primary video data stream to generate a secondary data stream, said secondary data stream being associated with said primary video stream; and at least one video output module operative to receive at least one of said primary video data stream and said secondary data stream, the output module comprising: a rate control unit; and a generalized encoder' in communication with said rate control unit and operative to receive said primary data stream Mom said at least one input module and encode said primary data stream into a compressed video output stream for transmission to at least one endpoint of said plurality of endpoints; means to route said primary data stream from at least one input module to at lest one output module; and means to route said secondary data stream fiom at least one input module to the at lest one output module; whereby the use of said secondary data stream by the output module improves the speed of encoding and the quality of the compressed video output signal.
    9. The multipoint control unit of claim 8, wherein the association between the secondary data stream and the primary video data stream is that the secondary data stream includes side information.
    1. The multipoint control unit of claim 8 Wherein the compressed video input signal includes at least one type of information selected Dom a group consisting of: frame type, resolution, motion vectors, filter usage indication, DCT coefficients and quantizer values.
    11. The multipoint control unit of claim 9 wherein said side information includes at least one type of information selected from a group consisting of: frame type, resolution, motion vectors, filter usage indication, quantizer identifications, coded/uncoded decisions, the amount of information within each macroblock, image segmentation indication, scene cut off indication, camera zoom identification, camera pan identification, camera movements identification and statistical information.
    1 2 Tile multpoint control unit of claim 8, wherein said rate control unit comprises: means to read said secondary data stream; means to process said secondary data stream; and means to control a generalized encoder based upon said processed secondary data stream.
    1 3 The multipoint control unit of claiml 2 wherein said rate control unit composes: means to read feedback data fi om a generalized encoder; means to process said secondary data stream with said feedback data; and means to control said generalized encoder based upon said processed secondary data stream and said feedback.
    1 4. The multipoint control unit of claim 8,wherein said means to route said primary data stream includes a common interface selected from a group consisting of shared memory, ATM bus, TDM bus, switching and direct connection.
    1 5 The multipoint control unit of claim 8 Wherein said means to route the secondary data stream includes a common interface selected from a group consisting of: shared memory, ATM bus, TDM bus, switching and direct connection.
    16 The multipoint control unit of claim 8, wherein said primary data stream includes information in the DCT domain.
    1 7, The multipoint control unit of claim 8, wherein said primary data sbeam includes information in spatial domain.
    18 The multipoint control unit of claim 8 Wherein said video output module receives at least one of said primary video data streams and it's associated said secondary data stream and control information Mom an external device.
    1 9 I he multipoint control unit of claim 1 8 Wherein said rate control unit of said video output mod-tile comprises: means to read said secondary data stream; means to read said control information; means to process said secondary data stream, means to process the control information; and means to control said generalized encoder based upon the processed secondary data stream and the processed control information.
    The multipoint control unit of claim 19 wherein said rate control unit of said video output module comprises: means to read feedback data from a generalized encoder; and means to process said secondary data stream with said control information and said feedback; and means to control said generalized encoder based upon the results.
    2 1 The m^ultipoint control unit of clam 1 8 wherein the control information includes at least one type of information selected from a group consisting of: region of interest indication, screen layout requirements, user quality preferences and special effects 2 2 The multipoint control unit olclaim 1 8 wherein the control information is bi-directional information.
    2 3 The multipoint control unit of claim 8 wherein said video output module receives said primary video data stream and said secondary data stream, and said rate control unit of said video output module comprises: means to read said secondary data stream; means to read data on the actual amount of bits used after variable length coding; means to process the respective secondary data stream with the variable length coding information; and means to control said generalized encoder based on the said processed information, whereby the use of said variable length coding information and the secondary data stream by generalized encoder improves the speed of encoding and the quality of the compressed video output signal by improving the output bits allocation.
    2 4. The multipoint control unit of claim 23 wherein said video output module receives said primary video data stream and said secondary data stream, and said rate control unit of said video output module comprises: means to read feedback data fiom a generalized encoder; means to process the respective secondary data stream with the variable length coding infonnation and said feedback data; and means to control said generalized encoder based on the said processed information.
    2 5. A multipoint control trait for facilitating communication between a plurality of endpoints, each endpoint including a video screen and a video camera, each endpoint being operative to send a compressed video input signal to the multipoint control unit and receive a compressed video output signal Mom said multipoint control unit, the multipoint control unit comprising: at least one video fat port, each video fat port receiving at least one compressed video input signal from at least one endpoint and sending a compressed video output signal to at least one endpoint of said plurality of endpoints, each video fat port comprising: at least one video input module, each video input module receiving a compressed video input signal Tom one endpoint of said plurality of endpoints, each video input module comprising: a generalized decoder for decoding the compressed ! video input signal from said endpoint for generating a primary video data stream, the generalized decoder comprising: a data processing unit for processing said compressed video input signal and said primary video data stream for generating a secondary data stream associated with said primary video data stream comprising side information; a video output module, the video output module receiving at least one of said primary video data streams and said secondary data streams, the output module comprising: a rate control unit; and a generalized encoder, in communication with said rate control unit and operative to receive each of said primary data stream from at least one of said input modules and encode said primary data streams into a compressed video output skeam for transmission to at least one endpoint of said plurality of endpoints; and means to route said primary data stream from at least one input module to the output module; and means to route said secondary data stream from at least one input module to the output module, whereby the use of said fat port enables sharing video streams of a conference in the compressed domain on the Backplane this increases the number of participants in a conference and eliminates fragmentation in compare to a case where the video sharing is done in the spatial domain on an uncompressed video bus.
    2 6 The multipoint control unit of claim 2 5 Wherein the association between the secondary data stream and the primary video data stream is that the secondary data stream includes side information.
    2 7. The multipoint control unit of claim 2 5 Wherein the compressed video input signal includes at least one type of information selected from a group consisting of: frame type, resolution, mohon vectors, filter usage indication, DCT coefficients and quantizer values.
    2. The multipoint control unit of claim 2 6 wherein said side infonnation includes at least one type of information selected from a group consisting of: frame type, resolution, motion vectors, fitter usage indication, quantizer identifications, coded/uncoded decisions, the amount of information within each macroblock, image segmentation indication, scene cut of indication, camera zoom identification, Genera pan identification, camera movements identification and statistical information.
    2 g The multipoint control unit of claim 2 5 wherein said rate control unit comprises: means to read each of the respective secondary data streams; means to process the respective secondary data streams; and means to control said generalized encoder based upon said processed secondary data stream.
    30. The multipoint control unit of claim 25, wherein said means to route said primely data stream includes a common interface selected from a group consisting of: shared memory, ATM bus, TDM bus, switching and direct connection.
    3 1-. The multipoint control unit of claim 2 5,wherein said means to route said secondary data stream includes a comm on interface selected fi om a group consisting of: shared memory, ATM bus, TDM bus, switchers and direct connection.
    3 2. The multipoint control unit of claim2 5, wherein said primary data stream includes information in a DCT domain.
    3 3 The multipoint control unit of claim 2 5 Wherein said primary data stream includes information in spatial domain.
    34. The multipoint control unit of claim25 wherein said video output module receives said at least one prunary video data stream and said secondary data stream and control information Mom an external device.
    3 5 The multipoint control unit of claim 3 4, wherein said rate control unit of said video output module comprises: means to read the respective secondary data stream; means to read said control information; means to process the respective secondary data stream; means to process the control information; and a means to control the generalized encoder based upon the processed information.
    3 6 Tile multipoint control unit of claim 34, wherein said rate control unit of said video output module comprises: means to read feedback data Tom a generalized encoder, means to process said secondary data stream with said control information and said feedback; and means to control said generalized encoder based upon the results.
    3 7:[he multipoint control unit of claim 3 (wherein the conko1 information includes at least one type of infonnation selected from the group consisting of: region of interest indication, screen layout requirements, user quality preferences and special effects.
    3 The:nultipoint control unit of claim 4,wherei1 the control information is bi-directianal information.
    39 The multipoint control unit of claim 2 5,wherein said video output module receives said primary video data stream and said secondary data stream, and said rate control unit of said video output module comprises: means to read said respective secondary data stream; means to read data on the actual amount of bits used after variable length coding; means to process the respective secondary data stream with the variable length coding in:tbrmation; and means to control a generalized encoder based on the said processed information, whereby process of said variable length coding information and said secondary data stream the generalized encoder improves the speed of encoding ant the quality of tle compressed video output signal by improving the output bits allocation.
    The multipoint control unit of claim39 wherein said video output module receives said primary video data stream and said secondary data stream, and said rate control unit of said video output module comprises: means to read feedback data from a generalized encoder; means to process the respective secondary data stream with the variable length coding infonnation and said feedback data; and means to control said generalized enc der based on the said processed information.
    4 L A method of performing operations on a compressed video stream, the method comprising the steps of: reading encoding parameters embedded within the compressed input video stream; processing tle compressed input video steam into two data stream, a primary data stream and a secondary data stream; routing said primary data stream and secondary data stream to at least one output unit; and encoding said primary data stream by using the infonnation associated with said secondary data stream, whereby using said secondary data stream improves the speed of encoding and the quality of the compressed video output signal.
    4 2 The method of claim 4 1 Wherein the step of reading encoding parameters further comprises the step of reading Dom the compressed video stream at least one type of parameter selected Mom a group of parameters consisting of DCT coefficients, frame type, resolution, motion vectors, filter usage indication, quantizer values.
    4 3. The method of claim 41 wherein the step of reading encoding parameters further comprises the step of reading statistical information from the compressed video stream.
    4 4 The method of claim 41 Wherein the step of processing said compressed video stream further comprises the step of analyzing at least one type of indications selected from a group of indication consisting of; coded/uncoded decision, the amount of information within each macroblock, image segmentation, scene cut off, camera zoom identification, camera pan identification, camera movements identification and statistical information.
    The method of claim 41 Wherein the step of encoding further comprises the step of using at least one parameter from a group of parameters, associated with said secondary data stream, consisting of: DCT coefficients, Came type, resolution, motion vectors, filter usage indication, quantizer values codedluncoded decision, the amount of information within each macroblock, image segmentation indications, scene cut off indications, camera zoom iderdiEcation, camera pan identification, camera movements identification, and statistical information.
    46. An apparatus embodying the metilod of claim 41 4 7 A system for perfonning operations on a compressed video stream, the system comprising: l i l
    I
    j, at least one generalized decoder for decoding the compressed 1 video stream into a primary data stream and for reading and analyzing encoding parameters embedded within the compressed video stream and for creating a secondary data stream; at least one editor that receives the primary streams and generates modified decoded video stream, and a generalized encoder for encoding said modified decoded video stream into a second compressed output video stream using the encoding parameters Tom the generalized decoders.
    4 8. The system of claim 4 7 wherein the editor is farther operable to scale said primary data stream.
    4 9. The system of claim 4 Wherein the editor is further operable to receive a second primary data stream.
    o. The system of claim 4 7 wherein the editor is further operable to scale the secondary primary data stream.
    À The system of claim 4 7 wherein the editor is further operable to composite the first primary data stream and the second primary data stream.
    2. The system of claim 4 7 further comprising a router for sending the primary data stream to the editor.
    3 À The system of claim 4 7 further comprising a router for sending the secondary data stream to the output unit.
    l 5 4 An apparatus for manipulating compressed digital video information 2 to form manipulated compressed video information, the manipulated compressed video 3 information being a manipulation of data from at least one of a plurality of compressed 4 digital video sources, the apparatus comprising: at least one video input module for receiving compressed video input data 6 from at least one source of the plurality- of compressed digital video sources, the at 7 least one video input module comprising a generalized decoder operative to 8 decode the compressed video input data, generate a primary video data stream, 9 and process the compressed video input data and the primary video data stream to generate a secondary data stream; and 11 at least one video output module for receiving the primary video data 12 stream and the secondary data stream from the at least one video input module, 13 and being operative to encode the primary video data stream with references to 14 the secondary data stream to form manipulated compressed video output data, 1 S whereby the use of the secondary data stream by the at least one video 16 output module improves a speed of encoding and the manipulated compressed 17 video output data's quality.
    1 5 5. 7 he apparatus of claim 5 4 wherein the video output modulecomprises: 3 a rate control unit; and 4 a generalized encoder, in communication with the rate control unit and operative to receive the primary video data stream, having primary video data, 6 from the at least one video input module and encode the primary video data into 7 the manipulated compressed video output data. ' 1 5 6. The apparatus of claim 5 5 Wherein the rate control unit comprises: 2 means to read the secondary data stream; 3 means to process the secondary data stream; and 4 means to control the generalized encoder based upon results of processing S the secondary data stream.
    7. The apparatus of claim 5 5 Wherein the rate control unit comprises: 2 means to read feedback data from the generalized encoder; 3 means to process the secondary data stream with the feedback data; and 4 means to control the generalized encoder based upon results of processing the secondary data stream with the feedback data.
    1 5 8. The apparatus of claims 7, wherein the secondary data stream 2 comprises side information which further comprises at least one type of infonnation 3 selected from a group consisting of: frame type, resolution, motion vectors, filter usage 4 indication, quantizer identifications, coded/uncoded decisions, the amount of information within each macroblock, image segmentation indication, scene cut off indication, camera 6 zoom identification, camera pan identification, camera movements identification, and 7 statistical information.
    1 5 9. The apparatus of claim 5 5 Wherein the secondary data stream is 2 associated with a primary data stream to form an associated secondary data stream, and 3 the at least one video output module receives the primary video data skeam, the 4 associated secondary data stream and control information from an external device.
    1 60. The apparatus of claim5g, wherein the rate control unit of the video 2 output module comprises: 3 means to read the secondary data stream; 4 means to read the control information; means to process the secondary data stream; 6 means to process the control information; and 7 means to control the generalized encoder based upon results of processing 8 the secondary data stream and results of processing the contra! information.
    I; ] . l he apparatus of claim6 0, wherein the rate control unit of the at least 2 one video output module comprises: 3 means to read feedback data from the generalized encoder; 4 means to process the secondary data stream with the control information and the feedback data; and 6 means to control the generalized encoder based upon results of processing 7 the secondary data stream with the control information and the feedback data.
    1 6 2. The apparatus of claim 5 9yvherein the control information includes at 2 least one type of information selected fTom a group consisting of: region of interest 3 indications, screen layout requirements, user quality preferences, and special effects.
    1 6 3. The apparatus of claim 5 5,wherein the at least one video output 2 module receives the primary video data stream and the secondary data stream, and the 3 rate control unit of the at least one video output module comprises: 4 means to read the secondary data stream; means to read data related to how many bits are used after variable length 6 coding; 7 means to process the secondary data stream with variable length coding 8 information; and 9 means to control the generalized encoder based on results of processing the variable length coding information, whereby the use of the variable length 11 coding information and the secondary data stream by the generalized encoder 12 improves a speed of encoding and the compressed video output signal's quality by 13 improving an output bit allocation.
    l 64. The apparatus of claim 6 3 wherein the at least one video output 2 module receives the primary video data stream and the secondary data stream, and the 3 rate control unit of the at least one video output module comprises: 4 means to read feedback data from the generalized encoder; means to process the secondary data stream with the variable length 6 coding information and the feedback data; and 7 means to control the generalized encoder based on results of processing 8 the secondary data stream with the variable length coding information and the 9 feedback data.
    1 6 5. The apparatus of claim54, further comprising: 2 means to route the primary video data from the at least one video input 3 module to the at least one video output module; and 4 means to route the secondary data stream from the at least one video input module to the at least one video output module.
    l 6 6. The apparatus of claim 6 5 Wherein the means to route the primary 2 video data stream includes a common interface selected from a group consisting of: 3 shared memory, an ATM bus, a TL)M bus, switching, and a direct connection.
    1 6 7 The apparatus of claim 6 5 wherein the means to route the secondary 2 data stream includes a common interface selected from a group consisting of: shared 3 memory, an ATM bus, a TDM bus, switching, and a direct connection.
    1 6 3 The apparatus of claim 5 4 Wherein the manipulation of the 2 compressed video input data includes at least one type of manipulation selected from a 3 group consisting of: transcoding and compositing.
    69. The apparatus of claims 4,wherein the secondary data stream is 2 associated with the primary video data stream in that the secondary data stream includes 3 side information.
    1 7 0 The apparatus of claim 5 4 Wherein the compressed video input data 2 includes at least one type of information selected from a group consisting of: frame type, 3 resolution, motion vectors, filter indication, DCT coefficients: and quantizer values.
    1 7 1. The apparatus of claims 4, wherein the primary video data stream 2 includes information in a DCT domain.
    7 2. The apparatus of claim 5 4,wherein the primary video data stream 2 includes information in a spatial domain.
    1 7 3. A compressed video combiner unit for generating a compressed digital 2 video signal, which is a composition of plurality of compressed digital video sources, the 3 compressed video combiner unit comprising: 4 at least one video input module for receiving compressed video input data from at least one source of the plurality of compressed digital video sources, the at 6 least one video input module further comprising 7 a generalized decoder operative to decode the compressed video 8 input data and generate a primary video data stream, the generalized 9 decoder further comprising a data processing unit operative to process the compressed 11 video input data and the primary video data stream to generate a 12 secondary data stream, the secondary data stream having an 13 association with the primary video stream forming associated 14 secondary data; at least one video output module operative to receive at least one of the 16 primary video data stream and the secondary data stream, the at least one video 17 output module further comprising 18 a rate control unit, and 19 a generalized encoder, in communication with the rate control unit and operative to receive the primary video data from the at least one video 21 input module and encode the primary video data into compressed video 22 output data; 23 means to route the primary video data from the at least one video input 24 module to the at least one video output module; and means to route the secondary data stream from the at least one video input 26 module to the at least one video output module; 27 whereby the use of the secondary data stream by the at least one video 28 output module improves a speed of encoding and the compressed video output 29 data's quality.
    1 7 4.- The compressed video combiner unit of claim 7 3 wherein the side 2 information includes at least one type of information selected from a group consisting of: 3 frame type, resolution, motion vectors, filter usage indication, quantizer identifications, 4 coded/uncoded decisions, an amount of information within each macroblock, image segmentation indication, scene cut off indication, camera zoom identification, camera pan 6 identification, camera movements identification, and statistical information.
    1 7 5 The compressed video combiner unit of claim7 3, wherein the 2 compressed video input data includes at least one type of information selected from a 3 group consisting of: frame type, resolution, motion vectors, filter indication, DCT 4 coefficients, and quantizer values.
    1 7 6. The compressed video combiner unit of claim 7 3 wherein the rate 2 control unit comprises: 3 means to read the secondary data stream; means to process the secondary data stream; and means to control the generalized encoder based upon results of processing 6 the secondary data stream.
    1 7 7. The compressed video combiner unit of claim7 3, wherein the rate 2 control unit comprises: 3 means to read feedback data from the generalized encoder; 4 means to process the secondary data stream with the feedback data; and S means to control the generalized encoder based upon results of processing 6 the secondary data stream with the feedback data.
    1 7. The compressed video combiner unit of claim 7 3 wherein the means 2 to route the primary video data stream includes a common interface selected from a group 3 consisting of: shared memory, an ATM bus, a TDM bus, switching, and a direct 4 connection.
    1 7 9 The compressed video combiner unit of claim7 3, wherein the means 2 to route the secondary data stream includes a common interface selected from a group 3 consisting of: shared memory, an ATM bus, a TDM bus, switching, and a direct 4 connection.
    1 '3 0 The compressed video combiner unit of claim 7 3 wherein the primary 2 video data stream includes information in a DCT domain. /
    1 81. The compressed video combiner of claim 7 3 Wherein the primal 2 video data stream includes information in a spatial domain.
    1 8 2. The compressed video combiner unit of clairn7 3, wherein the video 2 output module receives at least one of the primary video data streams, the associated 3 secondary data stream, and control information from an external device.
    1 83. The compressed video combiner unit of claim 73 wherein the rate 2 control unit of the video output module comprises: 3 means to read the secondary data stream; 4 means to read the control information; means to process the secondary data stream; 6 means to process the control information; and 7 means to control the generalized encoder based upon results of processing 8 secondary data stream with results of processing control information.
    1 8 4. The compressed video combiner unit of claim 8 3 wherein the rate 2 control unit of the video output module comprises: 3 means to read feedback data from the generalized encoder; 4 means to process the secondary data stream with the control information and the feedback data; and 6 means to control the generalized encoder based upon results of processing 7 the secondary data stream with the control information and the feedback data.
    l 8 5. The compressed video combiner unit of claim. 8 2 wherein the control 2 information includes at least one type of information selected from a group consisting of: 3 a region of interest indication, screen layout requirements, user quality preferences, and 4 special effects.
    l 8 6. The compressed video combiner unit of claim 8 2 Wherein the control 2 information is bi-directional information.
    1 8 7. The compressed video combiner unit of claim 7 3 wherein the at least 2 one video output module receives the primary video data stream and the secondary data 3 stream, and the rate control unit of the at least one video output module comprises: 4 means to read the secondary data stream; means to read data related to how many bits are in use after variable length 6 coding; 7 means to process the secondary data stream with the variable length 8 coding information; and 9 means to control the generalized encoder based on results of processing the secondary data stream with the variable length coding information, whereby 11 the use of the variable length coding information and the secondary data stream 12 by the generalized encoder improves a speed of encoding and the compressed 13 video output signal's quality by improving an output bit allocation.
    1 8 8. The compressed video combiner unit of claim 8 7 wherein the at least one 2 video output module receives the primary video data stream and the secondary data 3 stream, and the rate control unit of the at least one video output module comprises: 4 means to read feedback data from the generalized encoder; S means to process the secondary data stream with the variable length 6 coding information and the feedback data; and 7 means to control the generalized encoder based on results of processing 8 the secondary data stream with the variable length coding information and the 9 feedback data.
GB0408547A 2000-01-13 2001-01-09 Method and system for compressed video processing Expired - Fee Related GB2397964B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/506,861 US6300973B1 (en) 2000-01-13 2000-01-13 Method and system for multimedia communication control
GB0122247A GB2363687B (en) 2000-01-13 2001-01-09 Method and system for compressed video processing

Publications (3)

Publication Number Publication Date
GB0408547D0 GB0408547D0 (en) 2004-05-19
GB2397964A true GB2397964A (en) 2004-08-04
GB2397964B GB2397964B (en) 2004-09-22

Family

ID=32683967

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0408547A Expired - Fee Related GB2397964B (en) 2000-01-13 2001-01-09 Method and system for compressed video processing

Country Status (1)

Country Link
GB (1) GB2397964B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008031039A2 (en) * 2006-09-08 2008-03-13 Taylor Nelson Sofres Plc Audio/video recording and encoding
GB2443969A (en) * 2006-11-20 2008-05-21 Codian Ltd Method of transmitting scaled images using video processing hardware architecture
GB2443967A (en) * 2006-11-20 2008-05-21 Codian Ltd Video processing hardware architecture for video conferencing
US7889226B2 (en) 2006-11-20 2011-02-15 Codian Ltd Hardware architecture for video conferencing
US8532100B2 (en) 2010-10-19 2013-09-10 Cisco Technology, Inc. System and method for data exchange in a heterogeneous multiprocessor system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998023075A2 (en) * 1996-11-22 1998-05-28 Unisys Corporation Multimedia teleconferencing bridge
US5784561A (en) * 1996-07-01 1998-07-21 At&T Corp. On-demand video conference method and apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5784561A (en) * 1996-07-01 1998-07-21 At&T Corp. On-demand video conference method and apparatus
WO1998023075A2 (en) * 1996-11-22 1998-05-28 Unisys Corporation Multimedia teleconferencing bridge

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008031039A2 (en) * 2006-09-08 2008-03-13 Taylor Nelson Sofres Plc Audio/video recording and encoding
WO2008031039A3 (en) * 2006-09-08 2008-07-24 Taylor Nelson Sofres Plc Audio/video recording and encoding
GB2443969A (en) * 2006-11-20 2008-05-21 Codian Ltd Method of transmitting scaled images using video processing hardware architecture
GB2443967A (en) * 2006-11-20 2008-05-21 Codian Ltd Video processing hardware architecture for video conferencing
GB2443967B (en) * 2006-11-20 2009-02-04 Codian Ltd Hardware architecture for video conferencing
GB2443969B (en) * 2006-11-20 2009-02-25 Codian Ltd Hardware architecture for video conferencing
US7889226B2 (en) 2006-11-20 2011-02-15 Codian Ltd Hardware architecture for video conferencing
US8169464B2 (en) 2006-11-20 2012-05-01 Codian Ltd Hardware architecture for video conferencing
US8532100B2 (en) 2010-10-19 2013-09-10 Cisco Technology, Inc. System and method for data exchange in a heterogeneous multiprocessor system

Also Published As

Publication number Publication date
GB0408547D0 (en) 2004-05-19
GB2397964B (en) 2004-09-22

Similar Documents

Publication Publication Date Title
US6496216B2 (en) Method and system for multimedia communication control
KR100363986B1 (en) Bit rate reduction device and motion vector processing device used therein
Bjork et al. Transcoder architectures for video coding
KR100311943B1 (en) Change Compression Area Video Synthesis System and Central Rate Control Processor
EP0691054B1 (en) Efficient transcoding device and method
US5870146A (en) Device and method for digital video transcoding
US5687095A (en) Video transmission rate matching for multimedia communication systems
US20070285500A1 (en) Method and Apparatus for Video Mixing
US20080212682A1 (en) Reduced resolution video transcoding with greatly reduced complexity
US20050036550A1 (en) Encoding and transmitting video information streams with optimal utilization of a constrained bit-rate channel
Tan et al. A frequency scalable coding scheme employing pyramid and subband techniques
KR20050031460A (en) Method and apparatus for performing multiple description motion compensation using hybrid predictive codes
EP1296520B1 (en) Method and system for multimedia video processing
GB2397964A (en) Optimising resource allocation in a multipoint communication control unit
EP0971542A2 (en) Readjustment of bit rates when switching between compressed video streams
Okubo Video codec standardization in CCITT study group XV
IL145363A (en) Method and system for multimedia communication control
Huitema et al. Software codecs and work station video conferences
KR100386194B1 (en) Apparatus and method for image improvement by DC value additional compensation of quantization error in image compression
Yashima et al. An extrapolative-interpolative prediction coding method for HDTV signals
Chiariglione et al. A variable resolution video codec for low bit-rate applications
Dvorkovich et al. On the implementation of software-only videoconferencing codec on PC
Kalva et al. Reduced resolution MPEG-2 to H. 264 transcoder
Jain Digital Coding of Video-Teleconference Signals
Netravali et al. CCITT H. 261 (P* 64) Videoconferencing Coding Standards

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20150109