AU717526B2 - Media pipeline with multichannel video processing and playback - Google Patents
Media pipeline with multichannel video processing and playback Download PDFInfo
- Publication number
- AU717526B2 AU717526B2 AU89366/98A AU8936698A AU717526B2 AU 717526 B2 AU717526 B2 AU 717526B2 AU 89366/98 A AU89366/98 A AU 89366/98A AU 8936698 A AU8936698 A AU 8936698A AU 717526 B2 AU717526 B2 AU 717526B2
- Authority
- AU
- Australia
- Prior art keywords
- sequence
- buffers
- digital still
- still images
- video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Landscapes
- Studio Circuits (AREA)
- Television Signal Processing For Recording (AREA)
Description
Ul Regulation 3.2
AUSTRALIA
Patents Act 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT
(ORIGINAL)
S
'S
S
Name of Applicant: Actual Inventors: Address for Service: Avid Technology, Inc., of Metropolitan Technology Park, One Park West, Tewksbury, Massachusetts 01876, United States of America KURTZ, Jeffrey CACCIATORE, Ray ZAWOJSKI, Peter PETERS, Eric, C.
WALSH, John, Jr.
DAVIES COLLISON CAVE, Patent Attorneys, of 1 Little Collins Street, Melbourne, Victoria 3000, Australia "Media pipeline with multichannel video processing and playback"
S.
Invention Title: The following statement is a full description of this invention, including the best method of performing it known to us: -1- -1A- MEDIA PIPELINE WITH MULTICHANNEL VIDEO PROCESSING AND PLAYBACK Technology for manipulating digital video has progressed to the point where it can be readily processed and handled on computers. For example, the Avid/i Media Composer, available from Avid Technology, Inc. of Tewksbury, Massachusetts, is a system wherein digital video can be readily captured, edited, and displayed for various purposes, such as broadcast television and film and video program post-production.
The Avid/i Media Composer uses a media pipeline to provide real-time digital video output on a computer display. This media pipeline 30 is shown in Fig. 1 and is described in more detail in U.S. Patent No. 5,045,940, issued September 3, 1991. In this media pipeline 30, a permanent storage 40 stores sequences of digital still images which represent digital video and are played back at a rate which.
provides the appearance of video. The sequences of digital still images do not include any frame synchronization or other type of timing information which are typically found in television signals. The still images also typically are stored in compressed form. The stored sequences are accessed and placed in a data buffer 42 from where they are provided to a compression/decompression system 44. The output of the compression/decompression system 44 is applied to a frame buffer 46 which converts the still image to a typical video signal which is then applied to an input/output unit 48.
Each of the systems 40, 42, 44, 46, and 48 in this media pipeline 30 operate bi-directionally. That is, the output process discussed above can be reversed and video signals can be input via input/output unit 48 to the frame buffer 46 where they are converted to a sequence of digital still images. The images in the sequence are compressed by compression/decompression system 44, stored in data buffer 42 and then transferred to the permanent storage P:\OPER\SSB\89366-98.RES 27/1/00 -2- Although the media pipeline provides many advantages for digital video, including enabling broadcast and editing from the stored digital images in a computer system, this media pipeline is not able to provide real-time digital video effects including complex arbitrary threedimensional effects, simple, two-dimensional effects such as resizing, x-y translation, rotating, layering (an appearance of picture-in-picture), and finally simple effects such as dissolves, wipes, fades and luma and/or chroma keying. In order to perceive such effects on the computer, the effect generally is first generated (not in real time), then digitized and stored if generated on a tape and finally played back.
As additional background, most digital video effect systems operate on and mix analog S. 10 video signals together. Such a system is shown, for example, in US Patent No. 4,698,682, to Brian Astle.
e The present invention provides a method of applying a digital video effect to a first sequence of digital still images and a second sequence of digital still images during playback to produce a third sequence of digital still images, the first and second sequences stored in video data files as part of a video file system, the first sequence including at least a first digital S°still image and the second sequence including at least a second digital still image, each digital image of the first and second sequence including a plurality of pixels, the method comprising the steps of controlling the transfer of the first and second sequences from the video data files to a first and a second data buffer, respectively; •oreceiving effect parameters defining the digital video effect to be applied to a combination of the first and second sequence of digital still images; for each image in the third sequence, transferring a first pixel of a first image from the first buffer and a second pixel of a second image from the second buffer to a blender S 25 in accordance with the effect parameters; and blending the first pixel and the second pixel in accordance with the effect parameters to generate a third pixel of a third image that is part of the third sequence of digital still images.
The present invention further provides a system for applying a digital video effect to a first ,sequence of digital still images and a second sequence of digital still images during P:\OPER\SSB\89366-98.RES 27/1/00.
2A playback to produce a third sequence of digital still images, the first and second sequences stored in video data files as part of a video file system, the first sequence including at least a first digital still image and the second sequence including at least a second digital still image, each digital image of the first and second sequence including a plurality of pixels, the system comprising: means for controlling the transfer of the first and second sequences from the video data files to a first and a second data buffer, respectively; means for receiving effect parameters defining the digital video effect to be applied to a combination of the first and second sequence of digital still images; means for transferring, for each image in the third sequence, a first pixel of the first image from the first buffer and a second pixel of the second image from the second buffer to a blender in accordance with the effect parameters; and means for blending, for each image in the third sequence, the first pixel and the S" second pixel in accordance with the effect parameters to generate a third pixel of a third S 15 image that is part of the third sequence of digital still images.
The present invention further provides a system for applying a digital video effect to a first sequence of digital still images and a second sequence of digital still images during playback to produce a third sequence of digital still images, the first and second sequences stored in video data files as part of a video file system, the first sequence including at least S 20 a first digital still image and the second sequence including at least a second digital still image, each digital image of the first and second sequence including a plurality of pixels, the system comprising: 6 a first data buffer and a second data buffer, each to store a sequence of digital still images; 0 25 dia first controller to control the transfer of the first and second sequences 0• from the video data files to the first and a second data buffer, respectively; and a second controller to receive input effect parameters defining the digital video effect to be applied to a combination of the first and second sequence of digital still images, the second controller to control the transfer, for each image of the third sequence, of a first pixel of the first image from the first buffer and a second pixel of the second ,<1O image from the second buffer to a blender in accordance with the effect parameters and to P:\OPER\SSB\89366-98.RES 27/1/00 -2Boutput a blend signal to the blender that indicates a blend to be performed based on the effect parameters; and a blender to receive the first pixel from the first buffer, a second pixel from the second buffer, and the blend signal, the blender to blend the first pixel and the second pixel in accordance with the blend signal to generate as output a third pixel of a third digital still image that is part of the third sequence.
The present invention further provides a method of generating a third sequence of digital still images from a first sequence of digital still images and a second sequence of digital still images during playback, wherein the first and second sequences are stored in data files in a file system, each digital image of the first and second sequence including a plurality of pixels, S:0the method comprising the steps of: s: controlling the transfer of the first and second sequences from the data files to a first and a second data buffer, respectively; receiving a transition signal defining a transition from the first sequence to the 15 second sequence; controlling reading of the first and second sequences from the first and second buffers, respectively; and generating the third sequence of digital still images from the read first sequence and the read second sequence in accordance with the defined transition.
S 20 The present invention also provides a system for generating a third sequence of digital still o images from a first sequence of digital still images and a second sequence of digital still images during playback, wherein the first and second sequences are stored in data files in a file system, each digital image of the first and second sequence including a plurality of pixels, the system comprising: S 25 means for controlling the transfer of the first and second sequences from the data files to a first and a second data buffer, respectively; means for receiving a transition signal defining a transition from the first sequence to the second sequence; means for controlling reading the first and second sequences from the first and second buffers, respectively; and tALI means for generating the third sequence of digital still images from the read first sequence and the read second sequence in accordance with the defined transition.
P:\OPER\SSB\89366-98.RES 27/1/00 -2C- The present invention further provides a system for generating a third sequence of digital still images from a first sequence of digital still images and a second sequence of digital still images during playback, wherein the first and second sequences are stored in data files in a file system, each digital image of the first and second sequence including a plurality of pixels, the system comprising: a first data buffer and a second data buffer; a first controller to control the transfers of the first and second sequences from the data files to the first and second data buffers, respectively; a second controller having a first input to receive a transition signal defining the transition from the first sequence to the second sequence, a first output to control a read of one or more digital still images of the first sequence and one or more digital still images of the second sequences from the first buffer and the second buffer, respectively, to the processing module in accordance with the transition signal, and a second output to produce .at an to output a control signal, wherein the control signal indicates the transition to be 15 performed based on the transition signal; and a digital video processing module for generating the third sequence of digital still images, the processing module having a first input to receive the one or more digital still images of the first sequence, a second input to receive the one or more digital still images of the second sequence, a third input to receive the control signal, and an output to provide the third sequence, the digital video processing module generating the third sequence from the one or more digital still images of the first sequence and the one or more digital still images of the second sequences in accordance with the control signal.
Embodiments of the invention improve over the prior art by providing a media pipeline with two channels for processing sequences of digital still images. A blender is provided so 25 as to enable simple effects on these two streams of video data such as dissolves, wipes and 0 S• chroma keys. Complex arbitrary three-dimensional effects and other effects may also be provided using an external interface.
Thus, a system for processing sequences of digital still images to provide real-time digital video effects includes first and second channels for communicating first and second sequences of digital still images at a rate for simulating video. A controller directs still images to one of the first and second channels. A blender, having a first input connected to the first I hannel, a second input connected to the second channel, and an output, provides a P:\OPER\SSB\89366-98RES 27/1/00 -2Dcombination of the first and second sequences of digital still images at a rate for simulating video.
The preferred embodiment of the invention is described in greater detail hereinafter, by way of example only, with reference to the accompanying drawings, in which: Fig. 1 is a block diagram of a media pipeline as is used in the prior art; Fig. 2 is a block diagram of a modified media pipeline in accordance with an embodiment of the present invention; Fig. 3 is a more detailed block diagram of a modified compression/decompression subsystem of the media pipeline in accordance with an embodiment of the present invention; Fig. 4 is a block diagram of a modified media pipeline in accordance with an embodiment goof the present invention to provide real time digital video effects; Fig. 5 is a block diagram of the a generator for box wipes; o* *s P:\OPERUCM\AVID.DIV 16/10/98 -3- Fig. 6 is a flow chart describing the operation of a state machine for each scan line in a frame for a box wipe; Fig. 7 is a flow chart describing the operation of a state machine for each pixel in a scan line for a box wipe; and Fig. 8 is a diagram showing how a is determined for different regions of an image for a box wipe.
The present invention will be more completely understood through the following detailed description which should be read in conjunction with the attached drawing in which similar reference numbers indicate similar structures. All references cited herein, including pending patent applications, are hereby expressly incorporated by reference.
A media pipeline 35 with two channels of digital video for providing effects will now be described in connection with Fig. 2. The media pipeline 30 shown in Fig. 1 is modified to include a compression/decompression (CODEC) unit 58 which is a modification of compression/decompression system 44 of Fig. 1. The CODEC unit 58 has two CODEC 15 channels 50 and 52. One is used for compression and e S:o• -4decompression, for both recording and playback, while the other is used only for playback. The outputs of these channels are fed to a blender 54 which combines them according to the desired effect. It is not necessary to use compressed data; however, compression is preferable to reduce storage requirements. This compression/decompression unit 58 is described in more detail in British provisional specification 9307894.7, filed April 16, 1993, under
U.S.
foreign filing license 504287 granted April 13, 1993.
This CODEC unit 58 will now be described in more detail in connection with Fig. 3. In this figure, a control unit controls two channels of coder/decoders. The modification to the media pipeline 30 is made by assigning, in the control unit 60, different sections of the compressed data buffer 42 to each channel. A sequence of digital still images is also assigned to a channel. Thus, when the sequence is read into the compressed data buffer 42, it is input to the section assigned to the channel for that sequence. Thus, reading and writing of data into the FIFO 62 and 64 for the CODECs 66 and 68 is based on the assignment of a channel to a selected sequence of digital still images.
Each channel has a separate CODEC, either a first
CODEC
66 or a second CODEC 68. The CODECs typically use the Joint Photographic Expert Group (JPEG) proposed standard for still image compression. Such CODECs are commercially available, such as the CL550 available from C-Cube of Milpitas, California. Each CODEC has a respective first-in, first-out (FIFO) memory elements 62 and 64. The FIFO memory elements 62 and 64 feed respectively to the CODECs 66 and 68 of which the outputs are applied to field buffers 70 and 72, which are also preferably FIFO memory elements. These two channels may be blended using a blender 74 which is controlled by an addressing and alpha information unit 76, as will be described in more detail below. The blender 74 and alpha and addressing information unit 76 are preferably implemented k. -4 ,1 using a field reprogrammable gate array such as the XC3090 manufactured by XiLinx.
Alternatively, a first output sequence may be provided by output A from the FIFO 70 for CODEC 66, and a second output sequence may then be provided by the output of the blender, when no blending is performed, as the output B. Thus, FIFO and blender 74 act as first and second sources of sequences of digital still images. The outputs A and B may be applied to a digital video effects system 59 as shown in Fig. 4. This embodiment is useful for providing arbitrary three-dimensional video effects, as are described in U.S.
patent application entitled Media Pipeline with Mechanism for Real-Time Addition of Digital Video Effects filed March 18, 1994 by Harry Der et al., and assigned to Avid Technology, Inc. of Tewksbury, Massachusetts.
More complex, two-dimensional effects can also be made using techniques known in the art, including X-Y translation, rotation and scaling. An additional effects board, similar to that for the three-dimensional, arbitrary effects, can be provided so as to perform an operation on a single stream.
To provide this operation, the output A as shown in Fig. 4 is applied to such an effects generator, the output of which would be applied to the input of the blender originally designed to receive channel A. When this capability is provided, the digital effects typically produce an output using the YUV data format with four bits for each of the Y, U e: and V parameters In contrast, the normal data format for channels A and B is 4:2:2. Thus, in this instance, the blender 76 should be designed so as to go*" optionally process channel A for either 4:4:4 format or 4:2:2 format, according to whether such digital effects are being provided.
Provision of simpler video effects in real time, such as box wipes and chr.-a and luma keys, using blender 74 and alpha and addressing information 76, will now be described.
Blending of two s- eams (A and B) of video typically involves -6the application of the function aA (1-a)B to the streams of video information, where a is a value which can vary from pixel to pixel in an image, and where A and B, at any given point in time, are pixels in corresponding frames in the two streams (A and B) of video. Each effect is thus applied to one frame from each of the two streams. (One normally does not perform an effect on only a fraction of a frame). Given a, at any point in time and the addresses for pixels A and B, the output image can be generated. The blender 74 which performs this operation be determining the result of the combination aA (1-a)B can be implemented using standard digital hardware design techniques.
Preferably, a field reprogrammable gate array is used to implement the function (A-B)a
B.
The value of a applied to two pixels is dependent upon the kind of effect to be provided. For example, dissolves uses the same a for for all pixels in one frame. a is gradually decreased for subsequent frames in the dissolve.
The pair of pixels to be combined is also dependent upon the kind of effect to be provided. The indication of which pair of pixels to use is called the addressing information. For each kind of effect to be provided, a state machine and state variables can be defined for processing one frame of output video. The two general types of effects are chroma and/or luma keys and box wipes, which include dissolves and fades.
For example, in order to implement chroma and/or luma keying, two threshold values D1 and D2 and a key point Kc are defined by the user for each parameter of a pixel. These effects are typically applied to the YUV representation of an image. Thus, an incoming image is processed by comparing the pixel Y, U and V values to the key points and threshold values defined for Y, U, and V. In particular, using the parameter U as an example, is calculated. If this value is less than Dl, a is set to be the maximum possible value. If this value is greater than D2, a is set to be the minimum possible value. When the value is somewhere -7between D1 and D2, a value for a is determined according to this value. In one embodiment of the invention, D1-D2 is required to be some fixed number, 16. The magnitude of this fixed number represents the desired number of a. In this embodiment, when the value of IIKc-UI is between D1 and D2, the value jIKc-Uj-D1 is applied to a lookup table (stored in a random access, preferably rewritable, memory), which stores corresponding values of a to be used. The values of a may be any function of the input JjKc-UjI-D1, such as a step function, a sigmoid function, a ramp or any function desired by a user. Typically, only Y or U,V are keyed and processed. One could apply keying to all of Y, U and V at once, and combine the resulting a values, for example, by using the function (1/2(u+amv) AND a.
My).
Box wipes with a border can also be provided. A box wipe is a transition between two streams defined by a rectangular shape. Within the rectangle, information from one channel is provided. Outside the rectangle, information from the other channel is provided. The transition region can be strictly defined by the border of the rectangle or a border color can be provided. The transition can be described as a linear ramp (defined by a ratio of the channels to each other). The transition is thus defined by the lower and upper limits of the ramp, the step size, and the duration. All of these parameters should be user definable. Also, the coordinates of the box should be programmable to provide a horizontal wipe, a vertical wipe, or some corner to corner wipe.
Typically, a blend is performed from the first channel to the border, from the border to the next channel, or among both channels. A state machine can readily be defined according to the variables defining the wipe so as to provide an output a value for each pair of pixels to be combined. There are three values used to define the final a. The minit values define the initial m, and ay values, where aX and a.y are accumulated lues according to the -8state machine. In the simplest wipe, a dissolve, the initial values are held, not changed, throughout a whole frame. In the other box wipes, aX and my may change, according to the desired wipe. In this process, the final a value is typically taken to be subject to a limiting function defined by cy. That is, the final a typically is OX when aLX is less than my and typically is my when rX is greater than my.
A wipe is defined by two sets of parameters. The first set is parameters for the X direction in a frame; the second set is parameters for the Y direction, defining changes between scan lines in the effect. Both of the X and Y parameters include four groups of four parameters, each group representing an operation, including offset, control, interval, and delta information. The offset information defines where blending is to begin. In the X direction, it identifies the pixel in the scan line where the first blend begins. In the Y direction, it identifies the scan line where blending is to begin. The next information is control information identifying whether further operations in the scan line, or in the frame will follow. For the X parameter, this control information is represented by two bits, wherein the first bit represents whether video is swapped between the A and B channels. The other bit indicates whether another operation in the X direction will appear. After the control information is the interval over which the blend is to be performed. The interval either identifies the number of scan lines or a number of pixels within one scan line. Finally, the delta information represents an increment to be added to aX or my for each pixel over the defined interval.
Thus, a wipe is defined by four groups of four operations in each of the X and Y directions. The first operation signifies the transition from channel A to the border; the second from the border to the second channel; the third from the second channel to the border; and the fourth from the border to the first channel. If there is no border, only two -9operations are used and the second operation indicates that there is no further operation to be performed either for the scan line or for the frame.
Given the operations defining the wipe to be performed, including the four groups of operational information for each of the X and Y parameters, a state machine can be used to determine a for each pixel in the frame. These state machines will now be described in connection with Figs. through 8.
Fig. 5 is a block diagram illustrating some structures controlled by a state machine. The state machine's operation will be described in connection with the flow charts of Figs.
6 and 7. In Fig. 5, X parameter memory 82 and Y parameter memory 80 store the operations to be performed. An address pointer is stored in registers 84 and 86 for each of these memories as X and Y address pointers. Initial X and Y delta values are also stored in registers 88 and 90. These are fed to accumulators for the X and Y values 92 and 94 via switches 96 and 98. The output of the accumulators 94 and 92 are fed to compare and switch unit 100, the output of which provides the a value in a manner to be described below in connection with Fig. 8. There is also a loop counter 102 which indicates the part of frame on which the effect is being performed. The significance of this will also be discussed further below in connection with Fig. 8. There is also a Y position counter 104 and an X position counter 106 which are "used by the control 108 which operates in accordance with the flow charts of Figs. 6 and 7.
Figs. 6 and 7 will now be described.
Upon a horizontal reset (HRST) or vertical sync (VSYNC), as indicated at step 110, the Y accumulator 94 and Y address pointer 86 are cleared. An initial Y delta value 90 is loaded then into the accumulator 94 via switch 98 (step 112). An offset is then read from the Y parameter memory into the Y position counter 104 (step 114). Control information is then read from the Y parameter memory 80 into the loop counter 102 (step 116).
When valid data is available to be processed, and until the Y position counter 104 is not zero, operations on a scan line are performed in step 118 as will be discussed below in connection with Fig. 7. After each scan is processed, the Y position counter 104 is decremented. When the Y position counter reaches zero, the interval is read and loaded from parameter memory 80 into the Y position counter 104 (step 120). A delta value is then read from the Y parameter memory into the Y accumulator 94 and is added to the current value therein (step 122). This value is added for each scan line until the Y position counter is then zero. Each scan line is processed in accordance with the steps described below in connection with Fig. 7. When the Y position counter 104 is zero, the control information is examined in step 124 to determine if further Y operations are to be performed. If there are further operations to be performed, processing returns to step 114. Otherwise, the system waits until another horizontal reset or vertical sync occurs.
Operaticns on one scan line will now be described in connection with Fig. 7. These operations begin in step 130 upon the receipt of a horizontal sync or reset. In step 132, the X accumulator 92 and X address pointer 84 are cleared and an initial delta value at 88 is then loaded into the X accumulator 92 via switch 96. An offset is then loaded from the X parameter memory 82 in step 134 into the X position counter 106. Next control information is read from the X parameter memory 82 into loop counter 102 in step 136. The X position counter is decremented when valid data is available, until the X position counter 106 is zero (step 138). The interval is then read from X parameter memory 82 into X position counter 106 in step 140. The delta value is then read from X parameter memory 82 into the X accumulator 92 and is added to the current value in the accumulator 92 until the X position counter is zero (step 142). The X address pcinter -11and Y address pointer are incremented along this process to identify the correct operation. If the control information indicates that more X operations are to be performed, as determined in step 144, processing returns to step 134.
Otherwise, the system waits in step 146 until another horizontal sync or reset occurs.
For each pixel in the scan line, as indicated by each decrement operation on the X position counter 106, an a value is output from the compare and switch unit 100. How this operation is provided was discussed above. Further details for more complicated box wipes in this selection will now be provided in connection with Fig. 8.
As indicated in Fig. 5, the compare and switch unit 100 receives an my value from the Y accumulator 94 and cx value from X accumulator 92 and a loop counter 102. The loop counter indicates which quadrant in a box wipe with a border is being processed. The indication of a quadrant can readily be determined by the status of the state machine and a counter. Because the X operations and Y operations are each defined by four groups of four parameters, wherein each group identifies an operation to be performed when a portion of an image, there are precisely sixteen combinations of X parameters and Y parameters, each identifying a quadrant of the resulting image. The a value for each quadrant has a predetermined relationship with the mX and my values. Thus, according to the loop control 102, the appropriate selection among mX and my can be provided.
The relationships of a to aX and my for each quadrant will now be described in connection with Fig. 8.
Fig. 8 illustrates 25 regions of a box wipe with a border and transitions between one image, a border color, and another image. There are fifteen general types of regions to be considered in this effect, each being numbered accordingly in the upper left hand corner of the box. For example, the first region 200 is labelled zero as indicated at 202. A bo-: 204 is shown in each region identifying the source of image P:\OPER\ICM\1779815.RES 46/9 -12data (where CHO corresponds to the channel applied to input A of the blender 76 and CH1 corresponds to the input applied to input B of the blender). The first line in box 204 indicates the a value to be provided to the blender. indicates that ax is supplied as a, as in regions 4, 7, 8 and 11. "AL" indicates that ay is supplied as the a value, as in regions 1, 2, 13 and 14. "AL:A AL" indicates that a x is provided when it is less than o, and cy is provided otherwise, as in regions 0, 3, 10, 12 and 15. "AL: =AL" indicates that ay when c is greater than ay and ax is provided, as in regions 5, 6, and 9.
Having now described a few embodiments of the invention, it should be apparent to those skilled in the art that the foregoing is merely illustrative and not limiting, having been presented by way of example only. Numerous modifications and other embodiments are within the scope of one of ordinary skill in the art and are contemplated as falling within the scope of the invention as defined by the appended claims and equivalents thereto.
Throughout this specification and the claims which follow, unless the context requires 2 otherwise, the word "comprise", and variations such as "comprises" and "comprising", will 15 be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps.
I* *•oe
Claims (88)
13- THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:- 1. A method of applying a digital video effect to a first sequence of digital still images and a second sequence of digital still images during playback to produce a third sequence of digital still images, the first and second sequences stored in video data files as part of a video file system, the first sequence including at least a first digital still image and the second sequence including at least a second digital still image, each digital image of the first and second sequence including a plurality of pixels, the method comprising: controlling the transfer of the first and second sequences from the video data files to a first and a second data buffer, respectively; receiving effect parameters defining the digital video effect to be applied to a combination of the first and second sequence of digital still images; 0for each image in the third sequence, transferring a first pixel of a first .0 0 •image from the first buffer and a second pixel of a second image from the second buffer to a blender in accordance with the effect parameters; and blending the first pixel and the second pixel in accordance with the effect parameters to generate a third pixel of a third image that is pan of the third sequence of digital still images. S 20 2. The method of claim 1, wherein the digital video effect is user-defined. 3. The method of claim 1, further comprising the step of: encoding the third sequence of digital still images in a motion video signal. 0 25 4. The method of claim 1, further comprising the step of: transferring the third sequence of digital still images to a video encoder to be encoded in a motion video signal, wherein the first and second sequences of digital still images are transferred from the first and second data buffers, respectively, to the blender in response to demands from the video encoder. P:\OPER\SSB\89366-98.RES 27/1/00 -14- The method of claim 1, wherein each of the first and second buffers receives and outputs data on a first-in, first-out basis. 6. The method of claim 1, wherein the video data in the files is compressed, the method further comprising the step of: decompressing the video data in the files. 7. The method of claim 1, further comprising the steps of: transferring the first sequence of digital still images to a digital video effects S 10 system operative to perform three-dimensional video effects on sequences of digital still images; and transferring the third sequence of digital still images to the digital video S•effects system, wherein, in accordance with the effects parameters, the second sequence is not blended with the first sequence such that the third sequence is the same as the second sequence. ooS 8. The method of claim 1, wherein the step of controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, o includes: 20 transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers. S 9. The method of claim 1, wherein the digital video effect is applied at a real-time rate. The method of claim 9, wherein the step of controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, includes: transferring to the first and second buffers in accordance with the amount of space available in each of the buffers. P:\OPER\SSB\89366-98.RES 27/1/00 11. The method of claim 1, wherein the digital video effect is applied at a user- selectable rate. 12. The method of claim 11, wherein the digital video effect is applied at a real-time rate. 13. The method of claim 12, wherein the step of controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, includes: transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers. *e 6
14. The method of claim 11, wherein the step of controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, includes: •-transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers. 0 15. The method of claim 1, further comprising the step of: 20 determining the first pixel of the first image and the second pixel of the second image to blend based on the effect parameters.
16. The method of claim 1, further comprising the steps of: determining a blending value for blending the first and second pixel, the blending 25 value based on the effect parameters, a position of the first pixel in the first image, and a position of the second pixel in the second image; and determining a blending function defining the blend of the first pixel and the second pixel based on the blending value.
17. The method of claim 16, wherein the blending function is defined as: a A (1- i a)B, where a is the blending value, A is the first pixel, and B is the second pixel. P:\OPER\SSB\89366-98.RES 27/1/00 -16-
18. The method of claim 16, further comprising the step of: determining the first pixel of the first image and the second pixel of the second image to blend based on the effect parameters.
19. The method of claim 18, wherein the step of controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, includes: transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers. *0 0 "oo 20. The method of claim 18, wherein the digital video effect is applied at a real-time rate. S 0
21. The method of claim 20, wherein the step of controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, 0 includes: transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers. 0 20 22. The method of claim 18, wherein the digital video effect is applied at a user- selectable rate.
23. The method of claim 22, wherein the digital video effect is applied at a real-time S'e. rate.
24. The method of claim 23, wherein the step of controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, includes: transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers. P:\OPER\SSB\89366-98.RES 27/1/00 -17- The method of claim 22, wherein the step of controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, includes: transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers.
26. A system for applying a digital video effect to a first sequence of digital still images and a second sequence of digital still images during playback to produce a third sequence of digital still images, the first and second sequences stored in video data files as part of a video file system, the first sequence including at least a first digital still image and the second sequence including at least a second digital still image, each digital image of the first and •second sequence including a plurality of pixels, the system comprising: means for controlling the transfer of the first and second sequences from the video data files to a first and a second data buffer, respectively; S. 15 means for receiving effect parameters defining the digital video effect to be :applied to a combination of the first and second sequence of digital still images; means for transferring, for each image in the third sequence, a first pixel of the first image from the first buffer and a second pixel of the second image from the second buffer to a blender in accordance with the effect parameters; and 20 means for blending, for each image in the third sequence, the first pixel and the second pixel in accordance with the effect parameters to generate a third pixel of a third image that is part of the third sequence of digital still images.
27. The system of claim 26, whereinothe digital video effect is user-defined. go
28. The system of claim 26, further comprising: means for encoding the third sequence of digital still images in a motion video signal.
29. The system of claim 26, further comprising: RAmeans for transferring the third sequence of digital still images to a video 4_ means for transferring the third sequence of digital still images to a video P:\OPER\SSB\89366-98.RES 27/1/00 -18- encoder to be encoded in a motion video signal, wherein the means for transferring transfer the first and second sequences of digital still images from the first and second data buffers, respectively, to the blender in response to demands from the video encoder. The system of claim 26, wherein each of the first and second buffers receives and outputs data on a first-in, first-out basis.
31. The system of claim 26, wherein the video data files are in a compressed state, the system further comprising: Smeans for decompressing the video data files. S.
32. The system of claim 26, further comprising: means for transferring the first sequence of digital still images to a digital o" 15 video effects system operative to perform three-dimensional video effects on sequences of digital still images; and means for transferring the third sequence of digital still images to the digital video effects system, wherein, in accordance with the effects parameters, the second sequence is not blended with the first sequence such that the third sequence is the same as 4* o* 20 the second sequence.
33. The system of claim 26, wherein the step of controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, *0 includes: 25 means for transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers.
34. The system of claim 26, wherein the system is operative to apply the digital video effect at a real-time rate. The system of claim 34, wherein the means for controlling the transfer of the first P:\OPER\SSB\89366-98.RS 27/1/0 -19- and second sequences from the video data files to the first and second buffers, respectively, includes: means for transferring to the first and second buffers in accordance with the amount of space available in each of the buffers.
36. The system of claim 26, wherein the system is operative to apply the digital video effect at a user-selectable rate.
37. The system of claim 36, wherein the system is operative to apply the digital video effect at a real-time rate.
38. The system of claim 37, wherein the means for controlling the transfer of the first •and second sequences from the video data files to the first and second buffers, respectively, includes: 15 means for transferring to the first and second buffers in accordance with the oo amount of space available in each of first and second buffers.
39. The system of claim 36, wherein the means for controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, 20 includes: means for transferring to the first and second buffers in accordance with the 0 amount of space available in each of first and second buffers.
40. The system of claim 26, further Comprising: 8 25 means for determining the first pixel of the first image and the second pixel of the second image to blend based on the effect parameters.
41. The system of claim 26, further comprising: means for determining a blending value for blending the first and second RAp pixel, the blending value based on the following: the effect parameters, a position of the first pixel in the first image, and a position of the second pixel in the second image; and P:\OPER\SSB\89366-98.RES 27/1/00 means for determining a blending function defining the blend of the first pixel and the second pixel based on the blending value.
42. The system of claim 41, wherein the blending function is defined as: a A where a is the blending value, A is the first pixel, and B is the second pixel.
43. The system of claim 41, further comprising: means for determining the first pixel of the first image and the second pixel of the second image to blend based on the effect parameters. *0 S44. The system of claim 43, wherein the means for controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, includes: 0 means for transferring to the first and second buffers in accordance with the 15 amount of space available in each of first and second buffers. 00 The system of claim 43, wherein the system is operative to apply the digital video effect at a real-time rate. *fee• 20 46. The system of claim 45, wherein the means for controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, includes: means for transferring to the first and second buffers in accordance with the I ainount of space available in each of first and second buffers.
47. The system of claim 43, wherein the system is operative to apply the digital video effect at a user-selectable rate.
48. The system of claim 47, wherein the system is operative to apply the digital video effect at a real-time rate. P:\OPER\SSB\89366-98.RES 27/1100 -21-
49. The system of claim 48, wherein the means for controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, includes: means for transferring to the first and secondbuffers in accordance with the amount of space available in each of first and second buffers. The system of claim 47, wherein the means for controlling the transfer of the first and second sequences from the video data files to the first and second buffers, respectively, includes: S 10 means for transferring to the first and second buffers in accordance with the Soamount of space available in each of first and second buffers.
51. A system for applying a digital video effect to a first sequence of digital still images •and a second sequence of digital still images during playback to produce a third sequence of S 15 digital still images, the first and second sequences stored in video data files as part of a video file system, the first sequence including at least a first digital still image and the second sequence including at least a second digital still image, each digital image of the first and second sequence including a plurality of pixels, the system comprising: a first data buffer and a second data buffer, each to store a sequence of S 20 digital still images; ~a first controller to control the transfer of the first and second sequences from the video data files to the first and a second data buffer, respectively; and a second controller to receive input effect parameters defining the digital video effect to be applied to a combinatibn of the first and second sequence of digital still 25 images, the second controller to control the transfer, for each image of the third sequence, of a first pixel of the first image from the first buffer and a second pixel of the second image from the second buffer to a blender in accordance with the effect parameters and to output a blend signal to the blender that indicates a blend to be performed based on the effect parameters; and a blender to receive the first pixel from the first buffer, a second pixel from S Rtthe second buffer, and the blend signal, the blender to blend the first pixel and the second P:\OPER\SSB\89366-98.RES 2711/00 -22- pixel in accordance with the blend signal to generate as output a third pixel of a third digital still image that is part of the third sequence.
52. The system of claim 51, wherein the digital video effect is user-defined.
53. The system of claim 51, further comprising: a video encoder to receive the third sequence of digital still images and encode the third sequence in a motion video signal, wherein the second controller is operative to transfer the first and second sequences of digital still images from the first and second data buffers, respectively to the blender in response to demands from the video encoder. s54. The system of claim 51, wherein each of the first and second buffers receives and p, outputs data on a first-in, first-out basis. a•
55. The system of claim 51, wherein the video data files are in a compressed state, the system further comprising: a decompressor to decompress the video data files. je 20 56. The system of claim 51, further comprising: a digital video effects system operative to receive the first sequence of digital still images and the third sequence of digital still images and to perform three-dimensional video effects on the first and third sequences of digital still images, wherein in accordance with the effects parameters the blender is not to S* 25 blend the second sequence with the first sequence such that the third sequence is the same as the second sequence.
57. The system of claim 51, wherein the first controller is operative to transfer the first and second sequence to the first and second buffers, respectively, in accordance with the amount of space available in each of first and second buffers. P:\OPER\SSB\89366-98.RES 27/1/00 23
58. The system of claim 51, wherein the system is operative to apply the digital video effect at a real-time rate.
59. The system of claim 58, wherein the first controller is operative to transfer the first and second sequence to the first and second buffers, respectively, in accordance with the amount of space available in each of first and second buffers. The system of claim 51, wherein the system is operative to apply the digital video effect at a user-selectable rate. S61. The system of claim 60, wherein the system is operative to apply the digital video effect at a real-time rate.
62. The system of claim 61, wherein the first controller is operative to transfer the first and second sequence to the first and second buffers, respectively, in accordance with the amount of space available in each of first and second buffers. oooo
63. The system of claim 60, wherein the first controller is operative to transfer the first and second sequence to the first and second buffers, respectively, in accordance with the 20 amount of space available in each of first and second buffers.
64. The system of claim 51, wherein the second controller is operative to determine the first pixel of the first image and the second pixel of the second image to blend based on the effect parameters. The system of claim 51, wherein the second controller is operative to determine the blend signal based on the following: the effect parameters; a position of the first pixel in the first image; and a position of the second pixel in the second image. P:\OPER\SSB\89366-98.RES 27/1/00 24-
66. The system of claim 65, wherein the blender is operative to determine a blending function defining the blend of the first pixel and the second pixel based on the blend signal.
67. The system of claim 66, wherein the blending function is defined as: A (1 a)B, where a is the blending value, A is the first pixel, and B is the second pixel.
68. The system of claim 66, wherein the second controller is operative to determine the first pixel of the first image and the second pixel of the second image to blend based on the effect parameters. OS
69. The system of claim 68, wherein the first controller is operative to transfer the first and second sequence to the first and second buffers, respectively, in accordance with the S•amount of space available in each of first and second buffers. S 15 70. The system of claim 68, wherein the system is operative to apply the digital video effect at a real-time rate. 0o•0 0•
71. The system of claim 70, wherein the first controller is operative to transfer the first and second sequence to the first and second buffers, respectively, in accordance with the 20 amount of space available in each of first and second buffers.
72. The system of claim 68, wherein the system is operative to apply the digital video effect at a user-selectable rate.
73. The system of claim 72, wherein the system is operative to apply the digital video effect at a real-time rate.
74. The system of claim 73, wherein the first controller is operative to transfer the first and second sequence to the first and second buffers, respectively, in accordance with the amount of space available in each of first and second buffers. P:\OPER\SSB\8936698.RES 271100 25 The system of claim 72, wherein the first controller is operative to transfer the first and second sequence to the first and second buffers, respectively, in accordance with the amount of space available in each of first and second buffers.
76. A method of generating a third sequence of digital still images from a first sequence of digital still images and a second sequence of digital still images during playback, wherein the first and second sequences are stored in data files in a file system, each digital image of the first and second sequence including a plurality of pixels, the method comprising the steps :00 of: ::10 controlling the transfer of the first and second sequences from the data files to a first and a second data buffer, respectively; ""receiving a transition signal defining a transition from the first sequence to the second sequence; controlling reading of the first and second sequences from the first and S 15 second buffers, respectively; and generating the third sequence of digital still images from the read first sequence and the read second sequence in accordance with the defined transition.
77. The method of claim 76, wherein the transition is user-defined. i
78. The method of claim 76, further comprising the step of: encoding the third sequence of digital still images in a motion video signal.
79.' The method of claim 76, furtherocomprising the step of: transferring the third sequence of digital still images to a video encoder to be encoded in a motion video signal, wherein the transition is generated in response to demands from the video encoder. The method of claim 76, wherein the first and second buffers both receive and Q output data on a first-in, first-out basis. P:\OPER\SSB\89366-98.RE 271100 -26-
81. The method of claim 77, wherein the data in the files is compressed, the method further comprising: decompressing the data.
82. The method of claim 77, further comprising the steps of: transferring the first sequence of digital still images to a digital video effects system operative to perform three-dimensional video effects on sequences of digital still images; and transferring the third sequence of digital still images to the digital video S 10 effects system, wherein, in accordance with the transition defined by the transition signal, the third sequence is the same as the second sequence. 0 S•
83. The method of claim 76, wherein the step of controlling the transfer of the first and *0.0 second sequences from the data files to the first and second buffers, respectively, includes: transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers. 0000 0
84. The method of claim 76, wherein the third sequence is generated at a real-time rate. oo0o 0
85. The method of claim 84, wherein the step of controlling the transfer of the first and second sequences from the data files to the first and second buffers, respectively, includes: S00.. transferring to the first and second buffers in accordance with the amount :00 0 0 0: of space available in each of the buffers.
86. The method of claim 84, wherein the third sequence is generated at a user- selectable rate.
87. The method of claim 86, wherein the third sequence is generated at a real-time rate. P:\OPER\SSB\89366-98. RS 27/1/00 27-
88. The method of claim 87, wherein the step of controlling the transfer of the first and second sequences from the data files to the first and second buffers, respectively, includes: transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers.
89. The method of claim 86, wherein the step of controlling the transfer of the first and second sequences from the data files to the first and second buffers, respectively, includes: o transferring to the first and second buffers in accordance with the amount of space available in each of first and second buffers. The method of claim 76, further comprising: determining one or more digital still images of the first sequence and one S or more digital still images of the second sequence to which the transition applies based on the transition signal.
91. A system for generating a third sequence of digital still images from a first sequence of digital still images and a second sequence of digital still images during playback, 0 wherein the first and second sequences are stored in data files in a file system, each digital 20 image of the first and second sequence including a plurality of pixels, the system comprising: means for controlling the transfers of the first and second sequences from the data files to a first and a second data buffer, respectively; means for receiving a transition signal defining a transition from the first sequence to the second sequence; means for controlling reading the first and second sequences from the first and second buffers, respectively; and means for generating the third sequence of digital still images from the read first sequence and the read second sequence in accordance with the defined transition.
92. The system of claim 91, wherein the transition is user-defined. P:\OPER\SSB\8936698.RS 27/1/00 -28-
93. The system of claim 91, further comprising: means for encoding the third sequence of digital still images in a motion video signal.
94. The system of claim 91, further comprising: means for transferring the third sequence of digital still images to a video encoder to be encoded in a motion video signal, wherein the means for generating generates the transition between the first and second sequences of digital still images in response to demands from the video encoder. The system of claim 91, wherein each of the first and second buffers receives and outputs data on a first-in, first-out basis. S0**
96. The system of claim 91, wherein the data files are in a compressed state, the system 00 further comprising: means for decompressing the data files. 0
97. The system of claim 91 further comprising: 0 means for transferring the first sequence of digital still images to a digital 20 video effects system operative to perform three-dimensional video effects on sequences of digital still images; and means for transferring the third sequence of digital still images to the digital .000 video effects system, wherein, in accordance with the transition defined by the transition 0 0 signal the third sequence is the same asthe second sequence.
98. The system of claim 91, wherein the means for controlling the transfers of the first and second sequences from the data files to the first and second buffers controls the transfers in accordance with the amount of space available in each of first and second buffers.
99. The system of claim 91, wherein the system is operative to generate the third eauence at a real-time rate. P:\OPER\SSB\89366-98.RES 27/1/00 29
100. The system of claim 99, wherein the means for controlling the transfers of the first and second sequences from the data files to the first and second buffers controls the transfers in accordance with the amount of space available in each of the buffers.
101. The system of claim 91, wherein the system is operative to generate the third sequence at a user-selectable rate.
102. The system of claim 91, wherein the system is operative to generate the third sequence at a real-time rate. •o *0 0•
103. The system of claim 102, wherein the means for controlling the transfers of the first 00e 0 S and second sequences from the data files to the first and second buffers controls the transfers in accordance with the amount of space available in each of first and second buffers. S 15 104. The system of claim 101, wherein the means for controlling the transfer of the first and second sequences from the data files to the first and second buffers, respectively, 0. includes: means for transferring to the first and second buffers in accordance with the amount 00e0 of space available in each of first and second buffers. 0
105. The system of claim 91, further comprising: means for determining the first digital still image and the second digital still between which to generate the transition based on the transition signal.
106. A system for generating a third sequence of digital still images from a first sequence of digital still images and a second sequence of digital still images during playback, wherein the first and second sequences are stored in data files in a file system, each digital image of the first and second sequence including a plurality of pixels, the system comprising: a first data buffer and a second data buffer; a first controller to control the transfers of the first and second sequences -from the data files to the first and second data buffers, respectively; P:\OPER\SSB\89366-98.RES 27/1/00 a second controller having a first input to receive a transition signal defining the transition from the first sequence to the second sequence, a first output to control a read of one or more digital still images of the first sequence and one or more digital still images of the second sequences from the first buffer and the second buffer, respectively, to the processing module in accordance with the transition signal, and a second output to produce at an to output a control signal, wherein the control signal indicates the transition to be performed based on the transition signal; and a digital video processing module for generating the third sequence of S" digital still images, the processing module having a first input to receive the one or more digital still images of the first sequence, a second input to receive the one or more digital still images of the second sequence, a third input to receive the control signal, and an output to provide the third sequence, the digital video processing module generating the 0 °third sequence from the one or more digital still images of the first sequence and the one S or more digital still images of the second sequences in accordance with the control signal.
107. The system of claim 106, wherein the transition is user-defined.
108. The system of claim 106, further comprising: a video encoder to receive the third sequence of digital still images and encode the S: 20 third sequence in a motion video signal, wherein the second controller is operative to transfer the first and second sequences 00 of digital still images from the first and second data buffers, respectively, to the processing 000 0. module in response to demands from the video encoder.
109. The system of claim 106, wherein the first and second buffers both receive and output data on a first-in, first-out basis.
110. The system of claim 106, wherein the data files are in a compressed state, the system further comprising: a decompressor to decompress the data files. P:\OPER\SSB\89366-98.RES 27//00 -31-
111. The system of claim 106, further comprising: a digital video effects system having first input to receive the first sequence of digital still images and a second input to receive the third sequence of digital still images, the digital video effects system to perform three-dimensional video effects on the first and third sequences of digital still images, wherein, in accordance with the transition defined by the transition signal, the third sequence is the same as the second sequence. S112. The system of claim 106, wherein the first controller is operative to transfer the S 0 10 first and second sequences to the first and second buffers, respectively, in accordance with the amount of space available in each of first and second buffers. O* 0
113. The system of claim 106, wherein the system is operative to generate the third sequence at a real-time rate.
114. The system of claim 113, wherein the first controller is operative to transfer the first and second sequence to the first and second buffers, respectively, in accordance with the .00 amount of space available in each of first and second buffers. 20 115. The system of claim 106, wherein the system is operative to generate the third sequence at a user-selectable rate.
116. The system of claim 115, wherein the system is operative to generate the third sequence at a real-time rate.
117. The system of claim 116, wherein the first controller is operative to transfer the first and second sequence to the first and second buffers, respectively, in accordance with the amount of space available in each of first and second buffers.
118. The system of claim 115, wherein the first controller is operative to transfer the A first and second sequence to the first and second buffers, respectively, in accordance with the P:\OPER\SSB\89366-98.RES 27/1/00 -32- amount of space available in each of first and second buffers.
119. The system of claim 106, wherein the second controller is operative to determine the first image and the second image between which to generate the transition based on the transition signal.
120. A method of applying a digital video effect substantially as hereinbefore described with reference to the accompanying drawings. 1
121. A system for applying a digital video effect substantially as hereinbefore described S with reference to the accompanying drawings. *000
122. A method of generating a third sequence of digital still images from first and second sequences of digital still images during playback substantially as hereinbefore described with reference to the accompanying drawings.
123. A system for generating a third sequence of digital still images from first and second sequences of digital still images during playback substantially as hereinbefore described P•O 20 with reference to the accompanying drawings. o* DATED this 27th day of January 2000 Avid Technology, Inc. By its Patent Attorneys DAVIES COLLISON CAVE
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU89366/98A AU717526B2 (en) | 1993-04-16 | 1998-10-16 | Media pipeline with multichannel video processing and playback |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB939307894A GB9307894D0 (en) | 1993-04-16 | 1993-04-16 | Multichannel digital image compression and processing |
GB9307894 | 1993-04-16 | ||
AU67989/94A AU694119B2 (en) | 1993-04-16 | 1994-04-18 | Media pipeline with multichannel video processing and playback |
AU89366/98A AU717526B2 (en) | 1993-04-16 | 1998-10-16 | Media pipeline with multichannel video processing and playback |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU67989/94A Division AU694119B2 (en) | 1993-04-16 | 1994-04-18 | Media pipeline with multichannel video processing and playback |
Publications (2)
Publication Number | Publication Date |
---|---|
AU8936698A AU8936698A (en) | 1999-01-14 |
AU717526B2 true AU717526B2 (en) | 2000-03-30 |
Family
ID=25635533
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU89366/98A Ceased AU717526B2 (en) | 1993-04-16 | 1998-10-16 | Media pipeline with multichannel video processing and playback |
Country Status (1)
Country | Link |
---|---|
AU (1) | AU717526B2 (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4698682A (en) * | 1986-03-05 | 1987-10-06 | Rca Corporation | Video apparatus and method for producing the illusion of motion from a sequence of still images |
-
1998
- 1998-10-16 AU AU89366/98A patent/AU717526B2/en not_active Ceased
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4698682A (en) * | 1986-03-05 | 1987-10-06 | Rca Corporation | Video apparatus and method for producing the illusion of motion from a sequence of still images |
Also Published As
Publication number | Publication date |
---|---|
AU8936698A (en) | 1999-01-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5644364A (en) | Media pipeline with multichannel video processing and playback | |
US6357047B1 (en) | Media pipeline with multichannel video processing and playback | |
US6333951B1 (en) | Image processing system | |
US6763175B1 (en) | Flexible video editing architecture with software video effect filter components | |
US8306399B1 (en) | Real-time video editing architecture | |
EP0772350A2 (en) | Keying system and composite image producing method | |
WO1999052276A1 (en) | A multi stream switch-based video editing architecture | |
JPH08511385A (en) | Adaptive image compression using variable quantization | |
US6763176B1 (en) | Method and apparatus for real-time video editing using a graphics processor | |
JP2006141042A (en) | Media pipeline with multichannel video processing and playback | |
US5412479A (en) | Computer generated wipes for video editing systems | |
JP3645922B2 (en) | Image processing method and apparatus | |
US7372472B1 (en) | Method and apparatus for graphically defining a video particle explosion effect | |
GB2276789A (en) | Offline digital video mixer | |
AU717526B2 (en) | Media pipeline with multichannel video processing and playback | |
US4870479A (en) | Video graphics memory storage reduction technique | |
AU703989B2 (en) | A system for acquiring and playing back a sequence of animated video images in real time | |
JP3981651B2 (en) | Image processing device | |
US20040136688A1 (en) | Video processing system | |
WO1999052277A1 (en) | A multi stream video editing system using uncompressed video data for real-time rendering performance, and for non real-time rendering acceleration | |
JPH02285867A (en) | Still picture filing device | |
US6711305B2 (en) | Image processing apparatus and method | |
JPH03286271A (en) | Picture display device | |
JP2830038B2 (en) | Editing device | |
JPH10240224A (en) | Method and device for transmitting moving image information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGA | Letters patent sealed or granted (standard patent) | ||
MK14 | Patent ceased section 143(a) (annual fees not paid) or expired |