GB2312123A - Post production film or video editing using "warping" - Google Patents

Post production film or video editing using "warping" Download PDF

Info

Publication number
GB2312123A
GB2312123A GB9705869A GB9705869A GB2312123A GB 2312123 A GB2312123 A GB 2312123A GB 9705869 A GB9705869 A GB 9705869A GB 9705869 A GB9705869 A GB 9705869A GB 2312123 A GB2312123 A GB 2312123A
Authority
GB
United Kingdom
Prior art keywords
identified
image
picture point
response
cursor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9705869A
Other versions
GB2312123B (en
GB9705869D0 (en
Inventor
Benoit Sevigny
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Discreet Logic Inc
Original Assignee
Discreet Logic Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB9607647.6A external-priority patent/GB9607647D0/en
Application filed by Discreet Logic Inc filed Critical Discreet Logic Inc
Priority to GB9705869A priority Critical patent/GB2312123B/en
Publication of GB9705869D0 publication Critical patent/GB9705869D0/en
Publication of GB2312123A publication Critical patent/GB2312123A/en
Application granted granted Critical
Publication of GB2312123B publication Critical patent/GB2312123B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Abstract

Displayed picture points are re-locatable in response to manual operation of an interface device such as a stylus or a mouse etc. The interface device is activated, by being placed into pressure or by clicking, and an identified picture point x is subsequently moved x 1 in response to manual operation of the interface device. In response to said picture point x, being moved x 1 , other picture points w, y, are also moved by a displacement which differs from the displacement of the identified picture point and which varies in proportion to the original distance of the other picture point from the identified picture point w 1 , y 1 .

Description

PROCESSING IMAGE DATA The present invention relates to processing image data, wherein displayed picture points are relocatable in response to manual operation of an interface device.
Introduction Systems for processing image data in response to manual operation of a control device are known, in which image data is stored as an array of pixel values. Each pixel value may represent a luminance level or, in a full color system, pixel values may be stored for color components such as additive red, green, blue components, subtractive cyan, magenta, yellow and black components or, particularly in video systems, luminance plus chrominance color difference components.
In addition to application in television post production facilities, systems of this type are being used increasingly for the production of cinematographic film, where the information content for each frame of film is substantially higher. Film clips are scanned on a frame-by-frame basis producing large volumes of image data for subsequent manipulation. Pixel data may be processed under program control to produce visual effects and a system of this type is produced by the present applicant and distributed under the trade mark "FLAME". Thus, in addition to manipulating images as part of a post production process, it may become necessary to make manual modifications to image data, possibly to remove unwanted items from an image or to change or add color etc.
Traditionally, in order to make manual operations, each frame must be examined to allow manual modifications to be made to the image data on a frame-by-frame basis. This takes a considerable amount of time and limits the extent to which techniques of this type may be implemented for particular productions.
In particular, in a function known as "warping", a grid or mesh is superimposed over an area of an image on a display device, and selected points of that mesh are translated, rotated, or reduced in size, with the effect that the whole mesh is correspondingly translated, rotated or reduced.
During warping operations, control points on an image frame are moved on a frame by frame basis. Since an image contains a large number of pixels, it is not feasible to manually move each pixel on a frame by frame basis. Since an image contains a large number of pixels, it is not feasible to manually move each pixel on a frame by frame basis. In these situations, processes are applied by which a region of an image can be warped by identifying a few control points, typically at the periphery of a desired image portion, in order to define a mesh covering the image portion. A warp function is applied to the mesh which moves the remaining pixel portion of the image in accordance with movements of the mesh. However, such warping techniques, although allowing a warp to be carried out in a relatively short time, introduce distortions particularly at the edges of the warped image portion. Such edges need to be "smoothed out" to reduce the visual distortions and visible discontinuities on the resultant warped image.
Presently, it is known to make manual operations to each frame for smoothing out the edges of a warped region. However, each frame requires much work and, as an example, to produce five seconds of warped film footage having discontinuities smoothed out, can take up to three weeks of work. Film Directors are therefore reluctant to make heavy use of warping techniques due to the high cost of post production procedures involved.
Summary of the Invention According to the first aspect of the present invention there is provided a method of processing image data, wherein displayed picture points are relocatable in response to manual operation of an interface device, comprising steps of moving an identified picture point in response to said manual operation; and moving other picture point data by a displacement which differs from the displacement of said identified pictured point and which varies in proportion to distance from the identified picture point.
Preferably, the identified picture point is identified by a displayed cursor and said cursor may be moved in response to movement of a stylus or a mouse.
In a preferred embodiment, a picture point is identified as a transition between image colours.
According to a second aspect of the invention, there is provided an image data processing apparatus, including display means for displaying picture images in the form of a plurality of picture points; a manually operable interface device; and processing means configured to relocate the position of said picture point in response to manual operation of said interface device, wherein said processing device is configured such that the movement of a first picture point in response to operation of said manual device results in the movement of other picture point data by displacements which differ from the displacement of an identified picture point and which vary in proportion to distance from said identified picture point.
Brief Description of the Drawings Figure 1 shows in general view an editing suite for editing film clips, video images or computer generated images; Figure 2 shows a general layout comprising the editing suite of Figure 1; Figure 3 shows a display device comprising the editing suite of Figure 1; Figure 4 shows a control panel feature of the display device of Figure 3; Figure 5 shows an Ith image frame displayed on the display device, in which a group of vertices are translated as a whole from a first position to a second position; Figure 6 shows the vertices of Figure 5 in the second position; Figure 7 illustrates a method of moving the group of vertices individually one by one, between an initial position and positions intermediate between an initial position and a final position, in order to perform a "smoothing" operation; Figure 8 shows the resultant smoothed image of Figure 7; Figure 9 shows the image of Figure 7 in a successive I + 20th frame; Figure 10 shows the I + 20th frame after having a smoothing operation as carried out in Figures 7 and 8; Figure 11 shows a successive I + Nth frame; Figure 12 shows the I + Nth frame after the individual smoothing operation; Figure 13 shows schematically a method of smoothing a group of vertices in accordance with a preferred method of the present invention; Figure 14 shows the group of vertices in smoothed form; Figure 15 shows the image of Figure 13, in and I + 20th frame; Figure 16 shows the smoothed image of Figure 15 in the I + 20th frame; Figure 17 shows a portion of the image in an I + Nth frame; Figure 18 shows the smoothed image in the I + Nth frame; Figure 19 illustrates a proportionality distribution function according to a preferred method of the present invention; Figure 20 shows a second proportionality distribution function; Figure 21 illustrates smoothing of an edge feature in accordance with the preferred method of the present invention; Figure 22 shows a third proportionality distribution function as may be applied in Figure 21; Figure 23 shows a fourth proportionality distribution function as may be applied in Figure 21; Figures 24 shows a second example of smoothing in accordance with the preferred methods of the present invention; Figure 25 illustrates a further proportionality distribution function as applied in the smoothing of Figure 24.
Detailed Description of the Preferred Embodiment Refening to Figure 1 of the accompanying drawings, a film, video or computer generated image editing suite comprises an image display device 103 eg a high resolution video monitor; a control key pad 106; and a graphics tablet 103 and stylus 104 for applying modifications to the displayed image.
Figure 2 shows in schematic form features comprising the image processing apparatus. There is provided a central processing unit 201, a random access memory 202, a graphics hardware device 203, a hard disk storage unit 204; and a graphics interface 205.
In the example of the preferred embodiment a post production process will be considered in which source material, in the form of a film clip, has been recorded and is being processed prior to a final on-line editing operation being performed.
A post production facility is illustrated in Figure 1, in which a video artist 101 is seated at a processing station 102. Images are displayed to the artist via a visual display unit 103 and manual modifications to displayed images are effected in response to manual operation of a stylus 104 upon a graphics touch tablet 105. In addition, a conventional keyboard 106 is provided to allow alpha-numeric values to be entered directly. The monitor 103, tablet 105 and keyboard 106 are interfaced to an image processor 107, which may be based substantially upon a graphics workstation executing the UNIX operating system.
Image data is supplied to the image processor 107 via a digital video tape recorder 108, which may be configured to supply full bandwidth broadcast quality video data to the image processor at video rate.
Altematively, general purpose data storage tape drives may be used and image frames substantially larger than video frames, such as image frames derived from cinematographic film, may be received and processed within the system.
Image processor 107 is detailed in Figure 2. The processor includes a central processing unit 201, arranged to receive program instructions from an addressable random access memory 202. The processing unit 201 may also access a hardware graphics card 203, provided as part of the UNIX environment, allowing computationally extensive operations to be effected at very high speed.
Image data is held within the random access memory 202 as modifications are taking place and large data volumes are held on a disk store 204, preferably taking the form of an array of concurrently accessible disks. The processing unit 201 communicates with the display unit 103, the graphics tablet 105, the keyboard 106 and the video recorder 108 via interface circuits 205 and additional interface circuits may be provided, such as an SCSI interface etc, to allow communication with conventional data manipulation and storage devices etc.
In response to program instructions read from RAM 202, the CPU 201 generates image data which is in tum displayed on the display unit 103.
Display unit 103 is a twenty inch non-interlaced visual display unit. A displayed image 301 may be considered as being made up of two component parts, taking the form of a working "canvas" 302 and a control panel 303. Image data is displayed in the region of said canvas 302 either as individual frames or as a moving video/film clip.
While an image is being displayed on the canvas 302, pixel data may be modified in response to manual operations of the stylus 104 upon the touch tablet 105. The position of the stylus 104 is identified to the artist 101 by means of a cursor 304, which tracks the position of the stylus 104 as it is moved over the touch tablet 105. The monitor 103 includes a control panel 305 for the control of monitor variables, as is well known in the art.
In addition to effecting interactive modification to displayed images, by positioning the cursor within region 302, control operations are similarly effected by moving said cursor into region 303. Control region 303 is detailed in Figure 4.
New data objects are stored with reference to particular data layers in which a first layer may be considered as background image data, with a second layer of data taking priority over said first and a third layer of data taking priority over the second. This arrangement of layers is substantially similar to the layering of video source material in on-line mixing systems, in which images are combined using smooth keying or matting signals so as to achieve a smooth blending to create realistic-looking composites. In operation, modifications may be made within any of these layers and an appropriate layer is selected, layer 1, layer 2 or layer 3, by placing the cursor 304 over a respective layer "button" 401, 402, 403 and placing the stylus 104 into pressure.
The system may be used to display moving video clips, with soft controls being provided substantially similar to those known within video tape recorders etc. Thus, the control display includes a fast rewind button 404, a reverse button 405, a stop button 406, a play button 407 and a fast forward button 408.
Processes are selectable using process selection buttons 409, 410 and 411. Button 409 selects the color mode of operation, in which manual modifications may be made through pixels displayed within the canvas 302.
Button 410 allows layer information to be considered and in particular it allows various constructed objects to be allocated to particular layers. Button 411 allows visual effects to be controlled and upon selection of button 409, 410 or 411 associated process parameters are displayed in a modifiable fashion within region 412. Region 413 allows geometric shapes to be selected, such as circles, ellipses, squares and rectangles, which are then generated automatically at locations within the canvas 302 identified by the cursor 304.
Consideration may now be made to the initial problem, of the smoothing out of the periphery of warped images.
To illustrate a first method of producing a warp, reference will be made to Figures 5 and 6 of the accompanying drawings.
Points a to e of an image frame I undergo a translation as part of the warp, such that points a and e remain in their original positions, and points b, c and d are translated to new positions bl, cl, dl. The result of the warp translation is shown in Figure 6. However, since the points b, c, d are selected as a group and translated as a group, there is a perceived discontinuity around an edge of a warped image, which in the finished film clip or video clip leads to lack of realism as perceived by a viewer.
Referring to Figures 7 and 8, the warp translation may be improved by, rather than moving selected points b, c, d as a group in a single translation, individually moving selected points b, c, d to respective new positions b2, c2, d2 as shown in Figure 7. This may result in a smoothing out" of the discontinuity as compared with the warp method of Figure 5. An illustrative result of individually moving the points b, c, d is shown in Figure 8.
It will be appreciated that individual movement of specific points of an image, using the apparatus described with reference to Figures 1 to 4 of the accompanying drawings, is a time consuming operation. For each frame of film clip or video clip, a large number of individual points must be relocated using the touch tablet 105 and the stylus 104. For a clip of film or video having a number N individual frames, an image will generally move from frame to frame, and the warp may need to be effected on a frame by frame basis. Consequently, smoothing of the warp also may need to be effected on a frame by frame basis leading to a large number of individual manual point movements using the stylus 104 and touch tablet 105.
Figures 9 and 10 show illustratively an I + 20th frame of the clip before and after warping and manual post warp smoothing, and Figures 11 and 12 show respectively an illustrative I + Nth frame both before and after warping and manual post warp smoothing.
An example of the specific method according to the present invention will now be described.
In the following discussion, movement of points of an image are described. It will be understood that where movement of a "point" is described, this relates to movement of one or more pixels on display screen 103 or the graphics tablet 105. Corresponding pixel data and image data is modified in accordance with movements of pixels, and so where movement of parts of an image frame to new positions within the image frame are described, corresponding processing of image data occurs in the image processor 107.
Refening to Figure 13 of the accompanying drawings, an image characterized by individual points V,W,X,Y,Z at initial positions v, w, x, y, z is to be warped to a new position at points v, w1, xl, yl, z. A point, eg point X, is identified as a source point by manipulation of a cursor on the visual display unit 103 in response to manual operation of an interface device, for example the stylus 104 and the touch tablet 105. Using the stylus and touch tablet, the point X is dragged to a new position xl.
The central processing unit 201 identifies points related to the identified source point X, in this case related points V, W, Y and Z. The processor applies a proportionality distribution function in order to move the related points V,W,Y and Z in accordance with a predetermined distribution, which is proportional to the distance which the source point X has been moved from its source position to its destination position. For example, in Figure 13 source point X is translated from its original position within the frame (its source position) to a new position xl, its destination position.
Related points V, W, Y and Z are translated to respective destination positions v, wl, y1, z.
In Figure 13 in the case of the points V and Z, the source positions of these related points are the same as their destination positions, ie the points V and Z stay where they are in relation to the frame. However, points w and y are moved from their initial source positions to new destination positions w1, yl. The resultant destination positions of points V to Z are shown in Figure 14.
Referring to Figures 15 and 16, in the I + 20th frame of the clip, the image has moved compared to the Ith frame and so the source point X20 needs to be moved from its source position in the I + 20th frame to its destination position x20 as shown in Figure 16. The line in Figure 16 illustrates the destination positions of related points V20, W20, Y20, Z20.
Similarly, in the I + Nth frame, the image has moved relative to the frame even further, and again the image needs to be identified with reference to a source point XN, which is moved from a source position to a destination position XN.
Thus, for each frame an identified feature of an image may be moved by identifying a source point of that image data, moving the source point of the image data from a source position to a destination position, and by moving further related points of the image move automatically under control of the central processing unit and in accordance with a predetermined proportionality distribution function by an amount relative to the distance between the source point and the destination point, the amount being determined by the proportionality distribution function.
Referring to Figure 19 of the accompanying drawings, an example of a proportionality distribution function is shown with reference to a "+" electronic cursor 600. The cursor is shown as moving in a vertical direction with reference to the frame. The proportionality function is defined as having a y axis in the direction of movement of the cursor, in this case vertically with reference to the frame, and an x axis in a direction transverse to the movement of direction of the cursor. In this case, the x axis of the function happens to be perpendicular to the direction of movement of the cursor, but is not necessarily so.
In the case shown in Figure 19, the proportionality function comprises a substantially gaussian function. The maximum extent of the gaussian function in the y direction is preferably set, such that it corresponds to the distance moved between the source point and the destination point by the cursor. At positions either side of the cursor, related points are moved to a lesser extent, being a proportion of the distance moved by the cursor.
The width of the gaussian function may be determined, to enable related points within a predetermined distance from the source point to move with the source point. Selection of the width of the proportional distribution function determines the fineness or coarseness of the smoothing effect.
Refening to Figure 20, movement of the cursor is made diagonally across a frame. In this case the proportional distribution function is defined in the y direction as being in a direction diagonally across the frame and in the x direction, transverse to the y direction.
Related points may be identified by way of intensity, color, or their initial unwarped position.
Referring to Figure 21 of the accompanying drawings, there is illustrated a source point S at a source position s, which is moved to a destination position sl. The effect of applying a proportional distribution function as shown in Figure 22, having a relatively narrow width (trace a) and another proportional distribution function as shown in Figure 23, having a relatively wide width (trace b) as shown in Figure 21.
The maximum magnitude of the proportionality function in the y direction is the distance s-s1, in Figure 21, and at x positions either side of the maximum value, the value of the proportionality function is less than the distance s-s1. The maximum magnitude of the proportionality function may be varied or preset as a percentage of the distance between the source point and the destination point.
Refening to Figure 24 herein, another identified source point T at source position t is moved to a destination position t1. In this case, a proportional distribution function having a partially negative effect, as shown with reference to Figure 25 is applied. The function extends over a distance in the x direction of width d transverse to the direction of the movement t-t1.
The effect on the image feature, denoted by the line in Figure 24, is that movement of the source point T to the destination position t1 results in the trace c as shown in Figure 24, in which a portion of the image feature corresponding to related points actually moves away from the destination point t as the source point is translated from position t to tl.

Claims (10)

Claims
1. A method of processing image data, wherein displayed picture points are relocatable in response to manual operation of an interface device, comprising steps of moving an identified picture point in response to said manual operation; and moving other picture point data by a displacement which differs from the displacement of said identified picture point and which varies in proportion to distance from the identified picture point.
2. A method according to Claim 1, wherein said identified picture point is identified by a displayed cursor.
3. A method according to Claim 2, wherein said cursor is moved in response to manual movement of a stylus or a mouse.
4. A method according to any of Claims 1 to 3, wherein said picture point is identified as a transition between image colours.
5. Image data processing apparatus, including display means for displaying picture images in the form of a plurality of picture points; a manually operable interface device; and processing means configured to relocate the position of said picture point in response to manual operation of said interface device, wherein said processing device is configured such that the movement of a first picture point in response to operation of said manual device results in the movement of other picture point data by displacements which differ from the displacement of an identified picture point and which vary in proportion to distance from said identified point.
6. Apparatus according to claim 5, including means for generating a displayable cursor, wherein said identified picture point is identified by the position of said cursor.
7. Apparatus according to claim 6, including means for moving said cursor in response to manual operation of a stylus or a mouse.
8. Apparatus according to any of claims 5 to 7, wherein said picture point is identified as a transition between image colours.
9. A method of processing image date substantially as here in described with reference to the accompanying drawings.
10. Image processing apparatus substantially as here in described with reference to the accompanying drawings.
GB9705869A 1996-04-12 1997-03-21 Relocating picture points in response to manual operation of an interface device Expired - Fee Related GB2312123B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9705869A GB2312123B (en) 1996-04-12 1997-03-21 Relocating picture points in response to manual operation of an interface device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB9607647.6A GB9607647D0 (en) 1996-04-12 1996-04-12 Proportional modelling
GB9705869A GB2312123B (en) 1996-04-12 1997-03-21 Relocating picture points in response to manual operation of an interface device

Publications (3)

Publication Number Publication Date
GB9705869D0 GB9705869D0 (en) 1997-05-07
GB2312123A true GB2312123A (en) 1997-10-15
GB2312123B GB2312123B (en) 1999-01-13

Family

ID=26309119

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9705869A Expired - Fee Related GB2312123B (en) 1996-04-12 1997-03-21 Relocating picture points in response to manual operation of an interface device

Country Status (1)

Country Link
GB (1) GB2312123B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002089059A2 (en) * 2001-04-25 2002-11-07 Siemens Aktiengesellschaft Image processing method
GB2438668A (en) * 2006-06-02 2007-12-05 Siemens Molecular Imaging Ltd Deformation of mask based images
WO2013121239A1 (en) * 2012-02-15 2013-08-22 Thomson Licensing User interface for depictive video editing

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4683468A (en) * 1985-03-11 1987-07-28 International Business Machines Corp. Method for manipulation of graphic sub-objects in an interactive draw graphic system
EP0314395A1 (en) * 1987-10-26 1989-05-03 Crosfield Electronics Limited Interactive image display
US4951040A (en) * 1987-03-17 1990-08-21 Quantel Limited Image transformation processing
GB2229336A (en) * 1989-03-17 1990-09-19 Sony Corp Picture manipulation
WO1993012502A1 (en) * 1991-12-18 1993-06-24 Ampex Systems Corporation Video special effects system
GB2284524A (en) * 1993-12-02 1995-06-07 Fujitsu Ltd Graphic editing apparatus and method
GB2295301A (en) * 1992-04-17 1996-05-22 Computer Design Inc Surface mesh generation and 3-d shape flattening

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4683468A (en) * 1985-03-11 1987-07-28 International Business Machines Corp. Method for manipulation of graphic sub-objects in an interactive draw graphic system
US4951040A (en) * 1987-03-17 1990-08-21 Quantel Limited Image transformation processing
EP0314395A1 (en) * 1987-10-26 1989-05-03 Crosfield Electronics Limited Interactive image display
GB2229336A (en) * 1989-03-17 1990-09-19 Sony Corp Picture manipulation
WO1993012502A1 (en) * 1991-12-18 1993-06-24 Ampex Systems Corporation Video special effects system
GB2295301A (en) * 1992-04-17 1996-05-22 Computer Design Inc Surface mesh generation and 3-d shape flattening
GB2284524A (en) * 1993-12-02 1995-06-07 Fujitsu Ltd Graphic editing apparatus and method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002089059A2 (en) * 2001-04-25 2002-11-07 Siemens Aktiengesellschaft Image processing method
WO2002089059A3 (en) * 2001-04-25 2003-05-30 Siemens Ag Image processing method
GB2438668A (en) * 2006-06-02 2007-12-05 Siemens Molecular Imaging Ltd Deformation of mask based images
GB2438668B (en) * 2006-06-02 2008-07-30 Siemens Molecular Imaging Ltd Deformation of mask-based images
US7929799B2 (en) 2006-06-02 2011-04-19 Siemens Medical Solutions Usa, Inc. Deformation of mask-based images
WO2013121239A1 (en) * 2012-02-15 2013-08-22 Thomson Licensing User interface for depictive video editing

Also Published As

Publication number Publication date
GB2312123B (en) 1999-01-13
GB9705869D0 (en) 1997-05-07

Similar Documents

Publication Publication Date Title
CA2079918C (en) Image editing system and method having improved multi-dimensional editing controls
US5808628A (en) Electronic video processing system
US6445816B1 (en) Compositing video image data
US5982350A (en) Compositer interface for arranging the components of special effects for a motion picture production
US7030872B2 (en) Image data editing
US6751347B2 (en) Color diamond chroma keying
US6522787B1 (en) Method and system for rendering and combining images to form a synthesized view of a scene containing image information from a second image
US6757425B2 (en) Processing image data to transform color volumes
US5781188A (en) Indicating activeness of clips and applying effects to clips and tracks in a timeline of a multimedia work
US5077610A (en) Previewing cuts and transformations in an electronic image composition system
US20020171668A1 (en) User interface for generating parameter values in media presentations based on selected presentation instances
US5999194A (en) Texture controlled and color synthesized animation process
GB2391148A (en) Selecting functions via a graphical user interface
WO1992021096A1 (en) Image synthesis and processing
US6400832B1 (en) Processing image data
GB2312120A (en) Producing a transition region surrounding an image
US6473094B1 (en) Method and system for editing digital information using a comparison buffer
US6052109A (en) Processing image data
GB2312123A (en) Post production film or video editing using "warping"
EP0947957B1 (en) Defining color regions for applying image processing in a color correction system
JPS61221878A (en) Picture processor
GB2091515A (en) Interactive Video System
JPS61150077A (en) Picture processor
JPH0765189A (en) Method and device for image processing

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20020321