GB2258790A - Animation - Google Patents

Animation Download PDF

Info

Publication number
GB2258790A
GB2258790A GB9117409A GB9117409A GB2258790A GB 2258790 A GB2258790 A GB 2258790A GB 9117409 A GB9117409 A GB 9117409A GB 9117409 A GB9117409 A GB 9117409A GB 2258790 A GB2258790 A GB 2258790A
Authority
GB
United Kingdom
Prior art keywords
data
pictures
picture
stored
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9117409A
Other versions
GB9117409D0 (en
Inventor
Andrew Louis Berend
Stuart Philip Hawkins
Michael John Brocklehurst
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cambridge Animation Systems Ltd
Original Assignee
Cambridge Animation Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cambridge Animation Systems Ltd filed Critical Cambridge Animation Systems Ltd
Priority to GB9117409A priority Critical patent/GB2258790A/en
Publication of GB9117409D0 publication Critical patent/GB9117409D0/en
Priority to US07/844,634 priority patent/US5692117A/en
Priority to JP4500477A priority patent/JPH06503663A/en
Priority to EP91920852A priority patent/EP0559714A1/en
Priority to AU90158/91A priority patent/AU9015891A/en
Priority to JP4500061A priority patent/JPH06505817A/en
Priority to PCT/GB1991/002122 priority patent/WO1992009965A1/en
Priority to PCT/GB1991/002124 priority patent/WO1992009966A1/en
Priority to AU89321/91A priority patent/AU8932191A/en
Priority to EP91920646A priority patent/EP0559708A1/en
Priority to AU17921/92A priority patent/AU1792192A/en
Priority to JP4510508A priority patent/JPH06507742A/en
Priority to JP4510509A priority patent/JPH06507743A/en
Priority to EP19920910492 priority patent/EP0586444A1/en
Priority to EP92910474A priority patent/EP0585298A1/en
Priority to PCT/GB1992/000927 priority patent/WO1992021095A1/en
Priority to AU17934/92A priority patent/AU1793492A/en
Priority to US08/150,100 priority patent/US5598182A/en
Priority to PCT/GB1992/000928 priority patent/WO1992021096A1/en
Publication of GB2258790A publication Critical patent/GB2258790A/en
Priority to US08/311,398 priority patent/US5611036A/en
Priority to US08/643,322 priority patent/US5754183A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Abstract

Apparatus for generating an animated sequence of pictures comprises means for storing data defining a plurality of pictures and data defining, for each, a temporal position in said sequence 180, means for reading stored data and generating therefrom data defining a plurality of intervening pictures occurring at time positions between those of said stored pictures 100, and providing a transition therebetween, and means 170A, 170B for editing said data so as to amend said sequence. Objects are preferably described in terms of Bezier curves (Figs 1, 2). Objects may be re-coloured by defining attributes at control points along the Bezier curves and then changing these attributes. <IMAGE>

Description

ANIMATION This invention relates to apparatus for, and a method of, producing a sequence of images defining an animated sequence, such as a cartoon.
Traditionally, cartoons are manually drawn as a sequence of frames which, when played in succession at relatively high speed, form a moving picture (typical frame rates are 24, 25 or 30 frames per second).
Even for a short sequence, many thousands of frames thus need to be drawn by hand and production of the hand drawn frames requires large teams of skilled animators and assistance. Almost all cartoon animation today is still produced in this way.
The essential steps in production of the cartoon are: 1. "key frame" production, in which a senior animator draws each character at significant points throughout the sequence; 2. "in betweening", in which more junior (less skilled) animators create the missing intermediate frames by a process of interpolating by eye between adjacent key frames, and then 3. "checking", in which the sequence of key frames and in between frames are recorded on film and then replayed, typically with the sound track, to check for other errors. If necessary, frames are redrawn at this point; otherwise, the pencil drawings are then inked in, painted into the required colours, and recorded on film.
In view of the sheer volume of drawings required, and of the time and expense involved in producing cartoons by this method, some attempts have been made to automate parts of the process Inking and colouring has successfully been automated, resulting in some savings in manpower and time.
It has also previously been proposed to automate the interpolation or "in betweening" stage. In such proposals, the key frames produced by the senior animator are scanned in some manner by input means into an image processor such as a programmed computer, where an internal representation of each is stored.
Corresponding points or lines, or other parts of two key frames, are identified in some manner and in between images are generated by producing a sequence of frames in each of which a similar set of points, lines or parts are generated by interpolation between those of two adjacent stored key frames. The remainder of the frame between the identified points or parts is then generated.
Such proposals have been uniformally unsuccessful, however, because the problem of identifying corresponding parts in two key frames is extremely difficult. Two key frames drawn by the same artist may appear similar to the human eye, but every point of the two line drawings may be different and the image processing apparatus is unable to distinguish between differences which correspond to motion or intentional change, and are hence to be interpolated, and those which are merely accidental.
The present invention, in one aspect, provides apparatus which interpolates between key frames which are generated to explicitly correspond one to another.
Preferably, the key frames are stored as data defining parametric curves (that is, curves controlled by control points), and the control points in the two key frames are labelled as corresponding one to the other.
Preferably, the apparatus includes means allowing the user to define key frames, and preferably key frames are defined by adapting a common template key frame so that all key frames correspond.
In this embodiment, the apparatus is provided with means for allowing the user to insert additional control points, so as to permit the complexity of a frame to vary; preferably, in this embodiment, the template is correspondingly modified to add the further control point.
Preferably, the apparatus is arranged to create or insert new key frames between existing ones. This helps overcome a problem with the above discussed type of prior art interpolation aids. This problem arises because two key frames are two dimensional projections of a solid object or objects. The object will have moved through three dimensions between key frames, and not usually merely in the two dimensions in which both key frames lie. Interpolating in the two dimensions of the key frames therefore generally leads to distortions which often make the interpolated frames look unrealistic.However, equally, for simple shapes this distortion may not be particularly noticeable We have found that an efficient solution is therefore provided by interpolating in two dimensions, and replaying the interpolating sequence, and then permitting a user to insert a new intermediate key frame where the present interpolation is unsuccessful due to this problem.
One prior proposal for character animation is described in "3-D character animation on the symbolics system", by P Bergeron, issued as Course Notes on the Course "3D Character Animation by Computer" at the Siggraph '87 Conference in 1987.
This publication describes the use of a three dimensional modelling program (S-geometry) and an animation (in the broad sense of the word) program (S-dynamics) to animate cartoon characters. Each character is modelled as a spline surface defined by a mesh of control points. The article proposes to create a plurality of "expressions" corresponding to different pictures, each of which is derived from a "source" expression. The "expressions" are created by performing a transformation upon a plurality of the control points of the mesh of the "source" expression, using the S-geometry program.
After creating the expressions, the separate S-dynamics program is used to produce a sequence of pictures. A bar chart illustrating the value of a control variable as the height of bars along a timeline is interactively manipulated by a position sensitive input device (a mouse) to create or modify the different pictures of the sequence. The control variable may dictate the amount of a given transformation operation to be applied (such as rotation), or may give a percentage of one of the predetermined expressions so that a given picture is interpolated between the "source" expression and the given expression in proportion to the percentage. In this case, an interpolated sequence of pictures can be produced between first and second percentages of a single expression.Different sub-sequences of pictures can be combined to produce a composite sequence, of which each picture is defined by the control variables from each of the sub-sequences.
The publication teaches that, although automatic interpolation between "key" values of the control variable (for instance, percentages of a single expression) is possible in the S-dynamics program, it is particularly important for character animation not to use this facility for interpolating between percentages of an expression but instead to allow the user to specify every value of the control variable for each frame by freehand sketch. Further, although the values of the control variable may be edited, the "expressions" themselves cannot while the sequence is retained. We have found, however, that the apparatus described in the publication makes character animation relatively slow and laborious.
Accordingly, in a first embodiment, we provide a system in which, rather than specifying "expressions" and then manipulating percentages of a single expression to create a sequence or sub-sequence, the apparatus defines pictures which will actually occur at a predetermined times in the sequence and then generates the interpolated pictures in between. The interpolation may be, in one system mode, performed automatically following a predetermined (e.g. linear) curve or rule. In this case, in particular since the interpolation will not always be successful, it is also strongly preferred (but not essential in all embodiments) to provide editing means enabling the user to subsequently edit the sequence thus produced.
This greatly accelerates the animation process, since in many cases the interpolation will be acceptable, whilst not sacrificing the necessary flexibility to amend those instances where it is not.
In another aspect, the invention provides a system comprising means for defining stored pictures and means for interpolating predetermined proportions of the stored pictures to build up an animated sequence; the means for interpolating may be automatic as above, or could be manually controlled as in the above referenced prior art, but in either case, the unnecessary complexity of the above discussed proposal is avoided by providing that the pictures are provided as lines defined by a small (preferably minimal) number of control points, so as to obtain bold, smooth curves.Very preferably, editing means are provided enabling an operator to edit a picture by editing individual control points in isolation, rather than by effecting a transformation on a plurality of control points as in the prior art discussed above; this is found to provide a simple but fast and effective method of generating character animation.
To facilitate interpolation, the number of such control points in different pictures is generally the same and control points of different pictures generally correspond one to another; however, in a particularly preferred embodiment, there are provided means for storing points of two different kinds; the first (active) kind which define properties of a line and are used by the interpolation means to generate interpolated pictures and a second (dormant) kind which do not, and the editing means is arranged to be capable of converting such a dormant point to an active one, whilst retaining the correspondence between the point and those of other pictures, so as to enable the complexity of a particular picture to be increased relative to others without losing the correspondence between the pictures which facilitates interpolation, and retaining a small set of control points to keep the lines bold and simplify processing.
In another preferred embodiment, all pictures are derived from an original or template picture, and new control points may be added to a particular picture to increase the complexity thereof, in which case the corresponding control point (preferably dormant) is added to the template picture so as to maintain the correspondence therebetween.
Other preferred aspects and embodiments of the invention are as described or claimed hereafter, with advantages that will be apparent from the following: singly or in combination, these embodiments enable the provision of an animation system for character animation which is fast and straightforward to use.
Equally, it will be appreciated that many aspects of the invention may be employed in other fields of animation, however (such as computer aided design).
The invention will now be described, by way of example only, with reference to the accompanying drawings, in which: BRIEF DESCRIPTION OF DRAWINGS Figs. la-e illustrate curve approximations; Figs. 2a and 2b illustrate the effect of varying the control variables used in parametric cubic curves; Fig. 3 is a block diagram of apparatus according to one embodiment of the invention; Fig. 4 is a block diagram of apparatus according to a further embodiment of the invention; Fig. 5 is a block diagram of apparatus according to yet further embodiments of the invention; Figs. 6a illustrates the information stored in a memory of the apparatus of this embodiments to represent the curve shown in Fig. 6b; Fig. 7 is a block diagram schematically illustrating the operation of these embodiments in generating a display;; Fig. 8 is a flow diagram schematically illustrating the operation of the apparatus of Fig. 7; Fig. 9 is a block diagram schematically illustrating the operation of editing the data shown in Fig. 6a; Fig. 10 is a flow diagram schematically showing the process of operation of the apparatus of Fig. 9; Fig. 11 is a flow diagram showing schematically the sequence of operations undertaken by a user of the apparatus of the above embodiments; Fig. 12 is a block diagram indicating schematically the manner in which data is stored within the memory as a linked list; Figs. 13a-13e provide greater details of the data stored within the elements of Fig. 12; Fig. 14 shows in greater detail the arrangement of data stored in the memory corresponding to a displayed picture; Fig. 15 shows schematically the arrangement of the information of Fig. 14 within the memory as a linked list;; Fig. 16 is a block diagram illustrating schematically the contents of an image store in the above embodiments; Fig. 17 shows schematically the arrangement of display areas of a display device in the above embodiments; Fig. 18 shows the appearance of related displays in the display areas; Figs 19a-19c show alternative display formats; Fig. 20 illustrates the appearance of the display during editing; Fig. 21 shows schematically the appearance of a further display area in the above embodiments; Fig. 22 shows schematically a method of interpolating animated frames in one embodiment of the invention; Fig. 23 shows schematically a part of the method of Fig. 22; Fig. 24 shows schematically a method of converting a key frame to an interpolated frame in that embodiment; Fig. 25 shows schematically a method of deleting a key frame in that embodiment;; Fig. 26 shows schematically a method of moving a key frame in that embodiment; Figs. 27a-d show schematically the results displayed on the monitor corresponding to the operations of Figs. 22-26; Fig. 28a-d show schematically the facts on the monitor 160 of converting between a point on a curve and a curve control point; Figs. 29a-d show corresponding amendments to the contents of the memory 121; Fig. 30 show schematically a method of preparing frames for interpolation or addition; Fig. 31 show schematically a method of adding frames together to produce a composite frame; Fig. 32 show schematically a method of deriving curve point positions in interpolated or added frames; and Fig. 33 shows illustratively the functional elements of the invention, provided in this embodiment by a single processor.
PARAMETRIC CURVES Before discussing the invention in detail a brief description of parametric curves will be given; such curves form part of a common general knowledge of the skilled worker, and are referred to in, for example, "Interactive Computer Graphics", P Burger and D Gillies, 1989, Edison Wesley, ISBN 0-201-17439-1, or "An Introduction to Splines for Use in Computer Graphics and Geometric Modelling", by R H Bartels, J C Beatty and B A Barsky, published by Morgan Kaufmann, ISBN 0-934613-27-3 (both incorporated herein by reference).
Referring to Fig. 1A, a fairly smooth freehand curve is shown. Referring to Fig. 1B, one way of representing the curve would be to draw a series of straight line segments, meeting at points. However, the number of straight line segments has to be large, as illustrated in Fig. 1C, before the simulation is at all convincing.
Alternatively, the curve may be represented as a series of curve segments running between points. If, as in Fig. 1D, adjacent curve segments have the same slope at the point of which they join, the curve can be made smooth.
One well known type of curve approximating technique employs a cubic curve in which the coordinate variables x and y are each represented as a third order or cubic polynomial of some parameter t.
Commonly, the value of the parameter is constrained to lie between 0 and 1. Thus, each curve segment is described as: x=axt 3+ bxt 2+ cxt + dx (1) Each segment has two end points, at which t = 0 and t = 1. The coordinates of the t = 0 end point are therefore x0 = dx, y0= dy, and those of the t = 1 point are given by: x=ax + bx + cx + dx (3) y1= ay + by + cy + dy (4) At the end points, the slope of the curved segment is also fixed or predetermined so that each segment can be matched to its neighbours to provide a continuous curve if desired.
The shape of the curve between the end points is partially dictated by the slopes at the end points, but also by a further item of information at each point which is conveniently visualised as the length of a tangent vector at each point. The curve between the two points may be thought of as having at its end clamped at the end points, at fixed slopes thereat, whilst the tangent vector exercises a pull on the direction of the curve which is proportional to its length, so that if the tangent vector is long the curve tends to follow the tangent over much of its length.The tangent vector may be derived from the above equations (1)-(4) and vice versa; for example, where the end of the Bezier tangent vector at the t = 0 point has coordinates x2,y2, and that at the end of the t = 1 point has coordinates x3,y3, the coefficients a, b, c, d are given by: dx = x0 (likewise d = y0) (5) y y0) bx = 3(x0 - 2x2 + x3) (and likewise by) (6) cx = 3(x2 - x0) (and likewise cy) (7) ax = 3x2 -x0 - 3x3 + x1 (and likewise ay) (8) The differential of the curve equation with respect to the variable t is:: c + 2bt + 3at2 (9) The differential values at the t = 0 and t = 1 points are, respectively, 3(x2 - X0) = cx; 3(y2 - y0) = cy; 3(x1 - X3) = Cx + 2bx + 3ax; 3(y1 - y3) = cy + 2by + 3a From these equations, by inspection, it will be seen that the length of the tangent to the Bezier control points (x2,y2), (x3,y3) is 1/3 that of the actual tangent vector. Although the actual tangent vector could be employed, it is mathematically more convenient to employ the Bezier tangent vector (which has the same direction but 1/3rd the magnitude).
In the so called Hermite form of a cubic equation, the data used to define a curve segment is the coordinates of the end points, the slope of the tangent vector at each end point, and the length of each tangent vector.
In the Bezier format, the data used to define a curve segment are the coordinates of the end points, and the coordinates of the ends of each tangent vectors.
Conversion between the Hermite and Bezier format is merely a matter of polar to rectangular conversion, and vice versa.
Fig. 2A shows the effect of varying the magnitude or lengths of the tangent vectors, whilst keeping their angle constant. It will be seen that the effect is to "pull" the curve towards the tangent vector, more or less strongly depending on the length of the tangent vector.
Fig. 2B shows the effect of varying the angle of the tangent vector whilst keeping its magnitude fixed.
Other types of cubic curve are also known; for example, the B-spline, which is defined by two ends points and a plurality of intervening control points through which the curve does not pass. However, the Bezier curve description is used in many applications because it is relatively easy to manipulate; for instance, in matching an approximated curve to an existing curve, the coordinates and tangent angles at points along the curve can directly be measured and employed. The PostScript command language used to control many laser printers employs this curve description, accepting values defining the coordinates of curve segment end points and the coordinates of corresponding tangent end points.
In general, a smooth curve is defined by a number of such end points, and two adjacent such segments will share a common end point. If the curve is to be smooth, the tangent angles defined at the end point in relation to each curve segment will be equal, although the tangent vector lengths will in general not.
However, as shown in Fig. le, it is possible to represent a line with a curvature discontinuity by providing that the tangent angle at end point is different for each of the two segments it defines.
For present purposes, the main usefulness of this form of curve representation is that a smooth, bold curve can be defined using only a small number of coefficients or control points, and parts of it can be amended without extensive recalculation of the whole line.
Apparatus for performing the invention will now be described.
GENERAL DESCRIPTION OF APPARATUS Referring to Fig. 3, apparatus according to an embodiment of the invention comprises a computer 100 comprising a central processing unit 110, a memory device 120 for storing the program sequence for the CPU 110 and providing working read/write memory, a frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed, and an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150.
A monitor 160 is connected to the computer 100 and its display updated from the frame store 130 under control of the CPU 110. At least one user input device 170a,170b is provided; typically a keyboard 170b for inputting commands or control signals for controlling peripheral operations such as starting, finishing and storing the results of an image generation or image processing operation, and a position sensitive input device 170a such as, in combination, a stylus and digitising tablet, or a "mouse", or a touch sensitive screen on the monitor 160, or a "trackerball" device or a joystick.A cursor symbol is generated by the computer 100 for display on the monitor 160 in dependence upon the signal from the position sensitive input device 170a to allow a user to inspect an image on the monitor 160 and select or designate a point or region of the image during image generation or processing.
A mass storage device 180 such as, for instance, a hard disk device is preferably provided as a long term image store, since the amount of data associated with a single image stored as a frame at an acceptable resolution is high. Preferably, the mass storage device 180 also or alternatively comprises a removable medium storage device such as a floppy disk drive or a high capacity tape drive, to allow data to be transferred into and out from the computer 100.
Also preferably provided, connected to the input/output device 140, is a printer 190 for producing a permanent visual output record of the image generated. The output may be provided on a transparency or on a sheet of paper. A film recorder 196 and/or a video recorder 197, and means for generating a suitably formatted moving picture output comprising a succession of frames, are also preferably provided.
A picture input device 195 such as a scanner for scanning an image on, for example, a slide, and inputting a corresponding video signal to the computer 100 may also be provided.
Referring to Fig. 4, an animation system in one embodiment of the invention comprises the computer 100 of Fig. 3 providing an animators work station, and arranged to execute three different stored sequences so as to comprise an interpolator 101, a replayer 103 and a renderer 105. The interpolator, with which the present invention is chiefly concerned, is arranged to generate sequences of image frames. The replayer 103 is arranged to recall a stored image sequence previously created, and generate a display of the sequence as a moving image on the animators workstation monitor 160. The renderer 105 is arranged to colour the image, and may also affect the way in which the lines are represented (for example, their thickness). The renderer 105 may operate as disclosed in our earlier application 9110945.4.
Referring to Fig. 5, in an embodiment of the invention for larger scale animation production, there may be a plurality of workstations 110a-llOc allowing different users to develop different parts of a given animated sequence, sharing a common mass storage (file server) unit 180 such as a disk drive unit with a controlling processor, and connected thereto via a local area network (LAN) such as Ethernet.
Since the rendering process usually requires many more pixel calculations than the interpolation process according to the invention for each frame, it may be advantageous to provide separate processors (typically, a smaller number) 110d-110f for performing the rendering operation, interconnected to the workstations 110a-110c via the local area network.
This enables the use to be made of simpler computers for the animator workstations 110a-110c; they may, for example, lack maths coprocessor devices and/or, sophisticated graphics engines. Alternatively, the same processors may act either as rendering processors or workstations, depending on demand. In this case, control means may be provided for determining the current processing load on each processor and for allocating rendering or workstation tasks to processors connected to the network so as to manage (e.g. balance) the processing load.
One example of a suitable computer 100 for implementing the above embodiments of Figs. 3 and 4 is the NeXTCUBE computer including the NeXTdimension colour board, available from NeXTComputer, Inc., USA.
This arrangement provides direct formatted outputs for connection to a videocassette recorder or other video storage device, and accepts video input signals.
Further, it includes means for compressing images for storage on a disk store 180, and for decompressing such stored images for display.
In this embodiment of the invention, display frames, consisting of line drawings of objects, are created and/or edited with reference to stored control point data (preferably data stored in the Bezier format referred to above). In other words, a stored representation of a display frame comprises a plurality of control points which define line segments which make up a line representation.
Referring to Fig. 6a, the memory 120 includes a working memory area 121 to which data may be written (for example, a random access memory area). Referring to Fig. 6b, an image displayed on the monitor 160 includes at least one line A, which is drawn as a cubic curve defined by three control points A1, A2, A3. Corresponding image frame data representing the line image is stored within a frame table 122 within the working memory 121, as a series of curves (curve 1, curve 2 etc) each of which is defined by a series of control points (point 1, point 2, point 3).Each control point is represented by data comprising positional data (xi,yi) representing the position within the area of the display of that control point, and tangent data (Xei'Yei# Xfi'Yfi) defining two tangent end points associated with the curved segments on either side of the control point. The tangent extent point data (xeityeit Xfi'Yfi) are stored as position data X, Y defining the position of the tangent end point. It would also be possible to store instead the x,y offsets from the control point position.
The monitor 160 is usually of the raster scanned type and consequently expects a raster scanned image, which is supplied from a memory mapped image store 130 as discussed above. Accordingly, it is necessary to provide a line display generating means 111 arranged to read the stored data representing the curve segments making up the frame, and generate corresponding raster image data comprising a plurality of pixels for storage in the image store 130. Each pixel need only comprise a single data bit or a small number of bits, if the display is monochrome black/white.
The line display generator 111 shown in Fig. 7 is accordingly arranged to access the memory 122 to read the stored data, and the image store 130 to write pixel data. As shown in Fig. 8, it calculates intervening point positions, and sets those memory locations within the image store 130 which corresponds to pixels lying on the curve to "dark" and all those which do not to "bright". The contents of the image store 130 are then displayed on the monitor 160. In practice, the line display generating means 111 comprises the CPU 110 operating under control of the programme stored in a programme store area in the memory 1200 Where the computer 100 comprises the above mentioned NeXTComputer, the line display generating means 111 may comprise the CPU 110 operating under the "PostScript" display command language provided within the operating system.
The manner in which some basic operations are performed by the above apparatus will now be discussed.
EDITING A FRAME As will be discussed in greater detail below, the preferred embodiments of the invention provide means for enabling a user to edit a frame. Editing a frame may involve either modifying the trajectory of existing lines or (more rarely) adding new lines. It is therefore necessary both to amend the data held in the frame table 122, and desirably to amend the image data in the image store 130 so as to enable the user to view the effects of the change. It is found that the best way of providing the user with means for amending the frame data stored in the table 122 is to allow him to employ a position sensitive input device 170a, so as to appear to directly amend the displayed representation of the frame on the screen monitor 160.
DEFINING AND EDITING A CURVE In this embodiment, referring to Fig. 9, a user manipulates the position sensing input device 170a, for example "mouse", by moving the device 170a so as to generate a signal indicating the direction and extent of the movement. This signal is sensed by the device input/output controller 140, which provides a corresponding signal to a cursor position controller 112 (in practice, provided by the CPU 110 operating under stored program control) which maintains stored current cursor position data in x,y co-ordinates and updates the stored cursor position in accordance with the signal from the device input/output controller 140. The cursor position controller 112 accesses the image store 130 and amends the image data corresponding to the stored cursor position to cause the display of a cursor position symbol D on the display shown on the monitor 160.The user may thus, by moving the input device 170a, move the position of the displayed cursor position symbol D.
In a preferred embodiment, the display line generator 111 is arranged in the editin#g mode not only to write data corresponding to the line A into the image store 130, but also to generate a display of the control point data. Accordingly, for each control point A1,A2, the display generator 111 writes data representing a control point symbol (for example, a dark blob) into the image store 130 at address locations corresponding to the control point co-ordinates x,y.
Further, the display generator 111 preferably, for each control point, correspondingly generates a second control point symbol E1 (or two such symbols) located relative to the A1 along a line defined by the control point tangent data at a position xe1, V and/or Xf1,yf1; preferably, a line between the two points A1 and E1 is likewise generated to show the tangent itself.
To enter a new curve A, the user signals an intention so to do (for example by typing a command on the keyboard 170b, or by positioning the cursor symbol at a designated area of a displayed control menu), positions the cursor symbol d at desired point on the display 160, by manipulating the position sensitive input device 170a and generates a control signal to indicate that the desired point has been reached. The cursor position controller 112 supplies the current cursor position data to the frame table 122 as control point position co-ordinates, and the display generator 111 correspondingly writes data representing a control point symbol into the image store 130 at address locations corresponding to the control point co-ordinates. The user then inputs tangent extent point information, for example via the keyboard 170b, or in the manner described below.When a second path control point has been thus defined and stored in the table 122, the supervisory image generator 111 will correspondingly generate the line segment therebetween on the supervisory display by writing the intervening image points into the image store 130.
Referring to Fig. 10, to amend the shape or path of the line A displayed on the supervisory display, a user manipulates the input device 170a to move the cursor position symbol D to coincide with one of the control point symbols A1 or E1 on the display 160. To indicate that the cursor is at the desired position, the user then generates a control signal (for example, by "clicking" a mouse input device 170a). The device input/ output controller 140 responds by supplying a control signal to the cursor position controller 112.
The cursor position controller 112 supplies the cursor position data to a supervisory display editor 113, (comprising in practice the CPU 110 operating under stored program control) which compares the stored cursor position with, for each point, the point A position (X,Y) and the point E position (xe,ye).
When the cursor position is determined to coincide with any point position A or tangent end position E the display editor 113 is thereafter arranged to receive the updated cursor position from the cursor controller 112 and to amend the point data corresponding to the point A1 with which the cursor symbol coincides, so as to move that point to track subsequent motion of the cursor.
If the cursor is located at the point A1 on the curve A, manipulation by a user of the input device 170a amends the position data (X1,Y1) in the line table 122, but leaves the tangent data (Xel'Y,1) unaffected. If, on the other the cursor is located at an end of tangent point E1, manipulation by a user of the input device 170a alters the tangent end point data in the frame table 122 within the memory 120, leaving the control point position data (x,y) unaffected.
In either case, after each such amendment to the contents of the line table 122, the display generator 111 regenerates the line segment affected by the control point in question within the image store 130 so as to change the representation of the line on the monitor 160.
Once a line has been amended to a desired position, the user generates a further control signal (e.g by "clicking" the mouse input device 170a), and the supervisory display editor 113 thereafter ceases to amend the contents of the memory 120. The cursor controller 112 continues to update the stored cursor position.
This method of amending the line representation is found to be particularly simple and quick to use.
GENERAL DESCRIPTION OF PROCESSES The processes performed by the apparatus of the preferred embodiment to the invention to enable a user to define an animated sequence are: 1. Defining Objects to be Animated - for example, characters. As will be disclosed in greater detail below, the apparatus of this embodiment permits the definition of a topological representation of a character or object to be animated.
2. Defining Key Frames - image frames in which the character previously defined is represented in a particular shape, orientation or position are defined, corresponding to spaced apart frames of an animated sequence.
3. Creating Interpolated Frames - from the key frames created above, a plurality of intervening frames in which the object is manipulated.
4. Displaying/Editing - the sequence of key frames and interpolated frames, or a representation thereof, is displayed and may be edited.
5. Replaying - the sequence of frames is successively displayed at a display rate corresponding to a video image (24, 25 or 30 frames per second), to enable the user to view a representation of the animated sequence. The sequence may be replayed with an associated sound track, to assess the correctness of timings and synchronisation.
6. Rendering - frames or sequences are coloured and/or shaded, and/or mixed with a desired background, to produce a finished video sequence.
GENERAL OVERVIEW OF SYSTEM OPERATION One typical sequence of operations of this embodiment is shown in Fig. 11. Initially, the user will wish to create a character or object to animate. The shape of the object will be changeable, but its underlying topology is maintained constant and the user will therefore create an initial "template" or set of data storing this underlying topology. The template is a view of the character or object which includes all the lines (and, therefore, is defined by all the control points) which it is desired to show in later pictures of the character or object.. The template picture or frame is created on the monitor 160, preferably using the position sensitive input device 170a (for example "mouse") as described in greater detail below.
At this stage, it may be desirable to store the template data (the curve control points, together with identification data labelling the template) permanently, on the mass storage device 180. Equally, rather than creating a template anew, the user may summon a stored template from mass storage 180.
The next stage may be to create a number of key frames. As is the case with hand produced animations, key frames are frames spaced apart in time which includes some change or shape or position of the character or object to be animated. Each key frame therefore has corresponding data identifying the point in the animated sequence at which the key frame occurs.
Key frames may be produced directly from the template to which they correspond, by copying the control point data making up the template and then editing the copied control point data to cause the key frame to diverge from the template. The editing is preferably performed interactively, using as above the position sensitive input device 170a, and viewing the effects of the editing on the monitor 160. The edited control point data then comprises the key frame data. A key frame may likewise be produced by copying an existing frame; in this case it will be an indirect copy of the template frame.
At this point it may be convenient to save the key frame data to the mass storage device 180.
Preferably, the key frame control point data comprise offset data defining the difference between a given key frame data and the corresponding data in the template. Thus, when the template is amended, the key frames need not be individually amended. Other advantages of this representation are discussed below.
The key frames thus generated, or key frames recalled from the mass storage device 180, may then be processed to derive the intervening frames (interpolated frames). Each interpolated frame comprises, as above, a set of control points defining the curves or lines making up the image frame. Each control point of each interpolated frame is derived to lie between the control points of the pair of key frames it lies between. The number of interpolated frames depends upon the separation in time of the two key frames between which the interpolation is performed.
The user may next view the interpolated sequence.
Typically, key frames are separated by less than one second, or less than 30 interpolants (although greater separations are of course possible) and it is therefore possible to provide a display including several key frames and the interpolants lying therebetween simultaneously on the screen of the monitor 160. At this point, the user may store the sequence of interpolated frames in mass storage 180, or may wish to amend the sequence in some manner.
A first type of amendment comprises changing the time occurrence of the key frame; in this case, the key frame itself is not redrawn but the number of interpolants will change and consequently the interpolation must be repeated. Alternatively, the user may wish to edit a key frame. Finally, he may (as discussed below) decide that a sequence cannot be directly interpolated and that therefore a new key frame needs to be inserted between two existing key frames; this may be achieved by converting an interpolated frame into a key frame (as discussed below in greater detail).
The next stage may typically be to animate the sequence, to test whether the timing and appearance is correct. The apparatus therefore displays each key frame and interpolated frame of the sequence in turn, at short intervals in time. If the sequence is to be displayed at "normal" running speed, the interval is 1/24, 1/25 or 1/30 second between frames. Preferably, however, the user can vary the frame repetition rate so as to view the sequence in slow motion. Preferably, the user can also designate a short sub-sequence to be animated, and can move repeatedly forwards or backwards through the short sub-sequence. If the sequence is not correct, then as before the user will edit either the appearance or position in time of the key frame, or add or delete a key frame.The control point data making up the frames of the sequence are then typically saved to mass storage device 180, for later use.
Additionally, or alternatively, the frames may be coloured and/or filled and/or added to an existing background ("rendered"), to generate a corresponding series of raster image frames which may be displayed on a colour monitor, saved on a video tape recorder, or compression coded and stored on the mass storage device 180.
It will be clear from Fig. 11 and the following description that the above described sequences is by no means exhaustive of the options open at each stage to the user.
STORAGE OF DATA IN MEMORY 120 From the foregoing, it will be apparent that a number of different types of data must be evaluated and stored by the apparatus to enable a completed animated sequence to be produced. One exemplary arrangement of data will now be discussed.
A finished animation will involve different characters, and will be produced in segments or sequences. Referring to Fig. 13a, for each completed animation (labelled "epoch" in Figs. 12 and 13) a table 1000 of data is defined which includes data establishing the identity of the animated sequence (a title), data relating to the soundtrack, and a table of sequences 1100, 1200, 1300, 1400 of successive frames. The sequences will occur in succession.
Conveniently, the sequences are stored as a linked list in the working memory 121; in other words, the complete animation table stores data identifying the location in the memory 121 of the first (and preferably the last) of the sequence tables 1100 and each sequence table 1100 ... includes data identifying the address in memory 121 of the next sequence table (and preferably, the previous sequence table).
In animation of cartoons, for example, it is very common for several parts of a character or object to move simultaneously and substantially independently.
For example, a character may walk and talk at the same time. In a preferred embodiment, the invention therefore enables separate movements to be defined for different parts of the same template This is achieved by creating separate key frames and interpolated frames therebetween for different parts of the template, and editing the separate sets of key frames and interpolants to achieve the desired motion, and then subsequently merging together the separate sets as will be discussed below. Each set of key frames and interpolants does form a sequence over time, but for consistency the term "sequence" will be reserved in the following for the merged sequence of frames, and the term "timeline" will be used to describe the sequential set of frames (key frames and interpolated frames) corresponding to separate parts of the templates, separately animated which are merged to form the sequence.Of course, where the whole object is animated simultaneously, the single timeline also comprises the finished sequence.
Thus, in general, referring to Fig. 13b, each sequence table 1100 comprises data defining the template frame which the sequence animates, data (e.g. a pointer) indicating to which animation or epoch 1000 the sequence 1100 corresponds, a set of frame tables (curve sets) comprising the composite or merged sequence (conveniently stored as a linked list of frames), a set of timeline tables 1110, 1120, 1130 ... (discussed below), data defining a currently displayed timeline, and, conveniently, a set of frames or curve-sets which comprises the merged sum of all timelines except that currently displayed. This enables the currently displayed timeline to be easily edited, then merged with this "base" sequence of frames to replace the existing composited sequence.
The length, and the first and last frame addresses of the composite sequence are also stored.
Referring to Fig. 13c, each timeline table 1110, 1120 ... likewise defines a series of frame data tables, and for convenience these are stored as a linked list of key frames 1111, 1112, 1113 Referring to Fig. 13d, each key frame data table 1111, 1112, 1113 includes a pointer to a frame table 122, but also includes further data. A pointer to a list of interpolant frame tables comprising those defining the interpolant frames lying after that key frame and prior to the next is included. Frame tables 122 are associated with a stored frame type indicator, which in this case indicates that the frame table 122 is a key frame. Additionally, data defining the key frame number (i.e. its order amongst the key frames in the timeline 1110) is stored.
Referring to Fig. 13e, the interpolant frame data tables ll1lA, lll1B, 1111C ... for each key frame 1111 each comprise a pointer to a frame curve set data table 122. Each also includes an interpolation factor (typically 0-1) defining the extent to which the frame depends upon the following key frame 1112; thus, for successive interpolated frames 11l1A, ll1lB, 111 it the theinterpolation factor gradually rises from close to 0 to close to 1. The interpolated frame 1111A and the key frame 1111 each store a frame number, which defines their position in the timeline 1110 and sequence 1100.Frame numbers correspond to points in time succeeding one another by 1/24, 1/25 or 1/30 of a second (or whatever the frame repetition period is desired to be).
Figs. 14 and 15 show the arrangement of the frame table 122 of Fig. 6a in greater detail. Each frame table 122 includes a list of lines or curves making up a set which represent the object or character which the frame depicts (and corresponds topologically to the template). The template, key frames and interpolated frames may thus all be represented by similar frame tables 122. The lines or curves are conveniently provided as a linked list of curve tables 2100, 2200, 2300, 2400, each curve table comprising a list of curve control points (again conveniently stored as a link list) 2110, 2120, 2130.
Each control point 2110 comprises position data defining the control point coordinates, and position data defining the control point tangent end coordinates. The curve segment to the next control point may include attribute control points (which will be discussed in greater detail below) for controlling the values of attributes such as colour and transparency during the rendering process, or for enabling compatibility during interpolation as discussed below, and in this case it is desirable for the positions of these attribute control points to be interpolated between key frames at which they are defined, for use in the subsequent rendering operation.
Accordingly, the control points 2110 O.. include a list or table of attribute control points over the curve segment to the next curve control point. Each attribute control point table entry may comprise data defining the value of the attribute controlled by the point (for example, the line colour or transparency), and comprises data defining the position along the line segment of the attribute control point; conveniently, this is the value of the parameter t at the point. Further details of attribute control points will be found in our earlier British Application No. 9110945.4.
The use of a linked list arrangement discussed above for storing frame data, timeline data and sequence data is not essential, but is particularly preferred since this enables individual frames, sub-sequences or sequences to be moved in time by breaking and replacing links at either side of the frame, sequence or sub-sequence.
During operation of the apparatus, all of the above data tables will normally be residence in the working memory 121 DISPLAYS ON MONITOR 160 From the foregoing, it will be apparent that a considerable amount of data is held by the apparatus during use, and that the data is held in a form in which it is not immediately comprehensible to the user. The manner in which data is presented to the user through the monitor 160, and in which it may be amended by the user, is therefore of considerable importance, and has major functional effects on the performance of the apparatus according to the invention.
Firstly, it is advantageous to provide a single frame display, and means for designating a stored frame (an interpolated or a key frame) for display. To display a single frame, the display generator 111 reads the corresponding frame table 122 and generates corresponding image data in an image buffer 130, which is then displayed.
Referring to Fig. 16, accordingly, preferably, a plurality of image buffers 130a, 130c, are provided within the image store 130. One buffer 130a comprises the display buffer, which represents (is mapped to) the display produced on the monitor 160. The other buffers 130b, 130c ... provide "windows", as is known generally in the computing art, and each of which contains image data corresponding to a raster image optionally forming a portion of the monitor 160 display. The images held in the frame buffers 130b, 130c ... are combined into the buffer 130a, and the size, position and order (i.e. which window overwrites which) may be determined by the user manipulating the keyboard 170b or mouse 170at as is provided in the operating systems of many commercially available personal computers in a manner which is well known and forms no part of the present invention.
Referring to Fig. 17, the display generated on the monitor 160 may therefore include display areas 160b-160e corresponding to some or all of the buffers 130b-130e, although preferably means (the input means 170 and CPU 110) are provided for enabling a user to select only one or some such display areas.
The buffer 130b as discussed above comprises image data which corresponds to a single selected frame of a sequence.
The buffer 130c is likewise dimensioned to contain a single frame image, which is however the image corresponding to the stored template.
The buffer 130d is arranged to store an image which comprises a montage of a succession of frames (key frames and interpolated frames) having successive frame numbers, and defining part of a timeline or sequence, in a manner described in greater detail below.
The buffer 130e stores a bar chart image comprising a plurality of bars each corresponding to one frame of the image in the buffer 130d, and each displaying the value of the interpolant factor for a corresponding frame as a length along the bar.
The appearance of an exemplary corresponding display is as shown in Fig. 18, in which the dark images correspond to key frames and the grey images correspond to interpolated frames. It will be seen that in this embodiment, a given frame may be represented in three different manners simultaneously; firstly, as an individual display on the display area 160b which corresponds to the contents of the image store 130b; secondly, as part of sequence displayed in the sequence display 160d corresponding to the image store 130d; and thirdly, as a bar of the bar chart representing the timeline in which the image is included, displayed in the display area 160e corresponding to the image buffer 130e.
Referring to Figs. 19a-c, the image held in the sequence buffer 130d may be presented in differing formats. In a first format, shown in Figs. 18a and 18b, the sequence image is produced by writing into the buffer 130d raster image data corresponding to each of the frames making up the sequence, so as to generate a display 160d in which the frame images are progressively displaced one from the other, but with some overlap so that each frame image partially overwrites its predecessor in the buffer 130d. The frame images could also be provided with no progressive displacement (i.e. superimposed).
Fig. 19e shows an alternative embodiment in which each frame image is written into the buffer 130d into a spatially separate portion thereof, without overlap.
This embodiment is also illustrated in Fig. 18. The display format of Fig. 19e is of assistance in viewing motion, since corresponding parts of the object in successive frames are close together. The representation of Fig. 19a, however, enables each frame to be more clearly examined. Advantageously, preferred embodiments of the invention provide means (e.g. the keyboard 170b) for selecting between these modes. It may also permit the displacement between successive frame images in the mode shown in Fig. 19e to be varied.
Referring to Fig. 20, the presentation of a frame in the frame display area 160b when it is desired to edit a frame is shown. When the user indicates a desire to edit a frame by selecting that frame (by manipulating the keyboard 170b or position sensitive input device 170a) the display generator 111 is arranged not only to generate the frame image data in the frame buffer 130b, but also to generate symbols (e.g. dots) at curvature control points. Preferably, the tangent end points and, more preferably, the tangent extent lines are also drawn. Finally, a cursor symbol (shown as a "+") is displayed, to enable a user to edit the frame image as discussed above using the position sensitive input device 170b.
Referring to Fig. 21, the display area 160e displays the bar chart data display held in the timeline buffer 130e. Each bar relates to a frame (key frame or interpolated frame) within a single timeline. The length of the bar shows the interpolation factor associated with interpolated frames, and since key frames are (by definition) not interpolated, they have either maximum or minimum bar lengths.The usefulness of the timeline display 160e and corresponding buffer 130e is, firstly, in providing the user with a synopsis of the information shown in the sequence image area 160d and, secondly, in providing a particularly simple way of editing the timeline, and seeing the effects on the timeline as a whole, by using the position sensitive input device 170a to position the cursor symbol at selected bars of the display 160e and signalling an appropriate control signal.
One type of such amendment is to alter the interpolation factor of given interpolated frame. In this case, the height of the bar for that frame is varied to follow the cursor position symbol manipulated by the user, and the interpolation value stored in the corresponding frame table llla is amended accordingly. Initially, the values of the interpolation factor in successive frames follow a progressive sequence which is typically a linear sequence, but could equally follow any predetermined curve (usually monotonic) between the neighbouring key frames.It is advantageous to provide that each bar is displayed in two colours, the height of the bar comprising the interface between the two, and that the colours of key frame bars (determined using the key frame numbers thereof) should alternate, and the height of the interpolant bars should rise with interpolation factor after one key frame, then fall with interpolation factor after the next, so as to present a rising and falling bar height rather than a sawtooth pattern. This is found easier for a user to interpret.
Such a progression gives the user an immediately visible sequence, which is considerably easier to use than having to specify each interpolant value individually, and it is found that in most cases, the same progression (for example, linear interpolation) can be employed. However, it is extremely useful to be able to amend the sequence merely by amending the interpolation value of a given frame (rather than redrawing or editing the frame), and this is particularly advantageously achieved by onscreen manipulation of a bar chart display using the position sensitive input device 170a.
Another type of amendment involves moving a frame or a series of frames in time. The chart display provides a readily visualised means for achieving this; using a position sensitive input device 170a, the user may designate one or a number of frames, and then move the frames along the timeline using the cursor symbol to a desired new position. In this case, the apparatus is arranged to alter the frame numbers of the frames selected by the user, and to generate new intervening frames (or delete old frames) as required. More details will be given below.
DESCRIPTION OF PARTICULAR OPERATIONS A description of one exemplary method of performance of particular operations will now be described.
Creating a Template Frame The user signals a desire to create a new template by generating an appropriate signal using the keyboard 170b or position sensitive input device 170a, typically by selecting an option from a menu displayed (possibly permanently) on the monitor 160.
The CPU 110 then creates within the working memory 121 a template table, which comprises a frame table 122 and a datum indicating that the frame is a template frame. Because the template frame is not itself employed in a sequence, no sequence numbers are necessary. The user will typically signal, via the keyboard 170b, a name to be associated with the template, which is stored therewith in the working memory 121. The template display area 160c is generated on the monitor 160. The cursor symbol is displayed within the display area 160c, and the user can proceed to build up a template. To do so, the user selects from the following options: Creating a new curve - the current cursor position provides the x,y coordinates of the first control point. The length of the tangent at this point is set to 0. These values are written into the frame table 122.The cursor position is continually monitored, and provides the second control point position coordinates, the values in the table 122 being continously updated with movements of the cursor until a control signal is generated by the user to fix the coordinates of the second control point (when the desired location is reached). A line between the first and second control points is continually generated by the line generator 111 within the template buffer 130c, displayed on the template display area 160c, to enable the user to determine the correct position for the second control point. The second control point tangent length is likewise initially set to 0.
Amending a control point - as described above with reference to Fig. 10.
Adding a new control point - a further control point can be inserted within a curve, to increase the complexity of the curve by dividing a segment of the curve into two. Accordingly, the user positions the cursor symbol at a desired point along a curve displayed on the template display area 160cur and initiates a signal via the keyboard 170b, or the position sensitive input device 170a (for example, by "clicking" a mouse device). The current cursor position coordinates are read, and the identity of the two control points which lie to either side of the current position along the line segment are determined. The cubic equation is solved using the current cursor coordinates, to derive the value of the parameter t at the current cursor position.A new control point record 2110 is created within the frame table 122 for the template, including pointers to the records of the two neighbouring control points on the curve. The "next control point " and "previous control point" pointer field in the surrounding control point data records are amended to point to the new control point. The slope and magnitudes of the tangents at the new control point are calculated, and stored in the new control point record. The new control point may then be edited, to change the shape of the curve running through it.
Deleting a control point - an appropriate signal being generated by the user, the control point at which the cursor is located is looked up in the table 122 using the current cursor position coordinates, and the corresponding control point record is deleted. The "next control point" and "previous control point" fields of the neighbouring control points on the curve segment are amended to point to each other and omit reference to deleted control point.
By adding and editing line segments, the desired line drawing is built up on the template display area 160c, and a corresponding set of curve data is stored in a frame table 122 labelled as corresponding to a template.
In addition to the above described operations, which directly affect the shape of the template, the user can also add attribute control points to control attributes of the finally rendered image, by positioning the cursor symbol to a desired point along the curve segment represented on the display device 160 and generating appropriate signal (e.g. by pressing appropriate key on the keyboard 170b). On doing so, the current cursor position is used to find the preceding curvature control point along the curve, to the attribute control point list of which a new attribute control point record is inserted and the pointers of surrounding attribute control points altered accordingly. The value of the parameter t is derived and stored in the attribute control point record, and the user may input data concerning the value of the attribute at that point for later use (as described in our earlier UK application 9110945.4).
Having built up a desired template, typically the contents of the template table 122 are stored to the mass storage device 180 so as to be recallable using the name or identification data for the template (e.g.
the file name).
Creating a Key Frame The set of curves comprising the key frame may be edited and stored in a key frame table, corresponding images being displayed in the frame display area 160b derived from the frame buffer 130b, in the same manner as described above with reference to a template. The point in time of occurrence of the key frame in the sequence is also of significance; it is therefore necessary to store data defining the sequence and timeline to which the key frame belongs; the position of the key frame in the sequence relative to other key frames; and the absolute position in the sequence or timeline of the key frame (the frame number).
The user may input these via the keyboard 170b.
Alternatively, if the cursor tracker 112 identifies the cursor position as corresponding to that of one of the bars of the bar chart display shown in display area 160e, the key frame may be allocated the corresponding frame number. In the absence of either, the apparatus is preferably arranged to allocate the key frame a frame number equal to the current largest frame number plus one, and a key frame number equal to the current large key frame number plus one, so that the frame is added to the end of the existing timeline.
A new key frame table 122 is then created within the memory 120, and the CPU 110 copies the contents of the template frame table into the new key frame table so that the new key frame is identical to the template.
The address within the memory 120 of the new key frame is then inserted into the "next key frame" pointer of the neighbouring key frame or key frames in the timeline, and any other necessary pointers within the memory are set to reflect the addition of the new key frame.
The timeline image buffer 130e is amended to cause the generation of a new bar at the key frame position in the display area 160e, and then the interpolated frames of the preceding key frame, if any, are recalculated (as discussed in greater detail below).
If there is a succeeding key frame, a set of interpolated frames to the succeeding key frame are also calculated, and corresponding interpolated frame data tables are set up within the memory 120, as a list pointing to the new key frame The sequence display buffer is then updated to include the newly interpolated frames, and the display on the monitor 160 in the display area 160d is correspondingly altered, as is the timeline bar chart display area 160e.
Interpolation When, as above, a key frame is created or amended or deleted or moved in time, it is necessary to recalculate the interpolated frames on either side of the change. Referring to Fig. 22, the CPU 110 first checks whether there is a previous key frame in the timeline. If there is, in other words if the present key frame is not the first frame of the timeline, the separation in frame numbers of the two key frames is found and the number of interpolants is set equal to this. The interpolation routine shown in Fig. 23 is then executed for the prceding keyframe.
Referring to Fig. 23, the length of the list of interpolated frames of the earlier key frame is examined; if the new number of interpolants required differs from the current number in the list, the current list is deleted and a new list of the required length is created. Each interpolant in the list is allocated an interpolation factor; for the i'th list member (i.e. the i'th frame after the earlier key frame) in a list comprising n members, the interpolation factor is L=i/n for linear interpolation, or F(i/n) where F is a non-linear function.One non-linear function which may be used is sigmoidal; that is, tends to horizontal at either end and is monotonically rising in between, so as to slow the interpolation rate towards either keyframe, and smooth the transition through the key frame; other functions smoothing the transition are equally possible Next, the curve data for each interpolated frame is derived and stored in the associated interpolated frame table. Each key frame is derived from the same template, and hence will have the same number of curve control points. For a given interpolated frame having an interpolation factor L, the CPU 110 therefore takes the first curve control point of the earlier key frame and that of the later key frame, and stores for the first curve control point of the interpolated frame a value intermediate between the two.The x,y position of the interpolated point is derived as: x = x1 (1-L) + x2L, where x1 is the earlier frame (to the list of which the interpolated frame belongs) and x2 is the later frame. The y coordinate is likewise given by: Y = Y1 (1-L) + y2L.
The coordinates of the tangent extent point at the control point are derived in exactly the same manner.
A preferred embodiment allows values of L greater than unity; this permits a character to "overshoot", which gives a desirable visual effect in cartoon animation.
In this case, the (1-L) term may be set to zero. For example, overshoot may be provided by interpolating from L = 0 to 1.2 in 8 frames, and 1.2 to 1.0 in two following frames.
The CPU 110 then proceeds to the next control point in the lists of the two key frame, and proceeds until all control points of all curves of the two key frames have been interpolated to produce corresponding control points in the interpolated frame. The CPU 110 then selects the next interpolated frame, with a correspondingly higher interpolation factor, and repeats the process.
Returning to Fig. 22, the CPU 110 next determines whether there is a following key frame occurring in the timeline, (eggs by referring to the pointers maintained in the timeline table 1110) and, if so (in other words, if the key frame is not the last frame of the time) the process shown in Fig. 23 is again executed to interpolate frames between the key frame and the following key frame. Once the corresponding interpolated frame tables have been calculated, the CPU 110 amends the data held in the timeline image buffer 130e to reflect the new interpolation factors, and updates the display area 160e.
Likewise, a new image is generated in the sequence image store 130d corresponding to the new interpolant values and the sequence display area 160d is updated.
Converting an Interpolated Frame into a Key Frame Where an interpolated sequence is unsatisfactory (as sometimes occurs when the sequence is to show an object moving in three dimensions, since the interpolation only interpolates in two dimensions) one convenient way of improving the sequences is to convert one of the interpolated frames into a key frame, and then edit the key frame as desired. To signal his intention to do so, the user may for example position the cursor symbol at the bar on the timeline display area 160e and issue an appropriate control signal (for example, by "clicking" a mouse device 170a twice). The CPU 110 then identifies the cursor position and derives the frame number of the corresponding interpolated frame. Next, the CPU 110 reads the frame table for that interpolated frame and locates the key frame to which the interpolated frame is stored.
Next, referring to Fig. 24, the CPU 110 creates a new key frame data table, and allocates the key frame the next number after that to which the interpolated frame belonged.
The curve data of the interpolated frame table 122 is then copied into the new key frame table, and the interpolated frame table is deleted. Reference to the new key frame is inserted into the list of key frames maintained in the time line table. The new key frame is then selected for display in the frame display area 160b, and corresponding image data is generated in the frame image buffer 130d and displayed on to 160. The frame is preferably displayed as shown in Fig. 20, with the curve control points and tangents indicated for editing Subsequent key frames in the list stored in the timeline table are renumbered, each incremented by one, to take account of the insertion of a new key frame. Next, the interpolation process of Fig. 22 is executed, and the sequence display frame store 130d is correspondingly modified to generate an updated sequence display in the display area 160d.With linear interpolation, the appearance of the other interpolated frames may not change until the new key frame has been edited, but the interpolation factors for each will have changed Deleting a Key Frame Where possible, it is desirable to simplify the calculation of the sequence by minimising the number of key frames. Accordingly, it may on occasion be possible to delete a key frame, and correspondingly interpolate frames between the two surrounding key frames. Referring to Fig. 25, when a user signals a desire to delete a key frame (for example, by positioning the cursor symbol at the corresponding bar of the bar chart display area 160e and issuing an appropriate control signal using the position sensitive input device 170a or keyboard 170b), the CPU 110 reads the key frame number of the key frame concerned and accesses the timeline data table.The key frame numbers of succeeding key frames in the list maintained by the timeline table are accordingly decremented by one, and then the current key frame table is deleted from the memory 121. All interpolated frame tables, listed within the key frame table are also deleted. If the key frame is the first key frame of the timeline, the only further action taken is to regenerate the image data in the sequence display buffer 130d and update the sequence display 160d and likewise amend the timeline buffer 130e and display area 160e, to remove the references to the deleted keyframe and its interpolated frame. The succeeding frames may also be shifted back in time.
On the other hand, if the deleted key frame occurs later in the sequence, the CPU 110 performs the interpolation process shown in Fig. 23 from the key frame which preceded the deleted frame to its new successor in the timeline.
Since the frame numbers of the following key frames have not been changed, the key frame will be replaced by an interpolated frame. The sequence image in the sequence image store 130d and the bar chart image in the bar chart image buffer 130e are updated by the CPU 110, and correspondingly redisplayed on the monitor 160.
Moving a Key Frame To change the length of an interpolated sequence in time, or to rearrange the order of the sequence, the preferred embodiment enables the user to indicate a particular key frame and change its time of occurrence in the sequence (e.g. frame number).
Typically, the user indicates an intention to move the key frame by positioning the cursor symbol at a desired key frame bar on the bar chart display area 160e and inputting an appropriate control signal, via the keyboard 170b or position sensitive input device 170a, and then moving the cursor symbol to the desired new key frame location.
Referring to Fig. 26, the CPU 110 determines from the cursor symbol position the frame number corresponding to the new location. If the frame has not been moved past either of its neighbouring key frames, the frame number of the key frame is changed to that of the new location and the interpolation routine of Fig. 22 is then executed. If the key frame is moved on to the frame number of its neighbouring key frames, the existing key frame is deleted and the key frame list is amended to avoid reference to it. The key frame numbers of following key frames are then decremented.
After this, the CPU 110 continues, as above, by allocating the key frame a new frame number and interpolating using the process of Fig. 22.
If the key frame has been moved past either of its neighbours, the CPU 110 first removes the key frame from the list and links the pointers of the neighbouring key frames, and then executes the interpolation routine of Fig. 23 to regenerate the interpolated frames for the key frame preceding the deleted key frame.
Next, the CPU 110 locates the key frame at or immediately preceding the new frame to which the selected key frame is to be moved. If there is already a key frame at the position to which the selected key frame is to be moved, the CPU 110 deletes the record of that key frame. The selected key frame is then inserted in the key frame list maintained in the timeline table just after the previous key frame position, by amending the "previous" and "next" pointers in the key frame tables concerned.
The key frame numbers of key frames between the old position and the new position are then decremented.
Furthermore, if the key frame has replaced an existing key frame at its new position, subsequent key frames are also decremented. Thereafter, the CPU 110 proceeds as above to update the key frame frame number, generate new interpolated frames between the key frame and its neighbour on either side, and regenerate the sequence image buffer 130d and display 160d, and correspondingly the timeline buffer 130e and display area 160e.
In a preferred embodiment, the CPU 110 is arranged to be capable of accepting an instruction to move a block of successive frames in time; the above process is in this embodiment essentially repeated for each such frame.
Example Sequence of Operations Referring to Figs. 27a-d, and to Fig. 18, the results of the sequence operations described above will be illustrated.
Referring to Fig. 27a, the user positions the position sensitive input device so as to move the cursor symbol to the next vacant point in the bar chart display area 160e on the monitor 160, and initiates a control signal indicating a desire to create a new key frame thereat.
The CPU 110 copies the template (or an existing key frame) to create a new key frame table in the memory 121 as discussed above. The sequence display buffer 130d is regenerated, and the display area 160 consequently displays the new key frame at the end of the sequence. The bar chart display area 160e likewise displays a new key frame bar.
Preferably, the apparatus is arranged also to generate a new key frame which is a copy of an existing key frame; in this case, the user may designate the existing key frame he wishes to copy using the position sensitive input device 170a to position the cursor symbol appropriately, and upon generating an appropriate control signal via an input device the CPU 110 will, rather than copying the template table, copy the designated key frame table curve data to produce the new key frame table curve data.
Referring to Fig. 27b, the user then generates an input signal indicating an intention to move the just created key frame four frames later in time. The CPU 110 performs the routine of the centre path of Fig.
26, and four interpolated frames are added to the interpolated frame list of the preceding key frame The sequence display and timeline displays 160d, 160e are then updated as above.
Referring to Fig. 27c, the user signals a desire to delete the preceding key frame and the CPU 110 executes the routine of Fig. 25. Since the last two key frames are now substantially identical, it will be seen that the key frames interpolated therebetween are likewise identical.
Referring to Fig. 27d, the user next signals an intention to convert one of the intervening interpolated frames into a key frame, to allow for subsequent editing. The CPU 110 follows the routine of Fig. 24, and updates the displays 160d and 160e.
Adding Further Curves to a Frame In the above described embodiments, each key frame (and consequently each interpolated frame also) includes only those curves defined by curve control points which exist in the template frame.
The method of adding control points and new curves to the template has already been discussed above.
Initially, as discussed above, each key frame comprises the same curve data as the template to which it is consequently identical. However, the user will often wish to delete some parts of the template for a given key frame; for instance, when an object is turned, many lines become invisible as they are obscured by other parts of the object. The key frame corresponding to the turned object would therefore not be required to include those lines.
Accordingly, the user can delete some control points (and/or curves from a key frame, and the pointers in the frame table 122 will be correspondingly reset to omit references to the deleted points and curves. In this case, the CPU 110 does not affect any other key frame or the template frame table. However, the repositioning of the pointers within the frame table 122 does not affect the correspondence between the remaining control points and curves and their counterparts in the template set. Each is still uniquely identifiable to the CPU as corresponding to a particular point in the template set.
It is thus possible for different frames to correspond to different subsets of the template set It may also occur that, whilst preparing a particular key frame, a user wishes to add a further control point or a curve comprising a number of control points. To do so directly would however introduce points which had no counterparts in the template set. It is nonetheless inconvenient to have to edit the template set directly to produce a result in a particular key frame. In preferred embodiments, therefore, the apparatus is arranged to allow a user to add further control points to a key frame, exactly in the manner described for the template frame, but upon his doing so CPU 110 is arranged to add a corresponding point to the template frame table. The template frame table therefore always comprises a super set of the points held in each key frame.
Interpolation between frames, and adding of frames of different timelines to produce composite frames, is still possible even if one frame includes extra curve control points or curves.
The operations of interpolating between the frames and adding to frames both require a one-to-one correspondence between curve control points. Thus, to perform either of these operations, a first step is to make the two frames compatible by equalising the number of points to be interpolated between or to be added together. To illustrate the manner in which this is achieved, reference is made to Figs. 28 and 29.
In Fig. 28a, a curve is shown as it would appear in the frame display area 160e. The shape of the curve is defined by two control points Al, A2, at which the corresponding curve tangents are indicated. Three attribute control points B1, B2, B3 are shown on the curve segment between the two curve control points Al, A2. Fig. 29 shows the corresponding curve table 2100 stored within the working memory 121. The table includes two curve control point records, first corresponding to Al, and pointing to the next record corresponding to A2. The curve control point record corresponding to Al also points to the list of attribute control point records, the first of which corresponds to B1, which in turn points to that corresponding to B2, which likewise points to that corresponding to B3.
Referring to Fig. 28b, upon the user generating a control signal indicating that a selected attribute control point B2 is to be converted into a curvature control point located at the same position on the curve, the CPU 110 creates a new curve control point record A3 within the curve table 2100 The record corresponding to the point Al is altered to point to the new record, which in turn points to A2. The attribute control point record corresponding to B2 is deleted from the attribute control point list The control point data stored for the new control point A3 corresponds to the position at the curve previously occupied by the attribute control point, and tangent extents such that the tangent slope at the new control point is the same as it had been at the attribute control point B2.
The lengths of the tangents at three curvature control points Al, A2, A3 are calculated so as to keep the shape of the curve unchanged; it will be observed from Fig. 28b that the length, but not the angles, of the tangent set control points Al and A2 have altered. Accordingly, new extent point data is written to the records corresponding to Al and A2 by the CPU 110.
The attribute control point B3 is deleted from the list of the curvature control point record for Al, and added to that for A2.
The position data defining the positions along the curve of the attribute control points B1, B3 are recalculated within the curve segments from A1-A3 and -A3-A2, and the new data are stored with the attribute control point records B1, B3.
Having created a new curvature control point A3, the user may employ the apparatus according to this embodiment to amend the curve shape by altering the control point data as described above; in particular, as shown in Fig. 28c, points of inflection may be introduced by setting the tangent extent points to define different tangent angles.
Referring to Fig. 28d, the apparatus of this embodiment is arranged also to convert a control point into an attribute control point if desired; in this case, the control point record A3 is deleted and the pointer stored with the record for Al is amended to point to the record for A2. A new attribute control point record for new attribute control point B2 is created. The attribute point records for B2 and B3 are added to the list held for the curvature control point record for Al. The curve is recalculated by the CPU 110, and the position data for the three attribute control points are amended Key frames in this embodiment of the invention are permitted to include more curvature control points than does the template frame from which they are derived, where a corresponding attribute control point exists in the template frame.Thus, when two frames are to be added or interpolated between, one may include curvature control points not present in the other, but the other will include a corresponding attribute control point, since it is derived from the same template Referring to Fig 30, the CPU 110 is therefore arranged, when a curvature control point not having a corresponding curvature control point in another frame is located, to locate the corresponding attribute control point in the other frame and convert that point into a curvature control point as discussed above with reference to Figs. 28a and 28b and Figs.
29a and 29b. The two frames will then be in correspondence, and may be added or interpolated between as discussed above.
ADDING FRAMES FROM PARALLEL TIMELINES As stated above, one advantage of the preferred embodiments is that different parts of an object may be animated separately and the separate sub-sequences (timelines) can be amalgamated together. This is possible because all frames of the different timelines have the same topology, or are all the sub-set of a common template table. The operation of adding frames is similar to that of interpolation, as discussed below, except that whereas in interpolation predetermined proportions of a pair of frames are added, in addition it is generally (although not necessarily) the case that equal proportions of each frame are added.
Essentially, the CPU 110 locates a pair (or, in general, a plurality) of frames of different timelines occurring at the same point in time, and derives a composite frame by taking, for each curve control point of the composite frame, the corresponding curve control points in each of the existing frames. From the coordinates of these, the coordinates of the corresponding point of the template frame is subtracted so as to generate difference coordinates, defining the difference between the control point coordinates of the key frames, and the coordinates of the corresponding points of the template frame to which they correspond.
The difference coordinates for a corresponding control point in each frame are then added together to form summed difference coordinates for that control point of the composite frame, to which the absolute coordinates of the corresponding control point in the template frame table are added to derive the composite control point coordinates Thus, each composite control point corresponds to the sum of the corresponding template control point coordinates, and the vector sum of the differences between the corresponding control points of time aligned frames of different timelines and the template.
More generally, it is possible to form the sum of the vector differences weighted by predetermined constants, so that the composite frame depends more upon the frame from one timeline than from another.
Equally, the arithmetic can of course be rearranged so that the coordinates of the frames of the timeline are added together first and then predetermined multiples of the template coordinates are subtracted from the sum.
In this way, a composite sequence of frames which correspond to the sums of the deviations from the template of the different timeline of sequence can be formed.
Referring to Fig. 31, one way of adding a plurality of frames is as follows. The CPU 110 creates a new frame table hereafter termed a difference table temporarily within the memory 121 for each frame which is to be added. The coordinates of each curve control point of each frame are subtracted from those of the corresponding point stored in the template table, and the difference in coordinates are stored in the difference frame table corresponding to that frame.
When difference tables have been set up for all frames to be added, the difference tables are made mutually compatible according to the process of Fig. 30.
The CPU 110 then creates a result frame table in the memory 110. It then reads the template table, and for each curve record, checks whether that curve record is present in any of the difference frames. If the corresponding curve exists in no difference frames#, the CPU 110 proceeds to the next curve in the template table If the corresponding curve exists in all difference frames, for each curve control point in the sequence, the sum of the difference coordinates for the corresponding control points in the difference tables is taken and the result is added to the coordinates of the corresponding point in the template table and stored in the result table. The next curve in the template table is then processed.
If a curve is not in all the frames to be added, the CPU 110 tests whether any of the frames to be added are key frames and, if so, whether the curve in question is in a key frame. If so, the sum of the difference coordinates for the frames in which the curve is present is taken and added to the template coordinates as before. If not, in other words if the curve is present only in an interpolated frame or frames, the curve is omitted from the result frame table.
Once the CPU 110 has considered all the curves in the template table, the result frame table will include all the curve control points necessary. At this stage, the CPU 110 derives the positions of any attribute points, as shown in Fig. 32, by taking in turn each curve in the results frame table and considering each attribute point in turn. If an attribute point occurs on all the curves to be added, the CPU 110 derives averaged or interpolated values for the attribute point position parameter and, optionally, for any attribute data which may be stored in the attribute point records. The interpolated values (e.g. the average values) are then stored in the results table.
If an attribute point is not present in all the frames to be added, then unless one of the frames in which it occurs is a key frame, the CPU 110 allocates a value equal to the position value in the template for each frame in which the attribute point is absent and interpolates a new position between all frame values as above.
Preferably, the interpolation of the attribute point position is not simply an interpolation between the two parametric position data values in the corresponding pair of frames interpolated between, but is derived by deriving the length of the corresponding curve segment in the interpolated frame, and the actual curve segment length is divided into the required interpolation ratio, the corresponding position on the curve is found, and the corresponding value of the parameter t at that position is derived and stored as the interpolated attribute point position.
If the attribute point is present in a key frame, the key frame attribute point position data is stored in the results table as this is relatively more significant than the position derived from an interpolated frame.
As mentioned above, preferably, for each sequence, a current composite sequence comprising a set of frame tables is maintained together with a corresponding base sequence corresponding a further set of frame tables, the base sequence comprising the composite sum as discussed above of all timelines other than at presently being displayed for editing After the current timeline has been edited, it is thus merely added to the current basic composite sequence to generate a new composite sequence, thus reducing the amount of computation necessary.
The operations of interpolation and addition will be seen to be closely similar; although in the above described embodiments, for clarity, interpolation between frames and addition of frame differences from the template are described, it is possible on the one hand to interpolate using frame differences (adding the result to the template frame coordinates) and on the other hand to add frames (subtracting the template coordinates or a multiple thereof afterwards); in practice, for convenience, the memory 121 may contain either frame tables stored as absolute point coordinates or frame tables stored as coordinates defining the difference from the corresponding coordinates in the template table. The processes described in Figs. 30-32 are equally applicable, and are preferably applied, to interpolation, mutatis mutandis.
Replayer 103 The replayer 103 in one embodiment of the invention is provided by the CPU 110 operating under suitable stored program control.
In one embodiment, the replayer is arranged to display the frames at a rate corresponding to the frame repetition rate (24, 25 or 30 Hz) at which the sequence is to be displayed, so that the operator can view the sequence at a realistic rate. Preferably, however, the replayer 103 is arranged also to accept input commands from the keyboard or other input device specifying the speed of replay This is particularly useful in enabling an operator to view crucial parts of the sequence in slow motion, or to move quickly through a sequence for cursory inspection In another preferred embodiment, the replayer 103 is arranged to accept input signals (from the keyboard 170b or more preferably, the position sensitive input device 170a in cooperation with the timeline display) to specify an initial and/or a final frame in the sequence between which the sequence is to be replayed.
An operator can thereby designate a particular part of the sequence to be replayed, and the replayer 103 will display in turn each frame between the initial and end frames. It is particularly convenient if the replayer 103 is arranged to constantly cycle between the start and finish frames; this may either be by displaying the sequence repeatedly from the first frame to the last frame, or by displaying the sequence forwardly (from start to finish) and then backwardly (from last to first) repeatedly. This is found particularly useful in enabling the operator to localise a particular frame or series of frames which are incorrect, for subsequent editing.
If the CPU 110 operates sufficiently fast, it would be possible for the replayer 113 to be arranged to access the memory 120, and to cause the display generator 111 to access in turn each frame table 122 corresponding to each frame of a sequence between the first and last frames specified. However, many CPU's available at present are incapable of generating entire frames of data in real time; thus, the replayer 103 is arranged instead to perform an initial operation of creating, for each frame to be displayed, a raster image by causing the display generator 111 to access in turn each frame table 122 and generate an image in the image store 130, and after each image is created the replayer 103 is arranged to cause the image to be stored on the mass storage device (e.g. hard disk) 180.In this context, a computer such as the above mentioned NeXT computer, which includes image compression means for compression encoding the image for storage on hard disk is preferred, since otherwise the volume of image data stored corresponding to the frames of even a relatively short sequence is extremely large. Once image data or a plurality of frames has been stored on the mass storage device 180, the replayer 103 is arranged to display the sequence by accessing the image data corresponding to each frame in turn to refresh the image store 130 at the desired frame repetition rate Once the operator signals a desire to cease replaying, the image data files corresponding to the frames in the replayed sequence may be deleted from the mass storage device 180, to reduce the memory used.
In a preferred embodiment, the replayer 103 is also arranged during the initial phase of preparing the sequence of images to cause the renderer 105 to render each frame as discussed below, so that the replayed sequence can be seen in colour and/or against the background.
Having viewed the replayed sequence or part thereof, it will often be desired to edit the sequence and in this case, the operator instructs the CPU 110 to cease replaying and commence editing.
Renderer 105 The renderer 105 may again comprise the CPU 110 operating under stored program control, or may be provided by different computer 100. In either case, the operation of the renderer, as is conventional, is to colour the image and/or to mix the picture with a background picture. The renderer 105 therefore reads the data stored in a table 122 corresponding to a frame to be rendered, and processes the frame in accordance with predetermined stored colour and/or background information. In particular, the attribute control points stored, as described above, may include colour and other attribute information (for example transparency), the manner of rendering which is described in our British Application No. 9110945.4 (Agents Reference 5086305) incorporated herein by reference in its entirety.
Modifications and Other Embodiments Although the invention has been described with reference to animation of two-dimensional objects, the general principles of the invention are applicable equally to animation of three-dimensional subjects or objects represented, similarly, by parametric control points. In particular, the invention may also be adapted to cooperate with a three-dimensional interpolation and animation system. In this case, all key frames are provided as two-dimensional projections of three-dimensional models, and no two-dimensional template is employed; instead, a three-dimensional template equivalent is maintained by the threedimensional system, but all two-dimensional projections correspond one to another as all are derived therefrom.
In the foregoing, it will be noted that attribute control points are employed for several purposes; firstly, to set the values of attributes which will subsequently be used during rendering, so that a considerable amount of rendering information need be specified only for individual key frames and is automatically inserted into frames interpolated therebetween at correct positions; secondly, as a means for providing the possibility of extra curve control points, to increase the complexity where necessary without doing so otherwise, whilst maintaining topological similarity between the frames.
However, apparatus allowing the definition of lines in terms of a limited number of control points controlling the path of the lines and also allowing the specification of attribute or feature properties at predetermined points along the lines, may be useful for other purposes; for instance, as described in our British Application No. 9110945.4 such points may mark the points at which further curves or structures of curves are to be connected to a line, which is particularly useful in heirarchical definition of objects, useful in two or three-dimensional animation.
Other applications are likewise not precluded.
Likewise, attribute values need not be set at separate points to those used to define curvature, but may be provided at curvature control points; although it is very much preferred to provide the flexibility to define attributes at points along the curve segment as well.
Matte Sequences One particular other application of the invention in one aspect comprises its use to provide a system for automated production of a sequence of image masks or mattes used in the compositing or combining of images in film, video or similar visual productions. Image compositing is combining two or more images, which is usually performed by defining masks or mattes which specify which parts of the various images are to be retained in a final sequence. For example, one common use of the technique is to take a character from one sequence of images and superimpose the character on another sequence of images in place of the existing picture.A matte or mask in traditional film production is a stencil through which parts of an image are visible; in electronic image processing, the matte or mask is a transparency map which assigns transparency values to parts of the image to determine which are visible when the image is composited with others.
In film and television production, sequences are combined together using a technique known as "chromakey", in which live characters are videotaped against a blue background, the characters not incorporating blue, so that in subsequent video processing, blue portions of the image are ignored and a different background is substituted. A similar technique, incorporating optical filters, is used out for film production. The blue portion of the image creates the mask through which the character is visible.
However, there are many situations where the parts of the image to be selected cannot be simply distinguished on the basis of a colour difference. In these cases, an operator or artist must hand draw a corresponding mask frame for each frame of the sequence, by drawing an outline around the part of the image to be separated. In traditional film production, for each frame of the sequence that requires a matte or mask, the artist projects the frame onto a clear background on which the required mask is drawn. It is also known to provide electronic systems in which the artist creates the appropriate matte or mask by using a stylus and digitising tablet to create a line drawing around parts of each image.
However, in either case, the artist has to create a separate mask for each frame. This is time consuming.
Further, because each frame is created completely independently, any inaccuracies or differences introduced by the artist are manifested as a highly visible phenomenon known as "jitter" or "boil", a motion or busyness on the edges of the image.
Further, in electronic systems, the matte is created on a pixel by pixel basis, and so the higher the image resolution, the more work which must be performed by both the artist and the electronic image processing apparatus in creating an accurate matte.
Accordingly, the present invention provides in one aspect a system for compositing sequences, in which the portions of a sequence to be composited are defined at frames spaced apart in time (key frames) and those for intervening frames are interpolated therebetween. This reduces the amount of work to be performed by the artist, and guarantees that the edges of successive frames, since interpolated, will not include the random jitter associated with current techniques Very preferably, the mattes are created as line drawings using, as in the above described embodiments, a limited number of control points to create and edit smooth curves; this provides practical and efficient interpolation and, further, provides reduced dependence upon image resolution.
Further, by employing attribute control points as described above and in our copending British Application No. 9110945.4 to control the transparency of the matte, it is possible to smoothly vary the transparency of the matte in time, or along the matte, or both. This provides a vastly increased flexibility and sophistication in specifying the properties of the matte and hence in controlling the image compositing.
Again, the use of interpolation over time makes changes in the matte over time appear smooth and without temporal irregularity.
The mattes created in this way therefore comprise a sequence of electronic images including outline drawings surrounding portions of the image to be separated, with attribute data specifying the transparency associated with those portions. These matte images may be separately printed for subsequent optical combining, or may be employed directly, electronically, by providing that the image buffer 130 is of the type including a transparency plane and three colour planes, and by multiplying each pixel of the frame with which the matte is associated with the corresponding transparency value of the matte, and each pixel at the frame with which it is to be combined by the inverse of that transparency value, for example.
More particularly, where each pixel of each image comprises a red value (R plane), a blue value (B plane), and a green value (G plane), the process of compositing the two comprises multiplying each of the R, G and B pixel values of a first image by the transparency values of the corresponding pixels of the matte image, and multiplying each of the R, G and B values of the second image by unity less the transparency values of the corresponding pixels of the matte.
Next, each colour value of each pixel in the image buffer is set equal to C=C1+ (1-A1)C2 Where A1 is the transparency value (derived from the matte at that pixel) and C1,C2 are the colour values multiplied by the transparency values as described above. If each pixel in the final image is required to have a transparency value, then this too can be derived from the transparency values of the original images added in a proportion determined by the transparency value of the matte at each point.
Still Image Compositing Exactly the same principles may be used to create masks for compositing together still images, with the advantages of increased flexibility and reduced dependence upon resolution. Still image compositing finds application in, for example, electronic illustration systems, desktop publishing systems, illustration and design systems generally, business presentation graphics, audio visual systems and the like.
Automated Reprocessing of Film and Video Images Film and video sequences often include blemishes, or other flaws, or occasional images or sequences of images which are unsatisfactory for some reason. It has long been known to repaint, by hand, parts of such images to repair such blemishes. Further, there has recently been interest in adding colour to images filmed originally in black and white.
It is also known to apply electronic graphics apparatus to this process, to electronically provide a coloured image on a pixel by pixel basis. According to another aspect of the present invention, the rendering and painting techniques described in our UK Application No. 9110945.4 are employed to paint parts or the whole of an image, by displaying the image on a screen, and creating an overlying painted image which is merged with the underlying image to add colour by specifying colour values at spaced apart attribute control points. If it is desired to colour or retouch the sequence of image frames, this process is employed to specify the colour at key frames and the above described embodiments are employed to generate interpolated frames between the key frames.

Claims (46)

CLAIMS:
1. Apparatus for generating an animated sequence of pictures, which comprises: means for storing data defining a plurality of pictures and data defining, for each, a temporal position in said sequence; means for reading said stored data relating to at least a pair of pictures, and generating therefrom data defining one or more intervening pictures occurring at time positions between those of said stored pictures, and providing a transition therebetween.
2. Apparatus according to claim 1, further comprising means for editing said data so as to amend said sequence.
3. Apparatus according to claim 1 or claim 2, in which the stored data defining each picture comprises a plurality of point data, relating to and defining lines comprising said picture, the points in each picture corresponding topologically to those in other pictures, so that the stored picture data defines pictures which generally topologically correspond.
4. Apparatus according to claim 3, wherein said data representing said pictures comprises data defining a plurality of lines, the data defining each line comprising a plurality of control point data each of which comprises position data defining a point on said line and data relating to at least one tangent value at that position.
5. Apparatus according to claim 4, wherein the control point data comprises data defining two tangents at each point, whereby a point may be defined to exhibit a slope discontinuity.
6. Apparatus according to any one of claims 3 to 5, in which the generating means comprises means for reading the stored point data of first and second stored pictures occurring one after the other within the sequence, and generating at least one intervening interpolated picture comprising a plurality of point data each derived from topologically corresponding point data for the two stored frames in predetermined proportions related monotonically to the relative distance in time between the position of the interpolated picture and those of each of the stored pictures.
7. Apparatus according to claim 6, wherein the ratio between the said predetermined proportions is proportional to the ratio between the said distances in time, so as to provide linear interpolation between said stored pictures.
8. Apparatus according to claim 6, wherein the ratio between the predetermined proportions is a non-linear function of the ratio between distances in time, the function being such as to provide generally sigmoidal interpolation so as to smooth the transition between interpolated sequences.
9. Apparatus according to claim 6, wherein the predetermined ratio can exceed unity.
10. Apparatus according to any of claims 6 to 9 appended to claim 2, wherein the editing means is arranged to operate, in response to a control signal, to amend the predetermined proportions for at least one interpolated picture.
11. Apparatus according to claim 10, wherein the editing means is arranged to allow the predetermined proportion to be amended for a single picture, leaving those of other interpolated pictures unchanged.
12. Apparatus according to claim 2, or any of claims 3 to 11 appended thereto, in which the editing means is arranged to amend, in response to a control signal, picture data relating to a stored picture.
13. Apparatus according to claim 12, when appended to claims 3, 4 or 5, in which the editing means is arranged to permit amendment of data relating to a single said point without affecting other said points.
14. Apparatus according to any preceding claim, further comprising display means for displaying at least one said stored picture represented by said stored picture data.
15 Apparatus according to claim 14, appended to claim 2 or any of claims 3 to 13 appended thereto, wherein the editing means comprises position sensitive input means manually operable to cause said apparatus to amend said picture data so as to change the display on said display means, permitting interactive editing.
16. Apparatus according to claim 14 or claim 15, further comprising means for generating upon said display a representation of said sequence comprising a chart defining, in a first direction, the position in time of said stored pictures.
17. Apparatus according to claim 16 when appended to claim 14, in which the editing means is arranged to accept a signal from the input means indicating a stored picture position in said sequence display, and at least a further signal defining a different temporal position in said sequence to be occupied by said stored picture, and for amending said stored picture time sequence data correspondingly.
18. Apparatus according to claim 17, in which the editing means is arranged to accept signals from the input means defining a plurality of said pictures to be temporarily moved, and is arranged to edit said data relating to a plurality of said pictures.
19. Apparatus according to any one of claims 16 to 18, in which said sequence display comprises a bar chart display providing a bar corresponding to each stored or interpolated picture in a sequence.
20. Apparatus according to claim 19, when appended to claims 10 or 11 and 15, in which the editing means is arranged to accept a signal from the input means designating one of said bars, and designating an amended length for one of said bars, and is arranged to correspondingly amend the predetermined proportions and to redisplay said bar, so as to allow interactive editing thereof.
21. Apparatus according to claim 14, comprising means for generating on said display a plurality of pictures of said sequence.
22. Apparatus according to claim 21, in which said pictures are displayed progressively mutually displaced.
23. Apparatus according to claim 22, in which said mutually displaced pictures are displayed so as not to overlap.
24. Apparatus according to claim 22, in which said mutually displaced pictures are displayed so as to overlap.
25. Apparatus according to any one of claims 21 to 24, in which said interpolated pictures are represented visually distinctively from said stored pictures.
26. Apparatus according to claim 21, in which said pictures are displayed in an animated sequence.
27. Apparatus for generating an animated sequence of pictures, which comprises: means for storing data defining a plurality of pictures, means for reading said stored data relating to at least a pair of pictures and generating therefrom data defining one or more intervening pictures by interpolation, and means for editing said stored picture data.
28. Apparatus according to claim 27, further comprising means for displaying a plurality of successive' pictures of said sequence, to facilitate said editing.
29. Apparatus according to claim 27 or claim 28, in which the stored data defining each picture comprises a plurality of point data, relating to and defining lines comprising said picture, the points in each picture corresponding topologically to those in other pictures, so that the stored picture data defines pictures which generally topologically correspond, the editing means being arranged to amend the stored picture so as to vary the topological correspondence between one picture and another, whilst retaining the possibility of interpolation therebetween.
30. Apparatus according to claim 2 or claim 27, wherein said editing means is arranged to generate picture data corresponding to a new stored picture in response to a signal designating a said interpolated picture, said new stored picture corresponding to said interpolated picture.
31. Apparatus according to claim 3 or claim 29, in which said store means is arranged to store first and second types of point data, said first type of point data defining point positions on lines making up said pictures and said second point data comprising a point position along a said line, said second point data position being defined by reference to point data of said first kind.
32. Apparatus according to claim 31, in which the arrangement is such that point data of a first kind defining a first picture may topologically correspond to point data of a second kind stored in relation to the second picture, in which said editing means is arranged, in response to an appropriate control signal, to convert point data of a second kind into point data of said first kind to enable the complexity of a picture to be increased, and/or vice versa to enable the complexity of a picture to be reduced.
33. Apparatus according to claim 31 or 32, in which the editing means is arranged to allow the position of the point data of a second kind along a line of a picture to be amended.
34. Apparatus according to claim 3 or claim 29, further comprising means for storing picture data defining a template picture, to which said stored pictures correspond.
35. Apparatus according to claim 34, in which the editing means is arranged to amend picture data corresponding to a stored picture to add point data to increase the topological complexity of the picture, the editing means being arranged also to amend the template data accordingly.
36. Apparatus according to claim 34 or claim 35, in which the point position data is stored as data defining the difference between point positions in each stored picture and in the template picture.
37. Apparatus according to any preceding claim, arranged to store, generate and edit a plurality of said sequences, and to merge temporally corresponding pictures from each of said plurality of said sequences to provide a merged picture sequence.
38 Apparatus according to claim 37, further comprising means for storing a merged sequence corresponding to all but one of said sequences, whilst said one of said sequences is being edited, and for subsequently merging said edited sequence with said stored merged sequence.
39. A character animation system comprising means for defining stored pictures and means for interpolating predetermined proportions of the stored pictures to provide intervening pictures to build up an animated sequence, the pictures comprising lines defined by a small number of control points.
40. An animation system comprising means for storing data defining at least one line and for storing data defining a point along said line, and a property unrelated with the path of said line at or in relation to the point, further comprising means for defining at least two pictures including said line and means for generating data relating to a third picture including said line in which said point is present from said first and second pictures.
41. A compositing system for defining a compositing mask specifying portions of an image to be combined with another image, the system comprising means for specifying said portions of a pair of images spaced apart in time, and means for generating from said spaced apart portions, corresponding portions for intervening frames by interpolation therefrom.
42. A system according to claim 41, wherein each portion is defined by data relating to lines defining the shape of the portion, the data comprising a plurality of point data, the point data of each image corresponding topologically to those of other images.
43. Apparatus according to claim 41 or 42, further comprising means for specifying transparency level data indicating the proportion of the portion of an image which is to be added to another.
44. Apparatus according to claim 43 appended to 42, in which the transparency data comprises point data defining a transparency level at a corresponding point.
45. Apparatus for compositing by specifying portions of a first image to be combined with a second image, which apparatus comprises means for defining said portions by defining control points which characterise curves surrounding said portions.
46. Apparatus for recolouring an existing image comprising means for specifying a curve by storing control points defining the curve, and means for specifying the colour thereof at points along the curve, and for interpolating colour values between the colour points.
GB9117409A 1990-11-30 1991-08-12 Animation Withdrawn GB2258790A (en)

Priority Applications (21)

Application Number Priority Date Filing Date Title
GB9117409A GB2258790A (en) 1991-08-12 1991-08-12 Animation
US07/844,634 US5692117A (en) 1990-11-30 1991-11-29 Method and apparatus for producing animated drawings and in-between drawings
JP4500477A JPH06503663A (en) 1990-11-30 1991-11-29 Video creation device
EP91920852A EP0559714A1 (en) 1990-11-30 1991-11-29 Animation
AU90158/91A AU9015891A (en) 1990-11-30 1991-11-29 Animation
JP4500061A JPH06505817A (en) 1990-11-30 1991-11-29 Image synthesis and processing
PCT/GB1991/002122 WO1992009965A1 (en) 1990-11-30 1991-11-29 Animation
PCT/GB1991/002124 WO1992009966A1 (en) 1990-11-30 1991-11-29 Image synthesis and processing
AU89321/91A AU8932191A (en) 1990-11-30 1991-11-29 Image synthesis and processing
EP91920646A EP0559708A1 (en) 1990-11-30 1991-11-29 Image synthesis and processing
PCT/GB1992/000928 WO1992021096A1 (en) 1990-11-30 1992-05-21 Image synthesis and processing
AU17921/92A AU1792192A (en) 1991-05-21 1992-05-21 Image synthesis and processing
JP4510508A JPH06507742A (en) 1991-05-21 1992-05-21 Video creation device
JP4510509A JPH06507743A (en) 1991-05-21 1992-05-21 Image synthesis and processing
EP19920910492 EP0586444A1 (en) 1991-05-21 1992-05-21 Image synthesis and processing
EP92910474A EP0585298A1 (en) 1990-11-30 1992-05-21 Animation
PCT/GB1992/000927 WO1992021095A1 (en) 1990-11-30 1992-05-21 Animation
AU17934/92A AU1793492A (en) 1991-05-21 1992-05-21 Animation
US08/150,100 US5598182A (en) 1991-05-21 1992-05-21 Image synthesis and processing
US08/311,398 US5611036A (en) 1990-11-30 1994-09-23 Apparatus and method for defining the form and attributes of an object in an image
US08/643,322 US5754183A (en) 1991-05-21 1996-05-06 Image processing apparatus and method for producing pixel data in dependence upon the shape of a sectional line extending between boundary lines of an object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9117409A GB2258790A (en) 1991-08-12 1991-08-12 Animation

Publications (2)

Publication Number Publication Date
GB9117409D0 GB9117409D0 (en) 1991-09-25
GB2258790A true GB2258790A (en) 1993-02-17

Family

ID=10699875

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9117409A Withdrawn GB2258790A (en) 1990-11-30 1991-08-12 Animation

Country Status (1)

Country Link
GB (1) GB2258790A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2303282A (en) * 1992-09-10 1997-02-12 Fujitsu Ltd Graphic editing apparatus
US5854634A (en) * 1995-12-26 1998-12-29 Imax Corporation Computer-assisted animation construction system using source poses within a pose transformation space
EP0892339A1 (en) * 1997-07-18 1999-01-20 International Business Machines Corporation A method and system for defining the movement path of a multimedia object
US5926186A (en) * 1992-09-10 1999-07-20 Fujitsu Limited Graphic editing apparatus and method
US6091427A (en) * 1997-07-18 2000-07-18 International Business Machines Corp. Method and system for a true-scale motion path editor using time segments, duration and synchronization
US6111590A (en) * 1997-07-18 2000-08-29 International Business Machines Corp. Method and system for a true scale motion path editor to create motion paths as independent entities
US7896231B2 (en) * 2006-12-08 2011-03-01 Wells Fargo Bank, N.A. Method and apparatus for check stack visualization
GB2548679A (en) * 2016-03-21 2017-09-27 Adobe Systems Inc Enhancing curves using non-uniformly scaled cubic variation of curvature curves
CN111857812A (en) * 2020-07-29 2020-10-30 珠海天燕科技有限公司 Method and device for transferring animation curves in game interface

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549712B (en) * 2022-04-25 2022-07-12 北京搜狐新媒体信息技术有限公司 Method and device for generating dynamic webp format picture

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1437795A (en) * 1973-07-04 1976-06-03 Computer Image Corp Digitally controlled computer animation generating system
GB2017459A (en) * 1978-02-17 1979-10-03 Messerschmitt Boelkow Blohm Synthesis of images for producing cartoons
US4600919A (en) * 1982-08-03 1986-07-15 New York Institute Of Technology Three dimensional animation
WO1989009458A1 (en) * 1988-03-22 1989-10-05 Strandberg Oerjan Method and device for computerized animation
EP0358498A2 (en) * 1988-09-09 1990-03-14 New York Institute Of Technology Method and apparatus for generating animated images
EP0365960A2 (en) * 1988-10-24 1990-05-02 The Walt Disney Company Computer animation production system
US4952051A (en) * 1988-09-27 1990-08-28 Lovell Douglas C Method and apparatus for producing animated drawings and in-between drawings

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1437795A (en) * 1973-07-04 1976-06-03 Computer Image Corp Digitally controlled computer animation generating system
GB2017459A (en) * 1978-02-17 1979-10-03 Messerschmitt Boelkow Blohm Synthesis of images for producing cartoons
US4600919A (en) * 1982-08-03 1986-07-15 New York Institute Of Technology Three dimensional animation
US4600919B1 (en) * 1982-08-03 1992-09-15 New York Inst Techn
WO1989009458A1 (en) * 1988-03-22 1989-10-05 Strandberg Oerjan Method and device for computerized animation
EP0358498A2 (en) * 1988-09-09 1990-03-14 New York Institute Of Technology Method and apparatus for generating animated images
US4952051A (en) * 1988-09-27 1990-08-28 Lovell Douglas C Method and apparatus for producing animated drawings and in-between drawings
EP0365960A2 (en) * 1988-10-24 1990-05-02 The Walt Disney Company Computer animation production system

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2303282B (en) * 1992-09-10 1997-04-16 Fujitsu Ltd Graphic editing apparatus
GB2303282A (en) * 1992-09-10 1997-02-12 Fujitsu Ltd Graphic editing apparatus
US5926186A (en) * 1992-09-10 1999-07-20 Fujitsu Limited Graphic editing apparatus and method
US6373492B1 (en) 1995-12-26 2002-04-16 Imax Corporation Computer-assisted animation construction system and method and user interface
US5854634A (en) * 1995-12-26 1998-12-29 Imax Corporation Computer-assisted animation construction system using source poses within a pose transformation space
US6577315B1 (en) 1995-12-26 2003-06-10 Imax Corporation Computer-assisted animation construction system and method and user interface
EP0892339A1 (en) * 1997-07-18 1999-01-20 International Business Machines Corporation A method and system for defining the movement path of a multimedia object
US6111590A (en) * 1997-07-18 2000-08-29 International Business Machines Corp. Method and system for a true scale motion path editor to create motion paths as independent entities
US6108010A (en) * 1997-07-18 2000-08-22 International Business Machines Corp. Method and system for a true-scale motion path editor
US6091427A (en) * 1997-07-18 2000-07-18 International Business Machines Corp. Method and system for a true-scale motion path editor using time segments, duration and synchronization
US7896231B2 (en) * 2006-12-08 2011-03-01 Wells Fargo Bank, N.A. Method and apparatus for check stack visualization
GB2548679A (en) * 2016-03-21 2017-09-27 Adobe Systems Inc Enhancing curves using non-uniformly scaled cubic variation of curvature curves
GB2548679B (en) * 2016-03-21 2019-04-03 Adobe Inc Enhancing curves using non-uniformly scaled cubic variation of curvature curves
CN111857812A (en) * 2020-07-29 2020-10-30 珠海天燕科技有限公司 Method and device for transferring animation curves in game interface

Also Published As

Publication number Publication date
GB9117409D0 (en) 1991-09-25

Similar Documents

Publication Publication Date Title
US5692117A (en) Method and apparatus for producing animated drawings and in-between drawings
US5754183A (en) Image processing apparatus and method for producing pixel data in dependence upon the shape of a sectional line extending between boundary lines of an object
Burtnyk et al. Interactive skeleton techniques for enhancing motion dynamics in key frame animation
US5619628A (en) 3-Dimensional animation generating apparatus
EP0950988B1 (en) Three-Dimensional image generating apparatus
US8694888B2 (en) Method and apparatus for titling
Fekete et al. TicTacToon: A paperless system for professional 2D animation
US4600919A (en) Three dimensional animation
US7420574B2 (en) Shape morphing control and manipulation
GB2258790A (en) Animation
JP3616241B2 (en) Animation display method and computer-readable recording medium recording animation display program
Durand The “TOON” project: requirements for a computerized 2D animation system
JP2000149046A (en) Cure generation device and method, recording medium storing program and corresponding point setting method
US8228335B1 (en) Snapsheet animation visualization
JP2714100B2 (en) How to make a video
Higgins The moviemaker's workspace: towards a 3D environment for pre-visualization
JPH10188026A (en) Method and storage medium for moving image preparation
JP2949594B2 (en) Video display device
GB2277856A (en) Computer generating animated sequence of pictures
Okuya et al. Reproduction of perspective in cel animation 2D composition for real-time 3D rendering
Cheng Human Skeleton System Animation
Remley Jr Computer Graphics and Animation
Blum et al. STORY: A HIERARCHICAL ANIMATION AND STORYBOARDING SYSTEM FOR ALPHA 11
EP0586444A1 (en) Image synthesis and processing
Zinn COMPUTER ANIMATION

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)