WO1994023392A1 - Animation - Google Patents

Animation Download PDF

Info

Publication number
WO1994023392A1
WO1994023392A1 PCT/GB1994/000631 GB9400631W WO9423392A1 WO 1994023392 A1 WO1994023392 A1 WO 1994023392A1 GB 9400631 W GB9400631 W GB 9400631W WO 9423392 A1 WO9423392 A1 WO 9423392A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
picture
defining
line
local reference
Prior art date
Application number
PCT/GB1994/000631
Other languages
French (fr)
Inventor
Andrew Louis Charles Berend
Mark Jonathan Williams
Michael John Brocklehurst
Graeme Peter Barnes
Craig Duncan Wareham
Original Assignee
Cambridge Animation Systems Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cambridge Animation Systems Limited filed Critical Cambridge Animation Systems Limited
Priority to EP94912003A priority Critical patent/EP0694191A1/en
Publication of WO1994023392A1 publication Critical patent/WO1994023392A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

Definitions

  • This invention relates to apparatus for, and a method of, producing a sequence of images defining an animated sequence, such as a cartoon featuring an animated character.
  • the invention further relates to motion picture signals and recordings produced by such an apparatus or method.
  • cartoons are manually drawn as a sequence of frames which, when played in succession at relatively high speed, form a moving picture (typical frame rates 24, 25, or 30 frames per second, although sometimes frames are repeated twice). Even for a short sequence, many thousands of frames thus need to be drawn by hand and production of the hand drawn frames requires large teams of skilled animators and assistants. Almost all cartoon animation today is still produced in this way.
  • the present invention provides improved animation methods and apparatus in which whole curves (representing, for example, component parts or limbs of a cartoon character) are defined with reference to a reference point in space, and the curves can be moved or interpolated as a whole by a user simply manipulating the reference point.
  • whole curves representing, for example, component parts or limbs of a cartoon character
  • the curves can be moved or interpolated as a whole by a user simply manipulating the reference point.
  • the position of a reference point is itself defined by reference to that of another reference point, so as to create a hierarchy of reference points which can reflect a hierarchical structure of the character.
  • the reference point defining the curve representing a toe may be defined by reference to that by which the curve representing a foot is defined, and so on.
  • it is straightforward to move a reference point and thus cause consequential movement of all reference points lower in the hierarchy (positions of which are defined by reference to the point which has moved), and hence the curves which are defined by reference to those reference points.
  • one curve may represent the lower arm and be “attached” to a reference point corresponding to the wrist joint, whereas a second may represent the upper arm and be attached to a reference point corresponding to the elbow. Movement of the wrist reference point can cause the two curves to become separated, and in a complex figure this can present a confusing image for the animator to edit if the separation occurs during an interpolated sequence.
  • the portions of a curve attached to a first reference point which are adjacent the next higher reference point in the hierarchy are positioned, when the curve is edited or interpolated, by taking account of the higher reference point in the hierarchy.
  • the link with the curve defined by the reference point higher in the hierarchy is maintained.
  • the positions of some portions of the curve may, after the curve is edited, be set in dependence jointly upon the positions of two reference points, between which the curve lies. For example, where a reference point is rotated through an angle about another reference point higher in the hierarchy, the positions of intermediate curve control points may be rotated through half the angle. Or, where a reference point is moved towards one higher in the hierarchy so as to compress the curve portions between the reference points, the distance between curve control points lying between the two reference points may likewise be shortened in proportion.
  • a local reference system can include a reference curve.
  • image features defined with reference to the local reference system can then acquire a parametric position on the reference curve, in addition to the attributes defining the position, scale and orientation of the reference system.
  • useful behaviour similar to that described above can be achieved by providing for automatic adjustment of this parametric position relative to the reference curve.
  • a further problem arising from the limitations of a two dimensional representation of three dimensional characters, can arise as follows. If a (humanoid) character, posing with arm outstretched and thumb uppermost, swings his arm in a vertical plane (i.e. about a horizontal axis) through half a revolution, the thumb will in the final position be lowermost. However, if the arm is swung in a horizontal plane (i.e. about a vertical axis) the thumb will remain uppermost.
  • This embodiment therefore enables a good representation of the type of motion where a character swings an arm in a vertical plane (i.e. about a horizontal axis projecting "out of" the 2-D image).
  • a character swings an arm in a vertical plane (i.e. about a horizontal axis projecting "out of" the 2-D image).
  • it cannot be used, on its own, to represent a motion with any element of rotation about a vertical axis.
  • we provide a means of specifying the depth of a rotation so as to produce elliptical rotations, which provide a more convincing and versatile movement of a character limb (for example) .
  • a remaining difficulty with this embodiment is that (to use the above illustration) even with the shallowest rotation depth, the positions of appendages (for example the thumb mentioned above) are reflected vertically, whereas with a rotation about a substantially vertical axis this should not occur. Accordingly, in a further particular mode of operation, during editing and interpolation a counter rotation of reference points lower in the hierarchy than that undergoing rotation is applied, so as to avoid this reflection where desired.
  • Figure 1 shows schematically a block diagram of apparatus according to one embodiment of the invention
  • Figure 2A illustrates the contents of a memory of apparatus of the apparatus to represent a curve displayed on a display shown in Figure 2B;
  • Figure 3 is a block diagram schematically illustrating the operation of the apparatus in generating a display
  • Figure 4 is a block diagram illustrating the functional elements of the apparatus
  • Figure 5 is a flow diagram schematically illustrating a sequence of operations controlled by a user of the apparatus
  • Figure 6 is a block diagram indicating schematically the manner in which data relating to a display frame is stored within the memory of the above apparatus;
  • Figure 7 is a flow diagram showing schematically the process of generating a curve in the above apparatus;
  • Figure 8 is a flow diagram showing schematically the process of editing a frame in the above apparatus
  • Figures 9A-9C are displays generated by the above embodiment on a display screen to illustrate the operation of the above apparatus
  • Figure 10 is a flow diagram showing schematically the process of interpolating to produce intervening frames between two key frames in the above apparatus
  • Figures 11A-11C are screen displays illustrating a particular feature of the apparatus;
  • Figure 12 shows in greater detail a portion of the flow diagram of Figure 8 when implementing the feature of Figure 11C;
  • Figure 13A represents a screen display illustrating two key frames between which data is to be interpolated
  • Figure 13B illustrates the effect of interpolating between the key frames using the feature of Figure
  • Figure 13C illustrates the effect of adding a further key frame in sequence of Figure 13A.
  • Figure 13D illustrates schematically the path over time of the interpolated frames using a further particular feature of the apparatus
  • Figure 14 is a flow diagram showing schematically the operation of the interpolator in the situation of Figure 13D;
  • Figure 15 is a screen display illustrating a yet further feature of the apparatus;
  • Figure 16 illustrates the corresponding arrangement of information in a frame table in the memory of the apparatus with reference to Figure 15;
  • Figure 17 is a flow diagram corresponding to Figure
  • Figure 18A is a screen display producable by the apparatus using the feature of Figures 15-17;
  • Figure 18B is a corresponding screen display after the frame has been edited
  • Figure 19A corresponds to Figure 18A
  • Figure 19B is a display corresponding to Figure 19A after the component represented therein ha ⁇ been edited according to the feature of Figures 15 to 17;
  • Figure 19C illustrates the corresponding display generated using a yet further feature of the apparatus
  • Figure 20 corresponds to Figure 8 and illustrates the operation of editing a frame using the feature of Figure 19C;
  • Figure 21A corresponds to Figure 18A
  • Figure 21B is a display corresponding to that of
  • Figure 21C is a corresponding display generated by a yet further feature of the apparatus.
  • Figure 22 is a flow diagram showing in greater detail a portion of Figure 20 implementing the feature of Figure 22C;
  • Figure 23 shows a display according to a yet further feature of the apparatus;
  • Figures 24A to 24F illustrate editing of the frame shown in Figure 23;
  • Figure 25 illustrates a method of operation in the implementation of the feature of Figures 24E and
  • Figure 26 is a flow chart illustrating the operations of the apparatus implementing the features of Figures 24E and 24F;
  • FIG. 27 illustrates the result of the method of
  • Figure 26 in a particular situation
  • Figure 28 is a flow chart for a more exact method appropriate to the situation of Figure 27;
  • Figures 29A and 29B illustrate example methods of constructing a character using the features of
  • apparatus according to an embodiment of the present invention (described in our earlier application WO92/09965) comprises a computer 100 comprising a central processing unit 110, a memory device 120 for storing the program sequence for the central processing unit (CPU) 110 and providing working read/write memory, a frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed, and an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150.
  • CPU central processing unit
  • frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed
  • an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150.
  • a monitor 160 is connected to the computer 100 and its display updated from the frame store 130 under the control of the CPU 110.
  • At least one user input device 170a, 170b is provided; typically a keyboard 170b for inputting commands or control signals for controlling peripheral operations such as starting, finishing and storing the results of an image generation or image processing operation, and a position sensitive input device (cursor control device) such as, in combination, a stylus and digitising tablet or a "mouse", or a touch-sensitive screen on the monitor 160, or a "tracker ball” device or a joystick.
  • cursor control device such as, in combination, a stylus and digitising tablet or a "mouse", or a touch-sensitive screen on the monitor 160, or a "tracker ball” device or a joystick.
  • a cursor symbol is generated by the computer 100 for display on the monitor 160 in dependence upon the signal from the position sensitive input device 170a, to allow a user to inspect an image on the monitor 160 and so it will designate a point or region of the image during image generation or processing.
  • a mass storage device 180 such as, for instance, a hard disk device is preferably provided as a long term image store, and preferably the mass storage device 180 also or alternatively comprises a removable medium storage device such as a floppy disk drive, to allow control programs and data to be transferred into and out from the computer 100. Also preferably provided is a printer 190, a film recorder 196 and/or a video recorder 197.
  • a picture input device 195 such as a scanner for scanning an image on, for example, a transparency, and inputting a corresponding video signal to the computer 100 may also be provided.
  • the memory 120 includes a working memory area 121.
  • An image displayed on the monitor 160 includes at least one line A which is drawn as a curve defined by three control points A ⁇ , A-, A,, corresponding
  • control point position (X., Y.) and the positions
  • each curve also includes a pointer to reference point P.
  • frame table 122 includes data defining the position and orientation of the reference point.
  • the coordinates of the control points of the curve A are defined in the reference frame of the reference point (i.e. as offsets from axes running through the reference point position at specified orientations) .
  • the CPU 110 functionally comprises a line image generator 111, a cursor tracker 112 and a display editor 113.
  • the line image generator 111 is arranged to read the frame table 122, to calculate from the reference point data and curve control point data the coordinates of intervening points along the curve, and to write the intervening points into the frame display store 130 in such a manner (for example, in a different colour) that they are distinguishable from the background.
  • the memory mapped frame image in the display store 130 is then displayed on the monitor 160.
  • the cursor tracker 112 reads the coordinates of the position-sensitive input device 170a for the device controller 140, and writes a cursor symbol D at a corresponding position in the frame display store 130 for corresponding display.
  • the display editor 113 responds to the cursor position from the cursor tracker 112 to alter the contents of the frame table 122 (specifically, the reference point Pl position or orientation, or the positions of curve control points and tangent end points).
  • the line image generator 111 amends the contents of the frame display store 130 so that the display on the monitor 160 changes correspondingly.
  • the CPU 110 further comprises, functionally, an interpolator 101 (described in greater detail hereafter and in. our earlier referenced application) which is arranged to generate sequences of image frames between a pair of spaced image key frames; a replayer 103 arranged to recall a stored image sequence previously created, and generate a display of the sequence as a moving image on the animator's screen 160 (as described in our earlier above referenced application); and a renderer 105 arranged to colour each generated image and/or otherwise affect the way in which the image is represented (as described for example in our earlier applications WO92/09966, WO92/21096 and UK Application 9310940.3).
  • Each of the components 105, 103, 111, 113, 101 may be provided by a separate processor, or each may be provided by a suitable sequence of execution steps on a common processor.
  • Film or video recording (each image in turn is recorded in a sequence either in electronic (video) form or by the film recorder as a sequence of colour transparencies, for projection or display) .
  • the present invention is particularly concerned with the stages of defining key frames, creating interpolated frames, and editing key frames.
  • the other stages above are as described in our earlier referenced applications in general.
  • FIG. 5 One typical sequence of operations in an embodiment of the present invention is shown in Figure 5. Initially, the user will wish to create a character to animate, and accordingly a "template" frame table defining the topology of the character, or part thereof, is created (Steps S500, S502) in the working memory 121.
  • the next stage is to create a number of key frames . These may be created by editing (S506) the curve control points of the template frame. Another possibility is to scan in an image using the image input device 195, display the scanned image on the monitor 160, simultaneously with the template image, and edit the template image to conform to the scanned image which will previously have been drawn freehand. In either case, when the animator is satisfied with the keyframe, the frame table 122 is then permanently stored (S508).
  • the template frame table (and the key frame tables derived therefrom) specifies a sequence of reference points as well as a sequence of curve tables each specifying a plurality of curve control points .
  • Each reference point data record in the memory 121 other than the single reference point highest in the hierarchy of points, includes a pointer (PREV R.P.) to the location of the data record of a higher reference point in the hierarchy.
  • Each reference point record also includes a pointer to a curve to which it is attached (i.e. a curve whose position and orientation is defined by that of the reference point) .
  • Each curve table record in the working memory 121 likewise contains a pointer to a reference point.
  • the user may first cause the creation of the curve table including the template frame comprising a plurality of lines, and then subsequently create the hierarchy of reference points by selecting reference point positions (typically at joints of a character) and designating which reference points are defined by reference to which others. Finally, the user inputs data designating which lines in the template image are to be attached to which reference points (for example, by designating a line and then a reference point using the position-sensitive input device 170a). The editor 113 then derives the relative position of the control points of the curve, in the coordinate space of the reference point, and writes these into the frame table 122 in place of the absolute positions previously present.
  • the animator amends the position and orientation of reference points to produce wholesale variations in all curves the positions of which are defined by the reference points (for example, to rotate or move a limb of a character).
  • the animator also edits the curve control points, as discussed in our earlier above referenced application. Because the curve control points are represented, in this apparatus, by defining data relative to the reference position, the contents of the curve tables need not be altered when the reference points are edited, unless a change of character shape is actually desired.
  • this character reference point For each character to be animated, there will be one reference point which is highest in the hierarchy. Moving this reference point will affect the whole of the character. This accordingly provides a convenient means of editing the character position, orientation or scale.
  • this character reference point may be defined positionally with reference to another reference point on another character, so as to be moved with the other character.
  • the animator may then cause the interpolator 101 to interpolate between the key frames as created
  • S510 may view (S512) the interpolated frames on the monitor 160, and make any necessary amendments to the key frames, or add a new key frame.
  • Data defining the sequence are then stored (S514).
  • the interpolated sequence generated by the replayer 103 may be viewed (S516), and again any necessary editing of key frames performed.
  • the interpolated sequence is rendered (S518), and may then be stored (S520) as a video sequence of images.
  • the interpolation process generally requires a consistent linkage between reference points and curves, and accordingly it may be necessary to break the interpolated sequence at a point where the hierarchy of reference points, or the linkage between curves and reference points, is changed, by providing two successive key frames .
  • the frame table 122 of Figure 2A comprises a list of lines or curves making up a set which represents the object or character which the frame depicts.
  • the lines or curves are provided as a linked list of curve tables 2100, 2200, 2300, 2400 (only 2100 shown); each curve table comprising a list of curve control points 2110, 2120, 2130 etc.
  • Each control point field 2110 comprises position data defining the control point coordinates, relative to a reference point, and position data defining the control point tangent end coordinates relative to the reference point. Attribute control points, as discussed in WO92/09965, may also be present.
  • Each reference point field comprises data defining a first angle ⁇ . and distance R which
  • the scale factor S are typically separate transformation matrices, allowing the rotation, translation and scaling transformations to be separately effected.
  • the distance R is, in fact, a transformation matrix defining a two dimensional translation; when used, as described hereafter, to define a distance R, the translation is purely one dimensional.
  • Figure 7 shows in a flow chart operations of the line generator 111.
  • Steps S700, S710 and S712 define a loop by which each reference point is processed in turn.
  • the line generator 111 calculates (S702) the cumulative transformation comprising the two rotations, translation and scale transformations for that reference point multiplied together, and multiplied by the cumulative transformation matrices for all higher reference points in the hierarchy (i.e. the reference point which is pointed to by the pointer field, (PREV R.P.) and all its predecessors).
  • PREV R.P. pointer field
  • the animator indicates, using the position-sensitive input device 170a, a command to edit a selected reference point, and then inputs a command signal to select (S800) an editing mode from command options which comprise*, a rotation of the reference point about the next reference point up in the hierarchy; a rotation of the reference point about itself; a change in position of the reference point; and a scaling of the reference point axes (and hence of the curve control point coordinates and thus the size of each of the curves attached to the reference point).
  • command options which comprise*, a rotation of the reference point about the next reference point up in the hierarchy; a rotation of the reference point about itself; a change in position of the reference point; and a scaling of the reference point axes (and hence of the curve control point coordinates and thus the size of each of the curves attached to the reference point).
  • the cursor tracker 112 thereafter is arranged (S802) to read the position indicated by the position-sensitive input device 170a, and, in dependence upon the position, to alter the data in the reference point table 7100 for the selected reference point.
  • the angular position of the cursor relative to a reference point, or the preceding reference point in the hierarchy (depending on the mode selected), is calculated as ARCTAN X/Y and the transformation defining the angle ⁇ .. or ⁇ _ is -
  • the display editor 113 having thus amended the frame table 122, the line image generator 111 is then arranged to draw (S812) the amended frame as discussed above, to allow the user interactively to edit the frame until he is content with the appearance of the frame.
  • Figure 9A illustrates the effect of rotating the shoulder reference point P-. about its own axis
  • an interpolation factor L is set (S1002) equal to i/N (where i is a frame counting index), and phase of ⁇ , , ⁇ junk, R, and S are interpolated by
  • a function of L e.g. a cosine function
  • S1006 When all parameters have been interpolated, and stored (S1006) in a frame table, an inbetween frame has been defined.
  • the apparatus is arranged to provide a mode in which a rotation about a vertical axis is simulated by rotating a reference point through an angle ⁇ .. about the next
  • the amended frame is then redrawn (S1204/S812) by the line generator 111.
  • Figure 13A(1) indicates a first key frame displayed upon the display unit 160
  • Figure 13A(2) indicates a second key frame, generated from the first above by rotating reference point P.. about reference point
  • Figure 13B into a key frame, as disclosed in our earlier application WO92/09965 (incorporated herein by reference), and reducing the distance R of the reference point from the predecessor.
  • Figure 13C the effect of this is to change the smooth circular arc path over time shown in Figure 13B into a bi-lobed curve, which is even less satisfactory as a representation of motion through an arc extending out of the plane of the display device 160.
  • FIG. 14 is a flowchart illustrating one particular way of achieving this, with the result shown in Figure 13D.
  • the interpolator 101 is arranged to linearly interpolate the angle ⁇ .. ,
  • the reference point position is modified (S1406) to effect a compression of the vertical axis of the frame, thus foreshortening the circular arc marking the path of the reference point in subsequent frames to form an elliptical arc.
  • This compression may be achieved by deriving (S1406) the cartesian (X, Y) position coordinates of the reference point, by deriving the cumulative transform for the reference point, and then multiplying (S1408 the difference in Y coordinate between the reference point and the reference point above it in the hierarchy (about which it is rotated) by a fractional scaling factor, so as to reduce its vertical offset from the axis about which it is rotated during interpolation.
  • the modified reference point position can then be returned (S1410) to polar form, and stored (S1412) in the frame table for interpolated frame i .
  • the extent of the desired compression (the fractional scaling factor) is input by the animator, from the keyboard 170b, or position-sensitive input device 170a.
  • the table 7100 for each reference point includes, if the reference point is to be rotated with foreshortening, an indication that the above foreshortening interpolation is to be performed, and an indication of the degree of foreshortening desired.
  • a direction of foreshortening might also be specified, rather than assuming rotation will always be about a purely vertical axis.
  • each curve is attached to (in other words defined with reference to) a single reference point.
  • This arrangement works well in many cases.
  • portions of a curve may be linked to two reference points, and appropriate modes of behaviour defined for the curve control points.
  • a portion of a cartoon character (for example a leg) comprises a first closed curve A defined by reference to a first reference point P-. , and a second closed curve B
  • control points A-. -A. and the curve B is likewise
  • the reference point P- is stored in the memory 120
  • each of the curve tables 2100, 2200 relating to curves A, B includes a pointer field containing a reference to the location in the memory of the corresponding reference point table 7100, 7200. Additionally, each of the control point fields 2210, 2220 ... contains a field including a pointer to the location of a corresponding reference point table 7100, 7200. Thus, the control points B 2 , B- may be
  • the line image generator 111 is arranged to consider each reference point in turn (steps S1700, S7101, S1712). For each reference point, the cumulative transformations are derived (S1702), as described above. These cumulative translations for each reference point are then applied (S1704, S1706) to the curve control points whose records explicitly point to that reference point. Likewise, the cumulative transformations for each reference point are applied to the curve control points of each curve whose record points to that reference point, provided that those curve control points do not themselves point to a different reference point.
  • control points B. and B. move with it, and the
  • the line generator 111 then joins the curve control points B.. -B. with a smooth curve as
  • Figure 18B indicates the effect which would be obtained if the reference point P_ were displaced from its position in Figure
  • Figure 19B illustrates the effect achieved on the display 160 by rotating the reference point P Mic from the position shown in
  • FIG 20 in general terms, when a component such as that illustrated in Figure 19A is desired to be edited by the animator, the editor 113 is arranged to perform the process shown in Figure 20.
  • the process comprising initially steps S2000 to S2010, is generally similar to that indicated in Figure 8 (steps S800 to S810), but is expanded as follows. After amending the position of a reference point P_ (S2010), the next reference point up in the hierarchy (in this case, P. ) pointed to by that reference point is located
  • the editor 113 can be arranged, when a reference point is moved such that the distance R from its predecessor in the hierarchy is reduced, to edit the positions of control points to reduce their displacement from the control point with reference to which they are defined by the same ratio as the ratio by which the distance between the two reference points has changed (R /R , ,), as shown in Figure 22.
  • the apparatus Rather than curves always being attached directly to reference points, the apparatus also provides that curves can be attached to other curves, in the manner described for example in application WO 92/21095. This facility can be particularly useful in creating life-like characters in an intuitive manner, and the apparatus provides modes of editing so as to ensure that this realism is easily maintained during the creation key frames for a complete animated sequence.
  • Figure 23 shows a collection of four curves A,B,C and D which are generally attached to a pair of reference points P , P mod.
  • Curve A is defined by
  • B is defined by four control points B.. to B. all
  • curves is attached to points on the curves A and B ⁇ which are thus used as reference curves or "formers' Curve C is defined by a control point C. attached to curve A at one end, and a point C ⁇ .
  • Curve D is defined similarly to curve C, having one end control point D-. attached to curve A near point
  • the apparatus further provides means for automatically adjusting the parametric positions of the points of attachment of curves such as curves C and D, to avoid problems of the type illustrated in Figure 19B.
  • Figure 25 illustrates one technique for identifying a direction in which to maximise extents
  • Figure 26 is a flow chart of one method of maximising the extent for one attachment point (for example C, ) .
  • the reference curves A and B are defined with reference to particular reference points P. and P ⁇ . respectively,
  • this line defines an angle ⁇ relative to the horizontal (X) axis.
  • the points of attachment can be moved for example to maximise their extent in a direction perpendicular to this line 2500 with results which are generally satisfactory.
  • the angle ⁇ may be determined simply as the angle of a line joining the reference points. If there are not reference points suitable for determining ⁇ , other methods may be employed to obtain such an angle, including for example direct user input.
  • the reference curve (curve A for example) to which the selected point is attached is rotated by an angle - ⁇ (S2604) in order to bring it into a "horizontal" orientation. Then the maximum extent of the curve in the Y direction (perpendicular to line 2500) is easily calculated (S2606).
  • each section of the reference curve is a cubic
  • the maximum and minimum values of Y can be found by solving a quadratic equation which is the derivative of the curve equation in Y only.
  • the maximum extent in the Y direction of each curve section is found (not forgetting to consider the end points), and the greatest of these over all of the sections forms the maximum y extent of the reference curve (for example curve A).
  • point D.. in Figure 23 For other attachment points, such as point D.. in Figure 23,
  • the method of Figure 26 allows an offset value to be added to the position of the attachment point (S2608), which offset is also defined in terms of the curve parameter.
  • the relative position of the point of attachment is adjusted (S2610) to the desired position.
  • the reference curve (curve A, for example) is rotated (S2612) by angle ⁇ , returning it to its proper orientation.
  • other methods of defining and finding the desired attachment point are possible.
  • Figure 27 illustrates a situation which corresponds broadly to that of Figure 23, except that the sizes of the reference curves A and B are very different. Again, an imaginary line 2700 is drawn joining the reference points P. and P ⁇ . to which the curves A
  • a line 2702 joins the points of maximum extent on curves A and B, in a direction perpendicular to line 2700. This line shows that, in certain cases, the extent finding mechanism described above will not give an ideal path for the points of attachment of curves attached to the curves A and B. In particular, unless it begins in a convex manner, a curve joining the two points of maximum extent will cross a part of the reference curve B, as shown by the line 2702.
  • each curve to be adjusted is identified in turn by the apparatus (S2800), including for example the curve C in Figure 23.
  • the apparatus S2800
  • the end points of the curve C, , C ⁇ , ) are adjusted to the positions of
  • step S2806 The position of attachment point C. on curve A is then adjusted (S2810) to the point where the second tangent line touches curve A.
  • the steps S2804 to S2810 are repeated (S2182), until the adjustment each time becomes smaller than a predetermined amount. For example, the iteration process should normally be stopped when the adjustment at each iteration becomes smaller than the resolution of the final displayed image.
  • a first step is to form a skeleton of reference points in a hierarchical arrangement, and then at each reference point to attach a curve to act as a reference curve.
  • curves are attached to the reference curves as desired.
  • curves can be attached to the outline curves, but it is often more convenient in practice to add extra constructional curves attached to the former curves, and then to add the curves defining features of detail to these constructional curves.
  • FIG. 29A to 29C An example of this process is shown in Figures 29A to 29C in which a generally sausage shaped outline is regarded somewhat as a cylinder, whose ends are formed by eliptical reference curves 2902 and 2904.
  • Outline curves 2906 and 2908 are attached to the curves 2902 and 2904 with an attribute of extent finding or tangent finding, while the curves 2902 and 2904 are attached to respective reference points 2900 and 2901.
  • a "chevron" or diamond shape pattern 2910 is desired to appear on a side of the character, and this is defined as a diamond shaped curve attached to a constructional curve 2912.
  • the constructional curve 2912 is attached at each end to one of the reference curves 2902 and 2904.
  • the points of attachment of the constructional curve to the reference curves, and the points of attachment of the feature curve 2910 to the constructional curve 2912 can be adjusted by the user independently of the outline curves, in order to achieve a wide variety of desirable effects.
  • Figure 29B shows that the constructional line 2912 can be adjusted to a position where the surface feature 2910 would project outside the outline of the character.
  • the apparatus can be configured to recognise this situation, and to "matte" the feature against the outline curves. This results in the shape shown at 2910' being displayed, with the portion outlined in dotted lines not being painted in the final picture.
  • the interpolation paths have been shown variously as being straight, circular or (in foreshortened rotation mode) elliptical. It should be recognised of course that arbitrary trajectories can be defined for the reference points between key frames. One method of doing this is to specify a Bezier curve section using control points for end positions and tangents . Instead of interpolating the values R, ⁇ .,etc directly, the interpolator

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Apparatus for generating an animated sequence of pictures, which comprises: means for storing data defining one or more pictures, said data defining one or more lines which, when displayed, define a said picture; means for reading said stored data and for generating therefrom a sequence of further pictures; and means for editing said stored data so as to amend said picture and means for storing local reference system data. The line defining data defines the position of portions of the line relative to a first local reference system (P2), and the means for storing line data is arranged to store data relating to the position of a portion of a line so as to allow said portion to be influenced by a second local reference system (P1).

Description

ANIMATION
This invention relates to apparatus for, and a method of, producing a sequence of images defining an animated sequence, such as a cartoon featuring an animated character. The invention further relates to motion picture signals and recordings produced by such an apparatus or method.
Traditionally, cartoons are manually drawn as a sequence of frames which, when played in succession at relatively high speed, form a moving picture (typical frame rates 24, 25, or 30 frames per second, although sometimes frames are repeated twice). Even for a short sequence, many thousands of frames thus need to be drawn by hand and production of the hand drawn frames requires large teams of skilled animators and assistants. Almost all cartoon animation today is still produced in this way.
In the production of cartoon animations, typically, there is a "key frame" production stage in which a senior animator draws each character at significant points during the sequence, followed by an "inbetween" stage, in which a more junior animator creates the missing intermediate frames by a process of interpolation by eye between adjacent key frames. After this, the sequence is recorded onto film or video tape and then replayed to check for errors. If necessary, frames are redrawn at this point; otherwise the pencil drawings are inked and painted for subsequent recording.
In view of the sheer volume of drawings required, and of the time and expense involved in producing cartoons by this method, attempts have been made to automate parts of this process.
Our earlier international application O92/09965 (incorporated herein in its entirety by reference) describes an electronic animation system in which key frames are generated as sets of curves defined by sparse curve control points, the positions of which are interpolated in an automatic "inbetween" stage to generate inbetween animated frames; editing means are provided for correcting any parts of the sequence which appear to be unnatural when replayed, typically by inserting a new key frame.
Our earlier application WO92/21095 describes an improved electronic animation system in which curves may be attached to other curves, so as to allow the manipulation of several curves simultaneously. Thus, curves indicating minor detail can be attached to curves indicating bold outlines, and can be edited therewith.
Another known graphics system, which was proposed to be used for animation, is described in "Automatic Curve Fitting with Quadratic B-Spline Functions and its Applications to Computer Assisted Animation", Yang et al, Computer Vision, Graphics and Image Processing 33, pp 346-363, 1986 March No. 3.
We have found that the production of key frames can be time consuming in the apparatus described in the above two references. Accordingly, the present invention provides improved animation methods and apparatus in which whole curves (representing, for example, component parts or limbs of a cartoon character) are defined with reference to a reference point in space, and the curves can be moved or interpolated as a whole by a user simply manipulating the reference point. Thus, the manual work required by the animator in creating a new key frame by editing a previous key frame is greatly reduced.
In particular embodiments of the invention, the position of a reference point is itself defined by reference to that of another reference point, so as to create a hierarchy of reference points which can reflect a hierarchical structure of the character. Thus, the reference point defining the curve representing a toe may be defined by reference to that by which the curve representing a foot is defined, and so on. In such an embodiment, it is straightforward to move a reference point, and thus cause consequential movement of all reference points lower in the hierarchy (positions of which are defined by reference to the point which has moved), and hence the curves which are defined by reference to those reference points.
Although the above techniques lead to improvements in the efficiency of animation, under some circumstances it can be inefficient, and can even give rise to further problems.
One problem occurs when two different curves, forming part of the same character, are defined with reference to (hereinafter: "attached" to) different reference points. For example, one curve may represent the lower arm and be "attached" to a reference point corresponding to the wrist joint, whereas a second may represent the upper arm and be attached to a reference point corresponding to the elbow. Movement of the wrist reference point can cause the two curves to become separated, and in a complex figure this can present a confusing image for the animator to edit if the separation occurs during an interpolated sequence. Accordingly, in the apparatus according to one embodiment, the portions of a curve attached to a first reference point which are adjacent the next higher reference point in the hierarchy are positioned, when the curve is edited or interpolated, by taking account of the higher reference point in the hierarchy. Thus, when the curve is moved by moving its reference point, the link with the curve defined by the reference point higher in the hierarchy is maintained.
This may be achieved for example by providing that the portions of the curve (represented, for example, by curve control points) adjacent to the reference point higher in the hierarchy are, in terms of their position, defined by the higher reference point, the rest of the curve being defined positionally by the lower reference point in the hierarchy.
This provides a simple mechanism for maintaining the linkage between two curves forming part of the same character. However, difficulties can still arise under some circumstances because the shape of the curve may be distorted in a way which can appear confusing to the animator, especially where a complex figure is represented.
Accordingly, in a further embodiment, the positions of some portions of the curve (for example represented by curve control points defining the curve) may, after the curve is edited, be set in dependence jointly upon the positions of two reference points, between which the curve lies. For example, where a reference point is rotated through an angle about another reference point higher in the hierarchy, the positions of intermediate curve control points may be rotated through half the angle. Or, where a reference point is moved towards one higher in the hierarchy so as to compress the curve portions between the reference points, the distance between curve control points lying between the two reference points may likewise be shortened in proportion. Thus, particularly confusing effects where what was originally a smooth curve lying in a single loop is distorted and caused to cross-over itself, or to include sharp inflections, are ameliorated or avoided, and so the topology and/or general appearance of the curve is usually retained.
In addition to reference points having attributes of position, orientation and scale, for example, a local reference system can include a reference curve. Using for example the mechanisms for attaching curves to other curves, described in WO92/21095, image features defined with reference to the local reference system can then acquire a parametric position on the reference curve, in addition to the attributes defining the position, scale and orientation of the reference system. In embodiments of the invention permitting attachment to curves, useful behaviour similar to that described above can be achieved by providing for automatic adjustment of this parametric position relative to the reference curve.
A further problem, arising from the limitations of a two dimensional representation of three dimensional characters, can arise as follows. If a (humanoid) character, posing with arm outstretched and thumb uppermost, swings his arm in a vertical plane (i.e. about a horizontal axis) through half a revolution, the thumb will in the final position be lowermost. However, if the arm is swung in a horizontal plane (i.e. about a vertical axis) the thumb will remain uppermost. If the motion is specified between two key frames, one at each of the start and end positions, a linear interpolation process as described in our earlier application WO92/09965, will interpolate the arm linearly between the two end positions; this will appear generally similar to the rotation in a horizontal plane, but with an inaccurate result in the centre of the interpolation as the control points for the outer end of the arm will cross over those for the inner end of the arm, so that the arm loses its width in the centre of the swing.
If, on the other hand the two key frames were attempted to be specified by simply translating (not rotating) the reference point, the shape of the whole limb would be retained but displaced sideways; an effect not generally desired in cartoon animation . Accordingly, in a further aspect of the invention, there is provided a mode in which movements of a curve representing a component of a character can be specified as rotations, and the intervening interpolated frames correspond to successive degrees of rotation of the component.
This embodiment therefore enables a good representation of the type of motion where a character swings an arm in a vertical plane (i.e. about a horizontal axis projecting "out of" the 2-D image). However, it cannot be used, on its own, to represent a motion with any element of rotation about a vertical axis. Accordingly, in a further embodiment, we provide a means of specifying the depth of a rotation so as to produce elliptical rotations, which provide a more convincing and versatile movement of a character limb (for example) .
A remaining difficulty with this embodiment, however, is that (to use the above illustration) even with the shallowest rotation depth, the positions of appendages (for example the thumb mentioned above) are reflected vertically, whereas with a rotation about a substantially vertical axis this should not occur. Accordingly, in a further particular mode of operation, during editing and interpolation a counter rotation of reference points lower in the hierarchy than that undergoing rotation is applied, so as to avoid this reflection where desired.
Other aspects and exemplary embodiments of the invention are as described or claimed hereafter.
The invention will now be illustrated, by way of example only, with reference to the accompanying drawings in which:
Figure 1 shows schematically a block diagram of apparatus according to one embodiment of the invention;
Figure 2A illustrates the contents of a memory of apparatus of the apparatus to represent a curve displayed on a display shown in Figure 2B;
Figure 3 is a block diagram schematically illustrating the operation of the apparatus in generating a display; Figure 4 is a block diagram illustrating the functional elements of the apparatus;
Figure 5 is a flow diagram schematically illustrating a sequence of operations controlled by a user of the apparatus;
Figure 6 is a block diagram indicating schematically the manner in which data relating to a display frame is stored within the memory of the above apparatus; Figure 7 is a flow diagram showing schematically the process of generating a curve in the above apparatus;
Figure 8 is a flow diagram showing schematically the process of editing a frame in the above apparatus;
Figures 9A-9C are displays generated by the above embodiment on a display screen to illustrate the operation of the above apparatus;
Figure 10 is a flow diagram showing schematically the process of interpolating to produce intervening frames between two key frames in the above apparatus;
Figures 11A-11C are screen displays illustrating a particular feature of the apparatus; Figure 12 shows in greater detail a portion of the flow diagram of Figure 8 when implementing the feature of Figure 11C;
Figure 13A represents a screen display illustrating two key frames between which data is to be interpolated;
Figure 13B illustrates the effect of interpolating between the key frames using the feature of Figure
11C; Figure 13C illustrates the effect of adding a further key frame in sequence of Figure 13A; and
Figure 13D illustrates schematically the path over time of the interpolated frames using a further particular feature of the apparatus; Figure 14 is a flow diagram showing schematically the operation of the interpolator in the situation of Figure 13D;
Figure 15 is a screen display illustrating a yet further feature of the apparatus; Figure 16 illustrates the corresponding arrangement of information in a frame table in the memory of the apparatus with reference to Figure 15;
Figure 17 is a flow diagram corresponding to Figure
7, illustrating the generation of a display using the feature of Figures 15-16;
Figure 18A is a screen display producable by the apparatus using the feature of Figures 15-17;
Figure 18B is a corresponding screen display after the frame has been edited;
Figure 19A corresponds to Figure 18A;
Figure 19B is a display corresponding to Figure 19A after the component represented therein haε been edited according to the feature of Figures 15 to 17;
Figure 19C illustrates the corresponding display generated using a yet further feature of the apparatus;
Figure 20 corresponds to Figure 8 and illustrates the operation of editing a frame using the feature of Figure 19C;
Figure 21A corresponds to Figure 18A;
Figure 21B is a display corresponding to that of
Figure 21A after editing of the frame depicted therein according to the features of Figures
15-19C;
Figure 21C is a corresponding display generated by a yet further feature of the apparatus;
Figure 22 is a flow diagram showing in greater detail a portion of Figure 20 implementing the feature of Figure 22C;
Figure 23 shows a display according to a yet further feature of the apparatus; Figures 24A to 24F illustrate editing of the frame shown in Figure 23;
Figure 25 illustrates a method of operation in the implementation of the feature of Figures 24E and
24F; Figure 26 is a flow chart illustrating the operations of the apparatus implementing the features of Figures 24E and 24F;
Figure 27 illustrates the result of the method of
Figure 26 in a particular situation; Figure 28 is a flow chart for a more exact method appropriate to the situation of Figure 27; and
Figures 29A and 29B illustrate example methods of constructing a character using the features of
Figures 23 to 28.
General description of system Full details of aspects of the animation system embodying the present invention are given in our earlier application WO92/09965, incorporated herein in its entirety by reference, and so a detailed recital of the apparatus and some aspects of the method of operation of such an animation system is not necessary. For clarity, however, a brief synopsis will now be given.
Our earlier above referenced application,
"Interactive Computer Graphics" by Burger and
Gillies (1989, Addison Wesley, ISBNO-201-17349-1 ) , cr "An Introduction to Splines for use in Computer
Graphics and Geometric Modelling" by R H Bartels, J C Beatty and B A Barsky, (Morgan Kaufmann, ISBN0-934613-27-3) , all incorporated herein by reference, disclose apparatus for editing and displaying smooth curves by defining curve control points dictating the shape of the curves.. One example of a class of such curves are B-splines . A particular way of representing such splines is the "Bezier" format, in which the curve is represented by a number of curve control points, the data for each curve control point comprising the coordinates of a point on the curve, together with the coordinates of two tangent end points marking tangents to the curve at that point.
Referring to Figure 1, apparatus according to an embodiment of the present invention (described in our earlier application WO92/09965) comprises a computer 100 comprising a central processing unit 110, a memory device 120 for storing the program sequence for the central processing unit (CPU) 110 and providing working read/write memory, a frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed, and an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150.
A monitor 160 is connected to the computer 100 and its display updated from the frame store 130 under the control of the CPU 110. At least one user input device 170a, 170b is provided; typically a keyboard 170b for inputting commands or control signals for controlling peripheral operations such as starting, finishing and storing the results of an image generation or image processing operation, and a position sensitive input device (cursor control device) such as, in combination, a stylus and digitising tablet or a "mouse", or a touch-sensitive screen on the monitor 160, or a "tracker ball" device or a joystick. A cursor symbol is generated by the computer 100 for display on the monitor 160 in dependence upon the signal from the position sensitive input device 170a, to allow a user to inspect an image on the monitor 160 and so it will designate a point or region of the image during image generation or processing.
A mass storage device 180 such as, for instance, a hard disk device is preferably provided as a long term image store, and preferably the mass storage device 180 also or alternatively comprises a removable medium storage device such as a floppy disk drive, to allow control programs and data to be transferred into and out from the computer 100. Also preferably provided is a printer 190, a film recorder 196 and/or a video recorder 197. A picture input device 195 such as a scanner for scanning an image on, for example, a transparency, and inputting a corresponding video signal to the computer 100 may also be provided.
Referring to Figures 2A and 2B, as described more fully in our earlier referenced application, the memory 120 includes a working memory area 121. An image displayed on the monitor 160 includes at least one line A which is drawn as a curve defined by three control points A→ , A-, A,, corresponding
image frame data representing the line images being stored within a frame table 122 within the working memory 121, as a series of curves (curve A curve B ... ) each of which is defined by a series of control points (A.. , A,.,, A-.) each represented by the
control point position (X., Y.) and the positions
of the end points of two tangents at that point
(Xei. ,' Ye_i. ,'
Figure imgf000021_0001
,' yffi. )'. In this embodiment,' as will
be more fully described below, each curve also includes a pointer to reference point P., and the
frame table 122 includes data defining the position and orientation of the reference point. The coordinates of the control points of the curve A are defined in the reference frame of the reference point (i.e. as offsets from axes running through the reference point position at specified orientations) .
Referring to Figure 3, and as described in greater detail in our above referenced application, the CPU 110 functionally comprises a line image generator 111, a cursor tracker 112 and a display editor 113. The line image generator 111 is arranged to read the frame table 122, to calculate from the reference point data and curve control point data the coordinates of intervening points along the curve, and to write the intervening points into the frame display store 130 in such a manner (for example, in a different colour) that they are distinguishable from the background. The memory mapped frame image in the display store 130 is then displayed on the monitor 160. The cursor tracker 112 reads the coordinates of the position-sensitive input device 170a for the device controller 140, and writes a cursor symbol D at a corresponding position in the frame display store 130 for corresponding display. The display editor 113 responds to the cursor position from the cursor tracker 112 to alter the contents of the frame table 122 (specifically, the reference point Pl position or orientation, or the positions of curve control points and tangent end points). After any such editing of the frame table 122 by the display editor 113, the line image generator 111 amends the contents of the frame display store 130 so that the display on the monitor 160 changes correspondingly.
Referring to Figure 4, the CPU 110 further comprises, functionally, an interpolator 101 (described in greater detail hereafter and in. our earlier referenced application) which is arranged to generate sequences of image frames between a pair of spaced image key frames; a replayer 103 arranged to recall a stored image sequence previously created, and generate a display of the sequence as a moving image on the animator's screen 160 (as described in our earlier above referenced application); and a renderer 105 arranged to colour each generated image and/or otherwise affect the way in which the image is represented (as described for example in our earlier applications WO92/09966, WO92/21096 and UK Application 9310940.3). Each of the components 105, 103, 111, 113, 101 may be provided by a separate processor, or each may be provided by a suitable sequence of execution steps on a common processor.
General description of animation process
As described in our above referenced earlier application, the processes performed by the apparatus embodying the present invention to enable a user to define an animator sequence are:
1. Defining objects to be animated (characters).
2. Defining key frames (i.e. creating image frames spaced apart in time in which the character is posed or shaped in a particular manner).
3. Creating interpolated "inbetween" frames (intervening image frames created between the key frames by the interpolator 101).
4. Displaying and editing (the interpolated sequence is displayed and changed as necessary) . 5. Replaying (the sequence is displayed at a normal replay speed).
6. Rendering (the image is coloured and mixed with a background).
7. Film or video recording (each image in turn is recorded in a sequence either in electronic (video) form or by the film recorder as a sequence of colour transparencies, for projection or display) .
The present invention is particularly concerned with the stages of defining key frames, creating interpolated frames, and editing key frames. The other stages above are as described in our earlier referenced applications in general.
One typical sequence of operations in an embodiment of the present invention is shown in Figure 5. Initially, the user will wish to create a character to animate, and accordingly a "template" frame table defining the topology of the character, or part thereof, is created (Steps S500, S502) in the working memory 121.
The next stage (S504, S506) is to create a number of key frames . These may be created by editing (S506) the curve control points of the template frame. Another possibility is to scan in an image using the image input device 195, display the scanned image on the monitor 160, simultaneously with the template image, and edit the template image to conform to the scanned image which will previously have been drawn freehand. In either case, when the animator is satisfied with the keyframe, the frame table 122 is then permanently stored (S508).
In the apparatus, the template frame table (and the key frame tables derived therefrom) specifies a sequence of reference points as well as a sequence of curve tables each specifying a plurality of curve control points . Each reference point data record in the memory 121, other than the single reference point highest in the hierarchy of points, includes a pointer (PREV R.P.) to the location of the data record of a higher reference point in the hierarchy. Each reference point record also includes a pointer to a curve to which it is attached (i.e. a curve whose position and orientation is defined by that of the reference point) . Each curve table record in the working memory 121 likewise contains a pointer to a reference point.
Typically, the user may first cause the creation of the curve table including the template frame comprising a plurality of lines, and then subsequently create the hierarchy of reference points by selecting reference point positions (typically at joints of a character) and designating which reference points are defined by reference to which others. Finally, the user inputs data designating which lines in the template image are to be attached to which reference points (for example, by designating a line and then a reference point using the position-sensitive input device 170a). The editor 113 then derives the relative position of the control points of the curve, in the coordinate space of the reference point, and writes these into the frame table 122 in place of the absolute positions previously present.
In editing and creating key frames, therefore, the animator amends the position and orientation of reference points to produce wholesale variations in all curves the positions of which are defined by the reference points (for example, to rotate or move a limb of a character). To change the shape of components of a character, the animator also edits the curve control points, as discussed in our earlier above referenced application. Because the curve control points are represented, in this apparatus, by defining data relative to the reference position, the contents of the curve tables need not be altered when the reference points are edited, unless a change of character shape is actually desired.
For each character to be animated, there will be one reference point which is highest in the hierarchy. Moving this reference point will affect the whole of the character. This accordingly provides a convenient means of editing the character position, orientation or scale. For example, this character reference point may be defined positionally with reference to another reference point on another character, so as to be moved with the other character.
The animator may then cause the interpolator 101 to interpolate between the key frames as created
(S510), and may view (S512) the interpolated frames on the monitor 160, and make any necessary amendments to the key frames, or add a new key frame. Data defining the sequence are then stored (S514). Subsequently, the interpolated sequence generated by the replayer 103 may be viewed (S516), and again any necessary editing of key frames performed. Finally, the interpolated sequence is rendered (S518), and may then be stored (S520) as a video sequence of images.
At some points during an animated sequence, it may be desirable to alter the linkage between curves and reference points, or the hierarchy of reference points. The editor 113 is preferably arranged to permit this. As discussed in greater detail below, the interpolation process generally requires a consistent linkage between reference points and curves, and accordingly it may be necessary to break the interpolated sequence at a point where the hierarchy of reference points, or the linkage between curves and reference points, is changed, by providing two successive key frames .
Referring to Figure 6, in this embodiment, as in our earlier application WO92/09965, the frame table 122 of Figure 2A comprises a list of lines or curves making up a set which represents the object or character which the frame depicts. The lines or curves are provided as a linked list of curve tables 2100, 2200, 2300, 2400 (only 2100 shown); each curve table comprising a list of curve control points 2110, 2120, 2130 etc. Each control point field 2110 comprises position data defining the control point coordinates, relative to a reference point, and position data defining the control point tangent end coordinates relative to the reference point. Attribute control points, as discussed in WO92/09965, may also be present.
Also provided within the frame table 122 is a network of reference point fields 7100, 7200, 7300
Each reference point field comprises data defining a first angle θ. and distance R which
comprise in polar coordinates, the position of the reference point relative to a reference point higher in the hierarchical network; an angle θ →.
defining a rotation of the axes about the reference point relative to those of the other reference points; optionally, a scale factor S for scaling the axes; a pointer to the location in the memory 120 of the other reference point relative to which the reference point is defined (i.e. the next reference point up in the hierarchy), and a pointer "PREV R.P." to the location of any curve tables 2100 the positions of which are defined relative to that reference point 7100. The fields storing the two angles Θ-, θ the
distance R; and the scale factor S are typically separate transformation matrices, allowing the rotation, translation and scaling transformations to be separately effected.
The distance R is, in fact, a transformation matrix defining a two dimensional translation; when used, as described hereafter, to define a distance R, the translation is purely one dimensional.
Figure 7 shows in a flow chart operations of the line generator 111. Steps S700, S710 and S712 define a loop by which each reference point is processed in turn. For each reference point table 7110, 7200, 7300 ..., the line generator 111 calculates (S702) the cumulative transformation comprising the two rotations, translation and scale transformations for that reference point multiplied together, and multiplied by the cumulative transformation matrices for all higher reference points in the hierarchy (i.e. the reference point which is pointed to by the pointer field, (PREV R.P.) and all its predecessors).
Next (S704), the curves linked to the reference point are identified from the curve pointers in the reference point table. The spatial coordinates of each control point of each linked curve are derived
(S706) by multiplying the stored coordinates in each control point table 2100 with the cumulative transformation matrix of the reference point to which the curve is linked. Having derived the transformed control point positions and sorted them
(S708), the line generator 111 then generates
(S714) a line image, for example in the manner described in the above referenced WO publications.
Referring to Figure 8, in this embodiment, to edit a reference point position the animator indicates, using the position-sensitive input device 170a, a command to edit a selected reference point, and then inputs a command signal to select (S800) an editing mode from command options which comprise*, a rotation of the reference point about the next reference point up in the hierarchy; a rotation of the reference point about itself; a change in position of the reference point; and a scaling of the reference point axes (and hence of the curve control point coordinates and thus the size of each of the curves attached to the reference point).
The cursor tracker 112 thereafter is arranged (S802) to read the position indicated by the position-sensitive input device 170a, and, in dependence upon the position, to alter the data in the reference point table 7100 for the selected reference point.
It may occasionally be desired to translate a reference point position whilst leaving all other aspects of the reference point unchanged. In this instance, if the display cursor D is moved by the position-sensitive input device 170a to a position X, Y relative to the next highest reference point in the hierarchy, those values of X and Y are used to calculate the new components of the translation matrix, which in this instance therefore defines a two dimensional translation by X, Y. We have found, however, that it is generally preferable to provide reference point motions as a combination of a rotation and a translation.
Thus, if the display cursor D is moved by the position-sensitive input device 170a to a position X, Y relative to the next highest reference point in the hierarchy, the distance R and angle θ-
are set corresponding, respectively, to
Figure imgf000035_0001
(X2
+ Y2) and ARCTAN (X/Y) . Corresponding transformations are therefore stored (S810) in the reference point table 7100.
Likewise, where a pure rotation is selected, the angular position of the cursor relative to a reference point, or the preceding reference point in the hierarchy (depending on the mode selected), is calculated as ARCTAN X/Y and the transformation defining the angle θ.. or θ_ is -
correspondingly altered (S806). Likewise, changing the scale (by, for example, typing a new value from the keyboard 170b) causes the editor 113 to write (S808) a new scale factor into the reference point table 7100.
The display editor 113 having thus amended the frame table 122, the line image generator 111 is then arranged to draw (S812) the amended frame as discussed above, to allow the user interactively to edit the frame until he is content with the appearance of the frame.
Referring to Figure 9, curves defining the appearance of a cartoon character are attached to various reference points in the manner described above. For controlling arm movements of the character in an intuitive manner, reference points P, , P„ and P-. are defined corresponding to
shoulder, elbow and wrist positions of the arm respectively. The points P, , P„ and P_ are
successively lower in the hierarchy.
Figure 9A illustrates the effect of rotating the shoulder reference point P-. about its own axis
(although the same effect could be achieved in this case by rotating the elbow reference point P„ about
the shoulder reference point P-. ) ; Figure 9B
illustrates the effects of rotating the reference point V→. about its own axis; and Figure 9C shows
the effects of rotating the reference point P-,
about its own axis. It will be seen that rotating a reference point automatically results in the rotation of all curves attached to that reference point and of reference points lower in the hierarchy.
Referring to Figure 10, to interpolate a plurality of inbetween frame tables from two key frame tables in this embodiment, a number N of new frame tables 211 are created within the memory 120, one corresponding to each desired interpolated frame. Then, to perform linear interpolation, an interpolation factor L is set (S1002) equal to i/N (where i is a frame counting index), and phase of θ, , θ„, R, and S are interpolated by
setting R = LRχ + (1-L) R2; S = LS1 + (1-L) S2; etc
(where R-. , S.. etc are the parameters of a reference point of the first key frame and R„, S etc are the
corresponding parameters of the corresponding reference point of the second key frame). For non-linear interpolation, a function of L (e.g. a cosine function) may be used. When all parameters have been interpolated, and stored (S1006) in a frame table, an inbetween frame has been defined.
Steps 1008 and 1010 implement a loop whereby steps 1002 to S1006 are performed for each inbetween frame index value, i = 1 to N-l.
Counter-rotation mode
In cartoon animation, it is often required to move a component of a character (for example an arm) to simulate the effect of swinging about a vertical axis, past a sight line to the character. "Vertical" in this context means in the plane of the 2-D image. Because this type of movement has a component towards the viewer, it is three-dimensional and is therefore often difficult to deal with in a purely two-dimensional animation system.
If, in the apparatus as described so far, it is attempted to emulate this movement by shifting the reference point position, the shape of the component is unchanged (rather than being reflected as it passes the view line to the component) and consequently the motion is extremely unconvincing. Although the final shape could be edited, the correspondence between curve control points would in many cases be lost and so interpolation would be more difficult.
On the other hand, if the appearance of rotation about the vertical axis is achieved by a rotation about a horizontal axis (perpendicular to the image plane) then, as shown in Figure 11B, any other curves attached via reference points will likewise be rotated, which is undesirable and unnatural-looking. Accordingly, in this apparatus, the apparatus is arranged to provide a mode in which a rotation about a vertical axis is simulated by rotating a reference point through an angle θ.. about the next
reference point above it in the hierarchy, and then counter-rotating it by an angle θ~ (= -θ* ) about
itself. It will be appreciated that the same effect could be achieved instead by counter-rotating each of the reference points below the reference point in question about by θ„, about that point, but this is
less convenient. The effect of the rotation followed by counter-rotation is shown in Figure 11C, where reference point P„ is rortated about the
higher reference point P1 , and then counter-rotated
about itself.
Referring to Figure 12, in this embodiment, when the mode selected by the user in Figure 8 (S8000) is the counter-rotate mode according to this embodiment, the offset and rotation R, θ. are derived
from the X and Y cursor position indicated by the position-sensitive input device 170a in the manner previously described. Then, the self rotation value θ →. for the selected reference point is
changed by an amount corresponding to θ →. , in
the opposite sense. The amended frame is then redrawn (S1204/S812) by the line generator 111.
Foreshortened rotation mode
Referring to Figure 13, Figure 13A(1) indicates a first key frame displayed upon the display unit 160, and Figure 13A(2) indicates a second key frame, generated from the first above by rotating reference point P.. about reference point
P„ using the counter-rotation mode.
In Figure 13B, the five interpolated frames produced by the interpolator 101 by linear interpolation as described above are indicated. It will be seen that the resulting rotation is fully in the plane of the display device 160. Therefore although the second key frame (2) is correctly positioned to have been produced by a rotation out of the plane of the display device 160 (i.e. around a "vertical" axis), the interpolation does not achieve this effect.
It might be thought that the problem could be overcome in the apparatus as described so far by converting the third interpolant frame ( 3 ) of
Figure 13B into a key frame, as disclosed in our earlier application WO92/09965 (incorporated herein by reference), and reducing the distance R of the reference point from the predecessor. However, as shown in Figure 13C, the effect of this is to change the smooth circular arc path over time shown in Figure 13B into a bi-lobed curve, which is even less satisfactory as a representation of motion through an arc extending out of the plane of the display device 160.
Accordingly, in this apparatus, a particular interpolation mode is provided in which interpolation is performed so as to generate a smooth motion curve, approximating for example to an elliptical arc). Figure 14 is a flowchart illustrating one particular way of achieving this, with the result shown in Figure 13D. Steps S1400, S1402 and S1414 implement a loop for processing each of N-l interpolated frames (index i = 1 to N-l). For each interpolated frame, the interpolator 101 is arranged to linearly interpolate the angle θ.. ,
as in the above embodiment, and the radius R, so as to produce a series of reference point positions corresponding to those of Figure 13B. Subsequently the reference point position is modified (S1406) to effect a compression of the vertical axis of the frame, thus foreshortening the circular arc marking the path of the reference point in subsequent frames to form an elliptical arc.
This compression may be achieved by deriving (S1406) the cartesian (X, Y) position coordinates of the reference point, by deriving the cumulative transform for the reference point, and then multiplying (S1408 the difference in Y coordinate between the reference point and the reference point above it in the hierarchy (about which it is rotated) by a fractional scaling factor, so as to reduce its vertical offset from the axis about which it is rotated during interpolation. The modified reference point position can then be returned (S1410) to polar form, and stored (S1412) in the frame table for interpolated frame i .
The extent of the desired compression (the fractional scaling factor) is input by the animator, from the keyboard 170b, or position-sensitive input device 170a.
It will be appreciated that, instead of compressing the vertical scale of the reference point, the same effect could be achieved after the positions of the curve control points have been derived by scaling the curve control point positions, although this involves rather more computation. Further, it would be possible to achieve the same effect by non-linear interpolation of the angle 8 and
radius R according to a desired interpolation function. Thus, as shown in Figure 13D, a smooth foreshortening of the rotated distance is achieved, giving the impression of rotation into or out of the depth of the display 160.
To facilate the above mode, in this apparatus, at each key frame the table 7100 for each reference point includes, if the reference point is to be rotated with foreshortening, an indication that the above foreshortening interpolation is to be performed, and an indication of the degree of foreshortening desired. A direction of foreshortening might also be specified, rather than assuming rotation will always be about a purely vertical axis.
Curves attached to plural reference points
In the apparatus as described so far, each curve is attached to (in other words defined with reference to) a single reference point. This arrangement works well in many cases. However, we have found that under some circumstances when editing or interpolating frames, adjacent curves defining adjacent portions of a character (together with rendering information associated therewith) can behave separately in a manner which is confusing for the animator and consequently requires more work in editing. Accordingly, in this apparatus, portions of a curve may be linked to two reference points, and appropriate modes of behaviour defined for the curve control points.
Referring to Figure 15, a portion of a cartoon character (for example a leg) comprises a first closed curve A defined by reference to a first reference point P-. , and a second closed curve B
defined by reference to a second reference point P„. The first curve A iε defined by four curve
control points A-. -A. and the curve B is likewise
defined by four curve control points B. -B . .
We have found that, using an apparatus of the type disclosed herein, it is common to wish to represent a character by a series of closed curves representing, very roughly, elliptical or sausage-shaped portions of the character. The adjacent portions (for example, upper and lower leg) approach or overlap at a region which, in the human or animal represented by the character, would be a physical joint. Thus, commonly one portion will pivot around the joint relative to the other. Thus, in using the present apparatus, a reference point is conveniently placed at the overlap region where a joint would be, to facilitate such rotation. Since the joint is typically near the end of one or both character portions (defined by their outline curves ) , the curvature of the outline curves is relatively tight and consequently typically there will be several curve control points located near the reference point P.. at the
joint.
If the reference point P» is moved (for example to
increase its distance from reference point P-. ) , the
two curves A, B can become separated, which reduces their similarity to a single character. Likewise, if the reference point P_ is moved towards the
reference point P-. , the resemblance to a character is again reduced. These problems can, of course, be overcome by editing as described above and in our previous patent application WO92/09965 (incorporated herein by reference), but such editing can be time consuming.
Accordingly, in the present apparatus, data defining the portion of the curve B (represented by the control points B→. and B,) which is closest to
the reference point P- is stored in the memory 120
so as to be linked with data defining that reference point Pl, so that when generating new frames by editing or interpolating the position and shape of the curve B, the position of the curve B is dictated both by reference point P^ and by the
reference point P .
Referring to Figure 16, each of the curve tables 2100, 2200 relating to curves A, B includes a pointer field containing a reference to the location in the memory of the corresponding reference point table 7100, 7200. Additionally, each of the control point fields 2210, 2220 ... contains a field including a pointer to the location of a corresponding reference point table 7100, 7200. Thus, the control points B2, B- may be
linked to a reference point P-. other than the
reference point P„ to which the curve B is
linked.
In this apparatus, as described above, when the position of a reference point is changed (for example when creating a new key frame) the new values of the transformations for rotation, shift and scaling (θ-., θ„, R and S) are stored in the
reference point table 7100, 7200 etc.
Referring to Figure 17, to draw the frame, the line image generator 111 is arranged to consider each reference point in turn (steps S1700, S7101, S1712). For each reference point, the cumulative transformations are derived (S1702), as described above. These cumulative translations for each reference point are then applied (S1704, S1706) to the curve control points whose records explicitly point to that reference point. Likewise, the cumulative transformations for each reference point are applied to the curve control points of each curve whose record points to that reference point, provided that those curve control points do not themselves point to a different reference point.
Thus, in this apparatus and the example of Figure 15, when reference point P„ in Figure 15 is moved,
the control points B. and B. move with it, and the
curve control points B→. and B→. do not. To render
each frame, the line generator 111 then joins the curve control points B.. -B. with a smooth curve as
described above and in our above referenced PCT application.
Attachment to "joints"
Referring to Figure 18, Figure 18B indicates the effect which would be obtained if the reference point P_ were displaced from its position in Figure
18A to its position in Figure 18B by merely shifting its axes by an amount X,Y. It will be seen that the effect bears little resemblance to a two dimensional outline of a three dimensional cartoon character, and is rarely of use to the animator. Accordingly, for this reason, in this embodiment it is preferred to represent motion of a reference point P by a rotation wherever possible, to maintain the curvature between B. and B.
convex relative to the curve segments between B„
and B. and B-. and B..
Referring to Figure 19, Figure 19B illustrates the effect achieved on the display 160 by rotating the reference point P„ from the position shown in
Figure 19A through an angle θr to the position shown in Figure 19B. It will be seen that, whereas the curvature between B, and B. is now
convex, the curve segments B o~B ι anc* B-.-B. now
cross in an artificial looking manner which is undesirable in cartoon animation.
Accordingly, to avoid this problem the editor 113 is arranged, when moving the reference point P„ to edit also the positions of the curve control points B«, Bo which are attached to (i.e. positionally
defined by) the reference point P-. around which the
reference point P~ is rotated. In particular, the
two reference points B?,B-, are each rotated by an
angle θ -/-2- corresponding to half the
angle through which the reference point P„ itself
is rotated. As shown in Figure 19C, the effect is that part of a curve can be smoothly rotated by changing the position of the reference point P„
without breaking the outline or producing lines which cross ever.
Referring to Figure 20, in general terms, when a component such as that illustrated in Figure 19A is desired to be edited by the animator, the editor 113 is arranged to perform the process shown in Figure 20. The process, comprising initially steps S2000 to S2010, is generally similar to that indicated in Figure 8 (steps S800 to S810), but is expanded as follows. After amending the position of a reference point P_ (S2010), the next reference point up in the hierarchy (in this case, P. ) pointed to by that reference point is located
in the frame table 122 (S2012), and the two control points B ,B~ comprising part of the curve
attached to a reference point P^ which has been
edited, and which are linked by a pointer to the higher reference point P. , are also identified
(S2014). These control points are then edited
(S2016) by (in this embodiment) rotating each through half the angle through which the reference point V→. has been rotated. This editing stage
comprises simply calculating the half angle and applying this to each relevant curve control point as a rotation in the coordinate space of the relevant reference point P...
The above feature may be referred to as an option for attaching curve control points to a "joint" in the "skeleton" defined by the set of reference points for the character. Attachment to "bones"
Referring to Figure 21, a further difficulty can arise when, as shown, a reference point P~ is moved
towards a reference point P.. (as shown, from its
position in Figure 21A to that in Figure 21B) . In Figure 21, the curve control points B-., B, , B→→ and
B,n, and the curve B in general, are positionally
defined with reference to the reference point P_,
and the curve control points BR and B„ are defined
with reference to the reference point P-.. It will
be seen, from Figure 21B, that the points B→. and
B. „ which lie between the two reference points P„
and P. in Figure 21A now project beyond the
reference point P.. , causing a buckling in the
outline of the curve which is confusing and not generally useful in cartoon animation. A similar problem could have arisen had the points B_ and Bη
been attached to the reference point P-.. Accordingly, in this apparatus, the editor 113 can be arranged, when a reference point is moved such that the distance R from its predecessor in the hierarchy is reduced, to edit the positions of control points to reduce their displacement from the control point with reference to which they are defined by the same ratio as the ratio by which the distance between the two reference points has changed (R /R , ,), as shown in Figure 22. The
translated control points are then stored once more in the frame table 122, and the amended frame is redrawn as before, to produce the results shown in Figure 21C. In this mode the curve control points can be referred to as being attached to an imaginary "bone" joining two reference points in the character skeleton.
Attachment to curves
Rather than curves always being attached directly to reference points, the apparatus also provides that curves can be attached to other curves, in the manner described for example in application WO 92/21095. This facility can be particularly useful in creating life-like characters in an intuitive manner, and the apparatus provides modes of editing so as to ensure that this realism is easily maintained during the creation key frames for a complete animated sequence.
Figure 23 shows a collection of four curves A,B,C and D which are generally attached to a pair of reference points P , P„. Curve A is defined by
four control points A-, to A. which are all attached
to (defined with reference
to) the first reference point P.. Similarly, curve
B is defined by four control points B.. to B. all
attached to the second reference point P«.
Rather than attach curves C and D directly to' the reference point P or P„, each of these outline
curves is attached to points on the curves A and B which are thus used as reference curves or "formers' Curve C is defined by a control point C. attached to curve A at one end, and a point C.
attached to curve B at the other end. The position of each point C. and C~ is defined with reference
to parameter of the parametric curve A and B respectively, in a manner described in more detail in Application WO 92/21095. Therefore, no matter how the curve A is rotated or translated, the control point C. will maintain its position
relative to that curve, with its absolute position changing as necessary. On the other hand, changing the parametric position of point C, on the
curve A will cause the attachment point C, to move
along the length of the curve A, for example from a position near A. along the curve towards A, .
Curve D is defined similarly to curve C, having one end control point D-. attached to curve A near point
A→., and another end control point D~ attached to
curve B near point B→. . Figures 24A to 24D show how the curves C and D behave when reference point P„ is moved towards
point P , coinciding exactly with point P in
Figure 24C, and continuing until the relative positions of P„ and P, are substantially reversed
in Figure 24D.
Because curves C and D are defined as attached curves in the manner described, the behaviour of these curves appears quite natural, exhibiting none of the unnatural behaviour shown for example in Figure 21B. As in the example of Figure 21C, intermediate control points of the curves C and D (for example C, and O→. shown in Figure 23) have
their positions defined by a scaling factor between the end control points (C-. ,C„ etc.) at the ends of
the relevant curve. The desirable behaviour illustrated in Figure 21C, for example, is therefore maintained. Various techniques for automatically transforming the intermediate curve control points are possible, as described in more detail in the reference WO92/21095. Extent Finding
For situations where reference point V→. is rotated
relative to point P.. , as illustrated in Figures 24E
and 24F, the apparatus further provides means for automatically adjusting the parametric positions of the points of attachment of curves such as curves C and D, to avoid problems of the type illustrated in Figure 19B.
In particular, a method is employed whereby the parametric positions of the attached points (C, ,C~
etc.) are moved automatically along the curves to which they are attached (A,B etc.). A particular method for this is one in which the extent of the point of attachment is maximised in a given direction.
Figure 25 illustrates one technique for identifying a direction in which to maximise extents, while Figure 26 is a flow chart of one method of maximising the extent for one attachment point (for example C, ) . In Figure 25, where the reference curves A and B are defined with reference to particular reference points P. and P→. respectively,
the orientation of a line 2500 joining points P..
and P-y is used to define the said direction.
Specifically, this line defines an angle θ relative to the horizontal (X) axis. The points of attachment can be moved for example to maximise their extent in a direction perpendicular to this line 2500 with results which are generally satisfactory.
In the method of Figure 26, an attachment point (C.
for example) is selected for adjustment (S2600) and a direction (angle θ) is obtained (S2602). As described above, where there are reference points associated with the two ends of the curve as shown in Figure 25, for example, the angle θ may be determined simply as the angle of a line joining the reference points. If there are not reference points suitable for determining θ, other methods may be employed to obtain such an angle, including for example direct user input. Having obtained the angle θ, the reference curve (curve A for example) to which the selected point is attached is rotated by an angle -θ (S2604) in order to bring it into a "horizontal" orientation. Then the maximum extent of the curve in the Y direction (perpendicular to line 2500) is easily calculated (S2606). In particular, since each section of the reference curve is a cubic, the maximum and minimum values of Y can be found by solving a quadratic equation which is the derivative of the curve equation in Y only. The maximum extent in the Y direction of each curve section is found (not forgetting to consider the end points), and the greatest of these over all of the sections forms the maximum y extent of the reference curve (for example curve A). For other attachment points, such as point D.. in Figure 23,
it will be appropriate to find the minimum value of Y to define the desired extent position, or else to employ a different angle which is rotated 180 relative to angle θ . The position of the maximum Y point on the former curve is calculated in step S2606 in parametric form, that is to say, as a relative position along one of the curve segments. For convenience, the apparatus permits such relative positions to be specified as a percentage of the linear distance along the curve, which for a cubic curve is non-linearly related to the actual curve parameter. Comparing Figures 23 and 24E, for example, it will be seen that the attachment point C, initially
has a relative position very close to control point A. of curve A, on the segment of curve A
which extends from point A. to point A... As the
reference point P2 is rotated to the position shown in Figure 24E, the attachment point C. moves along
the curve A towards control point A, , in accordance
with the extent finding method of Figure 26. The method of Figure 26 allows an offset value to be added to the position of the attachment point (S2608), which offset is also defined in terms of the curve parameter. Once the position and any offset have been calculated, the relative position of the point of attachment is adjusted (S2610) to the desired position. Finally, the reference curve (curve A, for example) is rotated (S2612) by angle θ, returning it to its proper orientation. Of course other methods of defining and finding the desired attachment point are possible.
Tangent finding
Figure 27 illustrates a situation which corresponds broadly to that of Figure 23, except that the sizes of the reference curves A and B are very different. Again, an imaginary line 2700 is drawn joining the reference points P. and P→. to which the curves A
and B are respectively attached. A line 2702 joins the points of maximum extent on curves A and B, in a direction perpendicular to line 2700. This line shows that, in certain cases, the extent finding mechanism described above will not give an ideal path for the points of attachment of curves attached to the curves A and B. In particular, unless it begins in a convex manner, a curve joining the two points of maximum extent will cross a part of the reference curve B, as shown by the line 2702.
A better solution, but one which is more difficult to implement, involves finding a straight line 2704 which is tangential to both curves A and B. In the region of the smaller curve A, the difference between line 2704 and line 2702 is not visible to the eye. In the region of the larger curve B, however, it can clearly be seen that the tangent line 2704 meets curve B at a different point from the line 2702 of maximum extent. Figure 28 illustrates one possible method for adjusting points of attachment to the preferred tangential position, in cases where visible errors are apparent.
In Figure 28, each curve to be adjusted is identified in turn by the apparatus (S2800), including for example the curve C in Figure 23. Initially (S2802) the end points of the curve ( C, , C→, ) are adjusted to the positions of
maximum extent, using for example the method of Figure 26, applied to each end of curve C in turn. These extent-adjusted positions are used as a basis then for an iterative procedure, whereby the attached points C. , C*. are gradually adjusted
towards the desired tangentially-ad usted points.
As a first step (S2804), a first one of these two points (C, ) for example, is selected, and a line
drawn through this point which is tangential to the other reference curve (curve B in this example). The point C~ where the curve C is presently
attached to curve B is then adjusted (S2806) to the position where this tangent line touches curve B.
The same procedure is then performed from the other end of curve C as follows. A line is found (S2808) which is tangential to curve A, and passes through the attachment point C2 on curve B, as adjusted in
step S2806. The position of attachment point C. on curve A is then adjusted (S2810) to the point where the second tangent line touches curve A. The steps S2804 to S2810 are repeated (S2182), until the adjustment each time becomes smaller than a predetermined amount. For example, the iteration process should normally be stopped when the adjustment at each iteration becomes smaller than the resolution of the final displayed image.
An alternative to tangent finding, which works well when the reference curves are fairly regular in shape, is to move the attachment point manually approximately the ideal tangential position. Using this manual position as an offset in combination with extent finding (S2608) gives in many cases a very good approximation to the ideal result.
Character construction
The above described features of the apparatus, permitting curves to be attached to reference points and/or to other curves, greatly facilitate the construction of cartoon characters and the subsequent editing of image frames to define an animated sequence. In a typical process of constructing a character, a first step is to form a skeleton of reference points in a hierarchical arrangement, and then at each reference point to attach a curve to act as a reference curve. To define the character outline, curves are attached to the reference curves as desired. To include surface details, curves can be attached to the outline curves, but it is often more convenient in practice to add extra constructional curves attached to the former curves, and then to add the curves defining features of detail to these constructional curves.
An example of this process is shown in Figures 29A to 29C in which a generally sausage shaped outline is regarded somewhat as a cylinder, whose ends are formed by eliptical reference curves 2902 and 2904. Outline curves 2906 and 2908 are attached to the curves 2902 and 2904 with an attribute of extent finding or tangent finding, while the curves 2902 and 2904 are attached to respective reference points 2900 and 2901. A "chevron" or diamond shape pattern 2910 is desired to appear on a side of the character, and this is defined as a diamond shaped curve attached to a constructional curve 2912. The constructional curve 2912 is attached at each end to one of the reference curves 2902 and 2904. The points of attachment of the constructional curve to the reference curves, and the points of attachment of the feature curve 2910 to the constructional curve 2912 can be adjusted by the user independently of the outline curves, in order to achieve a wide variety of desirable effects.
Figure 29B shows that the constructional line 2912 can be adjusted to a position where the surface feature 2910 would project outside the outline of the character. The apparatus can be configured to recognise this situation, and to "matte" the feature against the outline curves. This results in the shape shown at 2910' being displayed, with the portion outlined in dotted lines not being painted in the final picture. Other aspects and embodiments of the invention
Many variations to the above embodiments will be apparent from the foregoing. In particular, the features of each of our earlier applications WO92/09965, WO92/09966, WO92/21095, WO92/21096, GB 2256118 and GB 2258790 may be combined with those of the above embodiments.
Although two dimensional animation has been described above, the extension of the above techniques to three dimensions using a three dimensional reference point hierarchy is also possible, for example using techniques described in WO 92/09965.
In the situations where control points can be attached to plural reference points, rather than editing the positions of selected curve control points, it would be possible to define an additional transformation which affected a rotation of Θ . /2 on the selected points; this could be
arranged to achive an equivalent effect. Use of a single scaling parameter for scaling the whole of a curve or curves attached to a reference point has been described above. However, it will be apparent that more complex scaling (for example, applying different scaling along different axes) could equally be provided, as could scaling to provide a perspective effect.
In the apparatus as described above, the interpolation paths have been shown variously as being straight, circular or (in foreshortened rotation mode) elliptical. It should be recognised of course that arbitrary trajectories can be defined for the reference points between key frames. One method of doing this is to specify a Bezier curve section using control points for end positions and tangents . Instead of interpolating the values R, θ.,etc directly, the interpolator
then interpolates a distance along the Bezier curve to obtain the in-between positions of the reference points . Whilst the above described apparatus provides an interpolation process, in which a plurality of frames are derived by interpolating positional data between first and second key frames, it will equally be apparent that some aspects of the present invention are applicable also to extrapolation; for example, a single key frame could be defined, and a sequence of extrapolated frames created by specifying a progressive rotation or translation of a reference point from the original key frame.
The present invention is therefore not limited to the above described embodiments, but is intended to encompass any and all variations thereto, whether or not included within the scope of the following claims. Protection is sought for any novel matter, or combinations of matter, contained herein.

Claims

Claims :
1. Apparatus for generating an animated sequence of pictures, which comprises: means for storing data defining one or more pictures, said data defining one or more lines which, when displayed, define a said picture; means for reading said stored data and for generating therefrom a sequence of further pictures; means for editing said stored data so as to amend said picture and; means for storing local reference system data; in which the line defining data defines the position of portions of the line relative to a first local reference system, and in which the means for storing line data is arranged to store data relating to the position of a portion of a line so as to allow said portion to be influenced by a second local reference system.
2. Apparatus according to claim 1, in which the stored data defining said first local reference system defines said first local reference system relative to a second said local reference system, so as to define a hierarchical relationship between said local reference systems .
3. Apparatus according to claim 1 in which said store means is arranged to store said local reference system as data defining a spatial coordinate transformation matrix.
4. Apparatus according to claim 1 in which said editing means is arranged to edit the data defining said local reference system without amending the line data of at least a portion of a line defined in relation thereto.
5. Apparatus according to claim 1, in which said line defining data comprises a plurality of control point position data.
6. Apparatus according to claim 5, wherein each of the control point data comprises position data defining a point on said line and data defining at least one tangent thereto at that point.
7. Appatatus according to claim 1, further comprising display means for displaying at least one said stored picture represented by said stored line data.
8. Apparatus according to claim 7, wherein the editing meanε comprises position-sensitive input means manually operable to cause said apparatus to amend said picture data so as to change the display on said display means, permitting interactive editing.
9. Apparatuε according to claim 1, in which the editing means is arranged, on amending the. local reference system data for a line, to amend the data defining the line εo aε to ameliorate diεtortion of the shape thereof .
10. Apparatus according to claim 9, in which, upon rotation of a said local reference system through an angle relative to the . further local reference system relative to which a line is defined by the editing means, the editing means is arranged to rotate inflecting portions of the curve in the same sense to a lesser degree.
11. Apparatus according to claim 10, in which the inflecting portions of the curve are rotated by approximately half the angle through which the local reference system is rotated.
12. Apparatus according to claim 9, in which, when the displacement between two reference points by which a curve is defined is reduced by the editing means so as to reduce the distance between points on the curve, the editing means is arranged to edit the line defining data so as to reduce the length of the line between those points.
13. Apparatus according to claim 9, in which the editing means is arranged to amend the data defining the positions of portions of a line which are defined relative to the further reference syεtem, in dependence upon the extent of amendments to the first reference syεtem in relation to which other portionε of the line are defined.
14. Apparatus according to claim 3, in which the transformation data comprises first data defining a rotation of the reference system; and second data defining a translation of the reference system.
15. Apparatus according to claim 14 in which the firεt data defines a rotation of the local reference syεtem about a point within another local reference εystem, and there is provided third transformation data defining a rotation of the local reference syεtem about its origin.
16. Apparatus according to claim 1, in which the line defining data is arranged to define a continuous line enclosing a single space, and the editing means is arranged to change the orientation of a first portion of the line in relation to, and independently of, a second portion of the line, and iε further arranged to do so without εubstantially changing the topology of the line.
17. Apparatus according to claim 1 wherein at least said first local reference system includes data defining a reference curve, and wherein said line defining data defines the position of at least a portion of the line by reference to a parametric position on the defined curve.
18. Apparatus according to claim 17 wherein said means for storing line data is arranged to maximise an extent of the parametric position in a direction determined by reference to said second local reference syεtem.
19. Apparatuε according to claim 17 wherein said second local reference system includes data defining a second reference curve, and wherein said means for storing line data is arranged to identify at leaεt approximately a line tangential to both reference curveε .
20. Character animation apparatuε for generating an animated εequence of pictures, which compriseε meanε for εtoring data defining at leaεt a firεt picture, and means for generating therefrom a plurality of incrementally displaced secondary pictures defining a motion εequence, in which the picture defining data comprises data defining a hierarchy of local coordinate reference systems, defined by their spatial relationship with reference systems higher in the hierarchy, and data defining a plurality of portions of an object, the spatial poεition of said portions being defined in εaid local reference systems, in which said secondary picture generating means iε arranged to generate εaid secondary pictures so as to include progressive rotations of a first local reference system about its superior local reference syεtem in the hierarchy, with progreεεive counter-rotationε of inferior local reference εyεtems in the hierarchy, so as to maintain the angular orientation between said superior and inferior reference systems during rotation of said first local reference system.
21. Apparatus according to claim 20, arranged to store data defining first and εecond key frame pictureε, said secondary picture generating means operating to interpolate picture data including local reference system orientation data between said first and second key frames, in which there are provided editing means for creating said second key frame by editing data corresponding to said first key frame, εaid editing meanε being arranged, upon a rotation of the first local reference system relative to said superior reference system, to store data defining said rotation and data defining a counter-rotation affecting inferior local reference systems in the hierarchy.
22. Apparatus according to claim 21, in which the data defining each local reference εystem comprises first and second rotation data, for defining respectively a rotation about said superior reference syεtem and a rotation within the local reference system.
23. Character animation apparatus for generating an animated εequence of pictures, which compriseε: meanε for εtoring picture data defining at least one picture of a character in a first position; and secondary image generation means arranged to generate a plurality of secondary pictures from said picture data, said secondary pictures being progresεively modified εo aε to form an animated εequence; characteriεed in that said secondary image generating meanε iε arranged to simulate the effect of a rotation of a portion of said character out of the plane of said picture in a third dimension, by generating said secondary pictures so as to be progresεively mutually rotated in the plane of εaid picture.
24. Apparatus according to claim 23, wherein said secondary picture generating means is arranged to generate εaid secondary pictures so that the locus, in εuccessive said secondary pictures, of a portion of said character travels a curved path in the plane of said picture which is flatter than a circular arc.
25. Apparatus according to claim 24, wherein said curved arc approximates an elliptical arc.
26. Apparatus according to claim 24, further comprising means for defining the degree of flattening of said curved arc.
27. Apparatus according to claim 23 arranged to store picture data defining first and second key frame images, and in which said secondary image generation means compriseε interpolation means for generating said sequence of secondary pictures by interpolation of picture data between said first and second key frame pictures.
28. Apparatus according to claim 27, in which data defining the desired extent of representation of movement in said third dimension is stored in connection with picture data defining one of said key frames .
29. A method of generating an animated sequence of images, which method comprises the steps of: storing, in a digital storage device, data defining one or more pictures, said data defining one or more lines which, when displayed, define a said picture; and local reference system data; the line defining data defining the position of portions of the line relative to a first local reference εystem; and εtoring data relating to the poεition of a portion of a line in relation to a firεt local reference εystem so as to allow εaid portion to be influenced by a εecond local reference εystem; editing said stored data so as to amend said picture; reading said stored data and generating therefrom data defining a plurality of secondary pictures; converting said data into a plurality of video imageε; and creating a motion picture εignal or motion picture recording conveying said succession of video images in visible or machine-redable form.
30. A motion picture signal or recording created by a method according to claim 29.
31. A method of generating an animated sequence of images, which method comprises the steps of: storing, in a digital storage device, data defining at least a first picture, said picture defining data comprising data defining a hierarchy of local coordinate reference syεtems, defined by their spatial relationship with reference syεtemε higher in the hierarchy, and data defining a plurality of portionε of an object, the spatial position of said portions being defined in said local reference systems; generating therefrom a plurality of incrementally displaced secondary pictures defining a motion sequence, by generating progressive rotations of a first local reference system about its superior local reference system in the hierarchy, with progressive counter-rotations of inferior local reference syεtemε in the hierarchy, εo as to maintain the angular orientation between said superior and inferior reference syεtemε during rotation of εaid first local reference system; converting εaid εecondary image data into a plurality of video imageε; and creating a motion εignal or motion picture recording conveying εaid εucceεεion of video imageε in visible or machine-readable form.
32. A motion picture signal or recording created by a method according to claim 31.
33. A method of generating an animated sequence of images, which method comprises the stepε of: εtoring, in a digital εtorage device, picture data defining at least one picture of a character in a first position; and generating a plurality of secondary images from said picture data, said secondary pictures being progreεεively displaced so as to form an animated sequence, said secondary pictures being generated so as to be progresεively mutually rotated in the plane of εaid picture εo as to simulate the effect of a rotation of a portion of said character out of the plane of said picture in a third dimension; converting said secondary image data into a plurality of video images; and creating a motion picture signal or motion picture recording conveying said succession of video images in visible or machine-readable form.
34. A motion picture signal or recording created by a method according to claim 33.
35. A method of generating an animated sequence of pictures which comprises the use of apparatus according to any one of claims 1 to 28.
36. A motion picture signal or recording created by a method according to claim 35.
PCT/GB1994/000631 1993-04-05 1994-03-25 Animation WO1994023392A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP94912003A EP0694191A1 (en) 1993-04-05 1994-03-25 Animation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9307107.4 1993-04-05
GB9307107A GB2277856A (en) 1993-04-05 1993-04-05 Computer generating animated sequence of pictures

Publications (1)

Publication Number Publication Date
WO1994023392A1 true WO1994023392A1 (en) 1994-10-13

Family

ID=10733377

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1994/000631 WO1994023392A1 (en) 1993-04-05 1994-03-25 Animation

Country Status (3)

Country Link
EP (1) EP0694191A1 (en)
GB (1) GB2277856A (en)
WO (1) WO1994023392A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6317130B1 (en) * 1996-10-31 2001-11-13 Konami Co., Ltd. Apparatus and method for generating skeleton-based dynamic picture images as well as medium storing therein program for generation of such picture images

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9517115D0 (en) * 1995-08-21 1995-10-25 Philips Electronics Uk Ltd Animation control apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4600919A (en) * 1982-08-03 1986-07-15 New York Institute Of Technology Three dimensional animation
WO1989009458A1 (en) * 1988-03-22 1989-10-05 Strandberg Oerjan Method and device for computerized animation
GB2245807A (en) * 1990-06-28 1992-01-08 Rank Cintel Ltd Editing of object-based animated computer graphics

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4952051A (en) * 1988-09-27 1990-08-28 Lovell Douglas C Method and apparatus for producing animated drawings and in-between drawings
JPH06503663A (en) * 1990-11-30 1994-04-21 ケンブリッジ アニメーション システムズ リミテッド Video creation device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4600919A (en) * 1982-08-03 1986-07-15 New York Institute Of Technology Three dimensional animation
US4600919B1 (en) * 1982-08-03 1992-09-15 New York Inst Techn
WO1989009458A1 (en) * 1988-03-22 1989-10-05 Strandberg Oerjan Method and device for computerized animation
GB2245807A (en) * 1990-06-28 1992-01-08 Rank Cintel Ltd Editing of object-based animated computer graphics

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
AGUI E.A.: "A COMPUTER ANIMATION COMPOSED OF CONNECTED ANIMATION PRIMITIVES", SYSTEMS, COMPUTERS, CONTROLS, vol. 12, no. 2, March 1981 (1981-03-01), WASHINGTON US, pages 45 - 54 *
KIMOTO AND YASUDA: "A METHOD OF FRAME REPRESENTATION OF MOVING OBJECTS FOR KNOWLEDGE-BASED CODING", SYSTEMS & COMPUTERS IN JAPAN, vol. 21, no. 7, 1990, NEW YORK US, pages 63 - 74, XP000172928 *
MACIEJEWSKI: "SAM-ANIMATION SOFTWARE FOR SIMULATING ARTICULATED MOTION", COMPUTERS AND GRAPHICS., vol. 9, no. 4, 1985, OXFORD GB, pages 383 - 391 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6317130B1 (en) * 1996-10-31 2001-11-13 Konami Co., Ltd. Apparatus and method for generating skeleton-based dynamic picture images as well as medium storing therein program for generation of such picture images

Also Published As

Publication number Publication date
EP0694191A1 (en) 1996-01-31
GB9307107D0 (en) 1993-05-26
GB2277856A (en) 1994-11-09

Similar Documents

Publication Publication Date Title
US5692117A (en) Method and apparatus for producing animated drawings and in-between drawings
EP0950988B1 (en) Three-Dimensional image generating apparatus
Schmidt et al. Shapeshop: Sketch-based solid modeling with blobtrees
US6208360B1 (en) Method and apparatus for graffiti animation
Burtnyk et al. Interactive skeleton techniques for enhancing motion dynamics in key frame animation
US5619628A (en) 3-Dimensional animation generating apparatus
EP2043049B1 (en) Facial animation using motion capture data
US7307633B2 (en) Statistical dynamic collisions method and apparatus utilizing skin collision points to create a skin collision response
US5883638A (en) Method and apparatus for creating lifelike digital representations of computer animated objects by providing corrective enveloping
US20070035547A1 (en) Statistical dynamic modeling method and apparatus
US20070132763A1 (en) Method for creating 3-D curved suface by using corresponding curves in a plurality of images
JPH06507743A (en) Image synthesis and processing
Gortler Foundations of 3D computer graphics
Di Fiore et al. Automatic in-betweening in computer assisted animation by exploiting 2.5 D modelling techniques
JPH07109604B2 (en) Method and computer aided design system for placing dimensions and tolerances on a three-dimensional object
US7259764B2 (en) Defrobulated angles for character joint representation
US7333112B2 (en) Rig baking
US8358311B1 (en) Interpolation between model poses using inverse kinematics
Martín et al. Observer dependent deformations in illustration
WO1994023392A1 (en) Animation
US8228335B1 (en) Snapsheet animation visualization
Magnenat-Thalmann et al. Construction and Animation of a Synthetic Actress.
Li et al. Animating cartoon faces by multi‐view drawings
JPH06215105A (en) Three-dimensional picture processor and method for the same
Melikhov et al. Frame skeleton based auto-inbetweening in computer assisted cel animation

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): GB JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 1994912003

Country of ref document: EP

ENP Entry into the national phase

Ref country code: US

Ref document number: 1995 532782

Date of ref document: 19951121

Kind code of ref document: A

Format of ref document f/p: F

WWP Wipo information: published in national office

Ref document number: 1994912003

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1994912003

Country of ref document: EP