GB2277856A - Computer generating animated sequence of pictures - Google Patents

Computer generating animated sequence of pictures Download PDF

Info

Publication number
GB2277856A
GB2277856A GB9307107A GB9307107A GB2277856A GB 2277856 A GB2277856 A GB 2277856A GB 9307107 A GB9307107 A GB 9307107A GB 9307107 A GB9307107 A GB 9307107A GB 2277856 A GB2277856 A GB 2277856A
Authority
GB
United Kingdom
Prior art keywords
data
reference system
defining
local reference
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9307107A
Other versions
GB9307107D0 (en
Inventor
Andrew Louis Charles Berend
Mark Jonathan Williams
Michael John Brocklehurst
Graeme Peter Barnes
Craig Duncan Wareham
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cambridge Animation Systems Ltd
Original Assignee
Cambridge Animation Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cambridge Animation Systems Ltd filed Critical Cambridge Animation Systems Ltd
Priority to GB9307107A priority Critical patent/GB2277856A/en
Publication of GB9307107D0 publication Critical patent/GB9307107D0/en
Priority to EP94912003A priority patent/EP0694191A1/en
Priority to PCT/GB1994/000631 priority patent/WO1994023392A1/en
Publication of GB2277856A publication Critical patent/GB2277856A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

Abstract

Apparatus for generating an animated sequence of pictures, which comprises: means for storing data defining one or more pictures, said data defining one or more lines which, when displayed, define a said picture; means for reading said stored data and for generating therefrom a sequence of further pictures; and means for editing said stored data so as to amend said picture and; means for storing local reference system data; in which the line defining data defines the position of portions of the line relative to a first local reference system, and in which the means for storing line data is arranged to store data relating to the position of a portion of a line so as to allow said portion to be influenced by a second local reference system. <IMAGE>

Description

ANIMATION Field of the invention This invention relates to apparatus for, and a method of, producing a sequence of images defining an animated sequence, such as a cartoon featuring an animated character.
Description of the background art Traditionally, cartoons are manually drawn as a sequence of frames which, when played in succession at relatively high speed, form a moving picture (typical frame rates 24, 25, or 30 frames per second, although sometimes frames are repeated twice). Even for a short sequence, many thousands of frames thus need to be drawn by hand and production of the hand drawn frames requires large teams of skilled animators and assistants. Almost all cartoon animation today is still produced in this way.
In the production of cartoon animations, typically, there is a "key frame" production stage in which a senior animator draws each character at significant points during the sequence, followed by a "inbetween" stage, in which a more junior animator creates the missing intermediate frames by a process of interpolation by eye between adjacent key frames. After this, the sequence is recorded onto film or video tape and then replayed to check for errors. If necessary, frames are redrawn at this point; otherwise the pencil drawings are inked and painted for subsequent recording.
In view of the sheer volume of drawings required, and of the time and expense involved in producing cartoons by this method, attempts have been made to automate parts of this process.
Our earlier international application W092/09965 (incorporated herein in its entirety by reference) describes a two dimensional animation system in which key frames are generated as sets of curves defined by sparse curve control points, the positions of which are interpolated in an automatic "inbetween" stage to generate inbetween animated frames; editing means are provided for correcting any parts of the sequence which appear to be unnatural when replayed, typically by inserting a new key frame.
Our earlier application W092/21095 describes an improved two dimensional animation system in which curves may be attached to other curves, so as to allow the manipulation of several curves simultaneously. Thus, curves indicating minor detail can be attached to curves indicating bold outlines, and can be edited therewith.
Another known graphics system, which was proposed to be used for animation, is described in "Automatic Curve Fitting with Quadratic B-Spline Functions and its Applications to Computer Assisted Animation", Yang et al, Computer Vision, Graphics and Image Processing 33, pp 346-363, 1986 March No.
3.
We have found that the production of key frames can be time consuming in the apparatus described in the above two embodiments. Accordingly, an object of the present invention is to provide improved animation methods and apparatus as described in W092/09965, in which whole curves (representing, for example, component parts or limbs of a cartoon character) are defined with reference to a reference point in space, and the curves can be moved or interpolated as a whole by manipulating the reference point. Thus, the manual ' work required by the animator in creating a new key frame by editing a previous key frame is greatly reduced.
Very preferably, the position of a reference point is itself defined by reference to that of another reference point, so as to create a hierarchy of reference points which can reflect a hierarchical structure of the character. Thus, the reference point defining the curve representing a toe may be defined by reference to that by which the curve representing a foot is defined, and so on. In this case, it is straightforward to move a reference point, and thus cause consequential movement of all the lower reference points in the hierarchy (positions of which are defined by reference to the point which has moved), and hence the curves which are defined by reference to those reference points.
Although this invention of itself leads to improvements in the efficiency of animation, under some circumstances it can be inefficient, and can even give rise to further problems.
One problem occurs when two different curves, forming part of the same character, are attached (i.e. defined with reference) to different reference points. For example, one curve may represent the lower arm and be defined with reference to a reference point corresponding to the wrist joint, whereas a second may represent the upper arm and be defined with reference to a reference point corresponding to the elbow.
Movement of the wrist reference point can cause the two curves to become separated, and in a complex figure this can present a confusing image for the animator to edit if the separation occurs during an interpolated sequence.
Accordingly, in a preferred embodiment, the portions of a curve attached to a first reference point which are adjacent the next reference point up in the hierarchy are positioned, when the curve is edited or interpolated, by taking account of the higher reference point in the hierarchy. Thus, when the curve is moved by moving its reference point, the link with the curve defined by the reference point higher than the hierarchy is maintained.
In one embodiment, this is achieved by providing that the portions of the curve (represented, for example, by curve control points) adjacent the reference point higher in the hierarchy are, in terms of their position, defined by the higher reference point, the rest of the curve being defined positionally by the lower reference point in the hierarchy.
This embodiment provides a simple mechanism for maintaining the linkage between two curves forming part of the same character. However, difficulties can arise under some circumstances because the shape of the curve may be distorted in a way which can appear confusing to the animator, especially where a complex figure is represented.
Accordingly, in a further embodiment, the positions of some portions of the curve (for example represented by curve control points defining the curve) may, after the curve is edited, be set in dependence jointly upon the positions of two reference points, between which the curve lies.
For example, where a reference point is rotated through an angle about another reference point higher in the hierarchy, the positions of intermediate curve control points may be rotated through half the angle. Or, where a reference point is moved towards one higher in the hierarchy so as to compress the curve portions between the reference points, the distance between curve control points lying between the two reference points may likewise be shortened in proportion.
Thus, particularly confusing affects where what was originally a smooth curve lying in a single loop is distorted and caused to cross-over itself, or to include sharp inflections, are ameliorated or avoided, and so the topology and/or general appearance of the curve is usually retained.
A further problem, arising from the limitations of a two dimensional representation of three dimensional characters, can arise as follows. If a character, posing with arm outstretched and thumb uppermost, swings his arm in a vertical plane (i.e.
about a horizontal axis) through half a revolution, the thumb will in the final position be lowermost.
However, if the arm is swung in a horizontal plane (i.e. about a vertical axis) the thumb will remain uppermost. If the motion is specified between two key frames, one at each of the start and end positions, a linear interpolation process as described in our earlier application W092/09965, will interpolate the arm linearly between the two end positions; this will appear generally similar to the rotation in a horizontal plane, but with an inaccurate result in the centre of the interpolation as the control points for the outer end of the arm will cross over those for the inner end of the arm, so that the arm loses its width in the centre of the swing.
If, according to the present invention, the two key frames were attempted to be specified by simply moving the reference point, the shape of the whole limb would be retained but displaced sideways; this is obviously not the effect generally desired in cartoon animation.
Accordingly, in a further embodiment, there is provided a mode in which movements of a curve representing a component of a character can be specified as rotations, and the intervening interpolated frames correspond to successive degrees of rotation of the component.
This embodiment therefore enables a good representation of the type of motion where a character swings an arm in a vertical plane (i.e.
about a horizontal axis). However, it cannot be used, on its own, to represent a motion with any element of rotation about a vertical axis.
Accordingly, in a further embodiment, we provide a means of specifying the depth of a rotation so as to produce elliptical rotations, which provide a more convincing and versatile movement of a character limb (for example).
A remaining difficulty with this embodiment, however, is that (to use the above illustration) even with the shallowest rotation depth, the positions of appendages (for example the thumb mentioned above) are reflected vertically, whereas with a rotation about a substantially vertical axis this should not occur. Accordingly, in a preferred embodiment, during editing and interpolation a counter rotation of reference points lower in the hierarchy than that undergoing rotation is applied, so as to avoid this reflection where desired.
Other aspects and preferred embodiments of the invention are as described or claimed hereafter.
The invention will now be illustrated, by way of example only, with reference to the accompanying drawings in which; Figure 1 shows schematically a block diagram of apparatus according to one embodiment of the invention; Figure 2A illustrates the contents of a memory of apparatus of this embodiment to represent a curve displayed on a display shown in Figure 2B; Figure 3 is a block diagram schematically illustrating the operation of this embodiment in generating a display; Figure 4 is a block diagram illustrating the functional elements of this embodiment; Figure 5 is a flow diagram schematically illustrating a sequence of operations undertaken by a user of the apparatus of the above embodiment; Figure 6 is a block diagram indicating schematically the manner in which data relating to a display frame is stored within the memory of the above embodiment; Figure 7 is a flow diagram showing schematically the process of generating a curve in the above embodiment; Figure 8 is a flow diagram showing schematically the process of editing a frame in the above embodiment; Figures 9A-9C are displays generated by the above embodiment on a display screen to illustrate the operation of the above embodiment; Figure 10 is a flow diagram showing schematically the process of interpolating to produce intervening frames between two key frames in the above embodiments; Figures llA-llC are screen displays produced by a second embodiment of the invention; Figure 12 shows in greater detail a portion of the flow diagram of Figure 8 when applied to the second embodiment; Figure 13A represents a screen display illustrating two key frames between which data is to be interpolated; Figure 13B illustrates the effect of interpolating between the key frames in the second embodiment described above; Figure 13C illustrates the effect of adding a further key frame in the second embodiment; and Figure 13D illustrates schematically the path over time of the interpolated frames in a third embodiment of the invention; Figure 14 is a flow diagram showing schematically the operation of the interpolator in the third embodiment; Figure 15 is a screen display illustrating a fourth embodiment of the invention; and Figure 16 illustrates the corresponding arrangement of information in a frame table in the memory of that embodiment; Figure 17 is a flow diagram corresponding to Figure 7, illustrating the generation of a display in this embodiment; Figure 18A is a screen display producable by the fourth embodiment of the invention, and Figure 18B is a corresponding screen display after the frame has been edited according to that embodiment; Figure 19A corresponds to Figure 18A; Figure 19B is a display corresponding to Figure 19A after the component represented therein has been edited according to the fourth embodiment; and Figure 19C illustrates the corresponding display generated by a fifth embodiment of the invention; Figure 20 corresponds to Figure 8 -and illustrates the operation of editing a frame in the fifth embodiment; Figure 21 is a flow diagram showing in greater detail a portion of the flow diagram of Figure 20; Figure 22A corresponds to Figure 18A; Figure 22B is a display corresponding to that of Figure 22A after editing of the frame depicted therein according to the fourth or fifth embodiments; and Figure 22C is a corresponding display generated by a sixth embodiment of the invention; Figure 23 is a flow diagram showing in greater detail a portion of Figure 20 according to the sixth embodiment.
General description of system Full details of aspects of the animation system embodying the present invention are given in our earlier application W092/09965, incorporated herein in its entirety by reference, and so a detailed recital of the apparatus and some aspects of the method of operation of such an animation system is not necessary. For clarity, however, a brief synopsis will now be given.
As disclosed in our earlier above referenced application, and in "Interactive Computer Graphics", The Burger and Gillies, 1989, Edison Wesley, ISBNO-201-17349-1, or "An Introduction to Splines for use in Computer Graphics and Geometric Modelling", by R H Bartels, J C Beatty and B A Barsky, published by Morgan Kaufmann, ISBNO-934513-27-3 (both incorporated herein by reference) disclose apparatus for editing and displaying smooth curves by defining curve control points dictating the shape of the curves. One example of a class of such curves are B-splines. A particular way of representing such splines is the "Bezier" format, in which the curve is represented by a number of curve control points, the data for each curve control point comprising the coordinates of a point on the curve, together with the coordinates of two tangent end points marking tangents to the curve at that point Referring to Figure 1, apparatus according to an embodiment of the present invention (described in our earlier application W092/09965) comprises a computer 100 comprising a central processing unit 110, a memory device 120 for storing the program sequence for the central processing unit (CPU) 110 and providing working read/write memory, a frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed, and an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150.
A monitor 160 is connected to the computer 100 and its display updated from the frame store 130 under the control of the CPU 110. At least one user input device 170a, 170b is provided; typically a keyboard 170b for inputting commands or control signals for controlling peripheral operations such as starting, finishing and storing the results of an image generation or image processing operation, and a position sensitive input device (cursor control device) such as, in combination, a stylus and digitising tablet or a "mouse", or a touch-sensitive screen on the monitor 160, or a "tracker ball" device or a joystick. A cursor symbol is generated by the computer 100 for display on the monitor 160 in dependence upon the signal from the position sensitive input device 170a, to allow a user to inspect an image on the monitor 160 and so it will designate a point or region of the image during image generation or processing.
A mass storage device 180 such as, for instance, a hard disk device is preferably provided as a long term image store, and preferably the mass storage device 180 also or alternatively comprises a removable medium storage device such as a floppy disk drive, to allow data to be transferred into and out from the computer 100. Also preferably provided is a printer 190, a film recorder 196 and/or a video recorder 197. A picture input device 195 such as a scanner for scanning an image on, for example, a transparency, and inputting a corresponding video signal to the computer 100 may also be provided.
Referring to Figures 2A and 2B, as described more fully in our earlier referenced application, the memory 120 includes a working memory area 121. An image displayed on the monitor 160 includes at least one line A which is drawn as a curve defined by three control points A1, A2, A3, corresponding image frame data representing the line images being stored within a frame table 122 within the working memory 121, as a series of curves (curve A curve B ...) each of which is defined by a series of control points (A1, A2, A3) each represented by the control point position (Xi, Yi) and the positions of the end points of two tangents at that point (Xei, Yei, Xfi, Yfi). In this embodiment, as will be more fully described below, each curve also includes a pointer to reference point P., and the frame table 122 includes data defining the position and orientation of the reference point. The coordinates of the control points of the curve A are defined in the reference frame of the reference point (i.e. as offsets from axes running through the reference point position at specified orientations).
Referring to Figure 3, and as described in greater detail in our above referenced application, the CPU 110 functionally comprises a line image generator 111, a cursor tracker 112 and a display editor 113.
The line image generator 111 is arranged to read the frame table 122, to calculate from the reference point data and curve control point data the coordinates of intervening points along the curve, and to write the intervening point into the frame display store 130 in such a manner (for example, in a different colour) that they are distinguishable from the background. The memory mapped frame image in the display store 130 is then displayed on the monitor 160.
The cursor tracker 112 reads the coordinates of the position-sensitive input device 170a for the device controller 140, and writes a cursor symbol D at a corresponding position in the frame display store 130 for corresponding display. The display editor 113 responds to the cursor position from the cursor tracker 112 to alter the contents of the frame table 122 (specifically, the reference point B1 position or orientation, or the positions of curve control points and tangent end points). After any such editing of the frame table 122 by the display editor 113, the line image generator 111 amends the contents of the frame display store 130 so that the display on the monitor 160 changes correspondingly.
Referring to Figure 4, the CPU 110 further comprises, functionally, an interpolator 101 (described in greater detail hereafter and in our earlier referenced application) which is arranged to generate sequences of image frames between a pair of spaced image key frames; a replayer 103 arranged to recall a stored image sequence previously created, and generate a display of the sequence as a moving image on the animator's screen 160 (as described in our earlier above referenced application); and a renderer 105 arranged to colour each generated image and/or otherwise affect the way in which the image is represented (as described for example in our earlier application W092/09966, W092/21096 and UK Application 9211376.0). Each of the components 105, 103, 111, 113, 101 may be provided by a separate processor, or each may be provided by a suitable sequence of execution steps on a common processor.
General description of animation process As described in our above referenced earlier application, the processes performed by the apparatus embodying the present invention to enable a user to define an animator sequence are: 1. Defining objects to be animated (characters).
2. Defining key frames (i.e. creating image frames spaced apart in time in which the character is posed or shaped in a particular manner).
3. Creating interpolated "inbetween" frames (intervening image frames created between the key frames by the interpolator 101).
4. Displaying and editing (the interpolated sequence is displayed and changed as necessary).
5. Replaying (the sequence is displayed at a normal replay speed).
6. Rendering (the image is coloured and mixed with a background).
7. Film or video recording (each image in turn is recorded in a sequence either in electronic (video) form or by the film recorder as a sequence of colour transparencies, for projection or display).
The present invention is particularly concerned with the stages of defining key frames, creating interpolated frames, and editing key frames. The other stages above are as described in our earlier referenced applications in general.
One typical sequence of operations of the present invention is shown in Figure 5. Initially, the user will wish to create a character to animate, and accordingly a "template" frame table defining the topology of the character, or part thereof, is created in the working memory 121.
The next stage is to create a number of key frames.
These may be created by editing the curve control points of the template frame. Another possibility is to scan in an image using the image input device 195, display the scanned image on the monitor 160, simultaneously with the template image, and edit the template image to conform to the scanned image which will previously have been drawn freehand. In either case, when the animator is satisfied with the keyframe, the frame table 122 is then permanently stored.
In the present invention, the template frame table (and the key frame tables derived therefrom) comprise a sequence of reference points as well as a sequence of curve tables each comprising a plurality of curve control points. Each reference point data record in the memory 121, other than the single reference point highest in the hierarchy of points, includes a pointer to the location of the data record of a higher reference point in the hierarchy. Each reference point also includes a pointer to a curve to which it is attached (i.e. a curve the position and orientation of which is defined by the reference point). Each curve table in the working memory 121 likewise contains a pointer to a reference point.
Typically, the user may first cause the creation of the curve table including the template frame comprising a plurality of lines, and then subsequently create the hierarchy of reference points by selecting reference point positions (typically at joints of a character) and designating which reference points are defined by reference to which others. Finally, the user inputs data designating which lines in the template image are to be linked to which reference points (for example, by designating a line and a reference point consequently using the position-sensitive input device 170a). The editor 113 then derives the relative position of the control points of the curve, in the coordinate space of the reference point, and writes these into the frame table 122 in place of the absolute positions previously present.
In editing and creating key frames, therefore, the animator amends the position and orientation of reference points to produce wholesale variations in all curves the positions of which are defined by the reference points (for example, to rotate or move a limb of a character). To change the shape of components of a character, the animator also edits the curve control points, as discussed in our earlier above referenced application. Because the curve control points are represented, in this invention, by data defined relative to the reference position, in the first embodiment of the present invention, the contents of the curve tables are not altered when the reference points are edited.
For each character to be animated, there will be one reference point which is highest in the hierarchy. Moving this reference point will affect the whole of the character. This accordingly provides a convenient means of editing the character position, orientation or scale. For example, this character reference point may be defined positionally with reference to another reference point on another character, so as to be moved with the other character.
The animator may then cause the interpolator 101 to interpolate between the key frames as created, and may view the interpolated frames on the monitor 160, and make any necessary amendments to the key frames, or add a new key frame. Subsequently, the interpolated sequence generated by the replayer 103 may be viewed, and again any necessary editing of key frames performed. Finally, the interpolated sequences rendered, and may then be stored as a video sequence of images.
At some points during an animated sequence, it may be desirable to alter the linkage between curves and reference points, or the hierarchy of reference points. The editor 113 is preferably arranged to permit this. As discussed in greater detail below, the interpolation process generally requires a consistent linkage between reference points and curves, and accordingly it may be necessary to break the interpolated sequence at a point where the hierarchy of reference points, or the linkage between curves and reference points, is changed, by providing two successive key frames.
First embodiment Referring to Figure 6, in this embodiment, as in our earlier application W092/09965, in greater detail the frame table 122 of Figure 2A comprises a list of lines or curves making up a set which represents the object or character which the frame depicts. The lines or curves are provided as a linked list of curve tables 2100, 2200, 2300, 2400; each curve table comprising a list of curve control points 2110, 2120, 2130 etc. Each control point field 2110 comprises position data defining the control point coordinates, relative to a reference point, and position data defining the control point tangent end coordinates relative to the reference point. Attribute control points, as discussed in W092/09965, may also be present.
Also provided within the frame table 122 is a network of reference point fields 7100, 7200, 7300 ... Each reference point field comprises data defining a first angle 81 and distance R which comprise in polar coordinates, the position of the reference point relative to a reference point higher in the hierarchical network; an angle O2 defining a rotation of the axes about the reference point relative to those of the other reference points; optionally, a scale factor S for scaling the axes; a pointer to the location in the memory 120 of the other reference point relative to which the reference point is defined (i.e. the next reference point up in the hierarchy), and a pointer to the location of any curve tables 2100 the positions of which are defined relative to that reference point 7100.
The fields storing the two angles 81, 82; the distance R; and the scale factor S are typically separate transformation matrices, allowing the rotation, translation and scaling transformations to be separately effected.
The distance R is, in fact, a transformation matrix defining a two dimensional translation; when used, as described hereafter, to define a distance R, the translation is purely one dimensional.
In this embodiment, the line generator 111 is arranged to generate, for each reference point table 7110, 7200, 7300 ..., the cumulative transformation comprising the two rotations, translation and scale transformations for that reference point multiplied together, and multiplied by the cumulative transformation matrices for all higher reference points in the hierarchy (i.e. the reference point which is pointed to by the pointer field, and all its predecessors).
Next, the spatial coordinates of each control point of each curve are derived by multiplying the stored coordinates in each control point table 2100 with the cumulative transformation matrix of the reference point of the curve which is linked to (i.e. pdinted to by) the reference point. Having derived the transformed control point positions, the line generator 111 then generates a line image as described in the above referenced PCT application.
Referring to Figure 8, in this embodiment, to edit a reference point position the animator indicates, using the position-sensitive input device 170a, a command to edit a selected reference point, and then inputs a command signal to select an editing mode from command options which comprise a rotation of the reference point about the next reference point up in the hierarchy; a rotation of the reference point about itself; a change in position of the reference point; and a scaling of the reference point axes (and hence of the curve control point coordinates and thus the size of each of the curves attached to the reference point).
The cursor tracker 112 thereafter is arranged to read the position indicated by the position-sensitive input device 170a, and, in dependence upon the position, to alter the data in the reference point table 7100 for the selected reference point.
It may occasionally be desired to translate a reference point position whilst leaving all other aspects of the reference point unchanged. In this instance, if the display cursor D is moved by the position-sensitive input device 170a to a position X, Y relative to the next highest reference point in the hierarchy, those values of X and Y are used to calculate the new components of the translation matrix, which in this instance therefore defines a two dimensional translation by X, Y. We have found, however, that it is generally preferable to provide reference point motions as a combination of a rotation and a translation.
Thus, if the display cursor D is moved by the position-sensitive input device 170a to a position X, Y relative to the next highest reference point in the hierarchy, the distance R and angle 61 2 are set corresponding, respectively, to f (X2 + Y2) and ARCTAN (X/Y). Corresponding transformations are therefore stored in the reference point table 7100.
Likewise, where a pure rotation is selected, the angular position of the cursor relative to a reference point, or the preceding reference point in the hierarchy (depending on the mode selected), is calculated as ARCTAN X/Y and the transformation defining the angle 81 or 62 is correspondingly altered. Likewise, changing the scale (by, for example, typing a new value from the keyboard 170b) write causes the editor 113 to a new scale factor into the reference point table 7100.
The display editor 113 having thus amended the frame table 122, the line image generator 111 is then arranged to draw the amended frame as discussed above, to allow the user interactively to edit the frame until he is content with the appearance of the frame.
Referring to Figure 9, Figure 9A illustrates the effect of rotating a reference point P1 about its own axis (although the same effect could be achieved in this case by rotating the reference point P2 about the reference point P1); Figure 9B illustrates the effects of rotating the reference point P2 about its own axis; and Figure 9C shows the effects of rotating the reference point P3 about its own axis. It will be seen that rotating a reference point automatically results in the rotation of all curves attached to that reference point and to reference points lower in the hierarchy (i.e. which point to that reference point, or to others which point to that reference point).
Referring to Figure 10, in this embodiment, to interpolate a plurality of inbetween frame tables from two key frame tables in this embodiment, a number N of new frame tables 211 are created within the memory 120, one corresponding to each desired interpolated frame. Then, to perform linear interpolation, an interpolation factor L is set equal to i/N (where i is a frame counting index), and phase of 81, 2 R, and S are interpolated by setting R = LR1 + (l-L) R2; S = LS1 + (l-L) S2; etc (where R1, S1 etc are the parameters of a reference point of the first frame and R2, S2 etc are the corresponding parameters of the corresponding reference point of the second key frame). For non-linear interpolation, a function of L (e.g. a cosine function) may be used.
Second embodiment In cartoon animation, it is often required to move a component of a character (for example an arm) to simulate the effect of swinging about a vertical axis, past a sight line to the character. Because this type of movement has a component towards the viewer, it is purely three dimensional and is therefore often difficult to deal with in a two dimensional animation system.
If, as in the above described embodiment, it is attempted to emulate this movement by shifting the reference point position, the shape of the component is unchanged (rather than being reflected as it passes the view line to the component) and consequentally the motion is extremely unconvincing. Although the final shape could be edited, the correspondence between curve control points would in many cases be lost and so interpolation would be more difficult.
On the other hand, if the rotation about the vertical axis is achieved by a rotation about a horizontal axis, then as shown in Figure 11B, any other curves attached via reference points will likewise be rotated, which is undesirable and unnatural-looking.
Accordingly, in this embodiment, the apparatus is arranged to provide a mode in which a rotation about a vertical axis is simulated by rotating a reference point through an angle 81 about the next reference point about it in the hierarchy, and then counter-rotating it by an angle 82(= 8 01) about itself. It will be found that the same effect could be achieved instead by counter-rotating in each of the reference positions below the reference position in question about it by 82, but this is less convenient. The effect of the rotation followed by counter-rotation is shown in Figure llC.
Referring to Figure 12, in this embodiment, when the mode input by the user in Figure 8 is the counter-rotate mode according to this embodiment, the offset and rotation R, 81 are derived as in the first embodiment, from the X and Y cursor position indicated by the position-sensitive input device 170a. Then, the self rotation value 82 is changed by an amount corresponding to 61, in the opposite sense. The amended frame is then redrawn by the line image generator 111.
Third embodiment Referring to Figure 13, Figure 13A(1) indicates a first key frame displayed upon the display unit 160, and Figure 13A(2) indicates a second key frame, generated from the first by the second embodiment above by rotating reference point P1 about reference PO.
In Figure 13B, the five interpolated frames produced by the interpolator 101 by linear interpolation as described above are indicated. It will be seen that the resulting rotation is fully in the plane of the display device 160, and accordingly although the second key frame (2) is correctly positioned to have been produced by a rotation out of the plane of the display device 160 (i.e. around a "vertical" axis), the interpolation does not achieve this effect.
It might be thought that the problem could be overcome by converting the third interpolant frame (3) of Figure 13B into a key frame, as disclosed in our earlier application W092/09965 (incorporated herein by reference) and reducing the distance R of the reference point from the predecessor. However, as shown in Figure 13C, the effect of this is to change the smooth circular arc path over time shown in Figure 13B into a bi-lobed curve, which is even less satisfactory as a representation of motion through an arc extending out of the plane of the display device 160.
Accordingly, in this embodiment, a predetermined interpolation mode is provided in which interpolation is performed so as to generate a smoothed curve (preferably approximating to an elliptical arc).
In one particular way of achieving this, demonstrated in Figure 13D, the interpolator 101 is arranged to linearly interpolate the angle 01 as in the above embodiment, and the radius R, so as to produce a series of reference point positions corresponding to those of Figure 13B, and subsequently the reference point position is modified to effect a compression of the vertical axis of the frame, thus foreshortening the circular arc marking the path of the reference point in subsequent frames to form an elliptical arc. As shown in Figure 14, this may be achieved by deriving the polar (X, Y) position of the reference point, by deriving the cumulative transform for the reference point., and then multiplying the difference in Y coordinate between the reference point and the reference point above it in the hierarchy (about which it is rotated) by a fractional scaling factor, so as to reduce its vertical offset from the axis about which it is rotated during interpolation.
The extent of the desired compression (the fractional scaling factor) is input by the animator, from the keyboard 170b, or position-sensitive input device 170a.
It will be appreciated that, instead of compressing the vertical scale of the reference point, the same effect could be achieved after the positions of the curve control points have been derived by scaling the curve control point positions, although this involves rather more computation. Further, it would be possible to achieve the same effect by non-linear interpolation of the angle 61 and radius R according to a desired interpolation function.
Thus, as shown in Figure 13D, a smooth foreshortening of the rotated distance is achieved, giving the impression of rotation into or out of the depth of the display 160.
Preferably, in this embodiment, at each key frame the table 7100 for each reference point includes, if the reference point is to be rotated with foreshortening, an indication that the above foreshortening interpolation is to be performed.
Fourth embodiment In the above described embodiments, each curve is attached to (in other words defined with reference to) a single reference point. This arrangement works well in many cases. However, we have found that under some circumstances when editing or interpolating frames, adjacent curves defining adjacent portions of a character (together with rendering information associated therewith) can become separately in a manner which is confusing for the animator and consequentally requires more work in editing. Accordingly, in this embodiment, portions of a curve may be linked to two reference points.
Referring to Figure 15, a portion of a cartoon character (for example a leg) comprises a first closed curve A defined by reference to a first reference point P1, and a second closed curve B defined by reference to a second reference point P2. The first curve A is defined by four curve control points A1-A4 and the curve B is likewise defined by four curve control points B1-B4.
We have found that, using the apparatus of the present invention, it is common to wish to represent a character by a series of closed curves representing, very roughly, elliptical or sausage-shaped portions of the character. The adjacent portions (for example, upper and lower leg) approach or overlap at a region which, in the human or animal the character is to represent, would be a joint. Thus, commonly, one portion will pivot around the joint relative to the other.
Thus, in the present invention, a reference point is conveniently placed at the overlap region where a joint would be, to facilitate such rotation.
Since the joint is typically near the end of one or both character portions (defined by their outline curves), the curvature of the outline curves is relatively tight and consequently typically there will be several curve control points located near the reference point P1 at the joint.
If the reference point P2 is moved (for example to increase its distance from reference point P1), the two curves A, B can become separated, which reduces their similarity to a single character. Likewise, if the reference point P2 is moved towards the reference point P1, the resemblance to a character is again reduced. These problems can, of course, be overcome by editing as described above and in our previous patent application We92/09965 (incorporated herein by reference), but such editing can be time consuming.
Accordingly, in this embodiment, data defining the portion of the curve B (represented by the control points B2 and B3) which is closest to the reference point P1 is stored in the memory 120 so as to be linked with data defining that reference point P1, so that when generating new frames by editing or interpolating the position of the curve B, the position of the curve B is dictated both by reference point P2 and by the reference point P1 Referring to Figure 16, each of the curve tables 2100, 2200 relating to curves A, B includes a pointer field containing a reference to the location in the memory of the corresponding reference point table 7100, 7200. Additionally, each of the control point fields 2210, 2220 ...
contains a field including a pointer to the location of a corresponding reference point table 7100, 7200. Thus, the control points B2, B3 may be linked to a reference point P1 other than the reference point P2 to which the curve B is linked.
In this embodiment, as in the earlier embodiments, when the position of a reference point is changed (for example when creating a new key frame) the new values of the transformations for rotation, shift and scaling (81, 82 R and S) are stored in the reference point table 7100 or 7200 ....
Referring to Figure 17, in this embodiment, to draw the frame, the line image generator 111 is arranged to derive the cumulative transformations for each reference point, as described above, and then to apply these cumulative translations for each reference point to the curve control points the records of which explicitly point to that reference point. Likewise, the cumulative transformations for each reference point are applied to the curve control points of each curve the record of which points to that reference point, provided that those curve control points do not themselves point to a different reference point.
Thus, in this embodiment, when a reference point such as P2 in Figure 15 is moved, the control points B1 and B4 move with it, and the curve control points B2 and B3 do not. The line generator 111 then joins the curve control points B1-B4 with a smooth curve as described above and in our above referenced PCT application.
Referring to Figure 18, Figure 18B indicates the effect which would be obtained if the reference point P2 were displaced from its position in Figure 18A to its position in Figure 18B by merely shifting its axes by an amount X,Y. It will be seen that the effect bears little resemblance to a two dimensional outline of a three dimensional cartoon character, and is rarely of use to the animator. Accordingly, for this reason, in this embodiment it is preferred to represent motion of a reference point P by a rotation wherever possible, to maintain the curvature between B1 and B4 convex relative to the curve segments between B2 and B1 and B3 and B4. Referring to Figure 19, Figure l9B illustrates the effect achieved on the display 160 by rotating the reference point P2 from the position shown in Figure l9A through an angle 8 8 to the position shown in Figure l9B-. It will be seen that, whereas the curvature between B1 and B4 is now convex, the curve segments B2-B1 and B3-B4 now cross in an artificial looking manner which is undesirable in cartoon animation.
Fifth embodiment Accordingly, in this embodiment, to avoid this problem the editor 113 is arranged, when moving the reference point P2 to also edit the positions of the curve control points B2, B3 which are attached (i.e. positionally defined by) the reference point P1 around which the reference point P2 is rotated.
In this embodiment, the two reference points B2, B3 are each rotated by an angle 8 8/2 corresponding to half the angle through which the reference point P2 is rotated. As shown in Figure 19C, the effect is that part of a curve can be smoothly rotated by changing the position of the reference point P2 without breaking the outline or producing lines which cross over.
Referring to Figure 20, in general terms, when a component such as that illustrated in Figure 19A is desired to be edited by the animator, the editor 113 is arranged to perform the process shown in Figure 20, which is generally similar to that indicated in Figure 8 except that, after amending the position of a reference point P2, the next reference point up in the hierarchy (in this case, P1) pointed to by that reference point is located in the frame table 122, and the two control points B3, B2 comprising part of the curve attached to a reference point P2 which has been edited, and which are linked by a pointer to the higher reference point P1, are edited by (in this embodiment) rotating each through half the angle through which the reference point P2 has been rotated. This editing stage is shown in greater detail in Figure 21.
Referring to Figure 22, a further difficulty can arise when, as shown, a reference point P2 is moved towards a reference point P1 (as shown, from its position in Figure 22A to that in Figure 22B). In Figure 22, the curve control points B5, B6, B7 and Blo, and the curve B in general, are positionally defined with reference to the reference point P2, and the curve control points B8 and Bg are defined with reference to the reference point P1. It will be seen, from Figure 22B, that the points B7 and B10 which lie between the two reference points P2 and P1 in Figure 22A now project beyond the reference point P1, causing a buckling in the outline of the curve which is confusing and not generally useful in cartoon animation. A similar problem could have arisen had the point B7 and Blo been attached to the reference point P1.
Sixth embodiment Accordingly, in this embodiment, the editor 113 is arranged, when a reference point is moved such that the distance R from its predecessor in the hierarchy is reduced, to edit the positions of control points to reduce their displacement from the control point with reference to which they are defined by the same ratio as the ratio by which the distance between the two reference points has changed ( new/ROld) as shown in Figure 23. The translated control points are then stored once more in the frame table 122, and the amended frame is redrawn as before, to produce the results shown in Figure 22C.
Other aspects and embodiments of the invention Many variations to the above embodiments will be apparent from the foregoing. In particular, the features of each of our earlier applications W092/09965, W092/09966, W092/21095, W092/21096, GB 2256118, GB 2258790 and UK application 9211376.0 may be combined with those of the above embodiments.
Although two dimensional animation has been described above, the extension of the above techniques to three dimensions using a three dimensional reference point hierarchy is also possible.
In the fourth and fifth embodiments, rather - than editing the positions of selected curve control points, it would be possible to define an additional transformation which affected a rotation of 81/2 on the selected points; this could be arranged to achive an equivalent effect.
Use of a single scaling parameter for scaling the whole of a curve or curves attached to a reference point has been described above. However, it will be apparent that more complex scaling (for example, applying different scaling along different axes) could equally be provided, as could scaling to provide a perspective effect.
Whilst the above described embodiments have described an interpolation process, in which a plurality of frames are derived by interpolating positional data between first and second key frames, it will equally be apparent that some aspects of the present invention are applicable also to extrapolation; for example, a single key frame could be defined, and a sequence of extrapolated frames created by specifying a progressive rotation or translation of a reference point from the original key frame.
The present invention is therefore not limited to the above described embodiments, but is intended to encompass any and all variations thereto, whether or not included within the scope of the following claims. Protection is sought for any novel matter, or combinations of matter, contained herein.

Claims (30)

Claims:
1. Apparatus for generating an animated sequence of pictures, which comprises: means for storing data defining one or more pictures, said data defining one or more lines which, when displayed, define a said picture; means for reading said stored data and for generating therefrom a sequence of further pictures; and means for editing said stored data so as to amend said picture and; means for storing local reference system data; in which the line defining data defines the position of portions of the line relative to a first local reference system, and in which ' the means for storing line data is arranged to store data relating to the position of a portion of a line so as to allow said portion to be influenced by a second local reference system.
2. Apparatus according to claim 1, in which the stored data defining said first local reference system defines said first local reference system relative to a second said local reference system, so as to define a hierarchical relationship between said local reference systems.
3. Apparatus according to claim 1 or claim 2 in which said store means is arranged to store said local reference system as data defining a spatial coordinate transformation matrix.
4. Apparatus according to any one of claims 1 to 3 in which said editing means is arranged to edit the data defining said local reference system without amending the line data of at least a portion of a line defined in relation thereto.
5. Apparatus according to any preceding claim, in which said line defining data comprises a plurality of control point position data.
6. Apparatus according to claim 5, wherein each of the control point data comprises position data defining a point on said line and data defining at least one tangent thereto at that point.
7. Appatatus according to any preceding claim, further comprising display means for displaying at least one said stored picture represented by said stored line data.
8. Apparatus according to claim 7, wherein the editing means comprises position-sensitive input means manually operable to cause said apparatus to amend said picture data so as to change the display on said display means, permitting interactive editing.
9. Apparatus according to any preceding claim, in which the editing means is arranged, on amending the local reference system data for a line, to amend the data defining the line so as to ameliorate the distortion of the shape thereof.
10. Apparatus according to claim 9, in which, upon rotation of a said local reference system through an angle relative to the further local reference system relative to which a line is defined by the editing means, the editing means is arranged to rotate inflecting portions of the curve in the same sense to a lesser degree.
11. Apparatus according to claim 10, in which the inflecting portions of the curve are rotated by approximately half the angle through which the local reference system is rotated.
12. Apparatus according to claim 9, in which, when the displacement between two reference points by which a curve is defined is reduced by the editing means so as to reduce the distance between points on the curve, the editing means is arranged to edit the line defining data so as to reduce the length of the line between those points.
13. Apparatus according to any of claims 9 to 12, in which the editing means is arranged to amend the data defining the positions of portions of a line which are defined relative to the further reference system, in dependence upon the extent of amendments to the first reference system in relation to which other portions of the line are defined.
14. Apparatus according to claim 3, in which the transformation data comprises first data defining a rotation of the reference system; and second data defining a translation of the reference system.
15. Apparatus according to claim 14 in which the first data defines a rotation of the local reference system about a point within another local reference system, and there is provided third transformation data defining a rotation of the local reference system about its origin.
16. Apparatus according to claim 1, in which the line defining data is arranged to define a continuous line enclosing a single space, and the editing means is arranged to change the orientation of a first portion of the line relation to, and independently of, a second portion of the line, and is further arranged to do so without substantially changing the topology of the line.
17. Character animation apparatus for generating an animated sequence of pictures, which comprises means for storing data defining at least a first picture, and means for generating therefrom a plurality of incrementally displaced secondary pictures defining a motion sequence, in which the picture defining data comprises data defining a hierarchy of local coordinate reference systems, defined by their spatial relationship with reference systems higher in the hierarchy, and data defining a plurality of portions of an object, the spatial position of said portions being defined in said local reference systems, in which said secondary picture generating means is arranged to generate said secondary pictures so as to include progressive rotations of a first local reference system about its superior local reference system in the hierarchy, with progressive counter-rotations of inferior local reference systems in the hierarchy, so as to maintain the angular orientation between said superior and inferior reference systems during rotation of said first local reference system.
18. Apparatus according to claim 17, arranged to store data defining first and second key frame pictures, said secondary picture generating means operating to interpolate picture data including local reference system orientation data between said first and second key frames, in which there are provided editing means for creating said second key frame by editing data corresponding to said first key frame, said editing means being arranged, upon a rotation of the first local reference system relative to said superior reference system, to store data defining said rotation and data defining a counter-rotation affecting inferior local reference systems in the hierarchy.
19. Apparatus according to claim 18, in which the data defining each local reference system comprises first and second rotation data, for defining respectively a rotation about said superior reference system and a rotation within the local reference system.
20. Character animation apparatus for generating an animated sequence of pictures, which comprises: means for storing picture data defining at least one picture of a character in a first position; and secondary image generation means arranged to generate a plurality of secondary pictures from said picture data, said secondary pictures being progressively displaced so as to form an animated sequence; characterised in that said secondary image generating means is arranged to simulate the effect of a rotation of a portion of said character out of the plane of said picture in a third dimension, by generating said secondary pictures so as to be progressively mutually rotated in the plane of said picture.
21. Apparatus according to claim 20, wherein said secondary picture generating means is arranged to generate said secondary pictures so that the locus, in successive said secondary pictures, of a portion of said character travels a curved arc which is flatter than a circular arc.
22. Apparatus according to claim 21, wherein said curved arc approximates an elliptical arc.
23. Apparatus according to claim 21 or claim 22, further comprising means for defining the extent of flattening of said curved arc.
24. Apparatus according to any of claims 20 to 23 arranged to store picture data defining first and second key frame images, and in which said secondary image generation means comprises interpolation means for generating said sequence of secondary pictures by interpolation of picture data between said first and second key frame pictures.
25. Apparatus according to claim 24, in which data defining the desired extent of representation of movement in said third dimension is stored in connection with picture data defining one of said key frames.
26. Apparatus substantially as herein described with reference to any of the accompanying drawings.
27. A method of generating an animated sequence of images, which method comprises the steps of: storing, in a digital storage device, data defining one or more pictures, said data defining one or more lines which, when displayed, define a said picture; and local reference system data; the line defining data defining the position of portions of the line relative to a first local reference system; and storing data relating to the position of a portion of a line in relation to a first local reference system so as to allow said portion to be influenced by a second local reference system; editing said stored data so as to amend said picture; reading said stored data and generating therefrom data defining a plurality of secondary pictures; converting said data into a plurality of video images; and creating a film strip or video tape embodying said succession of video images.
28. A method of generating an animated sequence of images, which method comprises the steps of: storing, in a digital storage device, data defining at least a first picture, said picture defining data comprising data defining a hierarchy of local coordinate reference systems, defined by their spatial relationship with reference systems higher in the hierarchy, and data defining a plurality of portions of an object, the spatial position of said portions being defined in said local reference systems; generating therefrom a plurality of incrementally displaced secondary pictures defining a motion sequence, by generating progressive rotations of a first local reference system about its superior local reference system in the hierarchy, with progressive counter-rotations of inferior local reference systems in the hierarchy, so as to maintain the angular orientation between said superior and inferior reference systems during rotation of said first local reference system; converting said secondary image data into a plurality of video images; and creating a film strip or video tape embodying said succession of video images.
29. A method of generating an animated sequence of images, which method comprises the steps of: storing, in a digital storage device, picture data defining at least one picture of a character in a first position; and generating a plurality of secondary images from said picture data, said secondary pictures being progressively displaced so as to form an animated sequence, said secondary pictures being generated so as to be progressively mutually rotated in the plane of said picture so as to simulate the effect of a rotation of a portion of said character out of the plane of said picture in a third dimension; converting said secondary image data into a plurality of video images; and creating a film strip or video tape embodying said succession of video images.
30. A method of generating an animated sequence of pictures which comprises the use of apparatus according to any one of claims 1 to 26.
GB9307107A 1993-04-05 1993-04-05 Computer generating animated sequence of pictures Withdrawn GB2277856A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB9307107A GB2277856A (en) 1993-04-05 1993-04-05 Computer generating animated sequence of pictures
EP94912003A EP0694191A1 (en) 1993-04-05 1994-03-25 Animation
PCT/GB1994/000631 WO1994023392A1 (en) 1993-04-05 1994-03-25 Animation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9307107A GB2277856A (en) 1993-04-05 1993-04-05 Computer generating animated sequence of pictures

Publications (2)

Publication Number Publication Date
GB9307107D0 GB9307107D0 (en) 1993-05-26
GB2277856A true GB2277856A (en) 1994-11-09

Family

ID=10733377

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9307107A Withdrawn GB2277856A (en) 1993-04-05 1993-04-05 Computer generating animated sequence of pictures

Country Status (3)

Country Link
EP (1) EP0694191A1 (en)
GB (1) GB2277856A (en)
WO (1) WO1994023392A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997007483A1 (en) * 1995-08-21 1997-02-27 Philips Electronics N.V. Animation control apparatus

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10188028A (en) * 1996-10-31 1998-07-21 Konami Co Ltd Animation image generating device by skeleton, method for generating the animation image and medium storing program for generating the animation image

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4952051A (en) * 1988-09-27 1990-08-28 Lovell Douglas C Method and apparatus for producing animated drawings and in-between drawings
WO1992009965A1 (en) * 1990-11-30 1992-06-11 Cambridge Animation Systems Limited Animation

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4600919A (en) * 1982-08-03 1986-07-15 New York Institute Of Technology Three dimensional animation
SE8801043D0 (en) * 1988-03-22 1988-03-22 Orjan Strandberg GeniMator
GB2245807A (en) * 1990-06-28 1992-01-08 Rank Cintel Ltd Editing of object-based animated computer graphics

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4952051A (en) * 1988-09-27 1990-08-28 Lovell Douglas C Method and apparatus for producing animated drawings and in-between drawings
WO1992009965A1 (en) * 1990-11-30 1992-06-11 Cambridge Animation Systems Limited Animation

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997007483A1 (en) * 1995-08-21 1997-02-27 Philips Electronics N.V. Animation control apparatus
AU704512B2 (en) * 1995-08-21 1999-04-22 Koninklijke Philips Electronics N.V. Animation control apparatus
CN1114177C (en) * 1995-08-21 2003-07-09 皇家菲利浦电子有限公司 Animation control appts.

Also Published As

Publication number Publication date
WO1994023392A1 (en) 1994-10-13
GB9307107D0 (en) 1993-05-26
EP0694191A1 (en) 1996-01-31

Similar Documents

Publication Publication Date Title
US5692117A (en) Method and apparatus for producing animated drawings and in-between drawings
US5619628A (en) 3-Dimensional animation generating apparatus
EP0950988B1 (en) Three-Dimensional image generating apparatus
Burtnyk et al. Interactive skeleton techniques for enhancing motion dynamics in key frame animation
JPH06507742A (en) Video creation device
US4600919A (en) Three dimensional animation
US5883638A (en) Method and apparatus for creating lifelike digital representations of computer animated objects by providing corrective enveloping
Gomez Twixt: A 3d animation system
US20050231510A1 (en) Shape morphing control and manipulation
US7259764B2 (en) Defrobulated angles for character joint representation
EP1031946B1 (en) Recording medium,Image processing method and unit with integrated shaping model data
Martín et al. Observer dependent deformations in illustration
GB2258790A (en) Animation
US10319133B1 (en) Posing animation hierarchies with dynamic posing roots
GB2277856A (en) Computer generating animated sequence of pictures
US8228335B1 (en) Snapsheet animation visualization
JP6062589B1 (en) Program, information processing apparatus, influence derivation method, image generation method, and recording medium
JP3002972B2 (en) 3D image processing device
JP2003346181A (en) Animation image generator
JP3361437B2 (en) 3D CG animation creation apparatus and creation method
Melikhov et al. Frame skeleton based auto-inbetweening in computer assisted cel animation
CN113805532B (en) Method and terminal for manufacturing physical robot actions
JP2001109901A (en) Animation generation device, animation generation method and computer-readable recording medium recording animation generation program
Tsang et al. Animated surface pasting
JPH11296698A (en) Three-dimensional model creation device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)