AU765544B2 - Object based animations with timelines - Google Patents

Object based animations with timelines Download PDF

Info

Publication number
AU765544B2
AU765544B2 AU16717/01A AU1671701A AU765544B2 AU 765544 B2 AU765544 B2 AU 765544B2 AU 16717/01 A AU16717/01 A AU 16717/01A AU 1671701 A AU1671701 A AU 1671701A AU 765544 B2 AU765544 B2 AU 765544B2
Authority
AU
Australia
Prior art keywords
attribute values
attribute
label
time line
events
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU16717/01A
Other versions
AU1671701A (en
Inventor
Sean Matthew Doherty
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AUPQ5887A external-priority patent/AUPQ588700A0/en
Application filed by Canon Inc filed Critical Canon Inc
Priority to AU16717/01A priority Critical patent/AU765544B2/en
Publication of AU1671701A publication Critical patent/AU1671701A/en
Application granted granted Critical
Publication of AU765544B2 publication Critical patent/AU765544B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Description

~~'~aLLL~C S&FRef: 535258
AUSTRALIA
PATENTS ACT 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT
ORIGINAL
Name and Address of Applicant: Canon Kabushiki Kaisha 30-2, Shimomaruko 3-chome, Ohta-ku Tokyo 146 Japan Actual Inventor(s): Address for Service: Invention Title: Sean Matthew Doherty Spruson Ferguson St Martins Tower,Level 31 Market Street Sydney NSW 2000 Object Based Animations with Timelines ASSOCIATED PROVISIONAL APPLICATION DETAILS [33] Country [31] Applic. No(s) AU PQ5887 [32] Application Date 28 Feb 2000 The following statement is a full description of this invention, including the best method of performing it known to me/us:- Docu:ments 3 0 J 2001 Btch No: LZIZZ Z 5815c -1- OBJECT BASED ANIMATIONS WITH TIMELINES Technical Field of the Invention The present invention relates generally to image processing and, in particular, to systems for authoring object based animations.
Background Art Animation is based on the principle of human vision in that if a series of related still images are viewed successively over a short period, the images are perceived by a viewer as continuous motion. Each individual still image is generally referred to as a frame.
Conventionally, the main drawback in generating an animation sequence has been the work required by an animator to produce the large magnitude of frames required.
oooo A short period of animation one minute) can require between 600 and 1800 separate image frames, depending on the quality of the animation. Thus, generating images 15 manually is very labour intensive. The amount of labour required was the catalyst behind the development of the technique known as "keyframing".
The majority of frames of an animation sequence involve routine incremental changes from a previous frame directed toward a defined endpoint. Conventional "animators, such as Walt Disney Studios, realised that they could increase the productivity of their master artists by having them draw only the important frames, called "keyframes". Lower skilled assistants could then determine and produce the frames in 0between the key-frames. The in-between frames were generally referred to as "tweens".
Once all of the keyframes and tweens had been draughted, the images had to be inked or rendered to produce the final images. Even today, production of a traditional animation sequence requires many artists in order to produce the thousands of images required.
More recently it has become popular to create complex 3D models, photorealistic still images, and film quality animation through the utilisation of a computer system having a high-resolution graphics display. The graphic image animation 535258 -2production industry is undergoing rapid development and complex and sophisticated Tm image production tools, such as 3D Studio Max are often utilised for modelling, animating and rendering scenes.
Image production tools generally provide an integrated modelling environment, such as the user interface 100 of Fig. 1, where a user can perform 2D drawing, 3D modelling and animation within a unified workspace. Generally an animator creates an animated scene by firstly, modifying standard objects, such as 3D geometry and 2D shapes by adjusting the object's attributes colour, transparency, scale, position, rotation). Secondly, the keyframes that record the beginning and end of the animated sequence are created, by combining one or more of the modified objects. Thirdly, the animator uses the image production tools to automatically interpolate object attribute values and scene values bend angle, taper amount) between each of the keyframes to produce the completed animation sequence which is generally stored as a "canned" animation. Shading and rendering of each frame is also generally carried out by the image production tool.
Traditional animation methods, and early image production tools, are restricted to the concept of producing animations frame by frame which is sufficient if the animator always works in a single format or does not need to specify an animated effect at a precise time. However, most animation comes as a film based 24 frames per second (FPS) or a 20 video based 30 FPS format.
The requirement for time-based animation as opposed to frame-based animation is ever increasing as the uses of animation becomes more varied. Some image production tools use time based animation where animation values for an animation sequence are stored in fractions of a second. These time based animation image production tools generally include a time control such as the time slider 101 seen in the user interface 100 of Fig. 1. The time slider 101 can be used to move to any point in time in an animation sequence and to play back a segment of the canned animation using a viewing window 103.
535258 -3- The disadvantage of the prior art image production tools, such as 3D Studio MaxTM, is that they can only produce the canned animation sequences discussed above.
Disclosure of the Invention It is an object of the present invention to substantially overcome, or at least ameliorate, one or more of the deficiencies of the above mentioned animation method.
According to one aspect of the present invention there is provided a method of generating an animation sequence having a plurality of frames, said method comprising the steps of: displaying at least one of a plurality of attribute values associated with each said frame of said animation sequence as a time line, at least one of said attribute values having one or more associated events; labelling said time line with at least one event label, said label corresponding to at least one of said associated events; and generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
According to another aspect of the present invention there is provided a method S" of generating an animation sequence having a plurality of frames, said method comprising the steps of: interpolating between at least one attribute value associated with at least one of said frames and at least one desired attribute value to produce a plurality of attribute values; *..*displaying said plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; 25 labelling said time line with an event label, said label corresponding to at least one of said associated events; and generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
535258 -4- According to still another aspect of the present invention there is provided a method of generating an animation sequence, said method comprising the steps of: selecting at least one of a plurality of objects having at least one of a first plurality of attribute values; modifying at least one of said first plurality of attribute values to produce a modified object having modified attribute values; interpolating between said modified attribute values and at least one desired attribute value to produce a second plurality of attribute values; displaying said second plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; labelling said time line with an event label, said label corresponding to at least one of said associated events; and generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value in response to said at least one associated event occurring.
According to still another aspect of the present invention there is provided an apparatus for generating an animation sequence having a plurality of frames, said S. apparatus comprising: S"display means for displaying at least one of a plurality of attribute values associated with each said frame of said animation sequence as a time line, at least one of said attribute values having one or more associated events; labelling means for labelling said time line with at least one event label, said label corresponding to at least one of said associated events; and processor means for generating said animation sequence using said time line, S: 25 wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
According to still another aspect of the present invention there is provided an apparatus for generating an animation sequence having a plurality of frames, said apparatus comprising: 535258 interpolation means for interpolating between at least one attribute value associated with at least one of said frames and at least one desired attribute value to produce a plurality of attribute values; display means for displaying said plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; labelling means for labelling said time line with an event label, said label corresponding to at least one of said associated events; and processor means for generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
According to still another aspect of the present invention there is provided an apparatus for generating an animation sequence, said apparatus comprising: selection means for selecting at least one of a plurality of objects having at least one of a first plurality of attribute values; modification means for modifying at least one of said first plurality of attribute values to produce a modified object having modified attribute values; interpolation means for interpolating between said modified attribute values and at least one desired attribute value to produce a second plurality of attribute values; display means for displaying said second plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; labelling means for labelling said time line with an event label, said label corresponding to at least one of said associated events; and processor means for generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value in 25 response to said at least one associated event occurring.
According to still another aspect of the present invention there is provided a computer readable medium, having a program recorded thereon, where the program is configured to make a computer execute a procedure to generate an animation sequence having a plurality of frames, said program comprising: 535258 -6code for displaying at least one of a plurality of attribute values associated with each said frame of said animation sequence as a time line, at least one of said attribute values having one or more associated events; code for labelling said time line with at least one event label, said label corresponding to at least one of said associated events; and code for generating said animation sequence based on said event label using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
According to still another aspect of the present invention there is provided a computer readable medium, having a program recorded thereon,. where the program is configured to make a computer execute a procedure to generate an animation sequence having a plurality of frames, said program comprising: code for interpolating between at least one attribute value associated with at least one of said frames and at least one desired attribute value to produce a plurality of attribute values; code for displaying said plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; code for labelling said time line with an event label, said label corresponding to at least one of said associated events; and code for generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
According to still another aspect of the present invention there is provided a computer readable medium, having a program recorded thereon, where the program is 25 configured to make a computer execute a procedure to generate an animation sequence having a plurality of frames, said program comprising: code for selecting at least one of a plurality of objects having at least one of a first plurality of attribute values; 535258 6acode for modifying at least one of said first plurality of attribute values to produce a modified object having modified attribute values; code for interpolating between said modified attribute values and at least one desired attribute value to produce a second plurality of attribute values; code for displaying said second plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; code for labelling said time line with an event label, said label corresponding to at least one of said associated events; and code for generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
Brief Description of the Drawings A preferred embodiment of the present invention will now be described with reference to the drawings, in which: Fig. 1 shows a user interface for a prior art image production tool; 2 is a flow diagram of a method of generating an animation sequence in •••oo accordance with the preferred embodiment; Fig. 3 is a schematic block diagram of a general purpose computer upon which ••ooo Sthe preferred embodiment of the present invention can be practiced; and Fig. 4 shows an example of a graphics object (gob) tree used in accordance with the preferred embodiment; Fig. 5 shows a user interface that can be used to perform the method in Fig. 2; •oo°.
535258 -7- Fig. 6 shows transition formulae and profiles used in accordance with the preferred embodiment; Fig. 7 shows the user interface of Fig. 5 displaying attributes as timelines used in accordance with the preferred embodiment; and Fig. 8 is a flowchart showing the method of updating gobs of a gob tree which represents a frame in an animation sequence, during the rendering of the animation sequence, in accordance with the preferred embodiment.
Detailed Description including Best Mode Where reference is made in any one or more of the accompanying drawings to steps and/or features, which have the same reference numerals, those steps and/or features have for the purposes of this description the same function(s) or operation(s), unless the contrary intention appears.
The preferred embodiment is a method of generating an animation sequence whereby the animation sequence can be altered under certain events that the animation sequence may be exposed to while the sequence is running. In accordance with the preferred embodiment, each object of a frame will be comprised of attributes described using a time line. Preferably, each time line can include one or more event labels positioned at any point on the time line. Further, each attribute can preferably have a list of events associated with the attribute, which the attribute can respond to while an 20 animation sequence is running. For example, an animation sequence may jump to a certain position on an associated attribute time line if a mouse is moved. The preferred embodiment allows an animation sequence to loop back and replay a segment of an animation sequence or to change an animation sequence while the sequence is running.
Fig. 2 is a flow diagram showing the method of generating an animation sequence in accordance with the preferred embodiment. The method of generating an animation sequence is preferably practiced using a conventional general-purpose computer system 300, such as that shown in Fig. 3, whereby the process of Fig. 2 can be implemented as software, such as an application program executing within the computer system 300. In particular, the steps of the method of generating an animation sequence 535258 -8are effected by instructions in the software that are carried out by the computer. The software may be divided into two separate parts; one part for carrying out the preferred method; and another part to manage the user interface between the latter and the user.
The software can be stored in a computer readable medium, including the storage devices described below, for example. The software is loaded into the computer from the computer readable medium, and then executed by the computer. A computer readable medium having such software or computer program recorded on it is a computer program product. The use of the computer program product in the computer preferably effects an advantageous apparatus for generating an animation sequence in accordance with the embodiments of the invention.
The computer system 300 comprises a computer module 301, input devices such as a keyboard 302 and mouse 303, output devices including a printer 315 and a display device 314. A Modulator-Demodulator (Modem) transceiver device 316 is used by the computer module 301 for communicating to and from a communications network 320, for 15 example connectable via a telephone line 321 or other functional medium. The modem 316 can be used to obtain access to the Internet, and other network systems, such as a Local Area Network (LAN) or a Wide Area Network (WAN).
o* ~The computer module 301 typically includes at least one processor unit 305, a memory unit 306, for example formed from semiconductor random access memory (RAM) and read only memory (ROM), input/output interfaces including a video interface 307, and an I/O interface 313 for the keyboard 302 and mouse 303 and optionally ajoystick (not illustrated), and an interface 308 for the modem 316. A storage device 309 is provided and typically includes a hard disk drive 310 and a floppy disk drive 311. A magnetic tape drive (not illustrated) may also be used. A CD-ROM drive 312 is typically provided as a non-volatile source of data. The components 305 to 313 of the computer module 301, typically communicate via an interconnected bus 304 and in a manner which results in a conventional mode of operation of the computer system 300 known to those in the relevant art. Examples of computers on which the 535258 embodiments can be practised include IBM-PC's and compatibles, Sun Sparcstations or alike computer systems evolved therefrom.
Typically, the application program of the preferred embodiment is resident on the hard disk drive 310 and read and controlled in its execution by the processor 305.
Intermediate storage of the program and any data fetched from the network 320 may be accomplished using the semiconductor memory 306, possibly in concert with the hard disk drive 310. In some instances, the application program may be supplied to the user encoded on a CD-ROM or floppy disk and read via the corresponding drive 312 or 311, or alternatively may be read by the user from the network 320 via the modem device 316.
Still further, the software can also be loaded into the computer system 300 from other computer readable medium including magnetic tape, a ROM or integrated circuit, a magneto-optical disk, a radio or infra-red transmission channel between the computer module 301 and another device, a computer readable card such as a PCMCIA card, and the Internet and Intranets including email transmissions and information recorded on 15 websites and the like. The foregoing is merely exemplary of relevant computer readable mediums. Other computer readable mediums may be practiced without departing from S-the scope and spirit of the invention.
The method of generating an animation sequence can alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of Fig. 3. Such dedicated hardware may include graphic processors, digital signal processors, or one or more The method of generating an animation sequence, in accordance with the preferred embodiment, can now be described with reference to the flow diagram of Fig. 2, where the method is performed using the computer system 300. The process begins at step 201, where an animator selects an object from a library of objects, preferably stored on the hard disk drive 310, to be added to keyframes of the animation sequence. At the next step 203, the animator modifies one or more attributes of the object in order to create a desired effect. The process continues at step 205, where the object is added to the keyframes. Also at step 205, the animator can adjust scene values bend angle, taper 535258 amount) to create the keyframes. At the next step 207, if the animator wants to add more objects to the keyframes, then the process returns to step 201. Otherwise, the process continues to step 209, where in accordance with the preferred embodiment object attribute values and scene values are automatically interpolated between each of the keyframes to create the frames for the animation sequence. The process continues at step 211, where each of the object attribute values and scene values are displayed as a time line. An example of attribute values displayed as time lines is shown in Fig. 5, at reference 514.
At the next step 213, a list of events, which the animation sequence can respond to, is generated. Examples of events that can be used in accordance with the preferred embodiment include animation start, animation finish, mouse click, mouse move, presence detected, screen saver start, and new text. The event driven animation sequence will be explained in further detail later in this document.
At the next step 215, each of the events is associated with an event identifier.
oooo• For example, the event "Mouse Click" can be associated with the event identifier 0: 15 "IDANIMATIONSTART", such that when the mouse 303 is clicked, the created animation sequence is rendered. The process continues at the next step 217, where each S"of the event identifiers can be associated with an event label. For example, the event identifier ID ANIMATION START can be associated with the event label "START".
At the next step 219, the event labels can be associated with a point on the attribute time *o lines. As seen in Fig. 5, the time line axis 523 includes the event label 'START' 517.
The event identifiers, labels and associated events will be explained in further detail later o in this document. The process concludes at step 221, where the created animation sequence including the associated event identifiers, labels and events, is stored or rendered. The animation sequence is preferably stored on the hard disk drive 310.
In accordance with the preferred embodiment each frame written for the preferred embodiment has a graphic object tree gob tree). The preferred embodiment paints on the display by appropriately updating the gob tree. An example of a gob tree 400 is shown in Fig. 4. The gob tree 400 comprises any number of gobs 401 representing an object. The application program preferably defines a sequencing function, which 535258 -11updates a gob tree associated with a frame, on a regular basis to create the animation sequence. Once the preferred application program is made aware that there is a sequencing function associated, it will call the function as often as possible. When calling the function a value representing the current time is passed from the computer system 300 to the application program so that the function can determine the appropriate update for the animation sequence.
Particle based animations are suited to the tree structure used in accordance with the preferred embodiment. Each particle can be represented or defined by a gob 401 in the gob tree 400. The particles can be leaf gobs, which have no child gobs, or the gobs can be branch gobs, which are comprised of other gobs using the standard compositing operators, known in the art. Leaf gobs can be either a path gob or an image gob. A path gob is preferably defined by an array of 2D coordinates. An image gob is simply an image loaded from an image file. It will be appreciated by a person skilled in the art that text leaf ••gobs can be used for some applications.
15 Each gob 401 in the tree 400 has one or more attributes opacity) associated with an object represented by the gob 401. Preferably, each attribute for each gob 401 in the gob tree 400 can be adjusted independently of one another, allowing the preferred embodiment to create flexible and varied animations. For example, to animate a bouncing ball (not illustrated), the angle of a circular object can be adjusted with constant 20 angular velocity to simulate the spin on a ball, while the position of the object can be **adjusted with a parabolic function to simulate bouncing.
Attributes which can be modified while a created animation is running, in accordance with the preferred embodiment, include: S position relative to parent gob y) or layer ordering simulating a third dimension; angle scale (kx, ky); index to an image sequence (Image gobs only); index to a path sequence (Path gobs only); 535258 -12colour and opacity (Path gobs only); and individual points in a path definition an array of points).
Preferably, an animation sequence created in accordance with the preferred method can change from one state to another whilst the animation sequence is running (i.e at run time). The changes in state can involve changes in colour, scale, rotation, position and index to image or path sequences. To achieve changes in the state of object attributes, transitions are preferably defined to determine how an object attribute can transfer from one value to another. For example, given a starting value, x, and a destination value, x2, a number of transition formulae or profiles can be provided, as shown in Fig. 6. In accordance with the preferred embodiment, a user can also define transitions. Therefore, for every attribute of every object, which can be impacted by user or system events, a transition is defined. A user can also decide whether the transition 06 I••Se occurs at a constant speed the value of T is proportional to (x 2 or alternatively with a fixed duration (i.e T is the same for all transitions).
The preferred method can be used to create canned animation as well as allowing an animator to define how the animation may change under certain events that the animation may be exposed at run time. Preferably, attributes can also generate events when the attribute reaches a defined point in the time line, or when the attribute reaches a defined value. In this instance, the animator will need to identify when the event should 20 be created using a time line axis label and should also give the event an identifier (ID) so that other attributes can respond. Therefore, for respond events, an event ID will produce 0*50 an axis label for the attribute to jump to, while for generated events, an axis label will produce the event ID for the event to be generated.
In order for attributes to have the ability to change from one unpredictable point on a time line to another different point, the attribute must have a suitable transition defined for doing so. As discussed above, the preferred embodiment can have a selection of standard transitions with an option of allowing the animator to define custom transitions.
535258 13- The preferred embodiment preferably allows the animation to produce events at defined times which can be used by other animations.
The output of the application program in accordance with the preferred embodiment can comprise information such as time line labels used in redirecting animations and gob information. The gob information preferably includes a gob ID; gob type parent, path, image, text gobs, etc); associated parent gob ID; associated child Gob IDs (parent gobs only); and attributes and associated information attributes can include scale, position, layer, angle, colour, etc.). Each gob attribute preferably contains a list of events to respond to reference to time line label); a list of events to generate reference to time line label); a transition to be used when changing from one value to another; a time line description, including starting point; and flags used to reduce unnecessary processing.
Time line labels will be used by all the attributes, which respond to events or generate events, in accordance with the preferred embodiment. As discussed above, when an attribute's progression reaches a certain point on a time line, the attribute can generate an event. Alternatively, when an event occurs, an attribute can be re-positioned on the attributes time line. In either case, the animator will need to determine which points in time these are, hence the need for time line labels. Preferably, each of the time line labels 040.0 comprise an identifying label probably a character string); and a time value.
Each gob preferably contains the necessary information to create the associated object, in accordance with the preferred embodiment. Additional information can also be required to enable changes in animation of each gob in response to events. Each gob S* preferably comprises a gob ID probably its index in the array); a gob type parent (composition), path, image); flags, used to reduce unnecessary rendering or processing of 25 gobs; parent gob ID(i.e. parent index in the array. The position attributes will be relative to those of the parents so a connection must be maintained); an array of child gob IDs; and an array of attributes.
Each gob preferably contains a list of attributes, which varies between gob types.
An attribute describes only a single value. Hence to describe colour, at least four 535258 -14attributes are required g, b Other attributes include, for example, position (x layering scale (kx ky), angle, path xl, X 2 Xn yi, Y2,. and/or indices to image or path sequences.
Each attribute can comprise attribute type colour.r, colour.g, colour.b, colour.o, position.x, position.y, angle); flags used to eliminate unnecessary updates of the gob parameter and event processing) used to indicate the use of events to respond to, events generated and time lines; a list of events that the attribute may respond to (i.e.
typically the event will result in the attribute moving to a different position on a time line). Each event in the list will have an ID and a time line label that the attribute can move to; a list of events that the attribute may generate attributes can generate an event when the attribute reaches a position in a time line represented by a time line label); a transition type as described above, to allow an attribute to change to a different value in an appropriate manner); and time line.
Each time line describes an attribute as a function of time and can be comprised of several time line segments. In some cases only one time line segment may be necessary, but allowing multiple time line segments creates a flexible set up allowing for variation in event driven animations. Each time line segment preferably comprises a type identifier, which would primarily indicate the method used for joining points; a duration, to assist in locating the appropriate segment; and an array of time line points.
Preferably, new data can be loaded by a user into the computer system 300 while *o o.: the preferred application program is running. The application program may already have existing objects animating and hence the preferred application program will need the °ability to add and delete separate objects. Preferably gobs are stored in an array on the hard disk 310 or memory 306, as it allows efficient balanced gob trees to be created oooo e 25 quickly.
Events are preferably managed using a global FIFO event queue. Each time the application program provides an update to the gob tree, the application program will preferably offer each event in the queue to each attribute. Once all the events in the queue have been offered, the events in the queue can then be removed. After removal, 535258 any events generated by the attributes since the last update can be added to the queue.
System events, such as mouse move and mouse click, preferably add events at the time the event is raised using standard event based functions.
Preferably, each attribute has flags to indicate whether the attribute has the ability to generate and/or respond to events. Alternatively, lookup tables can be used whereby one table is indexed by event IDs, with each entry listing the effect gobs and attributes and a second table is preferably indexed by gobs or gob attributes to indicate which gobs need to be checked for new events.
Updating each object involves iterating through each attribute of each gob appropriately for a given value of time supplied by the application program. Gobs preferably have a flag to indicate that the object is constant and the flags are preferably checked at run time to prevent unnecessary updates.
In the case where a flag indicates that a gob needs to be updated, the application program will preferably check each gob attribute. Each gob attribute preferably operates independently along its own time line and the application program preferably converts the time supplied by the computer system 300 to a time based on an attribute's time line.
This is preferably done using the time of last update in terms of the time line scale and the time of last update in terms of absolute system 300 time flags. Once the time line value is derived, the attributes time line is used to obtain the appropriate attribute or scene value.
Fig. 8 is a flowchart showing the method of updating gobs of a gob tree which represents a frame in an animation sequence, during the rendering of the animation sequence at step 221), in accordance with the preferred embodiment. The process begins at steps 801 and 803 where for each gob in the tree, a check is performed to 25 determine if the gob is to remain constant for the next frame of the animation sequence.
As discussed above, the constancy of the object is indicated by a flag associated with the gob. If the gob does not remain constant for the next frame then the process performs the loop represented by steps 805 to 831 of Fig. 8, where any attributes associated with the 535258 16gob and which are required to be updated, are updated. As discussed above, each attribute of a gob in the tree can preferably be adjusted independently of one another.
At step 807 a check is performed to determine if an attribute is constant for the next frame. If the attribute is constant at step 807 then the process returns to step 805 and the next attribute of the gob is checked. Otherwise, the process proceeds to step 809, where a check is performed to determine if the attribute is required to respond to any events on a timeline for the animation sequence. As discussed above, each attribute preferably contains a list of events to respond to. If the attribute is required to respond to any events at step 809, then the process proceeds to step 811. At step 811 the events in an event queue for the animation timeline are offered to the attribute by comparing the list of events associated with the attribute to the events in the queue. At the next step 813, if an event is included in the list associated with the attribute then the process proceeds to step 815, where the attribute is re-positioned on the time line associated with the attribute. As discussed above, each attribute preferably operates independently along its own time line and the application program preferably converts the time supplied by the computer system 300 to a time based on the time line associated with the attribute. At the next step 817, the time of the last update in terms of the attribute time line scale is updated to reflect the time associated with the event. Also at step 817, the time of the last update in terms of absolute system time is updated to reflect the time associated with the event. At 20 the next step 819, the current time associated with the time line for the animation *ooo** sequence is accessed.
The process continues at the next step 821, where the time accessed at step 819 is used to determine a new value for the attribute. The new value for the attribute is obtained by correlating the time accessed at step 819 with a corresponding value on the attribute's time line. The attribute value preferably changes to the new value depending on a predetermined transition as described above. At the next step 823, a check is performed to determine if the attribute is required to generate any events for the animation sequence. If the attribute is required to generate an event then the process proceeds to step 825, where a test is performed to detect new events. At the next step 535258 17- 827, if there are new events then the process proceeds to step 829, where the new events are generated and added to an event queue associated with the animation sequence. If there are no new events at step 827, then the process proceeds to step 831, where the gob is updated using an appropriate sequencing function. The sequencing function is predetermined depending on the animation sequence. For example, as discussed above, to animate a bouncing ball, the angle of a circular object can be adjusted with constant angular velocity to simulate the spin on a ball, while the position of the object can be adjusted with a parabolic function to simulate bouncing. When each gob in the gob tree representing the next frame in the animation sequence has been processed, the process of Fig. 8 proceeds to step 833, where old events are removed from the event queue and the process concludes.
The preferred embodiment will now be explained with reference to the following example which is based on an animation sequence simulating a bouncing ball (not illustrated).
Fig. 5 shows a preferred user interface 501 that can be used to perform the method in accordance with the preferred embodiment. The preferred user interface 501 comprises a "Tree View" window 503 in which a user can view the Graphic Object Tree (GOB) associated with a particular frame or image. The tree view window 503 preferably •..."displays objects of an image or frame using indentation to express hierarchy. Child objects are displayed indented and below their parent object. The tree view window 503 preferably also displays all of the attributes associated with an object. In the present example, the ball is represented by a circle object having a circle gob 505 in the tree 507 associated with the bouncing ball animation sequence. The circle object has three attributes: centre, radius and scale associated with it, which are represented by three child attributes 509, 511 and 513.
The preferred user interface 501 also preferably includes a time line window 514 where each of the object attribute values and scene values associated with an object can be displayed as a time line 515. The time line 515 indicates how a particular object attribute value or a scene value changes over time. In the present example, the time line 535258 -18window 514 comprises two axes 523 and 521 indicating time (in seconds) and height (in metres), respectively. The child attribute 509 representing the centre attribute of the ball has been selected by a user and is shown highlighted. The selection of the child attribute 509 is preferably carried out using the mouse 303 in a conventional fashion. The selection of the child attribute 509 preferably results in the centre attribute time line 515 being displayed in the time line view window 514. As seen in Fig 5, the height of the ball varies parabolically over time in order to simulate the effect of gravity on a bouncing ball.
Two event labels, a START label 517 and a FINISHBOUNCE label 519, have been inserted by a user above the time line 515. In the present example, the START label 517 has been associated with the event identifier IDANIMATIONSTART, and the FINISH BOUNCE label 519 has been associated with the event identifier ID BALL BOUNCE FINISH.
In the animation sequence of the present example, after two seconds the time line 515 reaches the START label 517. The position of the centre of the ball then increases from a starting value of 0.3 metres to a value of 1.0 metre at 5 seconds. After eight seconds, the time line 515 reaches the FINISHBOUNCE label 519 and the position of i the centre of the ball settles at a value of 0.3. Therefore, when displayed, the animation V sequence of the present example would show a ball bouncing from rest to a height of one metre over a period of eight seconds.
In order to create the bouncing ball animation sequence of the above example, a user can create two Event Maps, a Respond Event Map and a Generate Event Map. The Respond Event Map and the Generate Event Map for the centre attribute of the bouncing ball example are shown below. The application program of the preferred embodiment oo uses the Respond and Generate Event Maps in conjunction with the Event Labels to determine which events an animation sequence will respond to or which events will be generated by the animation sequence.
535258 -19- RESPOND EVENT MAP GENERATE EVENT MAP Event ID Time Line Time Line Label Event ID Label ID ANIMATION START "START" "FINISH BOUNCE" ID BALL BOUNCE FINISH ID BALL BOUNCE FINISH "START" Preferably, the event maps can be created by a user using the preferred application program and subsequently automatically stored on the hard disk drive 310. In accordance with the preferred embodiment, the Respond Event Map indicates those event identifiers which can have an event associated with the identifier to which the animation sequence will respond. For example, the event "Mouse Click" can be associated with the event identifier "IDANIMATIONSTART", such that when the mouse 303 is clicked, the bouncing ball animation sequence is rendered. Similarly, the Generate Event Map indicates those event identifiers which can have an event associated with it, such that when the time line reaches the associated label, the event is automatically generated.
Continuing the bouncing ball example, Fig.7 shows the user interface 501 with another graphic object tree 701 displayed in the Tree View window 703. The graphic object tree 701 is for an ellipse object having an ellipse gob 727. In the present example, the ellipse object is used to simulate the compression and expansion of the ball as the ball bounces. Associated with the ellipse object are the attributes Width and Height, each having associated attributes 705, 707, 709 and 711, respectively. The X and Y attributes indicate the positional change of the object over time with respect to the parent gob of the gob associated with the object. As seen in Fig. 7, all of the attribute gobs 705, 707, 709 and 711 have been selected by the user and are shown highlighted. The selection of the attribute gobs 705, 707, 709 and 711 has resulted in the display of four time lines 713, 715, 719 and 721, indicating the change in the associated attribute values Y, Height, X and Width, respectively) over time. As seen in Fig. 7, a further event label, the SQUASH label 717 has been inserted by a user above the time line 713. A label 723 indicating the maximum change in the y attribute has also be added.
535258 20 At the beginning of the bouncing ball animation sequence, the SQUASH label 717 results in the width attribute of the ball increasing and the height attribute of the ball decreasing, as indicated by the time lines 715 and 721. The width and height attributes of the ellipse object return to normal as time increases, simulating the return of the ball to it's normal shape. When the animation sequence reaches the START label 729, the Y attribute increases to a maximum value set by the label 723 and then decreases, settling to a constant value when the FINISHBOUNCE label 725 is reached. The X attribute value is constant over the entire animation sequence indicating that the position of the ellipse object in the x direction, with respect to a parent gob, does not change.
The respond and generate event maps for the ellipse object attributes of the bouncing ball animation sequence are shown below.
X constant, no events (ii) Y RESPOND EVENT MAP rr r r r r Event ID Time Line Label ID ANIMATE START "START" ID FINISHBOUNCE "SQUASH" Width RESPOND EVENT MAP Event ID Time Line Label ID_FINISH_BOUNCE "SQUASH" Height RESPOND EVENT MAP Event ID Time Line Label
IDFINISHBOUNCE
"SQUASH"
535258 -21- GENERATE MAP Event ID Time Line Label "FINISH BOUNCE" IDFINISHBOUNCE In accordance with the preferred embodiment, a further label could be added to the attribute time lines 713, 715, 719 and 721 at any point in the animation sequence to create any desired effect relating to more unpredictable events. For example, the SQUASH label 717 can be positioned at any point on the time line such that when the animation sequence is rendered the ball may compress and expand at any point at the top of the ball's bounce). Further, the SQUASH label 717 can be associated with a respond event identifier such that when an event is detected by the animation sequence, the animation sequence can change. For example, if the mouse 303 is operated at any point in the animation sequence then the ball may compress and expand.
The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and 15 spirit of the invention, the embodiment(s) being illustrative and not restrictive. For example, the addresses of as many gobs as possible are cached in accordance with the embodiments. It may not be desirable to store and maintain all of these gobs because of •the overhead involved.
In the context of this specification, the word "comprising" means "including S 20 principally but not necessarily solely" or "having" or "including" and not "consisting only of'. Variations of the word comprising, such as "comprise" and "comprises" have corresponding meanings.
535258

Claims (37)

1. A method of generating an animation sequence having a plurality of frames, said method comprising the steps of: displaying at least one of a plurality of attribute values associated with each said frame of said animation sequence as a time line, at least one of said attribute values having one or more associated events; labelling said time line with at least one event label, said label corresponding to at least one of said associated events; and generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
2. The method according to claim 1, further comprising the step of automatically generating said events.
3. The method according to claim 1, wherein said events are generated if said 0:006 :0*0 •attribute value reaches a point in said time line defined by said event label. S•4. The method according to claim 1, wherein said events are generated if said attribute value reaches a predetermined value being defined by said event label. The method according to claim 1, wherein said event is generated by a user. 0 0
6. The method according to claim 1, wherein said step of generating said animation 25 sequence comprises the sub-steps of proceeding to said event label and generating said animation sequence from a point defined by said label. 535258 -23
7. The method according to claim 1, further comprising the step of interpolating between said attribute value associated with at least one of said frames and at least one desired attribute value to produce a plurality of attribute values.
8. The method according to claim 1, further comprising the steps of: selecting at least one of a plurality of objects having at least one of a first plurality of attribute values; and modifying at least one of said first plurality of attribute values to produce a modified object having modified attribute values.
9. The method according to claim 1, wherein said event label can be associated with any point on said time line. The method according to claim 1, wherein each frame of said animation sequence is represented by a hierarchical structure, said hierarchical structure including a plurality of nodes each representing a component of said frame.
11. The method according to claim 10, wherein each node of said hierarchical o structure is associated with one or more attributes. .ooooi
12. The method according to claim 1, wherein each of said plurality of attribute values can be modified independently. S 13. The method according to claim 1, wherein each of said plurality of attribute 1r °values can be modified as said animation sequence is being displayed. •go• :oi 14. The method according to claim 1, wherein each of said plurality of attribute values has an associated transition function. 535258 -24- The method according to claim 1, wherein each attribute comprises a list of events.
16. A method of generating an animation sequence having a plurality of frames, said method comprising the steps of: interpolating between at least one attribute value associated with at least one of said frames and at least one desired attribute value to produce a plurality of attribute values; displaying said plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; labelling said time line with an event label, said label corresponding to at least one of said associated events; and generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring. .ooo ooooi
17. The method according to claim 16, further comprising the step of automatically i i generating said events.
18. The method according to claim 16, wherein said events are generated if said attribute value reaches a point in said time line defined by said event label. S 19. The method according to claim 16, wherein said events are generated if said attribute value reaches a predetermined value being defined by said event label.
20. The method according to claim 16, wherein said event is generated by a user. 535258
21. The method according to claim 16, wherein said step of generating said animation sequence comprises the sub-steps of proceeding to said event label and generating said animation sequence from a point defined by said label.
22. The method according to claim 16, further comprising the steps of: selecting at least one of a plurality of objects having at least one of a first plurality of attribute values; and modifying at least one of said first plurality of attribute values to produce a modified object having modified attribute values.
23. The method according to claim 16, wherein said event label can be associated with any point on said time line.
24. The method according to claim 16, wherein each frame of said animation sequence is represented by a hierarchical structure, said hierarchical structure including a i plurality of nodes each representing a component of said frame. The method according to claim 24, wherein each node of said hierarchical structure is associated with one or more attributes.
26. The method according to claim 16, wherein each of said plurality of attribute values can be modified independently. S27. The method according to claim 16, wherein each of said plurality of attribute oooo values can be modified as said animation sequence is being displayed. °o o•
28. The method according to claim 16, wherein each of said plurality of attribute values has an associated transition function. 535258 -26-
29. The method according to claim 16, wherein each of said plurality of attribute values comprises a list of events. A method of generating an animation sequence, said method comprising the steps of: selecting at least one of a plurality of objects having at least one of a first plurality of attribute values; modifying at least one of said first plurality of attribute values to produce a modified object having modified attribute values; interpolating between said modified attribute values and at least one desired attribute value to produce a second plurality of attribute values; displaying said second plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; labelling said time line with an event label, said label corresponding to at least one of said associated events; and S. generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value in response to said at S• least one associated event occurring. S *age.:
31. An animation sequence having a plurality of frames generated in accordance with the method of any one of claims 1 to
32. A production using an animation sequence having a plurality of frames generated 0S0S in accordance with the method of any one of claims 1 to 0*55 6 0050
33. An apparatus for generating an animation sequence having a plurality of frames, said apparatus comprising: said apparatus comprising: 535258 27 display means for displaying at least one of a plurality of attribute values associated with each said frame of said animation sequence as a time line, at least one of said attribute values having one or more associated events; labelling means for labelling said time line with at least one event label, said label corresponding to at least one of said associated events; and processor means for generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
34. The apparatus according to claim 33, wherein said processor means automatically generates said events. The apparatus according to claim 33, wherein said events are generated if said attribute value reaches a point in said time line defined by said event label.
36. The apparatus according to claim 33, wherein said events are generated if said attribute value reaches a predetermined value being defined by said event label. The apparatus according to claim 33, wherein said event is generated by a user.
38. The apparatus according to claim 33, wherein said processor means further comprises means for proceeding to said event label and generating said animation sequence from a point defined by said label. 25 39. The apparatus according to claim 33, further comprising means for interpolating between said attribute value associated with at least one of said frames and at least one desired attribute value to produce a plurality of attribute values. The apparatus according to claim 33, further comprising: 535258 28 selection means for selecting at least one of a plurality of objects having at least one of a first plurality of attribute values; and modification means for modifying at least one of said first plurality of attribute values to produce a modified object having modified attribute values.
41. An apparatus for generating an animation sequence having a plurality of frames, said apparatus comprising: interpolation means for interpolating between at least one attribute value associated with at least one of said frames and at least one desired attribute value to produce a plurality of attribute values; display means for displaying said plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; labelling means for labelling said time line with an event label, said label corresponding to at least one of said associated events; and processor means for generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
42. The apparatus according to claim 41, wherein said processor means automatically generates said events.
43. The apparatus according to claim 41, wherein said events are generated if said attribute value reaches a point in said time line defined by said event label.
44. The apparatus according to claim 41, wherein said events are generated if said attribute value reaches a predetermined value being defined by said event label. The apparatus according to claim 41, wherein said event is generated by a user. 535258 29
46. The apparatus according to claim 41, wherein said processor means further comprises means for proceeding to said event label and generating said animation sequence from a point defined by said label.
47. The apparatus according to claim 41, further comprising means for interpolating between said attribute value associated with at least one of said frames and at least one desired attribute value to produce a plurality of attribute values.
48. The apparatus according to claim 41, further comprising: selection means for selecting at least one of a plurality of objects having at least one of a first plurality of attribute values; and modification means for modifying at least one of said first plurality of attribute values to produce a modified object having modified attribute values.
49. An apparatus for generating an animation sequence, said apparatus comprising: ~selection means for selecting at least one of a plurality of objects having at least one of a first plurality of attribute values; modification means for modifying at least one of said first plurality of attribute C..o .values to produce a modified object having modified attribute values; interpolation means for interpolating between said modified attribute values and at least one desired attribute value to produce a second plurality of attribute values; display means for displaying said second plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; labelling means for labelling said time line with an event label, said label 25 corresponding to at least one of said associated events; and processor means for generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value in response to said at least one associated event occurring. 535258 A computer readable medium, having a program recorded thereon, where the program is configured to make a computer execute a procedure to generate an animation sequence having a plurality of frames, said program comprising: code for displaying at least one of a plurality of attribute values associated with each said frame of said animation sequence as a time line, at least one of said attribute values having one or more associated events; code for labelling said time line with at least one event label, said label corresponding to at least one of said associated events; and code for generating said animation sequence based on said event label using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
51. A computer readable medium, having a program recorded thereon, where the program is configured to make a computer execute a procedure to generate an animation sequence having a plurality of frames, said program comprising: code for interpolating between at least one attribute value associated with at least one of said frames and at least one desired attribute value to produce a plurality of attribute values; .code for displaying said plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; code for labelling said time line with an event label, said label corresponding to at least one of said associated events; and ocode for generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
52. A computer readable medium, having a program recorded thereon, where the program is configured to make a computer execute a procedure to generate an animation sequence having a plurality of frames, said program comprising: 535258 -31- code for selecting at least one of a plurality of objects having at least one of a first plurality of attribute values; code for modifying at least one of said first plurality of attribute values to produce a modified object having modified attribute values; code for interpolating between said modified attribute values and at least one desired attribute value to produce a second plurality of attribute values; code for displaying said second plurality of attribute values as at least one time line, at least one of said attribute values having one or more associated events; code for labelling said time line with an event label, said label corresponding to at least one of said associated events; and code for generating said animation sequence using said time line, wherein said sequence advances to said label and alters said at least one attribute value, in response to said at least one associated event occurring.
53. A method of generating an animation sequence having a plurality of frames, °substantially as herein before described with reference to Figs. 2 to 8. S -54. An apparatus for generating an animation sequence having a plurality of frames, substantially as herein before described with reference to Figs. 2 to 8. A computer readable medium, having a program recorded thereon, where the program is configured to make a computer execute a procedure to generate an animation sequence having a plurality of frames, said program being substantially as herein before described with reference to Figs. 2 to 8. DATED this Eighteenth Day of June 2003 CANON KABUSHIKI KAISHA Patent Attorneys for the Applicant SPRUSON&FERGUSON 535258
AU16717/01A 2000-02-28 2001-01-30 Object based animations with timelines Ceased AU765544B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU16717/01A AU765544B2 (en) 2000-02-28 2001-01-30 Object based animations with timelines

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AUPQ5887 2000-02-28
AUPQ5887A AUPQ588700A0 (en) 2000-02-28 2000-02-28 Object based animations with timelines
AU16717/01A AU765544B2 (en) 2000-02-28 2001-01-30 Object based animations with timelines

Publications (2)

Publication Number Publication Date
AU1671701A AU1671701A (en) 2001-08-30
AU765544B2 true AU765544B2 (en) 2003-09-25

Family

ID=25616613

Family Applications (1)

Application Number Title Priority Date Filing Date
AU16717/01A Ceased AU765544B2 (en) 2000-02-28 2001-01-30 Object based animations with timelines

Country Status (1)

Country Link
AU (1) AU765544B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112667942A (en) * 2019-10-16 2021-04-16 腾讯科技(深圳)有限公司 Animation generation method, device and medium
CN114510183B (en) * 2022-01-26 2023-04-18 荣耀终端有限公司 Dynamic effect duration management method and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5926186A (en) * 1992-09-10 1999-07-20 Fujitsu Limited Graphic editing apparatus and method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5926186A (en) * 1992-09-10 1999-07-20 Fujitsu Limited Graphic editing apparatus and method

Also Published As

Publication number Publication date
AU1671701A (en) 2001-08-30

Similar Documents

Publication Publication Date Title
US8310485B1 (en) Creating animation effects
US7007295B1 (en) System and method for Internet streaming of 3D animated content
US6011562A (en) Method and system employing an NLE to create and modify 3D animations by mixing and compositing animation data
US9171390B2 (en) Automatic and semi-automatic generation of image features suggestive of motion for computer-generated images and video
US7149961B2 (en) Automatic generation of presentations from “path-enhanced” multimedia
US6924803B1 (en) Methods and systems for a character motion animation tool
US20020063714A1 (en) Interactive, multimedia advertising systems and methods
US20020023103A1 (en) System and method for accessing and manipulating time-based data using meta-clip objects
US6538654B1 (en) System and method for optimizing 3D animation and textures
US6587109B1 (en) System and method for real-time scalability of 3D graphics based on internet bandwidth and CPU speed
CN101290787A (en) Changing video frequency playback speed ratio
RU2005120391A (en) MEDIA INTEGRATION LEVEL
CN112802192B (en) Three-dimensional graphic image player capable of realizing real-time interaction
US6674437B1 (en) Key reduction system and method with variable threshold
US11126856B2 (en) Contextualized video segment selection for video-filled text
US20050128220A1 (en) Methods and apparatuses for adjusting a frame rate when displaying continuous time-based content
CN106657821A (en) Animation subtitle drawing method with changeable effect
AU765544B2 (en) Object based animations with timelines
US6856322B1 (en) Unified surface model for image based and geometric scene composition
US8379028B1 (en) Rigweb
US20080115062A1 (en) Video user interface
CN114913282A (en) VR editor and implementation method thereof
JP2000194873A (en) Multilevel simulation
EP1579391A1 (en) A unified surface model for image based and geometric scene composition
Stuckey A Comparison of ArcGIS and QGIS for Animation

Legal Events

Date Code Title Description
DA3 Amendments made section 104

Free format text: THE NATURE OF THE AMENDMENT IS: SUBSTITUTE PATENT REQUEST REGARDING ASSOCIATED DETAILS

FGA Letters patent sealed or granted (standard patent)