US20050231512A1 - Animation of an object using behaviors - Google Patents

Animation of an object using behaviors Download PDF

Info

Publication number
US20050231512A1
US20050231512A1 US10/826,973 US82697304A US2005231512A1 US 20050231512 A1 US20050231512 A1 US 20050231512A1 US 82697304 A US82697304 A US 82697304A US 2005231512 A1 US2005231512 A1 US 2005231512A1
Authority
US
United States
Prior art keywords
behavior
parameter
user
behaviors
text
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/826,973
Inventor
Gregory Niles
Stephen Sheeler
Guido Hucking
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Computer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US10/826,973 priority Critical patent/US20050231512A1/en
Application filed by Apple Computer Inc filed Critical Apple Computer Inc
Assigned to APPLE COMPUTER, INC. reassignment APPLE COMPUTER, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HUCKING, GUIDO, NILES, GREGORY E., SHEELER, STEPHEN M.
Priority to EP05735929A priority patent/EP1735754A2/en
Priority to PCT/US2005/012735 priority patent/WO2005106800A2/en
Publication of US20050231512A1 publication Critical patent/US20050231512A1/en
Priority to US11/257,882 priority patent/US20060055700A1/en
Priority to US11/786,850 priority patent/US7932909B2/en
Assigned to APPLE INC. reassignment APPLE INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: APPLE COMPUTER, INC.
Priority to US12/729,912 priority patent/US8542238B2/en
Priority to US12/729,890 priority patent/US8253747B2/en
Priority to US13/052,372 priority patent/US8300055B2/en
Priority to US13/566,571 priority patent/US20130113807A1/en
Priority to US13/663,435 priority patent/US20130265316A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2213/00Indexing scheme for animation
    • G06T2213/12Rule based animation

Definitions

  • This invention relates generally to computer animation and, more specifically, to animating an object using behaviors.
  • animation software has become more user-friendly, enabling a user to interact with objects at a higher level of abstraction.
  • a user may animate an object by applying a “behavior” to the object.
  • a behavior is an animation abstraction and can be thought of as a macro, script, or plugin.
  • the object is animated in a particular way (e.g., by growing or shrinking or by moving in a specific direction).
  • Some examples of animation software that support behaviors are Anark Studio and Macromedia Director MX.
  • behaviors make it easier to animate objects
  • software that supports behaviors can still be difficult to use.
  • Many types of behaviors may be applied to one object, and each type of behavior can be customized based on several parameters. Understanding each of these parameters and its effect on the behavior can be confusing. Providing values for all of these parameters can also be time-consuming.
  • Various embodiments of the invention cover various aspects of behaviors and working with behaviors.
  • behaviors themselves, including animations that can be produced by applying a behavior to an item and the algorithms underlying these animations.
  • Another embodiment covers using behaviors in conjunction with keyframes.
  • Yet another embodiment covers working with behaviors, including setting parameters of behaviors, saving behaviors, and creating new behaviors.
  • Yet another embodiment covers objects to which behaviors may be applied, including, for example, images, text, particle systems, filters, generators, and other behaviors.
  • Yet another embodiment covers dynamic rendering of objects to which behaviors have been applied, including changing an animation in real-time after the value of a behavior parameter has been changed. Yet another embodiment covers hardware acceleration methods that enable users to work effectively with behaviors.
  • FIG. 1 illustrates a behavior in the Layers tab, according to one embodiment of the invention.
  • FIG. 2 illustrates a behavior in the Timeline, according to one embodiment of the invention.
  • FIG. 3 illustrates a behavior in the Behaviors tab of the Inspector, according to one embodiment of the invention.
  • FIG. 4 illustrates a gear icon, according to one embodiment of the invention.
  • FIG. 5 illustrates a gear icon in the filters tab of the Inspector, according to one embodiment of the invention.
  • FIG. 6 illustrates a gear icon in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 7 illustrates a parameter behavior in the Layers tab, according to one embodiment of the invention.
  • FIG. 8 illustrates a parameter behavior in the Timeline, according to one embodiment of the invention.
  • FIG. 9 illustrates a parameter's pop-up menu, according to one embodiment of the invention.
  • FIG. 10 illustrates an Apply To pop-up menu, according to one embodiment of the invention.
  • FIG. 11 illustrates the controls for the Fade In/Fade Out behavior in the Dashboard, according to one embodiment of the invention.
  • FIG. 12 illustrates the controls for the Fade In/Fade Out behavior in the Behaviors tab, according to one embodiment of the invention.
  • FIG. 13 illustrates the Activate control, the Enable/Disable control, and the Lock control in the Layers tab, according to one embodiment of the invention.
  • FIG. 14 illustrates the Activate control, the Enable/Disable control, and the Lock control in the Timeline, according to one embodiment of the invention.
  • FIG. 15 illustrates an enable/disable behaviors control that has been toggled to disabled, according to one embodiment of the invention.
  • FIG. 16 illustrates a show behaviors control that has been toggled to show, according to one embodiment of the invention.
  • FIG. 17 illustrates a behavior that has been selected in the Layers tab, according to one embodiment of the invention.
  • FIG. 18 illustrates a behavior that is being dragged to another object in the Layers tab, according to one embodiment of the invention.
  • FIG. 19 illustrates an object with multiple behaviors in the Timeline, according to one embodiment of the invention.
  • FIG. 20 illustrates an object with multiple behaviors in the Layers tab, according to one embodiment of the invention.
  • FIG. 21 illustrates a behavior being dragged and a position indicator, according to one embodiment of the invention.
  • FIG. 22 illustrates an object with a behavior in the Timeline, according to one embodiment of the invention.
  • FIG. 23 illustrates a behavior being trimmed in the Timeline and a tooltip, according to one embodiment of the invention.
  • FIG. 24 illustrates a behavior being moved in the Timeline and a tooltip, according to one embodiment of the invention.
  • FIG. 25 illustrates a behavior after it has been moved in the Timeline, according to one embodiment of the invention.
  • FIG. 26 illustrates a behavior-driven motion path in the Canvas, according to one embodiment of the invention.
  • FIG. 27 illustrates a keyframed motion path in the Canvas, according to one embodiment of the invention.
  • FIG. 28 illustrates a behavior-driven and keyframed motion path in the Canvas, according to one embodiment of the invention.
  • FIG. 29 illustrates a parameter with an oscillate behavior applied to it in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 30 illustrates a parameter with an oscillate behavior and keyframes applied to it in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 31 illustrates the parameter of FIG. 30 but with one keyframe lowered, according to one embodiment of the invention.
  • FIG. 32 illustrates a parameter with a behavior curve and a keyframed curve in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 33 illustrates a parameter with a “final animation curve” in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 34 illustrates an object with an Orbit Around behavior applied, creating a regular orbit (a circular motion path 340 ), according to one embodiment of the invention.
  • FIG. 35 illustrates the same object as in FIG. 34 , but with a Ramp behavior applied to the Orbit Around behavior's Drag parameter as described above, creating a spiral motion path 340 , according to one embodiment of the invention.
  • FIG. 36 illustrates an object with an Orbit Around behavior applied, creating a regular orbit (a circular motion path), according to one embodiment of the invention.
  • FIG. 37 illustrates the same object as in FIG. 36 , but with keyframes applied to the Orbit Around behavior's Drag parameter as described above, creating a different motion path, according to one embodiment of the invention.
  • FIG. 38 illustrates a Dashboard for a Fade In/Fade Out behavior, according to one embodiment of the invention.
  • FIG. 39 illustrates a Dashboard for a Grow/Shrink behavior, according to one embodiment of the invention.
  • FIG. 40 illustrates a Motion Path behavior, including curves, applied to an object, according to one embodiment of the invention.
  • FIG. 41 illustrates an object moving along a motion path, according to one embodiment of the invention.
  • FIG. 42 illustrates the same object as in FIG. 41 , but also with a Snap Alignment to Motion behavior applied to the object, according to one embodiment of the invention.
  • FIG. 43 illustrates a Dashboard for a Spin behavior, according to one embodiment of the invention.
  • FIG. 44 illustrates a Dashboard for a Throw behavior, according to one embodiment of the invention.
  • FIG. 45 illustrates a motion path behavior applied to an object, according to one embodiment of the invention.
  • FIG. 46 illustrates a motion path behavior applied to an object, and a Negate behavior applied to the object's Position parameter, according to one embodiment of the invention.
  • FIG. 47 illustrates a Dashboard for an Oscillate behavior, according to one embodiment of the invention.
  • FIG. 48 illustrates two objects (an attracting object and an attracted object) and a motion path 480 of the latter object, according to one embodiment of the invention.
  • FIG. 49 illustrates one object and an edge collision motion path 490 , according to one embodiment of the invention.
  • FIG. 50 illustrates an object and a gravity motion path 500 , according to one embodiment of the invention.
  • FIG. 51 illustrates a first object orbiting around a second object and an orbit motion path 510 of the first object, according to one embodiment of the invention.
  • FIG. 52 illustrates a Dashboard of an Orbit Around behavior, according to one embodiment of the invention.
  • FIG. 53 illustrates an object and a Random Motion motion path, according to one embodiment of the invention.
  • FIG. 54 illustrates an Orbit Around behavior applied to an object and the object's motion path, according to one embodiment of the invention.
  • FIG. 55 illustrates both an Orbit Around behavior and a Random Motion behavior applied to an object and the object's motion path, according to one embodiment of the invention.
  • FIG. 56 illustrates a Dashboard for a Random Motion behavior, according to one embodiment of the invention.
  • FIG. 57 illustrates several objects, according to one embodiment of the invention.
  • FIG. 58 illustrates the same objects as in FIG. 57 after the Repel behavior has been applied to the central object, according to one embodiment of the invention.
  • FIG. 59 illustrates a Dashboard of a Wind behavior, according to one embodiment of the invention.
  • FIG. 60 illustrates two graphic objects, according to one embodiment of the invention.
  • FIG. 61 illustrates a pop-up menu showing Basic Motion>Motion Path, according to one embodiment of the invention.
  • FIG. 62 illustrates the top object's motion path, according to one embodiment of the invention.
  • FIG. 63 illustrates the bottom object's motion path, according to one embodiment of the invention.
  • FIG. 64 illustrates a Dashboard for the Motion Path behavior showing the Speed parameter as Ease Out, according to one embodiment of the invention.
  • FIG. 65 illustrates a small text object, according to one embodiment of the invention.
  • FIG. 66 illustrates the text object of FIG. 65 with a new anchor point location, according to one embodiment of the invention.
  • FIG. 67 illustrates the Increment pop-up menu of the Grow/Shrink behavior in the Behaviors tab of the Inspector, according to one embodiment of the invention.
  • FIG. 68 illustrates the text object and the Grow/Shrink Dashboard, according to one embodiment of the invention.
  • FIG. 69 illustrates the Fade/Fade Out Dashboard, according to one embodiment of the invention.
  • FIG. 70 illustrates the composition at the first frame, according to one embodiment of the invention.
  • FIG. 71 illustrates the composition at a middle frame, according to one embodiment of the invention.
  • FIG. 72 illustrates the composition at the last frame, according to one embodiment of the invention.
  • FIG. 73 illustrates one example of a particle system, according to one embodiment of the invention.
  • FIG. 74 illustrates another example of a particle system, according to one embodiment of the invention.
  • FIG. 75 illustrates yet another example of a particle system, according to one embodiment of the invention.
  • FIG. 76 illustrates an example of a cell, according to one embodiment of the invention.
  • FIG. 77 illustrates an example of a particle system based on the cell of FIG. 76 , according to one embodiment of the invention.
  • FIG. 78 illustrates an example of a particle system based on one cell, according to one embodiment of the invention.
  • FIG. 79 illustrates an example of a particle system based on multiple cells 760 A, 760 B, according to one embodiment of the invention.
  • FIG. 80 illustrates an example of a Project pane showing an emitter that is based on two cells, according to one embodiment of the invention.
  • FIG. 81 illustrates an example of a Timeline showing an emitter that is based on two cells, according to one embodiment of the invention.
  • FIG. 82 illustrates an example of a particle system based on an emitter, according to one embodiment of the invention.
  • FIG. 83 illustrates another example of a particle system based on the same emitter as in FIG. 82 , according to one embodiment of the invention.
  • FIG. 84 illustrates yet another example of a particle system based on the same emitter as in FIGS. 82 and 83 , according to one embodiment of the invention.
  • FIG. 85 illustrates an example of an object, according to one embodiment of the invention.
  • FIG. 86 illustrates an example of a particle system of bubbles along with the object of FIG. 85 , according to one embodiment of the invention.
  • FIG. 87 illustrates another example of a particle system of bubbles along with the object of FIG. 85 , according to one embodiment of the invention.
  • FIG. 88 illustrates an example of a particle system including an emitter and individual particles based on the emitter, according to one embodiment of the invention.
  • FIG. 89 illustrates a simple white circular gradient, according to one embodiment of the invention.
  • FIG. 90 illustrates an Emitter button, according to one embodiment of the invention.
  • FIG. 91 illustrates a new emitter, at the first frame of the particle effect, according to one embodiment of the invention.
  • FIG. 92 illustrates an active particle system, such as the emitter of FIG. 91 but at a later frame, according to one embodiment of the invention.
  • FIG. 93 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 94 illustrates the particle system of FIG. 93 after it has been rescaled, according to one embodiment of the invention.
  • FIG. 95 illustrates a Dashboard for a particle system, according to one embodiment of the invention.
  • FIG. 96 illustrates the particle system of FIGS. 91 and 92 in full effect, according to one embodiment of the invention.
  • FIG. 97 illustrates the particle system of FIG. 96 at anther point in time, according to one embodiment of the invention.
  • FIG. 98 illustrates the particle system of FIG. 97 after the value of Scale has been reduced, according to one embodiment of the invention.
  • FIGS. 99 and 100 illustrate the Dashboard and the particle system, respectively, before the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIGS. 101 and 102 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIGS. 103 and 104 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIGS. 105 and 106 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIGS. 107 and 108 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIGS. 109 and 110 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIG. 111 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 112 illustrates the particle system of FIG. 111 after the emitter has been moved, according to one embodiment of the invention.
  • FIG. 113 illustrates a particle system where the emitter's position has been animated using a behavior, or keyframed, according to one embodiment of the invention.
  • FIG. 114 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 115 illustrates the particle system of FIG. 114 after the emitter's Shear parameter has been modified, according to one embodiment of the invention.
  • FIG. 116 illustrates a particle system in the Timeline that comprises one emitter and three nested cells, according to one embodiment of the invention.
  • FIG. 117 illustrates a particle system with dense white particles emerging from the center, according to one embodiment of the invention.
  • FIG. 118 illustrates the particle system of FIG. 117 with more diffuse orange particles appearing around a larger area, according to one embodiment of the invention.
  • FIG. 119 illustrates the particle system of FIG. 118 with small sparks emerging from underneath both of the previous layers as they fade away, according to one embodiment of the invention.
  • FIG. 120 illustrates an Emitter tab and Emitter parameters, according to one embodiment of the invention.
  • FIG. 121 illustrates an Emitter tab and individual controls for several Emitter parameters, according to one embodiment of the invention.
  • FIG. 122 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 123 illustrates the particle system of FIG. 122 after the value of the Scale parameter in the Emitter tab has been increased, according to one embodiment of the invention.
  • FIG. 124 illustrates a particle system with a Point emitter shape, according to one embodiment of the invention.
  • FIG. 125 illustrates a particle system with a Line emitter shape, according to one embodiment of the invention.
  • FIG. 126 illustrates a particle system with a Circle emitter shape, according to one embodiment of the invention.
  • FIG. 127 illustrates a particle system with a Filled Circle emitter shape, according to one embodiment of the invention.
  • FIG. 128 illustrates a particle system with a Geometry emitter shape, according to one embodiment of the invention.
  • FIG. 129 illustrates the shape that was used as the Geometry emitter shape for the particle system of FIG. 128 , according to one embodiment of the invention.
  • FIG. 130 illustrates a particle system with an Image emitter shape, according to one embodiment of the invention.
  • FIG. 131 illustrates the image that was used as the Image emitter shape for the particle system of FIG. 130 , according to one embodiment of the invention.
  • FIG. 132 illustrates a particle system with a lower birth rate, according to one embodiment of the invention.
  • FIG. 133 illustrates the particle system of FIG. 132 but with a higher birth rate, according to one embodiment of the invention.
  • FIG. 134 illustrates a particle system with a higher initial number, according to one embodiment of the invention.
  • FIG. 135 illustrates the particle system of FIG. 134 but with a lower initial number, according to one embodiment of the invention.
  • FIG. 136 illustrates a particle system with a longer life, according to one embodiment of the invention.
  • FIG. 137 illustrates the particle system of FIG. 136 but with a shorter life, according to one embodiment of the invention.
  • FIG. 138 illustrates a particle system with the Additive Blend parameter turned off, according to one embodiment of the invention.
  • FIG. 139 illustrates a particle system with the Additive Blend parameter turned on, according to one embodiment of the invention.
  • FIG. 140 illustrates a particle system with a Solid Color Mode, according to one embodiment of the invention.
  • FIG. 141 illustrates a particle system with an Over Life Color Mode, according to one embodiment of the invention.
  • FIG. 142 illustrates a particle system with a Range Color Mode, according to one embodiment of the invention.
  • FIG. 143 illustrates a particle system with a Take Image Color Mode, according to one embodiment of the invention.
  • FIG. 144 illustrates a particle system with a larger Scale parameter, according to one embodiment of the invention.
  • FIG. 145 illustrates the particle system of FIG. 144 but with a smaller Scale parameter, according to one embodiment of the invention.
  • FIG. 146 illustrates a particle system with a Point Show Particles As parameter, according to one embodiment of the invention.
  • FIG. 147 illustrates a particle system with a Line Show Particles As parameter, according to one embodiment of the invention.
  • FIG. 148 illustrates a particle system with an Outline Show Particles As parameter, according to one embodiment of the invention.
  • FIG. 149 illustrates a particle system with an Image Show Particles As parameter, according to one embodiment of the invention.
  • FIG. 150 illustrates a Particle Cell tab, according to one embodiment of the invention.
  • FIG. 151 illustrates an object that is being dragged to a position in the Layers tab, according to one embodiment of the invention.
  • FIG. 152 illustrates the object of FIG. 151 , now nested within an emitter, according to one embodiment of the invention.
  • FIG. 153 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 154 illustrates the particle system of FIG. 153 after a Sphere filter has been applied, according to one embodiment of the invention.
  • FIG. 155 illustrates a simple graphic with a premultiplied alpha channel, according to one embodiment of the invention.
  • FIG. 156 illustrates an Emitter button, according to one embodiment of the invention.
  • FIG. 157 illustrates a distributed group of particles that partially fills the Canvas, according to one embodiment of the invention.
  • FIG. 158 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 159 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 160 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 161 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 162 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 163 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 164 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 165 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 166 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 167 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 168 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 169 illustrates one example of a slider, according to one embodiment of the invention.
  • FIG. 170 illustrates one example of a value slider, according to one embodiment of the invention.
  • FIG. 171 illustrates one example of a dial, according to one embodiment of the invention.
  • FIG. 172 illustrates one example of a value field, according to one embodiment of the invention.
  • FIG. 173 illustrates one example of a pop-up menu, according to one embodiment of the invention.
  • FIG. 174 illustrates one example of a value list, according to one embodiment of the invention.
  • FIG. 175 illustrates one example of an activation checkbox, according to one embodiment of the invention.
  • FIG. 176 illustrates one example of a color well, according to one embodiment of the invention.
  • FIG. 177 illustrates one example of a pop-up picker, according to one embodiment of the invention.
  • FIG. 178 illustrates one example of a gradient, according to one embodiment of the invention.
  • FIG. 179 illustrates one example of a drop well, according to one embodiment of the invention.
  • FIG. 180 illustrates one example of a parameter selection field, according to one embodiment of the invention.
  • FIG. 181 illustrates one example of a reset button, according to one embodiment of the invention.
  • FIG. 182 illustrates one example of a manage presets button, according to one embodiment of the invention.
  • FIG. 183 illustrates one example of an animation menu button, according to one embodiment of the invention.
  • FIG. 184 illustrates one example of a shortcut menu filled with Animation related controls, according to one embodiment of the invention.
  • FIG. 185 illustrates one example of a Lock icon, according to one embodiment of the invention.
  • FIG. 186 illustrates one example of a Dashboard, according to one embodiment of the invention.
  • FIG. 186 illustrates one example of a Dashboard, according to one embodiment of the invention.
  • FIG. 187 illustrates one example of a Dashboard title bar displaying a downward facing arrow, according to one embodiment of the invention.
  • FIG. 188 illustrates one example of a pop-up menu that lists all of the possible control sets that can be displayed in the Dashboard for the selected object, according to one embodiment of the invention.
  • FIG. 189 illustrates one example of a Dashboard for a particle system, according to one embodiment of the invention.
  • FIG. 190 illustrates one example of a Dashboard for a Grow/Shrink behavior, according to one embodiment of the invention.
  • FIG. 191 illustrates one example of a Dashboard for a Fade In/Fade Out behavior, according to one embodiment of the invention.
  • FIG. 192 illustrates one example of a Dashboard for a Throw behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 193 illustrates one example of a Dashboard for a Throw behavior where the special control specifies movement in a southeastern direction at a low speed, according to one embodiment of the invention.
  • FIG. 194 illustrates one example of a Dashboard for a Throw behavior where the special control specifies movement in the same direction as in FIG. 193 , but at a higher speed, according to one embodiment of the invention.
  • FIG. 195 illustrates one example of a Dashboard for a Wind behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 196 illustrates one example of a Dashboard for a Wind behavior where the special control specifies movement in a northeastern direction at a high speed, according to one embodiment of the invention.
  • FIG. 197 illustrates one example of a Dashboard for a Spin behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 198 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in a clockwise direction at a low speed, according to one embodiment of the invention.
  • FIG. 199 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in the same direction as in FIG. 198 , but at a higher speed, according to one embodiment of the invention.
  • FIG. 200 illustrates one example of a Dashboard for a Spin behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 201 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in a counterclockwise direction at a low speed, according to one embodiment of the invention.
  • FIG. 202 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in the same direction as in FIG. 201 , but at a much higher speed, according to one embodiment of the invention.
  • FIG. 203 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 204 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies a high grow rate, according to one embodiment of the invention.
  • FIG. 205 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 206 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies a high shrink rate, according to one embodiment of the invention.
  • FIG. 207 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies shrinking in the horizontal direction and simultaneous growing in the vertical direction, according to one embodiment of the invention.
  • FIG. 208 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a fade in time and a fade out time of equivalent length, according to one embodiment of the invention.
  • FIG. 209 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a shorter fade in time than in FIG. 208 and no fade out time (i.e., no fade out at all), according to one embodiment of the invention.
  • FIG. 210 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a similar fade in time to that in FIG. 208 and a longer fade out time than in FIG. 208 , according to one embodiment of the invention.
  • FIG. 211 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in all directions (i.e., there is no specified range) at a medium/high speed, according to one embodiment of the invention.
  • FIG. 212 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range) and at a medium speed, according to one embodiment of the invention.
  • FIG. 213 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range, and the range is narrower than the range in FIG. 211 ) and at a low speed, according to one embodiment of the invention.
  • FIG. 214 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range, and the range is narrower than the range in FIG. 212 ) and at a high speed, according to one embodiment of the invention.
  • the visual representation of an object may be specified by two pieces of information, a source image and a collection of parameters that modify the source image.
  • an object by modifying the values of these parameters over time, an object can be animated.
  • an object can appear to grow or shrink or fade in or fade out, respectively.
  • the visual representation of an object can also be assigned a position parameter.
  • by modifying the value of this position parameter over time, an object can appear to move.
  • a behavior is an animation abstraction that, when applied to an object, causes the object to be animated in a particular way.
  • a behavior changes the value of a parameter of an object over time, thereby animating the object with respect to that parameter.
  • a “shrink” behavior may cause an object to decrease in size by decreasing the values of the object's length and height parameters.
  • a “throw” behavior may cause an object to move in a specific direction with a specific speed by modifying the object's location on the screen over time.
  • a behavior changes the value of only one parameter of an object over time.
  • a “stretch” behavior may stretch an object by increasing the value of the object's length parameter while not modifying the value of the object's height parameter.
  • a behavior changes the value of more than one parameter of an object over time.
  • the “shrink” behavior mentioned above decreases the values of the object's length and height parameters.
  • a behavior changes the value of a parameter of an object over time.
  • a behavior specifies how it affects a parameter
  • a behavior may or may not specify which parameter it affects.
  • when a behavior specifies a particular parameter the behavior is applied to an object and affects that particular parameter of the object.
  • a behavior does not specify a particular parameter, the behavior is applied to a parameter of an object (any parameter) and affects that parameter in a particular way.
  • a behavior may affect the value of a parameter of an object—increasing and decreasing. In another embodiment, however, many more such ways exist.
  • these ways include oscillating, randomizing, and reversing.
  • an “oscillate rotation” behavior might be called “rock.”
  • a behavior that specifies a particular parameter is applied to an object, while a behavior that does not specify a particular parameter is applied to a parameter of an object.
  • one way to refer to a behavior that specifies a parameter is to indicate which parameters the behavior affects and in what way.
  • a behavior that decreases the brightness of an object may be known as the “decrease brightness” behavior.
  • the “decrease brightness” behavior may be called the “darken” behavior.
  • an “increase length, increase height” behavior may be called the “grow” behavior.
  • descriptive titles, such as “darken” and “grow,” help the user understand how a behavior will animate an object.
  • a user in order to apply a behavior to an object, where the behavior specifies the parameter to be animated, a user selects a behavior and selects an object to which the behavior should be applied. In one embodiment, note that these two steps may occur in any order.
  • a user selects a behavior or an object by choosing it from a menu.
  • a user selects a behavior or an object by clicking on a visual representation of the behavior or object, such as an icon (for a behavior or an object) or the object itself (for an object).
  • a user applies a behavior to an object by clicking on the behavior and dragging it onto the target object.
  • a user in order to apply a behavior to a parameter of an object, where the behavior does not specify the parameter to be animated, a user selects a behavior and selects a parameter of an object to which the behavior should be applied. In one embodiment, note that these two steps may occur in any order.
  • a user selects a behavior by choosing it from a menu or by clicking on a visual representation of the behavior, as described above.
  • a user selects a parameter of an object by first selecting an object and then selecting a parameter of the object.
  • a user may select an object by choosing it from a menu or by clicking on a visual representation of the object, as described above.
  • a user may display a list of the object's parameters and select a parameter by clicking on it.
  • a user applies a behavior to a parameter of an object by clicking on the behavior and dragging it onto the target parameter.
  • an object parameter to which a behavior has been applied is identified in the list of parameters of the object.
  • an icon appears near the object parameter to which a behavior has been applied.
  • a behavior may be simultaneously applied to multiple objects or to multiple parameters of an object.
  • the user instead of selecting one object or one parameter of an object to which the behavior should be applied, the user selects multiple objects to multiple parameters of an object to which the behavior should be applied.
  • a behavior may be removed by deleting it.
  • a behavior's target object or target object parameter may be changed without having to delete the behavior and create a new behavior.
  • the animation caused by a behavior may be customized by specifying a value for one or more parameters associated with the behavior.
  • the “stretch” behavior may have a parameter that indicates how fast the object will stretch (i.e., at what rate the object's length parameter will increase).
  • the “throw” behavior may have a parameter that indicates in which direction the object should move (i.e., how the object's location on the screen should be changed).
  • initially, when a behavior is applied these parameters have default values. In one embodiment, methods of specifying other values for these parameters will be further discussed below.
  • behaviors exist independently of the objects to which they are applied. In one embodiment, this means that behaviors are reusable—the same behavior can be applied to two different objects to animate the objects in a similar way.
  • the user may select a behavior from a group of pre-defined behaviors (a “behavior library”). In yet another embodiment, these behaviors may be, for example, the most useful behaviors or the most difficult behaviors to implement.
  • a behavior in the library is saved and may be re-used in the future.
  • the user creates behaviors to add to this library.
  • these behaviors may be created by assigning values to a behavior's parameters or by specifying a particular parameter of an object to be affected (e.g., where the behavior previously did not specify an object parameter).
  • a user creates a behavior from scratch or combines multiple behaviors into one behavior.
  • two behaviors can be combined to form one new behavior.
  • two behaviors may be applied to the same object but still retain their independent nature.
  • any number of behaviors may be applied to one object at the same time.
  • each behavior would affect the object at the same time.
  • the object may be animated in a different way depending on the order in which the behaviors were applied.
  • a keyframe is a visual representation of an object at a particular point in time.
  • a user can specify how the visual representation of an object changes over time.
  • the representation of an object may change drastically between keyframes, simply showing a number of keyframes in succession would result in jerky transitions.
  • new visual representations must be calculated that fall between keyframes in time and that are similar to surrounding keyframes. In one embodiment, this is known as “inbetweening.”
  • applying a behavior to an object does not add keyframes to an object or to its parameters.
  • a behavior instead, a behavior generates a range of values for a parameter of an object and then sets the parameter to these values over time, thereby animating the object.
  • the range of values applied to an object's parameters is controlled by the behavior's parameters.
  • keyframes apply specific values to an object's parameters.
  • that parameter is animated from the value in the first keyframe to the value in the last keyframe.
  • the other keyframes and thus, the values that they specify are not modified.
  • a user can use Behaviors to animate objects using simple, graphical controls.
  • Behaviors a user can create simple motion effects or complex simulated interactions between multiple objects quickly and easily.
  • a user can add behaviors to objects or properties in a project to create animated effects without needing to create or adjust keyframes.
  • drag a behavior onto an object and the object is automatically animated based on the type of behavior applied.
  • a user can customize a Behavior's parameters in the Dashboard, or in the Behaviors tab of the Inspector, to change its effect.
  • behaviors are designed to be flexible, and can be combined with one another to create all kinds of effects.
  • motion graphics design becomes interactive, allowing a user to create complex motion effects and simulated object interactions very quickly.
  • behaviors can also be used to animate nearly any individual object, particle system emitter, filter, and generator parameter. In one embodiment, this allows a user to quickly create animated backgrounds, dynamic filter effects, and incredibly complex particle systems, all using a few simple controls.
  • Behaviors vs. Keyframes In one embodiment, it's important to understand that behaviors do not add keyframes to the objects or parameters to which they're applied. In one embodiment, instead, behaviors automatically generate a range of values that are then applied to an object's parameters, animating it over the duration of that behavior. In another embodiment, changing the parameters of a behavior alters the range of values that behavior generates.
  • keyframes apply specific values directly to a parameter.
  • a user when a user creates two or more keyframes with different values in a Parameter in the Keyframe Editor, a user animates that parameter from the first keyframed value to the last.
  • a user if a user changes the value of a single keyframe, it has no effect on any other keyframes applied to the same parameter.
  • behaviors are most useful for creating generalized, ongoing motion effects.
  • behaviors are also extremely useful for creating animated effects that might be too complex or time-consuming to keyframe manually.
  • keyframing in turn, may be more useful for creating specific animated effects where the parameter a user is adjusting is required to hit a specific value at a specific time.
  • available behaviors appear in the Library tab.
  • selecting the Behaviors category in the category pane reveals the four behavior subcategories.
  • selecting a subcategory reveals behaviors of that type in the Library Stack pane.
  • when a user selects a behavior in the Library Stack a short description of it appears to the right of the Preview window.
  • how a user applies a behavior depends on what kind of behavior it is. In one embodiment, some behaviors are applied directly to objects in the Canvas, while others must be applied specifically to individual object parameters in the Inspector.
  • a user applies these behaviors directly to objects in the Canvas, Layers tab, or Timeline.
  • these behaviors automatically animate specific parameters of the object to which they're applied.
  • the Throw behavior only affects an object's Position parameter
  • the Grow/Shrink behavior only affects an object's Scale parameter.
  • to apply a behavior to an object in a project do one of the following:
  • a user can also apply behaviors directly to Layers in the Layers tab or Timeline.
  • behaviors applied to a Layer affect all objects nested within that layer as if they were a single object.
  • the object parameters affected by that behavior are automatically animated based on the behavior's default settings.
  • the object parameters affected by that behavior are automatically animated based on the behavior's default settings.
  • a user applies the Gravity behavior to an object in the Canvas and then plays the project, that object's position is animated and it moves down, according to the Gravity behavior's default setting.
  • FIG. 1 illustrates a behavior in the Layers tab, according to one embodiment of the invention.
  • FIG. 2 illustrates a behavior in the Timeline, according to one embodiment of the invention.
  • FIG. 3 illustrates a behavior in the Behaviors tab of the Inspector, according to one embodiment of the invention.
  • a gear icon 40 also appears to the right of the layer or object name in the Layers tab or Timeline. In one embodiment, clicking this icon enables and disables all behaviors that have been applied to that layer or object.
  • FIG. 4 illustrates a gear icon, according to one embodiment of the invention.
  • a user opens the Keyframe Editor and looks at a parameter that's affected by one or more behaviors, he'll see a background curve that represents the behavior's effect in addition to that parameter's keyframe curve. In one embodiment, this curve is uneditable, and is there to display the behavior's effect on that parameter.
  • parameter behaviors are applied differently than the other types of behaviors.
  • parameter behaviors can be applied to any of an object's parameters.
  • this also includes the parameters of filters, emitters and cells in particle systems, and other behaviors that have been applied to an object.
  • a parameter behavior's effect on an object depends on the parameter to which it is applied. In one embodiment, for example, if a user applies the Randomize parameter behavior to an object's Position parameter, that object drifts around the screen when the project is played. In another embodiment, applying the Randomize Parameter behavior to an object's Scale parameter, instead, makes the object randomly grow and shrink.
  • a user saves a parameter behavior as a favorite, the parameter to which it was applied is saved along with the rest of that behavior's settings. In one embodiment, as a result, it can be applied like any other behavior and that parameter is automatically affected.
  • a gear icon 40 when a Parameter behavior has been applied to an object in a project, a gear icon 40 appears in the keyframe menu to the right of the affected parameter in the Properties, Behaviors, or Filters tab where it's applied. In one embodiment, this shows a user that a parameter behavior is influencing that parameter. In another embodiment, a gear icon also appears in the keyframe menu of each affected parameter in the Keyframe Editor.
  • FIG. 5 illustrates a gear icon in the filters tab of the Inspector, according to one embodiment of the invention.
  • FIG. 6 illustrates a gear icon in the Keyframe Editor, according to one embodiment of the invention.
  • parameter behaviors 10 appear nested underneath the objects to which they're applied in the Layers tab and the Timeline, along with any other behaviors that have been applied to that object.
  • FIG. 7 illustrates a parameter behavior in the Layers tab, according to one embodiment of the invention.
  • FIG. 8 illustrates a parameter behavior in the Timeline, according to one embodiment of the invention.
  • opening a parameter's keyframe menu reveals the names of all the Parameter behaviors 10 currently applied to that parameter. In one embodiment, choosing one automatically opens that object's Behaviors tab.
  • FIG. 9 illustrates a parameter's pop-up menu, according to one embodiment of the invention.
  • the parameter assignment pop-up displays all of the Properties available for the object that behavior has been applied to.
  • those parameters also appear within submenus of the Apply To pop-up menu.
  • each behavior has a subset of parameters that appear in the Dashboard.
  • all controls for behaviors appear in the Behaviors tab of the Inspector.
  • both the Dashboard and the Behaviors tab reference the same parameters, so changing a parameter in one automatically changes the same parameter in the other.
  • the parameters that appear in the Dashboard 110 are the most essential ones for modifying that behavior's effect.
  • the controls available in a behavior's Dashboard are also more descriptive and easier to use than those in the Behaviors tab 18 , although the Behaviors tab may contain more controls.
  • FIG. 11 illustrates the controls for the Fade In/Fade Out behavior in the Dashboard, according to one embodiment of the invention.
  • FIG. 12 illustrates the controls for the Fade In/Fade Out behavior in the Behaviors tab, according to one embodiment of the invention.
  • the controls in the Dashboard consolidate all of the parameters available in the Behaviors tab into a single, graphical control. In one embodiment, there are times, however, when it may be more desirable to use a behavior's individual parameters to finesse the effect a user is trying to achieve with greater detail.
  • to switch among all behaviors applied to an object in the Dashboard click the disclosure triangle next to the name at the top of the Dashboard to open a pop-up menu that displays all of the behaviors, filters, and masks that are applied to that object. In one embodiment, choose a behavior from this list to display its parameters in the Dashboard.
  • the Behaviors tab displays every behavior that's applied to the selected object. In one embodiment, a disclosure triangle to the left of each behavior's name reveals all of that behavior's parameters underneath. In another embodiment, unlike the Dashboard, the Behaviors tab displays every parameter a behavior has.
  • this section describes how to enable, rename, lock, duplicate, move, and reorganize behaviors in a project. In one embodiment, these procedures apply to every type of behavior.
  • the behavior when a user applies a behavior to an object, the behavior appears in three different places—the Layers tab 14 , the Timeline 16 , and the Behaviors tab 18 of the Inspector.
  • the Behaviors tab in the Inspector contains all of the editable parameters for a behavior that's been applied to an object
  • the Layers tab and Timeline have three basic controls for each behavior: Activate 130 , Enable/Disable 132 , and Lock 134 .
  • FIG. 13 illustrates the Activate control, the Enable/Disable control, and the Lock control in the Layers tab, according to one embodiment of the invention.
  • FIG. 14 illustrates the Activate control, the Enable/Disable control, and the Lock control in the Timeline, according to one embodiment of the invention.
  • the Activate control is a checkbox that turns each individual behavior on or off. In one embodiment, behaviors that are turned off are not rendered.
  • Name In one embodiment, a user can double-click in the Name field to rename the behavior.
  • Lock In one embodiment, click the lock control to lock or unlock a behavior. In one embodiment, a user cannot modify the parameters of a locked behavior.
  • the enable/disable behaviors control 150 is a gear icon that appears to the right of the name of each object with one or more behaviors applied to it. In one embodiment, clicking this icon toggles all behaviors applied to that object on and off.
  • FIG. 15 illustrates an enable/disable behaviors control that has been toggled to disabled, according to one embodiment of the invention.
  • the show behaviors control 160 is a button at the bottom of the Layers tab and Timeline that lets a user show or hide all behaviors. This button neither enables or disables behaviors that have been applied to objects in a project, it only controls their visibility.
  • FIG. 16 illustrates a show behaviors control that has been toggled to show, according to one embodiment of the invention.
  • behaviors can be cut, copied, and pasted like any other object.
  • a user cuts or copies a behavior in the Timeline or Layers tab, he also copies the current state of all that behavior's parameters.
  • a user can also move a behavior 10 from one object 12 to another in the Timeline or Layers tab simply by dragging it.
  • to move a behavior from one object to another in the Timeline or Layers tab, drag a parameter behavior from one object and drop it on top of another.
  • a user moves a parameter behavior 10 to another object 12 , it is applied to whichever parameter it affected in the previous object.
  • FIG. 17 illustrates a behavior that has been selected in the Layers tab, according to one embodiment of the invention.
  • FIG. 18 illustrates a behavior that is being dragged to another object in the Layers tab, according to one embodiment of the invention.
  • a user can also duplicate a behavior in place.
  • to duplicate a behavior to duplicate a behavior:
  • a user can also duplicate a behavior and apply the duplicate to another object in the Timeline or Layers tab.
  • to drag a duplicate of a behavior to another object is
  • a user when a user duplicates an object, he also duplicates all behaviors that have been applied to the object. In one embodiment, this way, if the user is creating a project with a number of objects that all need to use the same behavior, the user can simply apply that behavior to the first instance of that object, and then duplicate that object as many times as necessary.
  • FIG. 19 illustrates an object with multiple behaviors in the Timeline, according to one embodiment of the invention.
  • FIG. 20 illustrates an object with multiple behaviors in the Layers tab, according to one embodiment of the invention.
  • each behavior since each behavior applies a value to a specific parameter, the values generated by all behaviors that affect the same parameters are added together to create the end result.
  • applying the Throw, Spin, and Gravity behaviors to a single object results in the combined result of the Throw and Gravity behaviors affecting the position of the object, and the Spin behavior affecting its rotation.
  • a user when a user applies a number of behaviors to a single object, they all appear nested beneath that object in the Timeline and Layers tab.
  • a user can change the order in which the behaviors are applied.
  • the effects of most behaviors on a parameter are additive, this is useful more as an organizational tool than as a way to change the animated effect created by the behaviors a user has added to an object.
  • the Stop behavior which suspends the activity of all behaviors appearing beneath it, while ignoring any behaviors above it.
  • a user can change a behavior's timing to control when it starts, how long it lasts, and when it stops. In one embodiment, there are several ways of accomplishing this.
  • the user can use the Stop parameter behavior to suspend one or more behaviors effects on a single parameter.
  • a user can also trim each behavior in the Timeline.
  • a user can change a parameter behavior's Start Offset parameter to delay its beginning, and its End Offset parameter to end the behavior prior to the end of the object to which it is applied.
  • the easiest way of controlling behavior timing is to use the Stop parameter behavior.
  • the Stop behavior halts the animation occurring in any parameter, whether the animation is due to keyframes in the Keyframe Editor, or behaviors that have been applied to that object.
  • a Stop behavior when a user applies a Stop behavior to an object, its position in the Layers tab and Timeline affects which of the other behaviors that are applied to the same object are stopped. In one embodiment, animation caused by all behaviors appearing underneath the Stop behavior that affects the same parameter is suspended. In another embodiment, behaviors appearing above the Stop behavior are not affected.
  • the duration of the behavior 10 in the Timeline 16 defaults to the duration of the object 12 to which the behavior 10 has been applied.
  • FIG. 22 illustrates an object with a behavior in the Timeline, according to one embodiment of the invention.
  • a behavior's duration can be modified to limit the duration of its effect. In one embodiment, for example, if a user applies the Spin behavior to an object, by default that object spins around for its duration. In another embodiment, if a user trims the out point of the Spin behavior, the spinning stops at the new position of the out point.
  • trimming the out point of a behavior usually resets the object to its original state.
  • using the Stop behavior to pause the object's animation is a better method to use then trimming its out point.
  • another way to stop a behavior's effect and leave the affected object in the transformed state is to adjust a behavior's Start and End Offset parameters.
  • the Spin and Throw behaviors leave the object at the transformed state after the last frame of the trimmed behavior for the object's remaining duration.
  • a user in addition to changing a behavior's 10 duration, a user can also slip its position in the Timeline 16 relative to the object 12 it is nested under. In one embodiment, this lets the user set the frame at which that behavior 10 begins to take effect.
  • parameter behaviors have two additional parameters, Start Offset and End Offset. In one embodiment, these parameters are used to change the frame where a parameter behavior's effect begins and ends.
  • the Start Offset parameter has a slider that lets a user delay the beginning of the behavior's effect, relative to the first frame of its position in the Timeline. In another embodiment, a user can adjust this parameter to make the parameter behavior start later.
  • the End Offset in turn, lets a user offset the end of the behavior's effect relative to the last frame of its position in the Timeline. In one embodiment, using this slider to stop the effect, instead of trimming the end of the behavior in the Timeline, has the result of freezing the behavior's effect on the object for its remaining duration.
  • any object can have both behaviors and keyframes applied to it simultaneously.
  • the values generated by the behavior and the keyframed values that are applied to the parameter itself are added together to yield the final value for that parameter.
  • this lets a user combine the automatic convenience of behaviors with the direct control of keyframing to achieve his final result.
  • FIG. 26 illustrates a behavior-driven motion path in the Canvas, according to one embodiment of the invention.
  • the user turns off the Random Motion behavior temporarily and adds keyframes to the Motion parameter of the same object 12 , he can create a completely predictable and smooth motion path 270 .
  • FIG. 27 illustrates a keyframed motion path in the Canvas, according to one embodiment of the invention.
  • a user can combine the two by turning the Random Motion behavior back on, with the end result being a motion path 280 that follows the general direction he wants, but that has enough random variation in it to make it interesting.
  • FIG. 28 illustrates a behavior-driven and keyframed motion path in the Canvas, according to one embodiment of the invention.
  • this example shows how a user can combine behaviors and keyframes to create motion paths
  • a user can combine behaviors and keyframes for any parameter.
  • FIG. 29 illustrates a parameter with an oscillate behavior applied to it in the Keyframe Editor, according to one embodiment of the invention.
  • a user can keyframe 300 a parameter 290 either before or after applying a behavior 10 to the object 12 that affects that parameter 290 .
  • the value of the keyframed curve 296 is added to the value generated by the behavior at each frame. In another embodiment, this has the result of either raising or lowering the resulting value displayed by the background curve 294 .
  • the background curve 294 doesn't just display the behavior's animated values, it displays the sum of all values affecting that parameter 290 .
  • FIG. 30 illustrates a parameter with an oscillate behavior and keyframes applied to it in the Keyframe Editor, according to one embodiment of the invention.
  • raising or lowering a keyframe 300 in the Keyframe Editor 292 also raises or lowers the background curve 294 , since it's adding to or subtracting from the values generated by the behavior 10 .
  • FIG. 31 illustrates the parameter of FIG. 30 but with one keyframe lowered, according to one embodiment of the invention.
  • a user when a user combines keyframes 300 with multiple behaviors 10 , the results can appear to be unpredictable, depending on the combination of behaviors that are applied.
  • the user has the option of converting the behaviors 10 that are applied to any parameter 290 into keyframes 300 .
  • converting behaviors 10 that have already been combined with keyframes 300 turns the sum of all behaviors 10 and keyframes 300 affecting that parameter 290 into a thinned series of keyframes 300 .
  • this results in a final animation curve 330 that closely replicates the shape of the background curve 294 that appeared in the Keyframe Editor 292 .
  • these keyframes 300 can then be edited directly in the Keyframe Editor 292 .
  • FIG. 32 illustrates a parameter with a behavior curve and a keyframed curve in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 33 illustrates a parameter with a “final animation curve” in the Keyframe Editor, according to one embodiment of the invention.
  • a user can animate any behavior's parameters in order to change the parameter's effect over time.
  • a user can animate behavior parameters using parameter behaviors, or by keyframing the parameters in the Keyframe Editor.
  • a user can animate a behavior's 10 parameter by applying a parameter behavior 10 .
  • a user could apply the Ramp behavior 10 to an Orbit Around behavior's 10 Drag parameter and adjust the Start and End values to increase from 0 to 8 over time. In another embodiment, this results in the orbit of the object 12 slowly decaying, causing the object 12 to fall towards the center of the orbit.
  • FIG. 34 illustrates an object with an Orbit Around behavior applied, creating a regular orbit (a circular motion path 340 ), according to one embodiment of the invention.
  • FIG. 35 illustrates the same object as in FIG. 34 , but with a Ramp behavior applied to the Orbit Around behavior's Drag parameter as described above, creating a spiral motion path 340 , according to one embodiment of the invention.
  • FIG. 36 illustrates an object with an Orbit Around behavior applied, creating a regular orbit (a circular motion path), according to one embodiment of the invention.
  • FIG. 37 illustrates the same object as in FIG. 36 , but with keyframes applied to the Orbit Around behavior's Drag parameter as described above, creating a different motion path, according to one embodiment of the invention.
  • a user can “bake” all the behaviors that have been applied to an object into keyframes using the Convert to Keyframes command, in the Object menu.
  • all behaviors that are applied to that object are converted to keyframes, which are applied to the individual parameters the behaviors originally affected.
  • a user cannot selectively convert individual behaviors.
  • the Convert to Keyframes command converts all behaviors that are applied to an object at once.
  • a user customizes a behavior, and he'd like to save it for future use, he can drag it to the Favorites folder of the Library for future use.
  • a Behavior once a Behavior has been placed into the Library, it can be applied to objects like any other behavior in the Library.
  • each customized behavior a user drags into the Library is saved as a separate file. In one embodiment, if a user has created one or more custom behaviors that he relies upon, he may want to move them to other computers he uses.
  • this section explains the options that are available for each behavior, presented by category.
  • the Target Object is the object to which the behavior is applied.
  • basic Motion behaviors ammate specific parameters of the object to which they are applied. In one embodiment, some basic motion behaviors affect position, while others affect scale or rotation.
  • the Fade In/Fade Out behavior affects an object's Opacity parameter. In one embodiment, the Fade In/Fade Out behavior lets a user dissolve into and out of any object. In one embodiment, the Fade In/Fade Out behavior affects the opacity of the object to which it is applied, fading from 0 percent opacity to 100 percent opacity at the beginning of the clip, and then back to 0 percent opacity at the end. In one embodiment, a user can eliminate the fade in or out by setting the duration of either the fade in or fade out to 0 frames.
  • this behavior is useful for introducing and removing images being animated in the middle of a project.
  • a user could apply the Fade In/Fade Out behavior to text objects moving slowly across the screen to make them fade into existence, and then fade away at the end of their duration.
  • the Dashboard 110 lets a user control the Fade In and Fade Out durations, equivalent to the Fade In Time and Fade Out Time parameters. In one embodiment, drag anywhere within the shaded area of the Fade In or the Fade Out ramps 380 to adjust their durations. In another embodiment, the user can extend the durations of the Fade In or Fade Out past the limits of the graphical dashboard control.
  • FIG. 38 illustrates a Dashboard for a Fade In/Fade Out behavior, according to one embodiment of the invention.
  • Parameters in the Inspector In one embodiment, the following parameters for the Fade In/Fade Out behavior are available in the Inspector:
  • the Grow/Shrink behavior affects an object's Scale parameter. In one embodiment, use the Grow/Shrink behavior to animate the scale of an object, enlarging or reducing the object's size over time at a speed defined by the Scale Rate. In another embodiment, the Grow/Shrink effect always begins at the object's original size at the first frame of the behavior.
  • the Grow/Shrink behavior is a good behavior to use with high-resolution graphics to zoom into an image, such as a map or photograph.
  • a user can also combine this behavior with the Throw or Wind behavior to pan across the image while zooming into it.
  • the Grow/Shrink behavior can also be used to emphasize or de-emphasize images in a project.
  • a user can enlarge objects to make them the center of attention, or shrink an object while introducing another object to move the viewer's eye to the new element.
  • the Grow/Shrink Dashboard 110 consists of two rectangular regions.
  • the first 390 is a rectangle with a dotted line that represents the original size of the object.
  • the second 392 is a solid rectangle that represents the target size, and can be resized by dragging any of the borders.
  • a slider 394 to the right lets a user adjust the scale of the Dashboard controls, increasing or decreasing the effect the controls have over the object.
  • FIG. 39 illustrates a Dashboard for a Grow/Shrink behavior, according to one embodiment of the invention.
  • Parameters in the Inspector In one embodiment, the following parameters for the Grow/Shrink behavior are available in the Inspector:
  • the Motion Path behavior 10 affects an object's 12 position parameter.
  • the Motion Path behavior lets a user create a motion path 400 for an object 12 to follow.
  • a user when a user first applies the Motion Path behavior to an object 12 , it defaults to a straight path 400 defined by two points at the beginning 410 A and end 410 B of the motion path 400 .
  • the first point 410 A on the path is the position of the object 12 in the Canvas at the first frame of the behavior.
  • a user can double-click or Option-click anywhere on the path to add bezier points 410 C to the path, which allow the user to reshape the motion path by creating curves.
  • FIG. 40 illustrates a Motion Path behavior, including curves, applied to an object, according to one embodiment of the invention.
  • the object upon playback, moves along the assigned path.
  • the speed at which the target object travels is defined by the duration of the behavior, minus the End Offset parameter.
  • the Speed parameter lets a user create acceleration and deceleration at the beginning and end of the behavior.
  • the Motion Path behavior is an easy way to create predictable motion without having to make keyframes for it in the Keyframe Editor.
  • the Motion Path Dashboard lets a user set the Speed parameter using a pop-up menu, with options for Linear, Ease In, Ease Out, or Both.
  • the motion path a user creates in the Canvas can be adjusted by adding points to the default motion path, and using the tangent controls attached to each point to adjust each curve.
  • Parameters in the Inspector In one embodiment, the following parameters for the Motion Path behavior are available in the Inspector:
  • behaviors related to Motion Path include Gravity, Random Motion, Throw, and Wind.
  • the Snap Alignment to Motion behavior affects an object's 12 Rotation parameter.
  • the Snap Alignment to Motion behavior aligns the rotation of an object 12 to match all changes made to its position along a motion path 410 .
  • this behavior is meant to be combined with behaviors that animate the position of an object 12 , or with a keyframed motion path 410 a user creates himself.
  • FIG. 41 illustrates an object moving along a motion path, according to one embodiment of the invention.
  • FIG. 42 illustrates the same object as in FIG. 41 , but also with a Snap Alignment to Motion behavior applied to the object, according to one embodiment of the invention.
  • the Snap Alignment to Motion Dashboard has a pop-up menu to control the Axis used to adjust the object's alignment and a checkbox to let the user invert the Axis.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Snap Alignment to Motion behavior in the Inspector:
  • the Axis parameter is set by a pop-up menu that lets a user specify whether the object aligns itself on its Horizontal or Vertical axis.
  • Invert Axis In one embodiment, if the object is aligning on the correct axis, but appears backwards, the Invert Axis checkbox flips the object so that it is facing the proper direction.
  • behaviors related to Snap Alignment to Motion include Align to Motion.
  • the Spin behavior affects an object's Rotation parameter.
  • apply the Spin behavior to animate the rotation of an object, spinning it either clockwise or counter-clockwise.
  • a user trims the end of the Spin behavior to be shorter than the duration of the object to which it is applied, the object remains at the angle of the last frame of the behavior.
  • uses for spin are fairly obvious, but another way to use the Spin behavior is with objects that have an off-center anchor point.
  • objects rotate about the anchor point if a user changes an object's anchor point before he applies a spin behavior to it, he can quickly change the look of the motion he creates.
  • the Spin behavior's Dashboard 110 control is a ring 430 .
  • drag anywhere within the ring to manipulate an arrow 432 that indicates the direction the object spins.
  • adjust the length of the arrow 432 to change the speed at which the spinning will occur.
  • FIG. 43 illustrates a Dashboard for a Spin behavior, according to one embodiment of the invention.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Spin behavior in the Inspector:
  • the Throw behavior affects an object's position parameter and is the simplest way of setting an object in motion.
  • the Throw behavior controls let a user adjust the speed and direction of a single force that is exerted on the object at the first frame of the behavior.
  • the object continues drifting in a straight line, and at the same speed, for the duration of the Throw behavior.
  • a simple example of the Throw behavior in use is to send a series of offscreen text objects moving across the screen.
  • other behaviors such as Grow/Shrink and Fade In/Fade Out
  • a user can create sophisticated moving titles without keyframing a single parameter.
  • the Throw behavior does not apply a continuous force, nor can a user create gradual changes in direction or speed using this behavior alone.
  • keyframed changes to the Throw behavior are instantly applied at the frame they appear, resulting in abrupt motion.
  • the Throw behavior is useful when the user is moving an object through a simulation, for example, a project in which he has arranged a number of other objects with attract or repel behaviors applied to them.
  • the Throw behavior since the Throw behavior only applies a single force to move the target object at the initial frame of the behavior, any other behaviors that interact with the target object will have greater influence over its motion.
  • a user wants to apply a continuous force to an object use the Wind behavior.
  • a user needs a more complex motion path use the Motion path behavior.
  • the Throw behavior's Dashboard 110 lets a user specify the direction and speed of the throw behavior by dragging an arrow 440 within a circular region 442 .
  • the direction of the arrow 440 defines the direction of movement
  • the length of the arrow 440 defines speed.
  • a slider 444 to the right lets the user adjust the scale of the Dashboard control, increasing or decreasing the effect the control has over the object 12 .
  • the maximum speed a user can define with the Dashboard is not the maximum possible speed. In one embodiment, higher values can be entered into the Rate or Final Value parameter in the Behaviors tab of the Inspector.
  • FIG. 44 illustrates a Dashboard for a Throw behavior, according to one embodiment of the invention.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Throw behavior in the Inspector:
  • behaviors related to Throw include Motion Path, Gravity, Random Motion, and Wind.
  • parameter behaviors can be applied to any object parameter, and their effects are limited to just that parameter.
  • the same parameter behavior can be added to different parameters, resulting in completely different effects.
  • a user can apply the oscillate behavior to the opacity of an object to make the object fade in and out, or he can apply it to the rotation of an object to make the object rock back and forth.
  • a user can also apply parameter behaviors to filter parameters, Generator parameters, the parameters of particle systems, or even the parameters of other behaviors.
  • examples of parameter behaviors include Oscillate, Randomize, and Reverse.
  • the Average behavior smooths the transition from one value to another caused by keyframes and behaviors that are applied to a parameter.
  • use the Average behavior to smooth out animated effects.
  • averaged motion moves more fluidly, while averaged changes to parameters such as Opacity and to filter parameters appear to happen more gradually.
  • use the Window Size parameter to adjust the amount by which to smooth the affected parameter.
  • the Average behavior can be used to smooth out the sequence of values generated by a Randomize behavior.
  • the Average behavior's Dashboard lets the user adjust the Window Size parameter.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Average behavior in the Inspector:
  • behaviors related to Average include Negate and Reverse.
  • the Custom behavior allows a user to create his own custom behaviors.
  • the Negate behavior 10 inverts the value of each keyframe 300 and behavior effect in the parameter to which it is applied.
  • the Negate behavior basically flips each parameter value to its opposite.
  • motion paths 450 are flipped, rotation is reversed, and any effect's parameter will be changed to its opposite.
  • applying the Negate behavior to the Position parameter of an object 12 with a Motion Path behavior applied results in the motion path 450 being flipped.
  • FIG. 45 illustrates a motion path behavior applied to an object, according to one embodiment of the invention.
  • FIG. 46 illustrates a motion path behavior applied to an object, and a Negate behavior applied to the object's Position parameter, according to one embodiment of the invention.
  • Dashboard Control In one embodiment, there are no Dashboard controls for the Negate behavior.
  • behaviors related to Negate include Average and Reverse.
  • the Oscillate behavior animates a parameter by cycling the parameter between two different values.
  • a user can customize how widely apart the high and low values are, as well as the number of oscillations per second.
  • the Oscillate behavior can create all kinds of cyclical effects.
  • the object will begin to rock back and forth. In another embodiment, this happens because the rotation property cycles back and forth between the initial rotation value plus and minus the Amplitude value that is set in the Oscillate behavior.
  • applying the Oscillate behavior to the X value of the scale parameter instead causes the width of the object to cycle, and it repeatedly stretches and compresses for the duration of the behavior.
  • the Oscillate Dashboard 10 lets a user adjust the Phase 470 , Amplitude 472 , and Speed 474 of the Oscillate behavior.
  • FIG. 47 illustrates a Dashboard for an Oscillate behavior, according to one embodiment of the invention.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Oscillate behavior in the Inspector:
  • behaviors related to Oscillate include Ramp and Rate.
  • the Ramp behavior lets a user create a gradual transition, in any parameter, from the Start Value to the End Value.
  • the speed of the transition is defined by the length of the Ramp behavior in the Timeline.
  • additional parameters allow a user to define how the transition occurs, whether it is at a single continuous speed, or whether it accelerates over time.
  • ramp is a versatile behavior.
  • a user if a user applies the Ramp behavior to the Scale property, it works like the Grow/Shrink behavior.
  • if a user applies it to the opacity property he can fade an object in or out in different ways.
  • a user can use the Ramp behavior to mimic other behaviors, it can be applied to any parameter he wants.
  • the Ramp Dashboard lets a user adjust the Ramp's Start Value, End Value, and Curvature.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Ramp behavior in the Inspector:
  • behaviors related to Ramp include Oscillate and Rate.
  • the Randomize behavior creates a continuous sequence of randomly increasing and decreasing values, based on the parameters defining the range and type of values that are generated.
  • the values created with this behavior are random, they're actually predetermined by the parameter settings chosen by the user.
  • the frame-by-frame values created by this behavior remain the same.
  • a user if a user doesn't like the values that were randomly generated, click the Generate button in the Behavior tab in the Inspector to pick a new random seed number. In one embodiment, this number is used to generate a new sequence of values.
  • the Apply Mode parameter determines how values generated by this behavior are combined with other behaviors and keyframes that affect the same parameter. In one embodiment, this provides a user with different ways of using a Randomize behavior to modify a parameter's preexisting values. In another embodiment, the Randomize behavior is useful for creating jittery effects, such as twitchy rotation, flickering opacity, and other effects requiring rapid and varied changes over time that would be time-consuming to keyframe. In yet another embodiment, the Randomize behavior can be modified with other behaviors, such as Average and Negate, to exercise further control over the values being generated.
  • the Randomize Dashboard has controls for Amount, Frequency, Wriggle Offset, Noisiness, Link, Start Offset, and End Offset.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Randomize behavior in the Inspector:
  • the Rate behavior increases a parameter's value over time, with the rate of increase determined by the Rate slider. In one embodiment, to use the Rate parameter to decrease a parameter over time, apply the Negate behavior after it.
  • Rate Dashboard Control In one embodiment, the Rate Dashboard has controls for Rate and Curvature.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Rate behavior in the Inspector:
  • behaviors related to Rate include Oscillate and Ramp.
  • the Reverse behavior reverses the direction of any animation affecting a parameter, whether the animation is caused by behaviors or keyframes.
  • the Reverse and Negate behaviors have the same effect. In another embodiment, in other instances, their effects are very different.
  • applying the Negate parameter flips an object's motion path, while applying the Reverse behavior leaves the motion path alone, reversing the object's motion, instead.
  • Dashboard Control In one embodiment, there are no Dashboard controls for the Reverse behavior.
  • Parameters in the Inspector In one embodiment, several parameters are available for the Reverse behavior in the Inspector.
  • behaviors related to Reverse include Average and Negate.
  • the Stop behavior suspends the animation of all behaviors that:
  • Dashboard Control In one embodiment, there is no Dashboard control for the Stop behavior.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Stop behavior in the Inspector:
  • this behavior works similarly to the Randomize behavior, but with a slower effect.
  • the Wriggle Dashboard has controls for Amount, Frequency, Wriggle Offset, Noisiness, Link, Start Offset, and End Offset.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Wriggle behavior in the Inspector:
  • the End Offset parameter is set by a slider that lets a user offset the end of the behavior's effect relative to the last frame of its position in the Timeline, in frames. In one embodiment, adjust this parameter to make the behavior stop before the actual end of the behavior in the Timeline. In another embodiment, using this slider to stop the effect, instead of trimming the end of the behavior in the Timeline, freezes the last random value generated by this behavior for the remaining duration of the object. In yet another embodiment, trimming the end of the behavior resets the parameter to its original value.
  • behaviors related to Wriggle include Random Motion and Randomize.
  • simulation behaviors perform one of two tasks.
  • some simulation behaviors such as Gravity
  • other simulation behaviors such as Attractor and Repel
  • these behaviors allow a user to create some very sophisticated interactions among multiple objects in a project with a minimum of adjustments.
  • simulation behaviors also affect specific object parameters.
  • examples of simulation behaviors include Attractor, Gravity, and Repel.
  • the Align To Motion behavior affects an object's Rotation parameter.
  • the Align To Motion behavior changes the rotation of an object to match changes made to the object's direction along a motion path. In one embodiment, this behavior is meant to be combined with behaviors that animate the position of an object, or with a keyframed motion path created by a user.
  • Align to Motion has a springy effect, and creates a more lively effect.
  • a user has a graphic of a rocket to which he has applied a Motion Path behavior, he can add the Align To behavior to make the rocket point in the direction it is moving.
  • the Drag parameter by adjusting the Drag parameter, he can make it careen wildly about its anchor point as it goes around turns in the motion path.
  • the Align to Motion Dashboard has controls for Axis, Invert Axis, Spring Tension, and Drag.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Align to Motion behavior in the Inspector:
  • behaviors related to Align To Motion include Snap Alignment to Motion.
  • the Attracted To behavior is part of a group of simulation behaviors that let a user create complex animated relationships between two or more objects. In one embodiment, these behaviors are extremely powerful, and allow complicated effects to be created with a minimum of steps.
  • the Attracted To behavior affects an object's Position parameter.
  • an object with the Attracted To behavior moves towards a single specified object, the object of attraction (the “attracting object” 12 B).
  • additional parameters allow a user to adjust the area of influence that defines how close an object 12 A needs to be to move towards the object of attraction 12 B, and how strongly it is attracted.
  • FIG. 48 illustrates two objects (an attracting object and an attracted object) and a motion path 480 of the latter object, according to one embodiment of the invention.
  • the Drag parameter lets a user define whether attracted objects overshoot and bounce around the attracting object, or whether they eventually slow down and stop at the position of the attracting object.
  • a user can apply two or more Attracted To behaviors to a single object, each with a different object of attraction, to create tug-of-war situations where the object bounces among all the objects it is attracted to.
  • the Attracted To Dashboard has an image well that the user can use to assign an object of attraction, as well as controls for Strength, Falloff Type, Falloff Rate, Influence, and Drag.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Align to Motion behavior in the Inspector:
  • behaviors related to Attracted To include Attractor, Drift Attracted To, Drift Attractor, Orbit Around, Spring, and Vortex.
  • the Attractor behavior affects other objects' Position parameters. In one embodiment, the Attractor behavior is the opposite of the Attracted To behavior. In one embodiment, if a user applies an Attractor behavior to an object, other objects that lie within the area of influence move toward it. In another embodiment, a user can manipulate the strength with which other objects are attracted, as well as the distance required for attraction to begin.
  • the Drag parameter lets a user adjust this behavior, changing whether attracted objects overshoot and bounce around, or whether they eventually slow down and stop at the position of the target object.
  • the Attractor behavior can affect all objects in the Canvas that fall within the area of attraction, or a user can limit the Attractor behavior's effect to a specific group of objects, using the Affect parameter.
  • the Attractor behavior can also be applied to objects in motion. In one embodiment, if a user animates the position of the Target object to which he has applied the Attractor behavior, all other objects in the Canvas continue to be attracted to the Target object's new position.
  • Attractor Dashboard has controls for Affect, Strength, Falloff Type, Falloff Rate, Influence, and Drag.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Attractor behavior in the Inspector:
  • behaviors related to Attractor include Attracted To, Drift Attracted To, Drift Attractor, Orbit Around, Spring, and Vortex.
  • the Drag behavior affects an object's Position parameter.
  • the Drag behavior lets a user simulate the force of friction on a moving object, slowing it down over time until it eventually comes to a stop.
  • applying the Drag behavior is an easy way to decelerate objects with multiple behaviors that create complex motion.
  • Drag Dashboard Control In one embodiment, the Drag Dashboard lets a user adjust the Amount of drag.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Drag behavior in the Inspector:
  • behaviors related to Drag include Rotational Drag.
  • the Draft Attracted To behavior affects an object's Position parameter.
  • the Draft Attracted To behavior is similar to the Attracted To behavior, but by default an object moves towards the object of attraction and comes to rest, rather then overshooting the object of attraction and bouncing around.
  • the Drift Attracted To Dashboard has an image well that the user can use to assign an object of attraction, as well as sliders for Strength and Drag.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Drift Attracted To behavior in the Inspector:
  • behaviors related to Drift Attracted To include Attracted To, Attractor, Drift Attractor, Orbit Around, Spring, and Vortex.
  • the Drift Attractor behavior affects other objects' Position parameters.
  • the Draft Attractor behavior is similar to the Attractor behavior, but by default objects within the area of influence move towards the object of attraction and come to rest, rather then overshooting the object of attraction and bouncing around.
  • the Drift Attractor Dashboard has controls for Affect, Strength, and Drag.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Drift Attractor behavior in the Inspector:
  • behaviors related to Drift Attractor include Attracted To, Attractor, Drift Attracted To, Orbit Around, Spring, and Vortex.
  • the Edge Collision behavior affects an object's 12 Position parameter.
  • the Edge Collision behavior is a good behavior to use to set up complex motion simulations where objects 12 should not exit the Canvas.
  • objects 12 with the edge collision behavior applied either come to a stop, or bounce off after colliding with the edge of the Canvas frame.
  • FIG. 49 illustrates one object and an edge collision motion path 490 , according to one embodiment of the invention.
  • the angle in which the object bounces depends on the angle with which it hit the edge of the frame, while the speed it travels after bouncing is set by the Bounce Strength parameter.
  • the Edge Collision behavior uses only the rectangular edges of the object's bounding box to determine how the object collides with the Canvas edge. In one embodiment, if a user is using this behavior with an object that has an alpha channel that is smaller than its bounding box, adjust the Crop parameter in the object's Properties tab to fit the bounding box as closely as possible to the edge of the image.
  • the Edge Collision Dashboard has controls for Affect, Bounce Strength, and Active Edges.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Edge Collision behavior in the Inspector:
  • the Gravity behavior affects an object's 12 Position parameter.
  • the Gravity behavior causes an object 12 to fall over time.
  • the gravitational acceleration can be increased or decreased, resulting in a change to the rate of fall.
  • objects 12 affected by the Gravity behavior continue to fall past the bottom edge of the Canvas (unless the Edge Collision behavior has been applied).
  • FIG. 50 illustrates an object and a gravity motion path 500 , according to one embodiment of the invention.
  • the Gravity behavior can be used in conjunction with other behaviors that animate the position of objects to create natural-looking arcs and motion paths that simulate thrown objects falling to the ground.
  • a user can also set the Acceleration parameter to a negative value, effectively applying “anti-gravity” to the object and making it fly up.
  • the Gravity Dashboard lets a user adjust the Acceleration parameter.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Gravity behavior in the Inspector:
  • behaviors related to Gravity include Motion Path, Random Motion, Throw, and Wind.
  • the Orbit Around behavior affects an object's 12 A Position parameter.
  • the Orbit Around behavior's default parameter settings cause an object 12 A to orbit around another object 12 B in a perfect circle.
  • FIG. 51 illustrates a first object orbiting around a second object and an orbit motion path 510 of the first object, according to one embodiment of the invention.
  • the Orbit Around Dashboard 110 has an image well 520 that a user can use to assign an object 12 of attraction, as well as controls for Strength 522 , Falloff Type 524, Falloff Rate 526 , Influence 527 , Drag 528 , and Direction 529 .
  • FIG. 52 illustrates a Dashboard of an Orbit Around behavior, according to one embodiment of the invention.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Orbit Around behavior in the Inspector:
  • behaviors related to Orbit Around include Attracted To, Attractor, Drift Attracted To, Drift Attractor, Spring, and Vortex.
  • the Random Motion behavior affects an object's 12 Position parameter. In one embodiment, if a user applies the Random Motion behavior to an object 12 , the behavior animates the position of the object, and makes the object move around the Canvas along a random path 530 .
  • FIG. 53 illustrates an object and a Random Motion motion path, according to one embodiment of the invention.
  • the motion created with this behavior is random, the motion is actually predetermined by the particular group of parameters a user has chosen. In one embodiment, as long as the user doesn't change the parameters, the motion path created by this behavior will remain the same. In another embodiment, if the user doesn't like the path that was randomly generated, click the Generate button in either the Dashboard or the Behavior tab in the Inspector to pick a new random seed number. In yet another embodiment, this number is used to generate a new path.
  • the Random Motion behavior is useful for quickly creating varied motion paths for large numbers of objects that a user wants to move at the same time.
  • a user can create an arrangement of ten objects in the canvas and apply the Random Motion behavior to all of them.
  • a user can also use the random motion behavior to add variation to the motion paths 540 created by other behaviors affecting an object's 12 position.
  • adding Random Motion to an object 12 with the Orbit Around behavior results in a more erratic motion path 540 , although the object 12 still orbits as before.
  • FIG. 54 illustrates an Orbit Around behavior applied to an object and the object's motion path, according to one embodiment of the invention.
  • FIG. 55 illustrates both an Orbit Around behavior and a Random Motion behavior applied to an object and the object's motion path, according to one embodiment of the invention.
  • the Random Motion Dashboard 110 has controls for the Amount 560 , Frequency 562 , noisy 564 , Drag 566 , and Random Seed 568 parameters.
  • FIG. 56 illustrates a Dashboard for a Random Motion behavior, according to one embodiment of the invention.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Random Motion behavior in the Inspector:
  • behaviors related to Random Motion include Motion Path, Gravity, Throw, and Wind.
  • the Repel behavior affects other objects' Position parameters.
  • the Repel behavior is the opposite of the Attractor behavior, and is part of a group of simulation behaviors that create complex animated relationships between two or more objects.
  • the Repel behavior if a user applies the Repel behavior to an object 12 A, the behavior pushes away all objects 12 B within the area of influence in the Canvas.
  • the strength with which objects 12 B are pushed away can be increased or decreased, as can the distance repelled objects 12 B travel.
  • FIG. 57 illustrates several objects, according to one embodiment of the invention.
  • FIG. 58 illustrates the same objects as in FIG. 57 after the Repel behavior has been applied to the central object, according to one embodiment of the invention.
  • the Repel Dashboard has controls for Strength, Falloff Type, Falloff Rate, Influence, and Drag.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Repel behavior in the Inspector:
  • behaviors related to Repel include Repel From.
  • the Repel From behavior affects an object's Position parameter. In one embodiment, while the Repel behavior pushes other objects away, the Repel From behavior has the opposite effect, making the object it is applied to move away from a selected object in the Canvas.
  • the Repel From Dashboard has an image well that the user can use to assign an object to move away from, as well as controls for Strength, Falloff Type, Falloff Rate, Influence, and Drag.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Repel From behavior in the Inspector:
  • behaviors related to Repel From include Repel.
  • the Rotational Drag behavior affects an object's Rotation parameter.
  • the Rotational Drag behavior is similar to the Drag behavior, except that it affects Rotation instead of Position.
  • rotational drag simulates friction affecting objects that are spinning due to keyframed or behavior-driven changes to the Rotation parameter.
  • by setting higher drag values a user can slow rotational changes to an eventual stop.
  • Rotational Drag Dashboard lets a user control the Amount of drag.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Rotational Drag behavior in the Inspector:
  • behaviors related to Rotational Drag include Drag.
  • the Spring behavior affects an object's Position parameter.
  • the Spring behavior creates a relationship between two objects, so that an object with the Spring behavior applied to it moves back and forth around a second object by a specified distance.
  • the Attract To parameter defines the object that serves as the target and center of the spring behavior.
  • additional parameters let a user adjust the speed of the behavior (Spring Tension) and the acceleration of the object at each change in direction (Relaxed Length).
  • the resulting motion is fairly simple and the springing object moves back and forth in a straight line. In one embodiment, if the Attract To object is in motion, the springing object's motion will be much more complex, changing direction according to the velocity of the Attract To object.
  • the Spring Dashboard contains an image well that lets a user set the Attract To object.
  • the Spring Dashboard contains two sliders that let a user adjust the Spring Tension and Relaxed Length of the Spring effect.
  • a checkbox lets a user turn on the Repel parameter.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Spring behavior in the Inspector:
  • behaviors related to Spring include Attracted To, Attractor, Drift Attracted To, Drift Attractor, Orbit Around, and Vortex.
  • the Vortex behavior affects other objects' Position parameters.
  • the Vortex behavior is the opposite of the Orbit Around behavior.
  • the Orbit Around behavior causes one object to orbit around another target object, the Vortex behavior exerts a force on all objects surrounding the object to which the Vortex behavior is applied.
  • Vortex Dashboard has a pop-up menu that lets a user limit the objects affected by this behavior, as well as controls for Strength, Falloff Type, Falloff Rate, Influence, Drag, and Direction.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Vortex behavior in the Inspector:
  • behaviors related to Vortex include Attracted To, Attractor, Drift Attracted To, Drift Attractor, Orbit Around, and Spring.
  • the Wind behavior affects an object's Position parameter.
  • apply the Wind behavior to an object to animate its position and move it in a specified direction.
  • the velocity specified by the Wind behavior is a continuous force, and its parameters can be keyframed to achieve gradual changes in speed and direction.
  • the Wind behavior is better than the Throw behavior when a user wants to vary the speed of the object being animated.
  • a user can either apply another behavior (such as randomize or ramp) or keyframe the Velocity parameter of the Wind behavior to vary the speed and direction at which the object moves.
  • a user cannot make gradual changes in either speed or direction with the Throw behavior.
  • the Wind Dashboard 110 lets a user specify the direction and speed of the Wind behavior by dragging an arrow 590 within a circular region 592 .
  • the direction of the arrow defines the direction of movement, and the length of the arrow defines speed.
  • a slider 594 to the right lets a user adjust the scale of the Dashboard control, increasing or decreasing the effect the control has over the object.
  • the maximum speed a user can define with the Dashboard 10 is not the maximum speed possible.
  • higher values can be entered into the Throw Velocity or Throw Distance parameter in the Behaviors tab of the Inspector.
  • FIG. 59 illustrates a Dashboard of a Wind behavior, according to one embodiment of the invention.
  • Parameters in the Inspector In one embodiment, the following parameters are available for the Wind behavior in the Inspector:
  • behaviors related to Wind include Motion Path, Gravity, Random Motion, and Throw.
  • the following three examples illustrate different ways that groups of behaviors can be combined to create different effects.
  • multiple behaviors will be used to bring up four text objects to create a title.
  • the first three text objects fly in from the sides, while the last text object zooms out from the center of the screen.
  • this example assumes that the Create Objects At preference in the Project Preferences window is set to Start of Project, so that newly applied behaviors are placed from the beginning of each object all the way through the end.
  • two parameter behaviors will be used to create an animated clock.
  • each part's motion can be created quickly and easily using the Rate and Oscillate behaviors.
  • Behaviors can be applied to text, one of the most essential motion graphics elements.
  • type has become more than words that provide basic information, such as what time to tune into your favorite television program.
  • Type design has become an art form.
  • a title sequence can set the mood of the film it is introducing, a certain combination of typeface and animation style can provide instant recognition of the identity of a broadcast network, or a clever television interstitial can keep a viewer from flipping channels during a commercial break.
  • Friz Freleng for Blake Edwards' “The Pink Panther” went from movie title to movie and television star, with a design and graphics style that hold up even today, nearly 40 years later.
  • text is added to a project directly in a Canvas.
  • select a Text tool click in the Canvas, and start typing.
  • text may be added and edited in the Canvas, or in a Text Editor in a Format pane of a Text Inspector.
  • the text may be put on a line or elliptical path that can be animated.
  • text objects when text is created, it becomes a text object.
  • the stacking order of text objects can be changed within a layer, or text objects can be moved to another layer, similar to other types of objects (e.g., video clips, images, paint objects, and shapes).
  • text objects can be easily duplicated or copied from one layer to another.
  • filters, transfer modes, and shadows can be applied to text objects, similar to other types of objects.
  • text objects can be moved, rotated, scaled, and easily animated using Basic Motion or Simulation behaviors (such as Throw or Gravity) or by setting keyframes.
  • a text object unlike other object types, has a special group of behaviors called Text Animation behaviors.
  • text behaviors create text animation by generating a range of values in text parameters specific to titling effects, without setting any keyframes.
  • the Text Tracking behavior can be dragged onto a text object, and the text characters will gracefully spread out across the Canvas over time.
  • using behaviors is an ideal workflow to interactively test different looks and animations.
  • the rate of an applied behavior can be quickly adjusted using a behavior's Dashboard, while the animation updates in the canvas.
  • parameters for a behavior may be accessed in an Inspector.
  • Text Animation behaviors can be converted to keyframes in order to fine tune the animation.
  • using Behaviors is not required to animate text; instead, text can be animated traditional keyframing or a combination of keyframing and behaviors.
  • keyframes and behaviors can be applied to an object, some thought must be given to the desired effect, since this workflow can defeat the purpose of Behaviors, as well as yield unexpected results.
  • the animation can be saved to a Library for use on another text object or a future project.
  • text objects have unique attributes, such as face and outline, and the ability to change fonts or edit the text of an existing, animated text object.
  • text can be created directly in a canvas using a Text tool.
  • behaviors and filters can be applied to a text object.
  • text can be added to a project in a Canvas.
  • a text object is created at the first frame of a project and exists for the duration of the project.
  • the duration of the text object is 900 frames.
  • to shorten the duration of a text object shorten the text object in a Timeline.
  • the mode is text-entry mode, so pressing S will add an “S” to text rather than change to the Select tool.
  • a Dashboard for the new text object is displayed. In one embodiment, if no Dashboard is present, press D to display the text object Dashboard.
  • a text Dashboard contains some of the most commonly-adjusted text parameters, such as opacity, type family, and color.
  • text parameters in a text Dashboard include:
  • text parameters are located in the Text tab in the Inspector.
  • to display the Text tab of the Inspector select the text object and click the “i” button on the Dashboard (or press Command+3).
  • the Inspector contains text parameters divided into three tabs: Format, Style, and Paragraph.
  • filters and behaviors are applied to text objects in the same manner as they are applied to other object types. This section provides a quick start to applying Behaviors and Filters to text objects.
  • to apply a behavior or filter to a text object do one of the following:
  • a Dashboard can be displayed for any object.
  • to display a Dashboard select the object and press H.
  • the Dashboard that is displayed represents the currently selected object.
  • the parameters contained in a Dashboard depend on the type of object that it represents.
  • a text object Dashboard displays text-specific parameters, such as Typeface and Line Spacing.
  • a particle emitter Dashboard displays particle-specific controls, such as Particles per Second and Lifetime.
  • the displayed Dashboard changes to the most recently added effect.
  • the Dashboard name is displayed on the top bar of the Dashboard window.
  • to cycle through the Dashboards for an object press H.
  • the Dashboards cycle in the order that the effects are applied.
  • to jump to a specific Dashboard click a disclosure triangle next to the Dashboard name and select a Dashboard from the list.
  • any supported font may be used.
  • supported fonts include OpenType, Type1 (or PostScript), TrueType, and LiveType.
  • a Library includes a font browser that allows a user to preview fonts, select fonts, or apply a font to an existing text object.
  • to access the font browser click the Library tab and then click the Fonts category.
  • fonts can be browsed using the Browse button in the Format panel of the Text Inspector.
  • to use the font browser do one of the following.
  • to quickly locate a font by its name in the font stack type the first letter or first few letters of the font name in the browser.
  • click in the font stack (on a font name or thumbnail).
  • quickly type the first two letters of the font name In one embodiment, quickly type the first two letters of the font name. In one embodiment, if the second letter of the font name is not typed quickly, the selection is reset and jumps to the font whose name begins with the second letter entered.
  • text becomes a text object when created.
  • a text object is like any other object type, with one exception.
  • text object properties can be animated, and behaviors and filters can be applied to text objects, similar to other object types.
  • text-specific parameters can be animated and Text Behaviors can be applied, unlike with other object types.
  • behaviors and filters aside there are two ways to edit a text object: (1) as an object using the object parameters in the Inspector>Properties tab (or the onscreen controls); and (2) as text using Text parameters in the Inspector>Text tab.
  • This section discusses the tools that can be used with text objects, according to one embodiment of the invention.
  • the following interface tools may be used to edit text:
  • the standard onscreen controls can be used to move and animate the text object in the Canvas.
  • a Text tool is located in a Toolbar above a Canvas.
  • other tools may be used with text objects, such as a Magnify tool, Grab tool, and Selection tool.
  • the Toolbar layout can be customized.
  • Text Tool may be used to create, select, and edit text. In one embodiment, to add text, do one of the following:
  • to select text characters do one of the following:
  • a Selection Tool may be used to select or deselect one or more object.
  • click the Selection tool or press Esc to select the text object.
  • the object's Dashboard may be displayed (press H), or the object's Inspector may be displayed (press I).
  • the Select tool when the Select tool is selected, double-clicking a text object automatically enters text editing mode.
  • Magnify Tool In one embodiment, a Magnify Tool zooms in or out of the canvas. In one embodiment, to zoom in, click the Magnify tool, click in the cursor, and drag to the right. In another embodiment, to zoom out, drag to the left. In yet another embodiment, the zoom is based around the position of the cursor in the canvas.
  • a Grab Tool moves the image within the canvas. In one embodiment, to reposition the canvas, click the Grab tool, click in the canvas, and drag.
  • text controls are located in the Text tab of the Inspector.
  • the Text tab is divided into three panes: Format, Style, and Paragraph.
  • the Format pane contains text basics, such as font, size, and tracking.
  • text characteristics such as face, outline, and blur are controlled in the Style pane.
  • the Paragraph pane contains text layout controls, such as margins and justification.
  • the Text Format panel contains the controls for text basics, such as font, typeface, size, kerning, and character rotation.
  • most of the Format parameters can be animated (keyframed), including the font family.
  • the Animation menu icon appears next to the parameter in the Inspector.
  • the following Format parameters appear in the text Dashboard: Family, Typeface, Size, and Tracking.
  • a Text Editor is an additional tool that allows text to be added and edited in the Inspector rather than the Canvas. In one embodiment, the Text Editor is useful when working with large amounts of text.
  • the Text Editor can also be used to edit text objects in projects.
  • a Text Style pane to specify the fill of a text object and to adjust its opacity and softness.
  • a text object can be a solid color, an image, or a color gradient.
  • most of the style parameters can be animated.
  • outlines, glows, and drop shadows can be created for a text object in the Style pane.
  • predefined Text Styles may be used in a project.
  • Text Styles use parameters in the Text Style pane to create a specific “look” for a text object.
  • one style is a yellow-to-orange gradient with a soft white outline.
  • these styles are located in a Library.
  • click the Text tab in the Inspector and click Style to show the Text Style panel.
  • Style pane there are four main groups of controls in the Style pane: Face, Outline, Glow, and Drop Shadow.
  • a style can be enabled or disabled for a text object. In one embodiment, by default, Outline, Glow, and Drop Shadow are disabled.
  • Text Face controls are used to specify whether the text is a solid color, a texture, or a color gradient.
  • Face parameters are available:
  • Changing the Text Color In one embodiment, to change the color of a text object, use the color picker in the text object Dashboard or in the Inspector. In one embodiment, to adjust individual color channels, use the Text Inspector.
  • gradient fills for text objects can be created and animated.
  • the gradient controls for a text object are similar to the gradient controls for a shape or particle object.
  • a gradient preset can be applied to a text object.
  • the gradient presets are located in a Library.
  • a gradient that has been created can be saved to the Library for use in a current project or future projects.
  • a Gradient Editor can be used to change the color, color position, number of colors, opacity, and direction of a gradient.
  • the color and opacity of a gradient can be animated.
  • the following sections assume that a text object is selected, and the Gradient option is selected from the “Fill with” pop-up menu in the Face controls.
  • to change gradient colors :
  • to change the spread of a gradient color click and drag the triangle between the color tags. In one embodiment, the closer the triangle is to a color tag, the sharper the gradient.
  • to add a color to a gradient place the cursor in the lower gradient bar in the position to add the new color, and click.
  • a new color tag is added to the gradient.
  • the color of the new color tag is based on the last selected color in the color picker.
  • the colors and opacity of a gradient can be animated, the number of color and opacity tags cannot be animated.
  • to remove a color from a gradient click and drag the color tag away from the gradient bar. In one embodiment, the color tag is removed.
  • the controls to move, change the spread, add, or remove a opacity tag are similar to those of the color tags.
  • an object (image, clip, or shape) can be used as the fill for a text object with the Texture option in the Face controls of the Inspector.
  • Applying a Texture to a Character vs. Applying a Texture to a Text Object In one embodiment, when an image (or object) is applied as the texture for a text object, the texture is applied to a character in the text object. In one embodiment, to use the image as a continual texture throughout a text object, use the text as a mask.
  • Editing a Texture In one embodiment, the position of a texture that is applied to a text object can be adjusted using Image Offset in the Texture controls. In one embodiment, if the image used as the texture is offset and is cut off in a text object, the edge behavior of the texture can be specified. In another embodiment, if an image sequence is being used, certain frames can be specified to use as the texture.
  • Wrap Mode In one embodiment, use the Wrap Mode controls to specify how the edge of a texture is treated when the texture is offset and appears cut off in the text object.
  • Lock/Jnlock In one embodiment, use Lock to use only the frame specified in the Frame field as the texture for all frames of s project. In one embodiment, unlock the Frame field to use the sequence of images as the texture.
  • keyframes can be set for the offset values of the texture source to create a moving element within a text object.
  • an object image, clip, shape, or layer
  • an object that has applied behaviors and filters can be used as the texture source for a text object.
  • the result of the filters are included in the texture source; i.e., the result of the filters can be seen in the texture.
  • the object has applied, active behaviors or transforms, the behaviors and transforms are ignored.
  • only the image appears as the texture. In one embodiment, use the following guidelines when using objects as texture sources.
  • an object with an applied behavior or active transforms e.g., rotate
  • the effects of the behavior or transforms are ignored.
  • the Opacity slider or value field in the Dashboard or in the Inspector can be used to adjust the opacity of a text object.
  • a text object is like objects of other types, its opacity can be adjusted in the Properties tab.
  • the changes are multiplicative.
  • the Opacity of a text object is set in the Text Style parameters to 50 percent, the opacity of the text object is 50 percent.
  • the Opacity in the Properties tab is then set to 50 percent, the opacity of the text object is 25 percent.
  • Text Outline Controls use the Outline controls in the Style pane to create text object outlines.
  • the color, opacity, softness, width, and fill of the outline can be changed.
  • Adding a Text Outline In one embodiment, to create a text outline, enable the Outline parameter in the Style pane of the Text Inspector.
  • Editing Text Object Outlines use the Outline controls to soften the opacity or blur of a text outline, change the width of an outline, or to set and edit the fill of an outline.
  • to change the color of an outline click the color picker and select a color from the Colors window.
  • to adjust the opacity of an outline use the Opacity slider or the value field to change the opacity of the outline.
  • to adjust the blur of a text outline use the Blur slider or the value field to change the blur of the outline.
  • to change the width of a text outline use the Width slider or the value field to change the width of the outline.
  • the Outline fill controls are similar to the controls for the Face parameters.
  • Text Glow Controls In one embodiment, use the Glow controls to create a glow around a text object.
  • Adding a Text Glow In one embodiment, to create a text glow, enable the Glow parameter in the Style pane of the Text Inspector.
  • Editing Text Object Glow In one embodiment, use the Glow controls to soften the opacity or blur of the text glow, change the size of the glow, or set and edit the fill of a glow.
  • to change the color of the glow click the color picker and select a color from the Colors window.
  • to adjust the opacity of the glow use the Opacity slider or the value field to change the opacity of the glow.
  • the Blur slider or the value field to change the softness of the glow.
  • to change the width of the glow use the Width slider or the value field to change the size of the glow.
  • the Glow fill controls are similar to the controls for the Face parameters.
  • Creating a Drop Shadow uses the Drop Shadow controls to create a drop shadow on a text object, and to adjust its color, opacity, offset from the text object, softness, and angle.
  • the Shadow parameters include:
  • Adding a Drop Shadow In one embodiment, to create a text drop shadow, enable the Drop Shadow parameter in the Style pane of the Text Inspector.
  • Adjusting the Drop Shadow Parameters use the Drop Shadow controls to change the color or opacity of the shadow and to adjust the softness of the shadow. In one embodiment, the distance the shadow is offset from the text object, and its angle, may also be adjusted. In another embodiment, the Drop Shadow parameters can be animated.
  • to change the color of the drop shadow click the color box and use the Color window to set a new color.
  • the distance the shadow is offset is represented in pixels.
  • to change the angle of the shadow from the text object click and drag in a circular motion on the Angle dial, or use the value field.
  • the Shadow fill controls are similar to the controls for the Face parameters.
  • the Text Layout pane contains controls for type layout, such as setting margins, alignment, justification, and line spacing.
  • a “typewriter” effect can be created using the Type On parameter in the Layout pane.
  • Text Layout controls use the Text Layout controls to specify general “layout” of text. In one embodiment, these controls include specifying if the text flows in a single line, a paragraph with set margins, or on a path.
  • to create a text box do one of the following:
  • margins if a user is working with a large amount of text and needs paragraph controls, he can establish margins. In one embodiment, a user can draw a custom text box in the Canvas, or set up margins in the Layout pane of the Text Inspector.
  • the default type layout option is Type.
  • text is entered in one string that extends beyond the Canvas, unless the user manually breaks or returns at the end of his text lines.
  • Drawing Text Margins In one embodiment, use the Text tool to draw a text box in the Canvas. A user can draw a box that extends beyond the edge of the Canvas.
  • a user when entering text via the Text Editor, a user can set text margins using the Paragraph Layout Method option and the margin controls in the Layout pane.
  • a user can create text on a line or an ellipse.
  • a user can change the shape of a text path, as well as add or remove control points, as well as animate the text along the path.
  • Text Path Controls In one embodiment, the following Text Path Controls are available:
  • Creating Text on a Path uses the Path options in the Layout pane to create text on a path.
  • the Type On parameters in the Text Layout controls there are two ways to create a type-on text effect, the Type On parameters in the Text Layout controls, or the Type On behavior (in the Text Animation behavior category). In one embodiment, this section discusses using the Type On controls in the Layout pane.
  • text behaviors create animation by applying a range of values to text parameters without creating keyframes.
  • behaviors work like expressions.
  • by dragging a behavior to a text object in the Canvas, Layers List, or Timeline a user can easily set up a left or right text crawl, scroll, generate random text characters, create a type-on effect, or create a tracking animation.
  • a user can also use the Sequencing behavior to create custom behaviors that animate individual text properties. In one embodiment, for example, the user can select the Scale and Opacity properties and set them to animate through the text characters.
  • text behaviors are applied in the same manner as other behaviors and filters.
  • drag a behavior to an object in the Canvas, Layers List, or Timeline are applied in the same manner as other behaviors and filters.
  • Text animation behaviors include:
  • a user can apply other behaviors to a text object.
  • a user can create keyframes for text parameters.
  • the following example uses both methods to animate text Tracking and Opacity.
  • some text behaviors automatically animate the text parameters.
  • the Tracking behavior when the Tracking behavior is applied to a text object, the tracking occurs at the rate specified in the behavior.
  • the user can adjust the rate of the tracking in the behavior parameters.
  • behaviors do not create keyframes.
  • the following example creates text that fades in as the tracking animates.
  • a user can also create this effect using the Fade In/Fade Out behavior (in the Basic Motion behavior category) and the Tracking behavior (in the Text Animation behavior category).
  • keyboardframing applies very specific values to an object's parameters.
  • the effect is more general, for example, the user wants the text be completely transparent at frame 1 , opaque at frames 60 - 90 , and become transparent by frame 120 , he should use the Fade In/Fade Out behavior.
  • behaviors generate a range of values that are applied to an object's parameters, animating those parameters over the duration of the behavior.
  • a user can combine keyframing and behaviors on an object.
  • a user can then apply the Tracking behavior to animate the text object tracking, or he can keyframe the tracking parameter.
  • the Tracking behavior to animate the text object tracking, or he can keyframe the tracking parameter.
  • a keyframe is applied to the text Opacity parameter, and then a Fade In/Fade Out behavior is applied to the text object, unexpected results may occur.
  • a user can use the object onscreen controls (e.g., Shear, Four Corner, Pivot, Scale, and Drop Shadow) to transform a selected text object.
  • the onscreen tools are shortcuts to the object controls in the Inspector>Properties tab.
  • to set specific values, or fine tune any of the following transforms use the Properties tab in the Inspector.
  • the onscreen controls and the Inspector>Properties parameters are applied to the text as an object (such as a clip or image), not as editable text.
  • the controls for editing the text itself are located in the Inspector>Text tab.
  • some object properties are similar to some text style and format controls, such as the Shear property and the Slant text format, the object properties are independent of the text format controls, and vice versa.
  • a slant value of 20 is applied to each character in the word, simulating italics.
  • a shear value of 20 in Inspector>Properties or using the onscreen controls
  • a shear value of 20 is applied to the object, not the individual text characters.
  • next section briefly describes how to transform a text object using the onscreen controls.
  • to move the text object click in the bounding box and drag the text object.
  • to scale the text object do one of the following:
  • an object may be scaled around its pivot point. In one embodiment, to scale proportionally, press Shift while dragging any of the control points.
  • a user may select a single character in a text object. In one embodiment, a user may select multiple characters in a text object.
  • a user can use a text object as a particle shape.
  • a user can edit the text after the fact.
  • a user can apply a style to a text object.
  • a user can apply a mask to a text object.
  • a user can save a custom text setup.
  • a user if a user has LiveFonts installed on his system, he can use the LiveType fonts.
  • FIG. 73 illustrates one example of a particle system, according to one embodiment of the invention.
  • FIG. 74 illustrates another example of a particle system, according to one embodiment of the invention.
  • FIG. 75 illustrates yet another example of a particle system, according to one embodiment of the invention.
  • FIG. 76 illustrates an example of a cell, according to one embodiment of the invention.
  • FIG. 77 illustrates an example of a particle system based on the cell of FIG. 76 , according to one embodiment of the invention.
  • the object used as a particle system's cell 760 determines how that particle system looks.
  • Particle systems can contain multiple cells 760 , resulting in the release of several types of particles 770 from a single emitter. Sophisticated particle presets may be constructed in this way.
  • FIG. 78 illustrates an example of a particle system based on one cell, according to one embodiment of the invention.
  • FIG. 79 illustrates an example of a particle system based on multiple cells 760 A, 760 B, according to one embodiment of the invention.
  • a particle system comprises an emitter 800 and one or more cells 760 .
  • a cell 760 is nested inside of the emitter 800 in a Project pane and a Timeline.
  • FIG. 80 illustrates an example of a Project pane showing an emitter that is based on two cells, according to one embodiment of the invention.
  • FIG. 81 illustrates an example of a Timeline showing an emitter that is based on two cells, according to one embodiment of the invention.
  • the emitter and cells have separate sets of parameters that control the particle system's behavior. If a garden hose were a particle system, the nozzle would act as the emitter, while the water would represent the flow of particles. Changing the parameters of the emitter changes the direction and number of particles that are created, while changing the cell's parameters affects each individual particle. By changing a few parameters, it's possible to create very different effects using the same cell.
  • FIG. 82 illustrates an example of a particle system based on an emitter, according to one embodiment of the invention.
  • FIG. 83 illustrates another example of a particle system based on the same emitter as in FIG. 82 , according to one embodiment of the invention.
  • FIG. 84 illustrates yet another example of a particle system based on the same emitter as in FIGS. 82 and 83 , according to one embodiment of the invention.
  • Particle system parameters can be keyframed in order to change a particle effect's dynamics over time. For example, by keyframing an emitter's 800 Position property in a Keyframe Editor, a path 860 of bubbles can be created that follows an object 850 onscreen.
  • FIG. 85 illustrates an example of an object, according to one embodiment of the invention.
  • FIG. 86 illustrates an example of a particle system of bubbles along with the object of FIG. 85 , according to one embodiment of the invention.
  • FIG. 87 illustrates another example of a particle system of bubbles along with the object of FIG. 85 , according to one embodiment of the invention.
  • Behaviors can be added to a cell to create even more varied effects.
  • simulation behaviors can be especially effective.
  • a behavior that is applied to a cell is in turn applied to a particle that it generates. This enables almost limitless variation. Adding behaviors to particles in addition to the particle system's own parameters is an easy way to create complex, organic motion that would be impossible to accomplish any other way. For example, if a Repel behavior is added to a cell, it causes emitted particles to weave around one another like amoebas under a microscope.
  • Adding a particle system to a project can be fast and easy.
  • Pre-made particle systems can be used from a particle library.
  • a simple particle system can be created.
  • a particle library found in a Content category of a Library, is a collection of pre-made particle effects that can be added to a project. There are many types of particle effects to choose from. The easiest way to add a particle system to a project is to use one from a particle library. If a user finds one that's close to what he needs, he can easily customize its parameters after he has added it to his project. Particle systems are added to a project exactly like any other object.
  • Customizing Preset Particle Systems Once a particle system has been added from a Library, it acts as it appeared in the library preview animation. If necessary, a particle system's Emitter parameters can be edited in a Dashboard to tailor the particle system. In one embodiment, a particle system can only be modified after it has been added to a project.
  • a Dashboard displays a selected particle system's most essential parameters, including, for example, the size and number of particles that are created, how long they remain onscreen, how fast they move, and the direction and area in which they travel.
  • a cell may also be selected in a Layers tab or Timeline to edit its parameters in the Dashboard.
  • creating a particle system begins by selecting an object 12 in a project and using it as a cell 760 within a new particle emitter 800 .
  • the emitter is a source of particles that are created.
  • Particle systems are very flexible, and any object in a project can be used as a cell in an emitter, including still graphics, animation or video clips, or shape objects.
  • the object 12 selected when an emitter 800 is created becomes the first Cell 760 in that particle system.
  • cells are nested inside of emitters and are used to create the actual particles 770 in that system.
  • FIG. 88 illustrates an example of a particle system including an emitter and individual particles based on the emitter, according to one embodiment of the invention.
  • FIG. 91 illustrates a new emitter, at the first frame of the particle effect, according to one embodiment of the invention.
  • the first frame of a new particle system has three particles. If the project is played, additional particles are generated, emerging from the center of the emitter.
  • a new cell 760 emits one particle 770 per frame in all directions, and each particle 770 moves 100 pixels per frame away from the emitter 800 over a lifetime of 100 frames.
  • FIG. 92 illustrates an active particle system, such as the emitter of FIG. 91 but at a later frame, according to one embodiment of the invention.
  • the Initial Number parameter in the Emitter or Particle Cell tabs changes the default behavior so that a particle system begins with a burst of particles at the first frame.
  • the particle system starts working according to the default parameters in its Emitter and Particle Cell tabs. In one embodiment, these are located in the Inspector.
  • the emitter Dashboard 110 can be used to easily change the most important of these parameters. Select an emitter to see its parameters in the Dashboard.
  • the Dashboard contains emitter controls that modify a particle system's size and shape. In one embodiment, these parameters are a subset of those found in the Emitter tab of the Inspector. In one embodiment, the Dashboard contains a group of sliders and an Emission control. In one embodiment, an Emission control provides a visual way to manipulate three different particle system parameters-Emission Range, Emission Angle, and Speed.
  • the emitter 800 Dashboard 110 parameters simultaneously modify the effect of each cell's parameters relative to one another. This means that for a particle system consisting of three cells with different Scale values, changing the scale in the Dashboard 110 resizes all three cells simultaneously. For example, increasing the scale in the Dashboard 110 by 130 percent does not change the scale of all three cells to 130 percent. Instead, it multiplies the scale of each cell by 130 percent, so that all are resized relative to their original scale values.
  • FIG. 93 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 94 illustrates the particle system of FIG. 93 after it has been rescaled, according to one embodiment of the invention. For this reason, in one embodiment, the Dashboard parameters are displayed as percentages, since they represent the percent at which these particle cell parameters are modified.
  • FIG. 95 illustrates a Dashboard for a particle system, according to one embodiment of the invention.
  • emitter parameters in the Dashboard include:
  • the emitter controls in the Dashboard are used to create a smoke effect using the emitter created in the procedure “Creating a Simple Custom Particle System.”
  • the emitter Dashboard to create a smoke effect using the emitter Dashboard:
  • FIGS. 109 and 110 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • a single object can thus be used to create a credible column of smoke rising gently into the sky. While the Dashboard controls are quite powerful, in one embodiment, the Emitter and Particle Cell tabs in the Inspector have many more parameters that can be customized.
  • emitter parameters can be modified in the Properties tab of the Inspector like any other object. Since particle systems are collections of independently generated objects, these parameters have a different effect then they do with other objects. In one embodiment, the only parameter that appears for cells in the Properties tab of the Inspector is Timing.
  • Transform Controls As a particle system plays, in one embodiment, cells 760 in that system are duplicated according to the parameters for that system to create individual particles 770 . Since particles 770 emerge from the position of the emitter 800 , changing the emitter's position in the Canvas also changes the position of particles 770 in that system. This results in the particle system being moved as a unit.
  • FIG. 111 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 112 illustrates the particle system of FIG. 111 after the emitter has been moved, according to one embodiment of the invention.
  • FIG. 113 illustrates a particle system where the emitter's position has been animated using a behavior, or keyframed, according to one embodiment of the invention.
  • modifying an emitter's other geometric parameters changes the distribution of particles from that emitter, as well as transforming each individual particle. For example, in one embodiment, if an emitter's Shear parameter is modified, the distribution of the emitted particles changes to reflect the new plane of the emitter, and the particles are sheared along the same plane.
  • FIG. 114 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 115 illustrates the particle system of FIG. 114 after the emitter's Shear parameter has been modified, according to one embodiment of the invention.
  • Blending In one embodiment, any changes made to the opacity or blend mode parameters for an emitter are applied to the particle system as a whole.
  • Mask and Drop Shadow Parameters In one embodiment, masks and drop shadows cannot be applied to particle systems.
  • FIG. 116 illustrates a particle system in the Timeline that comprises one emitter and three nested cells, according to one embodiment of the invention.
  • a cell in a system generates particles over the entire duration of the emitter.
  • the duration of an individually generated particle is defined by the Lifetime parameter of the cell that generated it, and not by the duration of the nested cell itself.
  • the duration of the nested cell itself controls the duration for which it generates particles.
  • a cell's duration can be changed by dragging either its overall position or its in and out points in the Timeline. In this way, the timing that defines when each cell's particles emerge can be adjusted.
  • a particle system can be created that simulates an explosion by offsetting the appearance of three different types of particles.
  • dense white particles 770 A emerge from the center.
  • FIG. 117 illustrates a particle system with dense white particles emerging from the center, according to one embodiment of the invention.
  • FIG. 118 illustrates the particle system of FIG. 117 with more diffuse orange particles appearing around a larger area, according to one embodiment of the invention.
  • FIG. 119 illustrates the particle system of FIG. 118 with small sparks emerging from underneath both of the previous layers as they fade away, according to one embodiment of the invention.
  • creating new particle systems from scratch begins with designing the particles that will be emitted.
  • Any graphic or video clip may be a cell.
  • Still images are the easiest to create, and are often all that is necessary to create a compelling particle system.
  • Graphics Size In one embodiment, it's a good idea to make graphics larger rather than smaller.
  • the size of the particles may be reduced without a loss of quality, but increasing the size of particles beyond the size of the original graphic may introduce unwanted artifacts.
  • Particle Edges In one embodiment, the quality of the edges of graphics can be extremely important for creating convincing particles. Soft, translucent edges might look better than hard, over-defined ones.
  • Object Color In one embodiment, by default, particles are created using the original colors of the image being used as the cell. If necessary, the emitted particles can be tinted using the Color Mode, Color, and Color Over Life parameters in the Emitter and Particle Cell tabs. In one embodiment, particles may be tinted by a single color, or they may be tinted with a gradiated tint that changes color over time. In one embodiment, tinting particles applies the tint color uniformly over the entire object.
  • the “Is Premultiplied” parameter in the Particle Cell tab can be turned on for that cell to eliminate any edge fringing.
  • Video clips such as QuickTime movies, may also be used as cells.
  • an animation can be created, rendered as a QuickTime movie, and used as a cell.
  • the same recommendations for creating still graphics apply to the creation of animation or video clips to use as cells, but in one embodiment, there are additional considerations.
  • video clips to be used as particles should be saved using an uncompressed codec, such as Animation or Uncompressed 8- and 10-bit 4:2:2.
  • an uncompressed codec such as Animation or Uncompressed 8- and 10-bit 4:2:2.
  • other codecs can be used, but they may introduce unwanted artifacts depending on the level of compression used.
  • the particle system's Emitter and Particle Cell tabs in the Inspector provide total control over every aspect of a particle system. This includes, for example, individual parameters for each cell in a system.
  • Emitter and Particle Cell parameters serve different purposes.
  • emitter parameters control the overall shape and direction of the animated mass of particles generated by the system.
  • other emitter parameters simultaneously modify the parameters of cells nested inside an emitter.
  • Particle Cell parameters control the behavior of particles generated from a cell that's nested inside the particle emitter separately.
  • Emitter tab 1200 several parameters 1202 in an Emitter tab 1200 are identical to those found in an emitter Dashboard 110 , with one difference.
  • the Emission control in the emitter Dashboard 110 allows manipulation of the Range, Angle, and Speed parameters using a single, graphical control
  • the Emitter tab 1200 lists individual controls 1210 for each parameter 1202 .
  • FIG. 121 illustrates an Emitter tab and individual controls for several Emitter parameters, according to one embodiment of the invention.
  • the contents of the Emitter tab are dynamic, and different parameters appear depending on the number of cells in the particle system, as well as the emitter shape that's used.
  • Emitter tab Single Cell vs. Multi-Cell Emitter Parameters—In one embodiment, at first glance, many of the parameters in the Emitter tab appear to mirror identically named parameters in the Particle Cell tabs for each cell within a system. In one embodiment, if a particle system has only one cell, then the Emitter tab displays parameters for the nested cell alongside the emitter's own parameters. In this case, an aspect of the particle system may be controlled directly from this tab, without having to go back and forth between the Emitter and Particle Cell tabs.
  • an Emitter tab looks different.
  • the list of parameters is shorter, and some cell parameters are replaced with a smaller group of master controls.
  • changes made using the master controls modify the effect of a cell's parameters relative to other cells in a system. This means that, in one embodiment, for a particle system with three cells that have different Scale values, increasing the Scale parameter in the Emitter tab multiplies the Scale value of all three cells by the same percentage. In one embodiment, this has the result of increasing or reducing the size of a particle in the system, while keeping the size of a particle relative to other particles the same.
  • FIG. 122 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 123 illustrates the particle system of FIG. 122 after the value of the Scale parameter in the Emitter tab has been increased, according to one embodiment of the invention.
  • the first parameter 1202 in an Emitter tab 1200 is an Emitter Shape pop-up menu. In one embodiment, the options in this menu significantly alter the distribution of generated particles 770 . In one embodiment, when an emitter 800 shape is chosen, different Emitter tab 1200 parameters 1202 are revealed which are unique to that shape. These parameters provide additional control over the distribution of particles 770 .
  • Emitter Shapes there are six Emitter Shapes:
  • Additional Cell Parameters for Animation or Video Clips In one embodiment, if a particle system uses an animation or video clip as a cell, additional parameters are available. In one embodiment, these parameters are:
  • Random Start Frame if Random Start Frame is turned off, the following parameter appears:
  • an option in the Color Mode pop-up menu displays a different set of parameters, based on the option.
  • parameters in the Particle Cell tab 1500 control the behavior of an individual particle 770 that is generated by the system, independently of the parameters governing the emitter 800 .
  • a cell in particle systems with multiple cells, a cell has its own particle cell parameters 1502 . In another embodiment, this enables the creation of a particle system made up of many kinds of particles, each with distinctly different behaviors.
  • a particle system may use multiple cells.
  • a particle system may emit different kinds of overlapping particles by nesting multiple cells inside of a single emitter.
  • any number of cells may be nested within a single emitter object.
  • a cell has its own particle cell parameters, which govern how particles from that cell are created.
  • a particle system with multiple cells generates particles from each cell simultaneously, according to each cell's parameters.
  • the Interleave Particles parameter determines how particles generated from the different cells blend together.
  • any Emitter or Cell parameter in a particle system can be animated by using Parameter Behaviors or by keyframing the parameter directly.
  • an emitter-specific parameter is animated, such as Emission Angle and Emission Range
  • the position and distribution of new particles generated by that emitter are animated.
  • animation occurs relative to the duration of the emitter.
  • the resulting animation is instead scaled to fit the Life parameter of each generated particle.
  • the keyframed animation will scale to the new duration of each particle.
  • animating an emitter's Property tab parameters is useful for altering the position and geometric distribution of a particle system over time.
  • keyframing an emitter object's Position parameter moves the source of newly emitted particles without affecting any particles that were generated at previous frames, which creates a trail of particles.
  • keyframing an emitter's Emitter tab parameters is a good way to modify the particle system's overall characteristics over time, such as increasing or decreasing the size, speed, or lifetime of newly generated particles.
  • adding behaviors to a particle system's emitter, or to the cells themselves can quickly achieve sophisticated, organic effects with very little effort.
  • behaviors may be added to a particle system's emitter, or to the cells themselves.
  • a Basic Motion behavior when applied to an emitter, the position of the source of all new particles generated by that system is affected. In one embodiment, once an individual particle emerges, it is unaffected by changes to the position of the emitter, so moving the emitter around the screen using behaviors results in the creation of a trail of particles that behave according to their particle cell parameters. In another embodiment, this behavior can be overridden by turning on a cell's Attach to Emitter parameter.
  • a behavior to apply a behavior to an emitter, drag a behavior from the Library onto an emitter in the Canvas, Layers tab, or Timeline. In one embodiment, the behavior is applied to the emitter, which begins to move according to the parameters of the behavior.
  • a behavior that is applied directly to a cell is in turn applied to individual particles generated from that cell. In one embodiment, this can result in some extremely complex interactions as dozens of particles weave and collide according to the defined behaviors. In another embodiment, a behavior applied to a Cell has no effect on the position of the Emitter.
  • to apply a behavior to a cell drag a behavior from the Library to a cell in the Layers tab or Timeline.
  • the behavior is applied to the cell, and all particles generated from that cell begin to move according to the parameters of the behavior.
  • the Particle Behavior Category there's a category that contains a behavior specifically for use with the cells in a particle system.
  • the Particles category contains the Scale Over Life behavior.
  • this behavior grows or shrinks a particle in a system over the duration of the particle's life.
  • the Scale Over Life behavior has two parameters:
  • a filter may be applied only to a particle system's emitter.
  • a filter affects an entire particle system, including every cell, as if it were a single object.
  • an individual cell cannot have a separate filter applied to it.
  • FIG. 153 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 154 illustrates the particle system of FIG. 153 after a Sphere filter has been applied, according to one embodiment of the invention.
  • This section presents three examples of how to use particle systems to create very different effects, according to one embodiment of the invention.
  • an animated background is created using a single still image, according to one embodiment of the invention.
  • a single image can be turned into a complex animated texture.
  • a particle system is created that uses two different cells to generate a streak of particles that trails behind another animated object, according to one embodiment of the invention.
  • using two cells adds more variation to a particle system than can be achieved with a single set of cell parameters.
  • a particle system can be saved as a particle preset in the Favorites folder of the Library for future use. In one embodiment, once a particle system has been saved in the Library, it can be used just like any other particle preset.
  • the particle preset when a particle preset is saved, the particle preset is saved as a file. In one embodiment, any custom objects used to create the particle system that were stored in the library appear in the same directory as this file. In another embodiment, particle presets that have been created may be copied from this location to give to other users, or particle presets received from other users can be added to this same directory. In yet another embodiment, whenever a particle preset file is copied, any graphics or video clips used by the particle preset should also be copied.
  • a visual effect from a behavior to a particle system to a gradient, is controlled by a collection of parameters that modify the various attributes for that effect.
  • a Blur filter has an amount slider that controls how much blur is applied.
  • a system may contain thousands of parameters.
  • many different types of controls may be used to set these parameters. These controls may include, for example, sliders, dials, and shortcut menus.
  • even objects without effects applied to them have many parameters that can be modified to alter the nature of the object and how it appears in a project.
  • these parameters include the object's scale, opacity, and position on screen, as well as more obscure attributes such as its pixel aspect ratio or field dominance.
  • parameters that control a visual effect are accessed in an Inspector.
  • the Inspector contains four tabs, each of which contains a set of parameters for the selected object.
  • the first three tabs, Properties, Behaviors, and Filters are present for any selected object.
  • the fourth tab generically called the Object tab, changes its name and contents depending on the type of object selected.
  • a control provides the opportunity to change the value of a parameter in a special way.
  • selecting different things will cause different controls to populate the Inspector.
  • the various types of controls include:
  • FIG. 169 illustrates one example of a slider, according to one embodiment of the invention.
  • a Value Slider 1700 is a special type of slider that includes the numerical value of the parameter in the control.
  • dragging the middle area 1702 works just like an ordinary slider; i.e., dragging to the right increases the value and dragging to the left decreases the value.
  • some parameters allow a value slider to increase or decrease the value indefinitely.
  • a user can click the Increment 1704 or Decrement 1706 arrows to change the value one step at a time.
  • a user can double-click the number itself to convert the slider 1700 into a value field so that he can type a specific number directly into the control.
  • an example of a parameter that uses a value slider is Position.
  • FIG. 170 illustrates one example of a value slider, according to one embodiment of the invention.
  • a Dial 1710 is used for values based on angles or degrees. In one embodiment, rotate the dial by dragging it in a clockwise or counter-clockwise motion. In another embodiment, a parameter that uses a dial is Rotation.
  • FIG. 171 illustrates one example of a dial, according to one embodiment of the invention.
  • a Value Field 1720 allows direct entry of text to set the value of the parameter.
  • an example of a parameter that uses a value field is the Text Entry field.
  • FIG. 172 illustrates one example of a value field, according to one embodiment of the invention.
  • a Pop-up Menu 1730 is a menu with preset values. In one embodiment, click the menu and choose the desired value. In another embodiment, an example of a pop-up menu is Throw Increment. FIG. 173 illustrates one example of a pop-up menu, according to one embodiment of the invention.
  • a Value List 1740 is another type of shortcut menu.
  • a user can click the arrow 1742 to the right of the field to display preset values or he can type a value directly into the Value field 1744 .
  • an example of a value list is Typeface.
  • FIG. 174 illustrates one example of a value list, according to one embodiment of the invention.
  • an Activation Checkbox 1750 is an on/off toggle for a parameter.
  • an example of an Activation Checkbox is Preserve Opacity.
  • FIG. 175 illustrates one example of an activation checkbox, according to one embodiment of the invention.
  • a Color Well 1760 enables a user to select a color.
  • the Color well can be used either by clicking on the box 1762 , which opens the Colors window, Control-clicking and picking a color from the pop-up picker 1770 , or by clicking the disclosure triangle 1764 and manipulating the individual RGB 176 A, 176 B, 176 C and A 176 D sliders.
  • an example of a color well is Drop Shadow Color.
  • FIG. 176 illustrates one example of a color well, according to one embodiment of the invention.
  • FIG. 177 illustrates one example of a pop-up picker, according to one embodiment of the invention.
  • a Gradient 1780 enables a user to select a preset gradient style or create a new one.
  • a user can choose from only the Preset shortcut menu 1782 to choose an existing preset.
  • click the disclosure triangle 1784 to reveal the Gradient Editor 1786 .
  • a user can set the gradient's opacity as well as its color values.
  • FIG. 178 illustrates one example of a gradient, according to one embodiment of the invention.
  • a Drop Well 1790 enables a user to drag an object 12 (e.g., a clip or still image) to provide input data for a type of effect.
  • an object 12 e.g., a clip or still image
  • a bump map filter needs an image to provide the bumps, or a Repel From behavior needs to know what object to Repel.
  • an example of a Drop Well is the Attracted To behavior's Object parameter.
  • FIG. 179 illustrates one example of a drop well, according to one embodiment of the invention.
  • a Parameter Selection Field 1800 is a special type of shortcut menu, specifically for Parameter Behaviors.
  • the user needs to identify which parameter the behavior should affect.
  • a user can either type the name of the parameter directly into the value field 1802 , or he can choose from the Go shortcut menu 1804 (which lists all current parameters).
  • an example of the Parameter Selection Field is the Average behavior's Apply To parameter.
  • FIG. 180 illustrates one example of a parameter selection field, according to one embodiment of the invention.
  • these controls include:
  • a Reset button 1810 automatically restores the parameter value (or, in some cases, an entire set of parameters) back to their default values.
  • FIG. 181 illustrates one example of a reset button, according to one embodiment of the invention.
  • Manage Presets Button In one embodiment, some parameter settings (e.g., Gradients and Type Styles) are so complex that they are commonly stored in presets. In one embodiment, whenever a Manage Presets Button 1820 is displayed, a user can save that particular parameter (or set of parameters) into a preset. In another embodiment, for example, the Text Style pane has a Manage Presets control at the top of the parameter list that allows a user to save styles, formats, or both. In yet another embodiment, this enables a user to save all of the settings in the window. In one embodiment, in some cases, a user can also use this control to load an existing preset. FIG. 182 illustrates one example of a manage presets button, according to one embodiment of the invention.
  • FIG. 182 illustrates one example of a manage presets button, according to one embodiment of the invention.
  • to load an existing preset click the Manage Presets button, and then choose the preset from the list in the pop-up menu.
  • the current parameter settings are replaced by the settings in the preset.
  • Animation Menu Button In one embodiment, most parameters of an item are animateable. In one embodiment, this means that a user can assign specific values to certain frames (keyframes) so the parameter value changes over time. In another embodiment, a parameter that can be animated has an Animation Menu Button 1830 to the right of the parameter settings. In yet another embodiment, depending on the current condition of the parameter, the Animation Menu Button displays a different icon. FIG. 183 illustrates one example of an animation menu button, according to one embodiment of the invention.
  • clicking on the Animation Menu Button displays a shortcut menu 1840 filled with Animation related controls.
  • FIG. 184 illustrates one example of a shortcut menu filled with Animation related controls, according to one embodiment of the invention.
  • these menu items include:
  • Enable/Disable Animation In one embodiment, the Enable/Disable Animation menu item 1842 remains dim until keyframing is applied to the parameter, either by using the Record button or by adding a keyframe. In one embodiment, once the parameter has some animation applied, the menu item is automatically renamed “Disable Animation.” In another embodiment, activating it at that point effectively hides the keyframes that have been set, restoring the parameter to its default value. In yet another embodiment, however, the keyframes are not thrown away. In one embodiment, choosing Enable Animation restores the channel to its last keyframed state.
  • Reset Parameter 1843 removes all keyframes and settings for this parameter. In one embodiment, the parameter value is reset to its default value.
  • Add Keyframe 1844 adds a keyframe at the current frame. In one embodiment, if the playhead is positioned on a frame where a keyframe has already been added, this menu item is dimmed.
  • Delete Keyframe 1845 deletes the current keyframe. In one embodiment, Delete Keyframe command is available only if the playhead is positioned on a frame where a keyframe already exists.
  • Previous Keyframe 1846 moves the playhead to the previous keyframe for this parameter. In one embodiment, Previous Keyframe is available only if a keyframe exists earlier in the project.
  • Next Keyframe 1847 moves the playhead to the next keyframe for this parameter.
  • Next Keyframe is available only if a keyframe exists later in the project.
  • Show In Keyframe Editor 1848 opens the Keyframe Editor if it is not showing and displays the graph for the parameter that is being modified.
  • the parameters in the inspector are grouped into four categories:
  • the Properties tab contains basic attributes about the selected object, such as Transformation (e.g., position, scale, and rotation), Blending (e.g., opacity and blend mode), Drop Shadow controls, Corner Pinning, and the object's In and Out points.
  • Transformation e.g., position, scale, and rotation
  • Blending e.g., opacity and blend mode
  • Drop Shadow controls e.g., Drop Shadow controls
  • Corner Pinning e.g., Corner Pinning, and the object's In and Out points.
  • Behaviors In one embodiment, whenever a behavior is applied to an object, the parameters associated with that behavior appear in the Behaviors tab. In one embodiment, multiple behaviors are grouped by the behavior name.
  • Filters In one embodiment, whenever a filter is applied to an object, the parameters associated with that filter appear in the Filters tab. In one embodiment, multiple filters are grouped by the filter name.
  • the title and contents of the Object tab change depending on what type of object is selected. In one embodiment, there are seven types of Object tabs, corresponding to seven types of objects.
  • a Generators tab displays the parameters and attributes of the selected generator (e.g., the colors and number of bars in a checkerboard). In one embodiment, the specific parameters listed depend on the specific generator that is selected.
  • the Inspector 19 typically changes dynamically based on the selection in the Canvas. In one embodiment, however, sometimes a user wants to select another object 12 while continuing to look at the parameters 290 for the current object 12 . In one embodiment, when a user locks the Inspector 19 , the view of the Inspector will not change based on the user's selection.
  • to lock the Inspector do one of the following:
  • FIG. 185 illustrates one example of a Lock icon, according to one embodiment of the invention.
  • a Dashboard 110 is a dynamically updating floating window.
  • the Dashboard contains the most common controls 1860 for any selected object 12 .
  • the Dashboard provides graphical animation control over images and other items that appear in the canvas window.
  • the Dashboard 110 is semi-transparent. In one embodiment, a user can set the opacity (transparency) of the Dashboard.
  • FIG. 186 illustrates one example of a Dashboard, according to one embodiment of the invention.
  • the Dashboard is designed to keep a selected object visible even while using the Dashboard to adjust the object's parameters. In another embodiment, this enables a user to keep his eye on the screen instead of switching his eye line from a main window to a utility panel and back.
  • the Dashboard can show a variety of controls, even for a single object. In one embodiment, for example, if a Throw behavior is applied to a shape with a blur filter on it, the Dashboard could conceivably show the shape controls, the blur controls, or the Throw controls. In another embodiment, the Dashboard shows all three. In yet another embodiment, a user can choose between which set of controls to view in the Dashboard using the pop-up menu in the title bar.
  • the Dashboard 110 title bar 1870 displays a downward facing arrow 1872 to the right of the name 1874 .
  • clicking the arrow 1872 displays a pop-up menu 1880 that lists all of the possible control sets that can be displayed in the Dashboard for the selected object.
  • FIG. 187 illustrates one example of a Dashboard title bar displaying a downward facing arrow, according to one embodiment of the invention.
  • FIG. 188 illustrates one example of a pop-up menu that lists all of the possible control sets that can be displayed in the Dashboard for the selected object, according to one embodiment of the invention.
  • the Dashboard displays a subset of the parameters visible in the Inspector for the selected object. In another embodiment, if a user is working in the Dashboard, he can quickly jump to the corresponding Inspector to access the remainder of the controls for that object.
  • to jump to the Inspector from the Dashboard click the Inspector icon in the upper-right corner of the Dashboard.
  • the Inspector is opened and the tab corresponding to the Dashboard controls is brought to the front.
  • a Dashboard contains controls that resemble controls used in the Inspector, such as sliders, checkboxes, and pop-up menu buttons.
  • the Dashboard contains special controls for certain types of effects such as Basic Motion Behaviors and particle systems.
  • these unique controls allow a user to set multiple parameters simultaneously and in an intuitive way.
  • these controls use standard English-like terminology and simple graphical diagrams that, when dragged interactively, cause the target image to react immediately to the changes in the diagram.
  • the Particle System Dashboard 110 contains a single control 1890 that lets a user set shape, angle, and range of a particle system simultaneously.
  • FIG. 189 illustrates one example of a Dashboard for a particle system, according to one embodiment of the invention.
  • FIG. 190 illustrates one example of a Dashboard for a Grow/Shrink behavior, according to one embodiment of the invention.
  • FIG. 191 illustrates one example of a Dashboard for a Fade In/Fade Out behavior, according to one embodiment of the invention.
  • the Throw behavior and corresponding special control make an image move in a certain direction at a controlled speed.
  • the user clicks and drags in a graphical “dish” 1920 to set the direction and speed of the object.
  • the dish appears initially with a small “+” 1922 in the center to indicate no movement.
  • a small arrow 1930 appears in the center region.
  • FIG. 192 illustrates one example of a Dashboard for a Throw behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 193 illustrates one example of a Dashboard for a Throw behavior where the special control specifies movement in a southeastern direction at a low speed, according to one embodiment of the invention.
  • FIG. 194 illustrates one example of a Dashboard for a Throw behavior where the special control specifies movement in the same direction as in FIG. 193 , but at a higher speed, according to one embodiment of the invention.
  • these images illustrate how the image's speed increases as the arrow increases in size.
  • a slider 1924 at the right side of the window controls the “zoom” of the dish. In one embodiment, dragging the slider upwards zooms out to display more area of the dish. In another embodiment, if more area of the dish is displayed, the control becomes more sensitive, so dragging the arrow will create more dramatic motion. In yet another embodiment, dragging the slider downward zooms in to display a smaller region, so dragging the arrow will create finer control over the movement.
  • the Wind behavior and corresponding special control make an image move in a certain direction and speed.
  • the graphical controls are similar to those of Throw.
  • the Wind behavior is designed to emulate real-life wind.
  • Wind pushes on the image constantly and ramps up over time. In one embodiment, for example, the object starts out moving slowly and picks up speed over time.
  • FIG. 195 illustrates one example of a Dashboard for a Wind behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 196 illustrates one example of a Dashboard for a Wind behavior where the special control specifies movement in a northeastern direction at a high speed, according to one embodiment of the invention.
  • the Spin behavior and corresponding special control make an image rotate at a constant rate.
  • the user clicks and drags in the graphical “dish” 1970 to set the speed and rotation direction (clockwise or counterclockwise) of the object.
  • the dish appears initially with a small “+” 1972 at the upper edge of the dish to indicate no movement.
  • a small arrow 1980 appears around the dish and follows the edge.
  • FIG. 197 illustrates one example of a Dashboard for a Spin behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 198 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in a clockwise direction at a low speed, according to one embodiment of the invention.
  • FIG. 199 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in the same direction as in FIG. 198 , but at a higher speed, according to one embodiment of the invention.
  • FIG. 200 illustrates one example of a Dashboard for a Spin behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 201 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in a counterclockwise direction at a low speed, according to one embodiment of the invention.
  • FIG. 202 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in the same direction as in FIG. 201 , but at a much higher speed, according to one embodiment of the invention.
  • the Grow/Shrink behavior and corresponding special control make an image grow or shrink at a constant rate.
  • the user clicks and drags in the rectangular area in the center to set the speed and direction (grow or shrink) of the object.
  • the control appears initially with a dotted rectangle 2030 in the center to indicate the “normal” size.
  • an additional rectangle 2040 and several arrows 2042 appear to indicate the rate of change from the initial state to the new state (either larger or smaller than the initial state).
  • the size of the box and the size of the arrows indicate growth or reduction.
  • FIG. 203 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 204 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies a high grow rate, according to one embodiment of the invention.
  • the outer box displays arrows that show the progression from the initial state to the larger state of the image over time.
  • FIG. 205 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 206 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies a high shrink rate, according to one embodiment of the invention.
  • the box is now smaller than the initial state, and displays arrows that show the progression from the initial state to the smaller state of the image over time.
  • the grow/shrink special control also has small draggable handles 2032 at the four edges of the user-defined box to set the rate differently for the horizontal vs. vertical axes.
  • an image can shrink horizontally over time, but simultaneously grow vertically over time.
  • FIG. 207 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies shrinking in the horizontal direction and simultaneous growing in the vertical direction, according to one embodiment of the invention.
  • a slider 2034 at the right side of the window controls the “zoom” of the control.
  • dragging the slider upwards zooms out to display more area of the control.
  • the control becomes more sensitive, so dragging the box and arrows will create more dramatic motion.
  • dragging the slider downward zooms in to display a smaller region, so dragging the box and arrows will create finer control over the movement.
  • the Fade In/Fade Out behavior and corresponding special control make an image fade in and/or fade out.
  • the user clicks and drags in the sloped, shaded regions at the left 2080 A and right 2080 B edges of the graphic to set the fade in or fade out time (displayed in number of frames) of the object.
  • the control appears initially with a predefined fade time of 20 frames at either end.
  • the slope changes to indicate a longer or shorter fade time.
  • FIG. 208 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a fade in time and a fade out time of equivalent length, according to one embodiment of the invention.
  • FIG. 209 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a shorter fade in time than in FIG. 208 and no fade out time (i.e., no fade out at all), according to one embodiment of the invention.
  • FIG. 210 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a similar fade in time to that in FIG. 208 and a longer fade out time than in FIG. 208 , according to one embodiment of the invention.
  • a particle emitter is a special type of image object that starts with one or more small images as sources and automatically generates large numbers of copies (particles) of those images.
  • a particle emitter has numerous controls specifying, for example, how many copies are created, where they are created, how fast they move, and what direction they move in.
  • a Dashboard for an emitter includes both traditional sliders 211 A, 2110 B, 211 C and a custom graphical element.
  • the custom graphical element is a dish 2112 that simultaneously controls three different aspects of the particles: Direction, Speed, and Range.
  • draggable arrows 2114 radiate out from the center of the dish to indicate direction and speed, similar to the Throw and Wind controls.
  • the ring of the dish defines a restricted range 2120 of where the particles travel outwards.
  • this ring resembles a “pie” shape.
  • it acts as a graphical representation of the emitter “nozzle.”
  • the particles move in a “stream” defined by the shaded area of the pie.
  • FIG. 211 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in all directions (i.e., there is no specified range) at a medium/high speed, according to one embodiment of the invention.
  • FIG. 212 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range) and at a medium speed, according to one embodiment of the invention.
  • FIG. 213 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range, and the range is narrower than the range in FIG.
  • FIG. 214 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range, and the range is narrower than the range in FIG. 212 ) and at a high speed, according to one embodiment of the invention.
  • a Dashboard can be closed by clicking an “x” in the upper left of the Dashboard window.
  • Inspector Button In one embodiment, if the user wants more controls over the image that is being manipulated, clicking on the small “i” in the upper right corner of the Dashboard will bring an Inspector window to the front. In one embodiment, the user can then use the Inspector to control the image via standard controls such as sliders, checkboxes, and numeric text fields. In another embodiment, this provides two levels of control over the animation: level 1 is an interactive graphical diagram, and level 2 is based on more traditional entry of values in the Inspector.
  • simulation behaviors implement two main functions: accumForces and accumInitialValues.
  • accumForces takes as input the current state of the object being simulated, including the position, rotation, velocity and angular velocity, and outputs the forces that should be applied at the given time.
  • accumInitialValues takes the same inputs and sets up the initial velocity of the object.
  • the simulator traverses a data structure, such as a tree structure, to find the behaviors affecting the object.
  • the simulator iterates across the list of behaviors and accumulates the forces on the object. If this is the first frame of the object, then the initial velocity is first calculated. Derivatives are then fed into a “mid-point method” differential solver to calculate a new position.
  • the simulator then traverses the list of simulation behaviors for collision behaviors. Collision behaviors examine the current state to determine if a collision has occurred. If so, it adjusts the state of the system to maintain the collision constraints. The simulation is iteratively stepped forward in this fashion until the desired frame is reached.
  • position and rotation properties that are keyframed are handled by a special “motion to forces” behavior which converts the keyframes into a series of forces that when applied produce a motion similar to that represented by the keyframe. This is done by examining the velocity and acceleration at the current frame and deriving the necessary forces from these values. These forces can then be input into the simulator so that they can interact with the other behaviors.
  • parameter behaviors are evaluated as a stack of operations on a range of values. First the stack is traversed to determine if all evaluations can be done using only the current value of the behavior before it in the stack. If so, an optimized path is taken which only passes the single value up the stack of operations. If not, then each behavior is queried to discover what range of input values will be needed to compute the requested output range. This stack of ranges is then used to evaluate each parameter behavior in turn, passing it the input it requested in the first step. Parameter behaviors such as Average use a range of values to compute a single output value, so they generally follow this second path. Also, while updating the curve editor, a large range of values can be calculated in one batch. This improves cache locality and reduces re-computation of partial values needed in the evaluation of an individual parameter behavior.
  • objects to which behaviors have been applied are dynamically rendered.
  • a behavior animation changes in real-time after the value of a behavior parameter has been changed.
  • caching is used to achieve dynamic rendering.
  • a behavior animation for an object is generated by rendering each frame sequentially and calculating a current frame based on a previous frame.
  • the result of evaluating the effect of a behavior on a previous frame is cached, thereby enabling the effect of a behavior on a current frame to be evaluated more rapidly.
  • an interval cache is also kept.
  • values are periodically added to the interval cache to speed up behavior evaluation when jumping to random frames.
  • multithreading is used to achieve dynamic rendering.
  • frames are rendered sequentially.
  • a second thread simultaneously evaluates behaviors for the next frame.
  • hardware acceleration enables users to work effectively with behaviors.
  • hardware acceleration methods include, for example: using multithreading (so that a program can, e.g., run on multiple CPUs); “Altivec'ing” algorithms, i.e., modifying algorithms to take advantage of G4 and/or G5 Altivec hardware on which they will be run (e.g., by vectorizing the algorithms); and using OpenGL (e.g., standard OpenGL, OpenGL vertex shaders, and OpenGL pixel shaders).
  • multithreading so that a program can, e.g., run on multiple CPUs
  • Altivec'ing algorithms, i.e., modifying algorithms to take advantage of G4 and/or G5 Altivec hardware on which they will be run (e.g., by vectorizing the algorithms)
  • OpenGL e.g., standard OpenGL, OpenGL vertex shaders, and OpenGL pixel shaders.
  • pixel shaders are used to accelerate various image processing tasks (such as, for example, applying filters) and enable custom blending.
  • a pixel shader is a graphics function that calculates effects on a per-pixel basis. Depending on resolution, in excess of 2 million pixels may need to be rendered, lit, shaded, and colored for each frame, at 60 frames per second. That creates a tremendous computational load. Per-pixel shading brings out an extraordinary level of surface detail, allowing a user to see effects beyond the triangle level.
  • the basics of pixel shader technology are known to those of ordinary skill in the relevant art and are further described in the course notes for Course 17 : “State of the Art in Hardware Shading” at SIGGRAPH 2002 . The course is described at http://www.siggraph.org/s2002/conference/courses/crsl7.html and the course notes are available at http://www.csee.umbc.edu/ ⁇ olano/s2002c17.
  • Certain aspects of the present invention include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the present invention could be embodied in software, firmware or hardware, and when embodied in software, could be loaded to reside on and be operated from different type of computing platforms.
  • the present invention also relates to an apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
  • the present invention is well-suited to a wide variety of computer network systems over numerous topologies.
  • the configuration and management of large networks comprise storage devices and computers that are communicatively coupled to dissimilar computers and storage devices over a network, such as the Internet.

Abstract

Various embodiments of the invention cover various aspects of behaviors and working with behaviors. One embodiment covers behaviors themselves, including animations that can be produced by applying a behavior to an item and the algorithms underlying these animations. Another embodiment covers using behaviors in conjunction with keyframes. Yet another embodiment covers working with behaviors, including setting parameters of behaviors, saving behaviors, and creating new behaviors. Yet another embodiment covers objects to which behaviors may be applied, including, for example, images, text, particle systems, filters, generators, and other behaviors. Yet another embodiment covers dynamic rendering of objects to which behaviors have been applied, including changing an animation in real-time after the value of a behavior parameter has been changed. Yet another embodiment covers hardware acceleration methods that enable users to work effectively with behaviors.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is related to the following commonly owned and co-pending U.S. patent applications, the disclosures of which are incorporated herein by reference:
      • U.S. patent application Ser. No. ______, for “Editing within Single Timeline”, filed Apr. 16, 2004.
      • U.S. patent application Ser. No. ______, for “Gesture Control of Multimedia Editing Applications”, filed Apr. 16, 2004.
    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • This invention relates generally to computer animation and, more specifically, to animating an object using behaviors.
  • 2. Background Art
  • In the last few decades, computers and software have been used to animate objects. Initially, animation software was complicated and difficult to use. A user was generally required to interact with objects using a low level of abstraction. For example, a user would manually create different visual representations of an object (keyframes) and then use software to interpolate between them.
  • Recently, animation software has become more user-friendly, enabling a user to interact with objects at a higher level of abstraction. For example, a user may animate an object by applying a “behavior” to the object. A behavior is an animation abstraction and can be thought of as a macro, script, or plugin. When a behavior is applied to an object, the object is animated in a particular way (e.g., by growing or shrinking or by moving in a specific direction). Some examples of animation software that support behaviors are Anark Studio and Macromedia Director MX.
  • Although behaviors make it easier to animate objects, software that supports behaviors can still be difficult to use. Many types of behaviors may be applied to one object, and each type of behavior can be customized based on several parameters. Understanding each of these parameters and its effect on the behavior can be confusing. Providing values for all of these parameters can also be time-consuming.
  • What is needed is a better user interface for animating objects using behaviors.
  • SUMMARY OF THE INVENTION
  • Various embodiments of the invention cover various aspects of behaviors and working with behaviors. One embodiment covers behaviors themselves, including animations that can be produced by applying a behavior to an item and the algorithms underlying these animations. Another embodiment covers using behaviors in conjunction with keyframes. Yet another embodiment covers working with behaviors, including setting parameters of behaviors, saving behaviors, and creating new behaviors. Yet another embodiment covers objects to which behaviors may be applied, including, for example, images, text, particle systems, filters, generators, and other behaviors.
  • Yet another embodiment covers dynamic rendering of objects to which behaviors have been applied, including changing an animation in real-time after the value of a behavior parameter has been changed. Yet another embodiment covers hardware acceleration methods that enable users to work effectively with behaviors.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a behavior in the Layers tab, according to one embodiment of the invention.
  • FIG. 2 illustrates a behavior in the Timeline, according to one embodiment of the invention.
  • FIG. 3 illustrates a behavior in the Behaviors tab of the Inspector, according to one embodiment of the invention.
  • FIG. 4 illustrates a gear icon, according to one embodiment of the invention.
  • FIG. 5 illustrates a gear icon in the filters tab of the Inspector, according to one embodiment of the invention.
  • FIG. 6 illustrates a gear icon in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 7 illustrates a parameter behavior in the Layers tab, according to one embodiment of the invention.
  • FIG. 8 illustrates a parameter behavior in the Timeline, according to one embodiment of the invention.
  • FIG. 9 illustrates a parameter's pop-up menu, according to one embodiment of the invention.
  • FIG. 10 illustrates an Apply To pop-up menu, according to one embodiment of the invention.
  • FIG. 11 illustrates the controls for the Fade In/Fade Out behavior in the Dashboard, according to one embodiment of the invention.
  • FIG. 12 illustrates the controls for the Fade In/Fade Out behavior in the Behaviors tab, according to one embodiment of the invention.
  • FIG. 13 illustrates the Activate control, the Enable/Disable control, and the Lock control in the Layers tab, according to one embodiment of the invention.
  • FIG. 14 illustrates the Activate control, the Enable/Disable control, and the Lock control in the Timeline, according to one embodiment of the invention.
  • FIG. 15 illustrates an enable/disable behaviors control that has been toggled to disabled, according to one embodiment of the invention.
  • FIG. 16 illustrates a show behaviors control that has been toggled to show, according to one embodiment of the invention.
  • FIG. 17 illustrates a behavior that has been selected in the Layers tab, according to one embodiment of the invention.
  • FIG. 18 illustrates a behavior that is being dragged to another object in the Layers tab, according to one embodiment of the invention.
  • FIG. 19 illustrates an object with multiple behaviors in the Timeline, according to one embodiment of the invention.
  • FIG. 20 illustrates an object with multiple behaviors in the Layers tab, according to one embodiment of the invention.
  • FIG. 21 illustrates a behavior being dragged and a position indicator, according to one embodiment of the invention.
  • FIG. 22 illustrates an object with a behavior in the Timeline, according to one embodiment of the invention.
  • FIG. 23 illustrates a behavior being trimmed in the Timeline and a tooltip, according to one embodiment of the invention.
  • FIG. 24 illustrates a behavior being moved in the Timeline and a tooltip, according to one embodiment of the invention.
  • FIG. 25 illustrates a behavior after it has been moved in the Timeline, according to one embodiment of the invention.
  • FIG. 26 illustrates a behavior-driven motion path in the Canvas, according to one embodiment of the invention.
  • FIG. 27 illustrates a keyframed motion path in the Canvas, according to one embodiment of the invention.
  • FIG. 28 illustrates a behavior-driven and keyframed motion path in the Canvas, according to one embodiment of the invention.
  • FIG. 29 illustrates a parameter with an oscillate behavior applied to it in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 30 illustrates a parameter with an oscillate behavior and keyframes applied to it in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 31 illustrates the parameter of FIG. 30 but with one keyframe lowered, according to one embodiment of the invention.
  • FIG. 32 illustrates a parameter with a behavior curve and a keyframed curve in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 33 illustrates a parameter with a “final animation curve” in the Keyframe Editor, according to one embodiment of the invention.
  • FIG. 34 illustrates an object with an Orbit Around behavior applied, creating a regular orbit (a circular motion path 340), according to one embodiment of the invention.
  • FIG. 35 illustrates the same object as in FIG. 34, but with a Ramp behavior applied to the Orbit Around behavior's Drag parameter as described above, creating a spiral motion path 340, according to one embodiment of the invention.
  • FIG. 36 illustrates an object with an Orbit Around behavior applied, creating a regular orbit (a circular motion path), according to one embodiment of the invention.
  • FIG. 37 illustrates the same object as in FIG. 36, but with keyframes applied to the Orbit Around behavior's Drag parameter as described above, creating a different motion path, according to one embodiment of the invention.
  • FIG. 38 illustrates a Dashboard for a Fade In/Fade Out behavior, according to one embodiment of the invention.
  • FIG. 39 illustrates a Dashboard for a Grow/Shrink behavior, according to one embodiment of the invention.
  • FIG. 40 illustrates a Motion Path behavior, including curves, applied to an object, according to one embodiment of the invention.
  • FIG. 41 illustrates an object moving along a motion path, according to one embodiment of the invention.
  • FIG. 42 illustrates the same object as in FIG. 41, but also with a Snap Alignment to Motion behavior applied to the object, according to one embodiment of the invention.
  • FIG. 43 illustrates a Dashboard for a Spin behavior, according to one embodiment of the invention.
  • FIG. 44 illustrates a Dashboard for a Throw behavior, according to one embodiment of the invention.
  • FIG. 45 illustrates a motion path behavior applied to an object, according to one embodiment of the invention.
  • FIG. 46 illustrates a motion path behavior applied to an object, and a Negate behavior applied to the object's Position parameter, according to one embodiment of the invention.
  • FIG. 47 illustrates a Dashboard for an Oscillate behavior, according to one embodiment of the invention.
  • FIG. 48 illustrates two objects (an attracting object and an attracted object) and a motion path 480 of the latter object, according to one embodiment of the invention.
  • FIG. 49 illustrates one object and an edge collision motion path 490, according to one embodiment of the invention.
  • FIG. 50 illustrates an object and a gravity motion path 500, according to one embodiment of the invention.
  • FIG. 51 illustrates a first object orbiting around a second object and an orbit motion path 510 of the first object, according to one embodiment of the invention.
  • FIG. 52 illustrates a Dashboard of an Orbit Around behavior, according to one embodiment of the invention.
  • FIG. 53 illustrates an object and a Random Motion motion path, according to one embodiment of the invention.
  • FIG. 54 illustrates an Orbit Around behavior applied to an object and the object's motion path, according to one embodiment of the invention.
  • FIG. 55 illustrates both an Orbit Around behavior and a Random Motion behavior applied to an object and the object's motion path, according to one embodiment of the invention.
  • FIG. 56 illustrates a Dashboard for a Random Motion behavior, according to one embodiment of the invention.
  • FIG. 57 illustrates several objects, according to one embodiment of the invention.
  • FIG. 58 illustrates the same objects as in FIG. 57 after the Repel behavior has been applied to the central object, according to one embodiment of the invention.
  • FIG. 59 illustrates a Dashboard of a Wind behavior, according to one embodiment of the invention.
  • FIG. 60 illustrates two graphic objects, according to one embodiment of the invention.
  • FIG. 61 illustrates a pop-up menu showing Basic Motion>Motion Path, according to one embodiment of the invention.
  • FIG. 62 illustrates the top object's motion path, according to one embodiment of the invention.
  • FIG. 63 illustrates the bottom object's motion path, according to one embodiment of the invention.
  • FIG. 64 illustrates a Dashboard for the Motion Path behavior showing the Speed parameter as Ease Out, according to one embodiment of the invention.
  • FIG. 65 illustrates a small text object, according to one embodiment of the invention.
  • FIG. 66 illustrates the text object of FIG. 65 with a new anchor point location, according to one embodiment of the invention.
  • FIG. 67 illustrates the Increment pop-up menu of the Grow/Shrink behavior in the Behaviors tab of the Inspector, according to one embodiment of the invention.
  • FIG. 68 illustrates the text object and the Grow/Shrink Dashboard, according to one embodiment of the invention.
  • FIG. 69 illustrates the Fade/Fade Out Dashboard, according to one embodiment of the invention.
  • FIG. 70 illustrates the composition at the first frame, according to one embodiment of the invention.
  • FIG. 71 illustrates the composition at a middle frame, according to one embodiment of the invention.
  • FIG. 72 illustrates the composition at the last frame, according to one embodiment of the invention.
  • FIG. 73 illustrates one example of a particle system, according to one embodiment of the invention.
  • FIG. 74 illustrates another example of a particle system, according to one embodiment of the invention.
  • FIG. 75 illustrates yet another example of a particle system, according to one embodiment of the invention.
  • FIG. 76 illustrates an example of a cell, according to one embodiment of the invention.
  • FIG. 77 illustrates an example of a particle system based on the cell of FIG. 76, according to one embodiment of the invention.
  • FIG. 78 illustrates an example of a particle system based on one cell, according to one embodiment of the invention.
  • FIG. 79 illustrates an example of a particle system based on multiple cells 760A, 760B, according to one embodiment of the invention.
  • FIG. 80 illustrates an example of a Project pane showing an emitter that is based on two cells, according to one embodiment of the invention.
  • FIG. 81 illustrates an example of a Timeline showing an emitter that is based on two cells, according to one embodiment of the invention.
  • FIG. 82 illustrates an example of a particle system based on an emitter, according to one embodiment of the invention.
  • FIG. 83 illustrates another example of a particle system based on the same emitter as in FIG. 82, according to one embodiment of the invention.
  • FIG. 84 illustrates yet another example of a particle system based on the same emitter as in FIGS. 82 and 83, according to one embodiment of the invention.
  • FIG. 85 illustrates an example of an object, according to one embodiment of the invention.
  • FIG. 86 illustrates an example of a particle system of bubbles along with the object of FIG. 85, according to one embodiment of the invention.
  • FIG. 87 illustrates another example of a particle system of bubbles along with the object of FIG. 85, according to one embodiment of the invention.
  • FIG. 88 illustrates an example of a particle system including an emitter and individual particles based on the emitter, according to one embodiment of the invention.
  • FIG. 89 illustrates a simple white circular gradient, according to one embodiment of the invention.
  • FIG. 90 illustrates an Emitter button, according to one embodiment of the invention.
  • FIG. 91 illustrates a new emitter, at the first frame of the particle effect, according to one embodiment of the invention.
  • FIG. 92 illustrates an active particle system, such as the emitter of FIG. 91 but at a later frame, according to one embodiment of the invention.
  • FIG. 93 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 94 illustrates the particle system of FIG. 93 after it has been rescaled, according to one embodiment of the invention.
  • FIG. 95 illustrates a Dashboard for a particle system, according to one embodiment of the invention.
  • FIG. 96 illustrates the particle system of FIGS. 91 and 92 in full effect, according to one embodiment of the invention.
  • FIG. 97 illustrates the particle system of FIG. 96 at anther point in time, according to one embodiment of the invention.
  • FIG. 98 illustrates the particle system of FIG. 97 after the value of Scale has been reduced, according to one embodiment of the invention.
  • FIGS. 99 and 100 illustrate the Dashboard and the particle system, respectively, before the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIGS. 101 and 102 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIGS. 103 and 104 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIGS. 105 and 106 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIGS. 107 and 108 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIGS. 109 and 110 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • FIG. 111 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 112 illustrates the particle system of FIG. 111 after the emitter has been moved, according to one embodiment of the invention.
  • FIG. 113 illustrates a particle system where the emitter's position has been animated using a behavior, or keyframed, according to one embodiment of the invention.
  • FIG. 114 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 115 illustrates the particle system of FIG. 114 after the emitter's Shear parameter has been modified, according to one embodiment of the invention.
  • FIG. 116 illustrates a particle system in the Timeline that comprises one emitter and three nested cells, according to one embodiment of the invention.
  • FIG. 117 illustrates a particle system with dense white particles emerging from the center, according to one embodiment of the invention.
  • FIG. 118 illustrates the particle system of FIG. 117 with more diffuse orange particles appearing around a larger area, according to one embodiment of the invention.
  • FIG. 119 illustrates the particle system of FIG. 118 with small sparks emerging from underneath both of the previous layers as they fade away, according to one embodiment of the invention.
  • FIG. 120 illustrates an Emitter tab and Emitter parameters, according to one embodiment of the invention.
  • FIG. 121 illustrates an Emitter tab and individual controls for several Emitter parameters, according to one embodiment of the invention.
  • FIG. 122 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 123 illustrates the particle system of FIG. 122 after the value of the Scale parameter in the Emitter tab has been increased, according to one embodiment of the invention.
  • FIG. 124 illustrates a particle system with a Point emitter shape, according to one embodiment of the invention.
  • FIG. 125 illustrates a particle system with a Line emitter shape, according to one embodiment of the invention.
  • FIG. 126 illustrates a particle system with a Circle emitter shape, according to one embodiment of the invention.
  • FIG. 127 illustrates a particle system with a Filled Circle emitter shape, according to one embodiment of the invention.
  • FIG. 128 illustrates a particle system with a Geometry emitter shape, according to one embodiment of the invention.
  • FIG. 129 illustrates the shape that was used as the Geometry emitter shape for the particle system of FIG. 128, according to one embodiment of the invention.
  • FIG. 130 illustrates a particle system with an Image emitter shape, according to one embodiment of the invention.
  • FIG. 131 illustrates the image that was used as the Image emitter shape for the particle system of FIG. 130, according to one embodiment of the invention.
  • FIG. 132 illustrates a particle system with a lower birth rate, according to one embodiment of the invention.
  • FIG. 133 illustrates the particle system of FIG. 132 but with a higher birth rate, according to one embodiment of the invention.
  • FIG. 134 illustrates a particle system with a higher initial number, according to one embodiment of the invention.
  • FIG. 135 illustrates the particle system of FIG. 134 but with a lower initial number, according to one embodiment of the invention.
  • FIG. 136 illustrates a particle system with a longer life, according to one embodiment of the invention.
  • FIG. 137 illustrates the particle system of FIG. 136 but with a shorter life, according to one embodiment of the invention.
  • FIG. 138 illustrates a particle system with the Additive Blend parameter turned off, according to one embodiment of the invention.
  • FIG. 139 illustrates a particle system with the Additive Blend parameter turned on, according to one embodiment of the invention.
  • FIG. 140 illustrates a particle system with a Solid Color Mode, according to one embodiment of the invention.
  • FIG. 141 illustrates a particle system with an Over Life Color Mode, according to one embodiment of the invention.
  • FIG. 142 illustrates a particle system with a Range Color Mode, according to one embodiment of the invention.
  • FIG. 143 illustrates a particle system with a Take Image Color Mode, according to one embodiment of the invention.
  • FIG. 144 illustrates a particle system with a larger Scale parameter, according to one embodiment of the invention.
  • FIG. 145 illustrates the particle system of FIG. 144 but with a smaller Scale parameter, according to one embodiment of the invention.
  • FIG. 146 illustrates a particle system with a Point Show Particles As parameter, according to one embodiment of the invention.
  • FIG. 147 illustrates a particle system with a Line Show Particles As parameter, according to one embodiment of the invention.
  • FIG. 148 illustrates a particle system with an Outline Show Particles As parameter, according to one embodiment of the invention.
  • FIG. 149 illustrates a particle system with an Image Show Particles As parameter, according to one embodiment of the invention.
  • FIG. 150 illustrates a Particle Cell tab, according to one embodiment of the invention.
  • FIG. 151 illustrates an object that is being dragged to a position in the Layers tab, according to one embodiment of the invention.
  • FIG. 152 illustrates the object of FIG. 151, now nested within an emitter, according to one embodiment of the invention.
  • FIG. 153 illustrates a particle system, according to one embodiment of the invention.
  • FIG. 154 illustrates the particle system of FIG. 153 after a Sphere filter has been applied, according to one embodiment of the invention.
  • FIG. 155 illustrates a simple graphic with a premultiplied alpha channel, according to one embodiment of the invention.
  • FIG. 156 illustrates an Emitter button, according to one embodiment of the invention.
  • FIG. 157 illustrates a distributed group of particles that partially fills the Canvas, according to one embodiment of the invention.
  • FIG. 158 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 159 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 160 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 161 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 162 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 163 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 164 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 165 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 166 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 167 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 168 illustrates the resulting image, according to one embodiment of the invention.
  • FIG. 169 illustrates one example of a slider, according to one embodiment of the invention.
  • FIG. 170 illustrates one example of a value slider, according to one embodiment of the invention.
  • FIG. 171 illustrates one example of a dial, according to one embodiment of the invention.
  • FIG. 172 illustrates one example of a value field, according to one embodiment of the invention.
  • FIG. 173 illustrates one example of a pop-up menu, according to one embodiment of the invention.
  • FIG. 174 illustrates one example of a value list, according to one embodiment of the invention.
  • FIG. 175 illustrates one example of an activation checkbox, according to one embodiment of the invention.
  • FIG. 176 illustrates one example of a color well, according to one embodiment of the invention.
  • FIG. 177 illustrates one example of a pop-up picker, according to one embodiment of the invention.
  • FIG. 178 illustrates one example of a gradient, according to one embodiment of the invention.
  • FIG. 179 illustrates one example of a drop well, according to one embodiment of the invention.
  • FIG. 180 illustrates one example of a parameter selection field, according to one embodiment of the invention.
  • FIG. 181 illustrates one example of a reset button, according to one embodiment of the invention.
  • FIG. 182 illustrates one example of a manage presets button, according to one embodiment of the invention.
  • FIG. 183 illustrates one example of an animation menu button, according to one embodiment of the invention.
  • FIG. 184 illustrates one example of a shortcut menu filled with Animation related controls, according to one embodiment of the invention.
  • FIG. 185 illustrates one example of a Lock icon, according to one embodiment of the invention.
  • FIG. 186 illustrates one example of a Dashboard, according to one embodiment of the invention.
  • FIG. 186 illustrates one example of a Dashboard, according to one embodiment of the invention.
  • FIG. 187 illustrates one example of a Dashboard title bar displaying a downward facing arrow, according to one embodiment of the invention.
  • FIG. 188 illustrates one example of a pop-up menu that lists all of the possible control sets that can be displayed in the Dashboard for the selected object, according to one embodiment of the invention.
  • FIG. 189 illustrates one example of a Dashboard for a particle system, according to one embodiment of the invention.
  • FIG. 190 illustrates one example of a Dashboard for a Grow/Shrink behavior, according to one embodiment of the invention.
  • FIG. 191 illustrates one example of a Dashboard for a Fade In/Fade Out behavior, according to one embodiment of the invention.
  • FIG. 192 illustrates one example of a Dashboard for a Throw behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 193 illustrates one example of a Dashboard for a Throw behavior where the special control specifies movement in a southeastern direction at a low speed, according to one embodiment of the invention.
  • FIG. 194 illustrates one example of a Dashboard for a Throw behavior where the special control specifies movement in the same direction as in FIG. 193, but at a higher speed, according to one embodiment of the invention.
  • FIG. 195 illustrates one example of a Dashboard for a Wind behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 196 illustrates one example of a Dashboard for a Wind behavior where the special control specifies movement in a northeastern direction at a high speed, according to one embodiment of the invention.
  • FIG. 197 illustrates one example of a Dashboard for a Spin behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 198 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in a clockwise direction at a low speed, according to one embodiment of the invention.
  • FIG. 199 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in the same direction as in FIG. 198, but at a higher speed, according to one embodiment of the invention.
  • FIG. 200 illustrates one example of a Dashboard for a Spin behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 201 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in a counterclockwise direction at a low speed, according to one embodiment of the invention.
  • FIG. 202 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in the same direction as in FIG. 201, but at a much higher speed, according to one embodiment of the invention.
  • FIG. 203 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 204 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies a high grow rate, according to one embodiment of the invention.
  • FIG. 205 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies no movement, according to one embodiment of the invention.
  • FIG. 206 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies a high shrink rate, according to one embodiment of the invention.
  • FIG. 207 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies shrinking in the horizontal direction and simultaneous growing in the vertical direction, according to one embodiment of the invention.
  • FIG. 208 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a fade in time and a fade out time of equivalent length, according to one embodiment of the invention.
  • FIG. 209 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a shorter fade in time than in FIG. 208 and no fade out time (i.e., no fade out at all), according to one embodiment of the invention.
  • FIG. 210 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a similar fade in time to that in FIG. 208 and a longer fade out time than in FIG. 208, according to one embodiment of the invention.
  • FIG. 211 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in all directions (i.e., there is no specified range) at a medium/high speed, according to one embodiment of the invention.
  • FIG. 212 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range) and at a medium speed, according to one embodiment of the invention.
  • FIG. 213 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range, and the range is narrower than the range in FIG. 211) and at a low speed, according to one embodiment of the invention.
  • FIG. 214 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range, and the range is narrower than the range in FIG. 212) and at a high speed, according to one embodiment of the invention.
  • The figures depict a preferred embodiment of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
  • DETAILED DESCRIPTIONS OF THE PREFERRED EMBODIMENTS
  • In one embodiment, the visual representation of an object may be specified by two pieces of information, a source image and a collection of parameters that modify the source image. In one embodiment, by modifying the values of these parameters over time, an object can be animated. In another embodiment, for example, by modifying the size of an image and the opacity of an image over time, an object can appear to grow or shrink or fade in or fade out, respectively. In one embodiment, the visual representation of an object can also be assigned a position parameter. In another embodiment, by modifying the value of this position parameter over time, an object can appear to move.
  • In one embodiment, a behavior is an animation abstraction that, when applied to an object, causes the object to be animated in a particular way. In another embodiment, specifically, a behavior changes the value of a parameter of an object over time, thereby animating the object with respect to that parameter. In yet another embodiment, for example, a “shrink” behavior may cause an object to decrease in size by decreasing the values of the object's length and height parameters. In one embodiment, as another example, a “throw” behavior may cause an object to move in a specific direction with a specific speed by modifying the object's location on the screen over time.
  • In one embodiment, a behavior changes the value of only one parameter of an object over time. In another embodiment, for example, a “stretch” behavior may stretch an object by increasing the value of the object's length parameter while not modifying the value of the object's height parameter. In yet another embodiment, a behavior changes the value of more than one parameter of an object over time. In one embodiment, for example, the “shrink” behavior mentioned above decreases the values of the object's length and height parameters.
  • Recall that, in one embodiment, a behavior changes the value of a parameter of an object over time. In one embodiment, while a behavior specifies how it affects a parameter, a behavior may or may not specify which parameter it affects. In another embodiment, when a behavior specifies a particular parameter, the behavior is applied to an object and affects that particular parameter of the object. In yet another embodiment, when a behavior does not specify a particular parameter, the behavior is applied to a parameter of an object (any parameter) and affects that parameter in a particular way. In one embodiment, thus far, only two ways have been discussed in which a behavior may affect the value of a parameter of an object—increasing and decreasing. In another embodiment, however, many more such ways exist. In yet another embodiment, these ways include oscillating, randomizing, and reversing. In one embodiment, for example, an “oscillate rotation” behavior might be called “rock.” In another embodiment, thus, a behavior that specifies a particular parameter is applied to an object, while a behavior that does not specify a particular parameter is applied to a parameter of an object.
  • In one embodiment, one way to refer to a behavior that specifies a parameter is to indicate which parameters the behavior affects and in what way. In one embodiment, for example, a behavior that decreases the brightness of an object may be known as the “decrease brightness” behavior. In another embodiment, sometimes, though, it is more useful to name a behavior based on the animation that it causes. In yet another embodiment, for example, the “decrease brightness” behavior may be called the “darken” behavior. In one embodiment, similarly, an “increase length, increase height” behavior may be called the “grow” behavior. In another embodiment, descriptive titles, such as “darken” and “grow,” help the user understand how a behavior will animate an object.
  • In one embodiment, in order to apply a behavior to an object, where the behavior specifies the parameter to be animated, a user selects a behavior and selects an object to which the behavior should be applied. In one embodiment, note that these two steps may occur in any order. In another embodiment, a user selects a behavior or an object by choosing it from a menu. In yet another embodiment, a user selects a behavior or an object by clicking on a visual representation of the behavior or object, such as an icon (for a behavior or an object) or the object itself (for an object). In one embodiment, a user applies a behavior to an object by clicking on the behavior and dragging it onto the target object.
  • In one embodiment, in order to apply a behavior to a parameter of an object, where the behavior does not specify the parameter to be animated, a user selects a behavior and selects a parameter of an object to which the behavior should be applied. In one embodiment, note that these two steps may occur in any order. In another embodiment, a user selects a behavior by choosing it from a menu or by clicking on a visual representation of the behavior, as described above. In yet another embodiment, a user selects a parameter of an object by first selecting an object and then selecting a parameter of the object. In one embodiment, a user may select an object by choosing it from a menu or by clicking on a visual representation of the object, as described above. In another embodiment, once an object has been selected, a user may display a list of the object's parameters and select a parameter by clicking on it. In yet another embodiment, a user applies a behavior to a parameter of an object by clicking on the behavior and dragging it onto the target parameter. In one embodiment, an object parameter to which a behavior has been applied is identified in the list of parameters of the object. In another embodiment, an icon appears near the object parameter to which a behavior has been applied.
  • In one embodiment, a behavior may be simultaneously applied to multiple objects or to multiple parameters of an object. In one embodiment, instead of selecting one object or one parameter of an object to which the behavior should be applied, the user selects multiple objects to multiple parameters of an object to which the behavior should be applied.
  • In one embodiment, once a behavior has been applied to an object or to an object parameter, it may be removed by deleting it. In one embodiment, a behavior's target object or target object parameter may be changed without having to delete the behavior and create a new behavior.
  • In one embodiment, the animation caused by a behavior may be customized by specifying a value for one or more parameters associated with the behavior. In one embodiment, for example, the “stretch” behavior may have a parameter that indicates how fast the object will stretch (i.e., at what rate the object's length parameter will increase). In another embodiment, as another example, the “throw” behavior may have a parameter that indicates in which direction the object should move (i.e., how the object's location on the screen should be changed). In yet another embodiment, initially, when a behavior is applied, these parameters have default values. In one embodiment, methods of specifying other values for these parameters will be further discussed below.
  • In one embodiment, behaviors exist independently of the objects to which they are applied. In one embodiment, this means that behaviors are reusable—the same behavior can be applied to two different objects to animate the objects in a similar way. In another embodiment, the user may select a behavior from a group of pre-defined behaviors (a “behavior library”). In yet another embodiment, these behaviors may be, for example, the most useful behaviors or the most difficult behaviors to implement. In one embodiment, a behavior in the library is saved and may be re-used in the future.
  • In one embodiment, the user creates behaviors to add to this library. In one embodiment, these behaviors may be created by assigning values to a behavior's parameters or by specifying a particular parameter of an object to be affected (e.g., where the behavior previously did not specify an object parameter). In another embodiment, a user creates a behavior from scratch or combines multiple behaviors into one behavior.
  • In one embodiment, as mentioned above, two behaviors can be combined to form one new behavior. In one embodiment, alternatively, two behaviors may be applied to the same object but still retain their independent nature. In another embodiment, in fact, any number of behaviors may be applied to one object at the same time. In yet another embodiment, in this situation, each behavior would affect the object at the same time. In one embodiment, sometimes, when multiple behaviors are applied to the same object, the object may be animated in a different way depending on the order in which the behaviors were applied.
  • Behaviors Contrasted with Keyframes
  • In one embodiment, a keyframe is a visual representation of an object at a particular point in time. In one embodiment, by defining several keyframes, a user can specify how the visual representation of an object changes over time. In another embodiment, since the representation of an object may change drastically between keyframes, simply showing a number of keyframes in succession would result in jerky transitions. In yet another embodiment, in order to obtain a smooth animation, new visual representations must be calculated that fall between keyframes in time and that are similar to surrounding keyframes. In one embodiment, this is known as “inbetweening.”
  • In one embodiment, applying a behavior to an object does not add keyframes to an object or to its parameters. In one embodiment, instead, a behavior generates a range of values for a parameter of an object and then sets the parameter to these values over time, thereby animating the object. In another embodiment, the range of values applied to an object's parameters is controlled by the behavior's parameters.
  • In one embodiment, keyframes apply specific values to an object's parameters. In one embodiment, when multiple keyframes are created that specify different values for the same parameter of an object, that parameter is animated from the value in the first keyframe to the value in the last keyframe. In another embodiment, if the value specified by one keyframe is changed, the other keyframes (and thus, the values that they specify) are not modified.
  • Behaviors and keyframes will be further discussed below.
  • Working with Behaviors
  • In one embodiment, a user can use Behaviors to animate objects using simple, graphical controls. In one embodiment, with Behaviors, a user can create simple motion effects or complex simulated interactions between multiple objects quickly and easily.
  • In one embodiment, a user can add behaviors to objects or properties in a project to create animated effects without needing to create or adjust keyframes. In one embodiment, drag a behavior onto an object and the object is automatically animated based on the type of behavior applied. In another embodiment, a user can customize a Behavior's parameters in the Dashboard, or in the Behaviors tab of the Inspector, to change its effect.
  • In one embodiment, behaviors are designed to be flexible, and can be combined with one another to create all kinds of effects. In one embodiment, using behaviors, motion graphics design becomes interactive, allowing a user to create complex motion effects and simulated object interactions very quickly.
  • In one embodiment, behaviors can also be used to animate nearly any individual object, particle system emitter, filter, and generator parameter. In one embodiment, this allows a user to quickly create animated backgrounds, dynamic filter effects, and incredibly complex particle systems, all using a few simple controls.
  • In one embodiment, there are five different kinds of behaviors available for a user to use.
      • Basic Motion behaviors—In one embodiment, basic motion behaviors are among the simplest behaviors. In one embodiment, basic motion behaviors animate specific parameters of the object to which they are applied. In another embodiment, some basic motion behaviors affect position, while others affect scale or rotation. In yet another embodiment, examples include Fade In/Fade Out, Spin, and Throw.
      • Parameter behaviors—In one embodiment, parameter behaviors can be applied to any object parameter, and their effects are limited to just that parameter. In one embodiment, the same parameter behavior can be added to different parameters, resulting in completely different effects. In another embodiment, for example, a user can apply the oscillate behavior to the opacity of an object to make it fade in and out, or a user can apply it to the rotation of an object to make it rock back and forth. In yet another embodiment, a user can also apply parameter behaviors to filter parameters, Generator parameters, the parameters of particle systems, or even the parameters of other behaviors. In one embodiment, examples include Oscillate, Randomize, and Reverse.
      • Simulation behaviors—In one embodiment, simulation behaviors perform one of two tasks. In one embodiment, some simulation behaviors, such as Gravity, animate the parameters of an object in a way that simulates a real-world phenomenon. In another embodiment, other simulation behaviors, such as Attractor and Repel, affect the parameters of one or more objects surrounding the object to which they're applied. In yet another embodiment, simulation behaviors allow a user to create some very sophisticated interactions among multiple objects in a project with a minimum of adjustments. In one embodiment, like the basic motion behaviors, simulation behaviors also affect specific object parameters. In another embodiment, examples include Attractor, Gravity, and Repel.
      • Particles behaviors—In one embodiment, particles behaviors are specifically designed to be applied to cells within particle systems. In one embodiment, these behaviors affect how individual particles are animated over the duration of their life.
      • Text behaviors—In one embodiment, text behaviors animate the parameters of text objects to create various animated effects. In one embodiment, examples include Scroll Up, which causes text to move vertically to create scrolling titles or credits, and Type On, which reveals a text object letter by letter.
  • Behaviors vs. Keyframes—In one embodiment, it's important to understand that behaviors do not add keyframes to the objects or parameters to which they're applied. In one embodiment, instead, behaviors automatically generate a range of values that are then applied to an object's parameters, animating it over the duration of that behavior. In another embodiment, changing the parameters of a behavior alters the range of values that behavior generates.
  • In one embodiment, keyframes, on the other hand, apply specific values directly to a parameter. In one embodiment, when a user creates two or more keyframes with different values in a Parameter in the Keyframe Editor, a user animates that parameter from the first keyframed value to the last. In another embodiment, if a user changes the value of a single keyframe, it has no effect on any other keyframes applied to the same parameter.
  • In one embodiment, by design, behaviors are most useful for creating generalized, ongoing motion effects. In one embodiment, behaviors are also extremely useful for creating animated effects that might be too complex or time-consuming to keyframe manually. In another embodiment, keyframing, in turn, may be more useful for creating specific animated effects where the parameter a user is adjusting is required to hit a specific value at a specific time.
  • A. Browsing For Behaviors
  • In one embodiment, available behaviors appear in the Library tab. In one embodiment, selecting the Behaviors category in the category pane reveals the four behavior subcategories. In another embodiment, selecting a subcategory reveals behaviors of that type in the Library Stack pane. In yet another embodiment, when a user selects a behavior in the Library Stack, a short description of it appears to the right of the Preview window.
  • B. Applying and Removing Behaviors
  • In one embodiment, how a user applies a behavior depends on what kind of behavior it is. In one embodiment, some behaviors are applied directly to objects in the Canvas, while others must be applied specifically to individual object parameters in the Inspector.
  • i. Applying and Removing Motion, Simulation, and Text Behaviors
  • In one embodiment, a user applies these behaviors directly to objects in the Canvas, Layers tab, or Timeline. In one embodiment, these behaviors automatically animate specific parameters of the object to which they're applied. In another embodiment, for example, the Throw behavior only affects an object's Position parameter, and the Grow/Shrink behavior only affects an object's Scale parameter.
  • In one embodiment, if the Create Objects At preference in the Project Preferences window is set to Current Frame, newly applied behaviors will be added at the position of the Playhead in the Timeline. In one embodiment, behavior animation begins at the first frame a behavior appears, so a behavior's position in the Timeline is important.
  • In one embodiment, to apply a behavior to an object in a project, do one of the following:
      • In one embodiment, drag a Basic Motion, Simulation, or Text behavior onto an object in the Canvas, Layers tab, or Timeline.
      • In one embodiment, select an object in the Canvas, Layers tab, or Timeline, and choose a behavior from the Behaviors menu (in the menu bar or toolbar).
      • In one embodiment, select an object in the Canvas, Layers tab, or Timeline, then select a behavior from the Library stack of the Library and click Apply in the Preview pane.
  • In one embodiment, to apply a behavior to multiple objects:
      • In one embodiment, select all of the objects to which to apply the behavior. In one embodiment, in either the Layers tab or Timeline, Shift-click to select a contiguous set of objects, or Command-click to select individual, non-contiguous objects.
      • In one embodiment, do one of the following:
        • In one embodiment, choose a behavior from the Behaviors menu (in the menu bar or Toolbar).
        • In one embodiment, select a behavior in the Library and click Apply in the Preview pane. In one embodiment, to see the animated effect in action, play the project.
  • In one embodiment, a user can also apply behaviors directly to Layers in the Layers tab or Timeline. In one embodiment, behaviors applied to a Layer affect all objects nested within that layer as if they were a single object.
  • In one embodiment, when a behavior is applied to an object, the object parameters affected by that behavior are automatically animated based on the behavior's default settings. In one embodiment, for example, if a user applies the Gravity behavior to an object in the Canvas and then plays the project, that object's position is animated and it moves down, according to the Gravity behavior's default setting.
  • a. Where Behaviors Appear
  • In one embodiment, when a user applies a behavior 10 to an object 12, it appears nested underneath that object in the Layers tab 14 and the Timeline 16. In one embodiment, behaviors are listed in the order in which they were applied in the Behaviors tab 18 of the Inspector 19. FIG. 1 illustrates a behavior in the Layers tab, according to one embodiment of the invention. FIG. 2 illustrates a behavior in the Timeline, according to one embodiment of the invention. FIG. 3 illustrates a behavior in the Behaviors tab of the Inspector, according to one embodiment of the invention.
  • In one embodiment, a gear icon 40 also appears to the right of the layer or object name in the Layers tab or Timeline. In one embodiment, clicking this icon enables and disables all behaviors that have been applied to that layer or object. FIG. 4 illustrates a gear icon, according to one embodiment of the invention.
  • b. Behavior Effects in the Keyframe Editor
  • In one embodiment, if a user opens the Keyframe Editor and looks at a parameter that's affected by one or more behaviors, he'll see a background curve that represents the behavior's effect in addition to that parameter's keyframe curve. In one embodiment, this curve is uneditable, and is there to display the behavior's effect on that parameter.
  • c. Removing Behaviors
  • In one embodiment, since behaviors don't add keyframes, removing a behavior instantly eliminates its animated effect. In one embodiment, all types of behaviors are removed in the same way.
  • In one embodiment, to remove a behavior from an object:
      • In one embodiment, select a behavior in the Layers tab, Timeline, or Behaviors tab.
      • In one embodiment, do one of the following:
      • In one embodiment, press the Delete key.
      • In one embodiment, choose Edit>Delete.
      • In one embodiment, right-click the behavior in the Timeline, and choose Delete from the shortcut menu.
  • ii. Applying Parameter Behaviors
  • In one embodiment, parameter behaviors are applied differently than the other types of behaviors. In one embodiment, while all other behaviors affect specific object parameters, parameter behaviors can be applied to any of an object's parameters. In another embodiment, this also includes the parameters of filters, emitters and cells in particle systems, and other behaviors that have been applied to an object.
  • In one embodiment, a parameter behavior's effect on an object depends on the parameter to which it is applied. In one embodiment, for example, if a user applies the Randomize parameter behavior to an object's Position parameter, that object drifts around the screen when the project is played. In another embodiment, applying the Randomize Parameter behavior to an object's Scale parameter, instead, makes the object randomly grow and shrink.
  • In one embodiment, to apply a parameter behavior to an object's specific parameter:
      • In one embodiment, select an object to which to apply the parameter behavior.
      • In one embodiment, open the Inspector.
      • In one embodiment, do one of the following:
        • In one embodiment, right-click a parameter in the Inspector, and choose a parameter behavior to add from the shortcut menu.
        • In one embodiment, select a parameter, click the Behaviors button in the Toolbar, and choose a Parameter behavior from the pop-up list.
  • In one embodiment, if a user saves a parameter behavior as a favorite, the parameter to which it was applied is saved along with the rest of that behavior's settings. In one embodiment, as a result, it can be applied like any other behavior and that parameter is automatically affected.
  • a. Where Parameter Behaviors Appear
  • In one embodiment, when a Parameter behavior has been applied to an object in a project, a gear icon 40 appears in the keyframe menu to the right of the affected parameter in the Properties, Behaviors, or Filters tab where it's applied. In one embodiment, this shows a user that a parameter behavior is influencing that parameter. In another embodiment, a gear icon also appears in the keyframe menu of each affected parameter in the Keyframe Editor. FIG. 5 illustrates a gear icon in the filters tab of the Inspector, according to one embodiment of the invention. FIG. 6 illustrates a gear icon in the Keyframe Editor, according to one embodiment of the invention.
  • In one embodiment, like other behaviors, parameter behaviors 10 appear nested underneath the objects to which they're applied in the Layers tab and the Timeline, along with any other behaviors that have been applied to that object. FIG. 7 illustrates a parameter behavior in the Layers tab, according to one embodiment of the invention. FIG. 8 illustrates a parameter behavior in the Timeline, according to one embodiment of the invention.
  • In one embodiment, opening a parameter's keyframe menu reveals the names of all the Parameter behaviors 10 currently applied to that parameter. In one embodiment, choosing one automatically opens that object's Behaviors tab. FIG. 9 illustrates a parameter's pop-up menu, according to one embodiment of the invention.
  • b. Reassigning a Parameter Behavior to Another Parameter
  • In one embodiment, once a user applies a parameter behavior, it remains assigned to that parameter unless the user reassigns it. In one embodiment, this is possible using the parameter assignment pop-up, located at the bottom of each parameter behavior in the Behaviors tab. In another embodiment, the parameter assignment popup displays all of the Properties available for the object that behavior has been applied to. In yet another embodiment, if an object has other behaviors or filters applied to it, those parameters also appear within submenus of the Apply To pop-up menu.
  • In one embodiment, to reassign a parameter behavior to another parameter:
      • In one embodiment, select the object with the parameter behavior to reassign.
      • In one embodiment, open the Behaviors tab in the Inspector.
      • In one embodiment, choose a new parameter from the Apply To pop-up menu 100. FIG. 10 illustrates an Apply To pop-up menu, according to one embodiment of the invention.
  • C. Customizing Behaviors
  • In one embodiment, each behavior has a subset of parameters that appear in the Dashboard. In one embodiment, in addition, all controls for behaviors appear in the Behaviors tab of the Inspector. In another embodiment, both the Dashboard and the Behaviors tab reference the same parameters, so changing a parameter in one automatically changes the same parameter in the other.
  • i. Customizing Parameters in the Dashboard
  • In one embodiment, in general, the parameters that appear in the Dashboard 110 are the most essential ones for modifying that behavior's effect. In one embodiment, frequently, the controls available in a behavior's Dashboard are also more descriptive and easier to use than those in the Behaviors tab 18, although the Behaviors tab may contain more controls. In another embodiment, for example, compare the controls for the Fade In/Fade Out behavior in the Behaviors tab 18 to those available in the Dashboard 110. FIG. 11 illustrates the controls for the Fade In/Fade Out behavior in the Dashboard, according to one embodiment of the invention. FIG. 12 illustrates the controls for the Fade In/Fade Out behavior in the Behaviors tab, according to one embodiment of the invention.
  • In one embodiment, the controls in the Dashboard consolidate all of the parameters available in the Behaviors tab into a single, graphical control. In one embodiment, there are times, however, when it may be more desirable to use a behavior's individual parameters to finesse the effect a user is trying to achieve with greater detail.
  • In one embodiment, to display the Dashboard of a behavior:
      • In one embodiment, select an object with a behavior applied to it.
      • In one embodiment, do one of the following:
        • In one embodiment, select the behavior to modify in the Layers tab, Timeline, or Behaviors tab.
        • In one embodiment, control-click an object in the Canvas, and choose a behavior from the Behavior submenu in the shortcut menu.
      • In one embodiment, make adjustments to the behavior using the controls in the Dashboard. In one embodiment, if the Dashboard doesn't appear, the user may need to choose Window>Show Dashboard (or press F7).
  • In one embodiment, to switch among all behaviors applied to an object in the Dashboard, click the disclosure triangle next to the name at the top of the Dashboard to open a pop-up menu that displays all of the behaviors, filters, and masks that are applied to that object. In one embodiment, choose a behavior from this list to display its parameters in the Dashboard.
  • ii. Customizing Parameters in the Behaviors Tab of the Inspector
  • In one embodiment, the Behaviors tab displays every behavior that's applied to the selected object. In one embodiment, a disclosure triangle to the left of each behavior's name reveals all of that behavior's parameters underneath. In another embodiment, unlike the Dashboard, the Behaviors tab displays every parameter a behavior has.
  • In one embodiment, to display the Behaviors tab:
      • In one embodiment, select an object with a behavior applied to it.
      • In one embodiment, open the Inspector, then click the Behaviors tab. In one embodiment, all of the behaviors applied to that object appear within.
  • D. Working With Behaviors
  • In one embodiment, this section describes how to enable, rename, lock, duplicate, move, and reorganize behaviors in a project. In one embodiment, these procedures apply to every type of behavior.
  • i. Controls for Enabling, Renaming, and Locking Behaviors
  • In one embodiment, when a user applies a behavior to an object, the behavior appears in three different places—the Layers tab 14, the Timeline 16, and the Behaviors tab 18 of the Inspector. In one embodiment, while the Behaviors tab in the Inspector contains all of the editable parameters for a behavior that's been applied to an object, the Layers tab and Timeline have three basic controls for each behavior: Activate 130, Enable/Disable 132, and Lock 134. FIG. 13 illustrates the Activate control, the Enable/Disable control, and the Lock control in the Layers tab, according to one embodiment of the invention. FIG. 14 illustrates the Activate control, the Enable/Disable control, and the Lock control in the Timeline, according to one embodiment of the invention.
  • Activate control—In one embodiment, the Activate control is a checkbox that turns each individual behavior on or off. In one embodiment, behaviors that are turned off are not rendered.
  • Name—In one embodiment, a user can double-click in the Name field to rename the behavior.
  • Lock—In one embodiment, click the lock control to lock or unlock a behavior. In one embodiment, a user cannot modify the parameters of a locked behavior.
  • Enable/Disable Behaviors control—In one embodiment, the enable/disable behaviors control 150 is a gear icon that appears to the right of the name of each object with one or more behaviors applied to it. In one embodiment, clicking this icon toggles all behaviors applied to that object on and off. FIG. 15 illustrates an enable/disable behaviors control that has been toggled to disabled, according to one embodiment of the invention.
  • Show Behaviors control—In one embodiment, the show behaviors control 160 is a button at the bottom of the Layers tab and Timeline that lets a user show or hide all behaviors. This button neither enables or disables behaviors that have been applied to objects in a project, it only controls their visibility. FIG. 16 illustrates a show behaviors control that has been toggled to show, according to one embodiment of the invention.
  • ii. Copying, Pasting, and Moving Behaviors
  • In one embodiment, after a user has added behaviors to an object, there are a number of ways he can copy and move them among the other objects in the Timeline or Layers tab. In one embodiment, behaviors can be cut, copied, and pasted like any other object. In another embodiment, when a user cuts or copies a behavior in the Timeline or Layers tab, he also copies the current state of all that behavior's parameters.
  • In one embodiment, to cut or copy a behavior:
      • In one embodiment, select a behavior.
      • In one embodiment, do one of the following:
        • In one embodiment, choose Edit>Cut (or press Command+X) to remove the behavior to the Clipboard.
        • In one embodiment, choose Edit>Copy (or press Command+C) to leave the behavior there, and copy the behavior to the Clipboard.
  • In one embodiment, to paste a behavior:
      • In one embodiment, select an object into which to paste the behavior.
      • In one embodiment, choose Edit>Paste (or press Command+V).
      • In one embodiment, the cut or copied behavior is applied to the selected object, with all its parameter settings intact.
  • In one embodiment, a user can also move a behavior 10 from one object 12 to another in the Timeline or Layers tab simply by dragging it. In one embodiment, to move a behavior from one object to another, in the Timeline or Layers tab, drag a parameter behavior from one object and drop it on top of another. In another embodiment, if a user moves a parameter behavior 10 to another object 12, it is applied to whichever parameter it affected in the previous object. FIG. 17 illustrates a behavior that has been selected in the Layers tab, according to one embodiment of the invention. FIG. 18 illustrates a behavior that is being dragged to another object in the Layers tab, according to one embodiment of the invention.
  • In one embodiment, a user can also duplicate a behavior in place. In one embodiment, to duplicate a behavior:
      • In one embodiment, select the behavior to duplicate.
      • In one embodiment, do one of the following:
        • In one embodiment, choose Edit>Duplicate (or press Command+D)
        • In one embodiment, right-click on the behavior to duplicate, and choose Duplicate from the shortcut menu
  • In one embodiment, a user can also duplicate a behavior and apply the duplicate to another object in the Timeline or Layers tab. In one embodiment, to drag a duplicate of a behavior to another object:
      • In one embodiment, press and hold the Option key, and click on the behavior to duplicate.
      • In one embodiment, holding the mouse button down, drag the behavior to the object to apply its duplicate to.
      • In one embodiment, release the mouse button.
        In one embodiment, the duplicated behavior is applied to the second object.
  • In one embodiment, when a user duplicates an object, he also duplicates all behaviors that have been applied to the object. In one embodiment, this way, if the user is creating a project with a number of objects that all need to use the same behavior, the user can simply apply that behavior to the first instance of that object, and then duplicate that object as many times as necessary.
  • iii. Applying Multiple Behaviors to an Object
  • In one embodiment, there is no limit to the number of behaviors 10 a user can add to an object 12. In one embodiment, when multiple behaviors 10 are applied to a single object 12, they all work together to create a final animated effect. FIG. 19 illustrates an object with multiple behaviors in the Timeline, according to one embodiment of the invention. FIG. 20 illustrates an object with multiple behaviors in the Layers tab, according to one embodiment of the invention.
  • In one embodiment, since each behavior applies a value to a specific parameter, the values generated by all behaviors that affect the same parameters are added together to create the end result. In one embodiment, for example, applying the Throw, Spin, and Gravity behaviors to a single object results in the combined result of the Throw and Gravity behaviors affecting the position of the object, and the Spin behavior affecting its rotation.
  • a. Reordering Behaviors
  • In one embodiment, when a user applies a number of behaviors to a single object, they all appear nested beneath that object in the Timeline and Layers tab. In one embodiment, a user can change the order in which the behaviors are applied. In another embodiment, while the effects of most behaviors on a parameter are additive, this is useful more as an organizational tool than as a way to change the animated effect created by the behaviors a user has added to an object. In yet another embodiment, one notable exception to this is the Stop behavior, which suspends the activity of all behaviors appearing beneath it, while ignoring any behaviors above it.
  • In one embodiment, to reorder a behavior:
      • In one embodiment, in the Timeline or Layers tab, select the behavior to reorder.
      • In one embodiment, drag the behavior 10 up or down the list of nested behaviors applied to the same object 12. In one embodiment, a position indicator 210 shows where the behavior 10 appears when the user releases the mouse button. FIG. 21 illustrates a behavior being dragged and a position indicator, according to one embodiment of the invention.
      • In one embodiment, when the position indicator is in the correct position, release the mouse button
  • E. Changing the Timing of Behaviors
  • In one embodiment, a user can change a behavior's timing to control when it starts, how long it lasts, and when it stops. In one embodiment, there are several ways of accomplishing this. In another embodiment, the user can use the Stop parameter behavior to suspend one or more behaviors effects on a single parameter. In yet another embodiment, a user can also trim each behavior in the Timeline. In one embodiment, finally, a user can change a parameter behavior's Start Offset parameter to delay its beginning, and its End Offset parameter to end the behavior prior to the end of the object to which it is applied.
  • i. Using the Stop Behavior
  • In one embodiment, the easiest way of controlling behavior timing is to use the Stop parameter behavior. In one embodiment, the Stop behavior halts the animation occurring in any parameter, whether the animation is due to keyframes in the Keyframe Editor, or behaviors that have been applied to that object.
  • In one embodiment, to stop a parameter from animating with the Stop parameter:
      • In one embodiment, select an object, and open the Properties tab in the Inspector.
      • In one embodiment, if the Create Objects At preference is set to Current Frame, move the Playhead to the time the animation should stop.
      • In one embodiment, command-click the parameter to stop, and choose Stop from the shortcut menu.
  • In one embodiment, when a user applies a Stop behavior to an object, its position in the Layers tab and Timeline affects which of the other behaviors that are applied to the same object are stopped. In one embodiment, animation caused by all behaviors appearing underneath the Stop behavior that affects the same parameter is suspended. In another embodiment, behaviors appearing above the Stop behavior are not affected.
  • ii. Trimming Behaviors
  • In one embodiment, when a user applies a behavior 10 to an object 12, the duration of the behavior 10 in the Timeline 16 defaults to the duration of the object 12 to which the behavior 10 has been applied. FIG. 22 illustrates an object with a behavior in the Timeline, according to one embodiment of the invention.
  • In one embodiment, a behavior's duration can be modified to limit the duration of its effect. In one embodiment, for example, if a user applies the Spin behavior to an object, by default that object spins around for its duration. In another embodiment, if a user trims the out point of the Spin behavior, the spinning stops at the new position of the out point.
  • In one embodiment, to alter the duration of a behavior 10 applied to an object 12 in the Timeline 16:
      • In one embodiment, move the cursor to the in or out point of any behavior in the Timeline.
      • In one embodiment, when the cursor changes to the Trim cursor 230, do one of the following:
        • In one embodiment, drag the in point to delay the beginning of the behavior's effect.
        • In one embodiment, drag the out point to end the behavior's effect prior to the end of the object.
          In one embodiment, when a user drags the In or Out point of a behavior, a tooltip 232 appears that displays the new location and duration of the behavior's edit point. FIG. 23 illustrates a behavior being trimmed in the Timeline and a tooltip, according to one embodiment of the invention.
  • In one embodiment, since behaviors don't add keyframes to the objects to which they're applied, trimming the out point of a behavior usually resets the object to its original state. In one embodiment, for many behaviors, using the Stop behavior to pause the object's animation is a better method to use then trimming its out point. In another embodiment, another way to stop a behavior's effect and leave the affected object in the transformed state is to adjust a behavior's Start and End Offset parameters.
  • Note: In one embodiment, the Spin and Throw behaviors leave the object at the transformed state after the last frame of the trimmed behavior for the object's remaining duration.
  • a. Slipping Behaviors in Time
  • In one embodiment, in addition to changing a behavior's 10 duration, a user can also slip its position in the Timeline 16 relative to the object 12 it is nested under. In one embodiment, this lets the user set the frame at which that behavior 10 begins to take effect.
  • In one embodiment, to slip a behavior in the Timeline:
      • In one embodiment, click anywhere within the middle of a behavior 10 in the Timeline.
      • In one embodiment, drag the behavior 10 to the left or right to move it to another position in the Timeline 16. In one embodiment, while the user moves the behavior, a tooltip 240 appears which displays the new In and Out points for the behavior. FIG. 24 illustrates a behavior being moved in the Timeline and a tooltip, according to one embodiment of the invention.
        FIG. 25 illustrates a behavior after it has been moved in the Timeline, according to one embodiment of the invention.
  • iii. Changing the Offset of Parameter Behaviors
  • In one embodiment, parameter behaviors have two additional parameters, Start Offset and End Offset. In one embodiment, these parameters are used to change the frame where a parameter behavior's effect begins and ends. In one embodiment, the Start Offset parameter has a slider that lets a user delay the beginning of the behavior's effect, relative to the first frame of its position in the Timeline. In another embodiment, a user can adjust this parameter to make the parameter behavior start later. In yet another embodiment, the End Offset, in turn, lets a user offset the end of the behavior's effect relative to the last frame of its position in the Timeline. In one embodiment, using this slider to stop the effect, instead of trimming the end of the behavior in the Timeline, has the result of freezing the behavior's effect on the object for its remaining duration.
  • iv. Combining Behaviors with Keyframes
  • In one embodiment, any object can have both behaviors and keyframes applied to it simultaneously. In one embodiment, when this happens, the values generated by the behavior and the keyframed values that are applied to the parameter itself are added together to yield the final value for that parameter. In another embodiment, this lets a user combine the automatic convenience of behaviors with the direct control of keyframing to achieve his final result.
  • In one embodiment, for example, if the user applies the Random Motion behavior to an object 12, that object 12 might weave around onscreen with a completely random motion path 260 similar to the following. FIG. 26 illustrates a behavior-driven motion path in the Canvas, according to one embodiment of the invention. In one embodiment, if the user turns off the Random Motion behavior temporarily and adds keyframes to the Motion parameter of the same object 12, he can create a completely predictable and smooth motion path 270. FIG. 27 illustrates a keyframed motion path in the Canvas, according to one embodiment of the invention. In another embodiment, a user can combine the two by turning the Random Motion behavior back on, with the end result being a motion path 280 that follows the general direction he wants, but that has enough random variation in it to make it interesting. FIG. 28 illustrates a behavior-driven and keyframed motion path in the Canvas, according to one embodiment of the invention.
  • In one embodiment, while this example shows how a user can combine behaviors and keyframes to create motion paths, a user can combine behaviors and keyframes for any parameter.
  • a. Combining Behaviors and Keyframes in the Keyframe Editor
  • In one embodiment, when a user displays a parameter 290 that is affected by a behavior 10 in the Keyframe Editor 292, two curves appear for that parameter 290. In one embodiment, an uneditable curve 294 in the background displays the parameter 290 as it is affected by the behavior 10. In another embodiment, there are no keyframes over this first curve. In yet another embodiment, superimposed over the first curve 294 is the parameter's editable curve 296. FIG. 29 illustrates a parameter with an oscillate behavior applied to it in the Keyframe Editor, according to one embodiment of the invention.
  • In one embodiment, a user can keyframe 300 a parameter 290 either before or after applying a behavior 10 to the object 12 that affects that parameter 290. In one embodiment, when a user keyframes 300 a parameter that's affected by a behavior, the value of the keyframed curve 296 is added to the value generated by the behavior at each frame. In another embodiment, this has the result of either raising or lowering the resulting value displayed by the background curve 294. In yet another embodiment, the background curve 294 doesn't just display the behavior's animated values, it displays the sum of all values affecting that parameter 290. FIG. 30 illustrates a parameter with an oscillate behavior and keyframes applied to it in the Keyframe Editor, according to one embodiment of the invention.
  • In one embodiment, raising or lowering a keyframe 300 in the Keyframe Editor 292 also raises or lowers the background curve 294, since it's adding to or subtracting from the values generated by the behavior 10. FIG. 31 illustrates the parameter of FIG. 30 but with one keyframe lowered, according to one embodiment of the invention.
  • Note: In one embodiment, when a user combines keyframes 300 with multiple behaviors 10, the results can appear to be unpredictable, depending on the combination of behaviors that are applied. In one embodiment, the user has the option of converting the behaviors 10 that are applied to any parameter 290 into keyframes 300. In another embodiment, converting behaviors 10 that have already been combined with keyframes 300 turns the sum of all behaviors 10 and keyframes 300 affecting that parameter 290 into a thinned series of keyframes 300. In yet another embodiment, this results in a final animation curve 330 that closely replicates the shape of the background curve 294 that appeared in the Keyframe Editor 292. In one embodiment, these keyframes 300 can then be edited directly in the Keyframe Editor 292. FIG. 32 illustrates a parameter with a behavior curve and a keyframed curve in the Keyframe Editor, according to one embodiment of the invention. FIG. 33 illustrates a parameter with a “final animation curve” in the Keyframe Editor, according to one embodiment of the invention.
  • F. Animating Behavior Parameters
  • In one embodiment, a user can animate any behavior's parameters in order to change the parameter's effect over time. In one embodiment, a user can animate behavior parameters using parameter behaviors, or by keyframing the parameters in the Keyframe Editor.
  • i. Applying Parameter Behaviors to a Behavior
  • In one embodiment, a user can animate a behavior's 10 parameter by applying a parameter behavior 10. In one embodiment, for example, a user could apply the Ramp behavior 10 to an Orbit Around behavior's 10 Drag parameter and adjust the Start and End values to increase from 0 to 8 over time. In another embodiment, this results in the orbit of the object 12 slowly decaying, causing the object 12 to fall towards the center of the orbit. FIG. 34 illustrates an object with an Orbit Around behavior applied, creating a regular orbit (a circular motion path 340), according to one embodiment of the invention. FIG. 35 illustrates the same object as in FIG. 34, but with a Ramp behavior applied to the Orbit Around behavior's Drag parameter as described above, creating a spiral motion path 340, according to one embodiment of the invention.
  • ii. Keyframing Behaviors
  • In one embodiment, if a user needs more detailed control when animating a behavior's 10 parameters, he can use keyframes 300. In one embodiment, for example, by keyframing the Drag parameter of the Orbit Around behavior 10, a user can grow and shrink the object's orbit many times, creating a much more complex motion path 340. In another embodiment, keyframing this motion path 340 manually would be incredibly difficult, but by keyframing a single parameter within a single behavior 10, a user can create this effect with ease. FIG. 36 illustrates an object with an Orbit Around behavior applied, creating a regular orbit (a circular motion path), according to one embodiment of the invention. FIG. 37 illustrates the same object as in FIG. 36, but with keyframes applied to the Orbit Around behavior's Drag parameter as described above, creating a different motion path, according to one embodiment of the invention.
  • iii. Converting Behaviors to Keyframes
  • In one embodiment, if necessary, a user can “bake” all the behaviors that have been applied to an object into keyframes using the Convert to Keyframes command, in the Object menu. In one embodiment, when a user uses the Convert to Keyframes command on an object in a project, all behaviors that are applied to that object are converted to keyframes, which are applied to the individual parameters the behaviors originally affected.
  • In one embodiment, to convert behaviors to keyframes:
      • In one embodiment, select an object that has behaviors to convert.
      • In one embodiment, choose Object>Convert to Keyframes. In one embodiment, all behaviors are converted into keyframes, which appear in the Keyframes Editor.
  • Note: In one embodiment, a user cannot selectively convert individual behaviors. In one embodiment, the Convert to Keyframes command converts all behaviors that are applied to an object at once.
  • G. Saving and Sharing Custom Behaviors
  • In one embodiment, if a user customizes a behavior, and he'd like to save it for future use, he can drag it to the Favorites folder of the Library for future use. In one embodiment, once a Behavior has been placed into the Library, it can be applied to objects like any other behavior in the Library.
  • In one embodiment, to save a behavior to the library:
      • In one embodiment, open the Library and select either the Favorites or Favorites Menu categories.
      • In one embodiment, drag the emitter object to save into the stack at the bottom of the Library.
        In one embodiment, for organizational purposes, a user may find it useful to create a new folder of his own in the Favorites or Favorites Menu categories to put his customized behaviors. In one embodiment, when a user saves a customized behavior, the behavior is saved in the User/Library/Application Support folder.
  • i. Importing and Exporting Behaviors
  • In one embodiment, each customized behavior a user drags into the Library is saved as a separate file. In one embodiment, if a user has created one or more custom behaviors that he relies upon, he may want to move them to other computers he uses.
  • In one embodiment, to copy a project preset to another computer, copy custom preset files to that computer's User/Library/Application Support folder.
  • H. Examples of Behaviors
  • In one embodiment, this section explains the options that are available for each behavior, presented by category. In one embodiment, in each description, the Target Object is the object to which the behavior is applied.
  • i. Basic Motion Behaviors
  • In one embodiment, basic Motion behaviors ammate specific parameters of the object to which they are applied. In one embodiment, some basic motion behaviors affect position, while others affect scale or rotation.
  • a. Fade In/Fade Out
  • In one embodiment, the Fade In/Fade Out behavior affects an object's Opacity parameter. In one embodiment, the Fade In/Fade Out behavior lets a user dissolve into and out of any object. In one embodiment, the Fade In/Fade Out behavior affects the opacity of the object to which it is applied, fading from 0 percent opacity to 100 percent opacity at the beginning of the clip, and then back to 0 percent opacity at the end. In one embodiment, a user can eliminate the fade in or out by setting the duration of either the fade in or fade out to 0 frames.
  • In one embodiment, this behavior is useful for introducing and removing images being animated in the middle of a project. In one embodiment, for example, a user could apply the Fade In/Fade Out behavior to text objects moving slowly across the screen to make them fade into existence, and then fade away at the end of their duration.
  • Dashboard Control—In one embodiment, the Dashboard 110 lets a user control the Fade In and Fade Out durations, equivalent to the Fade In Time and Fade Out Time parameters. In one embodiment, drag anywhere within the shaded area of the Fade In or the Fade Out ramps 380 to adjust their durations. In another embodiment, the user can extend the durations of the Fade In or Fade Out past the limits of the graphical dashboard control. FIG. 38 illustrates a Dashboard for a Fade In/Fade Out behavior, according to one embodiment of the invention.
  • Parameters in the Inspector—In one embodiment, the following parameters for the Fade In/Fade Out behavior are available in the Inspector:
      • Fade In Time—In one embodiment, the Fade In Time parameter is set by slider defining the duration, in frames, that the object will fade in, from 0 to 100 percent opacity, from the first frame of the object. In one embodiment, a duration of 0 frames results in a straight cut into the object, making it appear instantly.
      • Fade Out Time—In one embodiment, the Fade Out Time parameter is set by a slider defining the duration, in frames, that the object will fade out, from 100 to 0 percent opacity, from the last frame of the object. In one embodiment, a duration of 0 frames results in a straight cut away from the object, making it disappear instantly.
      • Start Offset—In one embodiment, the Start Offset parameter is set by a slider that lets a user delay the beginning of the behavior's effect relative to the first frame of its position in the Timeline. In one embodiment, adjust this parameter to make the behavior start later. In another embodiment, this parameter is in frames.
      • End Offset—In one embodiment, the End Offset parameter is set by a slider that lets a user offset the end of the behavior's effect relative to the last frame of its position in the Timeline, in frames. In one embodiment, adjust this parameter to make the behavior stop before the actual end of the behavior in the Timeline. In another embodiment, using this slider to stop the effect, instead of trimming the end of the behavior in the Timeline, freezes the end of the fade out for the remaining duration of the object. In yet another embodiment, trimming the end of the behavior resets the object to its original opacity.
  • b. Grow/Shrink
  • In one embodiment, the Grow/Shrink behavior affects an object's Scale parameter. In one embodiment, use the Grow/Shrink behavior to animate the scale of an object, enlarging or reducing the object's size over time at a speed defined by the Scale Rate. In another embodiment, the Grow/Shrink effect always begins at the object's original size at the first frame of the behavior.
  • In one embodiment, the Grow/Shrink behavior is a good behavior to use with high-resolution graphics to zoom into an image, such as a map or photograph. In one embodiment, a user can also combine this behavior with the Throw or Wind behavior to pan across the image while zooming into it. In another embodiment, the Grow/Shrink behavior can also be used to emphasize or de-emphasize images in a project. In yet another embodiment, a user can enlarge objects to make them the center of attention, or shrink an object while introducing another object to move the viewer's eye to the new element.
  • Dashboard Control—In one embodiment, the Grow/Shrink Dashboard 110 consists of two rectangular regions. In one embodiment, the first 390 is a rectangle with a dotted line that represents the original size of the object. In another embodiment, the second 392 is a solid rectangle that represents the target size, and can be resized by dragging any of the borders. In yet another embodiment, enlarge the box to grow the target object, or reduce the box to shrink it. In one embodiment, a slider 394 to the right lets a user adjust the scale of the Dashboard controls, increasing or decreasing the effect the controls have over the object. FIG. 39 illustrates a Dashboard for a Grow/Shrink behavior, according to one embodiment of the invention.
  • Parameters in the Inspector—In one embodiment, the following parameters for the Grow/Shrink behavior are available in the Inspector:
      • Increment—In one embodiment, the Increment parameter is set by a pop-up menu that lets the user choose how the behavior's effect progresses over its duration in the Timeline. In one embodiment, there are two options:
        • Continuous Rate—In one embodiment, this option uses the Scale Rate parameter to grow or shrink the object by a steady number of pixels per second.
        • Ramp to Final Value—In one embodiment, this option grows or shrinks the object from its original size to the specified percentage in the Scale To parameter.
      • Scale Rate/Scale To—In one embodiment, depending on the option selected in the Increment pop-up menu, the Scale Rate or Scale To parameter defines the speed and magnitude of the effect. In one embodiment, this parameter can be opened into X and Y sub-parameters by clicking the disclosure triangle to the left. In another embodiment, this lets the user adjust the horizontal or vertical scale independently.
      • Curvature—In one embodiment, the Curvature parameter lets a user adjust the acceleration with which this behavior transitions from the original to the final size. In one embodiment, higher Curvature values result in an easing into the effect, where the object slowly starts to change size, and this change gradually speeds up as the behavior continues. In another embodiment, curvature does not affect the overall duration of the effect, since the duration is defined by the length of the behavior in the Timeline, minus the End Offset.
      • End Offset—In one embodiment, the End Offset parameter is set by a slider that lets a user offset the end of the behavior's effect relative to the last frame of its position in the Timeline, in frames. In one embodiment, adjust this parameter to make the behavior stop before the actual end of the behavior in the Timeline. In another embodiment, using this slider to stop the effect, instead of trimming the end of the behavior in the Timeline, freezes the end of the Grow/Shrink effect for the remaining duration of the object. In yet another embodiment, trimming the end of the behavior resets the object to its original scale.
  • c. Motion Path
  • In one embodiment, the Motion Path behavior 10 affects an object's 12 position parameter. In one embodiment, the Motion Path behavior lets a user create a motion path 400 for an object 12 to follow. In another embodiment, when a user first applies the Motion Path behavior to an object 12, it defaults to a straight path 400 defined by two points at the beginning 410A and end 410B of the motion path 400. In yet another embodiment, the first point 410A on the path is the position of the object 12 in the Canvas at the first frame of the behavior. In one embodiment, a user can double-click or Option-click anywhere on the path to add bezier points 410C to the path, which allow the user to reshape the motion path by creating curves. FIG. 40 illustrates a Motion Path behavior, including curves, applied to an object, according to one embodiment of the invention.
  • In one embodiment, upon playback, the object moves along the assigned path. In one embodiment, the speed at which the target object travels is defined by the duration of the behavior, minus the End Offset parameter. In another embodiment, the Speed parameter lets a user create acceleration and deceleration at the beginning and end of the behavior. In yet another embodiment, the Motion Path behavior is an easy way to create predictable motion without having to make keyframes for it in the Keyframe Editor.
  • Dashboard Control—In one embodiment, the Motion Path Dashboard lets a user set the Speed parameter using a pop-up menu, with options for Linear, Ease In, Ease Out, or Both.
  • Additional Canvas Controls—In one embodiment, the motion path a user creates in the Canvas can be adjusted by adding points to the default motion path, and using the tangent controls attached to each point to adjust each curve.
  • Parameters in the Inspector—In one embodiment, the following parameters for the Motion Path behavior are available in the Inspector:
      • Speed—In one embodiment, the Speed parameter lets a user set how the object will accelerate from the first to the last point in the motion path. In one embodiment, there are four options:
        • Linear—In one embodiment, the object moves at a steady speed from the first to the last point on the motion path.
        • Ease In—In one embodiment, the object starts at a steady speed, and then slows down as it gradually decelerates to a stop at the last point of the motion path.
        • Ease Out—In one embodiment, the object slowly accelerates from the first point on the motion path, reaching and maintaining a steady speed through the last point on the motion path.
        • Ease Both—In one embodiment, the object slowly accelerates from the first point on the motion path, and then slows down as it gradually decelerates to a stop at the last point of the motion path.
      • End Offset—In one embodiment, the End Offset parameter is set by a slider that lets a user offset the end of the behavior's effect relative to the last frame its position in the Timeline, in frames. In one embodiment, adjust this parameter to make the object reach the end of the motion curve before the actual end of the behavior in the Timeline. In another embodiment, using this slider to stop the effect, instead of trimming the end of the behavior in the Timeline, freezes the object at the end of the motion path for the remaining duration of the object. In yet another embodiment, trimming the end of the behavior resets the object to its original position.
  • Related Behaviors—In one embodiment, behaviors related to Motion Path include Gravity, Random Motion, Throw, and Wind.
  • d. Snap Alignment to Motion
  • In one embodiment, the Snap Alignment to Motion behavior affects an object's 12 Rotation parameter. In one embodiment, the Snap Alignment to Motion behavior aligns the rotation of an object 12 to match all changes made to its position along a motion path 410. In one embodiment, this behavior is meant to be combined with behaviors that animate the position of an object 12, or with a keyframed motion path 410 a user creates himself. FIG. 41 illustrates an object moving along a motion path, according to one embodiment of the invention. FIG. 42 illustrates the same object as in FIG. 41, but also with a Snap Alignment to Motion behavior applied to the object, according to one embodiment of the invention.
  • Dashboard Control—In one embodiment, the Snap Alignment to Motion Dashboard has a pop-up menu to control the Axis used to adjust the object's alignment and a checkbox to let the user invert the Axis.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Snap Alignment to Motion behavior in the Inspector:
  • Axis—In one embodiment, the Axis parameter is set by a pop-up menu that lets a user specify whether the object aligns itself on its Horizontal or Vertical axis.
  • Invert Axis—In one embodiment, if the object is aligning on the correct axis, but appears backwards, the Invert Axis checkbox flips the object so that it is facing the proper direction.
  • Related Behaviors—In one embodiment, behaviors related to Snap Alignment to Motion include Align to Motion.
  • e. Spin
  • In one embodiment, the Spin behavior affects an object's Rotation parameter. In one embodiment, apply the Spin behavior to animate the rotation of an object, spinning it either clockwise or counter-clockwise. In another embodiment, if a user trims the end of the Spin behavior to be shorter than the duration of the object to which it is applied, the object remains at the angle of the last frame of the behavior.
  • In one embodiment, uses for spin are fairly obvious, but another way to use the Spin behavior is with objects that have an off-center anchor point. In one embodiment, since objects rotate about the anchor point, if a user changes an object's anchor point before he applies a spin behavior to it, he can quickly change the look of the motion he creates.
  • Dashboard Control—In one embodiment, the Spin behavior's Dashboard 110 control is a ring 430. In one embodiment, drag anywhere within the ring to manipulate an arrow 432 that indicates the direction the object spins. In another embodiment, adjust the length of the arrow 432 to change the speed at which the spinning will occur. FIG. 43 illustrates a Dashboard for a Spin behavior, according to one embodiment of the invention.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Spin behavior in the Inspector:
      • Increment—In one embodiment, the Increment parameter is set by a pop-up menu that lets a user choose how the behavior's effect progresses over its duration in the Timeline. In one embodiment, there are two options:
        • Continuous Rate—In one embodiment, this option uses the Spin Rate parameter to spin the object by a steady number of degrees per second.
        • Ramp to Final Value—In one embodiment, this option spins the object for the number of degrees specified in the Spin To parameter over the duration of the behavior in the Timeline.
      • Spin Rate/Spin To—In one embodiment, the Spin Rate/Spin To parameter is set by a dial that controls the speed at which the object spins. In one embodiment, the Spin Rate defines a continuous rate of spin in degrees per second. In another embodiment, Spin To defines a number of degrees to spin over that object's duration. In yet another embodiment, negative values result in clockwise motion, while positive values result in counter-clockwise motion.
  • f. Throw
  • In one embodiment, the Throw behavior affects an object's position parameter and is the simplest way of setting an object in motion. In one embodiment, the Throw behavior controls let a user adjust the speed and direction of a single force that is exerted on the object at the first frame of the behavior. In another embodiment, after this initial force is applied, the object continues drifting in a straight line, and at the same speed, for the duration of the Throw behavior.
  • In one embodiment, a simple example of the Throw behavior in use is to send a series of offscreen text objects moving across the screen. In one embodiment, when used in conjunction other behaviors such as Grow/Shrink and Fade In/Fade Out, a user can create sophisticated moving titles without keyframing a single parameter. In another embodiment, the Throw behavior does not apply a continuous force, nor can a user create gradual changes in direction or speed using this behavior alone. In yet another embodiment, keyframed changes to the Throw behavior are instantly applied at the frame they appear, resulting in abrupt motion.
  • In one embodiment, the Throw behavior is useful when the user is moving an object through a simulation, for example, a project in which he has arranged a number of other objects with attract or repel behaviors applied to them. In one embodiment, since the Throw behavior only applies a single force to move the target object at the initial frame of the behavior, any other behaviors that interact with the target object will have greater influence over its motion. In another embodiment, if a user wants to apply a continuous force to an object, use the Wind behavior. In yet another embodiment, if a user needs a more complex motion path, use the Motion path behavior.
  • Dashboard Control—In one embodiment, the Throw behavior's Dashboard 110 lets a user specify the direction and speed of the throw behavior by dragging an arrow 440 within a circular region 442. In one embodiment, the direction of the arrow 440 defines the direction of movement, and the length of the arrow 440 defines speed. In another embodiment, a slider 444 to the right lets the user adjust the scale of the Dashboard control, increasing or decreasing the effect the control has over the object 12. In yet another embodiment, the maximum speed a user can define with the Dashboard is not the maximum possible speed. In one embodiment, higher values can be entered into the Rate or Final Value parameter in the Behaviors tab of the Inspector. FIG. 44 illustrates a Dashboard for a Throw behavior, according to one embodiment of the invention.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Throw behavior in the Inspector:
      • Increment—In one embodiment, the Increment parameter is set by a pop-up menu that lets the user choose how the behavior's effect progresses over its duration in the Timeline. In one embodiment, there are two options:
        • Rate—In one embodiment, this option sets the speed of the object at a steady number of pixels per second, specified in the Throw Velocity parameter.
        • Final Value—In one embodiment, this option moves the object from its original position to the specified distance (in pixels) in the Throw distance parameter.
      • Throw Velocity/Throw Distance—In one embodiment, when the Increment pop-up menu is set to Rate, the Throw Velocity parameter appears which lets a user set a continuous speed for the object to move. In one embodiment, when the Increment pop-up menu is set to Final Value, the Throw Distance parameter appears, which sets a total distance (in pixels) for the object to travel over its duration.
  • Related Behaviors—In one embodiment, behaviors related to Throw include Motion Path, Gravity, Random Motion, and Wind.
  • ii. Parameter Behaviors
  • In one embodiment, parameter behaviors can be applied to any object parameter, and their effects are limited to just that parameter. In one embodiment, the same parameter behavior can be added to different parameters, resulting in completely different effects. In another embodiment, for example, a user can apply the oscillate behavior to the opacity of an object to make the object fade in and out, or he can apply it to the rotation of an object to make the object rock back and forth. In yet another embodiment, a user can also apply parameter behaviors to filter parameters, Generator parameters, the parameters of particle systems, or even the parameters of other behaviors. In one embodiment, examples of parameter behaviors include Oscillate, Randomize, and Reverse.
  • a. Average
  • In one embodiment, the Average behavior smooths the transition from one value to another caused by keyframes and behaviors that are applied to a parameter. In one embodiment, use the Average behavior to smooth out animated effects. In another embodiment, averaged motion moves more fluidly, while averaged changes to parameters such as Opacity and to filter parameters appear to happen more gradually. In yet another embodiment, use the Window Size parameter to adjust the amount by which to smooth the affected parameter. In one embodiment, the Average behavior can be used to smooth out the sequence of values generated by a Randomize behavior.
  • Dashboard Control—In one embodiment, the Average behavior's Dashboard lets the user adjust the Window Size parameter.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Average behavior in the Inspector:
      • Window Size—In one embodiment, the Window Size parameter is set by a slider that lets a user adjust the amount of smoothing to apply to the affected parameter by specifying the number of adjacent keyframes to average together. In one embodiment, higher values apply more smoothing by averaging a wider range of values, resulting in more fluid animation. In another embodiment, lower values average a narrower range of values, and apply less smoothing with values that are closer to the original.
      • Apply To—In one embodiment, the Apply To pop-up menu shows the parameter being affected, and can be used to reassign the behavior to another parameter.
  • Related Behaviors—In one embodiment, behaviors related to Average include Negate and Reverse.
  • b. Custom
  • In one embodiment, the Custom behavior allows a user to create his own custom behaviors.
  • c. Negate
  • In one embodiment, the Negate behavior 10 inverts the value of each keyframe 300 and behavior effect in the parameter to which it is applied. In one embodiment, the Negate behavior basically flips each parameter value to its opposite. In another embodiment, motion paths 450 are flipped, rotation is reversed, and any effect's parameter will be changed to its opposite. In yet another embodiment, for example, applying the Negate behavior to the Position parameter of an object 12 with a Motion Path behavior applied results in the motion path 450 being flipped. FIG. 45 illustrates a motion path behavior applied to an object, according to one embodiment of the invention. FIG. 46 illustrates a motion path behavior applied to an object, and a Negate behavior applied to the object's Position parameter, according to one embodiment of the invention.
  • Dashboard Control—In one embodiment, there are no Dashboard controls for the Negate behavior.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Negate behavior in the Inspector:
      • Apply To—In one embodiment, the Apply To pop-up menu shows the parameter being affected, and can be used to reassign the behavior to another parameter.
  • Related Behaviors—In one embodiment, behaviors related to Negate include Average and Reverse.
  • d. Oscillate
  • In one embodiment, the Oscillate behavior animates a parameter by cycling the parameter between two different values. In one embodiment, a user can customize how widely apart the high and low values are, as well as the number of oscillations per second.
  • In one embodiment, the Oscillate behavior can create all kinds of cyclical effects. In one embodiment, for example, if a user applies the Oscillate behavior to the rotation property of an object, the object will begin to rock back and forth. In another embodiment, this happens because the rotation property cycles back and forth between the initial rotation value plus and minus the Amplitude value that is set in the Oscillate behavior. In yet another embodiment, applying the Oscillate behavior to the X value of the scale parameter instead causes the width of the object to cycle, and it repeatedly stretches and compresses for the duration of the behavior.
  • Dashboard Control—In one embodiment, the Oscillate Dashboard 10 lets a user adjust the Phase 470, Amplitude 472, and Speed 474 of the Oscillate behavior. FIG. 47 illustrates a Dashboard for an Oscillate behavior, according to one embodiment of the invention.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Oscillate behavior in the Inspector:
      • Phase—In one embodiment, the Phase parameter can be set by a slider that lets a user adjust the point of the specified oscillation the behavior starts at. In one embodiment, this parameter allows the user to put multiple objects with identical Oscillation behaviors out of phase with one another so that they don't all look the same.
      • Amplitude—In one embodiment, the Amplitude parameter can be set by a slider that lets a user adjust the maximum value, which defines the beginning and end of each oscillation. In one embodiment, higher values result in more extreme swings from the beginning to the ending of each oscillation.
      • Speed—In one embodiment, the Speed parameter can be set by a slider that lets a user adjust the speed at which the oscillation occurs, in oscillations per second. In one embodiment, higher values result in faster oscillations.
      • Start Offset—In one embodiment, the Start Offset parameter can be set by a slider that lets a user delay the beginning of the behavior's effect relative to the first frame of its position in the Timeline. In one embodiment, adjust this parameter to make the behavior start later. In another embodiment, this parameter is in frames.
      • End Offset—In one embodiment, the End Offset parameter can be set by a slider that lets a user offset the end of the behavior's effect relative to the last frame of its position in the Timeline, in frames. In one embodiment, adjust this parameter to make the behavior stop before the actual end of the behavior in the Timeline. In another embodiment, using this slider to stop the effect, instead of trimming the end of the behavior in the Timeline, freezes the end of the effect for the remaining duration of the object. In yet another embodiment, trimming the end of the behavior resets the object to its original parameters.
      • Apply To—In one embodiment, the Apply To pop-up menu shows the parameter being affected, and can be used to reassign the behavior to another parameter.
  • Related Behaviors—In one embodiment, behaviors related to Oscillate include Ramp and Rate.
  • e. Ramp
  • In one embodiment, the Ramp behavior lets a user create a gradual transition, in any parameter, from the Start Value to the End Value. In one embodiment, the speed of the transition is defined by the length of the Ramp behavior in the Timeline. In another embodiment, additional parameters allow a user to define how the transition occurs, whether it is at a single continuous speed, or whether it accelerates over time. In yet another embodiment, ramp is a versatile behavior. In one embodiment, if a user applies the Ramp behavior to the Scale property, it works like the Grow/Shrink behavior. In another embodiment, if a user applies it to the opacity property, he can fade an object in or out in different ways. In yet another embodiment, although a user can use the Ramp behavior to mimic other behaviors, it can be applied to any parameter he wants.
  • In one embodiment, for example, suppose a user is animating different segments of a bar graph, and each segment needs to grow until it reaches a specific height. Once the user has arranged the different bars in the graph in the Canvas, he can apply Ramp behaviors to the Y values of the Four Corner Top Right and Top Left parameters of each bar, and set the End Value parameters of each object's pair of Ramp behaviors to the height he wants each bar to reach.
  • Dashboard Control—In one embodiment, the Ramp Dashboard lets a user adjust the Ramp's Start Value, End Value, and Curvature.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Ramp behavior in the Inspector:
      • Start Value—In one embodiment, the Start Value is the value that's applied at the first frame of the Ramp behavior.
      • End Value—In one embodiment, the End Value is the value the Ramp behavior reaches at the last frame of the behavior. In one embodiment, over the life of the behavior, the parameter the Ramp behavior is applied to makes a transition from the Start Value to the End Value.
      • Curvature—In one embodiment, the Curvature parameter lets a user ease the acceleration with which the Ramp behavior transitions from the Start Value to the End Value. In one embodiment, higher Curvature values result in an Ease In effect, where the value slowly begins the transition, and gradually speeds up as the behavior continues. In another embodiment, curvature does not affect the overall duration of the effect, since that is defined by the length of the behavior in the Timeline.
      • Start Offset—In one embodiment, the Start Offset parameter is set by a slider that lets a user delay the beginning of the behavior's effect relative to the first frame of its position in the Timeline. In one embodiment, adjust this parameter to make the behavior start later. In another embodiment, this parameter is in frames.
      • End Offset—In one embodiment, the End Offset parameter is set by a slider that lets a user offset the end of the behavior's effect relative to the last frame of its position in the Timeline, in frames. In one embodiment, adjust this parameter to make the behavior stop before the actual end of the behavior in the Timeline. In another embodiment, using this slider to stop the effect, instead of trimming the end of the behavior in the Timeline, freezes the end of the effect for the remaining duration of the object. In yet another embodiment, trimming the end of the behavior resets the object to its original parameter.
      • Apply To—In one embodiment, the Apply To pop-up menu shows the parameter being affected, and can be used to reassign the behavior to another parameter.
  • Related Behaviors—In one embodiment, behaviors related to Ramp include Oscillate and Rate.
  • f. Randomize
  • In one embodiment, the Randomize behavior creates a continuous sequence of randomly increasing and decreasing values, based on the parameters defining the range and type of values that are generated. In one embodiment, although the values created with this behavior are random, they're actually predetermined by the parameter settings chosen by the user. In another embodiment, as long as the user doesn't change the parameters, the frame-by-frame values created by this behavior remain the same. In yet another embodiment, if a user doesn't like the values that were randomly generated, click the Generate button in the Behavior tab in the Inspector to pick a new random seed number. In one embodiment, this number is used to generate a new sequence of values.
  • In one embodiment, the Apply Mode parameter determines how values generated by this behavior are combined with other behaviors and keyframes that affect the same parameter. In one embodiment, this provides a user with different ways of using a Randomize behavior to modify a parameter's preexisting values. In another embodiment, the Randomize behavior is useful for creating jittery effects, such as twitchy rotation, flickering opacity, and other effects requiring rapid and varied changes over time that would be time-consuming to keyframe. In yet another embodiment, the Randomize behavior can be modified with other behaviors, such as Average and Negate, to exercise further control over the values being generated.
  • Dashboard Control—In one embodiment, the Randomize Dashboard has controls for Amount, Frequency, Wriggle Offset, Noisiness, Link, Start Offset, and End Offset.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Randomize behavior in the Inspector:
      • Amount/Multiplier—In one embodiment, the Amount/Multiplier parameter is set to Amount when the Apply Mode is set to Add or Subtract, and Multiplier when the Apply Mode is set to Multiply. In one embodiment, this parameter defines the maximum value the Randomize behavior will generate.
      • Apply Mode—In one embodiment, the Apply Mode pop-up menu determines how values generated by this behavior are combined with other behaviors and keyframes that affect the same parameter. In one embodiment, this provides a user with different ways of using a Randomize behavior to modify a parameter's preexisting values. In another embodiment, the options are Add, Subtract, or Multiply.
      • Frequency—In one embodiment, the Frequency parameter is set by a slider that lets a user adjust the amount of random variation per second. In one embodiment, higher values will generate faster variations, whereas lower values will generate slower variations.
      • Wriggle Offset—In one embodiment, the Wriggle Offset parameter allows a user to offset the sequence of random values when he wants to apply the same randomize behavior to multiple objects. In one embodiment, by offsetting each object's version of the Randomize behavior, a user can prevent them from moving in sync.
      • Noisiness—In one embodiment, the Noisiness parameter adds an additional overlay of random variance to the Frequency the user has set. In one embodiment, higher Noisiness values results in more erratic variations in the affected parameter.
      • Link—In one embodiment, the Link parameter appears when the user applies this behavior to a two-dimensional parameter, such as Position or Scale, that consists of X and Y values. In one embodiment, turn this checkbox on to keep the behavior's effect on each value proportional.
      • Random Seed—In one embodiment, a button lets a user pick a new random seed number. In one embodiment, this number is used to randomly generate new sequences of values, based on the other parameters of this behavior.
      • Start Offset—In one embodiment, a slider lets a user delay the beginning of the behavior's effect relative to the first frame of its position in the Timeline. In one embodiment, adjust this parameter to make the behavior start later. In one embodiment, this parameter is in frames.
      • End Offset—In one embodiment, the End Offset parameter is set by a slider that lets a user offset the end of the behavior's effect relative to the last frame of its position in the Timeline, in frames. In one embodiment, adjust this parameter to make the behavior stop before the actual end of the behavior in the Timeline. In another embodiment, using this slider to stop the effect, instead of trimming the end of the behavior in the Timeline, freezes the last random value generated by this behavior for the remaining duration of the object. In yet another embodiment, trimming the end of the behavior resets the parameter to its original value.
      • Related Behaviors—In one embodiment, behaviors related to Randomize include Random Motion and Wriggle.
  • g. Rate
  • In one embodiment, the Rate behavior increases a parameter's value over time, with the rate of increase determined by the Rate slider. In one embodiment, to use the Rate parameter to decrease a parameter over time, apply the Negate behavior after it.
  • Dashboard Control—In one embodiment, the Rate Dashboard has controls for Rate and Curvature.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Rate behavior in the Inspector:
      • Rate—In one embodiment, the Rate parameter is set by a slider that lets a user set a rate of increase over time for the affected parameter. In one embodiment, the Rate parameter is measured in percentage increase per second.
      • Curvature—In one embodiment, the Curvature parameter lets a user adjust the acceleration with which the Rate behavior gets up to speed. In one embodiment, higher Curvature values result in an Ease In effect, where the value begins slowly, gradually reaching the target speed as the behavior continues. In another embodiment, curvature does not affect the overall duration of the effect, since that is defined by the length of the behavior in the Timeline.
      • Apply To—In one embodiment, the Apply To pop-up menu shows the parameter being affected, and can be used to reassign the behavior to another parameter.
  • Related Behaviors—In one embodiment, behaviors related to Rate include Oscillate and Ramp.
  • h. Reverse
  • In one embodiment, the Reverse behavior reverses the direction of any animation affecting a parameter, whether the animation is caused by behaviors or keyframes. In one embodiment, in some instances, the Reverse and Negate behaviors have the same effect. In another embodiment, in other instances, their effects are very different. In yet another embodiment, for example, applying the Negate parameter flips an object's motion path, while applying the Reverse behavior leaves the motion path alone, reversing the object's motion, instead.
  • Dashboard Control—In one embodiment, there are no Dashboard controls for the Reverse behavior.
  • Parameters in the Inspector—In one embodiment, several parameters are available for the Reverse behavior in the Inspector.
  • Related Behaviors—In one embodiment, behaviors related to Reverse include Average and Negate.
  • i. Stop
  • In one embodiment, the Stop behavior suspends the animation of all behaviors that:
      • are below it in the Layers tab
      • also affect the parameter to which Stop is applied
      • begin prior to Stop's in point in the Timeline
        In one embodiment, each behavior's effect on the object is frozen at the parameters at the first frame of the Stop behavior in the Timeline. In another embodiment, keyframes that are applied to that parameter are disabled for the duration of the stop behavior in the Timeline. In yet another embodiment, if the Stop behavior is shorter then the object to which it is applied, all keyframes and behaviors affecting that channel immediately take effect after the last frame of the Stop behavior.
  • Dashboard Control—In one embodiment, there is no Dashboard control for the Stop behavior.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Stop behavior in the Inspector:
      • Apply To—In one embodiment, the Apply To pop-up menu shows the parameter being stopped, and can be used to reassign the stop behavior to another parameter.
  • i. Wriggle
  • In one embodiment, this behavior works similarly to the Randomize behavior, but with a slower effect.
  • Dashboard Control—In one embodiment, the Wriggle Dashboard has controls for Amount, Frequency, Wriggle Offset, Noisiness, Link, Start Offset, and End Offset.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Wriggle behavior in the Inspector:
      • Amount/Multiplier—In one embodiment, the Amount/Multiplier parameter is set to Amount when the Apply Mode is set to Add or Subtract, and Multiplier when the Apply Mode is set to Multiply. In one embodiment, this parameter defines the maximum value the Randomize behavior will generate.
      • Apply Mode—In one embodiment, the Apply Mode parameter is set by a pop-up menu that determines how values generated by this behavior are combined with other behaviors and keyframes that affect the same parameter. In one embodiment, this provides a user with different ways of using a Randomize behavior to modify a parameter's preexisting values. In one embodiment, the options are Add, Subtract, or Multiply.
      • Frequency—In one embodiment, the Frequency parameter is set by a slider that lets a user adjust the amount of random variation per second. In one embodiment, higher values will generate faster variations, whereas lower values will generate slower variations.
      • Wriggle Offset—In one embodiment, the Wriggle Offset parameter is set by a slider that allows a user to offset the sequence of random values when he wants to apply the same randomize behavior to multiple objects. In one embodiment, by offsetting each object's version of the Wriggle behavior, a user can prevent them from moving in sync.
      • Noisiness—In one embodiment, the Noisiness parameter is set by a slider that adds an additional overlay of random variance to the Frequency the user has set. In one embodiment, higher Noisiness values result in more erratic variations in the affected parameter.
      • Link—In one embodiment, the Link parameter appears when a user applies this behavior to a two-dimensional parameter, such as Position or Scale, that consists of X and Y values. In one embodiment, turn this checkbox on to keep the behavior's effect on each value proportional.
      • Random Seed—In one embodiment, the Random Seed parameter is set by a button that lets a user pick a new random seed number. In one embodiment, this number is used to randomly generate new sequences of values, based on the other parameters of this behavior.
      • Start Offset—In one embodiment, the Start Offset parameter is set by a slider that lets a user delay the beginning of the behavior's effect relative to the first frame of its position in the Timeline. In one embodiment, adjust this parameter to make the behavior start later. In another embodiment, this parameter is in frames.
  • End Offset—In one embodiment, the End Offset parameter is set by a slider that lets a user offset the end of the behavior's effect relative to the last frame of its position in the Timeline, in frames. In one embodiment, adjust this parameter to make the behavior stop before the actual end of the behavior in the Timeline. In another embodiment, using this slider to stop the effect, instead of trimming the end of the behavior in the Timeline, freezes the last random value generated by this behavior for the remaining duration of the object. In yet another embodiment, trimming the end of the behavior resets the parameter to its original value.
  • Related Behaviors—In one embodiment, behaviors related to Wriggle include Random Motion and Randomize.
  • iii. Simulation Behaviors
  • In one embodiment, simulation behaviors perform one of two tasks. In one embodiment, some simulation behaviors, such as Gravity, animate the parameters of an object in a way that simulates a real-world phenomenon. In another embodiment, other simulation behaviors, such as Attractor and Repel, affect the parameters of one or more objects surrounding the object to which they're applied. In yet another embodiment, these behaviors allow a user to create some very sophisticated interactions among multiple objects in a project with a minimum of adjustments. In one embodiment, like the basic motion behaviors, simulation behaviors also affect specific object parameters. In another embodiment, examples of simulation behaviors include Attractor, Gravity, and Repel.
  • a. Align To Motion
  • In one embodiment, the Align To Motion behavior affects an object's Rotation parameter. In one embodiment, the Align To Motion behavior changes the rotation of an object to match changes made to the object's direction along a motion path. In one embodiment, this behavior is meant to be combined with behaviors that animate the position of an object, or with a keyframed motion path created by a user.
  • In one embodiment, unlike the Snap Alignment to Motion behavior, which produces absolute changes in rotation that precisely match changes in direction, Align to Motion has a springy effect, and creates a more lively effect. In one embodiment, for example, if a user has a graphic of a rocket to which he has applied a Motion Path behavior, he can add the Align To behavior to make the rocket point in the direction it is moving. In another embodiment, by adjusting the Drag parameter, he can make it careen wildly about its anchor point as it goes around turns in the motion path.
  • Dashboard Control—In one embodiment, the Align to Motion Dashboard has controls for Axis, Invert Axis, Spring Tension, and Drag.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Align to Motion behavior in the Inspector:
      • Axis—In one embodiment, the Axis parameter is set by a pop-up menu that lets a user align the target object's rotation to the X or Y value of its position.
      • Invert Axis—In one embodiment, the Invert Axis parameter is set by a checkbox that flips the orientation with which the object aligns itself to the motion.
      • Spring Tension—In one embodiment, the Spring Tension parameter is set by a slider that adjusts how quickly the object's rotation changes to match a change in the object's direction. In one embodiment, lower values create a delay between a change to an object's position and its subsequent change in rotation. In another embodiment, higher values create more responsive changes in rotation.
      • Drag—In one embodiment, the Drag parameter is set by a slider that adjusts whether or not the change in rotation made by this behavior overshoots the new direction of the object. In one embodiment, low drag values result in springy changes in rotation, where the object rotates back and forth as it overshoots changes in direction. In another embodiment, high drag values dampen this effect, making the object's rotation stick more closely to the changes made in direction.
  • Related Behaviors—In one embodiment, behaviors related to Align To Motion include Snap Alignment to Motion.
  • b. Attracted To
  • In one embodiment, the Attracted To behavior is part of a group of simulation behaviors that let a user create complex animated relationships between two or more objects. In one embodiment, these behaviors are extremely powerful, and allow complicated effects to be created with a minimum of steps.
  • In one embodiment, the Attracted To behavior affects an object's Position parameter. In one embodiment, an object with the Attracted To behavior (the “attracted object” 12A) moves towards a single specified object, the object of attraction (the “attracting object” 12B). In one embodiment, additional parameters allow a user to adjust the area of influence that defines how close an object 12A needs to be to move towards the object of attraction 12B, and how strongly it is attracted. FIG. 48 illustrates two objects (an attracting object and an attracted object) and a motion path 480 of the latter object, according to one embodiment of the invention.
  • In one embodiment, the Drag parameter lets a user define whether attracted objects overshoot and bounce around the attracting object, or whether they eventually slow down and stop at the position of the attracting object. In one embodiment, a user can apply two or more Attracted To behaviors to a single object, each with a different object of attraction, to create tug-of-war situations where the object bounces among all the objects it is attracted to.
  • Dashboard Control—In one embodiment, the Attracted To Dashboard has an image well that the user can use to assign an object of attraction, as well as controls for Strength, Falloff Type, Falloff Rate, Influence, and Drag.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Align to Motion behavior in the Inspector:
      • Object—In one embodiment, the Object parameter is set by an image well that defines the object of attraction.
      • Strength—In one embodiment, the Strength parameter is set by a slider that defines the speed at which the attracted object moves towards the object of attraction. In one embodiment, with a value of 0, the object doesn't move at all. In another embodiment, the higher the value, the faster the object will move.
      • Falloff Type—In one embodiment, the Falloff Type parameter is set by a pop-up menu that determines whether the distance defined by the Influence parameter falls off linearly or exponentially.
        • Linear—In one embodiment, all objects that are within the area of influence are attracted equally.
        • Exponential—In one embodiment, the closer an object is within the area of influence, the stronger it is attracted, and the faster it will move towards the object of attraction.
      • Falloff Rate—In one embodiment, the Falloff Rate parameter is set by a slider that defines the rate of acceleration with which objects move towards the object of attraction. In one embodiment, a low Falloff Rate value results in objects quickly getting up to speed as they move towards the object of attraction. In another embodiment, a high Falloff Rate causes objects to accelerate much more slowly.
      • Influence—In one embodiment, the Influence parameter is set by a slider that defines the area of influence, in pixels. In one embodiment, objects that fall within the area of influence move towards the object of attraction. In another embodiment, objects that are outside of the area of influence remain where they are.
      • Drag—In one embodiment, the Drag parameter is set by a slider that can be used to reduce the distance attracted objects overshoot the object of attraction. In one embodiment, the effect is of the attracted objects skidding to a stop at the position of the target object. In another embodiment, lower Drag values result in the object overshooting the object of attraction, moving past and then careening back around towards the target object again and again. In yet another embodiment, higher Drag values result in the object coming to rest sooner.
  • Related Behaviors—In one embodiment, behaviors related to Attracted To include Attractor, Drift Attracted To, Drift Attractor, Orbit Around, Spring, and Vortex.
  • c. Attractor
  • In one embodiment, the Attractor behavior affects other objects' Position parameters. In one embodiment, the Attractor behavior is the opposite of the Attracted To behavior. In one embodiment, if a user applies an Attractor behavior to an object, other objects that lie within the area of influence move toward it. In another embodiment, a user can manipulate the strength with which other objects are attracted, as well as the distance required for attraction to begin.
  • In one embodiment, by default, objects overshoot the object of attraction and bounce around, never coming to rest. In one embodiment, the Drag parameter lets a user adjust this behavior, changing whether attracted objects overshoot and bounce around, or whether they eventually slow down and stop at the position of the target object. In another embodiment, the Attractor behavior can affect all objects in the Canvas that fall within the area of attraction, or a user can limit the Attractor behavior's effect to a specific group of objects, using the Affect parameter. In yet another embodiment, the Attractor behavior can also be applied to objects in motion. In one embodiment, if a user animates the position of the Target object to which he has applied the Attractor behavior, all other objects in the Canvas continue to be attracted to the Target object's new position.
  • Dashboard Control—In one embodiment, the Attractor Dashboard has controls for Affect, Strength, Falloff Type, Falloff Rate, Influence, and Drag.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Attractor behavior in the Inspector:
      • Affect—In one embodiment, the Affect parameter is set by a pop-up menu that limits which objects in a project are affected by the Attractor behavior. In one embodiment, there are three options:
      • All Objects—In one embodiment, all objects in the Canvas are affected by the Attractor behavior. In one embodiment, this is the default behavior.
      • Related Objects—In one embodiment, only other objects that are within the same layer as the object of attraction are affected.
      • Specific Objects—In one embodiment, only objects appearing in the Affected Objects list are affected by the Attractor behavior. In one embodiment, the Affected Objects list appears when Specific Objects is selected in the Affect pop-up menu. In another embodiment, drag objects from the Layers tab into this list to be affected by the Attractor behavior when the Related Objects option is selected in the Influence pop-up. In one embodiment, drag the layer icon of objects in a project from the Layers tab to add them to this list.
      • Strength—In one embodiment, the Strength parameter is set by a slider that defines the speed with which attracted objects move towards the target object. In one embodiment, with a value of 0, attracted objects don't move at all. In another embodiment, the higher the value, the faster attracted objects will move.
      • Falloff Type—In one embodiment, the Falloff Type parameter is set by a pop-up menu that determines whether the distance defined by the Influence parameter falls off linearly or exponentially.
      • Linear—In one embodiment, all objects that are within the area of influence are attracted equally.
      • Exponential—In one embodiment, the closer an object is within the area of influence, the stronger it is attracted, and the faster it will move towards the object of attraction.
      • Falloff Rate—In one embodiment, the Falloff Rate parameter is set by a slider that defines the rate of acceleration with which objects move towards the object of attraction. In one embodiment, a low Falloff Rate value results in objects quickly getting up to speed as they move towards the object of attraction. In another embodiment, a high Falloff Rate causes objects to accelerate much more slowly.
      • Influence—In one embodiment, the Influence parameter is set by a slider that defines the area of influence, in pixels. In one embodiment, objects that fall within the area of influence move towards the object of attraction. In another embodiment, objects that are outside of the area of influence remain where they are.
      • Drag—In one embodiment, the Drag parameter is set by a slider that can be used to reduce the distance attracted objects overshoot the object of attraction. In one embodiment, the effect is of the attracted objects skidding to a stop at the position of the target object. In another embodiment, lower Drag values result in the object overshooting the object of attraction, moving past and then careening back around towards the target object again and again. In yet another embodiment, higher Drag values result in the object coming to rest sooner.
  • Related Behaviors—In one embodiment, behaviors related to Attractor include Attracted To, Drift Attracted To, Drift Attractor, Orbit Around, Spring, and Vortex.
  • d. Drag
  • In one embodiment, the Drag behavior affects an object's Position parameter. In one embodiment, the Drag behavior lets a user simulate the force of friction on a moving object, slowing it down over time until it eventually comes to a stop. In another embodiment, applying the Drag behavior is an easy way to decelerate objects with multiple behaviors that create complex motion.
  • Dashboard Control—In one embodiment, the Drag Dashboard lets a user adjust the Amount of drag.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Drag behavior in the Inspector:
      • Amount—In one embodiment, the Amount parameter is set by a slider that can be used to slow down an object over time, causing it to eventually come to a stop. In one embodiment, higher Drag values result in the object coming to rest sooner. In another embodiment, a user can adjust the drag applied to the X and Y values separately. In yet another embodiment, one example of this would be to create a situation where an object's vertical speed slows down faster than its horizontal speed.
  • Related Behaviors—In one embodiment, behaviors related to Drag include Rotational Drag.
  • e. Drift Attracted To
  • In one embodiment, the Draft Attracted To behavior affects an object's Position parameter. In one embodiment, the Draft Attracted To behavior is similar to the Attracted To behavior, but by default an object moves towards the object of attraction and comes to rest, rather then overshooting the object of attraction and bouncing around.
  • Dashboard Control—In one embodiment, the Drift Attracted To Dashboard has an image well that the user can use to assign an object of attraction, as well as sliders for Strength and Drag.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Drift Attracted To behavior in the Inspector:
      • Object—In one embodiment, the Object parameter is set by an image well that defines the object of attraction.
      • Strength—In one embodiment, the Strength parameter is set by a slider that defines the speed at which the object moves towards the object of attraction. In one embodiment, with a value of 0, the object doesn't move at all. In another embodiment, the higher the value, the faster the object will move.
      • Falloff Type—In one embodiment, the Falloff Type parameter is set by a pop-up menu that determines whether the distance defined by the Influence parameter falls off linearly or exponentially.
        • Linear—In one embodiment, all objects that are within the area of influence are attracted equally.
        • Exponential—In one embodiment, the closer an object is within the area of influence, the stronger it is attracted, and the faster it will move towards the object of attraction.
      • Falloff Rate—In one embodiment, the Falloff Rate parameter is set by a slider that defines the rate of acceleration with which objects move towards the object of attraction. In one embodiment, a low Falloff Rate value results in objects quickly getting up to speed as they move towards the object of attraction. In another embodiment, a high Falloff Rate causes objects to accelerate much more slowly.
      • Influence—In one embodiment, the Influence parameter is set by a slider that defines the area of influence, in pixels. In one embodiment, objects that fall within the area of influence move towards the object of attraction. In another embodiment, objects that are outside of the area of influence remain where they are.
      • Drag—In one embodiment, the Drag parameter is set by a slider that can be used to reduce the distance attracted objects overshoot the object of attraction. In one embodiment, the effect is of the attracted objects skidding to a stop at the position of the target object. In another embodiment, lower Drag values result in the object overshooting the object of attraction, moving past and then careening back around towards the target object again and again. In yet another embodiment, higher Drag values result in the object coming to rest sooner.
  • Related Behaviors—In one embodiment, behaviors related to Drift Attracted To include Attracted To, Attractor, Drift Attractor, Orbit Around, Spring, and Vortex.
  • f. Drift Attractor
  • In one embodiment, the Drift Attractor behavior affects other objects' Position parameters. In one embodiment, the Draft Attractor behavior is similar to the Attractor behavior, but by default objects within the area of influence move towards the object of attraction and come to rest, rather then overshooting the object of attraction and bouncing around.
  • Dashboard Control—In one embodiment, the Drift Attractor Dashboard has controls for Affect, Strength, and Drag.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Drift Attractor behavior in the Inspector:
      • Affect—In one embodiment, the Affect parameter is set by a pop-up menu that limits which objects in a project are affected by the Attractor behavior. In one embodiment, there are three options:
        • All Objects—In one embodiment, all objects in the Canvas are affected by the Attractor behavior. In one embodiment, this is the default behavior.
        • Related Objects—In one embodiment, only other objects that are within the same layer as the object of attraction are affected.
        • Specific Objects—In one embodiment, only objects appearing in the Affected Objects list are affected by the Attractor behavior. In one embodiment, the Affected Objects list appears when Specific Objects is selected in the Affect pop-up menu. In another embodiment, drag objects from the Layers tab into this list to be affected by the Attractor behavior when the Related Objects option is selected in the Influence pop-up. In yet another embodiment, drag the layer icon of objects in a project from the Layers tab to add them to this list.
      • Strength—In one embodiment, the Strength parameter is set by a slider that defines the speed with which attracted objects move towards the target object. In one embodiment, with a value of 0, attracted objects don't move at all. In another embodiment, the higher the value, the faster attracted objects will move.
      • Falloff Type—In one embodiment, the Falloff parameter is set by a pop-up menu that determines whether the distance defined by the Influence parameter falls off linearly or exponentially.
        • Linear—In one embodiment, all objects that are within the area of influence are attracted equally.
        • Exponential—In one embodiment, the closer an object is within the area of influence, the stronger it is attracted, and the faster it will move towards the object of attraction.
      • Falloff Rate—In one embodiment, the Falloff Rate parameter is set by a slider that defines the rate of acceleration with which objects move towards the object of attraction. In one embodiment, a low Falloff Rate value results in objects quickly getting up to speed as they move towards the object of attraction. In another embodiment, a high Falloff Rate causes objects to accelerate much more slowly.
      • Influence—In one embodiment, the Influence parameter is set by a slider that defines the area of influence, in pixels. In one embodiment, objects that fall within the area of influence move towards the object of attraction. In another embodiment, objects that are outside of the area of influence remain where they are.
      • Drag—In one embodiment, the Drag parameter is set by a slider that can be used to reduce the distance attracted objects overshoot the object of attraction. In one embodiment, the effect is of the attracted objects skidding to a stop at the position of the target object. In another embodiment, lower Drag values result in the object overshooting the object of attraction, moving past and then careening back around towards the target object again and again. In yet another embodiment, higher Drag values result in the object coming to rest sooner.
  • Related Behaviors—In one embodiment, behaviors related to Drift Attractor include Attracted To, Attractor, Drift Attracted To, Orbit Around, Spring, and Vortex.
  • g. Edge Collision
  • In one embodiment, the Edge Collision behavior affects an object's 12 Position parameter. In one embodiment, the Edge Collision behavior is a good behavior to use to set up complex motion simulations where objects 12 should not exit the Canvas. In another embodiment, objects 12 with the edge collision behavior applied either come to a stop, or bounce off after colliding with the edge of the Canvas frame. In yet another embodiment, for example, if a user applied the Throw behavior to an object 12 and set the velocity to send the object towards the edge of the frame, then applied Edge Collision, the object 12 would hit the edge of the frame, then bounce off according to the Bounce Strength parameter. FIG. 49 illustrates one object and an edge collision motion path 490, according to one embodiment of the invention. In one embodiment, the angle in which the object bounces depends on the angle with which it hit the edge of the frame, while the speed it travels after bouncing is set by the Bounce Strength parameter.
  • In one embodiment, the Edge Collision behavior uses only the rectangular edges of the object's bounding box to determine how the object collides with the Canvas edge. In one embodiment, if a user is using this behavior with an object that has an alpha channel that is smaller than its bounding box, adjust the Crop parameter in the object's Properties tab to fit the bounding box as closely as possible to the edge of the image.
  • Dashboard Control—In one embodiment, the Edge Collision Dashboard has controls for Affect, Bounce Strength, and Active Edges.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Edge Collision behavior in the Inspector:
      • Affect—In one embodiment, the Affect parameter is set by a pop-up menu that determines which objects in a project are affected by the Edge Collision behavior. In one embodiment, there are three options:
        • All Objects—In one embodiment, all objects in the Canvas bounce off the edge of the frame. In one embodiment, this is the default behavior.
        • Related Objects—In one embodiment, only other objects that are within the same layer as the object of attraction bounce off the edge of the frame.
        • Specific Objects—In one embodiment, only objects appearing in the Affected Objects list bounce off the edge of the frame. In one embodiment, the Affected Objects list appears when Specific Objects is selected in the Affect pop-up menu. In another embodiment, drag objects from the Layers tab into this list to be affected by the Edge Collision behavior when the Related Objects option is selected in the Influence pop-up. In yet another embodiment, drag the layer icon of objects in a project from the Layers tab to add them to this list.
      • Bounce Strength—In one embodiment, the Bounce Strength parameter is the speed at which objects travel after colliding with an edge. In one embodiment, a value of 0 causes objects to come to a complete stop when colliding with an edge. In another embodiment, higher values cause an object to move faster after bouncing.
      • Active Edges—In one embodiment, the Active Edges parameter is set by four checkboxes that define which Canvas edges are detected by the Edge Collision behavior. In one embodiment, a user can turn on and off edges in any combination.
  • h. Gravity
  • In one embodiment, the Gravity behavior affects an object's 12 Position parameter. In one embodiment, the Gravity behavior causes an object 12 to fall over time. In another embodiment, the gravitational acceleration can be increased or decreased, resulting in a change to the rate of fall. In yet another embodiment, objects 12 affected by the Gravity behavior continue to fall past the bottom edge of the Canvas (unless the Edge Collision behavior has been applied). FIG. 50 illustrates an object and a gravity motion path 500, according to one embodiment of the invention.
  • In one embodiment, the Gravity behavior can be used in conjunction with other behaviors that animate the position of objects to create natural-looking arcs and motion paths that simulate thrown objects falling to the ground. In one embodiment, for example, apply the throw behavior to an object to send it flying through the air, and then apply the Gravity object to it to make the object arc up and then fall down past the bottom of the Canvas. In one embodiment, a user can also set the Acceleration parameter to a negative value, effectively applying “anti-gravity” to the object and making it fly up.
  • Dashboard Control—In one embodiment, the Gravity Dashboard lets a user adjust the Acceleration parameter.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Gravity behavior in the Inspector:
      • Acceleration—In one embodiment, the Acceleration parameter is set by a slider that defines the strength of gravity affecting the target object. In one embodiment, the higher this value, the faster the target object will fall.
  • Related Behaviors—In one embodiment, behaviors related to Gravity include Motion Path, Random Motion, Throw, and Wind.
  • i. Orbit Around
  • In one embodiment, the Orbit Around behavior affects an object's 12A Position parameter. In one embodiment, similar to the Attracted To behavior, the Orbit Around behavior's default parameter settings cause an object 12A to orbit around another object 12B in a perfect circle. FIG. 51 illustrates a first object orbiting around a second object and an orbit motion path 510 of the first object, according to one embodiment of the invention.
  • Dashboard Control—In one embodiment, the Orbit Around Dashboard 110 has an image well 520 that a user can use to assign an object 12 of attraction, as well as controls for Strength 522, Falloff Type 524, Falloff Rate 526, Influence 527, Drag 528, and Direction 529. FIG. 52 illustrates a Dashboard of an Orbit Around behavior, according to one embodiment of the invention.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Orbit Around behavior in the Inspector:
      • Object—In one embodiment, the Object Parameter is set by an image well that defines the object to orbit around.
      • Strength—In one embodiment, the Strength parameter is set by a slider that defines the speed at which the object moves.
      • Falloff Type—In one embodiment, the Falloff Type parameter is set by a pop-up menu that determines whether the distance defined by the Influence parameter falls off linearly or exponentially. In one embodiment, the default is Linearly.
        • Linear—In one embodiment, all objects that are within the area of influence are attracted equally.
        • Exponential—In one embodiment, the closer an object is within the area of influence, the stronger it is attracted, and the faster it will move around the object of attraction.
      • Falloff Rate—In one embodiment, the Falloff Rate parameter is set by a slider that defines the rate of acceleration with which objects move around the object of attraction. In one embodiment, for orbit, the default value is 1, keeping the object in a stable orbit around the target object. In another embodiment, a low Falloff Rate value results in objects quickly getting up to speed as they move around the object of attraction. In yet another embodiment, a high Falloff Rate causes objects to accelerate much more slowly.
      • Influence—In one embodiment, the Influence parameter is set by a slider that defines the area of influence, in pixels. In one embodiment, objects that fall within the area of influence move around the object of attraction. In another embodiment, objects that are outside of the area of influence remain as they are.
      • Drag—In one embodiment, the Drag parameter is set by a slider that can be used to reduce the distance attracted objects overshoot the object of attraction if they're set to fall. In one embodiment, the effect is of the attracted objects skidding to a stop at the position of the target object. In another embodiment, lower Drag values result in the object overshooting the object of attraction, moving past and then careening back around towards the target object again and again. In yet another embodiment, higher Drag values result in the object coming to rest sooner. In one embodiment, the default value for the Drag parameter is 0.
      • Direction—In one embodiment, the Direction parameter reverses the direction of this behavior.
  • Related Behaviors—In one embodiment, behaviors related to Orbit Around include Attracted To, Attractor, Drift Attracted To, Drift Attractor, Spring, and Vortex.
  • j. Random Motion
  • In one embodiment, the Random Motion behavior affects an object's 12 Position parameter. In one embodiment, if a user applies the Random Motion behavior to an object 12, the behavior animates the position of the object, and makes the object move around the Canvas along a random path 530. FIG. 53 illustrates an object and a Random Motion motion path, according to one embodiment of the invention.
  • In one embodiment, although the motion created with this behavior is random, the motion is actually predetermined by the particular group of parameters a user has chosen. In one embodiment, as long as the user doesn't change the parameters, the motion path created by this behavior will remain the same. In another embodiment, if the user doesn't like the path that was randomly generated, click the Generate button in either the Dashboard or the Behavior tab in the Inspector to pick a new random seed number. In yet another embodiment, this number is used to generate a new path.
  • In one embodiment, the Random Motion behavior is useful for quickly creating varied motion paths for large numbers of objects that a user wants to move at the same time. In one embodiment, for example, a user can create an arrangement of ten objects in the canvas and apply the Random Motion behavior to all of them. In another embodiment, a user can also use the random motion behavior to add variation to the motion paths 540 created by other behaviors affecting an object's 12 position. In yet another embodiment, for example, adding Random Motion to an object 12 with the Orbit Around behavior results in a more erratic motion path 540, although the object 12 still orbits as before. FIG. 54 illustrates an Orbit Around behavior applied to an object and the object's motion path, according to one embodiment of the invention. FIG. 55 illustrates both an Orbit Around behavior and a Random Motion behavior applied to an object and the object's motion path, according to one embodiment of the invention.
  • Dashboard Control—In one embodiment, the Random Motion Dashboard 110 has controls for the Amount 560, Frequency 562, Noisiness 564, Drag 566, and Random Seed 568 parameters. FIG. 56 illustrates a Dashboard for a Random Motion behavior, according to one embodiment of the invention.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Random Motion behavior in the Inspector:
      • Amount—In one embodiment, the Amount parameter is set by a slider that determines the speed the object moves by changing the length of the motion path. In one embodiment, higher values result in faster motion and longer motion paths.
      • Frequency—In one embodiment, the Frequency parameter is set by a slider that determines the number of twists and turns in the motion path, which can be seen by the crookedness of the resulting motion path. In one embodiment, higher values create more turns in the motion path. In another embodiment, lower values result in straighter motion paths.
      • Noisiness—In one embodiment, the Noisiness parameter is set by a slider that determines an additional level ofjaggedness along the motion path shape defined by the Amount parameter. In one embodiment, higher values result in a more jagged looking motion path.
      • Drag—In one embodiment, the Drag parameter is set by a slider that controls the speed the object moves along the motion path, without changing the shape of the motion path itself. In one embodiment, while the Amount parameter controls the length of the motion path, the Drag parameter shrinks or enlarges the motion path as a whole.
      • Random Seed—In one embodiment, the Random Seed parameter is set by a button that lets a user pick a new random seed number. In one embodiment, this number is used to randomly generate new motion paths, based on the values the user has picked in the other parameters of this behavior.
  • Related Behaviors—In one embodiment, behaviors related to Random Motion include Motion Path, Gravity, Throw, and Wind.
  • k. Repel
  • In one embodiment, the Repel behavior affects other objects' Position parameters. In one embodiment, the Repel behavior is the opposite of the Attractor behavior, and is part of a group of simulation behaviors that create complex animated relationships between two or more objects. In another embodiment, if a user applies the Repel behavior to an object 12A, the behavior pushes away all objects 12B within the area of influence in the Canvas. In yet another embodiment, the strength with which objects 12B are pushed away can be increased or decreased, as can the distance repelled objects 12B travel. FIG. 57 illustrates several objects, according to one embodiment of the invention. FIG. 58 illustrates the same objects as in FIG. 57 after the Repel behavior has been applied to the central object, according to one embodiment of the invention.
  • Dashboard Control—In one embodiment, the Repel Dashboard has controls for Strength, Falloff Type, Falloff Rate, Influence, and Drag.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Repel behavior in the Inspector:
      • Affect—In one embodiment, the Affect parameter is set by a pop-up menu that limits which objects in a project are affected by the Repel behavior. In one embodiment, there are three options:
        • All Objects—In one embodiment, all objects in the Canvas are affected by the Repel behavior. In one embodiment, this is the default behavior.
        • Related Objects—In one embodiment, only other objects that are within the same layer as the repelling object are affected.
        • Specific Objects—In one embodiment, only objects appearing in the Affected Objects list are affected by the Repel behavior. In one embodiment, the Affected Objects list appears when Specific Objects is selected in the Affect pop-up menu. In another embodiment, drag objects from the Layers tab into this list to be affected by the Attractor behavior when the Related Objects option is selected in the Influence pop-up. In yet another embodiment, drag the layer icon of objects in a project from the Layers tab to add them to this list.
      • Strength—In one embodiment, the Strength parameter is set by a slider that defines the speed with which repelled objects move away from the object. In one embodiment, with a value of 0, repelled objects don't move at all. In another embodiment, the higher the value, the faster repelled objects move.
      • Falloff Type—In one embodiment, the Falloff Type parameter is set by a pop-up menu that determines whether the distance defined by the Influence parameter falls off linearly or exponentially.
        • Linear—In one embodiment, all objects that are within the area of influence are repelled equally.
        • Exponential—In one embodiment, the closer an object is within the area of influence, the more it is repelled, and the faster it moves away from the repelling object.
      • Falloff Rate—In one embodiment, the Falloff Rate parameter is set by a slider that defines the rate of acceleration with which objects move away from the repelling object. In one embodiment, a low Falloff Rate value results in objects quickly getting up to speed as they move away. In another embodiment, a high Falloff Rate causes objects to accelerate more slowly.
      • Influence—In one embodiment, the Influence parameter is set by a slider that defines the area of influence, in pixels. In one embodiment, objects that fall within the area of influence move away from the repelling object. In another embodiment, objects that are outside of the area of influence remain where they are.
        • Drag—In one embodiment, the Drag parameter is set by a slider that can be used to reduce the distance repelled objects travel away from the repelling object.
  • Related Behaviors—In one embodiment, behaviors related to Repel include Repel From.
  • l. Repel From
  • In one embodiment, the Repel From behavior affects an object's Position parameter. In one embodiment, while the Repel behavior pushes other objects away, the Repel From behavior has the opposite effect, making the object it is applied to move away from a selected object in the Canvas.
  • Dashboard Control—In one embodiment, the Repel From Dashboard has an image well that the user can use to assign an object to move away from, as well as controls for Strength, Falloff Type, Falloff Rate, Influence, and Drag.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Repel From behavior in the Inspector:
      • Object—In one embodiment, the Object parameter is set by an image well that defines the object to be repelled from.
      • Strength—In one embodiment, the Strength parameter is set by a slider that defines the speed at which the object is repelled. In one embodiment, with a value of 0, the object is not repelled at all. In another embodiment, the higher the value, the faster the object is repelled.
      • Falloff Type—In one embodiment, the Falloff Type parameter is set by a pop-up menu that determines whether the distance defined by the Influence parameter falls off linearly or exponentially.
        • Linear—In one embodiment, the object is repelled equally regardless of its distance from the repelling object.
        • Exponential—In one embodiment, the closer an object is within the area of influence, the more it is repelled, and the faster it moves away from the repelling object.
      • Falloff Rate—In one embodiment, the Falloff Rate is set by a slider that defines the rate of acceleration at which the object moves away from the repelling object. In one embodiment, a low Falloff Rate value results in the object quickly getting up to speed as it moves away. In another embodiment, a high Falloff Rate causes the object to accelerate more slowly.
      • Influence—In one embodiment, the Influence parameter is set by a slider that defines the area of influence, in pixels. In one embodiment, if the object falls within the area of influence, it is repelled. In another embodiment, if the object is outside of the area of influence, it remains unaffected.
      • Drag—In one embodiment, the Drag parameter is set by a slider that can be used to reduce the distance the object travels away from the repelling object.
  • Related Behaviors—In one embodiment, behaviors related to Repel From include Repel.
  • m. Rotational Drag
  • In one embodiment, the Rotational Drag behavior affects an object's Rotation parameter. In one embodiment, the Rotational Drag behavior is similar to the Drag behavior, except that it affects Rotation instead of Position. In another embodiment, rotational drag simulates friction affecting objects that are spinning due to keyframed or behavior-driven changes to the Rotation parameter. In yet another embodiment, by setting higher drag values, a user can slow rotational changes to an eventual stop.
  • Dashboard Control—In one embodiment, the Rotational Drag Dashboard lets a user control the Amount of drag.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Rotational Drag behavior in the Inspector:
      • Amount—In one embodiment, the Amount parameter can be set by a slider that can be used to slow down an object's rotation over time, causing it to eventually come to a stop. In one embodiment, higher Drag values result in the rotation ending sooner.
  • Related Behaviors—In one embodiment, behaviors related to Rotational Drag include Drag.
  • n. Spring
  • In one embodiment, the Spring behavior affects an object's Position parameter. In one embodiment, the Spring behavior creates a relationship between two objects, so that an object with the Spring behavior applied to it moves back and forth around a second object by a specified distance. In another embodiment, the Attract To parameter defines the object that serves as the target and center of the spring behavior. In yet another embodiment, additional parameters let a user adjust the speed of the behavior (Spring Tension) and the acceleration of the object at each change in direction (Relaxed Length).
  • In one embodiment, if the Attract To object is at a stop, the resulting motion is fairly simple and the springing object moves back and forth in a straight line. In one embodiment, if the Attract To object is in motion, the springing object's motion will be much more complex, changing direction according to the velocity of the Attract To object.
  • Dashboard Control—In one embodiment, the Spring Dashboard contains an image well that lets a user set the Attract To object. In one embodiment, the Spring Dashboard contains two sliders that let a user adjust the Spring Tension and Relaxed Length of the Spring effect. In another embodiment, a checkbox lets a user turn on the Repel parameter.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Spring behavior in the Inspector:
      • Attract To—In one embodiment, the Attract To parameter is set by an image well that defines the object of attraction.
      • Spring Tension—In one embodiment, the Spring Tension parameter is set by a slider that determines how fast the object is pulled towards the object of attraction
      • Relaxed Length—In one embodiment, the Relaxed Length parameter is set by a slider that determines how far away the object can be pulled from a moving object of attraction.
      • Repel—In one embodiment, when this checkbox is turned on, when the object gets closer to the object of attraction than the Relaxed Length value, the objects are pushed apart. In one embodiment, when this checkbox is turned off, no repelling force is applied.
  • Related Behaviors—In one embodiment, behaviors related to Spring include Attracted To, Attractor, Drift Attracted To, Drift Attractor, Orbit Around, and Vortex.
      • o. Vortex
  • In one embodiment, the Vortex behavior affects other objects' Position parameters. In one embodiment, the Vortex behavior is the opposite of the Orbit Around behavior. In another embodiment, whereas the Orbit Around behavior causes one object to orbit around another target object, the Vortex behavior exerts a force on all objects surrounding the object to which the Vortex behavior is applied.
  • Dashboard Control—In one embodiment, the Vortex Dashboard has a pop-up menu that lets a user limit the objects affected by this behavior, as well as controls for Strength, Falloff Type, Falloff Rate, Influence, Drag, and Direction.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Vortex behavior in the Inspector:
      • Affect—In one embodiment, the Affect parameter is set by a pop-up menu that limits which objects in a project are affected by the Vortex behavior. In one embodiment, there are three options:
        • All Objects—In one embodiment, all objects in the Canvas are affected by the Vortex behavior. In one embodiment, this is the default behavior.
        • Related Objects—In one embodiment, only other objects that are within the same layer as the object of attraction are affected.
        • Specific Objects—In one embodiment, only objects appearing in the Affected Objects list are affected by the Vortex behavior. In one embodiment, the Affected Objects list appears when Specific Objects is selected in the Affect pop-up menu. In another embodiment, drag objects from the Layers tab into this list to be affected by the Vortex behavior when the Related Objects option is selected in the Influence pop-up. In yet another embodiment, drag the layer icon of objects in a project from the Layers tab to add them to this list.
      • Strength—In one embodiment, the Strength parameter is set by a slider that defines the speed at which the affected objects move about the object of attraction.
      • Falloff Type—In one embodiment, the Falloff Type parameter is set by a pop-up menu that determines whether the distance defined by the Influence parameter falls off linearly or exponentially. In one embodiment, the default is Linear.
        • Linear—In one embodiment, all objects that are within the area of influence are affected equally.
        • Exponential—In one embodiment, the closer an object is within the area of influence, the stronger the effect, and the faster it will move.
      • FalloffRate—In one embodiment, the FalloffRate parameter is set by a slider that defines the rate of acceleration with which objects move around the object of attraction. In one embodiment, a low Falloff Rate value results in objects quickly getting up to speed as they move around the object of attraction. In another embodiment, a high Falloff Rate causes objects to accelerate much more slowly.
      • Influence—In one embodiment, the Influence parameter is set by a slider that defines the area of influence, in pixels. In one embodiment, objects that fall within the area of influence move around the object of attraction. In another embodiment, objects that are outside of the area of influence remain where they are.
      • Drag—In one embodiment, the Drag parameter is set by a slider that can be used to reduce the distance attracted objects overshoot the object of attraction if they fall towards it at any point. In one embodiment, the effect is of the attracted objects skidding to a stop at the position of the target object. In another embodiment, lower Drag values result in the object overshooting the object of attraction, moving past and then careening back around towards the target object again and again. In yet another embodiment, higher Drag values result in the object coming to rest sooner.
      • Direction—In one embodiment, the Direction parameter is set by a pop-up menu that lets a user set whether objects move around in a Clockwise or Counter-clockwise direction.
  • Related Behaviors—In one embodiment, behaviors related to Vortex include Attracted To, Attractor, Drift Attracted To, Drift Attractor, Orbit Around, and Spring.
  • p. Wind
  • In one embodiment, the Wind behavior affects an object's Position parameter. In one embodiment, apply the Wind behavior to an object to animate its position and move it in a specified direction. In another embodiment, unlike the Throw behavior, the velocity specified by the Wind behavior is a continuous force, and its parameters can be keyframed to achieve gradual changes in speed and direction.
  • In one embodiment, the Wind behavior is better than the Throw behavior when a user wants to vary the speed of the object being animated. In one embodiment, a user can either apply another behavior (such as randomize or ramp) or keyframe the Velocity parameter of the Wind behavior to vary the speed and direction at which the object moves. In another embodiment, a user cannot make gradual changes in either speed or direction with the Throw behavior.
  • Dashboard Control—In one embodiment, the Wind Dashboard 110 lets a user specify the direction and speed of the Wind behavior by dragging an arrow 590 within a circular region 592. In one embodiment, the direction of the arrow defines the direction of movement, and the length of the arrow defines speed. In another embodiment, a slider 594 to the right lets a user adjust the scale of the Dashboard control, increasing or decreasing the effect the control has over the object. In yet another embodiment, the maximum speed a user can define with the Dashboard 10 is not the maximum speed possible. In one embodiment, higher values can be entered into the Throw Velocity or Throw Distance parameter in the Behaviors tab of the Inspector. FIG. 59 illustrates a Dashboard of a Wind behavior, according to one embodiment of the invention.
  • Parameters in the Inspector—In one embodiment, the following parameters are available for the Wind behavior in the Inspector:
      • Air Thickness—In one embodiment, the Air Thickness parameter is set by a slider that adjusts how fast the object accelerates when the speed is changed. In one embodiment, lower values (simulating thinner air) have less effect when pushing the object, so it takes longer to get up to speed. In another embodiment, higher values (thicker air) have more effect, and push the object up to speed more quickly.
      • Velocity—In one embodiment, the Velocity parameter is set by a slider that adjusts the speed at which the simulated air is blowing the object. In one embodiment, higher values result in faster motion.
  • Related Behaviors—In one embodiment, behaviors related to Wind include Motion Path, Gravity, Random Motion, and Throw.
  • I. EXAMPLES
  • In one embodiment, the following three examples illustrate different ways that groups of behaviors can be combined to create different effects.
  • i. Example 1 Creating Animated Title
  • In one embodiment, in this example, multiple behaviors will be used to bring up four text objects to create a title. In one embodiment, the first three text objects fly in from the sides, while the last text object zooms out from the center of the screen. In another embodiment, this example assumes that the Create Objects At preference in the Project Preferences window is set to Start of Project, so that newly applied behaviors are placed from the beginning of each object all the way through the end.
  • In one embodiment, to create an animated title sequence:
      • In one embodiment, arrange the first two graphic objects 600A, 600B to determine their vertical position in the composition. FIG. 60 illustrates two graphic objects, according to one embodiment of the invention.
      • In one embodiment, select both objects, click the Add Behavior icon in the Toolbar, and choose Basic Motion>Motion Path from the pop-up menu 610 to apply this behavior to both objects at the same time. FIG. 61 illustrates a pop-up menu showing Basic Motion>Motion Path, according to one embodiment of the invention.
      • In one embodiment, select the top object 600A. In one embodiment, if necessary, choose the Motion Path behavior from the Dashboard pop-up menu to make that object's motion path 620 editable. In another embodiment, move the start point 622 of the motion path 620 to the off-screen position where it will start, and move the end point 624 to the onscreen position where it will stop. FIG. 62 illustrates the top object's motion path, according to one embodiment of the invention.
      • In one embodiment, next, select the bottom object 600B. In one embodiment, choose the Motion Path behavior from the Dashboard pop-up menu to make its motion path 630 editable. In another embodiment, move the start point 632 of the motion path 630 to the off-screen position where it will start, and move the end point 634 to the onscreen position where it will stop. FIG. 63 illustrates the bottom object's motion path, according to one embodiment of the invention.
      • In one embodiment, click the Play button or scrub the playhead in the Timeline or Canvas to see both objects moving onscreen. In one embodiment, both objects come to an abrupt stop. In another embodiment, this is probably not the desired effect, so in the next steps the Drag behavior will be used to slow both objects to a gentle stop.
      • In one embodiment, for each object in the Layers tab, choose its Motion Path behavior from the Dashboard 110 pop-up menu, and choose Ease Out 640 from the speed pop-up menu. FIG. 64 illustrates a Dashboard for the Motion Path behavior showing the Speed parameter as Ease Out, according to one embodiment of the invention. In one embodiment, as a result, both objects will slow down before gradually coming to a stop.
      • In one embodiment, now, create a text object 12C. In one embodiment, this is the object that will fade in and zoom up to fill the screen. In one embodiment, resize this object 12C to the size it will be at the beginning of the sequence. FIG. 65 illustrates a small text object, according to one embodiment of the invention.
      • In one embodiment, next, choose the Adjust Anchor Point tool and move the anchor point 660 to the center of the object 12C. In one embodiment, this way, when the object is scaled up with the Grow/Shrink behavior, it will zoom from its center. FIG. 66 illustrates the text object of FIG. 65 with a new anchor point location, according to one embodiment of the invention.
      • In one embodiment, select the text object, then click the Add Behavior icon and choose Basic Motion>Grow/Shrink from the pop-up menu.
      • In one embodiment, next, open the Inspector, and click the Behaviors tab 18. In one embodiment, choose Final Value 670 from the Grow/Shrink behavior's 10 Increment pop-up menu 672. In another embodiment, this enables the Grow/Shrink Dashboard 110 control to control the size of the affected object at the last frame of the behavior, so that the object doesn't grow indefinitely. FIG. 67 illustrates the Increment pop-up menu of the Grow/Shrink behavior in the Behaviors tab of the Inspector, according to one embodiment of the invention.
      • In one embodiment, reposition the text object 12C at the center of the canvas, move the playhead to the last frame of the animation, and drag the Grow/Shrink control 680 in the Dashboard 110 until the text object 12C reaches its final size. FIG. 68 illustrates the text object and the Grow/Shrink Dashboard, according to one embodiment of the invention.
      • In one embodiment, back in the Behaviors tab of the Inspector, increase the value of the Curvature parameter. In one embodiment, this causes the increase in scale to gradually slow to a stop, rather than stopping abruptly.
      • In one embodiment, lastly, the Fade In/Fade Out behavior 10 will be used to fade the text object onscreen. In one embodiment, select the text object 12C, then click the Add Behavior icon, and choose Basic Motion>Fade In/Fade Out from the Behavior pop-up menu in the Toolbar.
      • In one embodiment, drag the left shaded ramp 690A of the Fade In/Fade Out control 692 in the Dashboard 110 to the right to lengthen the fade in effect. FIG. 69 illustrates the Fade In/Fade Out Dashboard, according to one embodiment of the invention.
      • In one embodiment, drag the right shaded ramp 690B all the way to the right, until it's a non-shaded, vertical edge. In one embodiment, this eliminates the fade out part of the effect, so that the center text object remains onscreen for the remainder of its duration. In another embodiment, the animation is now complete.
        FIG. 70 illustrates the composition at the first frame, according to one embodiment of the invention. FIG. 71 illustrates the composition at a middle frame, according to one embodiment of the invention. FIG. 72 illustrates the composition at the last frame, according to one embodiment of the invention.
    ii. Example 2 Creating a Clock Animation
  • In one embodiment, in this example, two parameter behaviors will be used to create an animated clock. In one embodiment, by arranging the objects and their anchor points properly, each part's motion can be created quickly and easily using the Rate and Oscillate behaviors.
  • In one embodiment, to create a clock animation:
      • In one embodiment, place the graphics objects constituting the hands, face, and pendulum into the Canvas, arranging them to create the clock. In one embodiment, the hands are on top, the face in the middle, and the pendulum should be in the back. In another embodiment, by default, the anchor point is located at the center of each object. In yet another embodiment, prior to adding behaviors to animate these objects, move the anchor points so that the objects move the way they're supposed to. In one embodiment, in this example, the hands should spin about the center of the clock face, not the center of the hand itself, and the pendulum should swing from its top.
      • In one embodiment, choose the Adjust Anchor Point tool, and move the anchor points of both hand objects and the pendulum object to the area that should appear to be attached to the rest of the clock.
      • In one embodiment, now that the composition is set up to be animated, the only remaining thing to do is to assign behaviors to each of the objects. In one embodiment, select the minute hand object, and open the Properties tab in the Inspector. In another embodiment, control-click the Rotation parameter, and choose Rate from the shortcut menu to apply the Rate parameter behavior.
      • In one embodiment, now, open the Behaviors tab, and set the Rate parameter to <−49>. In one embodiment, this rotates the minute hand clockwise at a continuous rate.
      • In one embodiment, next, select the hour hand object, then open the Properties tab in the Inspector. In one embodiment, control-click its Rotation parameter, and choose Rate from the shortcut menu to apply the Rate parameter behavior to this object, as well.
      • In one embodiment, again, open the Behaviors tab, but this time set the Rate parameter to <−4>. In one embodiment, this also rotates the minute hand clockwise at a continuous rate, but much more slowly than the minute hand, replicating the relative movement of both hands.
      • In one embodiment, now it's time to make the pendulum swing. In one embodiment, select the pendulum object. In another embodiment, the user should have already adjusted its anchor point to be at the top. In yet another embodiment, this way, the bottom pendulum object will swing properly. In one embodiment, open the Behaviors tab, Control-click the Rotation parameter, but this time choose Oscillate from the shortcut menu to add the Oscillate parameter behavior.
      • In one embodiment, open the behaviors tab. In one embodiment, reduce the Oscillate behavior's amplitude to 20 so that the pendulum object doesn't swing so widely. In another embodiment, then, increase the speed to 50 in keeping with the overall fast-forward motion of the clock.
        Examples of Object Types to Which Behaviors May Be Applied
  • A. Text
  • Behaviors can be applied to text, one of the most essential motion graphics elements.
  • i. About Type
  • In motion graphics, type has become more than words that provide basic information, such as what time to tune into your favorite television program. Type design has become an art form. A title sequence can set the mood of the film it is introducing, a certain combination of typeface and animation style can provide instant recognition of the identity of a broadcast network, or a clever television interstitial can keep a viewer from flipping channels during a commercial break. For example, the opening title sequence by Friz Freleng for Blake Edwards' “The Pink Panther” went from movie title to movie and television star, with a design and graphics style that hold up even today, nearly 40 years later.
  • Although trends in type design change, the balanced use of type and graphics will remain the key to achieving the right effect on the subject of commercials, documentaries, titles, broadcast identification, corporate presentations, or personal video projects. No matter what style a project requires, unique text animation tools offer immediate results.
  • ii. Using Text
  • In one embodiment, text is added to a project directly in a Canvas. In one embodiment, select a Text tool, click in the Canvas, and start typing. In another embodiment, once a text object has been created, text may be added and edited in the Canvas, or in a Text Editor in a Format pane of a Text Inspector. In yet another embodiment, once a text object has been created, the text may be put on a line or elliptical path that can be animated.
  • In one embodiment, when text is created, it becomes a text object. In one embodiment, the stacking order of text objects can be changed within a layer, or text objects can be moved to another layer, similar to other types of objects (e.g., video clips, images, paint objects, and shapes). In another embodiment, text objects can be easily duplicated or copied from one layer to another. In yet another embodiment, filters, transfer modes, and shadows can be applied to text objects, similar to other types of objects. In one embodiment, text objects can be moved, rotated, scaled, and easily animated using Basic Motion or Simulation behaviors (such as Throw or Gravity) or by setting keyframes.
  • In one embodiment, a text object, unlike other object types, has a special group of behaviors called Text Animation behaviors. In one embodiment, text behaviors create text animation by generating a range of values in text parameters specific to titling effects, without setting any keyframes. In another embodiment, for example, the Text Tracking behavior can be dragged onto a text object, and the text characters will gracefully spread out across the Canvas over time.
  • In one embodiment, using behaviors is an ideal workflow to interactively test different looks and animations. In one embodiment, it is not necessary to explicitly set keyframes, wait for a preview, and play the preview only to discover the need to go back and adjust the keyframes. In another embodiment, the rate of an applied behavior can be quickly adjusted using a behavior's Dashboard, while the animation updates in the canvas. In yet another embodiment, parameters for a behavior may be accessed in an Inspector.
  • In one embodiment, Text Animation behaviors, like other types of behaviors, can be converted to keyframes in order to fine tune the animation. In one embodiment, using Behaviors is not required to animate text; instead, text can be animated traditional keyframing or a combination of keyframing and behaviors. In another embodiment, although both keyframes and behaviors can be applied to an object, some thought must be given to the desired effect, since this workflow can defeat the purpose of Behaviors, as well as yield unexpected results.
  • In one embodiment, once a text treatment has been created (e.g., a customized text behavior, a combination of behaviors, or a keyframed animation), the animation can be saved to a Library for use on another text object or a future project.
  • In one embodiment, text objects have unique attributes, such as face and outline, and the ability to change fonts or edit the text of an existing, animated text object.
  • iii. Creating Text
  • In one embodiment, text can be created directly in a canvas using a Text tool. In one embodiment, once text has been added to a project, behaviors and filters can be applied to a text object.
  • a. Creating Text in a Project
  • In one embodiment, text can be added to a project in a Canvas. In one embodiment, when text is created, a text object is created at the first frame of a project and exists for the duration of the project. In another embodiment, for example, if a text object is added to a 900-frame project, the duration of the text object is 900 frames. In yet another embodiment, to shorten the duration of a text object, shorten the text object in a Timeline.
  • In one embodiment, to add text to a project in a Canvas:
      • In one embodiment, in a Layers list, select the layer to which the text will be added. In one embodiment, if no layer is selected, a new layer is created that contains the text object.
      • In one embodiment, in a Toolbar, click a Text tool (or press T).
      • In one embodiment, click in the Canvas. In one embodiment, the cursor flashes in the Canvas. In another embodiment, a “blank” text object is added to a Layers tab and a Timeline before any text has been entered.
      • In one embodiment, type the text. In one embodiment, a text object appears in the Canvas. In another embodiment, the name of the text object in the Layers tab, Timeline, and Dashboard is based on the entered text. In yet another embodiment, by default, the text Layout is set to Type. In one embodiment, the Type layout option creates no margin, so if a long string of text is entered, it extends on a single line beyond the canvas until a manual line break is created (e.g., by pressing Return). In another embodiment, this mode is useful for working with short text objects and panning text across the canvas.
  • In one embodiment, when done typing, press Esc or select another tool. In one embodiment, once text has been typed, press Esc or select another tool on the toolbar before using a hot key. In another embodiment, when a Text tool is selected, the mode is text-entry mode, so pressing S will add an “S” to text rather than change to the Select tool. A Dashboard for the new text object is displayed. In one embodiment, if no Dashboard is present, press D to display the text object Dashboard.
  • Using a Text Dashboard—In one embodiment, a text Dashboard contains some of the most commonly-adjusted text parameters, such as opacity, type family, and color. In one embodiment, text parameters in a text Dashboard include:
      • Opacity: In one embodiment, by default, the Opacity of a text object is set to 100 percent. In one embodiment, use a slider to change the Opacity value.
      • Blend Mode: In one embodiment, by default, the Blend Mode of a text object is set to Normal. In one embodiment, click a Blend Mode pop-up menu to choose another mode for the selected text object. In another embodiment, because a text object is similar to other object types, a Properties tab (and Layers tab) also contains controls to change the blend mode of the text object. In yet another embodiment, when the blend mode of a text object is changed in the Text tab of the Inspector, the blend mode is also changed in the Properties tab (and vice-versa).
      • Family: In one embodiment, by default, a text object's font family is set to Geneva. In one embodiment, to change the font of the selected text object, click an arrow and choose a font from the pop-up menu.
      • Typeface: In one embodiment, click an arrow to choose the type style, such as Bold, Italic, etc. In one embodiment, the available typefaces are specific to the selected font family.
      • Color: In one embodiment, a text object's color is white by default. In one embodiment, click a color well to display a Colors window and choose another color for the selected text object.
      • Size: In one embodiment, a text object is created at 48 points by default. In one embodiment, to change the point size of a text object, drag a Size slider. In another embodiment, to change font size in single-point increments, press Option and drag the slider. In yet another embodiment, text Size sliders (in the Dashboard and in the Inspector) are limited to 288 points. In one embodiment, to set the text to a larger point size, type a number in the Size field in the Text Inspector>Format pane.
      • Tracking: In one embodiment, tracking is set to 0 by default. In one embodiment, to change the Tracking value of a text object, drag a Tracking slider left (for a negative tracking value) or right (for a positive tracking value). In another embodiment, text Tracking sliders (in the Dashboard and in the Inspector) are limited to 100 points. In yet another embodiment, to set a larger tracking value, type a number in the Tracking field in the Text Inspector>Format pane.
      • Line Spacing: In one embodiment, when working with multiple lines of text, drag a slider to change the Line Spacing value.
      • Alignment: In one embodiment, text alignment is set to Left by default. In one embodiment, to change alignment, click an Alignment pop-up menu and choose Right or Center.
  • More Text Parameters—In one embodiment, text parameters (including those in the Text Dashboard) are located in the Text tab in the Inspector. In one embodiment, to display the Text tab of the Inspector, select the text object and click the “i” button on the Dashboard (or press Command+3). In another embodiment, the Inspector contains text parameters divided into three tabs: Format, Style, and Paragraph.
  • b. Adding Behaviors and Filters to Text Objects
  • In one embodiment, filters and behaviors are applied to text objects in the same manner as they are applied to other object types. This section provides a quick start to applying Behaviors and Filters to text objects.
  • In one embodiment, to apply a behavior or filter to a text object, do one of the following:
      • In one embodiment, to apply a behavior to a text object, drag a behavior from Library>Behaviors>Behavior category>behavior name, or use the Behaviors file menu. In one embodiment, the text object Dashboard is replaced with the behavior Dashboard.
      • In one embodiment, to apply a filter to a text object, drag a filter from Library>Filters>Filter category>filter name, or use the Filters file menu.
  • Using the Dashboards—In one embodiment, a Dashboard can be displayed for any object. In one embodiment, to display a Dashboard, select the object and press H. In another embodiment, the Dashboard that is displayed represents the currently selected object. In yet another embodiment, the parameters contained in a Dashboard depend on the type of object that it represents. In one embodiment, for example, a text object Dashboard displays text-specific parameters, such as Typeface and Line Spacing. In another embodiment, a particle emitter Dashboard displays particle-specific controls, such as Particles per Second and Lifetime.
  • In one embodiment, as effects (e.g., behaviors and filters) are added to an object, the displayed Dashboard changes to the most recently added effect. In one embodiment, the Dashboard name is displayed on the top bar of the Dashboard window. In another embodiment, to cycle through the Dashboards for an object, press H. In yet another embodiment, the Dashboards cycle in the order that the effects are applied.
  • In one embodiment, to jump to a specific Dashboard, click a disclosure triangle next to the Dashboard name and select a Dashboard from the list.
  • iv. Fonts
  • In one embodiment, any supported font may be used. In one embodiment, supported fonts include OpenType, Type1 (or PostScript), TrueType, and LiveType.
  • In one embodiment, to preview the available fonts:
      • In one embodiment, in the Library, click the Fonts category and then select a font subcategory. In one embodiment, the fonts appear in the stack. In another embodiment, to view thumbnails of the fonts, set the Library to icon view.
      • In one embodiment, in the stack, select a font. In one embodiment, the font is previewed in the Preview window. In another embodiment, click a font to appear in the font browser.
  • a. Using the Font Browser
  • In one embodiment, a Library includes a font browser that allows a user to preview fonts, select fonts, or apply a font to an existing text object. In one embodiment, to access the font browser, click the Library tab and then click the Fonts category. In another embodiment, when working with a text object, fonts can be browsed using the Browse button in the Format panel of the Text Inspector.
  • In one embodiment, to use the font browser, do one of the following.
  • In one embodiment, to preview a font:
      • In one embodiment, in the Library, click the Fonts category.
      • In one embodiment, click the font sub-category to use. In one embodiment, if in list view, the font list appears in the font stack. In another embodiment, if in icon view, the font thumbnails appear in the font stack.
      • In one embodiment, in the stack, click a font thumbnail or name. In one embodiment, the font is displayed in the Preview window. In another embodiment, use the scroll bar on the left side of the browser to scrub through fonts in alphabetical order.
  • In one embodiment, to select a font:
      • In one embodiment, follow steps 1-3, above.
      • In one embodiment, create a text object. In one embodiment, the selected font is applied to the text object.
  • In one embodiment, to change the font of an existing text object:
      • In one embodiment, in the canvas or Layers list, select the text object.
      • In one embodiment, in the Library, click the Fonts category.
      • In one embodiment, click the font sub-category to use.
      • In one embodiment, in the stack, click a font.
      • In one embodiment, drag the font onto the text object in the Canvas. In one embodiment, the text object is changed to the selected font.
  • b. Navigating the Font Browser Lists
  • In one embodiment, to quickly locate a font by its name in the font stack, type the first letter or first few letters of the font name in the browser.
  • In one embodiment, to select a font by the first letter of its name:
      • In one embodiment, click in the font stack (on a font name or thumbnail).
      • In one embodiment, type the first letter of the font name. In one embodiment, the first font that begins with that letter is selected in the stack.
  • In another embodiment, it is also possible to quickly type the first few letters of the font name to select the font. In one embodiment, to select a font by the first few letters of its name:
  • In one embodiment, click in the font stack (on a font name or thumbnail).
  • In one embodiment, quickly type the first two letters of the font name. In one embodiment, if the second letter of the font name is not typed quickly, the selection is reset and jumps to the font whose name begins with the second letter entered.
  • v. Text Tools
  • In one embodiment, text becomes a text object when created. In one embodiment, a text object is like any other object type, with one exception. In another embodiment, text object properties can be animated, and behaviors and filters can be applied to text objects, similar to other object types. In yet another embodiment, however, text-specific parameters can be animated and Text Behaviors can be applied, unlike with other object types.
  • In one embodiment, behaviors and filters aside, there are two ways to edit a text object: (1) as an object using the object parameters in the Inspector>Properties tab (or the onscreen controls); and (2) as text using Text parameters in the Inspector>Text tab. This section discusses the tools that can be used with text objects, according to one embodiment of the invention.
  • In one embodiment, the following interface tools may be used to edit text:
      • Toolbar
      • Text Dashboards
      • Text Inspector
  • In one embodiment, when a text object is selected, the standard onscreen controls can be used to move and animate the text object in the Canvas.
  • a. Text Tool and the Toolbar
  • In one embodiment, a Text tool is located in a Toolbar above a Canvas. In one embodiment, in addition to the Text tool, other tools may be used with text objects, such as a Magnify tool, Grab tool, and Selection tool. In one embodiment, the Toolbar layout can be customized.
  • Text Tool—In one embodiment, a Text Tool may be used to create, select, and edit text. In one embodiment, to add text, do one of the following:
      • In one embodiment, select the Text tool (or press T), click in the canvas, and begin typing.
      • In one embodiment, select the Text tool, and click and drag in the Canvas to draw a custom margin. In one embodiment, once the margin is drawn, release the mouse and begin typing.
      • In one embodiment, select the Text tool, click in the canvas, and type in the Text Editor. In one embodiment, the Text Editor is located in the Format pane of the Text Inspector.
  • In one embodiment, to select text characters, do one of the following:
      • In one embodiment, select the Text tool, click and drag the text to select.
      • In one embodiment, position the cursor in between two characters, press Shift, and press the Right Arrow key to add characters to the right of the cursor to the selection, or press the Left Arrow key to add characters to the left of the cursor to the selection.
  • Selection Tool—In one embodiment, a Selection Tool may be used to select or deselect one or more object. In one embodiment, once a text object has been created, click the Selection tool (or press Esc) to select the text object. In another embodiment, once a text object has been selected, the object's Dashboard may be displayed (press H), or the object's Inspector may be displayed (press I). In yet another embodiment, when the Select tool is selected, double-clicking a text object automatically enters text editing mode.
  • Magnify Tool—In one embodiment, a Magnify Tool zooms in or out of the canvas. In one embodiment, to zoom in, click the Magnify tool, click in the cursor, and drag to the right. In another embodiment, to zoom out, drag to the left. In yet another embodiment, the zoom is based around the position of the cursor in the canvas.
  • Grab Tool—In one embodiment, a Grab Tool moves the image within the canvas. In one embodiment, to reposition the canvas, click the Grab tool, click in the canvas, and drag.
  • vi. Editing Text in the Inspector
  • In one embodiment, text controls are located in the Text tab of the Inspector. In one embodiment, the Text tab is divided into three panes: Format, Style, and Paragraph. In another embodiment, the Format pane contains text basics, such as font, size, and tracking. In yet another embodiment, text characteristics such as face, outline, and blur are controlled in the Style pane. In one embodiment, the Paragraph pane contains text layout controls, such as margins and justification.
  • vii. Editing Text Format
  • In one embodiment, the Text Format panel contains the controls for text basics, such as font, typeface, size, kerning, and character rotation. In one embodiment, most of the Format parameters can be animated (keyframed), including the font family. In another embodiment, if a parameter can be animated, the Animation menu icon appears next to the parameter in the Inspector.
  • In one embodiment, to show the Text Format panel, click the Text tab in the Inspector and click the Format button.
  • a. Text Format Controls
  • The following section describes the Text Format parameters, according to one embodiment of the invention.
      • Font Type—In one embodiment, two tools are provided for font filtering and selection:
        • In one embodiment, a first pop-up menu that filters which fonts appear in the Family list (see below). The menu can show, for example, All Fonts, System Fonts, LiveFonts (LiveType), or Favorites. In one embodiment, fonts are displayed in alphabetical order.
        • In one embodiment, a font browser.
      • Collection
      • Family—In one embodiment, the font family (the set of characters, letters, and symbols of a single typeface) of a text object may be set. In one embodiment, typing the first letter or few letters of a type family name into the Family text field jumps to that font.
  • In one embodiment, to preview different font families for a text object in the canvas:
      • In one embodiment, select the text object.
      • In one embodiment, in the Text Format pane, click the Family list arrow. In one embodiment, the font family list appears.
      • In one embodiment, click and hold the cursor in the font list, and scrub up or down to select a font. In one embodiment, as the font family list is scrubbed through, the text changes in the Canvas to the currently selected font family.
      • In one embodiment, once the font has been selected, release the mouse. In one embodiment, the scroll bar can be used to move up and down the font list.
      • Typeface—In one embodiment, the type style, such as Bold, Condensed, etc., may be set. In one embodiment, the available typefaces are specific to the selected font family.
      • Size—In one embodiment, the size of the type may be set. In one embodiment, the size may be set by entering a value in the value field or using a slider. In another embodiment, the text may be scaled in the Canvas using onscreen controls, however, this scaling is independent of setting type point size in the Format controls. In yet another embodiment, to change font size, drag the Size slider left or right. In one embodiment, to change font sizes in single-point increments, press Option and drag the slider. In another embodiment, the slider value in the Dashboard and the slider value in the Inspector are limited to 288. In yet another embodiment, larger text can be created by typing a value in the Size value slider in the Inspector.
      • Tracking—In one embodiment, Tracking determines the spacing between the characters of a text object. In one embodiment, Tracking applies a uniform value between each character.
      • Kerning—In one embodiment, Kerning is used to adjust the spacing between individual characters of a text object.
      • In one embodiment, to kern individual characters in a text object:
      • In one embodiment, select the text object.
      • In one embodiment, click the Text tool (or press T).
      • In one embodiment, in the canvas, position the cursor in between the characters to kern, and do one of the following:
        • In one embodiment, use the Keming slider or value field to set a specific kerning value.
        • In one embodiment, press Opt+Right Arrow to increase the space between the characters by one-pixel increments.
        • In one embodiment, press Opt+Left Arrow to reduce the space between the characters by one-pixel increments.
          In one embodiment, once the cursor is positioned in between the adjacent characters to kern, use the Right Arrow and Left Arrow keys to move between the characters. In one embodiment, if there are multiple lines of text (with a single text object), use the Up Arrow and Down Arrow keys to move the cursor between the lines of text.
      • Baseline—In one embodiment, Baseline adjusts the baseline of text characters. In one embodiment, a baseline is a horizontal “line” to which the bottom of characters is aligned.
      • Slant—In one embodiment, Slant simulates italics by adding a slant value to the characters of a text object.
      • Character Scale—In one embodiment, Character Scale scales the characters of the text object either proportionately, in X, or in Y. In one embodiment, to scale in only X or Y, click the disclosure triangle to enter separate X and Y scale values.
      • Offset—In one embodiment, Offset offsets the text from the text bounding box. In one embodiment, enter a value in the Offset field to proportionally edit the X and Y offsets, or click the disclosure triangle to enter separate X and Y position values.
      • Character Rotate—In one embodiment, Character Rotate rotates each text character around its base. In one embodiment, click and drag on the dial or enter a value in the value field to rotate the text characters.
      • Monospace—In one embodiment, when enabled, monospace applies a fixed amount of space between each text character.
      • All Caps—In one embodiment, All Caps sets all text characters to upper-case.
      • All Caps Size—In one embodiment, when All Caps is enabled, All Caps Size sets the size of the upper-case characters based on a percentage of the font point size.
  • In one embodiment, the following Format parameters appear in the text Dashboard: Family, Typeface, Size, and Tracking.
  • Using the Text Editor—In one embodiment, a Text Editor is an additional tool that allows text to be added and edited in the Inspector rather than the Canvas. In one embodiment, the Text Editor is useful when working with large amounts of text.
  • In one embodiment, to add or change text in the Text Inspector:
      • In one embodiment, in the Layers list, select the layer to which to add text.
      • In one embodiment, select the Text tool (e.g., click the tool or press T).
      • In one embodiment, click in the Canvas. In one embodiment, the cursor flashes in the Canvas.
      • In one embodiment, in the Text Inspector, click the Format pane.
      • In one embodiment, click in the Text Editor (e.g., in the lower portion of the Format pane), and begin typing. In one embodiment, when text is entered in the Text Editor, margins are automatically set based on project safe zones.
  • In one embodiment, the Text Editor can also be used to edit text objects in projects.
  • In one embodiment, to edit existing text using the Text Editor:
      • In one embodiment, in either the Layers List or the Canvas, select the text object to be edited.
      • In one embodiment, in the Text Inspector, click the Format pane. In one embodiment, the selected text appears in the Text Editor.
      • In one embodiment, make changes in the Text Editor.
  • viii. Editing Text Style
  • In one embodiment, use a Text Style pane to specify the fill of a text object and to adjust its opacity and softness. In one embodiment, a text object can be a solid color, an image, or a color gradient. In another embodiment, most of the style parameters can be animated. In yet another embodiment, outlines, glows, and drop shadows can be created for a text object in the Style pane.
  • In one embodiment, predefined Text Styles may be used in a project. In one embodiment, Text Styles use parameters in the Text Style pane to create a specific “look” for a text object. In another embodiment, for example, one style is a yellow-to-orange gradient with a soft white outline. In yet another embodiment, these styles are located in a Library. In one embodiment, to show the Text Style panel, click the Text tab in the Inspector and click Style.
  • In one embodiment, there are four main groups of controls in the Style pane: Face, Outline, Glow, and Drop Shadow. In one embodiment, a style can be enabled or disabled for a text object. In one embodiment, by default, Outline, Glow, and Drop Shadow are disabled.
  • a. Text Face Controls
  • In one embodiment, Text Face controls are used to specify whether the text is a solid color, a texture, or a color gradient. In one embodiment, the following Face parameters are available:
      • Enable—In one embodiment, the Enable parameter enables and disables the face of the text object. In one embodiment, the Face is enabled by default.
      • Fill with—In one embodiment, clicking the Fill with pop-up menu sets the fill for the text object. In one embodiment, the fill options are Color, Gradient, and Texture.
      • Color—In one embodiment, clicking the color box selects a color for the text object from the Colors window. In one embodiment, the individual Red, Green, Blue, and Alpha values for a text object can be adjusted by clicking the Color disclosure triangle.
      • Opacity—In one embodiment, the Opacity parameter sets the opacity of the text object. In one embodiment, the opacity is applied to the Color, Texture, and Gradient options.
      • Blur—In one embodiment, the Blur parameter sets the softness of the text object. In one embodiment, the blur is applied to the Color, Texture, and Gradient options.
  • Changing the Text Color—In one embodiment, to change the color of a text object, use the color picker in the text object Dashboard or in the Inspector. In one embodiment, to adjust individual color channels, use the Text Inspector.
  • In one embodiment, to set the text color in the Dashboard:
      • In one embodiment, select the text object.
      • In one embodiment, if the Dashboard is not displayed, press H.
      • In one embodiment, click the color picker and use the Colors window to set the text color. In one embodiment, the text object is dynamically updated as the color is selected. In another embodiment, once a Color has been set, that color becomes the default color for all new text objects added to a project. In yet another embodiment, to select a color from the Canvas (or anything on a computer's desktop), click the Color Picker/magnification tool in the Colors window, position the tool over the color, and click. In one embodiment, the Colors window is the color picker for the operating system.
  • In one embodiment, to set the text object color in the Inspector:
      • In one embodiment, select the text object.
      • In one embodiment, in the Inspector (I), click the Text tab.
      • In one embodiment, click Style.
      • In one embodiment, in the Fill with pop-up menu, ensure Color is selected.
      • In one embodiment, click the color picker and use the Colors window to set the text color.
  • In one embodiment, to adjust an individual color channel:
      • In one embodiment, in the Inspector, click the Color disclosure triangle to show the channel parameters.
      • In one embodiment, use the sliders or value fields to adjust the value of a color channel, or the alpha value of the text object. In one embodiment, text object colors can be animated.
  • Applying a Gradient to a Text Object—In one embodiment, in the Inspector, gradient fills for text objects can be created and animated. In one embodiment, the gradient controls for a text object are similar to the gradient controls for a shape or particle object. In another embodiment, a gradient preset can be applied to a text object. In yet another embodiment, the gradient presets are located in a Library. In one embodiment, a gradient that has been created can be saved to the Library for use in a current project or future projects.
  • In one embodiment, to create a text object gradient:
      • In one embodiment, select the text object.
      • In one embodiment, in the Inspector (I), click the Text tab.
      • In one embodiment, click Style.
      • In one embodiment, in the Fill with pop-up menu, select Gradient. In one embodiment, the Color controls are replaced with the gradient controls, and the gradient is applied to the selected text object. In another embodiment, the gradient is set to two colors: red and yellow, by default.
  • In one embodiment, to apply a gradient preset to a text object:
      • In one embodiment, follow steps 1-4, above.
      • In one embodiment, click the preset button (located next to the gradient display), and select a preset. In one embodiment, once a gradient preset is applied to a text object, the preset can be edited. In another embodiment, a user can preview a gradient preset.
  • Using the Gradient Editor—In one embodiment, a Gradient Editor can be used to change the color, color position, number of colors, opacity, and direction of a gradient. In one embodiment, the color and opacity of a gradient can be animated.
  • In one embodiment, the following sections assume that a text object is selected, and the Gradient option is selected from the “Fill with” pop-up menu in the Face controls.
  • In one embodiment, to change gradient colors:
      • In one embodiment, click the Gradient disclosure triangle to show the Gradient Editor. In one embodiment, the Gradient editor includes opacity controls (bar and tags), a gradient rep bar, a gradient editing bar, gradient tags, a color bar, and color position carets.
      • In one embodiment, to change the color of a gradient tag, do one of the following:
        • In one embodiment, double-click a gradient color tag. In one embodiment, the Colors window appears. Use the Colors window to set a new color for the tag.
        • In one embodiment, click a gradient color tag. In one embodiment, the color controls for that tag are enabled. In another embodiment, in the Color controls, either click the color picker to show the Colors window, or use the individual color channel controls to set a new color for the tag.
  • In one embodiment, to move the position of a color tag:
      • In one embodiment, click the color tag to move.
      • In one embodiment, do one of the following:
        • In one embodiment, drag the color tag left to right.
        • In one embodiment, in the Location parameter, use the slider to value field to enter a specific value. In one embodiment, a value of 100 percent is the right-most position of the gradient, and a value of 0 percent is the left-most position of the gradient.
  • In one embodiment, to change the spread of a gradient color, click and drag the triangle between the color tags. In one embodiment, the closer the triangle is to a color tag, the sharper the gradient.
  • In one embodiment, to add a color to a gradient, place the cursor in the lower gradient bar in the position to add the new color, and click. In one embodiment, a new color tag is added to the gradient. In another embodiment, the color of the new color tag is based on the last selected color in the color picker.
  • In one embodiment, although the colors and opacity of a gradient can be animated, the number of color and opacity tags cannot.
  • In one embodiment, to remove a color from a gradient, click and drag the color tag away from the gradient bar. In one embodiment, the color tag is removed.
  • In one embodiment, to change the opacity of a gradient color:
      • In one embodiment, in the opacity bar of the gradient Editor, click an opacity tag. In one embodiment, the Opacity controls are enabled.
      • In one embodiment, use the slider or value field to change the value of the Opacity. In one embodiment, the gradient opacity is applied to the area of a gradient, not to a specific color tag.
  • In one embodiment, the controls to move, change the spread, add, or remove a opacity tag are similar to those of the color tags.
  • In one embodiment, to reverse the gradient color or transparency direction, click the Change Tags button next to the opacity or lower gradient bars.
  • In one embodiment, to evenly distribute the gradient color or transparency tags, click the Divide Tags button next to the opacity or lower gradient bars.
  • Using a Texture—In one embodiment, an object (image, clip, or shape) can be used as the fill for a text object with the Texture option in the Face controls of the Inspector.
  • In one embodiment, to apply a texture to a text object:
      • In one embodiment, select the text object.
      • In one embodiment, in the Inspector (I), click the Text tab.
      • In one embodiment, click Style.
      • In the Fill with pop-up menu, select Texture. In one embodiment, the Color controls are replaced with the Texture controls.
      • In one embodiment, click the Texture disclosure triangle. In one embodiment, by default, no texture is applied to the text object.
      • In one embodiment, in the Layers or Media List (of the Project Panel), click and drag the image to use for the texture to the Input Image well. In one embodiment, the image appears in the well and is applied to the text object. In another embodiment, when selecting an image to put into the Input Image well, click and drag in one movement. In yet another embodiment, if the object is clicked on and the mouse is released, that object is selected and the relative Inspector appears. In one embodiment, this also applies to the Input Image well for masks.
  • Applying a Texture to a Character vs. Applying a Texture to a Text Object—In one embodiment, when an image (or object) is applied as the texture for a text object, the texture is applied to a character in the text object. In one embodiment, to use the image as a continual texture throughout a text object, use the text as a mask.
      • In one embodiment, to use a text object as a mask:
      • In one embodiment, in the Layers List or Canvas, select the object or layer to use as the texture.
      • In one embodiment, in the Inspector (I), click the Properties tab.
      • In one embodiment, in the Layers list, click and drag the text object to use as a mask to the Input Image well in the Mask controls of the Properties tab.
        In one embodiment, the text object masks the image.
  • In one embodiment, to replace a texture:
      • In one embodiment, select the text object and display the expanded Texture controls.
      • In one embodiment, in the Layers or Media List, click and drag the image to use to replace the existing texture to the Input Image well. In one embodiment, the new image appears in the well and is applied to the text object. In one embodiment, when footage is replaced that is linked to a text object as a texture (or any object as a mask) in the Layers or Media lists, the texture is replaced for the text object with the new footage.
  • Editing a Texture—In one embodiment, the position of a texture that is applied to a text object can be adjusted using Image Offset in the Texture controls. In one embodiment, if the image used as the texture is offset and is cut off in a text object, the edge behavior of the texture can be specified. In another embodiment, if an image sequence is being used, certain frames can be specified to use as the texture.
  • In one embodiment, to change the position of a texture, do one of the following in the Texture controls:
      • In one embodiment, press Command and click and drag in the Image Input well.
      • In one embodiment, the image moves in the well and is offset in the text object in the Canvas.
      • In one embodiment, adjust the Image Offset values. In one embodiment, click the disclosure triangle to independently adjust the X and Y position values of the input texture.
  • Wrap Mode—In one embodiment, use the Wrap Mode controls to specify how the edge of a texture is treated when the texture is offset and appears cut off in the text object.
      • Clamp—In one embodiment, as the default wrap mode, the texture remains transparent beyond the edge of the source image.
      • Repeat—In one embodiment, similar to tiling behavior, the texture source is repeated beyond the edge of the source image.
      • Mirror—In one embodiment, beyond the edge of the source image, the texture source is reflected like in a mirror.
      • Frame—In one embodiment, use the Frame field to specify a frame or timecode value of the frame to use as the texture.
  • Lock/Jnlock—In one embodiment, use Lock to use only the frame specified in the Frame field as the texture for all frames of s project. In one embodiment, unlock the Frame field to use the sequence of images as the texture.
  • Animating a Texture—In one embodiment, keyframes can be set for the offset values of the texture source to create a moving element within a text object.
  • In one embodiment, to animate the texture offset:
      • In one embodiment, go to the frame where the texture animation will begin.
      • In one embodiment, click the Animate button. In one embodiment, keyframing is enabled. In another embodiment, when Animate is enabled, any changes made to a project are keyframed.
      • In one embodiment, to position the texture, do one of the following:
        • In one embodiment, press Command and click and drag in the Image Input well.
        • In one embodiment, use Image Offset sliders or value fields to enter an offset value.
      • In one embodiment, go to the next frame where a keyframe will be set.
      • In one embodiment, move the texture to the new position.
      • In one embodiment, go to frame 1 (or the start frame of the animation) and play the clip. In one embodiment, the texture offset is animated.
      • In one embodiment, click Animate again to disable keyframing.
        In one embodiment, the Animation Menu in the Inspector can also be used to set keyframes.
  • Using an Object with Applied Behaviors and Filters as a Texture Source—In one embodiment, an object (image, clip, shape, or layer) that has applied behaviors and filters can be used as the texture source for a text object. In one embodiment, if the object has applied, active filters, the result of the filters are included in the texture source; i.e., the result of the filters can be seen in the texture. In another embodiment, if the object has applied, active behaviors or transforms, the behaviors and transforms are ignored. In yet another embodiment, only the image appears as the texture. In one embodiment, use the following guidelines when using objects as texture sources.
  • In one embodiment, when using an object with an applied filter as a texture source:
      • In one embodiment, to use the object with the effect of the filter, use the steps similar to those given for applying a texture to a text object.
      • In one embodiment, if the object is an image or image sequence, use the object without the effect of the filters by dragging the image from the Media tab to the Texture Input Image well, rather than from the Layers List.
      • In one embodiment, to use an object without the effect of the applied filter, make a copy of the object in the Layers List, remove the filters from the object, and turn the object off. In one embodiment, the object can then be dragged from the Layers List to the input well.
  • In one embodiment, when using an object with an applied behavior or active transforms (e.g., rotate) as a texture source, use the steps similar to those given for applying a texture to a text object. In one embodiment, the effects of the behavior or transforms are ignored.
  • Changing the Text Opacity—In one embodiment, the Opacity slider or value field in the Dashboard or in the Inspector can be used to adjust the opacity of a text object.
  • In one embodiment, to set the text object opacity in the Dashboard:
      • In one embodiment, select the text object.
      • In one embodiment, press H to display the Dashboard. In one embodiment, the Opacity controls are located at the top of the Dashboard.
      • In one embodiment, click and drag the Opacity slider, or enter an opacity value in the field. In one embodiment, the text object opacity is dynamically updated as the slider is dragged.
  • In one embodiment, to set the opacity in the Inspector:
      • In one embodiment, select the text object.
      • In one embodiment, in the Inspector (I), click the Text tab.
      • In one embodiment, click Style.
      • In one embodiment, in the Face controls, click and drag the Opacity slider, or enter an opacity value in the field.
  • In one embodiment, because a text object is like objects of other types, its opacity can be adjusted in the Properties tab. In one embodiment, the changes are multiplicative. In another embodiment, in other words, if the Opacity of a text object is set in the Text Style parameters to 50 percent, the opacity of the text object is 50 percent. In yet another embodiment, if the Opacity in the Properties tab is then set to 50 percent, the opacity of the text object is 25 percent.
  • Setting the Text Blur—In one embodiment, use the Blur parameter to adjust the softness of the text object.
  • In one embodiment, to adjust the softness in the Inspector:
      • In one embodiment, select the text object.
      • In one embodiment, in the Inspector (I), click the Text tab.
      • In one embodiment, click Style.
      • In one embodiment, in the Face controls, click and drag the Blur slider, or enter a blur value in the field. In one embodiment, the text object softness is dynamically updated as the slider is dragged.
  • Text Outline Controls—In one embodiment, use the Outline controls in the Style pane to create text object outlines. In one embodiment, the color, opacity, softness, width, and fill of the outline can be changed.
      • Enable—In one embodiment, Enable enables and disables the text outline.
      • Fill with—In one embodiment, Fill with sets the fill for the text outline. In one embodiment, as with the Face controls, the outline fill can be set to Color, Gradient, or Texture.
      • Color—In one embodiment, Color sets the color for the text outline. In one embodiment, click the Color disclosure triangle to adjust the individual Red, Green, Blue, and Alpha values of the outline.
      • Opacity—In one embodiment, Opacity sets the opacity of the text outline.
      • Blur—In one embodiment, Blur sets the softness of the text outline.
      • Width—In one embodiment, Width sets the range of the outline.
      • Priority/Over/Under—In one embodiment, Priority/Over/Under specifies whether the outline is drawn over or under the text object face.
  • Adding a Text Outline—In one embodiment, to create a text outline, enable the Outline parameter in the Style pane of the Text Inspector.
  • In one embodiment, to create an outline for a text object:
      • In one embodiment, select the text object.
      • In one embodiment, in the Inspector (I), click the Text tab.
      • In one embodiment, click Style.
      • In one embodiment, in the Outline controls, turn on Outline.
        In one embodiment, the outline only of a text object may be displayed by turning off the Face parameters.
  • Editing Text Object Outlines—In one embodiment, use the Outline controls to soften the opacity or blur of a text outline, change the width of an outline, or to set and edit the fill of an outline.
  • In one embodiment, to change the color of an outline, click the color picker and select a color from the Colors window.
  • In one embodiment, to adjust the opacity of an outline, use the Opacity slider or the value field to change the opacity of the outline.
  • In one embodiment, to adjust the blur of a text outline, use the Blur slider or the value field to change the blur of the outline.
  • In one embodiment, to change the width of a text outline, use the Width slider or the value field to change the width of the outline.
  • In one embodiment, the Outline fill controls are similar to the controls for the Face parameters.
  • Text Glow Controls—In one embodiment, use the Glow controls to create a glow around a text object.
      • Enable—In one embodiment, Enable enables and disables the text glow.
      • Fill with—In one embodiment, Fill with sets the fill for the text glow. In one embodiment, as with the Face and Outline controls, the outline fill can be set to Color, Gradient, or Texture.
      • Color—In one embodiment, Color sets the color for the text glow. In one embodiment, click the Color disclosure triangle to adjust the individual Red, Green, Blue, and Alpha values of the glow.
      • Opacity—In one embodiment, Opacity sets the opacity of the text glow.
      • Blur—In one embodiment, Blur sets the softness of the text glow.
      • Width—In one embodiment, Width sets the size of the glow.
  • Adding a Text Glow—In one embodiment, to create a text glow, enable the Glow parameter in the Style pane of the Text Inspector.
  • In one embodiment, to create a glow for a text object:
      • In one embodiment, select the text object.
      • In one embodiment, in the Inspector (I), click the Text tab.
      • In one embodiment, click Style.
      • In one embodiment, in the Outline controls, turn on Glow.
        In one embodiment, the glow only of a text object may be displayed by turning off the Face (and any other active) parameters.
  • Editing Text Object Glow—In one embodiment, use the Glow controls to soften the opacity or blur of the text glow, change the size of the glow, or set and edit the fill of a glow.
  • In one embodiment, to change the color of the glow, click the color picker and select a color from the Colors window.
  • In one embodiment, to adjust the opacity of the glow, use the Opacity slider or the value field to change the opacity of the glow.
  • In one embodiment, to adjust the blur of the glow, use the Blur slider or the value field to change the softness of the glow.
  • In one embodiment, to change the width of the glow, use the Width slider or the value field to change the size of the glow.
  • In one embodiment, the Glow fill controls are similar to the controls for the Face parameters.
  • Creating a Drop Shadow—In one embodiment, use the Drop Shadow controls to create a drop shadow on a text object, and to adjust its color, opacity, offset from the text object, softness, and angle. In one embodiment, the Shadow parameters include:
      • Enable—In one embodiment, Enable enables and disables the drop shadow.
      • Color—In one embodiment, click the color box to select a color for the drop shadow from Colors window. In one embodiment, click the Color disclosure triangle to adjust the individual Red, Green, Blue, and Alpha values of the shadow.
      • Opacity—In one embodiment, Opacity sets the opacity of the drop shadow.
      • Distance—In one embodiment, Distance sets the distance, in pixels, of the drop shadow from the text object.
      • Blur—In one embodiment, Blur sets the softness of the drop shadow.
      • Angle—In one embodiment, Angle sets the angle (or direction) of the drop shadow.
      • Size—In one embodiment, Size determines the size, in points, of the drop shadow. In one embodiment, by default, the shadow is the same size as the font size.
  • Adding a Drop Shadow—In one embodiment, to create a text drop shadow, enable the Drop Shadow parameter in the Style pane of the Text Inspector.
  • In one embodiment, to add a drop shadow:
      • In one embodiment, select the text object.
      • In one embodiment, in the Inspector (I), click the Text tab.
      • In one embodiment, click Style.
      • In one embodiment, in the Drop Shadow parameters, turn on Drop Shadow. In one embodiment, the default drop shadow is applied to the text object.
  • Adjusting the Drop Shadow Parameters—In one embodiment, use the Drop Shadow controls to change the color or opacity of the shadow and to adjust the softness of the shadow. In one embodiment, the distance the shadow is offset from the text object, and its angle, may also be adjusted. In another embodiment, the Drop Shadow parameters can be animated.
  • In one embodiment, to change the color of the drop shadow, click the color box and use the Color window to set a new color.
  • In one embodiment, to change the opacity of the drop shadow, in the Opacity parameter, click and drag the slider or use the value field.
  • In one embodiment, to change the size of the drop shadow, in the Size parameter, click and drag the slider or use the value field.
  • In one embodiment, to change the distance of the shadow from the text object, in the Distance parameter, click and drag the slider or use the value field. In one embodiment, the distance the shadow is offset is represented in pixels.
  • In one embodiment, to change the angle of the shadow from the text object, click and drag in a circular motion on the Angle dial, or use the value field.
  • In one embodiment, the Shadow fill controls are similar to the controls for the Face parameters.
  • ix. Editing Text Paragraphs
  • In one embodiment, the Text Layout pane contains controls for type layout, such as setting margins, alignment, justification, and line spacing. In one embodiment, a “typewriter” effect can be created using the Type On parameter in the Layout pane.
  • In one embodiment, to show the Text Layout pane, in the Inspector, click the Text tab, and click Layout.
  • a. Text Layout Controls
  • In one embodiment, use the Text Layout controls to specify general “layout” of text. In one embodiment, these controls include specifying if the text flows in a single line, a paragraph with set margins, or on a path.
      • Layout Method—In one embodiment, Layout Method specifies whether the text layout is set to Type, Paragraph, or Path. In one embodiment, the default layout is Type, which creates a single line of text For a description of each layout.
      • Alignment—In one embodiment, Alignment sets the alignment of the lines of text. The options include Left, Center, or Right.
      • Justification—In one embodiment, Justification sets the justification of the lines of text. In one embodiment, the options include None, Partial, or Full.
      • Line Spacing—In one embodiment, Line Spacing specifies the distance between each line of text (leading) in point-sized increments.
      • Type On—In one embodiment, Type On creates a type-on effect, similar to a typewriter.
      • Left, Right, Top, and Bottom Margin—In one embodiment, Left, Right, Top, and Bottom Margin sets the margins for the text layout in the Canvas. In one embodiment, a user can create a custom margin, use the Margin controls, or draw a text box in the Canvas.
  • In one embodiment, to create a text box, do one of the following:
      • In one embodiment, select the Text tool (T), and click and drag a text box in the Canvas.
      • In one embodiment, in the Text Layout controls, set values using the Left, Right, Top, and Bottom Margin parameters.
  • b. Setting Text Margins
  • In one embodiment, if a user is working with a large amount of text and needs paragraph controls, he can establish margins. In one embodiment, a user can draw a custom text box in the Canvas, or set up margins in the Layout pane of the Text Inspector.
  • In one embodiment, the default type layout option is Type. In one embodiment, when Type is enabled, text is entered in one string that extends beyond the Canvas, unless the user manually breaks or returns at the end of his text lines.
  • Drawing Text Margins—In one embodiment, use the Text tool to draw a text box in the Canvas. A user can draw a box that extends beyond the edge of the Canvas.
  • In one embodiment, to draw a text box:
      • In one embodiment, select the Text tool (T).
      • In one embodiment, click and drag in the Canvas to draw the text box.
      • In one embodiment, begin typing.
      • In one embodiment, to resize the text margins, ensure the Text tool still selected and click and drag a control point on the text box. In one embodiment, a user can also resize the margins of the text box using the Margin controls in the Layout pane of the Text Inspector.
      • In one embodiment, press Esc or click the Selection (S) tool to select the text box and exit editing mode. In one embodiment, a user cannot use the Selection tool to resize only the text box margins and not the text. In another embodiment, if a user selects a control point of a text box with the Selection tool and resizes, the object itself is resized, not just the bounding box.
  • Using Margins with the Text Editor—In one embodiment, when entering text via the Text Editor, a user can set text margins using the Paragraph Layout Method option and the margin controls in the Layout pane.
  • In one embodiment, to set margins for text entered in the Text Editor:
      • In one embodiment, select the Text tool (T) and click in the Canvas.
      • In one embodiment, in the Inspector>Text tab, click Layout.
      • In one embodiment, select Paragraph from the Layout Method pop-up menu.
      • In one embodiment, use the margin controls in the lower portion of the Layout pane to set the text margins.
      • In one embodiment, click Format and enter the text in the Text Editor.
  • c. Working with Text on a Path
  • In one embodiment, a user can create text on a line or an ellipse. In one embodiment, a user can change the shape of a text path, as well as add or remove control points, as well as animate the text along the path.
  • Text Path Controls—In one embodiment, the following Text Path Controls are available:
      • Path Type—In one embodiment, Path Type sets the type of path. In one embodiment, the options include Line and Ellipse.
      • Inside—In one embodiment, when Inside is enabled, the baseline of text on an elliptical path is shifted so the text appears inside of the ellipse
      • Outside—In one embodiment, when Outside is enabled, the baseline of text on an elliptical path is shifted so the text appears outside of the ellipse
      • Path Offset—In one embodiment, Path Offset determines where the text begins on the path. In one embodiment, animate this value to move text along a path.
  • Creating Text on a Path—In one embodiment, use the Path options in the Layout pane to create text on a path.
  • d. Creating a Type-On Effect
  • In one embodiment, there are two ways to create a type-on text effect, the Type On parameters in the Text Layout controls, or the Type On behavior (in the Text Animation behavior category). In one embodiment, this section discusses using the Type On controls in the Layout pane.
  • In one embodiment, to create a type-on effect:
      • In one embodiment, select the Text object.
      • In one embodiment, in the Inspector (I), click the Text tab.
      • In one embodiment, click Layout.
      • In one embodiment, go to the frame where the animation should start.
      • In one embodiment, turn on Animate (in the Playback controls).
      • In one embodiment, in the Type On controls, enter 0 in the Start value field. In one embodiment, a user can also use the Animation menu rather then enabling Animate in the playback controls. In another embodiment, click the Animation Menu icon next to the Start parameter and select Add Keyframe.
      • In one embodiment, go to the frame where the animation should end (the type-on effect to be complete).
      • In one embodiment, enter 100 in the End value field.
      • In one embodiment, to create a softer fade in as the characters appear, turn on Fade In.
  • x. Using Text Animation Behaviors
  • In one embodiment, text behaviors create animation by applying a range of values to text parameters without creating keyframes. In one embodiment, in other words, behaviors work like expressions. In another embodiment, by dragging a behavior to a text object in the Canvas, Layers List, or Timeline, a user can easily set up a left or right text crawl, scroll, generate random text characters, create a type-on effect, or create a tracking animation. In yet another embodiment, a user can also use the Sequencing behavior to create custom behaviors that animate individual text properties. In one embodiment, for example, the user can select the Scale and Opacity properties and set them to animate through the text characters.
  • a. Applying Text Animation Behaviors
  • In one embodiment, text behaviors are applied in the same manner as other behaviors and filters. In one embodiment, for example, drag a behavior to an object in the Canvas, Layers List, or Timeline.
  • In one embodiment, Text animation behaviors include:
      • Crawl Left—In one embodiment, the Crawl Left behavior scrolls the text object to the left across the Canvas.
      • Crawl Right—In one embodiment, the Crawl Right behavior scrolls the text object to the right across the Canvas.
      • Scroll Up—In one embodiment, to scroll text upward in the Canvas:
        • In one embodiment, select the text object to which to apply the scroll.
        • In one embodiment, in the Library, select the Behaviors category and the Text Animation subcategory.
        • In one embodiment, click and drag the Scroll Up behavior to one of the following:
          • In one embodiment, the text object in the Canvas
          • In one embodiment, the text object in the Layers List
          • In one embodiment, the text object in the Timeline
            Adjusting the Rate of the Scroll—In one embodiment, to adjust the rate of the Scroll Up (or any) behavior, shorten the duration of the behavior in the Timeline.
      • Scroll Down
      • Randomize—In one embodiment, the Randomize behavior randomly generates different characters in a text object. In one embodiment, to randomize the characters in a text object, select the text object to randomize.
      • Sequence
        • Position
        • Rotation
        • Opacity
        • Scale
        • Tracking—In one embodiment, to track from the center, text format should be set to Center Alignment.
      • Tracking
      • Type On
  • xi. Applying Other Behaviors to Text Objects
  • In one embodiment, a user can apply other behaviors to a text object.
  • xii. Creating Text Keyframes
  • In one embodiment, a user can create keyframes for text parameters. In one embodiment, as with objects of other types, there are two ways to create keyframes: In another embodiment, use the Animate button in the Playback controls, or the Animation Menu in the Inspector. In yet another embodiment, the following example uses both methods to animate text Tracking and Opacity.
  • In one embodiment, some text behaviors automatically animate the text parameters. In one embodiment, for example, when the Tracking behavior is applied to a text object, the tracking occurs at the rate specified in the behavior. In another embodiment, the user can adjust the rate of the tracking in the behavior parameters. In yet another embodiment, however, keep in mind that behaviors do not create keyframes.
  • a. Creating Text Object Tracking and Opacity Keyframes
  • In one embodiment, the following example creates text that fades in as the tracking animates. In one embodiment, a user can also create this effect using the Fade In/Fade Out behavior (in the Basic Motion behavior category) and the Tracking behavior (in the Text Animation behavior category).
  • Using Keyframes vs. Using Behaviors—In one embodiment, which text animation method is used (keyframing or behaviors) depends on a project, or more specifically, the project's timing needs. In one embodiment, in general, if the user needs a very specific action to happen at very specific point in time in a project, he should use keyframing. In another embodiment, for example, if the user wants a text object to be completely transparent at frame 1, become completely opaque at frame 60, become transparent again at frame 90, and opaque again at frame 120, he should use keyframing. In yet another embodiment, in other words, keyframing applies very specific values to an object's parameters.
  • In one embodiment, if the effect is more general, for example, the user wants the text be completely transparent at frame 1, opaque at frames 60-90, and become transparent by frame 120, he should use the Fade In/Fade Out behavior. In one embodiment, behaviors generate a range of values that are applied to an object's parameters, animating those parameters over the duration of the behavior.
  • In one embodiment, a user can combine keyframing and behaviors on an object. In one embodiment, for example, if a user keyframed the text opacity parameter, he can then apply the Tracking behavior to animate the text object tracking, or he can keyframe the tracking parameter. Keep in mind, however, that if a keyframe is applied to the text Opacity parameter, and then a Fade In/Fade Out behavior is applied to the text object, unexpected results may occur.
  • In one embodiment, to create text Tracking and Opacity keyframes:
      • In one embodiment, go to the frame where the tracking animation should start.
      • In one embodiment, select the text object.
      • In one embodiment, in the playback controls, click the Animate button. In one embodiment, when enabled, the Animate button appears bright red and a keyframe is automatically created whenever a value is changed.
      • In one embodiment, in the Inspector, click Format and set the first Tracking value. In one embodiment, because keyframing is enabled, a Tracking keyframe is created. In another embodiment, the lower the Tracking value, the closer the text characters are to each other.
      • In one embodiment, go to the frame where the tracking animation should end.
      • In one embodiment, enter the end tracking value.
      • In one embodiment, go to the frame where the opacity animation should start.
      • In one embodiment, click Style, and set the first Opacity value.
      • In one embodiment, go to the frame where the opacity animation should end.
      • In one embodiment, enter the end tracking value.
  • In one embodiment, to view the keyframes for a Text parameter in the Curve Editor, click its Animation menu and select Show in Curve Editor.
  • b. Onscreen Controls and Text Objects
  • In one embodiment, because text objects share most of the characteristics of objects of other types, a user can use the object onscreen controls (e.g., Shear, Four Corner, Pivot, Scale, and Drop Shadow) to transform a selected text object. In one embodiment, the onscreen tools are shortcuts to the object controls in the Inspector>Properties tab. In another embodiment, to set specific values, or fine tune any of the following transforms, use the Properties tab in the Inspector.
  • In one embodiment, the onscreen controls and the Inspector>Properties parameters are applied to the text as an object (such as a clip or image), not as editable text. In one embodiment, the controls for editing the text itself are located in the Inspector>Text tab. In another embodiment, although some object properties are similar to some text style and format controls, such as the Shear property and the Slant text format, the object properties are independent of the text format controls, and vice versa. In yet another embodiment, for example, if a user applies a Slant value of 20 in Inspector>Text>Format, a slant value of 20 is applied to each character in the word, simulating italics. In one embodiment, if a user applies a Shear value of 20 in Inspector>Properties (or using the onscreen controls), a shear value of 20 is applied to the object, not the individual text characters.
  • In one embodiment, the next section briefly describes how to transform a text object using the onscreen controls.
  • Using the Onscreen Controls—For all of the following transforms, ensure the text object is selected (e.g., a bounding box appears around a selected object in the Canvas).
  • In one embodiment, to select a text object:
      • In one embodiment, on the Toolbar, click the Selection tool (or press S).
      • In one embodiment, in the Canvas, click on the text object to transform. In one embodiment, a user can also select the text object in the Layers list.
  • In one embodiment, to move the text object, click in the bounding box and drag the text object.
  • In one embodiment, to rotate the text object, click the rotation handle and drag.
  • In one embodiment, to scale the text object, do one of the following:
      • In one embodiment, to scale in X, click a center left or right control point and drag.
      • In one embodiment, to scale in Y, click a center upper or lower control point and drag.
      • In one embodiment, to scale in X and Y, click one of the corner control points on the bounding box and drag.
  • In one embodiment, an object may be scaled around its pivot point. In one embodiment, to scale proportionally, press Shift while dragging any of the control points.
  • In one embodiment, to shear a text object:
      • In one embodiment, select Shear from the menu.
      • In one embodiment, do one of the following:
        • In one embodiment, to shear the object in X, click and drag on either of the upper or lower control points.
        • In one embodiment, to shear the object in Y, click and drag on either of the right or left control points.
  • In one embodiment, to use the four-corner controls:
      • In one embodiment, select Four-corner from the menu.
      • In one embodiment, click and drag one of the four-corner control points.
  • In one embodiment, to add a drop shadow to a text object:
      • In one embodiment, select Drop shadow from the menu.
      • In one embodiment, adjust the shadow parameters in the Dashboard, or in the Inspector>Properties. In one embodiment, the shadow is applied to the object as a whole. In one embodiment, some object properties are similar to text styles and formats. In another embodiment, shadow controls specific to text are located in the Inspector>Text>Style controls.
  • In one embodiment, to change the anchor point of a text object:
      • In one embodiment, select Anchor point from the menu.
      • In one embodiment, click and drag anchor point to the new position.
        In one embodiment, in addition to using the onscreen transform controls, a user can enter precise values for the transforms in Inspector>Properties.
  • In one embodiment, a user may select a single character in a text object. In one embodiment, a user may select multiple characters in a text object.
  • xiii. Using Text as a Particle Shape
  • In one embodiment, a user can use a text object as a particle shape.
  • In one embodiment, to add an emitter to a text object:
      • In one embodiment, select the text object.
      • In one embodiment, press E. In one embodiment, a particle emitter is added to the text object and the text object becomes the emitted particle shape.
  • In one embodiment, a user can edit the text after the fact.
  • xiv. Using Text Styles
  • In one embodiment, a user can apply a style to a text object.
  • xv. Using Text as a Mask
  • In one embodiment, a user can apply a mask to a text object.
  • xvi. Saving Custom Text Setups
  • In one embodiment, a user can save a custom text setup.
  • xvii. Using LiveType Fonts
  • In one embodiment, if a user has LiveFonts installed on his system, he can use the LiveType fonts.
  • In one embodiment, to use LiveFonts:
      • In one embodiment, ensure LiveFonts is installed. In one embodiment, by default, LiveType is installed with Final Cut Pro 4. In another embodiment, the LiveFonts are installed separately.
      • In one embodiment, in the Library, click the Fonts tab.
      • In one embodiment, the LiveType fonts appear in the lower portion.
  • B. Particle Systems
  • Using Particle Systems, a user can simulate real-world effects such as smoke and sparks, or he can create sophisticated abstract textures. Particle Systems allow a user to quickly and easily create sophisticated animated effects involving large numbers of automatically animated objects. A particle effects library can be used to add a pre-made particle system to a composition, or custom particle effects can be created using nearly any object in a project. Particle systems are flexible enough to create many different kinds of effects. FIG. 73 illustrates one example of a particle system, according to one embodiment of the invention. FIG. 74 illustrates another example of a particle system, according to one embodiment of the invention. FIG. 75 illustrates yet another example of a particle system, according to one embodiment of the invention.
  • Particle systems work by using a specified object, referred to as a cell 760, as the model for the creation of numerous individual particles 770. Each particle 770 is essentially a duplicate of the original cell 760 and is animated according to the parameters for that particle system. This means that potentially hundreds of animated particles 770 can be created and animated using a single cell 760. FIG. 76 illustrates an example of a cell, according to one embodiment of the invention. FIG. 77 illustrates an example of a particle system based on the cell of FIG. 76, according to one embodiment of the invention.
  • In one embodiment, the object used as a particle system's cell 760 determines how that particle system looks. Particle systems can contain multiple cells 760, resulting in the release of several types of particles 770 from a single emitter. Sophisticated particle presets may be constructed in this way. FIG. 78 illustrates an example of a particle system based on one cell, according to one embodiment of the invention. FIG. 79 illustrates an example of a particle system based on multiple cells 760A, 760B, according to one embodiment of the invention.
  • i. The Anatomy of a Particle System
  • In one embodiment, a particle system comprises an emitter 800 and one or more cells 760. In one embodiment, a cell 760 is nested inside of the emitter 800 in a Project pane and a Timeline. FIG. 80 illustrates an example of a Project pane showing an emitter that is based on two cells, according to one embodiment of the invention. FIG. 81 illustrates an example of a Timeline showing an emitter that is based on two cells, according to one embodiment of the invention.
  • In one embodiment, the emitter and cells have separate sets of parameters that control the particle system's behavior. If a garden hose were a particle system, the nozzle would act as the emitter, while the water would represent the flow of particles. Changing the parameters of the emitter changes the direction and number of particles that are created, while changing the cell's parameters affects each individual particle. By changing a few parameters, it's possible to create very different effects using the same cell. FIG. 82 illustrates an example of a particle system based on an emitter, according to one embodiment of the invention. FIG. 83 illustrates another example of a particle system based on the same emitter as in FIG. 82, according to one embodiment of the invention. FIG. 84 illustrates yet another example of a particle system based on the same emitter as in FIGS. 82 and 83, according to one embodiment of the invention.
  • Particle system parameters can be keyframed in order to change a particle effect's dynamics over time. For example, by keyframing an emitter's 800 Position property in a Keyframe Editor, a path 860 of bubbles can be created that follows an object 850 onscreen. FIG. 85 illustrates an example of an object, according to one embodiment of the invention. FIG. 86 illustrates an example of a particle system of bubbles along with the object of FIG. 85, according to one embodiment of the invention. FIG. 87 illustrates another example of a particle system of bubbles along with the object of FIG. 85, according to one embodiment of the invention.
  • Behaviors can be added to a cell to create even more varied effects. In one embodiment, simulation behaviors can be especially effective. In one embodiment, a behavior that is applied to a cell is in turn applied to a particle that it generates. This enables almost limitless variation. Adding behaviors to particles in addition to the particle system's own parameters is an easy way to create complex, organic motion that would be impossible to accomplish any other way. For example, if a Repel behavior is added to a cell, it causes emitted particles to weave around one another like amoebas under a microscope.
  • ii. Using Particle Systems
  • Adding a particle system to a project can be fast and easy. Pre-made particle systems can be used from a particle library. A simple particle system can be created.
  • a. Using a Particle Library
  • In one embodiment, a particle library, found in a Content category of a Library, is a collection of pre-made particle effects that can be added to a project. There are many types of particle effects to choose from. The easiest way to add a particle system to a project is to use one from a particle library. If a user finds one that's close to what he needs, he can easily customize its parameters after he has added it to his project. Particle systems are added to a project exactly like any other object.
  • In one embodiment, to add a particle effect from a library:
      • Open the Library, select the Content category, and click<Particle Library>.
      • Select a particle preset in the Library Stack, and click a Play button in a Preview pane of a Browser to see an animated preview of the selected particle effect.
      • To use a particle preset, do one of the following:
        • Click Apply to add the selected particle system to a project at the center of a Canvas. It appears in its own layer in a Layers tab and Timeline.
        • Drag the particle system into the Canvas at the position where it should appear. It appears in its own layer in the Layers tab and Timeline.
        • Drag the particle system into a layer in the Layers tab or Timeline. It appears at the center of the Canvas.
          The new particle system object appears in the project. In one embodiment, the new object appears composited against any other objects that have already been added.
  • Customizing Preset Particle Systems—Once a particle system has been added from a Library, it acts as it appeared in the library preview animation. If necessary, a particle system's Emitter parameters can be edited in a Dashboard to tailor the particle system. In one embodiment, a particle system can only be modified after it has been added to a project.
  • In one embodiment, a Dashboard displays a selected particle system's most essential parameters, including, for example, the size and number of particles that are created, how long they remain onscreen, how fast they move, and the direction and area in which they travel. In one embodiment, a cell may also be selected in a Layers tab or Timeline to edit its parameters in the Dashboard.
  • b. Creating a Simple Custom Particle System
  • In one embodiment, creating a particle system begins by selecting an object 12 in a project and using it as a cell 760 within a new particle emitter 800. In a particle system, the emitter is a source of particles that are created. Particle systems are very flexible, and any object in a project can be used as a cell in an emitter, including still graphics, animation or video clips, or shape objects. In one embodiment, the object 12 selected when an emitter 800 is created becomes the first Cell 760 in that particle system. In one embodiment, cells are nested inside of emitters and are used to create the actual particles 770 in that system. FIG. 88 illustrates an example of a particle system including an emitter and individual particles based on the emitter, according to one embodiment of the invention.
  • In one embodiment, to create an emitter:
      • Place an object that will be used to generate particles into a project. This example will use a graphic of a simple white circular gradient 890 that was created with an alpha channel. FIG. 89 illustrates a simple white circular gradient, according to one embodiment of the invention.
      • If necessary, move the object in a Timeline to a frame where the particle effect will begin.
      • Move the object in a Canvas to a location where the center of the particle system will be.
      • Select the object, and do one of the following:
        • Click an Emitter button 900 in the Toolbar.
        • Press the E key.
          FIG. 90 illustrates an Emitter button, according to one embodiment of the invention. The object 12 selected is replaced by an emitter 800, represented by a transform control in the Canvas. In one embodiment, the emitter 800 appears at the same location in the Canvas as the original object 12. In one embodiment, a cell 760 with the same name as the object 12 first selected is nested within this emitter 800. In one embodiment, the original object remains in the Layers tab, but is turned off.
  • FIG. 91 illustrates a new emitter, at the first frame of the particle effect, according to one embodiment of the invention. In one embodiment, by default, the first frame of a new particle system has three particles. If the project is played, additional particles are generated, emerging from the center of the emitter.
  • In one embodiment, by default, a new cell 760 emits one particle 770 per frame in all directions, and each particle 770 moves 100 pixels per frame away from the emitter 800 over a lifetime of 100 frames. FIG. 92 illustrates an active particle system, such as the emitter of FIG. 91 but at a later frame, according to one embodiment of the invention. In one embodiment, the Initial Number parameter in the Emitter or Particle Cell tabs changes the default behavior so that a particle system begins with a burst of particles at the first frame.
  • The Predictability of Particle Systems—When a particle system is created or a parameter of an existing particle system is modified, the path of a particle in that system is immediately calculated and predetermined. While the number and motion of particles may seem random, they are actually predictable based on that system's parameters. Playing the same particle system twice with the same parameters results in the same particle motion, so that once a particle system is created that looks right, it will always be the same.
  • c. Customizing a Particle System's Emitter
  • When an emitter is created, the particle system starts working according to the default parameters in its Emitter and Particle Cell tabs. In one embodiment, these are located in the Inspector. The emitter Dashboard 110 can be used to easily change the most important of these parameters. Select an emitter to see its parameters in the Dashboard.
  • Emitter Dashboard Parameters—In one embodiment, the Dashboard contains emitter controls that modify a particle system's size and shape. In one embodiment, these parameters are a subset of those found in the Emitter tab of the Inspector. In one embodiment, the Dashboard contains a group of sliders and an Emission control. In one embodiment, an Emission control provides a visual way to manipulate three different particle system parameters-Emission Range, Emission Angle, and Speed.
  • In one embodiment, for particle systems containing multiple cells 760, the emitter 800 Dashboard 110 parameters simultaneously modify the effect of each cell's parameters relative to one another. This means that for a particle system consisting of three cells with different Scale values, changing the scale in the Dashboard 110 resizes all three cells simultaneously. For example, increasing the scale in the Dashboard 110 by 130 percent does not change the scale of all three cells to 130 percent. Instead, it multiplies the scale of each cell by 130 percent, so that all are resized relative to their original scale values. FIG. 93 illustrates a particle system, according to one embodiment of the invention. FIG. 94 illustrates the particle system of FIG. 93 after it has been rescaled, according to one embodiment of the invention. For this reason, in one embodiment, the Dashboard parameters are displayed as percentages, since they represent the percent at which these particle cell parameters are modified.
  • FIG. 95 illustrates a Dashboard for a particle system, according to one embodiment of the invention. In one embodiment, emitter parameters in the Dashboard include:
      • Birth Rate: In one embodiment, a slider 950 defines how many particles are created every second. In conjunction with the Life parameter, this defines how many particles appear in the Canvas at a given time.
      • Life: In one embodiment, a slider 952 defines how long each particle remains on-screen before disappearing from existence.
      • Scale: In one embodiment, a slider 954 defines the size of each particle, relative to original size of the cell.
      • Emission Range: In one embodiment, moving two points 1012 on the outer ring of the graphical Emission control defines a segment 1010 of the circumference about the center of the emitter from which particles emerge.
      • Emission Angle: In one embodiment, if the Emission Range into which particles emerge is constrained to a subsection of the Emission control, dragging the inside of this section changes the direction into which particles will be emitted.
      • Speed: In one embodiment, draggable arrows 956 within the defined Emission Range of the Emission control can be shortened or lengthened to define how quickly particles move away from the emitter.
  • Using the Dashboard to Create a Simple Smoke Effect—In this example, the emitter controls in the Dashboard are used to create a smoke effect using the emitter created in the procedure “Creating a Simple Custom Particle System.” In one embodiment, to create a smoke effect using the emitter Dashboard:
      • Before making adjustments to the selected particle system, it may be helpful to move the playhead forward in the Timeline to a frame where the particle system can be seen in full effect. That way, any adjustments made will be readily apparent. FIG. 96 illustrates the particle system of FIGS. 91 and 92 in full effect, according to one embodiment of the invention.
      • Currently, the size of each particle is so big, it's difficult to make out any texture in the particle system. With the emitter Dashboard open, drag the Scale slider to the left to reduce every particle's size so that the individual particles are more textured. FIG. 97 illustrates the particle system of FIG. 96 at anther point in time, according to one embodiment of the invention. FIG. 98 illustrates the particle system of FIG. 97 after the value of Scale has been reduced, according to one embodiment of the invention.
      • In the Dashboard, click anywhere along the outer edge of the Emission control and drag to define a narrow segment 1010 that limits the area in which particles are created (the Emission Range). In one embodiment, both points 1012 defining the segment rotate around the center of the emission control symmetrically, so the initial wedge points to the right. As the Emission Range is adjusted, the particles rearrange themselves in the Canvas so the resulting effect can be seen. FIGS. 99 and 100 illustrate the Dashboard and the particle system, respectively, before the previously mentioned actions have been performed, according to one embodiment of the invention. FIGS. 101 and 102 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
      • To make the particles drift upwards, click in the middle of the Emission Range segment that was defined and drag to rotate the Emission Angle up and slightly to the left of the center control. FIGS. 103 and 104 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
      • In one embodiment, the middle of the Emission Range segment can be dragged towards or away from the center of the Emission control to adjust the Speed of the particles flying away from the emitter. As this adjustment is made, in one embodiment, one or more arrows within the currently defined Emission Range become longer to indicate a faster speed or shorter for a slower speed. Drag the Speed arrow so that it's approximately halfway between the center and the edge of the Emission control to create a slowly drifting column of particles. FIGS. 105 and 106 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
      • At this point, the particles are all moving in the correct direction, but there aren't very many of them (there isn't much of a fire, yet). Move the Birth Rate slider to the right, increasing the number of particles created by the emitter. In one embodiment, moving this slider to the right creates more particles. In one embodiment, at 430 percent or over, a nearly unified column of particles is created that move farther apart as they drift away from the emitter. FIGS. 107 and 108 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • Finally, adjust the Lifetime slider to define the length of the column of smoke. In one embodiment, moving this slider to the left reduces the duration each particle remains on the screen. This results in a shorter column of particles. In one embodiment, moving it to the right increases each particle's duration, creating a longer column of particles. In one embodiment, moving this slider to 130 percent or over creates a smoke-like column of particles drifting all the way past the edge of the Canvas. FIGS. 109 and 110 illustrate the Dashboard and the particle system, respectively, after the previously mentioned actions have been performed, according to one embodiment of the invention.
  • A single object can thus be used to create a credible column of smoke rising gently into the sky. While the Dashboard controls are quite powerful, in one embodiment, the Emitter and Particle Cell tabs in the Inspector have many more parameters that can be customized.
  • d. Modifying Emitter Properties
  • In one embodiment, emitter parameters can be modified in the Properties tab of the Inspector like any other object. Since particle systems are collections of independently generated objects, these parameters have a different effect then they do with other objects. In one embodiment, the only parameter that appears for cells in the Properties tab of the Inspector is Timing.
  • Transform Controls—As a particle system plays, in one embodiment, cells 760 in that system are duplicated according to the parameters for that system to create individual particles 770. Since particles 770 emerge from the position of the emitter 800, changing the emitter's position in the Canvas also changes the position of particles 770 in that system. This results in the particle system being moved as a unit. FIG. 111 illustrates a particle system, according to one embodiment of the invention. FIG. 112 illustrates the particle system of FIG. 111 after the emitter has been moved, according to one embodiment of the invention.
  • In one embodiment, if the emitter's 800 position is animated using a behavior, or keyframed, the particle system does not move as a unit. In this case, particles 770 emerging from the emitter's position at each frame continue to move relative to that position, regardless of changes to the emitter's position in subsequent frames. This results in a trail 1130 of particles 770 following the path of the emitter 800. FIG. 113 illustrates a particle system where the emitter's position has been animated using a behavior, or keyframed, according to one embodiment of the invention.
  • In one embodiment, modifying an emitter's other geometric parameters (e.g., Rotation, Scale, and Shear) changes the distribution of particles from that emitter, as well as transforming each individual particle. For example, in one embodiment, if an emitter's Shear parameter is modified, the distribution of the emitted particles changes to reflect the new plane of the emitter, and the particles are sheared along the same plane. FIG. 114 illustrates a particle system, according to one embodiment of the invention. FIG. 115 illustrates the particle system of FIG. 114 after the emitter's Shear parameter has been modified, according to one embodiment of the invention.
  • Blending—In one embodiment, any changes made to the opacity or blend mode parameters for an emitter are applied to the particle system as a whole.
  • Mask and Drop Shadow Parameters—In one embodiment, masks and drop shadows cannot be applied to particle systems.
  • Timing—In one embodiment, once a particle system has been created, its duration can be as long or short as necessary, regardless of the duration of the original objects 12 used to create the particle system. The duration of a particle system is defined by the duration of the emitter 800. In one embodiment, changing the out point of an emitter 800 in the Timeline 16 changes the duration of the entire particle system. FIG. 116 illustrates a particle system in the Timeline that comprises one emitter and three nested cells, according to one embodiment of the invention.
  • In one embodiment, by default, a cell in a system generates particles over the entire duration of the emitter. In another embodiment, the duration of an individually generated particle is defined by the Lifetime parameter of the cell that generated it, and not by the duration of the nested cell itself. In one embodiment, the duration of the nested cell itself controls the duration for which it generates particles. In another embodiment, a cell's duration can be changed by dragging either its overall position or its in and out points in the Timeline. In this way, the timing that defines when each cell's particles emerge can be adjusted.
  • For example, a particle system can be created that simulates an explosion by offsetting the appearance of three different types of particles. First, dense white particles 770A emerge from the center. FIG. 117 illustrates a particle system with dense white particles emerging from the center, according to one embodiment of the invention. FIG. 118 illustrates the particle system of FIG. 117 with more diffuse orange particles appearing around a larger area, according to one embodiment of the invention. Once second after that, small sparks emerge from underneath both of these layers as they fade away. FIG. 119 illustrates the particle system of FIG. 118 with small sparks emerging from underneath both of the previous layers as they fade away, according to one embodiment of the invention.
  • iii. Creating Graphics and Animations for Particle Systems
  • In one embodiment, creating new particle systems from scratch begins with designing the particles that will be emitted. Any graphic or video clip may be a cell.
  • a. Creating Still Image Graphics for Particle Systems
  • Still images are the easiest to create, and are often all that is necessary to create a compelling particle system. Here are some guidelines for creating graphics for use as particles.
  • Graphics Size—In one embodiment, it's a good idea to make graphics larger rather than smaller. The size of the particles may be reduced without a loss of quality, but increasing the size of particles beyond the size of the original graphic may introduce unwanted artifacts.
  • Particle Edges—In one embodiment, the quality of the edges of graphics can be extremely important for creating convincing particles. Soft, translucent edges might look better than hard, over-defined ones.
  • Object Color—In one embodiment, by default, particles are created using the original colors of the image being used as the cell. If necessary, the emitted particles can be tinted using the Color Mode, Color, and Color Over Life parameters in the Emitter and Particle Cell tabs. In one embodiment, particles may be tinted by a single color, or they may be tinted with a gradiated tint that changes color over time. In one embodiment, tinting particles applies the tint color uniformly over the entire object.
  • Create Graphics With an Alpha Channel—In one embodiment, create graphics to use as cells with pre-defined alpha channels. In one embodiment, if a graphic with a pre-multiplied alpha channel is imported, the “Is Premultiplied” parameter in the Particle Cell tab can be turned on for that cell to eliminate any edge fringing.
  • b. Creating Animations to Use as Cells
  • Video clips, such as QuickTime movies, may also be used as cells. For example, in one embodiment, an animation can be created, rendered as a QuickTime movie, and used as a cell. In general, the same recommendations for creating still graphics apply to the creation of animation or video clips to use as cells, but in one embodiment, there are additional considerations.
  • Create Clips That Loop—In one embodiment, particles created from video clips loop over and over for the duration of each individual particle's life. If the clip doesn't loop well, there will be a jump cut at every loop point.
  • Use Video Clips With Minimal Compression—Ideally, in one embodiment, video clips to be used as particles should be saved using an uncompressed codec, such as Animation or Uncompressed 8- and 10-bit 4:2:2. In other embodiments, other codecs can be used, but they may introduce unwanted artifacts depending on the level of compression used.
  • iv. Advanced Particle System Controls
  • In one embodiment, while the Dashboard provides a fast way to modify a particle system's main parameters, the particle system's Emitter and Particle Cell tabs in the Inspector provide total control over every aspect of a particle system. This includes, for example, individual parameters for each cell in a system.
  • a. The Difference Between Emitter and Particle Cell Parameters
  • In one embodiment, Emitter and Particle Cell parameters, though closely related, serve different purposes. In one embodiment, emitter parameters control the overall shape and direction of the animated mass of particles generated by the system. In another embodiment, other emitter parameters simultaneously modify the parameters of cells nested inside an emitter. In yet another embodiment, Particle Cell parameters, on the other hand, control the behavior of particles generated from a cell that's nested inside the particle emitter separately.
  • In one embodiment, to open a particle system's Emitter tab:
      • Select an emitter object.
      • Open an Inspector.
      • Click an Emitter tab 1200. Emitter parameters 1202 appear, ready to edit. FIG. 120 illustrates an Emitter tab and Emitter parameters, according to one embodiment of the invention.
  • b. Emitter Parameters
  • In one embodiment, several parameters 1202 in an Emitter tab 1200 are identical to those found in an emitter Dashboard 110, with one difference. In one embodiment, while the Emission control in the emitter Dashboard 110 allows manipulation of the Range, Angle, and Speed parameters using a single, graphical control, the Emitter tab 1200 lists individual controls 1210 for each parameter 1202. FIG. 121 illustrates an Emitter tab and individual controls for several Emitter parameters, according to one embodiment of the invention. In one embodiment, the contents of the Emitter tab are dynamic, and different parameters appear depending on the number of cells in the particle system, as well as the emitter shape that's used.
  • Single Cell vs. Multi-Cell Emitter Parameters—In one embodiment, at first glance, many of the parameters in the Emitter tab appear to mirror identically named parameters in the Particle Cell tabs for each cell within a system. In one embodiment, if a particle system has only one cell, then the Emitter tab displays parameters for the nested cell alongside the emitter's own parameters. In this case, an aspect of the particle system may be controlled directly from this tab, without having to go back and forth between the Emitter and Particle Cell tabs.
  • In one embodiment, if a particle system has multiple cells, an Emitter tab looks different. In one embodiment, the list of parameters is shorter, and some cell parameters are replaced with a smaller group of master controls. In one embodiment, changes made using the master controls modify the effect of a cell's parameters relative to other cells in a system. This means that, in one embodiment, for a particle system with three cells that have different Scale values, increasing the Scale parameter in the Emitter tab multiplies the Scale value of all three cells by the same percentage. In one embodiment, this has the result of increasing or reducing the size of a particle in the system, while keeping the size of a particle relative to other particles the same. FIG. 122 illustrates a particle system, according to one embodiment of the invention. FIG. 123 illustrates the particle system of FIG. 122 after the value of the Scale parameter in the Emitter tab has been increased, according to one embodiment of the invention.
  • Options in the Emitter Shape Parameter—In one embodiment, the first parameter 1202 in an Emitter tab 1200 is an Emitter Shape pop-up menu. In one embodiment, the options in this menu significantly alter the distribution of generated particles 770. In one embodiment, when an emitter 800 shape is chosen, different Emitter tab 1200 parameters 1202 are revealed which are unique to that shape. These parameters provide additional control over the distribution of particles 770.
  • In one embodiment, there are six Emitter Shapes:
      • Point: In one embodiment, this is the simplest emitter shape and is the default shape for newly created emitters. In one embodiment, it specifies a single point of emission for a particle system. In one embodiment, there are no additional parameters for the Point shape. FIG. 124 illustrates a particle system with a Point emitter shape, according to one embodiment of the invention.
      • Line: In one embodiment, particles emerge from a line stretching through a Canvas. In one embodiment, the length and location of the line segment may be specified, as well as how widely particle emission points are distributed across the line segment. In one embodiment, this emitter shape is good for creating sheets of particles that cascade over a wide area. In one embodiment, the Line shape displays additional parameters, such as Start Point, End Point, and Emit at Points. FIG. 125 illustrates a particle system with a Line emitter shape, according to one embodiment of the invention.
      • Circle: In one embodiment, particles emerge from an edge of a radius around a position of an emitter. In one embodiment, this emitter shape is good for surrounding an element in a composition with particles that emerge from the element's edge. In one embodiment, the Circle shape displays additional parameters, such as Radius and Emit at Points. FIG. 126 illustrates a particle system with a Circle emitter shape, according to one embodiment of the invention.
      • Filled Circle: In one embodiment, particles emerge from an area within a circle surrounding a position of an emitter. In one embodiment, this emitter shape is good for creating a cluster of particles that spreads from within a defined area of a Canvas, rather than from a single point. In one embodiment, the Filled Circle shape displays additional parameters, such as Radius and Emit at Points. FIG. 127 illustrates a particle system with a Filled Circle emitter shape, according to one embodiment of the invention.
      • Geometry: In one embodiment, particles emerge from an edge of a shape. In one embodiment, a spline object is used as the shape source. In one embodiment, the Geometry shape displays additional parameters, such as Shape Source and Emit at Points. FIG. 128 illustrates a particle system with a Geometry emitter shape, according to one embodiment of the invention. FIG. 129 illustrates the shape that was used as the Geometry emitter shape for the particle system of FIG. 128, according to one embodiment of the invention.
      • Image: In one embodiment, particles emerge from within an area defined by an image. The image may or may not have an alpha channel. In one embodiment, if it does have an alpha channel, the shape of the alpha channel may be used to define the emitter shape. In one embodiment, the Image shape displays additional parameters, such as Image Source, Emit at Alpha, Emission Alpha Cutoff, and Emit at Points. FIG. 130 illustrates a particle system with an Image emitter shape, according to one embodiment of the invention. FIG. 131 illustrates the image that was used as the Image emitter shape for the particle system of FIG. 130, according to one embodiment of the invention.
  • In one embodiment, the following parameters are available for Emitters:
      • Start Point: (Line) In one embodiment, two infinite sliders define, in X and Y coordinates, the first point of the line used as the emitter shape.
      • End Point: (Line) In one embodiment, two infinite sliders define, in X and Y coordinates, the second point of the line used as the emitter shape.
      • Radius: (Circle and Filled Circle) In one embodiment, a slider defines the size of the circle used as the emitter shape.
      • Shape Source: (Geometry) In one embodiment, an object defines the shape of the emitter. In one embodiment, either paint or spline objects may be dropped onto this control to assign the desired shape.
      • Image Source: (Image) In one embodiment, an object defines the image used to define the shape of the emitter. In one embodiment, any graphic or video clip may be dropped onto this control to assign the desired shape.
      • Emit at Alpha: (Image) In one embodiment, a checkbox controls whether or not an image's alpha channel will be used to define the shape of the emitter. In one embodiment, if Emit at Alpha is turned off, the entire image is used as the emitter shape. In one embodiment, if Emit at Alpha is turned on, the shape of the alpha channel defines the emitter. In another embodiment, an Emission Alpha Cutoff parameter modifies how the shape of the alpha channel defines the emitter (see below).
      • Emission Alpha Cutoff: (Image) In one embodiment, if the Emit at Alpha parameter is turned on, this slider determines how much of the alpha channel is used to define the emitter area.
      • Emission Angle: In one embodiment, a dial sets the direction in which particles travel. In another embodiment, this parameter works in conjunction with the Emission Range parameter. In yet another embodiment, the Emission Angle parameter is similar to one of the functions of the visual Emission control in the Dashboard. In one embodiment, this parameter is unique to the emitter object. In one embodiment, when using an emitter shape other than a point, such as a line, circle, or shape, setting the Emission Angle parameter to 0 degrees restricts the emission of particles to the outside of the shape. In another embodiment, setting the Emission Angle to 180 degrees restricts the emission of particles to the inside of the shape.
      • Emission Range: In one embodiment, a dial restricts the area around the center of the emitter into which particles are generated, in the direction of the Emission Angle. In another embodiment, the Emission Range parameter is similar to one of the functions of the visual Emission control in the Dashboard. In yet another embodiment, this parameter is unique to the emitter object. In one embodiment, when using an emitter shape other than a point, such as a line, circle, or shape, setting the Emission Range parameter to 0 degrees keeps particles perpendicular to the emitter when they emerge.
      • Emit at Points: (for all shapes except Point) In one embodiment, a checkbox defines whether or not points on an edge of a shape emit particles. In one embodiment, if the Emit at Points parameter is turned off, particles may emerge from anywhere on the line. In one embodiment, if the Emit at Points parameter is turned on, particles may emerge from a limited number of locations on the edge of the shape, as defined by the Num Points parameters.
      • Points: (Line, Circle, Geometry) In one embodiment, a slider defines the number of points on the edge of the currently selected Emitter Shape that emit particles. In one embodiment, emitter points are distributed evenly around the circle. In one embodiment, this parameter is available for Line, Circle, and Geometry Emitter Shapes.
      • Grid X: (Filled Circle, Image) In one embodiment, a slider specifies the horizontal number of emitter points on a grid that is overlaid on the selected Emitter Shape. In one embodiment, in the case of an irregular shape (non-rectangular), grid points that fall outside the shape are ignored. In one embodiment, this parameter is available for Filled Circle and Image Emitter Shapes.
      • Grid Y: (Filled Circle, Image) In one embodiment, a slider specifies the vertical number of emitter points on a grid that is overlaid on the selected Emitter Shape. In one embodiment, in the case of an irregular shape (non-rectangular), grid points that fall outside the shape are ignored. In one embodiment, this parameter is available for Filled Circle and Image Emitter Shapes.
      • Render Order: In one embodiment, a pop-up menu determines whether new particles are drawn on top of, or underneath, particles that have already been generated. In one embodiment, there are two options:
        • Oldest First: New particles appear on top of older particles.
        • Oldest Last: New particles appear underneath older particles.
      • Interleave Particle Cells: In one embodiment, turning on this checkbox blends particles generated from multiple cells together. In one embodiment, turning off this checkbox layers particles in the same order as the nested cells that generate them.
  • In one embodiment, the following parameters are available for Single-Cell Emitters and Particle Elements:
      • Birth Rate: In one embodiment, a slider defines how much to increase or decrease the Birth Rate of a cell in a system. In another embodiment, the Birth Rate parameter defines how many particles emerge from an emitter every second. In yet another embodiment, higher values create denser particle effects. FIG. 132 illustrates a particle system with a lower birth rate, according to one embodiment of the invention. FIG. 133 illustrates the particle system of FIG. 132 but with a higher birth rate, according to one embodiment of the invention.
      • Birth Rate Range: In one embodiment, a slider defines an amount of variance in the Birth Rate of particles generated. In one embodiment, a value of 0 results in no variance; i.e., particles emerge from the emitter at the same rate. In another embodiment, a value greater than 0 introduces a variance defined by the Birth Rate parameter, plus or minus a predetermined random value falling within the Birth Rate Range.
      • Initial Number: In one embodiment, a slider defines how much to increase or decrease the Initial Number of every cell in a system. In one embodiment, this parameter determines how many particles appear at the first frame of a particle effect. In another embodiment, the result is an initial burst of particles that eventually even out according the Birth Rate parameter. FIG. 134 illustrates a particle system with a higher initial number, according to one embodiment of the invention. FIG. 135 illustrates the particle system of FIG. 134 but with a lower initial number, according to one embodiment of the invention.
      • Life: In one embodiment, a slider defines how much to increase or decrease the duration over which a cell lasts. In one embodiment, the Life parameter determines how long a particle lasts before vanishing from existence, similar to how sparks disappear after flying away from a sparkler. In another embodiment, unless the Color Over Life or Opacity Over Life parameters are used to fade each particle out over its life, a particle will immediately vanish at the end of its lifetime. FIG. 136 illustrates a particle system with a longer life, according to one embodiment of the invention. FIG. 137 illustrates the particle system of FIG. 136 but with a shorter life, according to one embodiment of the invention.
      • Life Range: (single cell emitter/particle cell parameter) In one embodiment, a slider defines an amount of variance in the Life parameter of generated particles. In one embodiment, a value of 0 results in no variance; i.e., particles from the selected cell emerge with the same lifetime. In another embodiment, a value greater than 0 introduces a variance defined by the Lifetime parameter, plus or minus a predetermined random value falling within the Lifetime Range.
      • Speed: In one embodiment, a slider defines how much to increase or decrease the Speed of a cell in a system. In one embodiment, this determines how quickly each particle flies away from the emitter. In another embodiment, the Speed parameter, in conjunction with the Lifetime and Birth Rate parameters, determines how many particles appear in the Canvas at any given frame. In yet another embodiment, the Speed parameter is similar to one of the functions of the visual Emission control in the Dashboard.
      • Speed Range: (single cell emitter/particle cell parameter) In one embodiment, a slider defines an amount of variance in the Speed parameter of generated particles. In one embodiment, a value of 0 results in no variance; i.e., particles from the selected cell emerge with the same speed. In another embodiment, a value greater than 0 introduces a variance defined by the Speed parameter, plus or minus a predetermined random value falling within the Speed Range.
      • Angle: (single cell emitter/particle cell parameter) In one embodiment, a dial defines an angle of rotation, in degrees, with which new particles are created.
      • Angle Range: (single cell emitter/particle cell parameter) In one embodiment, a dial defines an amount of variance in the Angle parameter of generated particles.
      • Spin: In one embodiment, a dial animates particles in a system by spinning an individual particle around its center. In one embodiment, adjustments to this control are in degrees per second.
      • Spin Range: (single cell emitter/particle cell parameter) In one embodiment, a dial defines an amount of variance in the Spin parameter for a generated particle. In another embodiment, a value of 0 results in no variance; i.e., particles from the selected cell spin at the same rate. In yet another embodiment, a value greater than 0 introduces a variance defined by the Spin parameter, plus or minus a predetermined random value falling within the Spin Range.
      • Additive Blend: (single cell emitter/particle cell parameter) In one embodiment, by default, particles are composited together using the blending mode specified in a Properties tab of a particle system's emitter object. When Additive Blend is on, all overlapping generated particles are composited together using the Additive blending mode. In one embodiment, this compositing occurs in addition to whichever compositing method is already being used. In another embodiment, the result is that the brightness of overlapping objects is intensified. FIG. 138 illustrates a particle system with the Additive Blend parameter turned off, according to one embodiment of the invention. FIG. 139 illustrates a particle system with the Additive Blend parameter turned on, according to one embodiment of the invention.
      • Premultiplied: (single cell emitter/particle cell parameter) In one embodiment, if a cell's source image has a premultiplied alpha channel, the Premultiplied parameter may be turned on to eliminate any fringing that appears around the edge of particles generated from this object.
      • Color Mode: (single cell emitter/particle cell parameter) In one embodiment, a pop-up menu determines if and how particles will be tinted. In one embodiment, there are five options:
        • Original: In one embodiment, particles are generated using their original colors. In one embodiment, when Original is chosen, the Opacity Over Life parameter appears, which is a gradient control that allows changes to the opacity of particles to be animated over time.
        • Solid: In one embodiment, particles are tinted using the color specified in the Color parameter. In one embodiment, additional parameters appear, such as Color and Opacity Over Life. FIG. 140 illustrates a particle system with a Solid Color Mode, according to one embodiment of the invention.
        • Over Life: In one embodiment, particles are tinted based on their age. In one embodiment, a gradient control defines the range of color that a particle assumes as it ages, beginning with the leftmost color in the gradient, and progressing through the range of colors until finally reaching the rightmost color at the end of its life. In another embodiment, an additional grayscale gradient control at the top functions as an Opacity Over Life control. FIG. 141 illustrates a particle system with an Over Life Color Mode, according to one embodiment of the invention.
        • Range: In one embodiment, particles are tinted at random, and the range of possible colors is defined by a color gradient control. FIG. 142 illustrates a particle system with a Range Color Mode, according to one embodiment of the invention.
        • Take Image Color: In one embodiment, when this checkbox is turned on, a new particle's color is based on the color of the image at the position of the emitter point from which the particle was generated. FIG. 143 illustrates a particle system with a Take Image Color Mode, according to one embodiment of the invention.
      • Scale: In one embodiment, a slider defines how much to increase or decrease the Scale of a cell in a system. In one embodiment, the Scale parameter defines how large each particle in the system is. In another embodiment, opening the disclosure triangle of the Scale parameter reveals separate X Scaling and Y Scaling sub-parameters, which can be optionally used to scale the width and height of generated particles separately. FIG. 144 illustrates a particle system with a larger Scale parameter, according to one embodiment of the invention. FIG. 145 illustrates the particle system of FIG. 144 but with a smaller Scale parameter, according to one embodiment of the invention.
      • Scale Range: (single cell emitter/particle cell parameter) In one embodiment, a slider defines an amount of variance in the Scale parameter of generated particles. In one embodiment, a value of 0 results in no variance i.e., particles from the selected cell emerge with the same size. In another embodiment, a value greater than 0 introduces a variance defined by the Scale parameter, plus or minus a predetermined random value falling within the Scale Range. In yet another embodiment, opening the disclosure triangle of the Scale parameter reveals separate X and Y sub-parameters, which can be used to set the width and height of the Scale Range separately.
      • Attach to Emitter: (single cell emitter/particle cell parameter) In one embodiment, a checkbox determines if particles follow the position of the emitter when the emitter is animated with keyframes or behaviors. In one embodiment, if the Attach to Emitter parameter is turned off, a particle follows its own path after being emitted, resulting in a trail of particles that trails along the motion path the emitter is following. In another embodiment, if the Attach to Emitter parameter is turned on, then generated particles follow along with the emitter, surrounding the emitter in a moving cloud of particles.
      • Show Particles As: (single cell emitter/particle cell parameter) In one embodiment, by default, the Show Particles As parameter is set to Image, which displays each particle as the intended duplicate of the object being used as the particle system's cell. In one embodiment, the particles in a system may be viewed in one of a variety of preview modes. In another embodiment, these modes play more efficiently when viewing a complex particle system and also provide other ways of analyzing particle motion. In yet another embodiment, there are four options to choose from:
        • Point: In one embodiment, each particle 770 is represented by a single point 1460. In one embodiment, this is the fastest preview mode and is useful for displaying the type and speed of particle motion in a system. FIG. 146 illustrates a particle system with a Point Show Particles As parameter, according to one embodiment of the invention.
        • Line: In one embodiment, each particle 770 is represented by a line 1470. In one embodiment, this is a good preview mode to use to analyze the vector of each particle's motion. In another embodiment, the length of a line represents a particle's speed, and the angle of a line represents a particle's direction. FIG. 147 illustrates a particle system with a Line Show Particles As parameter, according to one embodiment of the invention.
        • Outline: In one embodiment, a particle 770 is represented by a bounding box 1480. In one embodiment, because a bounding box is a good indicator of a particle's orientation in a system, this preview mode is useful for evaluating the movements of individual particles. In one embodiment, for example, it's easy to see the angle of rotation for a particle that is spinning or following a complex motion path. FIG. 148 illustrates a particle system with an Outline Show Particles As parameter, according to one embodiment of the invention.
        • Image: In one embodiment, this is the final particle system effect. FIG. 149 illustrates a particle system with an Image Show Particles As parameter, according to one embodiment of the invention.
      • Random Seed: (single cell emitter/particle cell parameter) In one embodiment, although a particle system seems random, it is actually deterministic. In one embodiment, this means that the random variation in a particle system is created based on the random seed number shown here. In another embodiment, unless this seed number is changed, a particle system with the same parameter settings will play back with the same motion. In yet another embodiment, in order to change the current random motion or distribution of the particle system, change the seed number by typing in a new number or clicking Generate. In one embodiment, this changes the random calculations performed for that system.
      • Particle Shape, or individual cells: In one embodiment, in a particle system with multiple cells, a cell appears at the bottom of the Emitter tab. In one embodiment, a cell parameter has a checkbox that can be used to enable or disable that cell, a name field, and an image well for that object.
  • Additional Cell Parameters for Animation or Video Clips—In one embodiment, if a particle system uses an animation or video clip as a cell, additional parameters are available. In one embodiment, these parameters are:
      • Animate Image: In one embodiment, a checkbox controls playback looping. In one embodiment, if the Animate Image parameter is turned on, it loops the playback of the animation or video clip used to generate each particle. In another embodiment, if the Animate Image parameter is turned off, particles are generated using a still frame specified by a Hold Frames slider.
      • Random Start Frame: In one embodiment, a checkbox introduces variation into animated particles generated from animation or video clips. If the Random Start Frame parameter is turned on, a newly generated particle begins at a different frame of the animation.
      • Hold Frames: In one embodiment, a Hold Frames parameter overrides the automatic animation that occurs for animation or video clips being used as particle cells. In one embodiment, setting the hold frames parameter to a value other than 0 chooses a still frame from the source animation or video clip to use to generate particles.
      • Hold Frames Range: In one embodiment, a Hold Frames Range parameter varies the frame that's chosen to generate unanimated particles based on a source animation or video clip.
  • In one embodiment, if Random Start Frame is turned off, the following parameter appears:
      • Source Start Frame: In one embodiment, a slider defines which frame of an animation to use as the still frame.
  • Additional Cell Parameters Based on the Selected Color Mode—In one embodiment, an option in the Color Mode pop-up menu displays a different set of parameters, based on the option.
      • Color: In one embodiment, a color may be specified that will be used to tint particles from a cell in a system. In one embodiment, a particle's alpha channel may also be modified, altering its opacity. In another embodiment, changes made to this parameter do not take effect until the Emitter Tint Amount slider is set to a value other than 0 percent. In yet another embodiment, this parameter is unique to the emitter object. In one embodiment, a color may be chosen by, for example, clicking a color control to choose a color from a color picker or opening a disclosure triangle and using Red, Green, Blue, and Alpha channel sliders.
      • Opacity Over Life: (Original, Solid) In one embodiment, a gradient control changes the opacity of a particle based on the particle's age. In one embodiment, this gradient control is limited to grayscale values, which are used to represent varying levels of transparency. In another embodiment, white represents solid particles, progressively darker levels of grey represent decreasing opacity, and black represents complete transparency. In yet another embodiment, a simple white to black gradient represents a particle that is solid when first generated and that fades out gradually over its lifetime until finally vanishing at the end. In one embodiment, the Opacity Over Life parameter has four controls:
        • Gradient Favorites pop-up menu: In one embodiment, a Gradient Favorites pop-up menu displays favorite gradients that a user has saved. In one embodiment, choose a gradient from this menu to load it into a Gradient control.
        • Alpha Gradient control: In one embodiment, to add a new color to a gradient, click anywhere within the gradient bar to create a new color tag. In one embodiment, color tags in an Alpha Gradient control are limited to shades of grey. In another embodiment, click a gradient tag to select it and use an Opacity slider to change its color. In yet another embodiment, to change the distribution of color, drag a selected gradient tag along the gradient bar or select a gradient tag and use the Location slider. In one embodiment, change the spread of color between each segment between two gradient tags using the triangles. In another embodiment, to delete a gradient tag, drag it up off of the gradient bar until it disappears.
        • Opacity slider: In one embodiment, an Opacity slider changes the shade of the selected gradient tag, from 100 (solid/white) to 0 (transparent/black).
        • Location slider: In one embodiment, a location slider changes the location of the selected gradient tag relative to the gradient bar. A gradient tag may also be dragged directly to slide it along the gradient control.
      • Color Over Life: (Over Life) In one embodiment, a gradient control changes the color of a generated particle based on its age. In one embodiment, when born, a particle is tinted with the leftmost color in the gradient. In another embodiment, over its life, its color changes through the range of the gradient, from left to right, until finally reaching the rightmost color at the end of its life. In yet another embodiment, similar to the Opacity Over Life parameter, the Color Over Life parameter has five controls:
        • Gradient Favorites pop-up menu: In one embodiment, a Gradient Favorites pop-up menu displays favorite gradients that a user has saved. In one embodiment, choose a gradient from this menu to load it into the Gradient control.
        • Alpha Gradient control: In one embodiment, a gradient control changes the opacity of a generated particle based on its age. In one embodiment, color tags are limited to shades of grey.
        • Color Gradient control: In one embodiment, a gradient control tints a particle based on its age.
        • Color control: In one embodiment, when a color tag is selected in the Color Gradient control, its color may be changed by clicking the Color control and choosing a color using the Color Picker.
        • Opacity slider: In one embodiment, when a color tag is selected in the Alpha Gradient control, an Opacity slider may be used to change its shade, from 100 (solid/white) to 0 (transparent/black).
        • Location slider: In one embodiment, a Location slider changes the location of the selected gradient tag in either gradient control relative to the gradient bar. A gradient tag may also be dragged directly to slide it along the gradient control.
      • Color Range: (Range) In one embodiment, a gradient control defines a range of colors used to randomly tint new particles. In one embodiment, the number of colors that appear within the gradient is relevant but the direction of the gradient colors is not. In another embodiment, the Color Range parameter has the same controls as the Color Over Life parameter.
  • c. Particle Cell Parameters
  • In one embodiment, parameters in the Particle Cell tab 1500 control the behavior of an individual particle 770 that is generated by the system, independently of the parameters governing the emitter 800. In one embodiment, in particle systems with multiple cells, a cell has its own particle cell parameters 1502. In another embodiment, this enables the creation of a particle system made up of many kinds of particles, each with distinctly different behaviors.
  • In one embodiment, to open a cell's Particle Cell tab 1500:
      • Select a cell nested underneath an emitter in the Layers tab or Timeline.
      • Open the Inspector, and click the Particle Cell tab.
        The Particle Cell parameters will appear. FIG. 150 illustrates a Particle Cell tab, according to one embodiment of the invention.
  • v. Using Multiple Cells Within a Single Emitter
  • In one embodiment, a particle system may use multiple cells. In one embodiment, a particle system may emit different kinds of overlapping particles by nesting multiple cells inside of a single emitter. In another embodiment, any number of cells may be nested within a single emitter object. In yet another embodiment, a cell has its own particle cell parameters, which govern how particles from that cell are created. In one embodiment, a particle system with multiple cells generates particles from each cell simultaneously, according to each cell's parameters.
  • In one embodiment, to nest an additional cell within an emitter:
      • In one embodiment, select an object 12 to use as a cell 760, and drag it to a position in the Layers tab 14 or Timeline 16 directly underneath the Emitter 800 to nest the new cell 760 inside the emitter 800. In one embodiment, as a user moves the object 12, a position indicator 1510 appears underneath the object 12 that indicates its new position. FIG. 151 illustrates an object that is being dragged to a position in the Layers tab, according to one embodiment of the invention.
      • In one embodiment, when the object has been dragged to the desired position within the emitter hierarchy, release the mouse button. In one embodiment, the object that was dragged now appears nested within the emitter object. FIG. 152 illustrates the object of FIG. 151, now nested within an emitter, according to one embodiment of the invention.
  • In one embodiment, in a particle system with multiple cells, the Interleave Particles parameter determines how particles generated from the different cells blend together.
  • vi. Animating Objects in Particle Systems
  • In one embodiment, any Emitter or Cell parameter in a particle system can be animated by using Parameter Behaviors or by keyframing the parameter directly. In one embodiment, if an emitter-specific parameter is animated, such as Emission Angle and Emission Range, the position and distribution of new particles generated by that emitter are animated. In another embodiment, animation occurs relative to the duration of the emitter. In one embodiment, when a cell parameter is animated, on the other hand, the actual duration of the original behavior or keyframes is ignored. In another embodiment, the resulting animation is instead scaled to fit the Life parameter of each generated particle. In yet another embodiment, if the Life parameter is increased or decreased, the keyframed animation will scale to the new duration of each particle.
  • a. Animating Emitters and Cells
  • In one embodiment, animating an emitter's Property tab parameters is useful for altering the position and geometric distribution of a particle system over time. In one embodiment, keyframing an emitter object's Position parameter moves the source of newly emitted particles without affecting any particles that were generated at previous frames, which creates a trail of particles. In another embodiment, keyframing an emitter's Emitter tab parameters is a good way to modify the particle system's overall characteristics over time, such as increasing or decreasing the size, speed, or lifetime of newly generated particles.
  • vii. Using Behaviors with Particle Systems
  • In one embodiment, adding behaviors to a particle system's emitter, or to the cells themselves, can quickly achieve sophisticated, organic effects with very little effort. In one embodiment, behaviors may be added to a particle system's emitter, or to the cells themselves.
  • a. Applying Behaviors to Emitters
  • In one embodiment, when a Basic Motion behavior is applied to an emitter, the position of the source of all new particles generated by that system is affected. In one embodiment, once an individual particle emerges, it is unaffected by changes to the position of the emitter, so moving the emitter around the screen using behaviors results in the creation of a trail of particles that behave according to their particle cell parameters. In another embodiment, this behavior can be overridden by turning on a cell's Attach to Emitter parameter.
  • In one embodiment, to apply a behavior to an emitter, drag a behavior from the Library onto an emitter in the Canvas, Layers tab, or Timeline. In one embodiment, the behavior is applied to the emitter, which begins to move according to the parameters of the behavior.
  • b. Applying Behaviors to Cells
  • In one embodiment, a behavior that is applied directly to a cell is in turn applied to individual particles generated from that cell. In one embodiment, this can result in some extremely complex interactions as dozens of particles weave and collide according to the defined behaviors. In another embodiment, a behavior applied to a Cell has no effect on the position of the Emitter.
  • In one embodiment, to apply a behavior to a cell, drag a behavior from the Library to a cell in the Layers tab or Timeline. In one embodiment, the behavior is applied to the cell, and all particles generated from that cell begin to move according to the parameters of the behavior.
  • The Particle Behavior Category—In one embodiment, there's a category that contains a behavior specifically for use with the cells in a particle system. In one embodiment, the Particles category contains the Scale Over Life behavior. In another embodiment, this behavior grows or shrinks a particle in a system over the duration of the particle's life. In yet another embodiment, the Scale Over Life behavior has two parameters:
      • Increment Type: In one embodiment, choose which method is used to resize particles generated with a particle effect. In one embodiment, there are three options:
      • Rate:—In one embodiment, Rate specifies a steady rate at which particles change size over their entire lifetime. In one embodiment, a Scale Rate parameter appears, allowing the user to define how quickly each particle changes size. In one embodiment, positive values grow particles over time, while negative values shrink particles over time.
      • Birth and Death Values—In one embodiment, Birth and Death Values specify starting and ending scale percentages that are used to animate each particle's size over its lifetime. In one embodiment, two parameters appear when this option is selected. In one embodiment, Scale at Birth determines the initial size of particles when they are first created. In one embodiment, Scale at Death determines the size each particle changes to at the end of its lifetime.
      • Custom—In one embodiment, Custom reveals the Custom Scale parameter, which allows a user to set the size of each particle generated by a cell. In one embodiment, a user can apply a parameter behavior to this parameter to create different animated effects.
  • viii. Applying Filters to Particle Systems
  • In one embodiment, a filter may be applied only to a particle system's emitter. In one embodiment, as a result, a filter affects an entire particle system, including every cell, as if it were a single object. In another embodiment, an individual cell cannot have a separate filter applied to it. FIG. 153 illustrates a particle system, according to one embodiment of the invention. FIG. 154 illustrates the particle system of FIG. 153 after a Sphere filter has been applied, according to one embodiment of the invention.
  • ix. Particle System Examples
  • This section presents three examples of how to use particle systems to create very different effects, according to one embodiment of the invention.
  • a. Example 1 Creating an Animated Background
  • In this first example, an animated background is created using a single still image, according to one embodiment of the invention. In one embodiment, by using parameters available in the Emitter tab, a single image can be turned into a complex animated texture.
  • In one embodiment, to create an animated background from a single image:
      • In one embodiment, drag a file to use into the Canvas. This example uses a simple graphic 1550 with a premultiplied alpha channel, according to one embodiment of the invention. FIG. 155 illustrates a simple graphic with a premultiplied alpha channel, according to one embodiment of the invention.
      • In one embodiment, with the new object selected, click the Emitter button 1560 in the Toolbar to turn it into an emitter (or press the E key). FIG. 156 illustrates an Emitter button, according to one embodiment of the invention. In one embodiment, the original object is replaced with an Emitter, but nothing happens yet because the Playhead is at the first frame of the project, and no particles have been created yet.
      • In one embodiment, open the Inspector, and choose Filled Circle from the Emitter Shape pop-up menu. In one embodiment, set the Initial Number parameter to 12. In another embodiment, this creates a distributed group of particles 770 that partially fills the Canvas. FIG. 157 illustrates a distributed group of particles that partially fills the Canvas, according to one embodiment of the invention.
      • In one embodiment, to turn the particles into a uniform abstract mass, adjust the following parameters:
        • In one embodiment, set Life to 4.
        • In one embodiment, set Speed to 140.
        • In one embodiment, turn the Spin dial to 60.
        • In one embodiment, turn the Spin Range dial to 15.
        • In one embodiment, turn on Additive Blend.
        • In one embodiment, set Color Mode to Pick From Range.
        • In one embodiment, set Scale to 65.
        • In one embodiment, set Scale Range to 150.
        • In one embodiment, set Random Seed to 10000.
          In one embodiment, advance to frame 100. FIG. 158 illustrates the resulting image, according to one embodiment of the invention.
      • In one embodiment, an additional step might be to apply a filter to the emitter. In this example, adding the Crystallize filter creates an even more abstract effect, according to one embodiment of the invention. In one embodiment, color correction may also be applied to make the background fit more appropriately with the foreground elements. FIG. 159 illustrates the resulting image, according to one embodiment of the invention.
    b. Example 2 Creating Animated Pixie Dust
  • In this example, a particle system is created that uses two different cells to generate a streak of particles that trails behind another animated object, according to one embodiment of the invention. In one embodiment, using two cells adds more variation to a particle system than can be achieved with a single set of cell parameters.
  • In one embodiment, to create a two-celled particle system that trails:
      • In one embodiment, drag a first graphics file into the Canvas. This example uses a small graphic 1600 of a lens flare against black, with a built-in alpha channel, according to one embodiment of the invention. FIG. 160 illustrates the resulting image, according to one embodiment of the invention.
      • In one embodiment, while the object is selected, click the Emitter button in the Toolbar to turn it into an emitter (or press the E key). In one embodiment, the original object is replaced with an Emitter, but nothing happens yet because the Playhead is at the first frame of the project, and no particles have been created. In another embodiment, move the Playhead forward five seconds to view the particle system at a frame where more particles have been generated. In yet another embodiment, this allows the particle system to be viewed in action without having to play it. FIG. 161 illustrates the resulting image, according to one embodiment of the invention.
      • In one embodiment, to create a variety of particles, nest an additional image into the emitter that was just created. In one embodiment, the easiest way to do this is to open the Layers tab, and drag each additional file to use underneath the emitter. FIG. 162 illustrates the resulting image, according to one embodiment of the invention.
      • In one embodiment, to make the particles generated by different cells mingle together, select the emitter, open the Inspector, and turn the Interleave Particle Cells parameter on. FIG. 163 illustrates the resulting image, according to one embodiment of the invention.
      • In one embodiment, select the topmost cell in the Layers tab to adjust its parameters, which automatically appear in the Inspector.
      • In one embodiment, adjust the Scale slider to 45 to reduce the size of the particles generated by this cell. FIG. 164 illustrates the resulting image, according to one embodiment of the invention.
      • In one embodiment, change the color of the particles generated by this cell by doing the following:
        • In one embodiment, choose Solid from the Color Mode pop-up menu.
        • In one embodiment, click the color control in the Color parameter that appears.
        • In one embodiment, choose a color in the Color Picker window that appears. In this example, we'll use a light red, according to one embodiment of the invention.
        • In one embodiment, close the Color Picker window.
          In one embodiment, particles generated by that cell are now small and red. FIG. 165 illustrates the resulting image, according to one embodiment of the invention.
      • In one embodiment, use the Opacity Over Life gradient to make this cell's particles fade out over their lives. In one embodiment, for a simple fade-out, use one of the gradient favorites that exists in the system. FIG. 166 illustrates the resulting image, according to one embodiment of the invention.
      • In one embodiment, to make these particles spin as they move away, turn the dial in the Spin parameter clockwise, to 60 degrees.
      • In one embodiment, adjust the second cell's parameters. In one embodiment, in the Layers tab, select the second cell of the particle system. The second cell's parameters automatically appear in the Inspector.
      • In one embodiment, adjust the Scale slider to 125.
      • In one embodiment, follow the procedure in Step 8 to make these particles light yellow.
      • In one embodiment, to make the particles generated from this cell spin in the opposite direction, turn the dial in the Spin parameter counter-clockwise, to −60. FIG. 167 illustrates the resulting image, according to one embodiment of the invention.
      • In one embodiment, to create a trail of particles, the emitter is animated to follow the required motion path. FIG. 168 illustrates the resulting image, according to one embodiment of the invention.
  • x. Saving Custom Particle Effects to the Library
  • In one embodiment, a particle system can be saved as a particle preset in the Favorites folder of the Library for future use. In one embodiment, once a particle system has been saved in the Library, it can be used just like any other particle preset.
  • In one embodiment, to save a particle system to the particle library:
      • In one embodiment, open the Library and select either the Favorites or Favorites Menu categories.
      • In one embodiment, drag the emitter object to be saved, along with any custom objects used by that emitter, into the stack at the bottom of the Library. In one embodiment, for organizational purposes, it may be useful to create a new folder in the Favorites or Favorites Menu categories to put created particle systems.
  • In one embodiment, when a particle preset is saved, the particle preset is saved as a file. In one embodiment, any custom objects used to create the particle system that were stored in the library appear in the same directory as this file. In another embodiment, particle presets that have been created may be copied from this location to give to other users, or particle presets received from other users can be added to this same directory. In yet another embodiment, whenever a particle preset file is copied, any graphics or video clips used by the particle preset should also be copied.
  • Setting Parameters of Behaviors
  • In one embodiment, a visual effect, from a behavior to a particle system to a gradient, is controlled by a collection of parameters that modify the various attributes for that effect. In one embodiment, for example, a Blur filter has an amount slider that controls how much blur is applied. In another embodiment, a system may contain thousands of parameters. In yet another embodiment, many different types of controls may be used to set these parameters. These controls may include, for example, sliders, dials, and shortcut menus.
  • In one embodiment, even objects without effects applied to them have many parameters that can be modified to alter the nature of the object and how it appears in a project. In one embodiment, these parameters include the object's scale, opacity, and position on screen, as well as more obscure attributes such as its pixel aspect ratio or field dominance.
  • A. The Inspector
  • In one embodiment, parameters that control a visual effect are accessed in an Inspector. In one embodiment, the Inspector contains four tabs, each of which contains a set of parameters for the selected object. In another embodiment, the first three tabs, Properties, Behaviors, and Filters are present for any selected object. In yet another embodiment, the fourth tab, generically called the Object tab, changes its name and contents depending on the type of object selected.
  • i. Type of Controls
  • In one embodiment, there are eleven different types of controls that may appear in the Inspector. In one embodiment, a control provides the opportunity to change the value of a parameter in a special way. In another embodiment, since different types of objects and effects require different parameters, selecting different things will cause different controls to populate the Inspector.
  • In one embodiment, the various types of controls include:
  • Slider—In one embodiment, dragging the thumb 1690 of a slider 1692 changes the value of the parameter. In one embodiment, typically, dragging to the right increases the value and dragging to the left decreases the value. In another embodiment, an example of a parameter that uses a slider is Scale. FIG. 169 illustrates one example of a slider, according to one embodiment of the invention.
  • Value Slider—In one embodiment, a Value Slider 1700 is a special type of slider that includes the numerical value of the parameter in the control. In one embodiment, dragging the middle area 1702 (where the number is) works just like an ordinary slider; i.e., dragging to the right increases the value and dragging to the left decreases the value. In another embodiment, some parameters allow a value slider to increase or decrease the value indefinitely. In yet another embodiment, additionally, a user can click the Increment 1704 or Decrement 1706 arrows to change the value one step at a time. In one embodiment, a user can double-click the number itself to convert the slider 1700 into a value field so that he can type a specific number directly into the control. In another embodiment, an example of a parameter that uses a value slider is Position. FIG. 170 illustrates one example of a value slider, according to one embodiment of the invention.
  • Dial—In one embodiment, a Dial 1710 is used for values based on angles or degrees. In one embodiment, rotate the dial by dragging it in a clockwise or counter-clockwise motion. In another embodiment, a parameter that uses a dial is Rotation. FIG. 171 illustrates one example of a dial, according to one embodiment of the invention.
  • Value Field—In one embodiment, a Value Field 1720 allows direct entry of text to set the value of the parameter. In one embodiment, an example of a parameter that uses a value field is the Text Entry field. FIG. 172 illustrates one example of a value field, according to one embodiment of the invention.
  • Pop-up Menu—In one embodiment, a Pop-up Menu 1730 is a menu with preset values. In one embodiment, click the menu and choose the desired value. In another embodiment, an example of a pop-up menu is Throw Increment. FIG. 173 illustrates one example of a pop-up menu, according to one embodiment of the invention.
  • Value List—In one embodiment, a Value List 1740 is another type of shortcut menu. In one embodiment, a user can click the arrow 1742 to the right of the field to display preset values or he can type a value directly into the Value field 1744. In another embodiment, an example of a value list is Typeface. FIG. 174 illustrates one example of a value list, according to one embodiment of the invention.
  • Activation Checkbox—In one embodiment, an Activation Checkbox 1750 is an on/off toggle for a parameter. In one embodiment, an example of an Activation Checkbox is Preserve Opacity. FIG. 175 illustrates one example of an activation checkbox, according to one embodiment of the invention.
  • Color Well—In one embodiment, a Color Well 1760 enables a user to select a color. In one embodiment, the Color well can be used either by clicking on the box 1762, which opens the Colors window, Control-clicking and picking a color from the pop-up picker 1770, or by clicking the disclosure triangle 1764 and manipulating the individual RGB 176A, 176B, 176C and A 176D sliders. In another embodiment, an example of a color well is Drop Shadow Color. FIG. 176 illustrates one example of a color well, according to one embodiment of the invention. FIG. 177 illustrates one example of a pop-up picker, according to one embodiment of the invention.
  • Gradient—In one embodiment, a Gradient 1780 enables a user to select a preset gradient style or create a new one. In one embodiment, when a Gradient is collapsed, a user can choose from only the Preset shortcut menu 1782 to choose an existing preset. In another embodiment, alternately, click the disclosure triangle 1784 to reveal the Gradient Editor 1786. In yet another embodiment, a user can set the gradient's opacity as well as its color values. FIG. 178 illustrates one example of a gradient, according to one embodiment of the invention.
  • Drop Well—In one embodiment, a Drop Well 1790 enables a user to drag an object 12 (e.g., a clip or still image) to provide input data for a type of effect. In one embodiment, for example, a bump map filter needs an image to provide the bumps, or a Repel From behavior needs to know what object to Repel. In another embodiment, an example of a Drop Well is the Attracted To behavior's Object parameter. FIG. 179 illustrates one example of a drop well, according to one embodiment of the invention.
  • Parameter Selection Field—In one embodiment, a Parameter Selection Field 1800 is a special type of shortcut menu, specifically for Parameter Behaviors. In one embodiment, when a Parameter Behavior is applied to an object, the user needs to identify which parameter the behavior should affect. In another embodiment, a user can either type the name of the parameter directly into the value field 1802, or he can choose from the Go shortcut menu 1804 (which lists all current parameters). In yet another embodiment, an example of the Parameter Selection Field is the Average behavior's Apply To parameter. FIG. 180 illustrates one example of a parameter selection field, according to one embodiment of the invention.
  • In one embodiment, in addition to the parameter control types listed above, several other controls are used within the Inspector tab. In one embodiment, these controls include:
  • Reset Button—In one embodiment, a Reset button 1810 automatically restores the parameter value (or, in some cases, an entire set of parameters) back to their default values. FIG. 181 illustrates one example of a reset button, according to one embodiment of the invention.
  • Manage Presets Button—In one embodiment, some parameter settings (e.g., Gradients and Type Styles) are so complex that they are commonly stored in presets. In one embodiment, whenever a Manage Presets Button 1820 is displayed, a user can save that particular parameter (or set of parameters) into a preset. In another embodiment, for example, the Text Style pane has a Manage Presets control at the top of the parameter list that allows a user to save styles, formats, or both. In yet another embodiment, this enables a user to save all of the settings in the window. In one embodiment, in some cases, a user can also use this control to load an existing preset. FIG. 182 illustrates one example of a manage presets button, according to one embodiment of the invention.
  • In one embodiment, to save a preset:
      • In one embodiment, set the parameter values to the settings to save.
      • In one embodiment, click the Manage Presets button, and then choose Save from the pop-up menu. In one embodiment, a dialog appears.
      • In one embodiment, type a name for the preset to save, then click OK.
        In one embodiment, the preset is now stored as a file on the hard disk. In one embodiment, it will appear in the Manage Presets menu in this and future projects until it is manually deleted in the Finder.
  • In one embodiment, to load an existing preset, click the Manage Presets button, and then choose the preset from the list in the pop-up menu. In one embodiment, the current parameter settings are replaced by the settings in the preset.
  • Animation Menu Button—In one embodiment, most parameters of an item are animateable. In one embodiment, this means that a user can assign specific values to certain frames (keyframes) so the parameter value changes over time. In another embodiment, a parameter that can be animated has an Animation Menu Button 1830 to the right of the parameter settings. In yet another embodiment, depending on the current condition of the parameter, the Animation Menu Button displays a different icon. FIG. 183 illustrates one example of an animation menu button, according to one embodiment of the invention.
  • In one embodiment, clicking on the Animation Menu Button displays a shortcut menu 1840 filled with Animation related controls. FIG. 184 illustrates one example of a shortcut menu filled with Animation related controls, according to one embodiment of the invention. In one embodiment, these menu items include:
  • Enable/Disable Animation—In one embodiment, the Enable/Disable Animation menu item 1842 remains dim until keyframing is applied to the parameter, either by using the Record button or by adding a keyframe. In one embodiment, once the parameter has some animation applied, the menu item is automatically renamed “Disable Animation.” In another embodiment, activating it at that point effectively hides the keyframes that have been set, restoring the parameter to its default value. In yet another embodiment, however, the keyframes are not thrown away. In one embodiment, choosing Enable Animation restores the channel to its last keyframed state.
  • Reset Parameter—In one embodiment, Reset Parameter 1843 removes all keyframes and settings for this parameter. In one embodiment, the parameter value is reset to its default value.
  • Add Keyframe—In one embodiment, Add Keyframe 1844 adds a keyframe at the current frame. In one embodiment, if the playhead is positioned on a frame where a keyframe has already been added, this menu item is dimmed.
  • Delete Keyframe—In one embodiment, Delete Keyframe 1845 deletes the current keyframe. In one embodiment, Delete Keyframe command is available only if the playhead is positioned on a frame where a keyframe already exists.
  • Previous Keyframe—In one embodiment, Previous Keyframe 1846 moves the playhead to the previous keyframe for this parameter. In one embodiment, Previous Keyframe is available only if a keyframe exists earlier in the project.
  • Next Keyframe—In one embodiment, Next Keyframe 1847 moves the playhead to the next keyframe for this parameter. In one embodiment, Next Keyframe is available only if a keyframe exists later in the project.
  • Show In Keyframe Editor—In one embodiment, Show In Keyframe Editor 1848 opens the Keyframe Editor if it is not showing and displays the graph for the parameter that is being modified.
  • ii. Inspector Tabs
  • In one embodiment, the parameters in the inspector are grouped into four categories:
  • Properties—In one embodiment, the Properties tab contains basic attributes about the selected object, such as Transformation (e.g., position, scale, and rotation), Blending (e.g., opacity and blend mode), Drop Shadow controls, Corner Pinning, and the object's In and Out points.
  • Behaviors—In one embodiment, whenever a behavior is applied to an object, the parameters associated with that behavior appear in the Behaviors tab. In one embodiment, multiple behaviors are grouped by the behavior name.
  • Filters—In one embodiment, whenever a filter is applied to an object, the parameters associated with that filter appear in the Filters tab. In one embodiment, multiple filters are grouped by the filter name.
  • Object—In one embodiment, the title and contents of the Object tab change depending on what type of object is selected. In one embodiment, there are seven types of Object tabs, corresponding to seven types of objects.
      • Media—In one embodiment, a Media tab appears when a media object is selected. In one embodiment, the Media tab contains parameters that deal mostly with attributes of a file on disk or how a file is interpreted. In another embodiment, because multiple objects can point to a single media file, the Inspector Media tab contains a list of linked objects including the name of the layer where the objects exist. In yet another embodiment, making changes in this tab affects all objects that refer to the selected media file.
      • Text—In one embodiment, a Text tab appears when a text object is selected. In one embodiment, a Text tab contains controls that affect the text object. In another embodiment, the Text tab is divided into three panes: Format, Style and Layout.
        • Format—In one embodiment, the Format pane contains standard type controls such as font, size, tracking, and kerning. In one embodiment, the Format pane also contains a large text entry box where a user can edit the actual contents of the text.
        • Style—In one embodiment, the Style pane controls the color, texture, and similar attributes for the typeface, outline, glow, and drop shadow. In one embodiment, each of these sections is grouped and can be turned on or off by clicking the activation checkbox next to the category name.
        • Layout—In one embodiment, the Layout pane contains paragraph style controls such as justification, alignment, and line spacing (leading). In one embodiment, this pane also contains controls to create a type-on effect or to modify text path options.
      • Mask—In one embodiment, a Mask tab appears when a mask object is selected. In one embodiment, the only keyframeable attribute is the feather (softness) parameter, but a user can also control the mask type and how multiple masks interact by setting the Mask Blend mode.
      • Shape—In one embodiment, a Shape tab appears when a shape object is selected. In one embodiment, controls include the Shape type, fill and outline colors, and textures.
      • Emitter—In one embodiment, an Emitter tab appears when a particle emitter is selected. In one embodiment, the Emitter tab controls aspects of the emitter such as the emitter shape, angle, and range. In one embodiment, the Emitter tab also provides access to cell controls. For Emitters with multiple cells, these controls affect all cells.
      • Particle Cell—In one embodiment, a Particle Cell tab appears when a particle cell object is selected. In one embodiment, particle cell objects are only selectable in the Layers list. In one embodiment, the Particle Cell tab contains attributes such as birth rate, speed, angle, and color.
  • Generators—In one embodiment, a Generators tab displays the parameters and attributes of the selected generator (e.g., the colors and number of bars in a checkerboard). In one embodiment, the specific parameters listed depend on the specific generator that is selected.
  • iii. Locking the Inspector
  • In one embodiment, the Inspector 19 typically changes dynamically based on the selection in the Canvas. In one embodiment, however, sometimes a user wants to select another object 12 while continuing to look at the parameters 290 for the current object 12. In one embodiment, when a user locks the Inspector 19, the view of the Inspector will not change based on the user's selection.
  • In one embodiment, to lock the Inspector, do one of the following:
  • In one embodiment, click the Lock icon 1850 in the upper right corner of the Preview area of the Inspector 19. FIG. 185 illustrates one example of a Lock icon, according to one embodiment of the invention.
  • In one embodiment, choose Window>Create Locked Inspector. In one embodiment, this creates a new Inspector window showing the parameters of the currently selected object. In another embodiment, the main Inspector window continually updates to reflect whatever object is selected.
  • B. The Dashboard.
  • In one embodiment, a Dashboard 110 is a dynamically updating floating window. In one embodiment, the Dashboard contains the most common controls 1860 for any selected object 12. In another embodiment, the Dashboard provides graphical animation control over images and other items that appear in the canvas window.
  • In one embodiment, the Dashboard 110 is semi-transparent. In one embodiment, a user can set the opacity (transparency) of the Dashboard. FIG. 186 illustrates one example of a Dashboard, according to one embodiment of the invention.
  • In one embodiment, the Dashboard is designed to keep a selected object visible even while using the Dashboard to adjust the object's parameters. In another embodiment, this enables a user to keep his eye on the screen instead of switching his eye line from a main window to a utility panel and back.
  • i. Choosing Control Sets
  • In one embodiment, the Dashboard can show a variety of controls, even for a single object. In one embodiment, for example, if a Throw behavior is applied to a shape with a blur filter on it, the Dashboard could conceivably show the shape controls, the blur controls, or the Throw controls. In another embodiment, the Dashboard shows all three. In yet another embodiment, a user can choose between which set of controls to view in the Dashboard using the pop-up menu in the title bar.
  • In one embodiment, when an object 12 with multiple effects is selected, the Dashboard 110 title bar 1870 displays a downward facing arrow 1872 to the right of the name 1874. In one embodiment, clicking the arrow 1872 displays a pop-up menu 1880 that lists all of the possible control sets that can be displayed in the Dashboard for the selected object. FIG. 187 illustrates one example of a Dashboard title bar displaying a downward facing arrow, according to one embodiment of the invention. FIG. 188 illustrates one example of a pop-up menu that lists all of the possible control sets that can be displayed in the Dashboard for the selected object, according to one embodiment of the invention.
  • In one embodiment, to switch between control sets on a selected item, click on the Dashboard title bar, and then choose from the pop-up menu the control set to view. In one embodiment, most of the time, the Dashboard displays a subset of the parameters visible in the Inspector for the selected object. In another embodiment, if a user is working in the Dashboard, he can quickly jump to the corresponding Inspector to access the remainder of the controls for that object.
  • In one embodiment, to jump to the Inspector from the Dashboard, click the Inspector icon in the upper-right corner of the Dashboard. In one embodiment, the Inspector is opened and the tab corresponding to the Dashboard controls is brought to the front.
  • ii. Special Controls
  • In one embodiment, a Dashboard contains controls that resemble controls used in the Inspector, such as sliders, checkboxes, and pop-up menu buttons. In one embodiment, the Dashboard contains special controls for certain types of effects such as Basic Motion Behaviors and particle systems. In another embodiment, these unique controls allow a user to set multiple parameters simultaneously and in an intuitive way. In yet another embodiment, these controls use standard English-like terminology and simple graphical diagrams that, when dragged interactively, cause the target image to react immediately to the changes in the diagram.
  • In one embodiment, for example, the Particle System Dashboard 110 contains a single control 1890 that lets a user set shape, angle, and range of a particle system simultaneously. FIG. 189 illustrates one example of a Dashboard for a particle system, according to one embodiment of the invention. FIG. 190 illustrates one example of a Dashboard for a Grow/Shrink behavior, according to one embodiment of the invention. FIG. 191 illustrates one example of a Dashboard for a Fade In/Fade Out behavior, according to one embodiment of the invention.
  • a. Throw
  • In one embodiment, the Throw behavior and corresponding special control make an image move in a certain direction at a controlled speed. In one embodiment, to control the movement, the user clicks and drags in a graphical “dish” 1920 to set the direction and speed of the object. In another embodiment, the dish appears initially with a small “+” 1922 in the center to indicate no movement. In yet another embodiment, as the user drags the mouse, a small arrow 1930 appears in the center region. In one embodiment, the size of the arrow determines speed (larger=faster), while the direction of the arrow is the angle of movement.
  • FIG. 192 illustrates one example of a Dashboard for a Throw behavior where the special control specifies no movement, according to one embodiment of the invention. FIG. 193 illustrates one example of a Dashboard for a Throw behavior where the special control specifies movement in a southeastern direction at a low speed, according to one embodiment of the invention. FIG. 194 illustrates one example of a Dashboard for a Throw behavior where the special control specifies movement in the same direction as in FIG. 193, but at a higher speed, according to one embodiment of the invention. In one embodiment, these images illustrate how the image's speed increases as the arrow increases in size.
  • In one embodiment, a slider 1924 at the right side of the window controls the “zoom” of the dish. In one embodiment, dragging the slider upwards zooms out to display more area of the dish. In another embodiment, if more area of the dish is displayed, the control becomes more sensitive, so dragging the arrow will create more dramatic motion. In yet another embodiment, dragging the slider downward zooms in to display a smaller region, so dragging the arrow will create finer control over the movement.
  • b. Wind
  • In one embodiment, the Wind behavior and corresponding special control make an image move in a certain direction and speed. In one embodiment, the graphical controls are similar to those of Throw. In another embodiment, however, unlike Throw, the Wind behavior is designed to emulate real-life wind. In yet another embodiment, instead of a single, initial force, Wind pushes on the image constantly and ramps up over time. In one embodiment, for example, the object starts out moving slowly and picks up speed over time.
  • FIG. 195 illustrates one example of a Dashboard for a Wind behavior where the special control specifies no movement, according to one embodiment of the invention. FIG. 196 illustrates one example of a Dashboard for a Wind behavior where the special control specifies movement in a northeastern direction at a high speed, according to one embodiment of the invention.
  • c. Spin
  • In one embodiment, the Spin behavior and corresponding special control make an image rotate at a constant rate. In one embodiment, to control the spin, the user clicks and drags in the graphical “dish” 1970 to set the speed and rotation direction (clockwise or counterclockwise) of the object. In another embodiment, the dish appears initially with a small “+” 1972 at the upper edge of the dish to indicate no movement. In yet another embodiment, as the user drags the mouse, a small arrow 1980 appears around the dish and follows the edge. In one embodiment, the length of the arrow determines speed (longer=faster spin), while the direction of the arrow is the rotation direction.
  • FIG. 197 illustrates one example of a Dashboard for a Spin behavior where the special control specifies no movement, according to one embodiment of the invention. FIG. 198 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in a clockwise direction at a low speed, according to one embodiment of the invention. FIG. 199 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in the same direction as in FIG. 198, but at a higher speed, according to one embodiment of the invention.
  • In one embodiment, as the amount of spin increases beyond the circumference of the dish, the arrow overlaps the trailing line and displays a small multiplier 2020 (e.g., “×3”) in the lower right corner of the control, indicating revolutions. FIG. 200 illustrates one example of a Dashboard for a Spin behavior where the special control specifies no movement, according to one embodiment of the invention. FIG. 201 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in a counterclockwise direction at a low speed, according to one embodiment of the invention. FIG. 202 illustrates one example of a Dashboard for a Spin behavior where the special control specifies movement in the same direction as in FIG. 201, but at a much higher speed, according to one embodiment of the invention.
  • d. Grow/Shrink
  • In one embodiment, the Grow/Shrink behavior and corresponding special control make an image grow or shrink at a constant rate. In one embodiment, to control the size, the user clicks and drags in the rectangular area in the center to set the speed and direction (grow or shrink) of the object. In another embodiment, the control appears initially with a dotted rectangle 2030 in the center to indicate the “normal” size. In yet another embodiment, as the user drags the mouse, an additional rectangle 2040 and several arrows 2042 appear to indicate the rate of change from the initial state to the new state (either larger or smaller than the initial state). In one embodiment, the size of the box and the size of the arrows indicate growth or reduction.
  • FIG. 203 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies no movement, according to one embodiment of the invention. FIG. 204 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies a high grow rate, according to one embodiment of the invention. In one embodiment, the outer box displays arrows that show the progression from the initial state to the larger state of the image over time.
  • FIG. 205 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies no movement, according to one embodiment of the invention. FIG. 206 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies a high shrink rate, according to one embodiment of the invention. In one embodiment, the box is now smaller than the initial state, and displays arrows that show the progression from the initial state to the smaller state of the image over time.
  • In one embodiment, the grow/shrink special control also has small draggable handles 2032 at the four edges of the user-defined box to set the rate differently for the horizontal vs. vertical axes. In one embodiment, for example, an image can shrink horizontally over time, but simultaneously grow vertically over time. FIG. 207 illustrates one example of a Dashboard for a Grow/Shrink behavior where the special control specifies shrinking in the horizontal direction and simultaneous growing in the vertical direction, according to one embodiment of the invention.
  • In one embodiment, a slider 2034 at the right side of the window controls the “zoom” of the control. In one embodiment, dragging the slider upwards zooms out to display more area of the control. In another embodiment, the control becomes more sensitive, so dragging the box and arrows will create more dramatic motion. In yet another embodiment, dragging the slider downward zooms in to display a smaller region, so dragging the box and arrows will create finer control over the movement.
  • e. Fade In/Fade Out
  • In one embodiment, the Fade In/Fade Out behavior and corresponding special control make an image fade in and/or fade out. In one embodiment, to control the fade, the user clicks and drags in the sloped, shaded regions at the left 2080A and right 2080B edges of the graphic to set the fade in or fade out time (displayed in number of frames) of the object. In another embodiment, the control appears initially with a predefined fade time of 20 frames at either end. In yet another embodiment, as the user drags the mouse, the slope changes to indicate a longer or shorter fade time.
  • FIG. 208 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a fade in time and a fade out time of equivalent length, according to one embodiment of the invention. FIG. 209 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a shorter fade in time than in FIG. 208 and no fade out time (i.e., no fade out at all), according to one embodiment of the invention. FIG. 210 illustrates one example of a Dashboard for a Fade In/Fade Out behavior where the special control specifies a similar fade in time to that in FIG. 208 and a longer fade out time than in FIG. 208, according to one embodiment of the invention.
  • f. Particle Emitter
  • In one embodiment, a particle emitter is a special type of image object that starts with one or more small images as sources and automatically generates large numbers of copies (particles) of those images. In one embodiment, a particle emitter has numerous controls specifying, for example, how many copies are created, where they are created, how fast they move, and what direction they move in.
  • In one embodiment, a Dashboard for an emitter includes both traditional sliders 211A, 2110B, 211C and a custom graphical element. In one embodiment, the custom graphical element is a dish 2112 that simultaneously controls three different aspects of the particles: Direction, Speed, and Range. In another embodiment, draggable arrows 2114 radiate out from the center of the dish to indicate direction and speed, similar to the Throw and Wind controls.
  • In one embodiment, however, there is an additional control around the ring of the dish that defines a restricted range 2120 of where the particles travel outwards. In one embodiment, this ring resembles a “pie” shape. In another embodiment, it acts as a graphical representation of the emitter “nozzle.” In yet another embodiment, as the range narrows, the particles move in a “stream” defined by the shaded area of the pie.
  • FIG. 211 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in all directions (i.e., there is no specified range) at a medium/high speed, according to one embodiment of the invention. FIG. 212 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range) and at a medium speed, according to one embodiment of the invention. FIG. 213 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range, and the range is narrower than the range in FIG. 211) and at a low speed, according to one embodiment of the invention. FIG. 214 illustrates one example of a Dashboard for a particle emitter where the special control specifies that particles should be emitted in only certain directions (i.e., there is a specified range, and the range is narrower than the range in FIG. 212) and at a high speed, according to one embodiment of the invention.
  • iii. Other Controls
  • Close Button—In one embodiment, a Dashboard can be closed by clicking an “x” in the upper left of the Dashboard window.
  • Inspector Button—In one embodiment, if the user wants more controls over the image that is being manipulated, clicking on the small “i” in the upper right corner of the Dashboard will bring an Inspector window to the front. In one embodiment, the user can then use the Inspector to control the image via standard controls such as sliders, checkboxes, and numeric text fields. In another embodiment, this provides two levels of control over the animation: level 1 is an interactive graphical diagram, and level 2 is based on more traditional entry of values in the Inspector.
  • Algorithms Underlying Behaviors
  • A. Motion and Simulation Behaviors
  • In one embodiment, simulation behaviors implement two main functions: accumForces and accumInitialValues. accumForces takes as input the current state of the object being simulated, including the position, rotation, velocity and angular velocity, and outputs the forces that should be applied at the given time. accumInitialValues takes the same inputs and sets up the initial velocity of the object.
  • For a given object, the simulator traverses a data structure, such as a tree structure, to find the behaviors affecting the object. The simulator iterates across the list of behaviors and accumulates the forces on the object. If this is the first frame of the object, then the initial velocity is first calculated. Derivatives are then fed into a “mid-point method” differential solver to calculate a new position. The simulator then traverses the list of simulation behaviors for collision behaviors. Collision behaviors examine the current state to determine if a collision has occurred. If so, it adjusts the state of the system to maintain the collision constraints. The simulation is iteratively stepped forward in this fashion until the desired frame is reached.
  • In one embodiment, position and rotation properties that are keyframed are handled by a special “motion to forces” behavior which converts the keyframes into a series of forces that when applied produce a motion similar to that represented by the keyframe. This is done by examining the velocity and acceleration at the current frame and deriving the necessary forces from these values. These forces can then be input into the simulator so that they can interact with the other behaviors.
  • B. Parameter Behaviors
  • In one embodiment, parameter behaviors are evaluated as a stack of operations on a range of values. First the stack is traversed to determine if all evaluations can be done using only the current value of the behavior before it in the stack. If so, an optimized path is taken which only passes the single value up the stack of operations. If not, then each behavior is queried to discover what range of input values will be needed to compute the requested output range. This stack of ranges is then used to evaluate each parameter behavior in turn, passing it the input it requested in the first step. Parameter behaviors such as Average use a range of values to compute a single output value, so they generally follow this second path. Also, while updating the curve editor, a large range of values can be calculated in one batch. This improves cache locality and reduces re-computation of partial values needed in the evaluation of an individual parameter behavior.
  • Dynamic Rendering
  • In one embodiment, objects to which behaviors have been applied are dynamically rendered. In one embodiment, for example, a behavior animation changes in real-time after the value of a behavior parameter has been changed.
  • In one embodiment, caching is used to achieve dynamic rendering. In one embodiment, for example, a behavior animation for an object is generated by rendering each frame sequentially and calculating a current frame based on a previous frame. In another embodiment, the result of evaluating the effect of a behavior on a previous frame is cached, thereby enabling the effect of a behavior on a current frame to be evaluated more rapidly.
  • In yet another embodiment, an interval cache is also kept. In one embodiment, values are periodically added to the interval cache to speed up behavior evaluation when jumping to random frames.
  • In one embodiment, multithreading is used to achieve dynamic rendering. In one embodiment, for example, frames are rendered sequentially. In another embodiment, while a first thread renders a current frame, a second thread simultaneously evaluates behaviors for the next frame.
  • Hardware Acceleration Methods
  • In one embodiment, hardware acceleration enables users to work effectively with behaviors. In one embodiment, hardware acceleration methods include, for example: using multithreading (so that a program can, e.g., run on multiple CPUs); “Altivec'ing” algorithms, i.e., modifying algorithms to take advantage of G4 and/or G5 Altivec hardware on which they will be run (e.g., by vectorizing the algorithms); and using OpenGL (e.g., standard OpenGL, OpenGL vertex shaders, and OpenGL pixel shaders).
  • In one embodiment, recent advancements in the OpenGL standard, such as pixel shaders, are used to accelerate various image processing tasks (such as, for example, applying filters) and enable custom blending. A pixel shader is a graphics function that calculates effects on a per-pixel basis. Depending on resolution, in excess of 2 million pixels may need to be rendered, lit, shaded, and colored for each frame, at 60 frames per second. That creates a tremendous computational load. Per-pixel shading brings out an extraordinary level of surface detail, allowing a user to see effects beyond the triangle level. The basics of pixel shader technology are known to those of ordinary skill in the relevant art and are further described in the course notes for Course 17: “State of the Art in Hardware Shading” at SIGGRAPH 2002. The course is described at http://www.siggraph.org/s2002/conference/courses/crsl7.html and the course notes are available at http://www.csee.umbc.edu/˜olano/s2002c17.
  • Until now, however, no commercial software application has used pixel shaders for motion graphics or compositing.
  • The present invention has been described in particular detail with respect to one possible embodiment. Those of skill in the art will appreciate that the invention may be practiced in other embodiments. First, the particular naming of the components, capitalization of terms, the attributes, data structures, or any other programming or structural aspect is not mandatory or significant, and the mechanisms that implement the invention or its features may have different names, formats, or protocols. Further, the system may be implemented via a combination of hardware and software, as described, or entirely in hardware elements. Also, the particular division of functionality between the various system components described herein is merely exemplary, and not mandatory; functions performed by a single system component may instead be performed by multiple components, and functions performed by multiple components may instead performed by a single component.
  • Some portions of above description present the feature of the present invention in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. These operations, while described functionally or logically, are understood to be implemented by computer programs, which are stored in computer readable mediums. Furthermore, these arrangements of operations can be equivalently referred to as modules or code devices, without loss of generality.
  • It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “calculating” or “determining” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
  • Certain aspects of the present invention include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the present invention could be embodied in software, firmware or hardware, and when embodied in software, could be loaded to reside on and be operated from different type of computing platforms.
  • The present invention also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
  • The algorithms and illustrations presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description above. In addition, the present invention is not described with reference to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any references to specific languages are provided for disclosure of enablement and best mode of the present invention.
  • The present invention is well-suited to a wide variety of computer network systems over numerous topologies. Within this field, the configuration and management of large networks comprise storage devices and computers that are communicatively coupled to dissimilar computers and storage devices over a network, such as the Internet.
  • Finally, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.

Claims (82)

1. In a computer-implemented animation system, a method for animating an object, the method comprising:
receiving a first input, the first input specifing a first behavior, the first behavior indicating how to change a value of a first parameter of the object over time;
animating the object by changing the value of the first parameter of the object over time according to the specified behavior; and
outputting the animated object.
2. The method of claim 1, wherein the object comprises a two-dimensional object.
3. The method of claim 1, further comprising receiving a second input, the second input specifying a keyframe indicating the value for the first parameter of the object at a first point in time, and wherein animating the object comprises changing the value of the first parameter of the object according to the specified behavior and further according to the specified keyframe.
4. The method of claim 1, further comprising receiving a second input, the second input specifying a second behavior, the second behavior indicating how to change a value of a second parameter of the object over time, and wherein animating the object further comprises changing the value of the second parameter of the object according to the second specified behavior.
5. The method of claim 1, further comprising receiving a second input, the second input specifying a second behavior, the second behavior indicating how to change the value of the first parameter of the object over time, and wherein animating the object comprises changing the value of the first parameter of the object according to the first specified behavior and the second specified behavior.
6. The method of claim 5, wherein changing the value of the first parameter of the object according to the first specified behavior and the second specified behavior comprises determining a combined change to the value of the first parameter of the object according to a combination of the first specified behavior and the second specified behavior.
7. The method of claim 1, wherein the first behavior comprises one from a group consisting of:
a Fade In/Fade Out behavior;
a Grow/Shrink behavior;
a Motion Path behavior;
a Snap Alignment to Motion behavior;
a Spin behavior;
a Throw behavior;
an Align to Motion behavior;
an Attracted To behavior;
an Attractor behavior;
a Drag behavior,
a Drift Attracted To behavior,
a Drift Attractor behavior,
an Edge Collision behavior,
a Gravity behavior;
an Orbit Around behavior;
a Random Motion behavior;
a Repel behavior;
a Repel From behavior,
a Rotational Drag behavior;
a Spring behavior;
a Vortex behavior; and
a Wind behavior.
8. The method of claim 1, wherein the object comprises a text object and the first behavior comprises one from a group consisting of:
a Crawl Left behavior;
a Crawl Right behavior;
a Scroll Up behavior;
a Scroll Down behavior,
a Randomize behavior;
a Sequence behavior;
a Position behavior;
a Rotation behavior;
an Opacity behavior,
a Scale behavior,
a Tracking behavior; and
a Type On behavior.
9. The method of claim 1, wherein the first behavior indicates that the value of the first parameter of the object should be averaged over time.
10. The method of claim 1, therein the first behavior indicates that the value of the first parameter of the object should be changed using a user-specified custom change.
11. The method of claim 1, wherein the first behavior indicates that the value of the first parameter of the object should be negated.
12. The method of claim 1, wherein the first behavior indicates that the value of the first parameter of the object should oscillate over time.
13. The method of claim 1, wherein the first behavior indicates that the value of the first parameter of the object should ramp over time.
14. The method of claim 1, wherein the first behavior indicates that the value of the first parameter of the object should be randomized.
15. The method of claim 1, wherein the first behavior indicates that the value of the first parameter of the object should change over time according to a specified rate.
16. The method of claim 1, wherein the first behavior indicates that changes to the value of the first parameter of the object should be executed in reverse order.
17. The method of claim 1, wherein the first behavior indicates that the value of the first parameter of the object should not change.
18. The method of claim 1, wherein the first behavior indicates that the value of the first parameter of the object should wriggle over time.
19. The method of claim 1, wherein the object comprises one from a group consisting of:
an image object;
a text object;
a particle system;
a filter;
a generator; and
a behavior.
20. The method of claim 1, wherein the first behavior comprises at least one user-settable behavior parameter, the method further comprising receiving a second input specifying a value for the behavior parameter, and wherein animating the object comprises changing the value of the first parameter of the object according to the first specified behavior and the specified value for the behavior parameter.
21. In a computer-implemented animation system, a method for animating an object, the method comprising:
receiving an input, the input specifying the object;
creating one or more duplicates of the object according to a first plurality of parameters; and
animating the one or more duplicates by changing a value of a parameter of a duplicate over time according to a second plurality of parameters.
22. The method of claim 21, further comprising receiving an input, the input specifying a parameter, and wherein animating the one or more duplicates comprises changing the value of the parameter of the duplicate over time according to the second plurality of parameters and the specified parameter.
23. A user interface for a computer program for animating an object according to a behavior, the behavior having at least one user-settable parameter specifying how the behavior changes a value of at least one parameter of the object, the user interface comprising:
a control area; and
a user-manipulable control element located within the control area, for specifying a value for the at least one user-settable parameter of the behavior.
24. The user interface of claim 23, wherein the user-manipulable control element comprises a representation of a vector having a magnitude and an orientation.
25. The user interface of claim 24, wherein the control area comprises a circle, and wherein the representation of the vector comprises an arrow, the body of the arrow being a straight line, the tail of the arrow located in the center of the circle, the arrow pointing toward a point on the circumference of the circle.
26. The user interface of claim 25, wherein the magnitude of the vector controls a first user-settable parameter of the behavior, and wherein the orientation of the vector controls a second user-settable parameter of the behavior.
27. The user interface of claim 26, wherein the first user-settable parameter comprises a speed with which the object moves and wherein the second user-settable parameter comprises a direction in which the object moves.
28. The user interface of claim 27, wherein the behavior comprises a Throw behavior.
29. The user interface of claim 27, wherein the behavior comprises a Wind behavior.
30. The user interface of claim 23, wherein the user-manipulable control element comprises an arrow having a curved body.
31. The user interface of claim 30, wherein the control area comprises a circle, and wherein the curved body of the arrow comprises an arc of the circumference of the circle, the tail and the head of the arrow located on the circumference of the circle, the arrow pointing along the circumference of the circle.
32. The user interface of claim 31, wherein the length of the arrow controls a first user-settable parameter of the behavior, and wherein the direction of the arrow controls a second user-settable parameter of the behavior.
33. The user interface of claim 32, wherein the first user-settable parameter comprises a speed with which the object rotates and wherein the second user-settable parameter comprises a direction in which the object rotates.
34. The user interface of claim 33, wherein the behavior comprises a Spin behavior.
35. The user interface of claim 23, wherein the user-manipulable control element comprises, a first rectangle.
36. The user interface of claim 35, wherein the control area comprises a second rectangle, the second rectangle indicating an original size of the object
37. The user interface of claim 36, wherein a difference between a width of the first rectangle and a width of the second rectangle controls a first user-settable parameter of the behavior, and wherein a difference between a height of the first rectangle and a height of the second rectangle controls a second user-settable parameter of the behavior.
38. The user interface of claim 37, wherein the first user-settable parameter comprises a change in the object's width and the second user-settable parameter comprises a change in the object's height
39. The user interface of claim 38, wherein the behavior comprises a Grow/Shrink behavior.
40. The user interface of claim 23, wherein the user-manipulable control element comprises a first triangular region and a second triangular region.
41. The user interface of claim 40, wherein the control area comprises an area separating the first triangular region and the second triangular region.
42. The user interface of claim 40, wherein a width of the first triangular region controls a first user-settable parameter of the behavior, and wherein a width of the second triangular region controls a second user-settable parameter of the behavior.
43. The user interface of claim 42, wherein the first user-settable parameter comprises a fade-in time of the object and the second user-settable parameter comprises a fade-out time of the object.
44. The user interface of claim 43, wherein the behavior comprises a Fade In/Fade Out behavior.
45. The user interface of claim 23, wherein the control area is semi-transparent.
46. A user interface for a computer program for animating an object, wherein animating an object comprises creating one or more duplicates of the object and animating the one Or more duplicates by changing a value of a parameter of the one or more duplicates over time, the user interface comprising:
a control area; and
one or more controls for setting one or more parameters of the animation.
47. The user interface of claim 46, wherein the user-manipulable control element comprises a representation of a vector having a magnitude and an orientation
48. The user interface of claim 47, wherein the control area comprises a circle, and wherein the representation of the vector comprises an arrow, the body of the arrow being a straight line, the tail of the arrow located in the center of the circle, the arrow pointing toward a point on the circumference of the circle.
49. The user interface of claim 48, wherein the magnitude of the vector controls a first user-settable parameter of the animation, and wherein the orientation of the vector controls a second user-settable parameter of the animation.
50. The user interface of claim 49, wherein the first user-settable parameter comprises a speed with which the one or more duplicates moves and wherein the second user-settable parameter comprises a direction in which the one or more duplicates moves.
51. The user interface of claim 46, wherein the user-manipulable control element comprises two points.
52. The user interface of claim 47, wherein the control area comprises a circle, and wherein the two points are located on the circumference of the circle, and wherein the two points specify a segment of the circle.
53. The user interface of claim 48, wherein the size of the segment of the circle controls a first user-settable parameter of the animation, and wherein the position of the segment of the circle controls a second user-settable parameter of the animation.
54. The user interface of claim 49, wherein the first user-settable parameter comprises a size of a range in which the one or more duplicates moves and wherein the second user-settable parameter comprises a location of the range in which the one or more duplicates moves.
55. The user interface of claim 46, wherein the control area is semi-transparent.
56. A method for generating a frame of an object using behaviors, comprising:
determining a current state of the object;
traversing a data structure to identify behaviors affecting the object;
accumulating forces for the behaviors affecting the object; and
generating a frame of the object according to the accumulated forces.
57. The method of claim 56, further comprising determining an initial velocity for the object.
58. The method of claim 56, wherein at least one of the behaviors is a motion behavior.
59. The method of claim 56, wherein at least one of the behaviors is a simulation behavior.
60. The method of claim 56, wherein at least one of the behaviors is a parameter behavior.
61. The method of claim 56, wherein the data structure comprises a tree structure.
62. The method of claim 56, wherein generating the fame comprises applying a mid-point method differential solver to determine a new parameter value for the object.
63. The method of claim 56, wherein the parameter value comprises a position of the object.
64. The method of claim 56, further comprising:
traversing the data structure to identify collisions; and
responsive to the existence of a collision, adjusting a system state to maintain a collision constraint.
65. The method of claim 56, further comprising iteratively repeating the animating step until a desired frame is reached.
66. The method of claim 56, wherein at least one object state is specified in terms of a keyframe, the method further comprising converting at least one keyframe into a set of forces that, when applied to the object, approximate the motion represented by the keyframe.
67. The method of claim 66, wherein converting at least one keyframe into a set of forces comprises deriving a set of forces based on the velocity and acceleration at the keyframe.
68. A method for generating an animation for an object using behaviors, the animation comprising a plurality of frames, the method comprising:
for each frame:
determining a current state of the object;
traversing a data structure to identify behaviors affecting the object;
accumulating forces for the behaviors affecting the object;
generating a frame of the object according to the accumulated forces; and
outputting the generated frame.
69. The method of claim 68, wherein at least one of the determining, traversing, accumulating, generating and outputting steps for a first frame is performed concurrently with at least one of the determining, traversing, accumulating, generating and outputting steps for a second frame.
70. A method for animating an object using parameter behaviors, comprising:
traversing a stack of operations on a range of values;
responsive to a single behavior value being sufficient to evaluate all operations in the stack, passing the single behavior value to each operation in the stack; and
responsive to a single behavior value not being sufficient to evaluate all operations in the stack:
determining a range of input values to compute a requested output range; and
passing the determined range of input values to each operation in the stack.
71. A method for animating an object using a behavior comprising:
outputting an original animation for the object according to a first behavior;
concurrently with outputting the object animation, accepting user input; and
outputting an updated animation for the object according to the user input.
72. The method of claim 71, wherein the user input comprises a command for changing a value of a parameter of the behavior, and wherein outputting the updated animation comprises outputting the updated animation according to the changed value of the parameter
73. The method of claim 71, wherein the user input comprises a command for applying a second behavior to the object and wherein outputting the updated animation comprises outputting the updated animation according to the first and second behaviors.
74. The method of claim 71, wherein outputting the updated animation is performed without interrupting the animation for the object.
75. The method of claim 71, wherein the updated animation reflects the changed value of the parameter in real-time.
76. The method of claim 71, wherein outputting the original animation and outputting the updated animation each comprise rendering a plurality of frames and caching the rendered frames.
77. The method of claim 71, wherein outputting the original animation and outputting the updated animation each comprise rendering each of a plurality of frames sequentially.
78. The method of claim 71, wherein outputting the original animation and outputting the updated animation each comprise rendering each of a plurality of frames sequentially by calculating a current frame based on a previous frame.
79. The method of claim 71, wherein outputting the original animation and outputting the updated animation each comprise rendering a plurality of frames and periodically caching a subset of the rendered frames in an interval cache.
80. The method of claim 71, wherein outputting the original animation and outputting the updated animation each comprise evaluating, by a first thread, a first subset of frames, and evaluating, by a second thread, a second subset of frames.
81. The method of claim 80, wherein the first subset and the second subset of frames each comprise alternate frames of the animation.
82. In a computer-implemented animation system, a method for animating an object, the method comprising:
receiving a first input, the first input specifying a first behavior, the first behavior indicating how to change a value of a parameter of the object over time;
using at least one of a pixel shader and a vertex shader to generate a plurality of frames of the object, according to the specified behavior; and
outputting the plurality of frames.
US10/826,973 2004-04-16 2004-04-16 Animation of an object using behaviors Abandoned US20050231512A1 (en)

Priority Applications (10)

Application Number Priority Date Filing Date Title
US10/826,973 US20050231512A1 (en) 2004-04-16 2004-04-16 Animation of an object using behaviors
EP05735929A EP1735754A2 (en) 2004-04-16 2005-04-13 Animation of an object using behaviors
PCT/US2005/012735 WO2005106800A2 (en) 2004-04-16 2005-04-13 Animation of an object using behaviors
US11/257,882 US20060055700A1 (en) 2004-04-16 2005-10-24 User interface for controlling animation of an object
US11/786,850 US7932909B2 (en) 2004-04-16 2007-04-13 User interface for controlling three-dimensional animation of an object
US12/729,912 US8542238B2 (en) 2004-04-16 2010-03-23 User interface for controlling animation of an object
US12/729,890 US8253747B2 (en) 2004-04-16 2010-03-23 User interface for controlling animation of an object
US13/052,372 US8300055B2 (en) 2004-04-16 2011-03-21 User interface for controlling three-dimensional animation of an object
US13/566,571 US20130113807A1 (en) 2004-04-16 2012-08-03 User Interface for Controlling Animation of an Object
US13/663,435 US20130265316A1 (en) 2004-04-16 2012-10-29 User interface for controlling three-dimensional animation of an object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/826,973 US20050231512A1 (en) 2004-04-16 2004-04-16 Animation of an object using behaviors

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/257,882 Continuation US20060055700A1 (en) 2004-04-16 2005-10-24 User interface for controlling animation of an object

Publications (1)

Publication Number Publication Date
US20050231512A1 true US20050231512A1 (en) 2005-10-20

Family

ID=34966123

Family Applications (5)

Application Number Title Priority Date Filing Date
US10/826,973 Abandoned US20050231512A1 (en) 2004-04-16 2004-04-16 Animation of an object using behaviors
US11/257,882 Abandoned US20060055700A1 (en) 2004-04-16 2005-10-24 User interface for controlling animation of an object
US12/729,912 Active 2024-05-11 US8542238B2 (en) 2004-04-16 2010-03-23 User interface for controlling animation of an object
US12/729,890 Active 2024-11-24 US8253747B2 (en) 2004-04-16 2010-03-23 User interface for controlling animation of an object
US13/566,571 Abandoned US20130113807A1 (en) 2004-04-16 2012-08-03 User Interface for Controlling Animation of an Object

Family Applications After (4)

Application Number Title Priority Date Filing Date
US11/257,882 Abandoned US20060055700A1 (en) 2004-04-16 2005-10-24 User interface for controlling animation of an object
US12/729,912 Active 2024-05-11 US8542238B2 (en) 2004-04-16 2010-03-23 User interface for controlling animation of an object
US12/729,890 Active 2024-11-24 US8253747B2 (en) 2004-04-16 2010-03-23 User interface for controlling animation of an object
US13/566,571 Abandoned US20130113807A1 (en) 2004-04-16 2012-08-03 User Interface for Controlling Animation of an Object

Country Status (3)

Country Link
US (5) US20050231512A1 (en)
EP (1) EP1735754A2 (en)
WO (1) WO2005106800A2 (en)

Cited By (112)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050091615A1 (en) * 2002-09-06 2005-04-28 Hironori Suzuki Gui application development supporting device, gui display device, method, and computer program
US20060103667A1 (en) * 2004-10-28 2006-05-18 Universal-Ad. Ltd. Method, system and computer readable code for automatic reize of product oriented advertisements
US20060129353A1 (en) * 2004-12-13 2006-06-15 Olympus Corporation Laser scanning microscope apparatus
US20060129569A1 (en) * 2004-12-10 2006-06-15 International Business Machines Corporation System and method for partially collapsing a hierarchical structure for information navigation
US20060132812A1 (en) * 2004-12-17 2006-06-22 You Software, Inc. Automated wysiwyg previewing of font, kerning and size options for user-selected text
US20060192783A1 (en) * 2005-01-26 2006-08-31 Pixar Interactive spacetime constraints: wiggly splines
US20060214935A1 (en) * 2004-08-09 2006-09-28 Martin Boyd Extensible library for storing objects of different types
US20060274070A1 (en) * 2005-04-19 2006-12-07 Herman Daniel L Techniques and workflows for computer graphics animation system
US20060282786A1 (en) * 2005-06-14 2006-12-14 Microsoft Corporation User interface state reconfiguration through animation
US20060284878A1 (en) * 2004-06-24 2006-12-21 Apple Computer, Inc. Resolution Independent User Interface Design
US20070038424A1 (en) * 2005-08-10 2007-02-15 Simon Schirm Application programming interface for fluid simulations
US20070057951A1 (en) * 2005-09-12 2007-03-15 Microsoft Corporation View animation for scaling and sorting
US20070150364A1 (en) * 2005-12-22 2007-06-28 Andrew Monaghan Self-service terminal
US20070263010A1 (en) * 2006-05-15 2007-11-15 Microsoft Corporation Large-scale visualization techniques
US20080012864A1 (en) * 2006-04-12 2008-01-17 Teruyuki Nakahashi Image Processing Apparatus and Method, and Program
US20080155459A1 (en) * 2006-12-22 2008-06-26 Apple Inc. Associating keywords to media
US20080155458A1 (en) * 2006-12-22 2008-06-26 Joshua Fagans Interactive Image Thumbnails
US20080163119A1 (en) * 2006-12-28 2008-07-03 Samsung Electronics Co., Ltd. Method for providing menu and multimedia device using the same
US20080163053A1 (en) * 2006-12-28 2008-07-03 Samsung Electronics Co., Ltd. Method to provide menu, using menu set and multimedia device using the same
US20080180642A1 (en) * 2001-12-14 2008-07-31 Wichner Brian D Illumination field blending for use in subtitle projection systems
US20080195655A1 (en) * 2005-04-26 2008-08-14 Fumihito Kondou Video Object Representation Data Structure, Program For Generating Video Object Representation Data Structure, Method Of Generating Video Object Representation Data Structure, Video Software Development Device, Image Processing Program
WO2008137538A1 (en) * 2007-05-04 2008-11-13 Autodesk, Inc. Looping motion space registration for real-time character animation
US20090002376A1 (en) * 2007-06-29 2009-01-01 Microsoft Corporation Gradient Domain Editing of Animated Meshes
US20090044133A1 (en) * 2007-08-06 2009-02-12 Apple Inc. Updating Content Display Based on Cursor Position
US20090179901A1 (en) * 2008-01-10 2009-07-16 Michael Girard Behavioral motion space blending for goal-directed character animation
US20090295807A1 (en) * 2008-05-28 2009-12-03 Michael Girard Real-Time Goal-Directed Performed Motion Alignment For Computer Animated Characters
US20090295809A1 (en) * 2008-05-28 2009-12-03 Michael Girard Real-Time Goal-Directed Performed Motion Alignment For Computer Animated Characters
US20090295808A1 (en) * 2008-05-28 2009-12-03 Michael Girard Real-Time Goal-Directed Performed Motion Alignment For Computer Animated Characters
US20100095239A1 (en) * 2008-10-15 2010-04-15 Mccommons Jordan Scrollable Preview of Content
US20100194763A1 (en) * 2004-04-16 2010-08-05 Apple Inc. User Interface for Controlling Animation of an Object
US7805678B1 (en) 2004-04-16 2010-09-28 Apple Inc. Editing within single timeline
US20100281366A1 (en) * 2009-04-30 2010-11-04 Tom Langmacher Editing key-indexed graphs in media editing applications
US20100281404A1 (en) * 2009-04-30 2010-11-04 Tom Langmacher Editing key-indexed geometries in media editing applications
US20100289807A1 (en) * 2009-05-18 2010-11-18 Nokia Corporation Method, apparatus and computer program product for creating graphical objects with desired physical features for usage in animation
US20110012903A1 (en) * 2009-07-16 2011-01-20 Michael Girard System and method for real-time character animation
US20110029904A1 (en) * 2009-07-30 2011-02-03 Adam Miles Smith Behavior and Appearance of Touch-Optimized User Interface Elements for Controlling Computer Function
US20110069017A1 (en) * 2009-09-22 2011-03-24 Victor B Michael Device, Method, and Graphical User Interface for Manipulating User Interface Objects
US20110145742A1 (en) * 2004-06-22 2011-06-16 Imran Chaudhri Color labeling in a graphical user interface
US20110145743A1 (en) * 2005-11-11 2011-06-16 Ron Brinkmann Locking relationships among parameters in computer programs
US20110181528A1 (en) * 2010-01-26 2011-07-28 Jay Christopher Capela Device, Method, and Graphical User Interface for Resizing Objects
US8230358B1 (en) 2004-06-22 2012-07-24 Apple Inc. Defining motion in a computer system with a graphical user interface
US8265300B2 (en) 2003-01-06 2012-09-11 Apple Inc. Method and apparatus for controlling volume
US20120313957A1 (en) * 2011-06-09 2012-12-13 Microsoft Corporation Staged Animated Transitions for Aggregation Charts
US20120328130A1 (en) * 2011-06-24 2012-12-27 Yamaha Corporation Parameter Controlling Apparatus
US20130050224A1 (en) * 2011-08-30 2013-02-28 Samir Gehani Automatic Animation Generation
US20130055131A1 (en) * 2011-08-26 2013-02-28 Microsoft Corporation Animation for Cut and Paste of Content
US20130093795A1 (en) * 2011-10-17 2013-04-18 Sony Corporation Information processing apparatus, display control method, and computer program product
US8508549B2 (en) 2004-06-24 2013-08-13 Apple Inc. User-interface design
US8539386B2 (en) 2010-01-26 2013-09-17 Apple Inc. Device, method, and graphical user interface for selecting and moving objects
US8539385B2 (en) 2010-01-26 2013-09-17 Apple Inc. Device, method, and graphical user interface for precise positioning of objects
US20130311914A1 (en) * 2011-11-11 2013-11-21 Rockwell Automation Technologies, Inc. Method and apparatus for computer aided design of human-machine interface animated graphical elements
US8698844B1 (en) 2005-04-16 2014-04-15 Apple Inc. Processing cursor movements in a graphical user interface of a multimedia application
US8730246B2 (en) 2007-05-04 2014-05-20 Autodesk, Inc. Real-time goal space steering for data-driven character animation
US8766928B2 (en) 2009-09-25 2014-07-01 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8780069B2 (en) 2009-09-25 2014-07-15 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8799826B2 (en) 2009-09-25 2014-08-05 Apple Inc. Device, method, and graphical user interface for moving a calendar entry in a calendar application
US8819567B2 (en) 2011-09-13 2014-08-26 Apple Inc. Defining and editing user interface behaviors
US8832585B2 (en) 2009-09-25 2014-09-09 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US20140375572A1 (en) * 2013-06-20 2014-12-25 Microsoft Corporation Parametric motion curves and manipulable content
US8972879B2 (en) 2010-07-30 2015-03-03 Apple Inc. Device, method, and graphical user interface for reordering the front-to-back positions of objects
US9075434B2 (en) 2010-08-20 2015-07-07 Microsoft Technology Licensing, Llc Translating user motion into multiple object responses
US9081494B2 (en) 2010-07-30 2015-07-14 Apple Inc. Device, method, and graphical user interface for copying formatting attributes
US9098182B2 (en) 2010-07-30 2015-08-04 Apple Inc. Device, method, and graphical user interface for copying user interface objects between content regions
US9164576B2 (en) 2011-09-13 2015-10-20 Apple Inc. Conformance protocol for heterogeneous abstractions for defining user interface behaviors
US20150302628A1 (en) * 2014-04-18 2015-10-22 Alibaba Group Holding Limited Animating content display
US20160132201A1 (en) * 2014-11-06 2016-05-12 Microsoft Technology Licensing, Llc Contextual tabs in mobile ribbons
USD775141S1 (en) * 2014-10-30 2016-12-27 Kardium Inc. Display screen or portion thereof with animated graphical user interface for a monitoring and control device for an intra-cardiac procedure system
USD775634S1 (en) * 2014-10-30 2017-01-03 Kardium Inc. Display screen or portion thereof with animated graphical user interface for a monitoring and control device for an intra-cardiac procedure system
US20170139895A1 (en) * 2015-11-13 2017-05-18 Richard Scott Rosenblum Method and System for Report Generation
USD803233S1 (en) * 2015-08-14 2017-11-21 Sonos, Inc. Display device with animated graphical user interface element
USD815650S1 (en) * 2015-12-24 2018-04-17 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
US10311607B2 (en) * 2014-04-18 2019-06-04 Sugarcrm Inc. Chart decomposition and sequencing for limited display devices
US20190273901A1 (en) * 2018-03-01 2019-09-05 Motorola Mobility Llc Selectively applying color to an image
CN110383269A (en) * 2017-03-03 2019-10-25 微软技术许可有限责任公司 Animation font based on multi-shaft variable font
US10600225B2 (en) * 2013-11-25 2020-03-24 Autodesk, Inc. Animating sketches via kinetic textures
CN111047527A (en) * 2019-11-25 2020-04-21 福州市暖色网络科技有限公司 Method and storage medium for adjusting dynamic element based on input element
CN111105482A (en) * 2019-12-24 2020-05-05 上海莉莉丝科技股份有限公司 Animation system, animation method, and computer-readable storage medium
US10825142B2 (en) * 2016-11-30 2020-11-03 Boe Technology Group Co., Ltd. Human face resolution re-establishing method and re-establishing system, and readable medium
USD900839S1 (en) * 2018-01-05 2020-11-03 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US10860748B2 (en) * 2017-03-08 2020-12-08 General Electric Company Systems and method for adjusting properties of objects depicted in computer-aid design applications
US10884592B2 (en) 2015-03-02 2021-01-05 Apple Inc. Control of system zoom magnification using a rotatable input mechanism
USD915445S1 (en) * 2014-09-03 2021-04-06 Apple Inc. Display screen or portion thereof with graphical user interface
USD916852S1 (en) * 2018-03-12 2021-04-20 Apple Inc. Electronic device with animated graphical user interface
US11004249B2 (en) * 2019-03-18 2021-05-11 Apple Inc. Hand drawn animation motion paths
USD923642S1 (en) * 2018-09-06 2021-06-29 Apple Inc. Display screen or portion thereof with animated graphical user interface
US11080774B2 (en) * 2015-08-25 2021-08-03 Cardly Pty Ltd Online system and method for personalising a greeting card or stationery with handwriting and doodles using a computer
US11145027B2 (en) * 2019-04-02 2021-10-12 Rightware Oy Dynamic transitioning between visual user interface elements on a display
USD957420S1 (en) * 2019-03-22 2022-07-12 Apple Inc. Electronic device with graphical user interface
US20220222908A1 (en) * 2021-01-11 2022-07-14 Boe Technology Group Co., Ltd. Method and apparatus for displaying image
US11412159B2 (en) * 2018-07-27 2022-08-09 Beijing Microlive Vision Technology Co., Ltd Method and apparatus for generating three-dimensional particle effect, and electronic device
US11430195B2 (en) * 2016-08-31 2022-08-30 Sony Corporation Information processing apparatus, information processing method, and program for improving user-friendliness of an animated tutorial depicting assembling parts for creating a robot
USD962990S1 (en) 2020-06-09 2022-09-06 J. Morita Mfg. Corp. Display screen with icon
USD962980S1 (en) * 2020-06-09 2022-09-06 J. Morita Mfg. Corp. Display screen with animated graphical user interface
USD962986S1 (en) 2020-06-09 2022-09-06 J. Morita Mfg. Corp. Display screen with icon
USD962987S1 (en) * 2020-06-09 2022-09-06 J. Morita Mfg. Corp. Display screen with animated icon
USD966335S1 (en) 2020-01-31 2022-10-11 Mitsubishi Electric Corporation Display screen with animated graphical user interface
US20220343612A1 (en) * 2019-11-18 2022-10-27 Magic Leap, Inc. Mapping and localization of a passable world
US20220374139A1 (en) * 2021-05-19 2022-11-24 Snap Inc. Video editing application for mobile devices
USD971231S1 (en) * 2020-11-25 2022-11-29 Apple Inc. Electronic device with animated graphical user interface
USD982601S1 (en) * 2020-10-30 2023-04-04 Stryker Corporation Display screen or portion thereof with a graphical user interface
US11644941B1 (en) * 2020-08-10 2023-05-09 Apple Inc. Manipulation of animation timing
USD990508S1 (en) 2020-10-30 2023-06-27 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1003908S1 (en) 2020-10-30 2023-11-07 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1008301S1 (en) 2020-10-30 2023-12-19 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1008300S1 (en) 2020-10-30 2023-12-19 Stryker Corporation Display screen or portion thereof having a graphical user interface
CN117392358A (en) * 2023-12-04 2024-01-12 腾讯科技(深圳)有限公司 Collision detection method, collision detection device, computer device and storage medium
USD1011360S1 (en) 2020-10-30 2024-01-16 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1012116S1 (en) * 2020-09-18 2024-01-23 Glowstik, Inc. Display screen with animated icon
USD1012959S1 (en) 2020-10-30 2024-01-30 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1013701S1 (en) * 2020-09-18 2024-02-06 Glowstik, Inc. Display screen with animated icon
USD1014514S1 (en) 2020-10-30 2024-02-13 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1016078S1 (en) 2020-10-30 2024-02-27 Stryker Corporation Display screen or portion thereof having a graphical user interface

Families Citing this family (236)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8370770B2 (en) 2005-06-10 2013-02-05 T-Mobile Usa, Inc. Variable path management of user contacts
US8370769B2 (en) * 2005-06-10 2013-02-05 T-Mobile Usa, Inc. Variable path management of user contacts
US8359548B2 (en) * 2005-06-10 2013-01-22 T-Mobile Usa, Inc. Managing subset of user contacts
US7685530B2 (en) * 2005-06-10 2010-03-23 T-Mobile Usa, Inc. Preferred contact group centric interface
US7496416B2 (en) * 2005-08-01 2009-02-24 Luxology, Llc Input/output curve editor
US7509588B2 (en) 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US20070162857A1 (en) * 2006-01-06 2007-07-12 Ralf Weber Automated multimedia authoring
US7636889B2 (en) * 2006-01-06 2009-12-22 Apple Inc. Controlling behavior of elements in a display environment
US20070162855A1 (en) * 2006-01-06 2007-07-12 Kelly Hawk Movie authoring
US7616203B1 (en) * 2006-01-20 2009-11-10 Adobe Systems Incorporated Assigning attributes to regions across frames
US7844920B2 (en) * 2006-02-09 2010-11-30 Sony Corporation Modular entertainment system with movable components
US7774706B2 (en) * 2006-03-21 2010-08-10 Sony Corporation System and method for mixing media content
US8255281B2 (en) * 2006-06-07 2012-08-28 T-Mobile Usa, Inc. Service management system that enables subscriber-driven changes to service plans
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US20080086687A1 (en) * 2006-10-06 2008-04-10 Ryutaro Sakai Graphical User Interface For Audio-Visual Browsing
JP5013511B2 (en) * 2006-10-17 2012-08-29 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
US8519964B2 (en) 2007-01-07 2013-08-27 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US8391786B2 (en) * 2007-01-25 2013-03-05 Stephen Hodges Motion triggered data transfer
US20130086471A1 (en) * 2007-02-07 2013-04-04 Kenneth B. Moore Workflow integration and management of presentation options
US20080256484A1 (en) * 2007-04-12 2008-10-16 Microsoft Corporation Techniques for aligning and positioning objects
US8643653B2 (en) * 2007-06-08 2014-02-04 Apple Inc. Web-based animation
US9607408B2 (en) * 2007-06-08 2017-03-28 Apple Inc. Rendering semi-transparent user interface elements
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
US8208067B1 (en) 2007-07-11 2012-06-26 Adobe Systems Incorporated Avoiding jitter in motion estimated video
US8619038B2 (en) 2007-09-04 2013-12-31 Apple Inc. Editing interface
US9619143B2 (en) 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
US20100265250A1 (en) * 2007-12-21 2010-10-21 David Koenig Method and system for fast rendering of a three dimensional scene
CN101521004B (en) * 2008-02-29 2011-12-14 鹏智科技(深圳)有限公司 Electronic device with image processing function and image processing method thereof
JP5008605B2 (en) * 2008-05-26 2012-08-22 富士フイルム株式会社 Image processing apparatus and method, and program
KR101594861B1 (en) * 2008-06-03 2016-02-19 삼성전자주식회사 Web server for supporting collaborative animation production service and method thereof
US20100138782A1 (en) * 2008-11-30 2010-06-03 Nokia Corporation Item and view specific options
US8489569B2 (en) * 2008-12-08 2013-07-16 Microsoft Corporation Digital media retrieval and display
US20100207951A1 (en) * 2009-01-20 2010-08-19 Pvt Solar, Inc. Method and device for monitoring operation of a solar thermal system
US20130335425A1 (en) * 2009-03-02 2013-12-19 Adobe Systems Incorporated Systems and Methods for Combining Animations
US8428561B1 (en) 2009-03-27 2013-04-23 T-Mobile Usa, Inc. Event notification and organization utilizing a communication network
USD636403S1 (en) 2009-03-27 2011-04-19 T-Mobile Usa, Inc. Portion of a display screen with a user interface
USD631891S1 (en) 2009-03-27 2011-02-01 T-Mobile Usa, Inc. Portion of a display screen with a user interface
USD636402S1 (en) 2009-03-27 2011-04-19 T-Mobile Usa, Inc. Portion of a display screen with a user interface
US8577350B2 (en) 2009-03-27 2013-11-05 T-Mobile Usa, Inc. Managing communications utilizing communication categories
USD636400S1 (en) 2009-03-27 2011-04-19 T-Mobile Usa, Inc. Portion of a display screen with a user interface
US9355382B2 (en) 2009-03-27 2016-05-31 T-Mobile Usa, Inc. Group based information displays
US9369542B2 (en) 2009-03-27 2016-06-14 T-Mobile Usa, Inc. Network-based processing of data requests for contact information
USD631890S1 (en) 2009-03-27 2011-02-01 T-Mobile Usa, Inc. Portion of a display screen with a user interface
USD631889S1 (en) 2009-03-27 2011-02-01 T-Mobile Usa, Inc. Portion of a display screen with a user interface
US9195966B2 (en) * 2009-03-27 2015-11-24 T-Mobile Usa, Inc. Managing contact groups from subset of user contacts
USD636399S1 (en) 2009-03-27 2011-04-19 T-Mobile Usa, Inc. Portion of a display screen with a user interface
US8676626B1 (en) 2009-03-27 2014-03-18 T-Mobile Usa, Inc. Event notification and organization utilizing a communication network
US9210247B2 (en) 2009-03-27 2015-12-08 T-Mobile Usa, Inc. Managing contact groups from subset of user contacts
USD633918S1 (en) 2009-03-27 2011-03-08 T-Mobile Usa, Inc. Portion of a display screen with a user interface
US8140621B2 (en) * 2009-03-27 2012-03-20 T-Mobile, Usa, Inc. Providing event data to a group of contacts
USD631886S1 (en) 2009-03-27 2011-02-01 T-Mobile Usa, Inc. Portion of a display screen with a user interface
USD631887S1 (en) 2009-03-27 2011-02-01 T-Mobile Usa, Inc. Portion of a display screen with a user interface
US8631070B2 (en) 2009-03-27 2014-01-14 T-Mobile Usa, Inc. Providing event data to a group of contacts
US8893025B2 (en) 2009-03-27 2014-11-18 T-Mobile Usa, Inc. Generating group based information displays via template information
USD636401S1 (en) 2009-03-27 2011-04-19 T-Mobile Usa, Inc. Portion of a display screen with a user interface
USD631888S1 (en) 2009-03-27 2011-02-01 T-Mobile Usa, Inc. Portion of a display screen with a user interface
US8195701B1 (en) * 2009-04-08 2012-06-05 Ithaka Harbors, Inc. Integration of data sets into documents for interactive exploration
US8762886B2 (en) * 2009-07-30 2014-06-24 Lenovo (Singapore) Pte. Ltd. Emulating fundamental forces of physics on a virtual, touchable object
KR101727039B1 (en) * 2010-09-17 2017-04-14 엘지전자 주식회사 Mobile terminal and method for processing image thereof
US20110084962A1 (en) * 2009-10-12 2011-04-14 Jong Hwan Kim Mobile terminal and image processing method therein
KR101631273B1 (en) * 2009-10-26 2016-06-17 삼성전자주식회사 Method and apparatus for providing UI animation
US20110094184A1 (en) * 2009-10-28 2011-04-28 Honeywell International Inc. Systems and methods to display smoke propagation in multiple floors
US20110131526A1 (en) * 2009-12-01 2011-06-02 Microsoft Corporation Overlay user interface for command confirmation
EP2355472B1 (en) * 2010-01-22 2020-03-04 Samsung Electronics Co., Ltd. Apparatus and method for transmitting and receiving handwriting animation message
US10025458B2 (en) 2010-04-07 2018-07-17 Apple Inc. Device, method, and graphical user interface for managing folders
US10788976B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US9762975B2 (en) * 2010-04-30 2017-09-12 Thomas Loretan Content navigation guide
KR20110121888A (en) * 2010-05-03 2011-11-09 삼성전자주식회사 Apparatus and method for determining the pop-up menu in portable terminal
US9208599B2 (en) * 2010-06-17 2015-12-08 Microsoft Technology Licensing, Llc Visual previews
US9323438B2 (en) 2010-07-15 2016-04-26 Apple Inc. Media-editing application with live dragging and live editing capabilities
JP5454405B2 (en) * 2010-07-21 2014-03-26 ヤマハ株式会社 Acoustic adjustment console
US8487932B1 (en) 2010-08-30 2013-07-16 Disney Enterprises, Inc. Drawing figures in computer-based drawing applications
US8427483B1 (en) * 2010-08-30 2013-04-23 Disney Enterprises. Inc. Drawing figures in computer-based drawing applications
KR101780020B1 (en) * 2010-09-02 2017-09-19 삼성전자주식회사 Method and apparatus for interface
US8727611B2 (en) 2010-11-19 2014-05-20 Nest Labs, Inc. System and method for integrating sensors in thermostats
US20120089933A1 (en) * 2010-09-14 2012-04-12 Apple Inc. Content configuration for device platforms
US8918219B2 (en) 2010-11-19 2014-12-23 Google Inc. User friendly interface for control unit
US9489062B2 (en) 2010-09-14 2016-11-08 Google Inc. User interfaces for remote management and control of network-connected thermostats
US9104211B2 (en) 2010-11-19 2015-08-11 Google Inc. Temperature controller with model-based time to target calculation and display
US8538737B2 (en) * 2010-09-17 2013-09-17 Adobe Systems Incorporated Curve editing with physical simulation of mass points and spring forces
US9524279B2 (en) * 2010-10-28 2016-12-20 Microsoft Technology Licensing, Llc Help document animated visualization
USD665403S1 (en) * 2010-11-11 2012-08-14 Microsoft Corporation Display screen with graphical user interface
US9075419B2 (en) 2010-11-19 2015-07-07 Google Inc. Systems and methods for a graphical user interface of a controller for an energy-consuming system having spatially related discrete display elements
US9552002B2 (en) 2010-11-19 2017-01-24 Google Inc. Graphical user interface for setpoint creation and modification
US9092039B2 (en) 2010-11-19 2015-07-28 Google Inc. HVAC controller with user-friendly installation features with wire insertion detection
US9256230B2 (en) 2010-11-19 2016-02-09 Google Inc. HVAC schedule establishment in an intelligent, network-connected thermostat
US9459018B2 (en) 2010-11-19 2016-10-04 Google Inc. Systems and methods for energy-efficient control of an energy-consuming system
US9453655B2 (en) 2011-10-07 2016-09-27 Google Inc. Methods and graphical user interfaces for reporting performance information for an HVAC system controlled by a self-programming network-connected thermostat
US11334034B2 (en) 2010-11-19 2022-05-17 Google Llc Energy efficiency promoting schedule learning algorithms for intelligent thermostat
US8195313B1 (en) 2010-11-19 2012-06-05 Nest Labs, Inc. Thermostat user interface
US8850348B2 (en) 2010-12-31 2014-09-30 Google Inc. Dynamic device-associated feedback indicative of responsible device usage
US10346275B2 (en) 2010-11-19 2019-07-09 Google Llc Attributing causation for energy usage and setpoint changes with a network-connected thermostat
JP5728235B2 (en) * 2011-01-05 2015-06-03 ソニー株式会社 Display control apparatus, display control method, and program
US8954477B2 (en) 2011-01-28 2015-02-10 Apple Inc. Data structures for a media-editing application
US8780091B2 (en) * 2011-02-10 2014-07-15 General Electric Company Methods and systems for controlling an information display
US11747972B2 (en) 2011-02-16 2023-09-05 Apple Inc. Media-editing application with novel editing tools
US9997196B2 (en) 2011-02-16 2018-06-12 Apple Inc. Retiming media presentations
US20120229514A1 (en) * 2011-03-10 2012-09-13 Microsoft Corporation Transitioning presence indication through animation
US20120327114A1 (en) * 2011-06-21 2012-12-27 Dassault Systemes Device and associated methodology for producing augmented images
US8601366B2 (en) * 2011-09-08 2013-12-03 Microsoft Corporation Visualization and editing of composite layouts
US9223395B2 (en) 2011-09-14 2015-12-29 Microsoft Technology Licensing, Llc Viewing presentations in a condensed animation mode
US20130073933A1 (en) 2011-09-20 2013-03-21 Aaron M. Eppolito Method of Outputting a Media Presentation to Different Tracks
US9536564B2 (en) * 2011-09-20 2017-01-03 Apple Inc. Role-facilitated editing operations
US20130096987A1 (en) * 2011-10-06 2013-04-18 Ut Battelle, Llc Citizen engagement for energy efficient communities
US9222693B2 (en) 2013-04-26 2015-12-29 Google Inc. Touchscreen device user interface for remote control of a thermostat
US8893032B2 (en) 2012-03-29 2014-11-18 Google Inc. User interfaces for HVAC schedule display and modification on smartphone or other space-limited touchscreen device
CA2853033C (en) 2011-10-21 2019-07-16 Nest Labs, Inc. User-friendly, network connected learning thermostat and related systems and methods
EP3486743B1 (en) 2011-10-21 2022-05-25 Google LLC Energy efficiency promoting schedule learning algorithms for intelligent thermostat
US20130117698A1 (en) * 2011-10-31 2013-05-09 Samsung Electronics Co., Ltd. Display apparatus and method thereof
TWI571790B (en) * 2011-11-10 2017-02-21 財團法人資訊工業策進會 Method and electronic device for changing coordinate values of icons according to a sensing signal
US9529486B2 (en) * 2012-03-29 2016-12-27 FiftyThree, Inc. Methods and apparatus for providing a digital illustration system
EP2831687B1 (en) 2012-03-29 2020-01-01 Google LLC Processing and reporting usage information for an hvac system controlled by a network-connected thermostat
US20130271473A1 (en) * 2012-04-12 2013-10-17 Motorola Mobility, Inc. Creation of Properties for Spans within a Timeline for an Animation
USD740314S1 (en) * 2012-05-02 2015-10-06 Samsung Electronics Co., Ltd. Display screen or portion thereof with icon
US9582165B2 (en) 2012-05-09 2017-02-28 Apple Inc. Context-specific user interfaces
US9804759B2 (en) 2012-05-09 2017-10-31 Apple Inc. Context-specific user interfaces
US10990270B2 (en) 2012-05-09 2021-04-27 Apple Inc. Context-specific user interfaces
US10613743B2 (en) * 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
US8766986B1 (en) * 2012-06-22 2014-07-01 Google Inc. Efficient caching and drawing of objects whose rendering properties change frequently
US9865230B2 (en) 2012-07-02 2018-01-09 Microsoft Technology Licensing, Llc Animated visualization of alpha channel transparency
US20140059496A1 (en) * 2012-08-23 2014-02-27 Oracle International Corporation Unified mobile approvals application including card display
USD738901S1 (en) * 2012-11-08 2015-09-15 Uber Technologies, Inc. Computing device display screen with graphical user interface
USD741888S1 (en) * 2013-01-15 2015-10-27 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD746856S1 (en) * 2013-02-07 2016-01-05 Tencent Technology (Shenzhen) Company Limited Display screen portion with an animated graphical user interface
US9754392B2 (en) * 2013-03-04 2017-09-05 Microsoft Technology Licensing, Llc Generating data-mapped visualization of data
US9070227B2 (en) 2013-03-04 2015-06-30 Microsoft Technology Licensing, Llc Particle based visualizations of abstract information
US9076258B2 (en) * 2013-03-14 2015-07-07 Pixar Stylizing animation by example
US9176940B2 (en) * 2013-03-15 2015-11-03 Blackberry Limited System and method for text editor text alignment control
US9600151B2 (en) * 2013-05-09 2017-03-21 Autodesk, Inc. Interactive design variations interface
USD747732S1 (en) * 2013-08-30 2016-01-19 SkyBell Technologies, Inc. Display screen or portion thereof with a graphical user interface
USD711427S1 (en) 2013-10-22 2014-08-19 Apple Inc. Display screen or portion thereof with icon
KR102405189B1 (en) 2013-10-30 2022-06-07 애플 인크. Displaying relevant user interface objects
US10096296B2 (en) 2013-11-13 2018-10-09 Red Hat, Inc. Temporally adjusted application window drop shadows
USD752084S1 (en) * 2013-12-12 2016-03-22 Tencent Technology (Shenzhen) Company Limited Display screen portion with graphical user interface
USD755850S1 (en) * 2013-12-30 2016-05-10 Samsung Electronics Co., Ltd. Display screen or portion thereof with icon
USD759667S1 (en) * 2014-01-13 2016-06-21 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD752081S1 (en) * 2014-01-21 2016-03-22 Accompani, Inc. Display with graphical user interface
US20150206447A1 (en) * 2014-01-23 2015-07-23 Zyante, Inc. System and method for authoring content for web viewable textbook data object
USD763868S1 (en) * 2014-02-11 2016-08-16 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD763269S1 (en) * 2014-02-11 2016-08-09 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD763268S1 (en) * 2014-02-12 2016-08-09 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
CN104850389B (en) * 2014-02-14 2020-09-29 腾讯科技(深圳)有限公司 Method and device for realizing dynamic interface
USD786266S1 (en) 2014-03-07 2017-05-09 Sonos, Inc. Display screen or portion thereof with graphical user interface
USD792420S1 (en) 2014-03-07 2017-07-18 Sonos, Inc. Display screen or portion thereof with graphical user interface
USD775632S1 (en) 2014-03-07 2017-01-03 Sonos, Inc. Display screen or portion thereof with graphical user interface
USD759688S1 (en) * 2014-03-12 2016-06-21 Tencent Technology (Shenzhen) Company Limited Portion of a display screen with graphical user interface
USD774061S1 (en) * 2014-03-12 2016-12-13 Tencent Technology (Shenzhen) Company Limited Portion of a display screen with graphical user interface
US9436366B1 (en) * 2014-03-18 2016-09-06 Kenneth Davis System for presenting media content
USD760271S1 (en) * 2014-03-19 2016-06-28 Wargaming.Net Limited Display screen with graphical user interface
USD774540S1 (en) * 2014-05-08 2016-12-20 Express Scripts, Inc. Display screen with a graphical user interface
USD773518S1 (en) * 2014-05-08 2016-12-06 Express Scripts, Inc. Display screen with a graphical user interface
USD762688S1 (en) 2014-05-16 2016-08-02 SkyBell Technologies, Inc. Display screen or a portion thereof with a graphical user interface
USD776689S1 (en) * 2014-06-20 2017-01-17 Google Inc. Display screen with graphical user interface
EP2960767A1 (en) * 2014-06-24 2015-12-30 Google, Inc. Computerized systems and methods for rendering an animation of an object in response to user input
AU2015279544B2 (en) 2014-06-27 2018-03-15 Apple Inc. Electronic device with rotatable input mechanism for navigating calendar application
US10135905B2 (en) 2014-07-21 2018-11-20 Apple Inc. Remote user interface
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
CN115623117A (en) 2014-09-02 2023-01-17 苹果公司 Telephone user interface
US10073590B2 (en) 2014-09-02 2018-09-11 Apple Inc. Reduced size user interface
WO2016036481A1 (en) 2014-09-02 2016-03-10 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
USD785673S1 (en) * 2014-09-03 2017-05-02 Rockwell Collins, Inc. Avionics display screen with computer icon
CN106797439B (en) * 2014-09-12 2020-02-07 索尼半导体解决方案公司 Image processing apparatus, image processing method, and storage medium
JP1531722S (en) * 2014-11-21 2015-08-24
USD781877S1 (en) * 2015-01-05 2017-03-21 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD760738S1 (en) 2015-01-15 2016-07-05 SkyBell Technologies, Inc. Display screen or a portion thereof with a graphical user interface
USD759702S1 (en) 2015-01-15 2016-06-21 SkyBell Technologies, Inc. Display screen or a portion thereof with a graphical user interface
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
US10168899B1 (en) * 2015-03-16 2019-01-01 FiftyThree, Inc. Computer-readable media and related methods for processing hand-drawn image elements
KR101726844B1 (en) * 2015-03-25 2017-04-13 네이버 주식회사 System and method for generating cartoon data
JP1547602S (en) * 2015-03-27 2016-04-11
USD771679S1 (en) * 2015-09-01 2016-11-15 Grand Rounds, Inc. Display screen with graphical user interface
US10089120B2 (en) * 2015-09-25 2018-10-02 Entit Software Llc Widgets in digital dashboards
US9702582B2 (en) 2015-10-12 2017-07-11 Ikorongo Technology, LLC Connected thermostat for controlling a climate system based on a desired usage profile in comparison to other connected thermostats controlling other climate systems
USD786890S1 (en) * 2015-11-09 2017-05-16 Aetna Inc. Computer display screen for a server maintenance tool with graphical user interface
USD772250S1 (en) * 2015-11-09 2016-11-22 Aetna Inc. Computer display for a server maintenance tool graphical user interface
USD795891S1 (en) * 2015-11-09 2017-08-29 Aetna Inc. Computer display screen for a server maintenance tool with graphical user interface
US10606443B2 (en) * 2015-12-10 2020-03-31 Appelago Inc. Interactive dashboard for controlling delivery of dynamic push notifications
FR3046228B1 (en) * 2015-12-29 2018-01-05 Thales METHOD FOR THREE-DIMENSIONALLY GRAPHIC REPRESENTATION OF A LANDING TRACK ON AN AIRCRAFT VISUALIZATION DEVICE
WO2017197217A1 (en) 2016-05-12 2017-11-16 Life Technologies Corporation Systems, methods, and apparatuses for image capture and display
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
DK201670595A1 (en) 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
US11816325B2 (en) 2016-06-12 2023-11-14 Apple Inc. Application shortcuts for carplay
USD837239S1 (en) 2016-07-06 2019-01-01 Fujifilm Corporation Digital camera display panel with transitional graphical user interface
USD824935S1 (en) * 2016-07-20 2018-08-07 Biolase, Inc. Display screen including a dental laser graphical user interface
USD912689S1 (en) * 2016-07-22 2021-03-09 L&L Candle Company, Llc Display screen with graphical user interface for controlling an electronic candle
USD802622S1 (en) 2016-08-29 2017-11-14 Lutron Electronics Co., Inc. Display screen or portion thereof with graphical user interface
CN106874350B (en) * 2016-12-27 2020-09-25 合肥阿巴赛信息科技有限公司 Diamond ring retrieval method and system based on sketch and distance field
JP1590265S (en) * 2017-02-10 2017-11-06
USD812072S1 (en) * 2017-03-29 2018-03-06 Sorenson Ip Holdings, Llc Display screen or a portion thereof with graphical user interface
USD837234S1 (en) * 2017-05-25 2019-01-01 Palantir Technologies Inc. Display screen or portion thereof with transitional graphical user interface
CN107479784B (en) * 2017-07-31 2022-01-25 腾讯科技(深圳)有限公司 Expression display method and device and computer readable storage medium
USD855639S1 (en) * 2017-09-05 2019-08-06 Byton Limited Display screen with a graphical user interface
USD864227S1 (en) * 2017-09-05 2019-10-22 Byton Limited Display screen with an animated graphical user interface
USD879121S1 (en) * 2017-09-05 2020-03-24 Byton Limited Display screen with a graphical user interface
USD860228S1 (en) 2017-09-05 2019-09-17 Byton Limited Display screen with a graphical user interface
USD854043S1 (en) 2017-09-29 2019-07-16 Sonos, Inc. Display screen or portion thereof with graphical user interface
USD854563S1 (en) * 2017-11-03 2019-07-23 Nutanix, Inc. Display panel or portion thereof with a graphical user interface
USD860223S1 (en) * 2018-01-04 2019-09-17 Panasonic Intellectual Property Management Co., Ltd. Display screen with graphical user interface
USD877174S1 (en) 2018-06-03 2020-03-03 Apple Inc. Electronic device with graphical user interface
US10650184B2 (en) * 2018-06-13 2020-05-12 Apple Inc. Linked text boxes
USD963685S1 (en) 2018-12-06 2022-09-13 Sonos, Inc. Display screen or portion thereof with graphical user interface for media playback control
US11715246B1 (en) 2019-02-12 2023-08-01 Apple Inc. Modification and transfer of character motion
USD942996S1 (en) * 2019-03-29 2022-02-08 Home Run Dugout LLC Display screen or portion thereof with graphical user interface
US11675476B2 (en) 2019-05-05 2023-06-13 Apple Inc. User interfaces for widgets
USD926809S1 (en) * 2019-06-05 2021-08-03 Reliaquest Holdings, Llc Display screen or portion thereof with a graphical user interface
USD926810S1 (en) 2019-06-05 2021-08-03 Reliaquest Holdings, Llc Display screen or portion thereof with a graphical user interface
USD926811S1 (en) 2019-06-06 2021-08-03 Reliaquest Holdings, Llc Display screen or portion thereof with a graphical user interface
USD926782S1 (en) 2019-06-06 2021-08-03 Reliaquest Holdings, Llc Display screen or portion thereof with a graphical user interface
USD926200S1 (en) 2019-06-06 2021-07-27 Reliaquest Holdings, Llc Display screen or portion thereof with a graphical user interface
USD910660S1 (en) * 2019-07-26 2021-02-16 Lutron Technology Company Llc Display screen or portion thereof with graphical user interface
USD921012S1 (en) * 2019-09-19 2021-06-01 Keurig Green Mountain, Inc. Display screen or portion thereof with graphical user interface
USD921010S1 (en) * 2019-09-19 2021-06-01 Keurig Green Mountain, Inc. Display screen or portion thereof with graphical user interface
USD921009S1 (en) * 2019-09-19 2021-06-01 Keurig Green Mountain, Inc. Display screen or portion thereof with graphical user interface
USD915425S1 (en) * 2019-09-19 2021-04-06 Keurig Green Mountain, Inc. Display screen with graphical user interface
USD921008S1 (en) * 2019-09-19 2021-06-01 Keurig Green Mountain, Inc. Display screen or portion thereof with graphical user interface
USD920377S1 (en) * 2019-09-20 2021-05-25 Eolp Llc Display screen with graphical user interface
USD920350S1 (en) * 2019-10-25 2021-05-25 Eli Lilly And Company Display screen with animated graphical user interface
US11393148B2 (en) 2019-12-19 2022-07-19 Microsoft Technology Licensing, Llc Automatic weight determination for unassigned variables
USD941837S1 (en) * 2020-03-16 2022-01-25 Innovator Capital Management, LLC Display screen with animated graphical user interface
US11461874B2 (en) * 2020-04-02 2022-10-04 Adobe Inc. Graphics processing using matrices of transformations
USD944830S1 (en) 2020-05-14 2022-03-01 Lutron Technology Company Llc Display screen or portion thereof with graphical user interface
USD946022S1 (en) * 2020-10-19 2022-03-15 Splunk Inc. Display screen or portion thereof having a graphical user interface for a work table-based presentation of information
USD946027S1 (en) * 2020-10-19 2022-03-15 Splunk Inc. Display screen or portion thereof having a graphical user interface for an application home page
USD946032S1 (en) * 2020-10-19 2022-03-15 Splunk Inc. Display screen or portion thereof having a graphical user interface for a process control editor
USD946023S1 (en) * 2020-10-19 2022-03-15 Splunk Inc. Display screen or portion thereof having a graphical user interface for a work table-based presentation of information
USD946029S1 (en) * 2020-10-19 2022-03-15 Splunk Inc. Display screen or portion thereof having a graphical user interface for a process control editor
USD982029S1 (en) * 2020-12-15 2023-03-28 Cowbell Cyber, Ino. Display screen or portion thereof with a graphical user interface
USD977515S1 (en) * 2020-12-21 2023-02-07 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
CN114943791A (en) * 2021-02-08 2022-08-26 北京小米移动软件有限公司 Animation playing method, device, equipment and storage medium
US11714536B2 (en) 2021-05-21 2023-08-01 Apple Inc. Avatar sticker editor user interfaces
USD976945S1 (en) * 2021-06-18 2023-01-31 Jaret Christopher Computing device display screen with graphical user interface for generating an omni-channel message
USD991274S1 (en) * 2021-06-18 2023-07-04 DePuy Synthes Products, Inc. Display screen with a graphical user interface for a computer-assisted orthopaedic surgical system
US11657556B2 (en) * 2021-08-31 2023-05-23 Paul J. Hultgren Scrolling with damped oscillation
CN113920224A (en) * 2021-09-29 2022-01-11 北京达佳互联信息技术有限公司 Material display method and device, electronic equipment and storage medium
US20230281903A1 (en) * 2022-03-01 2023-09-07 Adobe Inc. Dynamic path animation of animation layers and digital design objects

Citations (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4841291A (en) * 1987-09-21 1989-06-20 International Business Machines Corp. Interactive animation of graphics objects
US5261041A (en) * 1990-12-28 1993-11-09 Apple Computer, Inc. Computer controlled animation system based on definitional animated objects and methods of manipulating same
US5428721A (en) * 1990-02-07 1995-06-27 Kabushiki Kaisha Toshiba Data processing apparatus for editing image by using image conversion
US5513303A (en) * 1990-08-02 1996-04-30 Xerox Corporation Moving an object in a three-dimensional workspace
US5544295A (en) * 1992-05-27 1996-08-06 Apple Computer, Inc. Method and apparatus for indicating a change in status of an object and its disposition using animation
US5583981A (en) * 1994-06-28 1996-12-10 Microsoft Corporation Method and system for changing the size of edit controls on a graphical user interface
US5588098A (en) * 1991-11-22 1996-12-24 Apple Computer, Inc. Method and apparatus for direct manipulation of 3-D objects on computer displays
US5619631A (en) * 1995-06-07 1997-04-08 Binaryblitz Method and apparatus for data alteration by manipulation of representational graphs
US5673380A (en) * 1994-02-15 1997-09-30 Fujitsu Limited Parallel processing of calculation processor and display processor for forming moving computer graphic image in a real-time manner
US5717848A (en) * 1990-06-11 1998-02-10 Hitachi, Ltd. Method and apparatus for generating object motion path, method of setting object display attribute, and computer graphics system
US5731819A (en) * 1995-07-18 1998-03-24 Softimage Deformation of a graphic object to emphasize effects of motion
US5835693A (en) * 1994-07-22 1998-11-10 Lynch; James D. Interactive system for simulation and display of multi-body systems in three dimensions
US5835692A (en) * 1994-11-21 1998-11-10 International Business Machines Corporation System and method for providing mapping notation in interactive video displays
US5883639A (en) * 1992-03-06 1999-03-16 Hewlett-Packard Company Visual software engineering system and method for developing visual prototypes and for connecting user code to them
US5923561A (en) * 1996-06-17 1999-07-13 Toyota Jidosha Kabushiki Kaisha Process of generating a succession of discrete points defining cutter path, by calculating a space interval of discrete points
US5977970A (en) * 1997-11-14 1999-11-02 International Business Machines Corporation Method and apparatus for moving information displayed in a window
US6011562A (en) * 1997-08-01 2000-01-04 Avid Technology Inc. Method and system employing an NLE to create and modify 3D animations by mixing and compositing animation data
US6045446A (en) * 1996-05-22 2000-04-04 Konami Co., Ltd. Object-throwing video game system
US6115053A (en) * 1994-08-02 2000-09-05 New York University Computer animation method and system for synthesizing human-like gestures and actions
US6141018A (en) * 1997-03-12 2000-10-31 Microsoft Corporation Method and system for displaying hypertext documents with visual effects
US6144381A (en) * 1997-05-14 2000-11-07 International Business Machines Corporation Systems, methods and computer program products for compass navigation of avatars in three dimensional worlds
US6154601A (en) * 1996-04-12 2000-11-28 Hitachi Denshi Kabushiki Kaisha Method for editing image information with aid of computer and editing system
US6266053B1 (en) * 1998-04-03 2001-07-24 Synapix, Inc. Time inheritance scene graph for representation of media content
US6271863B1 (en) * 1994-04-04 2001-08-07 Alive, Inc. Interactive method for operating a computer so as to graphically display the results of a computation
US6285380B1 (en) * 1994-08-02 2001-09-04 New York University Method and system for scripting interactive animated actors
US6285347B1 (en) * 1997-05-28 2001-09-04 Sony Corporation Digital map display scrolling method, digital map display scrolling device, and storage device for storing digital map display scrolling program
US20010030647A1 (en) * 1999-09-24 2001-10-18 Sun Microsystems, Inc. Using messaging to manage scene-based rendering
US6310621B1 (en) * 1998-04-03 2001-10-30 Avid Technology, Inc. Extended support for numerical controls
US6317140B1 (en) * 1999-08-02 2001-11-13 Hewlett-Packard Company Displaying interactive bitmap images within a display space
US6317128B1 (en) * 1996-04-18 2001-11-13 Silicon Graphics, Inc. Graphical user interface with anti-interference outlines for enhanced variably-transparent applications
US20020003540A1 (en) * 1990-07-12 2002-01-10 Munetoshi Unuma Method and apparatus for representing motion of multiple-jointed object, computer graphic apparatus, and robot controller
US6353437B1 (en) * 1998-05-29 2002-03-05 Avid Technology, Inc. Animation system and method for defining and using rule-based groups of objects
US6369821B2 (en) * 1997-05-19 2002-04-09 Microsoft Corporation Method and system for synchronizing scripted animations
US6414685B1 (en) * 1997-01-29 2002-07-02 Sharp Kabushiki Kaisha Method of processing animation by interpolation between key frames with small data quantity
US20020112180A1 (en) * 2000-12-19 2002-08-15 Land Michael Z. System and method for multimedia authoring and playback
US6500008B1 (en) * 1999-03-15 2002-12-31 Information Decision Technologies, Llc Augmented reality-based firefighter training system and method
US6512522B1 (en) * 1999-04-15 2003-01-28 Avid Technology, Inc. Animation of three-dimensional characters along a path for motion video sequences
US6525736B1 (en) * 1999-08-20 2003-02-25 Koei Co., Ltd Method for moving grouped characters, recording medium and game device
US6532014B1 (en) * 2000-01-13 2003-03-11 Microsoft Corporation Cloth animation modeling
US6573896B1 (en) * 1999-07-08 2003-06-03 Dassault Systemes Three-dimensional arrow
US6636242B2 (en) * 1999-08-31 2003-10-21 Accenture Llp View configurer in a presentation services patterns environment
US6664986B1 (en) * 1997-05-20 2003-12-16 Cadent Ltd. Computer user interface for orthodontic use
US6690376B1 (en) * 1999-09-29 2004-02-10 Sega Enterprises, Ltd. Storage medium for storing animation data, image processing method using same, and storage medium storing image processing programs
US20040036711A1 (en) * 2002-08-23 2004-02-26 Anderson Thomas G. Force frames in animation
US20040039934A1 (en) * 2000-12-19 2004-02-26 Land Michael Z. System and method for multimedia authoring and playback
US6714201B1 (en) * 1999-04-14 2004-03-30 3D Open Motion, Llc Apparatuses, methods, computer programming, and propagated signals for modeling motion in computer applications
US6756984B1 (en) * 1999-11-17 2004-06-29 Squire Enix Co., Ltd. Object displaying method, a recording medium and game apparatus
US6760030B2 (en) * 2000-01-14 2004-07-06 Hitachi, Ltd. Method of displaying objects in a virtual 3-dimensional space
US6778195B2 (en) * 1991-12-20 2004-08-17 Apple Computer, Inc. Zooming controller
US20050091576A1 (en) * 2003-10-24 2005-04-28 Microsoft Corporation Programming interface for a computer platform
US6909431B1 (en) * 1999-03-01 2005-06-21 Lucas Digital Ltd. Position and shape control for cloth and soft body animation
US6967654B2 (en) * 2002-02-15 2005-11-22 Computer Associates Think, Inc. System and method for specifying elliptical parameters
US20050268279A1 (en) * 2004-02-06 2005-12-01 Sequoia Media Group, Lc Automated multimedia object models
US7027055B2 (en) * 2001-04-30 2006-04-11 The Commonwealth Of Australia Data view of a modelling system
US7050955B1 (en) * 1999-10-01 2006-05-23 Immersion Corporation System, method and data structure for simulated interaction with graphical objects
US7071919B2 (en) * 2001-02-26 2006-07-04 Microsoft Corporation Positional scrolling
US20060155576A1 (en) * 2004-06-14 2006-07-13 Ryan Marshall Deluz Configurable particle system representation for biofeedback applications
US7441206B2 (en) * 2004-06-14 2008-10-21 Medical Simulation Corporation 3D visual effect creation system and method

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3038286A1 (en) * 1980-10-10 1982-05-19 Behringwerke Ag, 3550 Marburg METHOD FOR DETERMINING THE STREPTOCOCCE DESOXIRIBONUCLEASE B BY THE TOLUIDINE BLUE O METHOD
US4844291A (en) * 1986-05-28 1989-07-04 Free Flow Packaging Corporation Dispensing device
WO1993021588A1 (en) * 1992-04-10 1993-10-28 Avid Technology, Inc. Digital audio workstation providing digital storage and display of video information
US5498002A (en) * 1993-10-07 1996-03-12 Gechter; Jerry Interactive electronic games and screen savers with multiple characters
US6628303B1 (en) * 1996-07-29 2003-09-30 Avid Technology, Inc. Graphical user interface for a motion video planning and editing system for a computer
US6229456B1 (en) * 1998-08-10 2001-05-08 Tektronix, Inc. Method and apparatus for facilitating user interaction with a measurement instrument using a display-based control knob
US6549219B2 (en) * 1999-04-09 2003-04-15 International Business Machines Corporation Pie menu graphical user interface
US6762778B1 (en) 1999-06-10 2004-07-13 Dassault Systemes Three dimensional graphical manipulator
US6615199B1 (en) * 1999-08-31 2003-09-02 Accenture, Llp Abstraction factory in a base services pattern environment
JP2001325615A (en) * 2000-05-18 2001-11-22 Sony Corp Device and method for processing three-dimensional model and program providing medium
EP1249792A3 (en) * 2001-04-12 2006-01-18 Matsushita Electric Industrial Co., Ltd. Animation data generation apparatus, animation data generation method, animated video generation apparatus, and animated video generation method
US7073127B2 (en) * 2002-07-01 2006-07-04 Arcsoft, Inc. Video editing GUI with layer view
US7274375B1 (en) * 2002-11-19 2007-09-25 Peter David Timekeeping system and method for graphically tracking and representing activities
US7210107B2 (en) * 2003-06-27 2007-04-24 Microsoft Corporation Menus whose geometry is bounded by two radii and an arc
US7932909B2 (en) 2004-04-16 2011-04-26 Apple Inc. User interface for controlling three-dimensional animation of an object
US20050231512A1 (en) * 2004-04-16 2005-10-20 Niles Gregory E Animation of an object using behaviors
US20060274070A1 (en) * 2005-04-19 2006-12-07 Herman Daniel L Techniques and workflows for computer graphics animation system

Patent Citations (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4841291A (en) * 1987-09-21 1989-06-20 International Business Machines Corp. Interactive animation of graphics objects
US5428721A (en) * 1990-02-07 1995-06-27 Kabushiki Kaisha Toshiba Data processing apparatus for editing image by using image conversion
US5717848A (en) * 1990-06-11 1998-02-10 Hitachi, Ltd. Method and apparatus for generating object motion path, method of setting object display attribute, and computer graphics system
US20020003540A1 (en) * 1990-07-12 2002-01-10 Munetoshi Unuma Method and apparatus for representing motion of multiple-jointed object, computer graphic apparatus, and robot controller
US5513303A (en) * 1990-08-02 1996-04-30 Xerox Corporation Moving an object in a three-dimensional workspace
US5261041A (en) * 1990-12-28 1993-11-09 Apple Computer, Inc. Computer controlled animation system based on definitional animated objects and methods of manipulating same
US5588098A (en) * 1991-11-22 1996-12-24 Apple Computer, Inc. Method and apparatus for direct manipulation of 3-D objects on computer displays
US6778195B2 (en) * 1991-12-20 2004-08-17 Apple Computer, Inc. Zooming controller
US5883639A (en) * 1992-03-06 1999-03-16 Hewlett-Packard Company Visual software engineering system and method for developing visual prototypes and for connecting user code to them
US5544295A (en) * 1992-05-27 1996-08-06 Apple Computer, Inc. Method and apparatus for indicating a change in status of an object and its disposition using animation
US5596694A (en) * 1992-05-27 1997-01-21 Apple Computer, Inc. Method and apparatus for indicating a change in status of an object and its disposition using animation
US5673380A (en) * 1994-02-15 1997-09-30 Fujitsu Limited Parallel processing of calculation processor and display processor for forming moving computer graphic image in a real-time manner
US6271863B1 (en) * 1994-04-04 2001-08-07 Alive, Inc. Interactive method for operating a computer so as to graphically display the results of a computation
US5583981A (en) * 1994-06-28 1996-12-10 Microsoft Corporation Method and system for changing the size of edit controls on a graphical user interface
US5835693A (en) * 1994-07-22 1998-11-10 Lynch; James D. Interactive system for simulation and display of multi-body systems in three dimensions
US6285380B1 (en) * 1994-08-02 2001-09-04 New York University Method and system for scripting interactive animated actors
US6115053A (en) * 1994-08-02 2000-09-05 New York University Computer animation method and system for synthesizing human-like gestures and actions
US5835692A (en) * 1994-11-21 1998-11-10 International Business Machines Corporation System and method for providing mapping notation in interactive video displays
US5619631A (en) * 1995-06-07 1997-04-08 Binaryblitz Method and apparatus for data alteration by manipulation of representational graphs
US5731819A (en) * 1995-07-18 1998-03-24 Softimage Deformation of a graphic object to emphasize effects of motion
US6154601A (en) * 1996-04-12 2000-11-28 Hitachi Denshi Kabushiki Kaisha Method for editing image information with aid of computer and editing system
US6317128B1 (en) * 1996-04-18 2001-11-13 Silicon Graphics, Inc. Graphical user interface with anti-interference outlines for enhanced variably-transparent applications
US6045446A (en) * 1996-05-22 2000-04-04 Konami Co., Ltd. Object-throwing video game system
US5923561A (en) * 1996-06-17 1999-07-13 Toyota Jidosha Kabushiki Kaisha Process of generating a succession of discrete points defining cutter path, by calculating a space interval of discrete points
US6414685B1 (en) * 1997-01-29 2002-07-02 Sharp Kabushiki Kaisha Method of processing animation by interpolation between key frames with small data quantity
US6141018A (en) * 1997-03-12 2000-10-31 Microsoft Corporation Method and system for displaying hypertext documents with visual effects
US6144381A (en) * 1997-05-14 2000-11-07 International Business Machines Corporation Systems, methods and computer program products for compass navigation of avatars in three dimensional worlds
US6369821B2 (en) * 1997-05-19 2002-04-09 Microsoft Corporation Method and system for synchronizing scripted animations
US6664986B1 (en) * 1997-05-20 2003-12-16 Cadent Ltd. Computer user interface for orthodontic use
US6285347B1 (en) * 1997-05-28 2001-09-04 Sony Corporation Digital map display scrolling method, digital map display scrolling device, and storage device for storing digital map display scrolling program
US6011562A (en) * 1997-08-01 2000-01-04 Avid Technology Inc. Method and system employing an NLE to create and modify 3D animations by mixing and compositing animation data
US5977970A (en) * 1997-11-14 1999-11-02 International Business Machines Corporation Method and apparatus for moving information displayed in a window
US6266053B1 (en) * 1998-04-03 2001-07-24 Synapix, Inc. Time inheritance scene graph for representation of media content
US6310621B1 (en) * 1998-04-03 2001-10-30 Avid Technology, Inc. Extended support for numerical controls
US6353437B1 (en) * 1998-05-29 2002-03-05 Avid Technology, Inc. Animation system and method for defining and using rule-based groups of objects
US6909431B1 (en) * 1999-03-01 2005-06-21 Lucas Digital Ltd. Position and shape control for cloth and soft body animation
US6989831B2 (en) * 1999-03-15 2006-01-24 Information Decision Technologies, Llc Method for simulating multi-layer obscuration from a viewpoint
US6500008B1 (en) * 1999-03-15 2002-12-31 Information Decision Technologies, Llc Augmented reality-based firefighter training system and method
US6809744B2 (en) * 1999-03-15 2004-10-26 Information Decision Technologies, Llc Method for simulating flow of an extinguishing agent
US6714201B1 (en) * 1999-04-14 2004-03-30 3D Open Motion, Llc Apparatuses, methods, computer programming, and propagated signals for modeling motion in computer applications
US6512522B1 (en) * 1999-04-15 2003-01-28 Avid Technology, Inc. Animation of three-dimensional characters along a path for motion video sequences
US6573896B1 (en) * 1999-07-08 2003-06-03 Dassault Systemes Three-dimensional arrow
US6317140B1 (en) * 1999-08-02 2001-11-13 Hewlett-Packard Company Displaying interactive bitmap images within a display space
US6525736B1 (en) * 1999-08-20 2003-02-25 Koei Co., Ltd Method for moving grouped characters, recording medium and game device
US6636242B2 (en) * 1999-08-31 2003-10-21 Accenture Llp View configurer in a presentation services patterns environment
US20010030647A1 (en) * 1999-09-24 2001-10-18 Sun Microsystems, Inc. Using messaging to manage scene-based rendering
US6690376B1 (en) * 1999-09-29 2004-02-10 Sega Enterprises, Ltd. Storage medium for storing animation data, image processing method using same, and storage medium storing image processing programs
US7050955B1 (en) * 1999-10-01 2006-05-23 Immersion Corporation System, method and data structure for simulated interaction with graphical objects
US6756984B1 (en) * 1999-11-17 2004-06-29 Squire Enix Co., Ltd. Object displaying method, a recording medium and game apparatus
US6532014B1 (en) * 2000-01-13 2003-03-11 Microsoft Corporation Cloth animation modeling
US6760030B2 (en) * 2000-01-14 2004-07-06 Hitachi, Ltd. Method of displaying objects in a virtual 3-dimensional space
US20040039934A1 (en) * 2000-12-19 2004-02-26 Land Michael Z. System and method for multimedia authoring and playback
US20020112180A1 (en) * 2000-12-19 2002-08-15 Land Michael Z. System and method for multimedia authoring and playback
US7071919B2 (en) * 2001-02-26 2006-07-04 Microsoft Corporation Positional scrolling
US7027055B2 (en) * 2001-04-30 2006-04-11 The Commonwealth Of Australia Data view of a modelling system
US6967654B2 (en) * 2002-02-15 2005-11-22 Computer Associates Think, Inc. System and method for specifying elliptical parameters
US20040036711A1 (en) * 2002-08-23 2004-02-26 Anderson Thomas G. Force frames in animation
US20050091576A1 (en) * 2003-10-24 2005-04-28 Microsoft Corporation Programming interface for a computer platform
US20050268279A1 (en) * 2004-02-06 2005-12-01 Sequoia Media Group, Lc Automated multimedia object models
US20060155576A1 (en) * 2004-06-14 2006-07-13 Ryan Marshall Deluz Configurable particle system representation for biofeedback applications
US7441206B2 (en) * 2004-06-14 2008-10-21 Medical Simulation Corporation 3D visual effect creation system and method

Cited By (186)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080180642A1 (en) * 2001-12-14 2008-07-31 Wichner Brian D Illumination field blending for use in subtitle projection systems
US7802888B2 (en) * 2001-12-14 2010-09-28 Wichner Brian D Illumination field blending for use in subtitle projection systems
US7870511B2 (en) * 2002-09-06 2011-01-11 Sony Corporation GUI application development supporting device, GUI display device, method, and computer program
US20050091615A1 (en) * 2002-09-06 2005-04-28 Hironori Suzuki Gui application development supporting device, gui display device, method, and computer program
US8265300B2 (en) 2003-01-06 2012-09-11 Apple Inc. Method and apparatus for controlling volume
US8253747B2 (en) 2004-04-16 2012-08-28 Apple Inc. User interface for controlling animation of an object
US8542238B2 (en) 2004-04-16 2013-09-24 Apple Inc. User interface for controlling animation of an object
US8543922B1 (en) 2004-04-16 2013-09-24 Apple Inc. Editing within single timeline
US7805678B1 (en) 2004-04-16 2010-09-28 Apple Inc. Editing within single timeline
US20100194763A1 (en) * 2004-04-16 2010-08-05 Apple Inc. User Interface for Controlling Animation of an Object
US20100201692A1 (en) * 2004-04-16 2010-08-12 Apple Inc. User Interface for Controlling Animation of an Object
US20110145742A1 (en) * 2004-06-22 2011-06-16 Imran Chaudhri Color labeling in a graphical user interface
US20120221973A1 (en) * 2004-06-22 2012-08-30 Imran Chaudhri Color labeling in a graphical user interface
US9606698B2 (en) * 2004-06-22 2017-03-28 Apple Inc. Color labeling in a graphical user interface
US8230358B1 (en) 2004-06-22 2012-07-24 Apple Inc. Defining motion in a computer system with a graphical user interface
US20120256960A1 (en) * 2004-06-22 2012-10-11 Imran Chaudhri Defining motion in a computer system with a graphical user interface
US20060284878A1 (en) * 2004-06-24 2006-12-21 Apple Computer, Inc. Resolution Independent User Interface Design
US8130237B2 (en) * 2004-06-24 2012-03-06 Apple Inc. Resolution independent user interface design
US8508549B2 (en) 2004-06-24 2013-08-13 Apple Inc. User-interface design
US20120131479A1 (en) * 2004-06-24 2012-05-24 Apple Inc. Resolution Independent User Interface Design
US7411590B1 (en) * 2004-08-09 2008-08-12 Apple Inc. Multimedia file format
US20060214935A1 (en) * 2004-08-09 2006-09-28 Martin Boyd Extensible library for storing objects of different types
US7518611B2 (en) 2004-08-09 2009-04-14 Apple Inc. Extensible library for storing objects of different types
US20060103667A1 (en) * 2004-10-28 2006-05-18 Universal-Ad. Ltd. Method, system and computer readable code for automatic reize of product oriented advertisements
US20060129569A1 (en) * 2004-12-10 2006-06-15 International Business Machines Corporation System and method for partially collapsing a hierarchical structure for information navigation
US7984388B2 (en) * 2004-12-10 2011-07-19 International Business Machines Corporation System and method for partially collapsing a hierarchical structure for information navigation
US20060129353A1 (en) * 2004-12-13 2006-06-15 Olympus Corporation Laser scanning microscope apparatus
US7696996B2 (en) * 2004-12-13 2010-04-13 Olympus Corporation Laser scanning microscope apparatus
US20060132812A1 (en) * 2004-12-17 2006-06-22 You Software, Inc. Automated wysiwyg previewing of font, kerning and size options for user-selected text
US7483030B2 (en) * 2005-01-26 2009-01-27 Pixar Interactive spacetime constraints: wiggly splines
US20060192783A1 (en) * 2005-01-26 2006-08-31 Pixar Interactive spacetime constraints: wiggly splines
US8698844B1 (en) 2005-04-16 2014-04-15 Apple Inc. Processing cursor movements in a graphical user interface of a multimedia application
US9805491B2 (en) 2005-04-19 2017-10-31 Digitalfish, Inc. Techniques and workflows for computer graphics animation system
US20060274070A1 (en) * 2005-04-19 2006-12-07 Herman Daniel L Techniques and workflows for computer graphics animation system
US9216351B2 (en) 2005-04-19 2015-12-22 Digitalfish, Inc. Techniques and workflows for computer graphics animation system
US10546405B2 (en) 2005-04-19 2020-01-28 Digitalfish, Inc. Techniques and workflows for computer graphics animation system
US20100214313A1 (en) * 2005-04-19 2010-08-26 Digitalfish, Inc. Techniques and Workflows for Computer Graphics Animation System
US20080195655A1 (en) * 2005-04-26 2008-08-14 Fumihito Kondou Video Object Representation Data Structure, Program For Generating Video Object Representation Data Structure, Method Of Generating Video Object Representation Data Structure, Video Software Development Device, Image Processing Program
US20060282786A1 (en) * 2005-06-14 2006-12-14 Microsoft Corporation User interface state reconfiguration through animation
US7432928B2 (en) * 2005-06-14 2008-10-07 Microsoft Corporation User interface state reconfiguration through animation
US7580821B2 (en) * 2005-08-10 2009-08-25 Nvidia Corporation Application programming interface for fluid simulations
US20070038424A1 (en) * 2005-08-10 2007-02-15 Simon Schirm Application programming interface for fluid simulations
US20070057951A1 (en) * 2005-09-12 2007-03-15 Microsoft Corporation View animation for scaling and sorting
US20110145743A1 (en) * 2005-11-11 2011-06-16 Ron Brinkmann Locking relationships among parameters in computer programs
US20070150364A1 (en) * 2005-12-22 2007-06-28 Andrew Monaghan Self-service terminal
US8009171B2 (en) * 2006-04-12 2011-08-30 Sony Corporation Image processing apparatus and method, and program
US20080012864A1 (en) * 2006-04-12 2008-01-17 Teruyuki Nakahashi Image Processing Apparatus and Method, and Program
US20070263010A1 (en) * 2006-05-15 2007-11-15 Microsoft Corporation Large-scale visualization techniques
US9798744B2 (en) 2006-12-22 2017-10-24 Apple Inc. Interactive image thumbnails
US20080155458A1 (en) * 2006-12-22 2008-06-26 Joshua Fagans Interactive Image Thumbnails
US20080155459A1 (en) * 2006-12-22 2008-06-26 Apple Inc. Associating keywords to media
US8276098B2 (en) 2006-12-22 2012-09-25 Apple Inc. Interactive image thumbnails
US9142253B2 (en) * 2006-12-22 2015-09-22 Apple Inc. Associating keywords to media
US9959293B2 (en) 2006-12-22 2018-05-01 Apple Inc. Interactive image thumbnails
US20080163119A1 (en) * 2006-12-28 2008-07-03 Samsung Electronics Co., Ltd. Method for providing menu and multimedia device using the same
US20080163053A1 (en) * 2006-12-28 2008-07-03 Samsung Electronics Co., Ltd. Method to provide menu, using menu set and multimedia device using the same
WO2008137538A1 (en) * 2007-05-04 2008-11-13 Autodesk, Inc. Looping motion space registration for real-time character animation
US8154552B2 (en) 2007-05-04 2012-04-10 Autodesk, Inc. Looping motion space registration for real-time character animation
US9934607B2 (en) 2007-05-04 2018-04-03 Autodesk, Inc. Real-time goal space steering for data-driven character animation
US8730246B2 (en) 2007-05-04 2014-05-20 Autodesk, Inc. Real-time goal space steering for data-driven character animation
US8379029B2 (en) 2007-05-04 2013-02-19 Autodesk, Inc. Looping motion space registration for real-time character animation
US7843456B2 (en) 2007-06-29 2010-11-30 Microsoft Corporation Gradient domain editing of animated meshes
US20090002376A1 (en) * 2007-06-29 2009-01-01 Microsoft Corporation Gradient Domain Editing of Animated Meshes
US20090044133A1 (en) * 2007-08-06 2009-02-12 Apple Inc. Updating Content Display Based on Cursor Position
US8261209B2 (en) * 2007-08-06 2012-09-04 Apple Inc. Updating content display based on cursor position
US20090179901A1 (en) * 2008-01-10 2009-07-16 Michael Girard Behavioral motion space blending for goal-directed character animation
US10026210B2 (en) 2008-01-10 2018-07-17 Autodesk, Inc. Behavioral motion space blending for goal-oriented character animation
US20090295809A1 (en) * 2008-05-28 2009-12-03 Michael Girard Real-Time Goal-Directed Performed Motion Alignment For Computer Animated Characters
US20090295807A1 (en) * 2008-05-28 2009-12-03 Michael Girard Real-Time Goal-Directed Performed Motion Alignment For Computer Animated Characters
US8350860B2 (en) 2008-05-28 2013-01-08 Autodesk, Inc. Real-time goal-directed performed motion alignment for computer animated characters
US8363057B2 (en) 2008-05-28 2013-01-29 Autodesk, Inc. Real-time goal-directed performed motion alignment for computer animated characters
US8373706B2 (en) 2008-05-28 2013-02-12 Autodesk, Inc. Real-time goal-directed performed motion alignment for computer animated characters
US20090295808A1 (en) * 2008-05-28 2009-12-03 Michael Girard Real-Time Goal-Directed Performed Motion Alignment For Computer Animated Characters
US20100095239A1 (en) * 2008-10-15 2010-04-15 Mccommons Jordan Scrollable Preview of Content
US8788963B2 (en) 2008-10-15 2014-07-22 Apple Inc. Scrollable preview of content
US20100281404A1 (en) * 2009-04-30 2010-11-04 Tom Langmacher Editing key-indexed geometries in media editing applications
US20130097502A1 (en) * 2009-04-30 2013-04-18 Apple Inc. Editing and Saving Key-Indexed Geometries in Media Editing Applications
US20100281380A1 (en) * 2009-04-30 2010-11-04 Tom Langmacher Editing and saving key-indexed geometries in media editing applications
US8286081B2 (en) 2009-04-30 2012-10-09 Apple Inc. Editing and saving key-indexed geometries in media editing applications
US8566721B2 (en) 2009-04-30 2013-10-22 Apple Inc. Editing key-indexed graphs in media editing applications
US8543921B2 (en) * 2009-04-30 2013-09-24 Apple Inc. Editing key-indexed geometries in media editing applications
US20100281366A1 (en) * 2009-04-30 2010-11-04 Tom Langmacher Editing key-indexed graphs in media editing applications
US20100289807A1 (en) * 2009-05-18 2010-11-18 Nokia Corporation Method, apparatus and computer program product for creating graphical objects with desired physical features for usage in animation
US8427503B2 (en) 2009-05-18 2013-04-23 Nokia Corporation Method, apparatus and computer program product for creating graphical objects with desired physical features for usage in animation
WO2010133943A1 (en) 2009-05-18 2010-11-25 Nokia Corporation Method, apparatus and computer program product for creating graphical objects with desired physical features for usage in animations
US20110012903A1 (en) * 2009-07-16 2011-01-20 Michael Girard System and method for real-time character animation
US20110029904A1 (en) * 2009-07-30 2011-02-03 Adam Miles Smith Behavior and Appearance of Touch-Optimized User Interface Elements for Controlling Computer Function
US8863016B2 (en) 2009-09-22 2014-10-14 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8458617B2 (en) 2009-09-22 2013-06-04 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8456431B2 (en) 2009-09-22 2013-06-04 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US10282070B2 (en) 2009-09-22 2019-05-07 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8464173B2 (en) 2009-09-22 2013-06-11 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US10564826B2 (en) 2009-09-22 2020-02-18 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US20110069017A1 (en) * 2009-09-22 2011-03-24 Victor B Michael Device, Method, and Graphical User Interface for Manipulating User Interface Objects
US10788965B2 (en) 2009-09-22 2020-09-29 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US11334229B2 (en) 2009-09-22 2022-05-17 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8766928B2 (en) 2009-09-25 2014-07-01 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8780069B2 (en) 2009-09-25 2014-07-15 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8832585B2 (en) 2009-09-25 2014-09-09 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US8799826B2 (en) 2009-09-25 2014-08-05 Apple Inc. Device, method, and graphical user interface for moving a calendar entry in a calendar application
US10254927B2 (en) 2009-09-25 2019-04-09 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US11366576B2 (en) 2009-09-25 2022-06-21 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US9310907B2 (en) 2009-09-25 2016-04-12 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US10928993B2 (en) 2009-09-25 2021-02-23 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US8539385B2 (en) 2010-01-26 2013-09-17 Apple Inc. Device, method, and graphical user interface for precise positioning of objects
US20110181527A1 (en) * 2010-01-26 2011-07-28 Jay Christopher Capela Device, Method, and Graphical User Interface for Resizing Objects
US8612884B2 (en) * 2010-01-26 2013-12-17 Apple Inc. Device, method, and graphical user interface for resizing objects
US20110181528A1 (en) * 2010-01-26 2011-07-28 Jay Christopher Capela Device, Method, and Graphical User Interface for Resizing Objects
US8539386B2 (en) 2010-01-26 2013-09-17 Apple Inc. Device, method, and graphical user interface for selecting and moving objects
US8677268B2 (en) * 2010-01-26 2014-03-18 Apple Inc. Device, method, and graphical user interface for resizing objects
US8972879B2 (en) 2010-07-30 2015-03-03 Apple Inc. Device, method, and graphical user interface for reordering the front-to-back positions of objects
US9081494B2 (en) 2010-07-30 2015-07-14 Apple Inc. Device, method, and graphical user interface for copying formatting attributes
US9626098B2 (en) 2010-07-30 2017-04-18 Apple Inc. Device, method, and graphical user interface for copying formatting attributes
US9098182B2 (en) 2010-07-30 2015-08-04 Apple Inc. Device, method, and graphical user interface for copying user interface objects between content regions
US9075434B2 (en) 2010-08-20 2015-07-07 Microsoft Technology Licensing, Llc Translating user motion into multiple object responses
US20120313957A1 (en) * 2011-06-09 2012-12-13 Microsoft Corporation Staged Animated Transitions for Aggregation Charts
US9477389B2 (en) * 2011-06-24 2016-10-25 Yamaha Corporation Parameter controlling apparatus
US20120328130A1 (en) * 2011-06-24 2012-12-27 Yamaha Corporation Parameter Controlling Apparatus
US20130055131A1 (en) * 2011-08-26 2013-02-28 Microsoft Corporation Animation for Cut and Paste of Content
US9633464B2 (en) 2011-08-30 2017-04-25 Apple Inc. Automatic animation generation
US20130050224A1 (en) * 2011-08-30 2013-02-28 Samir Gehani Automatic Animation Generation
US10176620B2 (en) 2011-08-30 2019-01-08 Apple Inc. Automatic animation generation
US8907957B2 (en) * 2011-08-30 2014-12-09 Apple Inc. Automatic animation generation
US8819567B2 (en) 2011-09-13 2014-08-26 Apple Inc. Defining and editing user interface behaviors
US9164576B2 (en) 2011-09-13 2015-10-20 Apple Inc. Conformance protocol for heterogeneous abstractions for defining user interface behaviors
US20130093795A1 (en) * 2011-10-17 2013-04-18 Sony Corporation Information processing apparatus, display control method, and computer program product
US20130311914A1 (en) * 2011-11-11 2013-11-21 Rockwell Automation Technologies, Inc. Method and apparatus for computer aided design of human-machine interface animated graphical elements
US20140375572A1 (en) * 2013-06-20 2014-12-25 Microsoft Corporation Parametric motion curves and manipulable content
US10600225B2 (en) * 2013-11-25 2020-03-24 Autodesk, Inc. Animating sketches via kinetic textures
US9767592B2 (en) * 2014-04-18 2017-09-19 Alibaba Group Holding Limited Animating content display
US20150302628A1 (en) * 2014-04-18 2015-10-22 Alibaba Group Holding Limited Animating content display
US10311607B2 (en) * 2014-04-18 2019-06-04 Sugarcrm Inc. Chart decomposition and sequencing for limited display devices
USD915445S1 (en) * 2014-09-03 2021-04-06 Apple Inc. Display screen or portion thereof with graphical user interface
USD775634S1 (en) * 2014-10-30 2017-01-03 Kardium Inc. Display screen or portion thereof with animated graphical user interface for a monitoring and control device for an intra-cardiac procedure system
USD775141S1 (en) * 2014-10-30 2016-12-27 Kardium Inc. Display screen or portion thereof with animated graphical user interface for a monitoring and control device for an intra-cardiac procedure system
US20160132201A1 (en) * 2014-11-06 2016-05-12 Microsoft Technology Licensing, Llc Contextual tabs in mobile ribbons
US10884592B2 (en) 2015-03-02 2021-01-05 Apple Inc. Control of system zoom magnification using a rotatable input mechanism
USD803233S1 (en) * 2015-08-14 2017-11-21 Sonos, Inc. Display device with animated graphical user interface element
USD879799S1 (en) 2015-08-14 2020-03-31 Sonos, Inc. Display device with animated graphical user interface element
US11080774B2 (en) * 2015-08-25 2021-08-03 Cardly Pty Ltd Online system and method for personalising a greeting card or stationery with handwriting and doodles using a computer
US20170139895A1 (en) * 2015-11-13 2017-05-18 Richard Scott Rosenblum Method and System for Report Generation
USD815650S1 (en) * 2015-12-24 2018-04-17 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
US11430195B2 (en) * 2016-08-31 2022-08-30 Sony Corporation Information processing apparatus, information processing method, and program for improving user-friendliness of an animated tutorial depicting assembling parts for creating a robot
US10825142B2 (en) * 2016-11-30 2020-11-03 Boe Technology Group Co., Ltd. Human face resolution re-establishing method and re-establishing system, and readable medium
CN110383269A (en) * 2017-03-03 2019-10-25 微软技术许可有限责任公司 Animation font based on multi-shaft variable font
US10860748B2 (en) * 2017-03-08 2020-12-08 General Electric Company Systems and method for adjusting properties of objects depicted in computer-aid design applications
USD900839S1 (en) * 2018-01-05 2020-11-03 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US11032529B2 (en) 2018-03-01 2021-06-08 Motorola Mobility Llc Selectively applying color to an image
US20190273901A1 (en) * 2018-03-01 2019-09-05 Motorola Mobility Llc Selectively applying color to an image
US10645357B2 (en) * 2018-03-01 2020-05-05 Motorola Mobility Llc Selectively applying color to an image
USD936690S1 (en) 2018-03-12 2021-11-23 Apple Inc. Electronic device with graphical user interface
USD954096S1 (en) 2018-03-12 2022-06-07 Apple Inc. Electronic device with graphical user interface
USD919651S1 (en) 2018-03-12 2021-05-18 Apple Inc. Electronic device with animated graphical user interface
USD916852S1 (en) * 2018-03-12 2021-04-20 Apple Inc. Electronic device with animated graphical user interface
US11412159B2 (en) * 2018-07-27 2022-08-09 Beijing Microlive Vision Technology Co., Ltd Method and apparatus for generating three-dimensional particle effect, and electronic device
USD923642S1 (en) * 2018-09-06 2021-06-29 Apple Inc. Display screen or portion thereof with animated graphical user interface
US11004249B2 (en) * 2019-03-18 2021-05-11 Apple Inc. Hand drawn animation motion paths
US11494965B2 (en) 2019-03-18 2022-11-08 Apple Inc. Hand drawn animation motion paths
USD957420S1 (en) * 2019-03-22 2022-07-12 Apple Inc. Electronic device with graphical user interface
US11145027B2 (en) * 2019-04-02 2021-10-12 Rightware Oy Dynamic transitioning between visual user interface elements on a display
US20220343612A1 (en) * 2019-11-18 2022-10-27 Magic Leap, Inc. Mapping and localization of a passable world
CN111047527A (en) * 2019-11-25 2020-04-21 福州市暖色网络科技有限公司 Method and storage medium for adjusting dynamic element based on input element
CN111105482A (en) * 2019-12-24 2020-05-05 上海莉莉丝科技股份有限公司 Animation system, animation method, and computer-readable storage medium
USD966335S1 (en) 2020-01-31 2022-10-11 Mitsubishi Electric Corporation Display screen with animated graphical user interface
USD967183S1 (en) * 2020-01-31 2022-10-18 Mitsubishi Electric Corporation Display screen with animated graphical user interface
USD962986S1 (en) 2020-06-09 2022-09-06 J. Morita Mfg. Corp. Display screen with icon
USD962980S1 (en) * 2020-06-09 2022-09-06 J. Morita Mfg. Corp. Display screen with animated graphical user interface
USD962990S1 (en) 2020-06-09 2022-09-06 J. Morita Mfg. Corp. Display screen with icon
USD962987S1 (en) * 2020-06-09 2022-09-06 J. Morita Mfg. Corp. Display screen with animated icon
US11644941B1 (en) * 2020-08-10 2023-05-09 Apple Inc. Manipulation of animation timing
USD1013701S1 (en) * 2020-09-18 2024-02-06 Glowstik, Inc. Display screen with animated icon
USD1012116S1 (en) * 2020-09-18 2024-01-23 Glowstik, Inc. Display screen with animated icon
USD1003908S1 (en) 2020-10-30 2023-11-07 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1012959S1 (en) 2020-10-30 2024-01-30 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD990508S1 (en) 2020-10-30 2023-06-27 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1016078S1 (en) 2020-10-30 2024-02-27 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD982601S1 (en) * 2020-10-30 2023-04-04 Stryker Corporation Display screen or portion thereof with a graphical user interface
USD1008301S1 (en) 2020-10-30 2023-12-19 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1008300S1 (en) 2020-10-30 2023-12-19 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1014514S1 (en) 2020-10-30 2024-02-13 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD1011360S1 (en) 2020-10-30 2024-01-16 Stryker Corporation Display screen or portion thereof having a graphical user interface
USD971231S1 (en) * 2020-11-25 2022-11-29 Apple Inc. Electronic device with animated graphical user interface
USD997967S1 (en) 2020-11-25 2023-09-05 Apple Inc. Electronic device with animated graphical user interface
US20220222908A1 (en) * 2021-01-11 2022-07-14 Boe Technology Group Co., Ltd. Method and apparatus for displaying image
US20220374139A1 (en) * 2021-05-19 2022-11-24 Snap Inc. Video editing application for mobile devices
CN117392358A (en) * 2023-12-04 2024-01-12 腾讯科技(深圳)有限公司 Collision detection method, collision detection device, computer device and storage medium

Also Published As

Publication number Publication date
US20060055700A1 (en) 2006-03-16
US8542238B2 (en) 2013-09-24
US8253747B2 (en) 2012-08-28
US20100201692A1 (en) 2010-08-12
WO2005106800A2 (en) 2005-11-10
US20130113807A1 (en) 2013-05-09
EP1735754A2 (en) 2006-12-27
WO2005106800A3 (en) 2006-10-26
US20100194763A1 (en) 2010-08-05

Similar Documents

Publication Publication Date Title
US8253747B2 (en) User interface for controlling animation of an object
US7518611B2 (en) Extensible library for storing objects of different types
US7932909B2 (en) User interface for controlling three-dimensional animation of an object
US9997196B2 (en) Retiming media presentations
US6317142B1 (en) Taxonomy of objects and a system of non-modal property inspectors
CA2233819A1 (en) Computer imaging using graphics components
US20110285727A1 (en) Animation transition engine
US20140111534A1 (en) Media-Editing Application for Generating and Editing Shadows
Meyer et al. After Effects Apprentice: Real-world Skills for the Aspiring Motion Graphics Artist
Team Adobe After Effects CC Classroom in a Book
Faulkner et al. Adobe After Effects CC: 2014 Release
Shupe et al. Flash 8: Projects for Learning Animation and Interactivity: Projects for Learning Animation and Interactivity
Neumann Gimp pocket reference
Fridsma et al. Adobe After Effects Classroom in a Book 2024 Release
Christiansen Adobe After Effects CC: Classroom in a Book: the Official Training Workbook from Adobe Systems
GREEN et al. Learning the Animate CC Interface
Penston Adobe Creative Suite 2 How-Tos: 100 Essential Techniques
Gee 3D in Photoshop: the ultimate guide for creative professionals
Underdahl et al. Macromedia Director MX 2004 Bible
Wickes Welcome to Blender!
Maidasani Straight to the Point-3ds Max 9
Hinkson et al. The Ins And Outs Of Animation
Kerr et al. Let’s Get Animated!
Eagle Pan/Crop, Track Motion, and Basic Compositing in Vegas
Bhangal et al. Foundation Flash 8

Legal Events

Date Code Title Description
AS Assignment

Owner name: APPLE COMPUTER, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NILES, GREGORY E.;SHEELER, STEPHEN M.;HUCKING, GUIDO;REEL/FRAME:015078/0948;SIGNING DATES FROM 20040806 TO 20040816

AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:APPLE COMPUTER, INC.;REEL/FRAME:020638/0127

Effective date: 20070109

Owner name: APPLE INC.,CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:APPLE COMPUTER, INC.;REEL/FRAME:020638/0127

Effective date: 20070109

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION