US20180300938A1 - Method for representing an animated object - Google Patents

Method for representing an animated object Download PDF

Info

Publication number
US20180300938A1
US20180300938A1 US16/014,213 US201816014213A US2018300938A1 US 20180300938 A1 US20180300938 A1 US 20180300938A1 US 201816014213 A US201816014213 A US 201816014213A US 2018300938 A1 US2018300938 A1 US 2018300938A1
Authority
US
United States
Prior art keywords
animation
sequence
texture
individual objects
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/014,213
Inventor
Sven Schreiber
Original Assignee
Progressive3D Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Progressive3D Gmbh filed Critical Progressive3D Gmbh
Priority to US16/014,213 priority Critical patent/US20180300938A1/en
Publication of US20180300938A1 publication Critical patent/US20180300938A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the invention relates to a method for representing an animated object.
  • the representation of graphical objects is one of the main uses of computers with sometimes very large computer resources.
  • Comprehensive computation operations are required, in particular, when simulating and representing three-dimensional objects, for example during the computer-aided design (CAD) of objects and the subsequent simulation of a particular object behavior.
  • the object is conventionally created in a corresponding drawing program, for example a CAD program, and the object behavior is then simulated with respect to defined parameter properties.
  • the simulation calculations may sometimes last for hours or even days on account of the comprehensive computation operations.
  • the object which has been calculated in this manner can then be visualized in the drawing program, but any change in the view or parameters makes it necessary to recalculate the simulation of the object within the drawing program.
  • DE 602 14 696 T2 thus describes the simulation of a flow of fluids and a structural analysis in thin-walled three-dimensional geometries, the simulation being input as an outer skin with a finite element mesh.
  • DE 698 31 385 T2 describes a method and an arrangement for blending graphical objects using planar maps which are described in a page description language.
  • sections of a page description language representation are converted into a planar map representation and are blended with the planar map representations of the graphical objects.
  • planar maps allow a type of representation which is independent of the color space and the resolution.
  • U.S. Pat. No. 7,123,269 B1 likewise describes the modification of vector objects. After the user has selected particular sections of an image with a large number of vector objects, parameters of selectively determined vector objects can be changed and the changed vector objects can be represented again.
  • an animated object can only ever be represented in the drawing program.
  • an image sequence of the animated object can be exported by the drawing program, in which case the image sequence and particular parameters of the image sequence, for example the viewing angle, cannot be subsequently changed.
  • the so-called 3D-PDF from Adobe Systems Incorporated contains, as standard, rudimentary animation functions which can be used to animate bodies in the form of puppets.
  • surface changes of the bodies cannot be changed, for example deformed, expanded or produced as a wave movement on the body, with the aid of the animation functions.
  • the invention is based on the object of providing a fast and resource-saving method for representing an animated object, in which case the representation of the animated object can be interactively changed by a user.
  • the object is achieved by means of a method having the features according to claim 1 .
  • the invention provides a method for representing an animated object in the form of an animation sequence, in which a sequence of individual objects is generated for each moment of the animation sequence of the object.
  • the individual objects fully depict the object at the respective moment and may be two-dimensional or three-dimensional representations of the object.
  • An object in the sense of the present invention is a two-dimensional or three-dimensional graphical representation of a real or a computer-generated item, for example a ship or a moving surface of the sea.
  • Texture animations are two-dimensional effects of the three-dimensional object which are projected onto the surface of the three-dimensional object in a similar manner to a video projector.
  • Surface changes in the sense of the inventions may be, for example, deformations of the surface, material changes of the surface, liquid movements on or in the object or lighting effects. Lighting conditions can be changed by means of texture animation, as can color effects and two-dimensional movements, but not the shape and design of the three-dimensional object.
  • the calculated surface changes as a texture animation are then projected onto the object in the animation sequence. The impression of an animated object is produced for a human viewer by simultaneously running the animation sequence as a sequence of the individual objects with the texture animation on the object.
  • the simulation of the object on the basis of internal or external forces is calculated using a three-dimensional animation program, for example “Blender” or “Lightwave3D”.
  • the animation program may also be integrated inside a program with a vector-based page description language.
  • the object is represented in a simple manner and the sequence of the individual objects for defined moments is output.
  • an animation sequence which can be played as a sequence of the three-dimensional individual objects is obtained. Consequently, the viewing angle of the animated object can be interactively moved.
  • a plurality of such precalculated animation and/or texture sequences can be freely assembled and can be combined with one another via a controller in such a manner that the visual impression of being able to change the physical properties of the animated object to a limited extent is produced for a human viewer.
  • 25 objects are advantageously produced per second, which is advantageous on account of the physiognomy of the eye and makes it possible for the viewer to sense the running animation sequence with texture animation in a jerk-free manner.
  • These individual objects which are calculated in a complicated manner are then played in succession at high speed.
  • the principle corresponds to the presentation of film which likewise simulates a movement from a large number of still individual images by rapidly playing the latter.
  • a human viewer is provided with the impression of an animated object by integrating the texture animation as a simulated change of the surfaces of the object with the simultaneous running of the individual objects.
  • This animation sequence of the animated object which has been combined in this manner, requires less storage capacity on account of the reduced data density in comparison with playback in the drawing program.
  • One advantageous refinement of the method provides for the sequence of the individual objects to be created using a vector-based drawing program and to be assembled to form the animation sequence using a vector-based page description language.
  • the individual objects are advantageously output via the conventional export and/or storage functions of the vector-based drawing program.
  • the grabbing software tool may either be used autonomously or is part of the program written in a vector-based page description language.
  • the grabbing software tool may be part of the vector-based drawing program or may be a separate program, for example a so-called gadget as a software application.
  • the grabbing software tool has direct access to the geometry memory, to the texture memory of the graphics card and/or to the graphics output, for example in the OpenGL or DirectX output format, in order to filter out the individual objects there.
  • a ship can be created as an object in the “Lightwave3D” program.
  • the ship object is then illuminated and the resultant light simulation is stored in a graphics file, for example in the JPG format, with the result that the lighting moment can be recorded and can be projected, as a texture animation, onto the animated object in the form of a ship as part of the animation sequence.
  • the sea waves surrounding the object are stored as further objects with an additional texture animation.
  • the resultant effects and interactions between the objects are simulated for each moment and are stored as respective individual objects in simple object models.
  • the individual object and optionally a possibly associated texture animation for the respective object are stored for each individual object of the object “ship” and for all individual objects of the objects “sea waves”.
  • the individual objects and texture animations stored in this manner in respective files are now imported into a program with a vector-based page description language, for example the program “Acrobat 3D Toolkit” from Adobe Systems Incorporated.
  • a program for example the program “Acrobat 3D Toolkit” from Adobe Systems Incorporated.
  • the animation sequence can be visualized.
  • the animated objects are combined as a sequence of the individual objects in the program with the vector-based page description language.
  • the already existing sequence of the individual objects is imported into the program with the vector-based page description language.
  • the texture animation of the object is advantageously created using the vector-based drawing program and is combined with the object using the program with the vector-based page description language.
  • This has the advantage that the sequence of the individual objects and the texture animation are directly joined with the aid of programs with a vector-based page description language and can be played in a manner virtually independent of the platform.
  • Particularly programs with a vector-based page description language make it possible to represent the objects in a manner independent of the platform.
  • One advantageous refinement of the method provides for the texture animation of the object to be determined on the basis of a numerical simulation and to be combined with the object using the vector-based page description language.
  • the calculation and simulation of the object behavior as an animation sequence or of the texture of the object as a texture animation are carried out using corresponding basic equations, for example taking the lattice Boltzmann method as a basis for taking into account internal and external frictional forces for simulating liquid behavior.
  • Liquids can thus be calculated in a physically correct manner in the sense of a simulation and allow the human viewer to be given a visual impression of the sequences connected with the animated objects and interactions between the objects.
  • the flow behavior of different liquids inside an object for example inside a pipe, can be simulated.
  • the advantage is considered to be the fact that a further texture animation is created as a background plane of the object using the vector-based drawing program and is combined with the object as a background plane to form the animation sequence using the vector-based page description language.
  • the background plane need not have any objects to be animated but rather the animated object with texture animation is projected against the background of an exclusive texture animation. This dispenses with computation operations since no individual objects have to be created and joined for the background plane.
  • the texture animation can be interactively varied on the basis of predefinable boundary conditions and the respectively varied texture animation can be interactively projected onto the animated object using the vector-based page description language.
  • Changes in the texture animation for example lighting conditions or a changed material behavior, can thus be interactively changed by the user or on the basis of specifications and can then be projected onto the object to be animated.
  • a viewing angle of the animated object is advantageously interactively controlled using the vector-based page description language. Since the individual objects are in the form of two-dimensional or three-dimensional object bodies, the latter can also be viewed from all sides. Since the respective texture animations are likewise projected onto the respectively associated objects, the viewing angle can also be changed during the animation sequence of the animated object. This change in the viewing angle of the animation sequence of the animated object was not possible in previous object representations in vector-based page description languages. For this purpose, it is likewise important for a light texture to be calculated on the basis of the object and to be projected onto the object on the basis of the viewing angle.
  • the object is composed of polygons and/or triangles and/or non-uniform rational B-splines and/or voxels.
  • Non-uniform rational B-splines are mathematically defined curves or areas which are used to model any desired shapes in the field of computer graphics.
  • the geometrical information is represented using geometrical elements which are functionally defined piece by piece. Any desired technical producible or natural shape of an object or sections of an object can be represented with the aid of NURBS.
  • the advantage is considered to be the fact that a first sequence of individual images of the object for a first animation sequence is combined with a second sequence of individual images of the object for a second animation sequence.
  • the animation sequences can be played either at the same time or alternatively. Animation sequences can therefore be replaced with one another by the user during the playback operation.
  • the user is therefore provided with extensive variation possibilities within the vector-based page description language in conjunction with the interactive control of the viewing angle.
  • First surface changes for the first sequence of individual images of the object are advantageously calculated as a first texture animation and second surface changes for the second sequence of individual images of the object are calculated as a second texture animation and are joined with the animation sequences using the vector-based page description language.
  • extensive variations of the animated object which either relate only to the surface of the animated object as a texture animation or even relate to the movement and the object per se as an animation sequence, can be carried out within the vector-based page description language.
  • One advantageous refinement of the method provides for the first and last individual objects of the animation sequence to be matched to one another in such a manner that an endless loop of the animation sequence can be represented.
  • the starting and end objects of the animation sequence are advantageously matched using a so-called loop editor.
  • the endless loop can also be used with respect to selected individual objects in the central part of the animation sequence.
  • the loop editor represents the calculated sequences of the individual objects in pictograms or as an object representation on a timeline—similar to a video editing program or a node editor which is known from 3D animation programs for the overview of the graphical programming of shaders, for example. This is because there is the possibility of copying, deleting and cutting sequences of the individual objects in this case too.
  • An additional window in which the animation sequence of the object currently being animated runs is integrated in the loop editor.
  • the first window shows, in animated form, the region which is currently being edited
  • the second window shows the first individual object and the last individual object in an overlapping manner, the starting and end individual objects being displayed in a semi-transparent and overlapping manner.
  • the first ten individual objects and the last ten individual objects may also be represented in animated form in order to see where cutting is best.
  • the animation sequence can be played in a virtually endless manner without the transitions from the end individual object to the starting individual object being apparent to the human viewer.
  • One advantageous refinement of the method provides for corresponding boundary conditions and/or parameters to already be set when simulating and generating the sequence of the individual objects in such a manner that the end individual object of the animation sequence virtually corresponds to the starting individual object.
  • the animation sequence is divided into partial sequences.
  • the animation sequence can be such that the flowing of the water into the pipe is not repeated within an endless loop.
  • the water begins to run and runs through the pipe until the desired end of the animation sequence.
  • the central part of the animation sequence water has already reached the end of the pipe and flows through the pipes, with the result that there are presumably no longer any great changes in the flow properties and this central sequence can therefore be repeated.
  • the switching-off of the water and the emptying of the pipe can then be represented as a sequence which cannot be repeated.
  • the human viewer can define and configure the partial sequences of the animation sequence on the basis of his experience in such a manner that the partial sequences which are possibly configured in different ways form the animation sequence.
  • the sequence of the individual objects which is defined in this manner can be played as desired within a program with a vector-based page description language, for example as 3D-PDF or alternatively also other playback environments such as the “Silverlight” program from the Microsoft Corporation or the “Flash” software tool from Adobe Systems Incorporated.
  • the sequence of the individual objects can also be subsequently modified in the vector-based page description language. Further graphical optimizations, for example a polygon reducement, can likewise be carried out.
  • the animation sequence created in this manner can then be read into a program with a vector-based page description language and can be used as a flash animation or in the form of a control file for an interactive three-dimensional object.
  • the respective texture sequence can be synchronously played in an endless loop, the first and last individual objects of the animation sequence and the associated surface change as a texture of the first and last individual objects being virtually identical.
  • One advantageous refinement of the method provides for the sequence of the individual images and/or the texture animation and/or the light texture to be displayed in a display unit.
  • the respective sections or sectional planes of the individual objects can then be viewed by a human viewer in a quick and simple manner by interactively selecting the respectively desired animation sequence and thus the respectively desired sectional plane and it is possible to switch back and forth between the individual animation sequences by means of the vector-based page description language.
  • the sequence of the individual objects and/or the texture animation is/are determined using a simulation unit.
  • a simulation unit tailored to the problem to be simulated Simulation programs require comprehensive operation and control which can only be carried out by specially trained experts.
  • the simulation unit can be designed in such a manner that only a minimum amount of storage space is required and the input by a layman is also possible.
  • the parameter input carried out by the user can create a flow of water through a pipe, which, according to a simulation behavior, can meet a collision object (here the pipe), where the objects, in the form of water constituents, are then distributed according to the simulation.
  • the real-time representation is effected using particle points or voxels for rapid understanding or on the basis of predefined sectional planes of the collision object.
  • the collision object is precisely defined by locating the areas of polygons/triangles, that is to say by reading the boundaries of the collision object or by specifications from the user.
  • the user must open a toolbox and can then choose between different pipe cross sections, so-called “shapes”.
  • the user specifies a radius for the diameter of the pipe as a collision object or interactively defines it using a graphical selection.
  • the user draws the line through the pipe as a collision object.
  • Curve tools, distributing guides and other tools are available for creating the path.
  • the user can selectively choose the shapes using the toolbox. Alternatively, this task can be transferred to a computer which selects the shapes in an automated method.
  • the real-time preview of the animation sequence is then represented in the path which has been created and the user can work with the parameterization. If the user is satisfied with the simulation, he can start the complex three-dimensional simulation by pressing a button and the animation sequence derived therefrom can be output. Optionally, the user can also generate texture animations which are subsequently projected onto the finished animation sequence.
  • the present method likewise makes it possible to import an area defined by the user, for example a water surface, into the simulation unit.
  • the user can then input the necessary parameters, for example the wind direction and strength, and the simulation of a wave can thus be calculated. Additional objects such as ships can be placed on the water surface.
  • the wave movements produce further waves and spray.
  • the animation sequence indicates geometries and textures for the rough preview.
  • the user can start the complex three-dimensional simulation by pressing a button and the finished animation sequence can be output.
  • the closed water surface is broken up into sections; the spray first of all consists of particles, polygons or volume objects such as “voxels” and is subsequently likewise broken up into sections.
  • the impinging drops of water of the spray and the breaking waves are stored as textures on the water surface. Both methods can be represented in real time separately or together using more modern and more powerful computers or may likewise be exported and/or processed as 3D object sequences.
  • Selected properties for the surface simulations of the water surface can likewise be changed by the user.
  • the wave movements of the water surface are created using so-called centers in which the waves arise.
  • the method makes it possible for the wave to know its volume, force and speed in order to carry out correct force distributions at further objects, such as a wall or a ship, so that everything physically moves in a correct manner.
  • a flash program can also be used to play the animation sequence.
  • PDF portable document format
  • a computer program and a computer program product also achieve the object, the computer program product being stored in a computer-readable medium and comprising computer-readable means which cause a computer to carry out the method according to the invention when the program runs in the computer.
  • the present invention may be implemented in the form of hardware, software or a combination of hardware and software. Any type of system or any other apparatus set up to carry out the method according to the invention is suitable for this purpose.
  • the present invention may also be integrated in a computer program product which comprises all of the features that enable it to implement the computer-assisted methods described here and which, after being loaded into a computer system, is able to carry out these methods.
  • computer program and “computer program product” should be understood as meaning any expression in any desired computer language, code or notation of a set of instructions which enable a computer system to process data and thus to perform a particular function.
  • the computer program or the computer program product can be executed on the computer system either directly or after conversion into another language, code or notation or by means of representation in another material form.
  • FIG. 1 shows a flowchart with the essential method steps
  • FIG. 2 shows a diagrammatic illustration of the essential method steps
  • FIG. 3 shows a perspective view of an object with different sectional planes
  • FIG. 4 shows a view of an object in a vector-based page description language program
  • FIG. 5 shows a diagrammatic illustration of the essential method steps with a preview program.
  • FIG. 1 shows a flowchart with the essential method steps.
  • the object 1 (not illustrated) is simulated 10 .
  • a drawing program 5 (not illustrated), for example Lightwave3D, or in a simulation unit, the behavior of the object 1 is simulated and is generated in temporally successive individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g (not illustrated).
  • the texture of the animated object 1 is generated and simulated 12 either in the drawing program 5 or by means of a separate editor 6 (not illustrated) and is stored as a texture animation 4 a , 4 b (not illustrated).
  • the texture animation can also be simulated 12 in a parallel manner to the simulation of the object 10 or completely independently of the simulation of the object 10 .
  • the individual objects are then joined 13 , as a temporal sequence, to form an animation sequence 3 a , 3 b (not illustrated).
  • the individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g may also be individually read from the drawing program 5 and then viewed in a further preview program 17 (not illustrated) as a previewer and/or can be assembled as a sequence and thus as an animation sequence 3 a , 3 b .
  • the texture animation 4 a , 4 b is then projected 14 onto the animation sequence 3 a , 3 b and thus depicts the surface changes of the animated object 1 .
  • the playback of the animation sequence 3 a , 3 b with the texture animation 4 a , 4 b is repeated using a loop operation 16 until an internal condition occurs or a human viewer terminates the process.
  • FIG. 2 illustrates a diagrammatic illustration of the basic method sequences.
  • the individual objects 2 a , 2 b , 2 c are generated and the temporal behavior is simulated.
  • the individual objects 2 a , 2 b , 2 c are already joined, as a sequence, to form an animation sequence 3 a .
  • the animation sequence 3 a is indicated by the vertical lines between the individual objects 2 a , 2 b , 2 c and is intended to indicate the temporal sequence of the individual objects 2 a , 2 b , 2 c .
  • the textures are connected to form a texture animation 4 a , the texture animation 4 a again being indicated by the vertical lines between the textures.
  • the animation sequence 3 a and the texture animation 4 a are loaded into a program with a vector-based page description language 7 and can be played in a display unit 18 by means of control elements 8 a , 8 b .
  • the human user can use the control elements 8 a , 8 b to control the course and speed of the animation sequence 3 a with the texture animation 4 a.
  • FIG. 3 shows a perspective view of an object 1 with different sectional planes 9 a , 9 b , 9 c , 9 d , in which case not all sectional planes illustrated are assigned to a figure designation for the sake of clarity.
  • the animation sequence 3 a , 3 b (not illustrated) and the texture animation 4 a , 4 b , 4 c (not illustrated) are calculated on the basis of a simulation for the entire object 1 . However, the actual animations 3 a , 3 b , 4 a , 4 b , 4 c are only illustrated and projected for predefinable sectional planes 9 a , 9 b , 9 c , 9 d , 9 f , 9 g .
  • three radially running sectional planes 9 a , 9 b , 9 c subdivide the interior of a tubular object 1 . Furthermore, three axially running sectional planes 9 d , 9 f , 9 g run inside the pipe as an object 1 .
  • the animation sequence 3 a , 3 b and the texture animation 4 a , 4 b , 4 c are output only for the sectional planes 9 a , 9 b , 9 c , 9 d , 9 f , 9 g , which requires only small computation capacities.
  • the three-dimensional simulation behavior of the animated object 1 can be represented in two dimensions by representing a three-dimensional object behavior in the two-dimensional sectional planes 9 a , 9 b , 9 c , 9 d , 9 f , 9 g —if appropriate on the basis of automatic detection of the object geometries during the simulation 10 .
  • a path to which the 2D textures of the fluid simulation of the animation sequence 3 a , 3 b , which functions in real time, can be tied is automatically created during the simulation 10 .
  • a plurality of such “2D slices” as sectional planes 9 a , 9 b , 9 c , 9 d , 9 f , 9 g are placed in the pipe as an object 1 and are horizontally and vertically interleaved.
  • 9 a , 9 b , 9 c , 9 d , 9 f , 9 g are placed in the pipe as an object 1 and are horizontally and vertically interleaved.
  • 9 a , 9 b , 9 c , 9 d , 9 f , 9 g as in the example shown in FIG. 3 , nine animation sequences 3 a , 3 b and/or texture animations 4 a , 4 b , 4 c are thus created and therefore, in combination, provide a user with a very realistic three-dimensional impression of the behavior of the animated object.
  • FIG. 4 shows an excerpt from an animation sequence 4 a (not illustrated) of the animated object 1 .
  • the object behavior of the static pipe does not need to be calculated.
  • the medium flowing through the pipe is simulated using a simulation and is stored as individual objects 2 a , 2 b , 2 c .
  • the surface change of the liquid when flowing through the pipe is then simulated and stored as a texture animation 4 a , 4 b .
  • the animation sequence 3 a , 3 b and the texture animation 4 a , 4 b , 4 c can then be joined and played.
  • buttons and icons are also possible as control elements 8 a , 8 b in the vector-based page description language, which control elements make it possible to play the animation sequence 3 a , 3 b with the texture animation 4 a , 4 b , 4 c .
  • the human viewer can use the control elements 8 a , 8 b to interactively change the viewing angle of the animated object 1 .
  • FIG. 5 shows a diagrammatic illustration of the essential method sequences with a preview program 17 .
  • the individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g are generated on the basis of a simulation predefined by the user.
  • Corresponding textures with regard to the surface changes of the individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g are determined in the further program 6 .
  • the temporal sequence of the respectively associated individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g can be viewed in the preview program 17 , a user then being able to generate an animation sequence 3 a , 3 b from this preview.
  • the sequence of the textures can likewise be viewed and a texture animation 4 a , 4 b , 4 c can then be created.
  • the animation sequences 3 a , 3 b and the texture animations 4 a , 4 b , 4 c are then loaded into the program with the vector-based page description language 7 and can be played in a display unit 18 using control elements 8 a , 8 b .
  • a simulation unit in which the individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g generated in the drawing program 5 and/or the textures generated in the further program 6 are simulated may likewise be integrated in the preview program 17 .
  • the user starts a program or a plug-in and imports the individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g using Adobe Acrobat 3D Toolkit or as an OBJ file or as further supporting file formats.
  • the user is then able to define the pipe as an object 1 and selects, for example, the radius of the pipe and the resolution of the further objects as volume bodies of the fluid.
  • the user can then view a corresponding preview of the individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g and/or of the textures and the temporal sequence of the individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g and/or of the textures in the preview program 17 in a display unit.
  • the model behavior of the individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g and/or of the textures is calculated for a plurality of sectional planes 9 a , 9 b , 9 c , 9 d , 9 e , 9 f and is represented in the three-dimensional pipe as an object 1 by means of a real-time preview.
  • the parameters and properties of the fluid can be interactively changed during the preview.
  • the user While the user is making his parameter input, he can see, in the preview program 5 in real time, how the settings directly change in the previewer.
  • the user can likewise also set and interactively change the type of textures.
  • a button “Create 3D sequence” When the user is satisfied with the setting, he presses a button “Create 3D sequence” and must also specify the formula according to which he would like to have the physical simulation calculated.
  • the animation sequences 3 a , 3 b and/or texture animations 4 a , 4 b , 4 c are then edited in the loop editor in such a manner that a starting part, a central part and an end part of the animation sequences 3 a , 3 b and/or texture animations 4 a , 4 b , 4 c are available and can be loaded into the 3D-PDF program or into other platforms such as Microsoft's “Silverlight” or Adobe's “Flash”. If necessary, the central part of the animation sequences 3 a , 3 b and/or texture animations 4 a , 4 b , 4 c can be run through again in certain loops.
  • the user uses the function of the object detail reduction in the loop editor to reduce the resolution of the individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g and/or of the animation sequence 3 a , 3 b and/or of the texture animation 4 a , 4 b , 4 c .
  • Definitive animation sequences 3 a , 3 b and/or texture animations 4 a , 4 b , 4 c are now automatically created using a control element 8 a , 8 b and are directly created in the program with the vector-based page description language, for example 3D-PDF.
  • the user can track his changes to the individual objects 2 a , 2 b , 2 c , 2 d , 2 e , 2 f , 2 g and/or to the animation sequence 3 a , 3 b and/or to the texture animation 4 a , 4 b , 4 c on the freely defined fluid flow or ocean current interactively and in real time.
  • These surface colors can be seen only when a button for the visibility of properties has previously been activated.
  • the animation function changes the color of the selected ball as an animated object 1 to red. It is therefore possible to discern which objects 1 or object parts are currently running on the basis of the selection using the animation function. Blue could stand for water simulation with a large number of objects 1 , green could stand for the masking of an object 1 , pink stands for an object 1 which will collide, and pink-red stands for an object 1 which has its own deformation.
  • the present method also makes it possible to simulate film and gaming applications.
  • the drawing program 5 , 6 is used to create a percentage 3D map of hairs which are defined as objects 1 .
  • a multiplicity of sectional planes 9 a , 9 b , 9 c , 9 d , 9 e , 9 f are placed through the volume hair.
  • the sectional planes 9 a , 9 b , 9 c , 9 d , 9 e , 9 f pierce the hairs to be modeled and leave behind a cross section of the hair which has just been cut on their 2D plane. This cross section is represented by the drawing program 5 . Everything which has not been cut is transparent.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The invention relates to a method for representing an animated object. In a three-dimensional drawing program used to generate and animate objects, the model behavior of objects is calculated. For this purpose, sequences of individual objects are output at defined times and subsequently the sequence of the individual objects is jointed into an animation sequence. Surface changes of the object are simulated by way of additional texture animation and output. The animation sequence and the texture animation are then joined in a vector-based page description language, such as the 3D PDF program, and played at the same time. Based on the available sequence of the individual objects, a user can interactively modify the object animated in this way while playing back the animation sequence and the texture animation and change the viewing angle for the animated object. The animation sequence or the texture animation can likewise be configured as an infinite loop and thereby give the human user a dynamic view of the animated object.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a continuation of patent application Ser. No. 13/265,071, filed Nov. 30, 2011; which was a § 371 national stage filing of international application No. PCT/DE2010/000391, filed Mar. 26, 2010, which designated the United States; this application also claims the priority, under 35 U.S.C. § 119, of German patent application No. DE 10 2009 018 165.2, filed Apr. 18, 2009; the prior applications are herewith incorporated by reference in their entirety.
  • BACKGROUND OF THE INVENTION Field of the Invention
  • The invention relates to a method for representing an animated object.
  • The representation of graphical objects is one of the main uses of computers with sometimes very large computer resources. Comprehensive computation operations are required, in particular, when simulating and representing three-dimensional objects, for example during the computer-aided design (CAD) of objects and the subsequent simulation of a particular object behavior. The object is conventionally created in a corresponding drawing program, for example a CAD program, and the object behavior is then simulated with respect to defined parameter properties. The simulation calculations may sometimes last for hours or even days on account of the comprehensive computation operations. The object which has been calculated in this manner can then be visualized in the drawing program, but any change in the view or parameters makes it necessary to recalculate the simulation of the object within the drawing program.
  • For example, DE 602 14 696 T2 thus describes the simulation of a flow of fluids and a structural analysis in thin-walled three-dimensional geometries, the simulation being input as an outer skin with a finite element mesh.
  • Furthermore, DE 698 31 385 T2 describes a method and an arrangement for blending graphical objects using planar maps which are described in a page description language. For this purpose, sections of a page description language representation are converted into a planar map representation and are blended with the planar map representations of the graphical objects. The advantage is that planar maps allow a type of representation which is independent of the color space and the resolution.
  • U.S. Pat. No. 7,123,269 B1 likewise describes the modification of vector objects. After the user has selected particular sections of an image with a large number of vector objects, parameters of selectively determined vector objects can be changed and the changed vector objects can be represented again.
  • The laid-open specification US 2008/0303826 A1 discloses a method and a system for the animated representation of objects corresponding to data items by means of an intuitive input language.
  • The problem with all solutions in the prior art is that an animated object can only ever be represented in the drawing program. Alternatively, an image sequence of the animated object can be exported by the drawing program, in which case the image sequence and particular parameters of the image sequence, for example the viewing angle, cannot be subsequently changed.
  • For example, the so-called 3D-PDF from Adobe Systems Incorporated contains, as standard, rudimentary animation functions which can be used to animate bodies in the form of puppets. However, surface changes of the bodies cannot be changed, for example deformed, expanded or produced as a wave movement on the body, with the aid of the animation functions. In particular, it is currently not possible to interactively represent the disintegration of the body on account of external forces, for example as melting or exploding, in the 3D-PDF format. Therefore, it is currently not possible to run a precalculated animation and to interactively change the latter by superimposing object sequences and textures or to interactively change the viewing angle of the animated object during animation.
  • BRIEF SUMMARY OF THE INVENTION
  • The invention is based on the object of providing a fast and resource-saving method for representing an animated object, in which case the representation of the animated object can be interactively changed by a user.
  • The object is achieved by means of a method having the features according to claim 1. The invention provides a method for representing an animated object in the form of an animation sequence, in which a sequence of individual objects is generated for each moment of the animation sequence of the object. The individual objects fully depict the object at the respective moment and may be two-dimensional or three-dimensional representations of the object.
  • An object in the sense of the present invention is a two-dimensional or three-dimensional graphical representation of a real or a computer-generated item, for example a ship or a moving surface of the sea.
  • The individual objects are joined in succession for each moment and form the principal part of the animation sequence of the object. The surface changes of the object are then calculated as a texture animation. Texture animations are two-dimensional effects of the three-dimensional object which are projected onto the surface of the three-dimensional object in a similar manner to a video projector. Surface changes in the sense of the inventions may be, for example, deformations of the surface, material changes of the surface, liquid movements on or in the object or lighting effects. Lighting conditions can be changed by means of texture animation, as can color effects and two-dimensional movements, but not the shape and design of the three-dimensional object. The calculated surface changes as a texture animation are then projected onto the object in the animation sequence. The impression of an animated object is produced for a human viewer by simultaneously running the animation sequence as a sequence of the individual objects with the texture animation on the object.
  • The simulation of the object on the basis of internal or external forces is calculated using a three-dimensional animation program, for example “Blender” or “Lightwave3D”. Alternatively, the animation program may also be integrated inside a program with a vector-based page description language. As part of the representation of the simulation, the object is represented in a simple manner and the sequence of the individual objects for defined moments is output. After the physical animations have been calculated in a protracted manner, an animation sequence which can be played as a sequence of the three-dimensional individual objects is obtained. Consequently, the viewing angle of the animated object can be interactively moved. A plurality of such precalculated animation and/or texture sequences can be freely assembled and can be combined with one another via a controller in such a manner that the visual impression of being able to change the physical properties of the animated object to a limited extent is produced for a human viewer.
  • 25 objects are advantageously produced per second, which is advantageous on account of the physiognomy of the eye and makes it possible for the viewer to sense the running animation sequence with texture animation in a jerk-free manner. These individual objects which are calculated in a complicated manner are then played in succession at high speed. The principle corresponds to the presentation of film which likewise simulates a movement from a large number of still individual images by rapidly playing the latter.
  • A human viewer is provided with the impression of an animated object by integrating the texture animation as a simulated change of the surfaces of the object with the simultaneous running of the individual objects. This animation sequence of the animated object, which has been combined in this manner, requires less storage capacity on account of the reduced data density in comparison with playback in the drawing program. This makes it possible to interactively represent the animated object since the respective individual objects are present in full and the texture animations, for example in the form of a light or flow simulation, have likewise been determined on the basis of the animated object. Therefore, when interactively changing the viewing angle of the animated object, there is no need to determine a new animation sequence in the drawing program with sometimes a high degree of computational complexity, as previously.
  • One advantageous refinement of the method provides for the sequence of the individual objects to be created using a vector-based drawing program and to be assembled to form the animation sequence using a vector-based page description language. The individual objects are advantageously output via the conventional export and/or storage functions of the vector-based drawing program. Alternatively, provision is made for the individual objects to be filtered out from the vector-based drawing program and from the graphics memories used by the vector-based drawing program or to be recorded using grabbing software tools. In this case, the grabbing software tool may either be used autonomously or is part of the program written in a vector-based page description language.
  • In this case, the grabbing software tool may be part of the vector-based drawing program or may be a separate program, for example a so-called gadget as a software application. The grabbing software tool has direct access to the geometry memory, to the texture memory of the graphics card and/or to the graphics output, for example in the OpenGL or DirectX output format, in order to filter out the individual objects there.
  • This makes it possible, for example, to combine the simulation of a ship in conjunction with an animation of sea waves in such a manner that the ship is interactively viewed using a program written in a vector-based page description language. In one exemplary use of the method, a ship can be created as an object in the “Lightwave3D” program. The ship object is then illuminated and the resultant light simulation is stored in a graphics file, for example in the JPG format, with the result that the lighting moment can be recorded and can be projected, as a texture animation, onto the animated object in the form of a ship as part of the animation sequence. After the movement of the ship has been calculated as a movement of the object, the sea waves surrounding the object are stored as further objects with an additional texture animation. The resultant effects and interactions between the objects are simulated for each moment and are stored as respective individual objects in simple object models. In this respect, the individual object and optionally a possibly associated texture animation for the respective object are stored for each individual object of the object “ship” and for all individual objects of the objects “sea waves”.
  • The individual objects and texture animations stored in this manner in respective files are now imported into a program with a vector-based page description language, for example the program “Acrobat 3D Toolkit” from Adobe Systems Incorporated. After the animated objects have been grouped with respect to one another and the texture animations have been projected onto the objects, the animation sequence can be visualized. The animated objects are combined as a sequence of the individual objects in the program with the vector-based page description language. Alternatively, the already existing sequence of the individual objects is imported into the program with the vector-based page description language.
  • The texture animation of the object is advantageously created using the vector-based drawing program and is combined with the object using the program with the vector-based page description language. This has the advantage that the sequence of the individual objects and the texture animation are directly joined with the aid of programs with a vector-based page description language and can be played in a manner virtually independent of the platform. Particularly programs with a vector-based page description language make it possible to represent the objects in a manner independent of the platform.
  • One advantageous refinement of the method provides for the texture animation of the object to be determined on the basis of a numerical simulation and to be combined with the object using the vector-based page description language.
  • The calculation and simulation of the object behavior as an animation sequence or of the texture of the object as a texture animation are carried out using corresponding basic equations, for example taking the lattice Boltzmann method as a basis for taking into account internal and external frictional forces for simulating liquid behavior. Liquids can thus be calculated in a physically correct manner in the sense of a simulation and allow the human viewer to be given a visual impression of the sequences connected with the animated objects and interactions between the objects. For example, the flow behavior of different liquids inside an object, for example inside a pipe, can be simulated.
  • The advantage is considered to be the fact that a further texture animation is created as a background plane of the object using the vector-based drawing program and is combined with the object as a background plane to form the animation sequence using the vector-based page description language. For this purpose, the background plane need not have any objects to be animated but rather the animated object with texture animation is projected against the background of an exclusive texture animation. This dispenses with computation operations since no individual objects have to be created and joined for the background plane.
  • As a result of the fact that the texture animation is generated independently of the individual objects or the objects, the texture animation can be interactively varied on the basis of predefinable boundary conditions and the respectively varied texture animation can be interactively projected onto the animated object using the vector-based page description language. Changes in the texture animation, for example lighting conditions or a changed material behavior, can thus be interactively changed by the user or on the basis of specifications and can then be projected onto the object to be animated. This makes it possible not only to interactively view the animated object from all sides in a program with a vector-based page description language but also to interactively change the texture animation in the program with the vector-based page description language at the same time.
  • A viewing angle of the animated object is advantageously interactively controlled using the vector-based page description language. Since the individual objects are in the form of two-dimensional or three-dimensional object bodies, the latter can also be viewed from all sides. Since the respective texture animations are likewise projected onto the respectively associated objects, the viewing angle can also be changed during the animation sequence of the animated object. This change in the viewing angle of the animation sequence of the animated object was not possible in previous object representations in vector-based page description languages. For this purpose, it is likewise important for a light texture to be calculated on the basis of the object and to be projected onto the object on the basis of the viewing angle.
  • So that the representation of the animation sequence of the animated object can also be played when there are few computer resources, the object is composed of polygons and/or triangles and/or non-uniform rational B-splines and/or voxels. Non-uniform rational B-splines (NURBS for short) are mathematically defined curves or areas which are used to model any desired shapes in the field of computer graphics. The geometrical information is represented using geometrical elements which are functionally defined piece by piece. Any desired technical producible or natural shape of an object or sections of an object can be represented with the aid of NURBS.
  • The advantage is considered to be the fact that a first sequence of individual images of the object for a first animation sequence is combined with a second sequence of individual images of the object for a second animation sequence. As a result of the fact that a plurality of animation sequences of the object are provided, the animation sequences can be played either at the same time or alternatively. Animation sequences can therefore be replaced with one another by the user during the playback operation. The user is therefore provided with extensive variation possibilities within the vector-based page description language in conjunction with the interactive control of the viewing angle.
  • First surface changes for the first sequence of individual images of the object are advantageously calculated as a first texture animation and second surface changes for the second sequence of individual images of the object are calculated as a second texture animation and are joined with the animation sequences using the vector-based page description language. This results in extensive possibilities for varying the texture animations with respect to the animated object within the vector-based page description language. In conjunction with the presence of alternative animation sequences as a sequence of individual objects which have been varied, extensive variations of the animated object, which either relate only to the surface of the animated object as a texture animation or even relate to the movement and the object per se as an animation sequence, can be carried out within the vector-based page description language.
  • Surface changes are respectively advantageously determined for a plurality of objects and/or the background plane and are joined using the vector-based page description language.
  • One advantageous refinement of the method provides for the first and last individual objects of the animation sequence to be matched to one another in such a manner that an endless loop of the animation sequence can be represented. The starting and end objects of the animation sequence are advantageously matched using a so-called loop editor. The endless loop can also be used with respect to selected individual objects in the central part of the animation sequence. The loop editor represents the calculated sequences of the individual objects in pictograms or as an object representation on a timeline—similar to a video editing program or a node editor which is known from 3D animation programs for the overview of the graphical programming of shaders, for example. This is because there is the possibility of copying, deleting and cutting sequences of the individual objects in this case too. An additional window in which the animation sequence of the object currently being animated runs is integrated in the loop editor. The first window shows, in animated form, the region which is currently being edited, and the second window shows the first individual object and the last individual object in an overlapping manner, the starting and end individual objects being displayed in a semi-transparent and overlapping manner. Optionally, the first ten individual objects and the last ten individual objects may also be represented in animated form in order to see where cutting is best. An additional function of the loop editor enables automatic approach between the start and end of the endless loop.
  • As a result of the fact that a virtually seamless transition between the starting individual object and the end individual object is defined, the animation sequence can be played in a virtually endless manner without the transitions from the end individual object to the starting individual object being apparent to the human viewer. One advantageous refinement of the method provides for corresponding boundary conditions and/or parameters to already be set when simulating and generating the sequence of the individual objects in such a manner that the end individual object of the animation sequence virtually corresponds to the starting individual object.
  • If a transition from the end individual object to the starting individual object of the animation sequence is not possible, the animation sequence is divided into partial sequences. For example, when simulating the flow of water through a curved pipe, the animation sequence can be such that the flowing of the water into the pipe is not repeated within an endless loop. The water begins to run and runs through the pipe until the desired end of the animation sequence. In the central part of the animation sequence, water has already reached the end of the pipe and flows through the pipes, with the result that there are presumably no longer any great changes in the flow properties and this central sequence can therefore be repeated. In an end sequence of the animation sequence, the switching-off of the water and the emptying of the pipe can then be represented as a sequence which cannot be repeated.
  • In the loop editor, the human viewer can define and configure the partial sequences of the animation sequence on the basis of his experience in such a manner that the partial sequences which are possibly configured in different ways form the animation sequence. The sequence of the individual objects which is defined in this manner can be played as desired within a program with a vector-based page description language, for example as 3D-PDF or alternatively also other playback environments such as the “Silverlight” program from the Microsoft Corporation or the “Flash” software tool from Adobe Systems Incorporated. In addition, the sequence of the individual objects can also be subsequently modified in the vector-based page description language. Further graphical optimizations, for example a polygon reducement, can likewise be carried out. The animation sequence created in this manner can then be read into a program with a vector-based page description language and can be used as a flash animation or in the form of a control file for an interactive three-dimensional object.
  • In addition to synchronizing the starting individual objects and the end individual objects in an animation sequence, the respective texture sequence can be synchronously played in an endless loop, the first and last individual objects of the animation sequence and the associated surface change as a texture of the first and last individual objects being virtually identical. This provides the human viewer with the greatest possible range of variation possibilities, with the result that, in addition to the possibility of selecting from a plurality of animation sequences, different texture animations for the respectively selected animation sequence can also be compiled by the human viewer in the vector-based page description language.
  • One advantageous refinement of the method provides for the sequence of the individual images and/or the texture animation and/or the light texture to be displayed in a display unit.
  • In order to reduce the computational complexity for simulating and determining the individual objects for the animation sequence, only sectional planes of the individual objects are determined on the basis of predefinable planes and are displayed on the basis of these planes. Even if the behavior of a three-dimensional object is intended to be simulated and animated, the animation sequence is composed only of sections or sectional planes of the individual objects. This minimizes the computational complexity and the data size of the animation sequence. Provision is made for different animation sequences to be created with respect to different sectional planes and sections. The respective sections or sectional planes of the individual objects can then be viewed by a human viewer in a quick and simple manner by interactively selecting the respectively desired animation sequence and thus the respectively desired sectional plane and it is possible to switch back and forth between the individual animation sequences by means of the vector-based page description language.
  • The sequence of the individual objects and/or the texture animation is/are determined using a simulation unit. By virtue of the fact that the sequence of the individual objects and/or the texture animation does/do not have to be calculated using expensive and complex simulation programs, it is possible to use a simulation unit tailored to the problem to be simulated. Simulation programs require comprehensive operation and control which can only be carried out by specially trained experts. In contrast, the simulation unit can be designed in such a manner that only a minimum amount of storage space is required and the input by a layman is also possible. For example, the parameter input carried out by the user can create a flow of water through a pipe, which, according to a simulation behavior, can meet a collision object (here the pipe), where the objects, in the form of water constituents, are then distributed according to the simulation. The real-time representation is effected using particle points or voxels for rapid understanding or on the basis of predefined sectional planes of the collision object. The collision object is precisely defined by locating the areas of polygons/triangles, that is to say by reading the boundaries of the collision object or by specifications from the user.
  • For this purpose, the user must open a toolbox and can then choose between different pipe cross sections, so-called “shapes”. The user then specifies a radius for the diameter of the pipe as a collision object or interactively defines it using a graphical selection. The user then draws the line through the pipe as a collision object. Curve tools, distributing guides and other tools are available for creating the path. The user can selectively choose the shapes using the toolbox. Alternatively, this task can be transferred to a computer which selects the shapes in an automated method.
  • The real-time preview of the animation sequence is then represented in the path which has been created and the user can work with the parameterization. If the user is satisfied with the simulation, he can start the complex three-dimensional simulation by pressing a button and the animation sequence derived therefrom can be output. Optionally, the user can also generate texture animations which are subsequently projected onto the finished animation sequence.
  • The present method likewise makes it possible to import an area defined by the user, for example a water surface, into the simulation unit. The user can then input the necessary parameters, for example the wind direction and strength, and the simulation of a wave can thus be calculated. Additional objects such as ships can be placed on the water surface. The wave movements produce further waves and spray. The animation sequence indicates geometries and textures for the rough preview.
  • If the user is satisfied with the simulation, he can start the complex three-dimensional simulation by pressing a button and the finished animation sequence can be output. With this form of simulation, the closed water surface is broken up into sections; the spray first of all consists of particles, polygons or volume objects such as “voxels” and is subsequently likewise broken up into sections. The impinging drops of water of the spray and the breaking waves are stored as textures on the water surface. Both methods can be represented in real time separately or together using more modern and more powerful computers or may likewise be exported and/or processed as 3D object sequences.
  • Selected properties for the surface simulations of the water surface, such as color or transparency, can likewise be changed by the user. The wave movements of the water surface are created using so-called centers in which the waves arise. The method makes it possible for the wave to know its volume, force and speed in order to carry out correct force distributions at further objects, such as a wall or a ship, so that everything physically moves in a correct manner. Alternatively, a flash program can also be used to play the animation sequence.
  • The portable document format (PDF) data format is advantageously used as the vector-based page description language. However, other platforms such as Microsoft's “Silverlight”, Adobe's “Flash” etc. can also be alternatively used. The advantage is considered to be the fact that the data size of the animation sequence is reduced to the image size of a display unit.
  • A computer program and a computer program product also achieve the object, the computer program product being stored in a computer-readable medium and comprising computer-readable means which cause a computer to carry out the method according to the invention when the program runs in the computer. The present invention may be implemented in the form of hardware, software or a combination of hardware and software. Any type of system or any other apparatus set up to carry out the method according to the invention is suitable for this purpose. The present invention may also be integrated in a computer program product which comprises all of the features that enable it to implement the computer-assisted methods described here and which, after being loaded into a computer system, is able to carry out these methods.
  • In the present context, the terms “computer program” and “computer program product” should be understood as meaning any expression in any desired computer language, code or notation of a set of instructions which enable a computer system to process data and thus to perform a particular function. The computer program or the computer program product can be executed on the computer system either directly or after conversion into another language, code or notation or by means of representation in another material form.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
  • Further advantageous refinements are found in the subclaims. The present invention is explained in more detail using the exemplary embodiments in the figures, in which, by way of example,
  • FIG. 1 shows a flowchart with the essential method steps;
  • FIG. 2 shows a diagrammatic illustration of the essential method steps;
  • FIG. 3 shows a perspective view of an object with different sectional planes;
  • FIG. 4 shows a view of an object in a vector-based page description language program;
  • FIG. 5 shows a diagrammatic illustration of the essential method steps with a preview program.
  • DESCRIPTION OF THE INVENTION
  • FIG. 1 shows a flowchart with the essential method steps. After the method has been started, the object 1 (not illustrated) is simulated 10. In a drawing program 5 (not illustrated), for example Lightwave3D, or in a simulation unit, the behavior of the object 1 is simulated and is generated in temporally successive individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g (not illustrated).
  • In addition, the texture of the animated object 1 is generated and simulated 12 either in the drawing program 5 or by means of a separate editor 6 (not illustrated) and is stored as a texture animation 4 a, 4 b (not illustrated). Alternatively, the texture animation can also be simulated 12 in a parallel manner to the simulation of the object 10 or completely independently of the simulation of the object 10. The individual objects are then joined 13, as a temporal sequence, to form an animation sequence 3 a, 3 b (not illustrated). Alternatively, the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g may also be individually read from the drawing program 5 and then viewed in a further preview program 17 (not illustrated) as a previewer and/or can be assembled as a sequence and thus as an animation sequence 3 a, 3 b. The texture animation 4 a, 4 b is then projected 14 onto the animation sequence 3 a, 3 b and thus depicts the surface changes of the animated object 1. The playback of the animation sequence 3 a, 3 b with the texture animation 4 a, 4 b is repeated using a loop operation 16 until an internal condition occurs or a human viewer terminates the process.
  • FIG. 2 illustrates a diagrammatic illustration of the basic method sequences. In the drawing program 5, the individual objects 2 a, 2 b, 2 c are generated and the temporal behavior is simulated. In the drawing program 5, the individual objects 2 a, 2 b, 2 c are already joined, as a sequence, to form an animation sequence 3 a. The animation sequence 3 a is indicated by the vertical lines between the individual objects 2 a, 2 b, 2 c and is intended to indicate the temporal sequence of the individual objects 2 a, 2 b, 2 c. In a further program 6, the textures are connected to form a texture animation 4 a, the texture animation 4 a again being indicated by the vertical lines between the textures. The animation sequence 3 a and the texture animation 4 a are loaded into a program with a vector-based page description language 7 and can be played in a display unit 18 by means of control elements 8 a, 8 b. The human user can use the control elements 8 a, 8 b to control the course and speed of the animation sequence 3 a with the texture animation 4 a.
  • FIG. 3 shows a perspective view of an object 1 with different sectional planes 9 a, 9 b, 9 c, 9 d, in which case not all sectional planes illustrated are assigned to a figure designation for the sake of clarity. The animation sequence 3 a, 3 b (not illustrated) and the texture animation 4 a, 4 b, 4 c (not illustrated) are calculated on the basis of a simulation for the entire object 1. However, the actual animations 3 a, 3 b, 4 a, 4 b, 4 c are only illustrated and projected for predefinable sectional planes 9 a, 9 b, 9 c, 9 d, 9 f, 9 g. In the example shown in FIG. 3, three radially running sectional planes 9 a, 9 b, 9 c subdivide the interior of a tubular object 1. Furthermore, three axially running sectional planes 9 d, 9 f, 9 g run inside the pipe as an object 1. The animation sequence 3 a, 3 b and the texture animation 4 a, 4 b, 4 c are output only for the sectional planes 9 a, 9 b, 9 c, 9 d, 9 f, 9 g, which requires only small computation capacities.
  • The three-dimensional simulation behavior of the animated object 1 can be represented in two dimensions by representing a three-dimensional object behavior in the two-dimensional sectional planes 9 a, 9 b, 9 c, 9 d, 9 f, 9 g—if appropriate on the basis of automatic detection of the object geometries during the simulation 10. A path to which the 2D textures of the fluid simulation of the animation sequence 3 a, 3 b, which functions in real time, can be tied is automatically created during the simulation 10.
  • Overall, depending on the computer capacity, a plurality of such “2D slices” as sectional planes 9 a, 9 b, 9 c, 9 d, 9 f, 9 g are placed in the pipe as an object 1 and are horizontally and vertically interleaved. In the case of 3×3 sectional planes 9 a, 9 b, 9 c, 9 d, 9 f, 9 g, as in the example shown in FIG. 3, nine animation sequences 3 a, 3 b and/or texture animations 4 a, 4 b, 4 c are thus created and therefore, in combination, provide a user with a very realistic three-dimensional impression of the behavior of the animated object.
  • FIG. 4 shows an excerpt from an animation sequence 4 a (not illustrated) of the animated object 1. In the example shown in FIG. 4, the object behavior of the static pipe does not need to be calculated. The medium flowing through the pipe is simulated using a simulation and is stored as individual objects 2 a, 2 b, 2 c. The surface change of the liquid when flowing through the pipe is then simulated and stored as a texture animation 4 a, 4 b. In the program with the vector-based page description language, for example 3D-PDF or Microsoft's “Silverlight” or Adobe's “Flash”, the animation sequence 3 a, 3 b and the texture animation 4 a, 4 b, 4 c can then be joined and played. On account of the simple type of programming, it is also possible to define buttons and icons as control elements 8 a, 8 b in the vector-based page description language, which control elements make it possible to play the animation sequence 3 a, 3 b with the texture animation 4 a, 4 b, 4 c. At the same time, the human viewer can use the control elements 8 a, 8 b to interactively change the viewing angle of the animated object 1.
  • FIG. 5 shows a diagrammatic illustration of the essential method sequences with a preview program 17. In the drawing program 5, the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g are generated on the basis of a simulation predefined by the user. Corresponding textures with regard to the surface changes of the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g are determined in the further program 6. The temporal sequence of the respectively associated individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g can be viewed in the preview program 17, a user then being able to generate an animation sequence 3 a, 3 b from this preview. The sequence of the textures can likewise be viewed and a texture animation 4 a, 4 b, 4 c can then be created. The animation sequences 3 a, 3 b and the texture animations 4 a, 4 b, 4 c are then loaded into the program with the vector-based page description language 7 and can be played in a display unit 18 using control elements 8 a, 8 b. In this case, during the running of a first animation sequence 3 a, 3 b with a first texture animation 4 a, 4 b, 4 c, the user is likewise able to interactively mutually combine the animation sequence 3 a, 3 b and/or the texture animation 4 a, 4 b, 4 c and to view it/them from different viewing angles. A simulation unit in which the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g generated in the drawing program 5 and/or the textures generated in the further program 6 are simulated may likewise be integrated in the preview program 17.
  • In the program with the vector-based page description language 7, the user starts a program or a plug-in and imports the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g using Adobe Acrobat 3D Toolkit or as an OBJ file or as further supporting file formats.
  • For example, in the case of simulation of the flow through a pipe according to the examples in FIG. 3 and FIG. 4, the user is then able to define the pipe as an object 1 and selects, for example, the radius of the pipe and the resolution of the further objects as volume bodies of the fluid. The user can then view a corresponding preview of the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g and/or of the textures and the temporal sequence of the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g and/or of the textures in the preview program 17 in a display unit. For this purpose, the model behavior of the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g and/or of the textures is calculated for a plurality of sectional planes 9 a, 9 b, 9 c, 9 d, 9 e, 9 f and is represented in the three-dimensional pipe as an object 1 by means of a real-time preview.
  • Furthermore, the parameters and properties of the fluid can be interactively changed during the preview. There is also a standard setting which adopts materials such as water, oil or gases into the parameter setting at the click of a mouse. While the user is making his parameter input, he can see, in the preview program 5 in real time, how the settings directly change in the previewer. The user can likewise also set and interactively change the type of textures. When the user is satisfied with the setting, he presses a button “Create 3D sequence” and must also specify the formula according to which he would like to have the physical simulation calculated. All parameters are then adopted into the 3D engine and the animation sequence 3 a, 3 b either with or without a texture animation 4 a, 4 b, 4 c is calculated 13, 14 from polygons, particles and/or volume models such as voxels. The animation sequences 3 a, 3 b and/or texture animations 4 a, 4 b, 4 c calculated in this manner are stored. The animation sequences 3 a, 3 b and/or texture animations 4 a, 4 b, 4 c are then edited in the loop editor in such a manner that a starting part, a central part and an end part of the animation sequences 3 a, 3 b and/or texture animations 4 a, 4 b, 4 c are available and can be loaded into the 3D-PDF program or into other platforms such as Microsoft's “Silverlight” or Adobe's “Flash”. If necessary, the central part of the animation sequences 3 a, 3 b and/or texture animations 4 a, 4 b, 4 c can be run through again in certain loops. In order to reduce the file size of the animation sequences 3 a, 3 b and/or texture animations 4 a, 4 b, 4 c, the user uses the function of the object detail reduction in the loop editor to reduce the resolution of the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g and/or of the animation sequence 3 a, 3 b and/or of the texture animation 4 a, 4 b, 4 c. Definitive animation sequences 3 a, 3 b and/or texture animations 4 a, 4 b, 4 c are now automatically created using a control element 8 a, 8 b and are directly created in the program with the vector-based page description language, for example 3D-PDF.
  • The user can track his changes to the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g and/or to the animation sequence 3 a, 3 b and/or to the texture animation 4 a, 4 b, 4 c on the freely defined fluid flow or ocean current interactively and in real time. Although the technologies on the market, for example “Next Limits” or “RealFlow”, make it possible for the user to parameterize the objects 1 as fluids, there is no real preview which allows the user to directly discern, with 25 images/second, what sort of effects the parameter changes have on the behavior of the individual objects 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g and/or of the animation sequence 3 a, 3 b and/or of the texture animation 4 a, 4 b, 4 c. In such programs, there is a need for a time-consuming preview simulation which is then cached in order to show it to the user. If he changes the parameters, the entire simulation must be recalculated, sometimes with a considerable expenditure of time.
  • Provision is likewise made for the graphical data format, for example the OpenGL or the DirectX format, to be able to be used inside the drawing program 5 and/or the further program 6 and for the change in the surface colors to be able to be graphically represented in real time and thus observed by a viewer. These surface colors can be seen only when a button for the visibility of properties has previously been activated. If, for example, a ball as an animated object 1 is intended to be changed, the animation function changes the color of the selected ball as an animated object 1 to red. It is therefore possible to discern which objects 1 or object parts are currently running on the basis of the selection using the animation function. Blue could stand for water simulation with a large number of objects 1, green could stand for the masking of an object 1, pink stands for an object 1 which will collide, and pink-red stands for an object 1 which has its own deformation.
  • The present method also makes it possible to simulate film and gaming applications. The drawing program 5, 6 is used to create a percentage 3D map of hairs which are defined as objects 1. In order to be able to represent the hairs as objects 1 in real time, a multiplicity of sectional planes 9 a, 9 b, 9 c, 9 d, 9 e, 9 f are placed through the volume hair. The sectional planes 9 a, 9 b, 9 c, 9 d, 9 e, 9 f pierce the hairs to be modeled and leave behind a cross section of the hair which has just been cut on their 2D plane. This cross section is represented by the drawing program 5. Everything which has not been cut is transparent. Fanned out from the hairline to the tip of the hair, fifteen two-dimensional sectional planes 9 a, 9 b, 9 c, 9 d, 9 e, 9 f which are distributed along the hairs thus depict the hairs with the respective cross section of a hair and a texture.
  • There is also another variant for creating hairs in real time. For this purpose, in a base plate as an object 1 with small holes. This base plate as an object 1 moves very quickly from one point to the next. In this case, motion blur is switched on during the simulation 10. The holes in the base plate as an object 1 remain free because there can also be no motion blur where there is air. For this purpose, only every second individual object 2 a, 2 b, 2 c, 2 d, 2 e, 2 f, 2 g is calculated so that it is not possible to discern any flowing movement from one point to the next. The impression that the motion blur “is stationary” and is not moving is thus produced. As a result of the effect, it appears as if grass is growing from the base plate as an object 1.

Claims (22)

1. A method for representing an animated object in the form of an animation sequence, having the following steps of:
generating a sequence of individual objects for each moment of the animation sequence of the object;
joining the individual objects for each moment to form the animation sequence;
calculating surface changes of the object as a texture animation;
projecting the texture animation onto the object in the animation sequence;
simultaneously running the animation sequence with the texture animation and thus generating the impression of an animated object;
interactively controlling a viewing angle of the animated object using a program with a vector-based page description language; and
reducing a data size of the animation sequence to an image size of a display unit.
2. The method according to claim 1, which comprises using a Microsoft Silverlight data format as the vector-based page description language.
3. The method according to claim 1, which comprises using an Adobe Flash data format as the vector-based page description language.
4. The method according to claim 1, which comprises interactively moving the viewing angle of the animated object after calculating physical animations in a protracted manner.
5. The method according to claim 1, which comprises:
obtaining an animation sequence that is played as a sequence of the individual objects, wherein the individual objects in the sequence of the individual objects that is played are three-dimensional; and
interactively moving the viewing angle of the animated object after obtaining the animation sequence.
6. The method according to claim 1, which comprises performing the step of reducing the data size of the animation sequence by a graphical optimization implemented as a polygon reducement, wherein a surface of the animated object is composed of a plurality of polygons.
7. The method according to claim 1, wherein the sequence of the individual objects is created using a drawing program and is assembled to form the animation sequence with texture animation of the object using the program with the vector-based page description language.
8. The method according to claim 1, wherein the texture animation of the object is created using the drawing program and is combined with the object using the program with the vector-based page description language.
9. The method according to claim 1, wherein the texture animation of the object is determined on the basis of a numerical simulation and is combined with the object using the program with the vector-based page description language.
10. The method according to claim 1, wherein a further texture animation is created as a background plane of the object using the drawing program and is combined with the animation sequence as a background plane of the object using the program with the vector-based page description language.
11. The method according to claim 1, wherein the texture animation is varied on the basis of the interactive variation of predefinable boundary conditions and the respectively varied texture animation is interactively projected onto the animated object using the program of the vector-based page description language.
12. The method according to claim 1, wherein a light texture is calculated on the basis of the object and is projected onto the object on the basis of the viewing angle.
13. The method according to claim 1, wherein the object is composed of polygons and/or triangles and/or non-uniform rational B-splines and/or voxels.
14. The method according to claim 1, wherein a first sequence of individual objects for a first animation sequence of the object is combined with a second sequence of individual objects for a second animation sequence of the object.
15. The method according to claim 14, wherein first surface changes for the first sequence of individual objects are calculated as a first texture animation and second surface changes for the second sequence of individual objects are calculated as a second texture animation and are joined with the animation sequences of the object using the program with the vector-based page description language.
16. The method according to claim 1, wherein surface changes are respectively determined for a plurality of objects and/or the background plane and are joined using the program with the vector-based page description language.
17. The method according to claim 1, wherein the first and last individual objects of the animation sequence are matched to one another in such a manner that an endless loop of the animation sequence of the object can be represented.
18. The method according to claim 17, wherein the animation sequence and the texture animation are synchronously played in an endless loop, the first and last individual objects of the animation sequence and the associated surface change as a texture animation of the first and last individual objects being virtually identical.
19. The method according to claim 1, wherein the sequence of the individual objects and/or the texture animation and/or the light texture is/are displayed in a display unit.
20. The method according to claim 1, wherein the individual objects are determined on the basis of predefinable planes and are displayed on the basis of these planes in the animation sequence of the object.
21. The method according to claim 1, wherein the individual objects and/or the texture animation is/are determined using a simulation unit.
22. The method according to claim 1, wherein the portable document format (PDF) data format is used as the vector-based page description language.
US16/014,213 2009-04-18 2018-06-21 Method for representing an animated object Abandoned US20180300938A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/014,213 US20180300938A1 (en) 2009-04-18 2018-06-21 Method for representing an animated object

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
DE102009018165A DE102009018165A1 (en) 2009-04-18 2009-04-18 Method for displaying an animated object
DE102009018165.2 2009-04-18
PCT/DE2010/000391 WO2010118729A2 (en) 2009-04-18 2010-03-26 Method for representing an animated object
US201113265071A 2011-11-30 2011-11-30
US16/014,213 US20180300938A1 (en) 2009-04-18 2018-06-21 Method for representing an animated object

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/DE2010/000391 Continuation WO2010118729A2 (en) 2009-04-18 2010-03-26 Method for representing an animated object
US13/265,071 Continuation US10008022B2 (en) 2009-04-18 2010-03-26 Method for representing an animated object

Publications (1)

Publication Number Publication Date
US20180300938A1 true US20180300938A1 (en) 2018-10-18

Family

ID=42751129

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/265,071 Active 2031-05-22 US10008022B2 (en) 2009-04-18 2010-03-26 Method for representing an animated object
US16/014,213 Abandoned US20180300938A1 (en) 2009-04-18 2018-06-21 Method for representing an animated object

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/265,071 Active 2031-05-22 US10008022B2 (en) 2009-04-18 2010-03-26 Method for representing an animated object

Country Status (8)

Country Link
US (2) US10008022B2 (en)
EP (1) EP2419883B1 (en)
CN (1) CN102687176B (en)
DE (2) DE102009018165A1 (en)
DK (1) DK2419883T3 (en)
ES (1) ES2553737T3 (en)
RU (1) RU2541925C2 (en)
WO (1) WO2010118729A2 (en)

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10304347B2 (en) 2012-05-09 2019-05-28 Apple Inc. Exercised-based watch face and complications
WO2016022205A1 (en) * 2014-08-02 2016-02-11 Apple Inc. Context-specific user interfaces
US10990270B2 (en) 2012-05-09 2021-04-27 Apple Inc. Context-specific user interfaces
US9547425B2 (en) 2012-05-09 2017-01-17 Apple Inc. Context-specific user interfaces
US9459781B2 (en) 2012-05-09 2016-10-04 Apple Inc. Context-specific user interfaces for displaying animated sequences
US10613743B2 (en) 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
FR3008814B1 (en) * 2013-07-18 2016-12-23 Allegorithmic SYSTEM AND METHOD FOR GENERATING PROCEDURAL TEXTURES USING PARTICLES
AU2015279544B2 (en) 2014-06-27 2018-03-15 Apple Inc. Electronic device with rotatable input mechanism for navigating calendar application
JP6350037B2 (en) 2014-06-30 2018-07-04 株式会社安川電機 Robot simulator and robot simulator file generation method
TWI647608B (en) 2014-07-21 2019-01-11 美商蘋果公司 Remote user interface
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
WO2016036541A2 (en) 2014-09-02 2016-03-10 Apple Inc. Phone user interface
WO2016036481A1 (en) 2014-09-02 2016-03-10 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
WO2016144385A1 (en) 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
JP2016194843A (en) * 2015-04-01 2016-11-17 ファナック株式会社 Numerical control device having program display function using plural images
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
EP4321088A3 (en) 2015-08-20 2024-04-24 Apple Inc. Exercise-based watch face
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
CN106303722B (en) * 2016-08-04 2019-12-10 腾讯科技(深圳)有限公司 animation playing method and device
US10521937B2 (en) * 2017-02-28 2019-12-31 Corel Corporation Vector graphics based live sketching methods and systems
DK179412B1 (en) 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
CN108171281A (en) * 2017-11-10 2018-06-15 高萍 Nasal cavity cleaning compression pump automatic regulating system
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
RU2701453C1 (en) * 2018-06-25 2019-09-26 Михаил Григорьевич Блайвас Method of displaying graphic objects
WO2020108779A1 (en) * 2018-11-30 2020-06-04 HELLA GmbH & Co. KGaA Method for performing an animation with a lighting device comprising a plurality of light sources
JP6921338B2 (en) 2019-05-06 2021-08-18 アップル インコーポレイテッドApple Inc. Limited operation of electronic devices
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device
US11960701B2 (en) 2019-05-06 2024-04-16 Apple Inc. Using an illustration to show the passing of time
CN110213641B (en) * 2019-05-21 2022-03-29 北京睿格致科技有限公司 4D micro-course playing method and device
US10852905B1 (en) 2019-09-09 2020-12-01 Apple Inc. Techniques for managing display usage
JP7427930B2 (en) * 2019-11-26 2024-02-06 セイコーエプソン株式会社 Video data generation method, video data generation device, and program
US11372659B2 (en) 2020-05-11 2022-06-28 Apple Inc. User interfaces for managing user interface sharing
CN115552375A (en) 2020-05-11 2022-12-30 苹果公司 User interface for managing user interface sharing
DK181103B1 (en) 2020-05-11 2022-12-15 Apple Inc User interfaces related to time
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7123263B2 (en) * 2001-08-14 2006-10-17 Pulse Entertainment, Inc. Automatic 3D modeling system and method
US20090021513A1 (en) * 2007-07-18 2009-01-22 Pixblitz Studios Inc. Method of Customizing 3D Computer-Generated Scenes
US20090179901A1 (en) * 2008-01-10 2009-07-16 Michael Girard Behavioral motion space blending for goal-directed character animation
US8335675B1 (en) * 2009-02-27 2012-12-18 Adobe Systems Incorporated Realistic real-time simulation of natural media paints

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5261041A (en) * 1990-12-28 1993-11-09 Apple Computer, Inc. Computer controlled animation system based on definitional animated objects and methods of manipulating same
US5630043A (en) * 1995-05-11 1997-05-13 Cirrus Logic, Inc. Animated texture map apparatus and method for 3-D image displays
US6049339A (en) 1997-12-22 2000-04-11 Adobe Systems Incorporated Blending with planar maps
US6538654B1 (en) * 1998-12-24 2003-03-25 B3D Inc. System and method for optimizing 3D animation and textures
US7123269B1 (en) 2002-06-21 2006-10-17 Adobe Systems Incorporated Modifying vector objects
DE60214696T2 (en) 2002-07-23 2007-09-06 Simcon Kunststofftechnische Software Gmbh Simulation of liquid flow and structure analysis in thin-walled geometries
US7173623B2 (en) * 2003-05-09 2007-02-06 Microsoft Corporation System supporting animation of graphical display elements through animation object instances
KR100680191B1 (en) * 2003-09-05 2007-02-08 삼성전자주식회사 Proactive user interface system with empathized agent
KR100678120B1 (en) * 2004-11-01 2007-02-02 삼성전자주식회사 Apparatus and method for proceeding 3d animation file in mobile terminal
KR100898989B1 (en) * 2006-12-02 2009-05-25 한국전자통신연구원 Apparatus for generating and shading foam on the water surface and method thereof
US8199152B2 (en) * 2007-01-16 2012-06-12 Lucasfilm Entertainment Company Ltd. Combining multiple session content for animation libraries
US20080303826A1 (en) 2007-06-11 2008-12-11 Adobe Systems Incorporated Methods and Systems for Animating Displayed Representations of Data Items
US20090033674A1 (en) * 2007-08-02 2009-02-05 Disney Enterprises, Inc. Method and apparatus for graphically defining surface normal maps
KR20080034419A (en) * 2007-11-23 2008-04-21 야파 코포레이션 3d image generation and display system
US8629871B2 (en) * 2007-12-06 2014-01-14 Zynga Inc. Systems and methods for rendering three-dimensional objects

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7123263B2 (en) * 2001-08-14 2006-10-17 Pulse Entertainment, Inc. Automatic 3D modeling system and method
US20090021513A1 (en) * 2007-07-18 2009-01-22 Pixblitz Studios Inc. Method of Customizing 3D Computer-Generated Scenes
US20090179901A1 (en) * 2008-01-10 2009-07-16 Michael Girard Behavioral motion space blending for goal-directed character animation
US8335675B1 (en) * 2009-02-27 2012-12-18 Adobe Systems Incorporated Realistic real-time simulation of natural media paints

Also Published As

Publication number Publication date
EP2419883B1 (en) 2015-08-19
DK2419883T3 (en) 2015-11-30
DE102009018165A1 (en) 2010-10-21
EP2419883A2 (en) 2012-02-22
RU2541925C2 (en) 2015-02-20
ES2553737T3 (en) 2015-12-11
WO2010118729A2 (en) 2010-10-21
RU2011146891A (en) 2013-05-27
DE112010001686A5 (en) 2012-10-25
WO2010118729A3 (en) 2011-07-14
US20120086718A1 (en) 2012-04-12
US10008022B2 (en) 2018-06-26
CN102687176A (en) 2012-09-19
CN102687176B (en) 2015-11-25

Similar Documents

Publication Publication Date Title
US20180300938A1 (en) Method for representing an animated object
US8749588B2 (en) Positioning labels in an engineering drawing
EP2469474B1 (en) Creation of a playable scene with an authoring system
JP6787661B2 (en) Simulation of machining of workpieces
CA2563700C (en) System and method for smoothing three-dimensional images
EP3040945B1 (en) Creation of bounding boxes on a 3d modeled assembly
KR20130004066A (en) Method for designing a geometrical three-dimensional modeled object
US10210304B2 (en) Method and system for designing an assembly of objects in a system of computer-aided design
CN101866379B (en) Method, program and product edition system for visualizing objects displayed on a computer screen
KR20160082477A (en) Selection of a viewpoint of a set of objects
Bikmullina et al. The development of 3D object modeling techniques for use in the unity environmen
KR20120001114A (en) Method for controlling 3d object in virtual world, ineterface and input/output therefor
Stiver et al. Sketch based volumetric clouds
Cheng Human Skeleton System Animation
Perles et al. Interactive virtual tools for manipulating NURBS surfaces in a virtual environment
El Rhalibi et al. Highly realistic mpeg-4 compliant facial animation with charisma
Roa Santamaria Jr Development of design tools for the evaluation of complex CAD models
Lavender Maya manual
Lee et al. Novel irregular mesh tagging algorithm for wound synthesis on a 3D face
JP2021033769A (en) Apparatus for editing three-dimensional shape data and program for editing three-dimensional shape data
Kolárik Comparison of Computer Graphics
Fei et al. PASCAL: physics augmented space canvases for animating locomotion

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION