US7859538B2 - Converting deformation data for a mesh to animation data for a skeleton, skinning and shading in a runtime computer graphics animation engine - Google Patents

Converting deformation data for a mesh to animation data for a skeleton, skinning and shading in a runtime computer graphics animation engine Download PDF

Info

Publication number
US7859538B2
US7859538B2 US11/496,241 US49624106A US7859538B2 US 7859538 B2 US7859538 B2 US 7859538B2 US 49624106 A US49624106 A US 49624106A US 7859538 B2 US7859538 B2 US 7859538B2
Authority
US
United States
Prior art keywords
source
mesh
target
animation
locators
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US11/496,241
Other versions
US20080024487A1 (en
Inventor
Michael Isner
Javier Nicolai von der Pahlen
Thomas Ho-min Kang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Autodesk Inc
Original Assignee
Autodesk Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autodesk Inc filed Critical Autodesk Inc
Priority to US11/496,241 priority Critical patent/US7859538B2/en
Assigned to AVID TECHNOLOGY, INC. reassignment AVID TECHNOLOGY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ISNER, MICHAEL, KANG, THOMAS HO-MIN, VON DER PAHLEN, JAVIER NICOLAI
Publication of US20080024487A1 publication Critical patent/US20080024487A1/en
Assigned to AUTODESK, INC. reassignment AUTODESK, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVID TECHNOLOGY, INC.
Application granted granted Critical
Publication of US7859538B2 publication Critical patent/US7859538B2/en
Application status is Active legal-status Critical
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/32Image data format

Abstract

Animation of an object from a character modeling and/or animation tool is converted from a representation used by that tool to a representation used in a runtime animation system, such as a game engine. Such a tool typically represents the object using a source structure and a source skin. The runtime animation engine typically uses a target structure, target skin and shading to represent animation of an object. In addition to transferring motion of the object from its source structure to the target structure, deformation and shading also are converted. Low resolution information about the deformation of the source skin is converted into a set of skinning weights for associating the target skin with virtual bones added to the target structure and animated deformation data for each frame of animation. High resolution detail from the deformation of the source skin is converted into a set of normal maps, one or more masks and animated mask parameters for use by one or more shaders.

Description

BACKGROUND

Electronic games commonly use three dimensional modeling, animation and rendering techniques to achieve realistic characters, playing environments and interaction. Some electronic game platforms, such as the Sony PLAYSTATION Nintendo GAMECUBE and Microsoft XBOX game consoles, have “engines” which render three-dimensional animations in real time during play, i.e., at runtime. Example game engines include, but are not limited to, Source from Valve and Unreal from Epic.

These game engines generally represent a character as a skeleton with an associated envelope or skin to which color and textures are applied using a process called shading. A skeleton typically is defined by a set of interconnected or related bones. The envelope or skin typically is defined as a three-dimensional mesh. A set of envelope weights or skinning weights defines the relationship between bones in the skeleton and the vertices in the mesh defining the envelope or skin. The process of defining these weights is called skinning. Animation is applied to the skeleton. The set of envelope weights determines how the mesh deforms in response to movement of the skeleton. A set of normal maps affects how shading is applied to the mesh. The envelope weights and normal maps can be animated over time.

When games are developed, various three-dimensional modeling, animation and rendering tools are used by artists to define the characters and the environments of the games. Typically, these artists work with models with a higher resolution than the resolution used in the game engine. Further, a tool used for creating a model or animation may represent characters and animation in a way that is different from the representation used in the game engine. In particular, if the representation of motion, deformations and shading of the mesh in the tool is different from the representation in the game engine, the representation of the characters and animation needs to be converted to a representation that can be used by the game engine.

SUMMARY

Animation of an object from a character modeling and/or animation tool is converted from a representation used by that tool to a representation used in a runtime animation system, such as a game engine. Such a tool typically represents the object using a source structure and a source skin. The runtime animation engine typically uses a target structure, target skin and shading to represent animation of an object. In addition to transferring motion of the object from its source structure to the target structure, deformation and shading also are converted. Low resolution information about the deformation of the source skin is converted into a set of skinning weights for associating the target skin with virtual bones added to the target structure and animated deformation data for each frame of animation. High resolution detail from the deformation of the source skin is converted into a set of normal maps, one or more masks and animated mask parameters for use by one or more shaders.

This conversion process may be applied by the character modeling and/or animation tool that is used to define the animation of the object, or may be implemented in a separate tool. The conversion process uses the target structure and target skin defined for the object for the runtime animation engine and involves specifying relationships between the source structure and target structure, and specifying relationships between vertices in the source skin and vertices in the target skin.

Motion may be transferred from the source structure to the target structure using motion retargeting techniques. Such techniques are described in U.S. patent application Ser. No. 11/134,653, filed May 20, 2005 and entitled “TRANSFER OF MOTION BETWEEN ANIMATED CHARACTERS”, which is hereby incorporated by reference.

Low resolution deformation is provided by the addition of virtual bones to the target structure and by computing skinning weights and per-frame animated deformation data. First, key points, called locators, are placed on the source mesh. The artist may specify which points are locators. These locators are used to specify virtual bones to be added to the target structure, which are parented to the key structures of the target skeleton. The parenting may be defined by a user interface that permits a user to identify which parts of the mesh are parented to which key structures of the skeleton. The skinning weights are computed by identifying points on the target mesh that correspond to locators. These identified points on the target mesh are associated with the virtual bones that were added to the target structure. Each identified point has a skinning weight of 100% for its corresponding virtual bone. Weights for other points in the target mesh are determined according to the weighted topological distances to the closest points in the target mesh that correspond to locators.

Per-frame animated deformation data also are calculated. For each frame, the displacement of each key point (locator) on the source mesh with reference to the bone with which it is associated is computed. The user may indicate the relationship between points in the mesh to bones in the source structure through a parenting map, which may defined by painting on the mesh in a display. This displacement of each locator is transformed to the local space of the virtual bone corresponding to this locator which was added to the target structure. The set of transformed displacement values for each locator for each frame is the set of animated deformation data. As a result, when each virtual bone is moved by the animated deformation data at runtime, the mesh is deformed by virtue of the skinning of the mesh to these animated virtual bones.

To generate information for shading, two normal maps are computed. The first normal map is computed as the difference between the base pose of the source skin and the base pose of the target skin. The second normal map is computed as the difference between the source skin in a stressed state and the base pose of the target skin. These normal maps capture the detailed, high frequency, variations in the source skin, which then are applied to the target skin through shading.

Shading also uses one or more masks and corresponding animated mask parameters. Each mask is associated with a region of the source skin, which may be associated with a deformer used by the tool. Each mask and its corresponding animated mask parameters define blending parameters which control how much the second normal map is blended with the first normal map to provide a normal map to be used by the runtime shader. More masks may be used to provide higher resolution control of the blending of the normal maps. A user interface may be provided to permit a user to edit these masks. These masks are multiplied together and scaled based on per frame animated mask parameters. For each frame of animation in the tool, the animated mask parameters that scale each mask are computed. The level of stress in a region of the source skin that is associated with a mask is computed in each frame to provide this parameter for the mask. As a result, when the masks are scaled by the animated mask parameters and blended together, the stressed normal map for a region is exposed based on the level of stress in the region to create an illusion of subtle deformation detail.

The transferred motion, virtual bones, skinning weights, per-frame skinning deformers, normal maps, masks and per-frame animation mask parameters are provided to the game engine. Given the shaders, target skin and target skeleton, the information provided to the game engine is sufficient for it to reproduce with high fidelity the animation, and particularly the deformation of the mesh of the source object, as generated by the source tool.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a data flow diagram illustrating how a character and animation from one tool can be converted to data for use in a game engine.

FIG. 2 is a user interface describing how locators are identified.

FIG. 3 is a user interface describing how parenting is identified.

FIG. 4 is a flow chart describing how skinning weights are computed for the target mesh and target structure.

FIG. 5 is a flow chart describing how per-frame animated deformation data are computed.

DETAILED DESCRIPTION

Referring now to FIG. 1, the conversion of characters and animation from one character modeling or animation tool to a representation used in a runtime animation system will now be described.

The animation tool 100 permits a user to define a character or other object and associated animation. The character or other object may have an underlying structure such as a skeleton, called source structure 102, and a surface, called source mesh 104. An exporter 110 converts the representation of the character and animation to a format to be used by the runtime animation system. This conversion is done by a. converting low resolution deformation into a set of virtual bones 123 to be added to the target structure 120, skinning weights 122 for associating point in a target mesh 121 to the virtual bones, and animated deformation data 124 for each frame of animation, and b. converting high resolution deformation detail into a set of normal maps 126, a set of one or more masks 128, per-frame animated mask parameters 130 for each mask, to be used by a set of one or more shaders 132. This information can be used by a runtime animation system to play back an animated character in real time. In particular, the target structure, target mesh, virtual bones, skinning weights and animated deformation data are used for runtime skinning, whereas the normal maps, masks, animated mask parameters and shaders are used for runtime shading.

To perform this conversion, the exporter receives user input 112 and a specification of the target structure 120 and target mesh 121 in addition to the source animation. The source animation includes a source structure and a source mesh. The target mesh may be the same as the source mesh, but typically has a lower resolution than the source mesh. The source structure and target structure also typically are different.

Motion may be transferred from the source structure to the target structure using motion retargeting techniques. Such techniques are described in U.S. patent application Ser. No. 11/134,653, filed May 20, 2005 and entitled “TRANSFER OF MOTION BETWEEN ANIMATED CHARACTERS”, which is hereby incorporated by reference. The retargeted motion 114 can be used by the runtime animation system to manipulate the target structure.

Low resolution deformation information is generated in two parts. First, virtual bones are added to the target structure, and skinning weights for associating the target mesh to the virtual bones are defined, based on the key points (called locators) in the source mesh as identified by the user. Second, per-frame animated deformation data is generated based on the deformation of the mesh at these key points in each frame of animation.

Initially, the exporter displays the source mesh and the target mesh and permits the user to align the two meshes in position, orientation and scale. Given the aligned meshes, the exporter can determine a transform between the source mesh coordinate space and the target mesh coordinate space.

The exporter displays the source mesh and the user is permitted to identify locations, called “locators,” on vertices of the source mesh. Locators are placed by the user at anchor positions in areas of the greatest deformation, and every major deformation area should have at least one locator. The user interface displays the source mesh and permits a user to simply paint the mesh or select points on the mesh to indicate the desired locators. A locator has a position on the mesh, an orientation (normal to its point on the surface) and a spin. The user can control the position of each locator and its spin. Each locator is used to define a virtual bone that is parented to a corresponding bone in the target structure, based on parenting information between the mesh and the skeleton of the character.

FIG. 2 illustrates where some locators may be placed when exporting information about a face animation. For example, a locator may be placed at each of the following points on both the left and right sides of the face: inner brow 200, top of the eye 202, bottom of the eye 204, nose side 206, nostril 208, nasal labial fold 210, above lips 212, below lips 214, mouth corner 216, jaw bone 218, cheek puffer 220, cheek bone 222. Some locators may be placed near the center of the face, such as the nose tip 240, and chin 242.

The exporter permits the user to identify the parenting of each point in the mesh to each bone in the skeleton. For example, it may provide a user interface that displays the mesh, and, for each bone of the skeleton, permits the user to paint the mesh as a way to identify those vertices associated with the selected bone. If the relationships between the source mesh, source skeleton, target skeleton then target mesh are provided, this parenting can be done once to specify how each point in the source mesh is parented to each bone in the target structure. This parenting indicates the area of the mesh which is deformed by motion of the associated bone. FIG. 3A illustrates typical parenting for a jaw bone, whereas FIG. 3B illustrates typical parenting for a neck bone. In FIG. 3A, points 300 are indicated as being associated with the jaw bone (not shown). Points 304 are not associated with the jaw bone. In FIG. 3B, points 306 are indicated as being associated with a neck bone. Points 308 are not associated with the neck bone.

Given the virtual bones defined through the locators, the skinning weights are computed by identifying points on the target mesh that correspond to the locators. The identified points on the target mesh are associated with the virtual bones in the target structure. Each identified point has a skinning weight of 100% associating it to its corresponding virtual bone. Weights for other points in the target mesh are determined according to the weighted topological distances to the closest points in the target mesh that correspond to locators.

In particular, referring now to FIG. 4, for each locator (10-ln) painted by the artist, the closest point in (pl0-pln) in the target mesh is identified (400). The relationship of the point-locator is stored both in the memory of the program and in the locator name. For every point on the target mesh (p), its immediate neighbors (n1) are identified. Information about these points may be cached. For every point of the target mesh, as indicated at 402, if there is a locator associated with it, as determined at 404, then that point receives a weight of 100%, as indicated at 406. Thus, any displacement of the point in the source mesh to which the locator is attached will be applied fully to this point in the target mesh. If the point in the target mesh is not associated with a locator, then a recursive process is applied, examining next each of the neighbors n1, n2, n3 . . . , in order of topological distance. If a neighboring point has a locator associated with it, as determined at 408, an indication of the locator, along with an indication of the recursion step (x) in which it was found, is temporarily stored (410) for that point in the target mesh. The neighboring points are examined until a predefined limit on the numbers of locators to be found is met, as indicated at 412. For example, if the limit is three, the process of evaluating neighbors stops when three locators are found. The recursion step that was stored for each locator is inverted (1/x) and the results are normalized to 100%, which is the skinning normal, as indicated at 414. In this way, the closer the locators are found to each point in the target mesh, the higher their influence on the point. As a result of this process, each point in the target mesh will have an indication of one or more locators and its corresponding weight, with all weights summing to 100%. These skinning weights are relative to the virtual bones added through specifying the locators.

In order to ensure proper coverage, the envelope is smoothed and the weights are clipped again to a designated number of locators per point. Clipping is done by sorting the weights in decreasing amounts, removing the smallest weights and normalizing the remaining weights to 100%. If multiple target meshes are available, different envelopes can be saved—some with more bones per point, some with less.

To further optimize weights so as to fit into compressed data, the weights may be rounded. The precision for this rounding may be user-specified. When rounding, less influential weights can be rounded down with the rounding error added to the most influential weight, so that the sum of the weights for each point remains at 100%.

Referring now to FIG. 5, per-frame animated deformation data are calculated in the following manner. For each frame, each locator (as indicated at 500) is processed. For each locator, its displacement on the source mesh from its base pose, with reference to any movement of the bone with which it is associated, is determined (502). This displacement of each locator is transformed to the local space of its corresponding virtual bone in the target structure (504). If more locators remain to be processed, as determined in step 506, these steps 500, 502 and 504 are repeated for each locator in the frame. The set of transformed displacement values for each locator for each frame is output 508 as the set of animated deformation data. Using this animated deformation data, when each virtual bone is moved by the animated deformation data at runtime, the mesh is deformed by virtue of the skinning of the mesh to these animated virtual bones.

To generate information for shading, two normal maps are computed. The first normal map is computed as the difference between the base pose of the source skin and the base pose of the target skin. The second normal map is computed as the difference between the source skin in a stressed state and the base pose of the target skin. These normal maps enable the conversion to capture the detailed, high frequency, variations in the source skin, which are applied to the target skin through shading.

Shading also uses one or more masks and corresponding animated mask parameters. Each mask is associated with a region of the source skin, which may be associated with a deformer used by the tool. Each mask and its corresponding animated mask parameters defines blending parameters control how much the second normal map is blended with the first normal map to provide a normal map to be used by the runtime shader. More masks may be used to provide higher resolution control of the blending of the normal maps. A user interface may be provided to permit a user to edit these masks. These masks are multiplied together and scaled based on per frame animated mask parameters.

For each frame of animation in the tool, the animated mask parameters that scale each mask are computed. The level of stress in a region of the source skin that is associated with a mask is computed in each frame to provide this parameter for the mask. As a result, when the masks are scaled by the animated mask parameters and blended together, the stressed normal map for a region is exposed based on the level of stress in the region to create an illusion of subtle deformation detail.

A mask can be represented using image data, with each pixel corresponding to a point on the source mesh. Each color component may represent one mask. Thus one image can represent three masks.

Shaders for the runtime engine that correspond to the shaders used in the source tool also are needed. If the shaders in the source tool are implemented, for example, as shader in the OpenGL, CGFX or DirectX formats, then most runtime engines will be able to use the same shaders.

The runtime engine typically colors and textures the surface of an object using the shaders and the blended normal maps. As a character or object is animated over time, the blending of the normal maps through use of the animation masks results in animation of the shading. The shading typically provides high resolution details, such as wrinkles and hair. This technique permits transfer of animation of these characteristics to the runtime engine.

The various components of the system described herein may be implemented as a computer program using a general-purpose computer system. Such a computer system typically includes a main unit connected to both an output device that displays information to a user and an input device that receives input from a user. The main unit generally includes a processor connected to a memory system via an interconnection mechanism. The input device and output device also are connected to the processor and memory system via the interconnection mechanism.

One or more output devices may be connected to the computer system. Example output devices include, but are not limited to, a cathode ray tube (CRT) display, liquid crystal displays (LCD) and other video output devices, printers, communication devices such as a modem, and storage devices such as disk or tape. One or more input devices may be connected to the computer system. Example input devices include, but are not limited to, a keyboard, keypad, track ball, mouse, pen and tablet, communication device, and data input devices. The invention is not limited to the particular input or output devices used in combination with the computer system or to those described herein.

The computer system may be a general purpose computer system which is programmable using a computer programming language, a scripting language or even assembly language. The computer system may also be specially programmed, special purpose hardware. In a general-purpose computer system, the processor is typically a commercially available processor. The general-purpose computer also typically has an operating system, which controls the execution of other computer programs and provides scheduling, debugging, input/output control, accounting, compilation, storage assignment, data management and memory management, and communication control and related services.

A memory system typically includes a computer readable medium. The medium may be volatile or nonvolatile, writeable or no writeable, and/or rewriteable or not rewriteable. A memory system stores data typically in binary form. Such data may define an application program to be executed by the microprocessor, or information stored on the disk to be processed by the application program. The invention is not limited to a particular memory system.

A system such as described herein may be implemented in software or hardware or firmware, or a combination of the three. The various elements of the system, either individually or in combination may be implemented as one or more computer program products in which computer program instructions are stored on a computer readable medium for execution by a computer. Various steps of a process may be performed by a computer executing such computer program instructions. The computer system may be a multiprocessor computer system or may include multiple computers connected over a computer network. The components shown in FIG. 1 may be separate modules of a computer program, or may be separate computer programs, which may be operable on separate computers. The data produced by these components may be stored in a memory system or transmitted between computer systems.

Having now described an example embodiment, it should be apparent to those skilled in the art that the foregoing is merely illustrative and not limiting, having been presented by way of example only. Numerous modifications and other embodiments are within the scope of one of ordinary skill in the art and are contemplated as falling within the scope of the invention.

Claims (20)

1. A method for converting animation of an object from a first tool to a representation used in a runtime animation system, wherein the first tool represents the object using a source structure and a source mesh, and wherein the runtime animation engine uses a target structure, a target mesh and shading, the method comprising:
receiving a source animation including the source structure and the source mesh, the target structure, and the target mesh;
determining a transform between a source coordinate space of the source mesh and a target coordinate space of the target mesh based on an alignment between the source mesh and the target mesh;
adding virtual bones to the target structure based on locators within the source mesh that are positioned on the source mesh at major deformation areas, the locators are user identified locations on vertices of the source mesh, each of the locators includes a position on the source mesh, an orientation and a spin;
computing skinning weights for associating the target mesh to the virtual bones, wherein the skinning weights are computed by identifying points on the target mesh that correspond to the locators;
processing the locators for each frame of the source animation to transform displacements of each locator from the source coordinate space to the target coordinate space to produce a set of animated deformation data that specifies displacement values of each locator for each frame of the source animation; and
moving the virtual bones within the target structure based on the set of animated deformation data for each frame of the source animation to convert the animation of the object from the first tool to the representation used in the runtime animation system.
2. The method of claim 1, further comprising retargeting motion from the source structure to the target structure.
3. A computer program product, comprising:
a non-transitory computer readable storage medium;
computer program instructions stored on the non-transitory computer readable storage medium that, when processed by a computer, instruct the computer to perform a method for converting animation of an object from a first tool to a representation used in a runtime animation system, wherein the first tool represents the object using a source structure and a source mesh, and wherein the runtime animation engine uses a target structure, a target mesh and shading, the method comprising:
receiving a source animation including the source structure and the source mesh, the target structure, and the target mesh;
determining a transform between a source coordinate space of the source mesh and a target coordinate space of the target mesh on an alignment between the source mesh and the target mesh;
adding virtual bones to the target structure based on locators within the source mesh that are positioned on the source mesh at major deformation areas, the locators are user identified locations on vertices of the source mesh, each of the locators includes a position on the source mesh, an orientation and a spin;
computing skinning weights for associating the target mesh to the virtual bones, wherein the skinning weights are computed by identifying points on the target mesh that correspond to the locators;
processing the locators for each frame of the source animation to transform displacements of each locator from the source coordinate space to the target coordinate space to produce a set of animated deformation data that specifies displacement values of each locator for each frame of the source animation; and
moving the virtual bones within the target structure based on the set of animated deformation data for each frame of the source animation to convert the animation of the object from the first tool to the representation used in the runtime animation system.
4. The computer program product of claim 3, wherein the non-transitory computer readable storage medium further comprising programming instructions for retargeting motion from the source structure to the target structure.
5. The method of claim 1, further comprising the step of identifying points on the target mesh that correspond to the locators and computing a skinning weight of 100% for each of the identified points.
6. The method of claim 5, further comprising the step of determining weights for other points in the target mesh according to weighted topological distances to the closest points in the target mesh that correspond to locators.
7. The method of claim 1, further comprising the step of generating information describing a set of normal maps, one or more masks and animated mask parameters for use by one or more shaders for the runtime animation system.
8. The method of claim 7, wherein a mask and associated animated mask parameters defines blending parameters that control blending of a first normal map and a second normal map to produce one of the normal maps in the set of normal maps.
9. The method of claim 1, wherein the target mesh has a lower resolution than the source mesh.
10. The method of claim 1, further comprising the step of computing a first animated mask parameter for a frame of the source animation based on a level of stress in a region of the source skin that is associated with a first mask.
11. The computer program product of claim 3, wherein the non-transitory computer readable storage medium further comprising programming instructions for identifying points on the target mesh that correspond to the locators and computing a skinning weight of 100% for each of the identified points.
12. The computer program product of claim 11, wherein the non-transitory computer readable storage medium further comprising programming instructions for determining weights for other points in the target mesh according to weighted topological distances to the closest points in the target mesh that correspond to locators.
13. The computer program product of claim 3, wherein the non-transitory computer readable storage medium further comprising programming instructions for generating information describing a set of normal maps, one or more masks and animated mask parameters for use by one or more shaders for the runtime animation system.
14. The computer program product of claim 13, wherein in the programming instructions for generation information, a mask and associated animated mask parameters defines blending parameters that control blending of a first normal map and a second normal map to produce one of the normal maps in the set of normal maps.
15. The computer program product of claim 3, wherein in the computer program instructions stored on the non-transitory computer readable storage medium, the target mesh has a lower resolution than the source mesh.
16. The computer program product of claim 3, wherein the non-transitory computer readable storage medium further comprising programming instructions for computing a first animated mask parameter for a frame of the source animation based on a level of stress in a region of the source skin that is associated with a first mask.
17. The method of claim 1, wherein the source mesh and the target mesh are aligned in position, orientation, and scale to produce the alignment.
18. The computer program product of claim 3, wherein in the computer program instructions stored on the non-transitory computer readable storage medium, the source mesh and the target mesh are aligned in position, orientation, and scale to produce the alignment.
19. The method of claim 8, wherein the mask is represented using image data, with each pixel corresponding to a point on the source mesh.
20. The computer program product of claim 14, wherein in the computer program instructions stored on the non-transitory computer readable storage medium, the mask is represented using image data, with each pixel corresponding to a point on the source mesh.
US11/496,241 2006-07-31 2006-07-31 Converting deformation data for a mesh to animation data for a skeleton, skinning and shading in a runtime computer graphics animation engine Active 2028-03-18 US7859538B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/496,241 US7859538B2 (en) 2006-07-31 2006-07-31 Converting deformation data for a mesh to animation data for a skeleton, skinning and shading in a runtime computer graphics animation engine

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US11/496,241 US7859538B2 (en) 2006-07-31 2006-07-31 Converting deformation data for a mesh to animation data for a skeleton, skinning and shading in a runtime computer graphics animation engine
CA 2593485 CA2593485A1 (en) 2006-07-31 2007-07-12 Converting deformation data for a mesh to animation data for a skeleton, skinning and shading in a runtime computer graphics animation engine
EP07252885A EP1884896A3 (en) 2006-07-31 2007-07-20 Converting deformation data for a mesh to animation data for a skeleton, skinning and shading in a runtime computer graphics animation engine
JP2007198396A JP2008033940A (en) 2006-07-31 2007-07-31 Run time/computer graphic animation/conversion in engine from deformation data for mesh to animation data for skeleton, skinning, and shading

Publications (2)

Publication Number Publication Date
US20080024487A1 US20080024487A1 (en) 2008-01-31
US7859538B2 true US7859538B2 (en) 2010-12-28

Family

ID=38657699

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/496,241 Active 2028-03-18 US7859538B2 (en) 2006-07-31 2006-07-31 Converting deformation data for a mesh to animation data for a skeleton, skinning and shading in a runtime computer graphics animation engine

Country Status (4)

Country Link
US (1) US7859538B2 (en)
EP (1) EP1884896A3 (en)
JP (1) JP2008033940A (en)
CA (1) CA2593485A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080129738A1 (en) * 2006-12-02 2008-06-05 Electronics And Telecommunications Research Institute Method and apparatus for rendering efficient real-time wrinkled skin in character animation
US20100033488A1 (en) * 2008-08-11 2010-02-11 Microsoft Corporation Example-Based Motion Detail Enrichment in Real-Time
US20120226983A1 (en) * 2011-03-01 2012-09-06 Lucasfilm Entertainment Company Ltd. Copying an Object in an Animation Creation Application
WO2012151446A2 (en) * 2011-05-03 2012-11-08 Microsoft Corporation Employing mesh files to animate transitions in client applications
US8373704B1 (en) * 2008-08-25 2013-02-12 Adobe Systems Incorporated Systems and methods for facilitating object movement using object component relationship markers
US8683429B2 (en) 2008-08-25 2014-03-25 Adobe Systems Incorporated Systems and methods for runtime control of hierarchical objects
US9418465B2 (en) 2013-12-31 2016-08-16 Dreamworks Animation Llc Multipoint offset sampling deformation techniques
US20170243396A1 (en) * 2016-02-19 2017-08-24 Samsung Electronics Co., Ltd Method for processing image and electronic device thereof
US9786083B2 (en) 2011-10-07 2017-10-10 Dreamworks Animation L.L.C. Multipoint offset sampling deformation
US10134167B2 (en) 2013-03-15 2018-11-20 Dreamworks Animation Llc Using curves to emulate soft body deformation
US10410431B2 (en) 2017-07-11 2019-09-10 Nvidia Corporation Skinning a cluster based simulation with a visual mesh using interpolated orientation and position

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8115774B2 (en) * 2006-07-28 2012-02-14 Sony Computer Entertainment America Llc Application of selective regions of a normal map based on joint position in a three-dimensional model
JP5089422B2 (en) 2008-02-15 2012-12-05 岡本化学工業株式会社 Photosensitive composition and lithographic printing plate precursor using the same
US8379036B2 (en) * 2008-02-22 2013-02-19 Pixar Mesh transfer
US20100259547A1 (en) 2009-02-12 2010-10-14 Mixamo, Inc. Web platform for interactive design, synthesis and delivery of 3d character motion data
US8704832B2 (en) 2008-09-20 2014-04-22 Mixamo, Inc. Interactive design, synthesis and delivery of 3D character motion data through the web
US8749556B2 (en) 2008-10-14 2014-06-10 Mixamo, Inc. Data compression for real-time streaming of deformable 3D models for 3D animation
US8982122B2 (en) 2008-11-24 2015-03-17 Mixamo, Inc. Real time concurrent design of shape, texture, and motion for 3D character animation
US8659596B2 (en) 2008-11-24 2014-02-25 Mixamo, Inc. Real time generation of animation-ready 3D character models
WO2010060113A1 (en) * 2008-11-24 2010-05-27 Mixamo, Inc. Real time generation of animation-ready 3d character models
WO2010129721A2 (en) * 2009-05-05 2010-11-11 Mixamo, Inc. Distributed markerless motion capture
US8928672B2 (en) 2010-04-28 2015-01-06 Mixamo, Inc. Real-time automatic concatenation of 3D animation sequences
US8797328B2 (en) * 2010-07-23 2014-08-05 Mixamo, Inc. Automatic generation of 3D character animation from 3D meshes
US10049482B2 (en) 2011-07-22 2018-08-14 Adobe Systems Incorporated Systems and methods for animation recommendations
US9747495B2 (en) 2012-03-06 2017-08-29 Adobe Systems Incorporated Systems and methods for creating and distributing modifiable animated video messages
US10489956B2 (en) 2015-07-27 2019-11-26 Autodesk, Inc. Robust attribute transfer for character animation
WO2017223530A1 (en) 2016-06-23 2017-12-28 LoomAi, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
US10198845B1 (en) 2018-05-29 2019-02-05 LoomAi, Inc. Methods and systems for animating facial expressions

Citations (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09330424A (en) 1996-06-07 1997-12-22 Matsushita Electric Ind Co Ltd Movement converter for three-dimensional skeleton structure
US5767861A (en) 1994-08-11 1998-06-16 Kabushiki Kaisha Sega Enterprises Processing apparatus and method for displaying a moving figure constrained to provide appearance of fluid motion
US5852450A (en) 1996-07-11 1998-12-22 Lamb & Company, Inc. Method and apparatus for processing captured motion data
JPH11185055A (en) 1997-12-24 1999-07-09 Fujitsu Ltd Motion data preparing device and storage medium storing program therefor
US5966141A (en) 1996-09-03 1999-10-12 Monolith Co., Ltd. Apparatus and method for animation using topology
US6166746A (en) 1994-07-21 2000-12-26 Matsushita Electric Industrial Co., Ltd. Three-dimensional image processing apparatus for jointed objects
US6203425B1 (en) 1996-02-13 2001-03-20 Kabushiki Kaisha Sega Enterprises Image generating device, method thereof, game device and storage medium
US6215496B1 (en) 1998-07-23 2001-04-10 Microsoft Corporation Sprites with depth
US20010004262A1 (en) 1997-08-01 2001-06-21 Matsushita Electric Industrial Co. Ltd. Motion data generation apparatus, motion data generation method, and motion data generation program storage medium
US6326972B1 (en) 1998-08-21 2001-12-04 Pacific Data Images, Inc. 3D stroke-based character modeling suitable for efficiently rendering large crowds
US6377281B1 (en) 2000-02-17 2002-04-23 The Jim Henson Company Live performance control of computer graphic characters
US20020050997A1 (en) 2000-01-28 2002-05-02 Square Co., Ltd. Method, game machine and recording medium for displaying motion in a video game
US20020067363A1 (en) 2000-09-04 2002-06-06 Yasunori Ohto Animation generating method and device, and medium for providing program
US6503144B1 (en) 2000-01-28 2003-01-07 Square Co., Ltd. Computer readable program product storing program for ball-playing type game, said program, and ball-playing type game processing apparatus and method
US6522332B1 (en) 2000-07-26 2003-02-18 Kaydara, Inc. Generating action data for the animation of characters
US6535215B1 (en) 1999-08-06 2003-03-18 Vcom3D, Incorporated Method for animating 3-D computer generated characters
US20030164829A1 (en) 2001-03-21 2003-09-04 Christopher Bregler Method, apparatus and computer program for capturing motion of a cartoon and retargetting the motion to another object
US6626759B1 (en) 2000-06-05 2003-09-30 Kabushiki Kaisha Square Enix Game apparatus, method for displaying motion of character, and computer readable recording medium for recording program used to display motion of character
US20030193503A1 (en) 2002-04-10 2003-10-16 Mark Seminatore Computer animation system and method
US20040001064A1 (en) * 2002-06-28 2004-01-01 Microsoft Corporation Methods and system for general skinning via hardware accelerators
US20040012594A1 (en) 2002-07-19 2004-01-22 Andre Gauthier Generating animation data
US20040036689A1 (en) * 2002-08-23 2004-02-26 Hung-Chun Chiu Method for capturing and creating an animated image
US20040160445A1 (en) 2002-11-29 2004-08-19 Whatmough Kenneth J. System and method of converting frame-based animations into interpolator-based animations
US20040179013A1 (en) 2003-03-13 2004-09-16 Sony Corporation System and method for animating a digital facial model
US6976918B2 (en) 2000-01-24 2005-12-20 Konami Corporation Video game that interpolates between animated segments to create new segments
US7012608B1 (en) 2001-08-02 2006-03-14 Iwao Fujisaki Simulation device
US20060061574A1 (en) 2003-04-25 2006-03-23 Victor Ng-Thow-Hing Joint component framework for modeling complex joint behavior
US20060139355A1 (en) 2004-12-27 2006-06-29 Seyoon Tak Physically based motion retargeting filter
US20060181535A1 (en) 2003-07-22 2006-08-17 Antics Technologies Limited Apparatus for controlling a virtual environment
US7102647B2 (en) * 2001-06-26 2006-09-05 Microsoft Corporation Interactive horizon mapping
US7104890B2 (en) 2002-07-30 2006-09-12 Koei Co., Ltd. Program, recording medium, game character rendering method, and game apparatus
US7106334B2 (en) 2001-02-13 2006-09-12 Sega Corporation Animation creation program
US7126607B2 (en) 2002-08-20 2006-10-24 Namco Bandai Games, Inc. Electronic game and method for effecting game features
US20060262119A1 (en) 2005-05-20 2006-11-23 Michael Isner Transfer of motion between animated characters
US20060274070A1 (en) 2005-04-19 2006-12-07 Herman Daniel L Techniques and workflows for computer graphics animation system
US7168953B1 (en) 2003-01-27 2007-01-30 Massachusetts Institute Of Technology Trainable videorealistic speech animation
US20070024632A1 (en) 2005-07-29 2007-02-01 Jerome Couture-Gagnon Transfer of attributes between geometric surfaces of arbitrary topologies with distortion reduction and discontinuity preservation
US20070030266A1 (en) * 2005-08-02 2007-02-08 Sony Computer Entertainment America Inc. Scheme for providing wrinkled look in computer simulation of materials
US7221380B2 (en) 2003-05-14 2007-05-22 Pixar Integrated object bend, squash and stretch method and apparatus
US7251593B2 (en) 2001-10-29 2007-07-31 Honda Giken Kogyo Kabushiki Kaisha Simulation system, method and computer-readable medium for human augmentation devices
US7253817B1 (en) 1999-12-29 2007-08-07 Virtual Personalities, Inc. Virtual human interface for conducting surveys
US7515155B2 (en) * 2003-05-14 2009-04-07 Pixar Statistical dynamic modeling method and apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6826759B2 (en) * 1997-04-01 2004-11-30 Sun Microsystems, Inc. Method and apparatus for discovering and activating software components

Patent Citations (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6166746A (en) 1994-07-21 2000-12-26 Matsushita Electric Industrial Co., Ltd. Three-dimensional image processing apparatus for jointed objects
US5767861A (en) 1994-08-11 1998-06-16 Kabushiki Kaisha Sega Enterprises Processing apparatus and method for displaying a moving figure constrained to provide appearance of fluid motion
US6203425B1 (en) 1996-02-13 2001-03-20 Kabushiki Kaisha Sega Enterprises Image generating device, method thereof, game device and storage medium
JPH09330424A (en) 1996-06-07 1997-12-22 Matsushita Electric Ind Co Ltd Movement converter for three-dimensional skeleton structure
US5852450A (en) 1996-07-11 1998-12-22 Lamb & Company, Inc. Method and apparatus for processing captured motion data
US5966141A (en) 1996-09-03 1999-10-12 Monolith Co., Ltd. Apparatus and method for animation using topology
US20010004262A1 (en) 1997-08-01 2001-06-21 Matsushita Electric Industrial Co. Ltd. Motion data generation apparatus, motion data generation method, and motion data generation program storage medium
JPH11185055A (en) 1997-12-24 1999-07-09 Fujitsu Ltd Motion data preparing device and storage medium storing program therefor
US6215496B1 (en) 1998-07-23 2001-04-10 Microsoft Corporation Sprites with depth
US6326972B1 (en) 1998-08-21 2001-12-04 Pacific Data Images, Inc. 3D stroke-based character modeling suitable for efficiently rendering large crowds
US6535215B1 (en) 1999-08-06 2003-03-18 Vcom3D, Incorporated Method for animating 3-D computer generated characters
US7253817B1 (en) 1999-12-29 2007-08-07 Virtual Personalities, Inc. Virtual human interface for conducting surveys
US6976918B2 (en) 2000-01-24 2005-12-20 Konami Corporation Video game that interpolates between animated segments to create new segments
US6503144B1 (en) 2000-01-28 2003-01-07 Square Co., Ltd. Computer readable program product storing program for ball-playing type game, said program, and ball-playing type game processing apparatus and method
US20020050997A1 (en) 2000-01-28 2002-05-02 Square Co., Ltd. Method, game machine and recording medium for displaying motion in a video game
US6697071B2 (en) 2000-01-28 2004-02-24 Kabushiki Kaisha Square Enix Method, game machine and recording medium for displaying motion in a video game
US6377281B1 (en) 2000-02-17 2002-04-23 The Jim Henson Company Live performance control of computer graphic characters
US6626759B1 (en) 2000-06-05 2003-09-30 Kabushiki Kaisha Square Enix Game apparatus, method for displaying motion of character, and computer readable recording medium for recording program used to display motion of character
US6522332B1 (en) 2000-07-26 2003-02-18 Kaydara, Inc. Generating action data for the animation of characters
US20020067363A1 (en) 2000-09-04 2002-06-06 Yasunori Ohto Animation generating method and device, and medium for providing program
US7106334B2 (en) 2001-02-13 2006-09-12 Sega Corporation Animation creation program
US20030164829A1 (en) 2001-03-21 2003-09-04 Christopher Bregler Method, apparatus and computer program for capturing motion of a cartoon and retargetting the motion to another object
US7102647B2 (en) * 2001-06-26 2006-09-05 Microsoft Corporation Interactive horizon mapping
US7012608B1 (en) 2001-08-02 2006-03-14 Iwao Fujisaki Simulation device
US7251593B2 (en) 2001-10-29 2007-07-31 Honda Giken Kogyo Kabushiki Kaisha Simulation system, method and computer-readable medium for human augmentation devices
US20030193503A1 (en) 2002-04-10 2003-10-16 Mark Seminatore Computer animation system and method
US20040001064A1 (en) * 2002-06-28 2004-01-01 Microsoft Corporation Methods and system for general skinning via hardware accelerators
US20040012594A1 (en) 2002-07-19 2004-01-22 Andre Gauthier Generating animation data
US7104890B2 (en) 2002-07-30 2006-09-12 Koei Co., Ltd. Program, recording medium, game character rendering method, and game apparatus
US7126607B2 (en) 2002-08-20 2006-10-24 Namco Bandai Games, Inc. Electronic game and method for effecting game features
US20040036689A1 (en) * 2002-08-23 2004-02-26 Hung-Chun Chiu Method for capturing and creating an animated image
US20040160445A1 (en) 2002-11-29 2004-08-19 Whatmough Kenneth J. System and method of converting frame-based animations into interpolator-based animations
US7168953B1 (en) 2003-01-27 2007-01-30 Massachusetts Institute Of Technology Trainable videorealistic speech animation
US20040179013A1 (en) 2003-03-13 2004-09-16 Sony Corporation System and method for animating a digital facial model
US7068277B2 (en) 2003-03-13 2006-06-27 Sony Corporation System and method for animating a digital facial model
US20060061574A1 (en) 2003-04-25 2006-03-23 Victor Ng-Thow-Hing Joint component framework for modeling complex joint behavior
US7221380B2 (en) 2003-05-14 2007-05-22 Pixar Integrated object bend, squash and stretch method and apparatus
US7515155B2 (en) * 2003-05-14 2009-04-07 Pixar Statistical dynamic modeling method and apparatus
US20060181535A1 (en) 2003-07-22 2006-08-17 Antics Technologies Limited Apparatus for controlling a virtual environment
US20060139355A1 (en) 2004-12-27 2006-06-29 Seyoon Tak Physically based motion retargeting filter
US20060274070A1 (en) 2005-04-19 2006-12-07 Herman Daniel L Techniques and workflows for computer graphics animation system
US20060262119A1 (en) 2005-05-20 2006-11-23 Michael Isner Transfer of motion between animated characters
US20070024632A1 (en) 2005-07-29 2007-02-01 Jerome Couture-Gagnon Transfer of attributes between geometric surfaces of arbitrary topologies with distortion reduction and discontinuity preservation
US20070030266A1 (en) * 2005-08-02 2007-02-08 Sony Computer Entertainment America Inc. Scheme for providing wrinkled look in computer simulation of materials

Non-Patent Citations (15)

* Cited by examiner, † Cited by third party
Title
Antony Ward, Game Character Development with Maya, 2004, New Riders. *
English Abstract of "Motobayashi et al." (Provided as explanation of relevance), 2009.
English abstract of "Yukawa et al." (provided as explanation of relevance), 2009.
English Translation of JP 09-330424, 1997.
English Translation of JP 11-185055, 1999.
Florian Loitsch, "Maya File Translator," 2004. http://florian.loitsch.com/gpExport/oldDocs/report.html. *
Gleicher, Michael. "Retargetting Motion to New Characters," Proceedings of SIGGRAPH 98, pp. 33-42, Jul. 1998.
James, Doug L., et al, "Skinning Mesh Animations", SIGGRAPH '05: ACM Siggraph 2005 Papers, 2005, XP002468241, pp. 399-407.
Mohr, Alex, et al, "Building Efficient, Accurate Character Skins From Examples", ACM Transactions On Graphics, vol. 22, No. 3, Jul. 2003, XP002468240, pp. 562-568.
Motobayashi et al. "Assimilated Motion Generation for Characters with Various Features," Journal of Institute of Electronics, Information and Communication Engineers, Information and System II-Pattern Processing, Japan, Jul. 1, 2004, vol. J87-D-II, No. 7, pp. 1473-1486.
Motobayashi et al. "Assimilated Motion Generation for Characters with Various Features," Journal of Institute of Electronics, Information and Communication Engineers, Information and System II—Pattern Processing, Japan, Jul. 1, 2004, vol. J87-D-II, No. 7, pp. 1473-1486.
Office Action. U.S. Appl. No. 12/220,254 dtd. Aug. 20, 2009.
Sumner, R. W. and Popovic, J. 2004. Deformation transfer for triangle meshes. ACM Trans. Graph. 23, 3 (Aug. 2004), 399-405. DOI=http://doi.acm.org/10.1145/1015706.1015736. *
Sumner, R. W. and Popović, J. 2004. Deformation transfer for triangle meshes. ACM Trans. Graph. 23, 3 (Aug. 2004), 399-405. DOI=http://doi.acm.org/10.1145/1015706.1015736. *
Yukawa et al. "Human Motion Description System Using BUYO-FU," Journal of Information Processing Society of Japan, Japan, Information Processing Society of Japan, Oct. 15, 2000, vol. 41, No. 10, pp. 2873-2880.

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080129738A1 (en) * 2006-12-02 2008-06-05 Electronics And Telecommunications Research Institute Method and apparatus for rendering efficient real-time wrinkled skin in character animation
US20100033488A1 (en) * 2008-08-11 2010-02-11 Microsoft Corporation Example-Based Motion Detail Enrichment in Real-Time
US8144155B2 (en) * 2008-08-11 2012-03-27 Microsoft Corp. Example-based motion detail enrichment in real-time
US8373704B1 (en) * 2008-08-25 2013-02-12 Adobe Systems Incorporated Systems and methods for facilitating object movement using object component relationship markers
US8683429B2 (en) 2008-08-25 2014-03-25 Adobe Systems Incorporated Systems and methods for runtime control of hierarchical objects
US8464153B2 (en) * 2011-03-01 2013-06-11 Lucasfilm Entertainment Company Ltd. Copying an object in an animation creation application
US20120226983A1 (en) * 2011-03-01 2012-09-06 Lucasfilm Entertainment Company Ltd. Copying an Object in an Animation Creation Application
US9335902B2 (en) 2011-03-01 2016-05-10 Lucasfilm Entertainment Company, Ltd. Copying an object in an animation creation application
WO2012151446A2 (en) * 2011-05-03 2012-11-08 Microsoft Corporation Employing mesh files to animate transitions in client applications
WO2012151446A3 (en) * 2011-05-03 2013-03-21 Microsoft Corporation Employing mesh files to animate transitions in client applications
US9786083B2 (en) 2011-10-07 2017-10-10 Dreamworks Animation L.L.C. Multipoint offset sampling deformation
US10134167B2 (en) 2013-03-15 2018-11-20 Dreamworks Animation Llc Using curves to emulate soft body deformation
US9418465B2 (en) 2013-12-31 2016-08-16 Dreamworks Animation Llc Multipoint offset sampling deformation techniques
US20170243396A1 (en) * 2016-02-19 2017-08-24 Samsung Electronics Co., Ltd Method for processing image and electronic device thereof
US10410407B2 (en) * 2016-02-19 2019-09-10 Samsung Electronics Co., Ltd. Method for processing image and electronic device thereof
US10410431B2 (en) 2017-07-11 2019-09-10 Nvidia Corporation Skinning a cluster based simulation with a visual mesh using interpolated orientation and position

Also Published As

Publication number Publication date
EP1884896A3 (en) 2008-03-26
US20080024487A1 (en) 2008-01-31
JP2008033940A (en) 2008-02-14
CA2593485A1 (en) 2008-01-31
EP1884896A2 (en) 2008-02-06

Similar Documents

Publication Publication Date Title
Kähler et al. Geometry-based muscle modeling for facial animation
Hill Jr Computer graphics using open gl
US5267154A (en) Biological image formation aiding system and biological image forming method
US6434278B1 (en) Generating three-dimensional models of objects defined by two-dimensional image data
JP4364409B2 (en) How to model graphic objects interactively using linked and unlinked surface elements
US6677944B1 (en) Three-dimensional image generating apparatus that creates a three-dimensional model from a two-dimensional image by image processing
US6664956B1 (en) Method for generating a personalized 3-D face model
JP3184327B2 (en) 3D graphics processing method and apparatus
JP5232358B2 (en) Rendering outline fonts
EP0889437A2 (en) Raster image mapping
US6700586B1 (en) Low cost graphics with stitching processing hardware support for skeletal animation
US8982122B2 (en) Real time concurrent design of shape, texture, and motion for 3D character animation
JP2008519318A (en) Depth tracking method in scan line based raster image processor
JP5829371B2 (en) Facial animation using motion capture data
KR100720309B1 (en) Automatic 3D modeling system and method
US6268861B1 (en) Volumetric three-dimensional fog rendering technique
US6268865B1 (en) Method and apparatus for three-dimensional painting
US6283858B1 (en) Method for manipulating images
Wright Jr et al. OpenGL SuperBible: comprehensive tutorial and reference
US7920144B2 (en) Method and system for visualization of dynamic three-dimensional virtual objects
US7289119B2 (en) Statistical rendering acceleration
Wang et al. Multi-weight enveloping: least-squares approximation techniques for skin animation
US7515155B2 (en) Statistical dynamic modeling method and apparatus
US8659596B2 (en) Real time generation of animation-ready 3D character models
US6822653B2 (en) Methods and system for general skinning via hardware accelerators

Legal Events

Date Code Title Description
AS Assignment

Owner name: AVID TECHNOLOGY, INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ISNER, MICHAEL;VON DER PAHLEN, JAVIER NICOLAI;KANG, THOMAS HO-MIN;REEL/FRAME:018428/0148

Effective date: 20061023

AS Assignment

Owner name: AUTODESK, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVID TECHNOLOGY, INC.;REEL/FRAME:021962/0974

Effective date: 20081117

Owner name: AUTODESK, INC.,CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVID TECHNOLOGY, INC.;REEL/FRAME:021962/0974

Effective date: 20081117

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
REMI Maintenance fee reminder mailed
FPAY Fee payment

Year of fee payment: 4

SULP Surcharge for late payment
MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552)

Year of fee payment: 8