GB2574460A - Computer Animation Method and Apparatus - Google Patents

Computer Animation Method and Apparatus Download PDF

Info

Publication number
GB2574460A
GB2574460A GB1809389.8A GB201809389A GB2574460A GB 2574460 A GB2574460 A GB 2574460A GB 201809389 A GB201809389 A GB 201809389A GB 2574460 A GB2574460 A GB 2574460A
Authority
GB
United Kingdom
Prior art keywords
model
intermediate frame
foot
condition
frame time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1809389.8A
Other versions
GB2574460B (en
GB201809389D0 (en
Inventor
Powell Oliver
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Frontier Developments PLC
Original Assignee
Frontier Developments PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Frontier Developments PLC filed Critical Frontier Developments PLC
Priority to GB1809389.8A priority Critical patent/GB2574460B/en
Publication of GB201809389D0 publication Critical patent/GB201809389D0/en
Publication of GB2574460A publication Critical patent/GB2574460A/en
Application granted granted Critical
Publication of GB2574460B publication Critical patent/GB2574460B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A method of creating a computer animation of a model such as a skeletal model, the method comprises: a) providing a plurality of keyframes that describe a pose of a model at a plurality of times; b) identifying a plurality of intermediate frame times in an interval between the plurality of the keyframes; c) for each intermediate frame time, identifying a weighting value associated with the intermediate frame time; and d) for each intermediate frame time, creating an intermediate frame by a combination of interpolation between adjacent keyframes and inverse kinematics (IK); wherein e) the extent to which each of interpolation and inverse kinematics determines the content of each intermediate frame is dependent upon the weighting value associated with that intermediate frame. Typically the model includes a representation of a foot walking on terrain or head tracking. Preferably float-streams having a series of weighted values describe how much IK blending is used within framed animations.

Description

TITLE: COMPUTER ANIMATION METHOD AND APPARATUS
DESCRIPTION
The present invention relates to computer animation and particularly, but not exclusively, to computer animation of a model for display in a computer simulation or a computer game.
In computer programs such as video games, there is often a need to generate an animated display that presents a visualisation of models representative of an object moving in a virtual space. Within the scenario being represented by the program, the modelled objects may interact with arbitrary terrain and obstacles within the virtual space dynamically at run-time. The models used in more advanced video games typically define a skeleton that comprises a plurality of virtual solid components (e.g., a bone of a virtual animal) that are interconnected by pivotable joints. References to objects, terrain, etc., in this specification are references to virtual objects that exist in a virtual space that exist within a space created by a computer model.
A typical animation technique used in video games involves an artist creating “keyframes” that describe an object’s pose at various points over a timeline, and interpolating between these poses at run-time. In cases where the object represents a person or an animal, the animation is usually created to match certain scenarios such as walking on flat terrain, climbing stairs or making sharp turns. In such cases, footsteps are explicitly defined to determine when certain events should be triggered in game, such as playing foot-step sounds.
Creation of these animations can be laborious as different animations must be created to match likely variations in terrain, such as large or small steps and slopes. This limits the terrain that can be used within a game and requires explicit synchronisation of animation at run-time to produce correct foot-planting events, where a foot-plant is defined to be where and when a foot is mostly in contact with the ground. By implication, at a foot-plant, the visual representation of the foot should have its orientation dictated by its interaction with the ground.
One approach to ensuring that an object is represented in an appropriate configuration is the use of “Inverse Kinematics”, hereafter referred to as “IK”. IK is a process that, given a model’s skeleton configuration, can produce the approximate correct pose resulting from a desired position and orientation of a particular bone. A walk cycle will typically be modelled on a flat surface plane. However, if this model is applied to animate a character walking on uneven terrain, the resulting animation will cause unnatural foot placement, with feet appearing above or below the terrain at certain points.
This is illustrated in Figure 1 which shows two terrains: a flat terrain 10 (shown by a chain line) and an uneven terrain 12 (shown by a dotted line). The model (of a walking human, in this example) is placed on the terrain by its root transform “14”. Points “16” and “18” show the placement of the model’s feet during a particular frame of animation and clearly show a discrepancy between the assumed flat surface and the actual uneven terrain. IK corrects these discrepancies by calculating a new skeletal pose for connected bones when a foot makes contact with the ground, effectively augmenting or overriding the pre-generated animation key-frames. Not only will IK place the foot on the ground at the correct position, but it will also align the foot with the ground. IK can also be applied to other scenarios such as hand placement and head look targeting.
IK is typically applied in reaction to events that occur. In the case of foot placement, IK will only affect transforms once the “foot-plant” has occurred on the modelled, flat terrain. This gives rise to a problem, in that, on uneven terrain, the application of IK can result in unnatural or jarring correction of skeletal poses after IK has calculated the correct point and orientation of the foot while it is in contact with the terrain.
The present applicant has appreciated the need for an improved computer animation technique that overcomes or at least alleviates problems with the aforementioned prior art.
In accordance with a first aspect of the present invention, there is provided a method of creating a computer animation of a model (e.g. skeletal model), comprising: a) providing a plurality of keyframes that describe the pose of a model at a plurality of times (e.g. at a plurality of times during an animation sequence (e.g. animation loop)); b) identifying a plurality of intermediate frame times in an interval between first and second keyframes of the plurality of keyframes (e.g. the interval between an initial and end key frames of the animation sequence or the interval between adjacent keyframes); c) for each intermediate frame time, identifying a weighting value associated with the intermediate frame time; and
d) for each intermediate frame time, creating an intermediate frame by a combination of interpolation between adjacent keyframes and inverse kinematics; wherein e) the extent to which each of interpolation and inverse kinematics determines the content of each intermediate frame is dependent upon the weighting value associated with that intermediate frame.
In this way, a method of creating a computer animation of a model is provided whereby IK can be applied to correct the position of an object in animated frames by gradually blending between the interpolated animation and the most natural pose at a particular location without causing sudden movement of the object within an animated sequence. Advantageously, this technique provides for improved animation with minimal additional computational complexity/minimal additional data requirement.
In one embodiment, the model is a three-dimensional model (e.g. three-dimensional skeletal model).
In one embodiment, the inverse kinematics is operative to selectively focus influence on the positioning of one or more component part of the model. In one embodiment the one or more component part of the model comprises a limb (e.g. leg) or head of a character.
Through selection of suitable weighting values (which, as a set, will be referred to as a “float-stream”), IK can be applied to intermediate frames to ensure that represented objects exhibit smooth, continuous movement from one keyframe to the next.
In one embodiment, steps a)-e) occur dynamically at run-time. In this way, the gradual blending between the interpolated animation and IK may be implemented during live gameplay.
In one embodiment, each weighting value may be calculated in automated manner. Accordingly, the step of calculating the weighing values may be carried out in advance of run-time (e.g. during software development) or carried out dynamically during run-time.
In one embodiment, each weighting value may be calculated by (e.g. the automated computer-implemented steps of):
for each intermediate frame time, determining whether or not a condition is met and assigning a first or a second condition value to that frame time depending upon whether the condition is met; and calculating the weighting value of a frame time as a function of one or more of the condition values (e.g. as a function of the condition value assigned to that frame time and/or condition values assigned to adjacent frame times).
In such cases, the weighting values are calculated by fitting condition values to a curve (e.g. fitting condition values for successive frame times to a curve) and/or by averaging condition values (e.g. averaging condition values assigned to successive frame times).
In one embodiment each condition value may be either 0 or 1.
In one embodiment, each weighting value is a real number in the range 0 to 1.
In one embodiment, the weighting values are set to gradually increase the extent to which inverse kinematics is determinate on the content of each intermediate frame (e.g. gradually increase the extent as the model moves towards the meeting of the condition). In this way, the animation may gradually change from maximum interpolated (e.g. fully interpolated) to maximum IK (e g. fully IK).
In another embodiment, the weighting values are set to gradually decrease the extent to which inverse kinematics is determinate on the content of each intermediate frame (e.g. gradually decrease the extent as the model moves towards the meeting of the condition). In this way, the animation may gradually change from maximum IK (e.g. fully IK) to maximum interpolated (e.g. fully interpolated).
In a first embodiment, the condition value may relate to proximity to a feature in a scene (e.g. proximity of a part of the model to terrain in the scene).
In a typical application of this first embodiment, the model includes a representation of a foot walking on terrain. In such embodiments, the condition value may indicate whether or not the foot is planted on the terrain. The foot may be determined as being planted if greater than a predetermined set of vertices of a model representing the foot are within a threshold distance from the terrain. In addition, in one embodiment the foot may only be determined as being planted if the speed of movement of the foot falls below a predetermined velocity limit (e.g. a velocity limit indicative of a foot-plant). In one embodiment, a change from the foot not being planted to the foot being planted corresponds to a change from the first to the second condition value. In this example, the weighting value may be selected to gradually increase the influence of inverse kinematics upon occurrence of the second condition.
In a second embodiment, the condition value may be indicative of an orientation of a part of the model relative to a feature in a scene. For example, in one embodiment the condition value may be indicative of intersection of a ray cast from the model and the feature.
In a typical application of this second embodiment, the condition value may be indicative of intersection of a ray cast from a part of the model and a plane in the scene (e.g. a plane associated with an object or another character in the scene). Typically, the direction of the ray is determined by a direction of a pose of the model. In such embodiments, the model may represent represents a head of a character (i.e. the head of an entity within a space). In one embodiment, a change from the ray intersecting the plane to the ray no longer intersecting the plane corresponds to a change from the first to the second condition value. In this example, the weighting value may be selected to gradually reduce the influence of inverse kinematics upon occurrence of the second condition.
In accordance with a second aspect of the present invention, there is provided apparatus operative to carry out a method in accordance with any embodiment of the first aspect of the invention.
In accordance with a third aspect of the present invention, there is provided a computer program which, when executed by a processor, causes the processor to carry out a method in accordance with any embodiment of the first aspect of the invention.
In accordance with a fourth aspect of the present invention, there is provided a computer program product carrying instructions which, when executed by a processor, causes the processor to carry out a method in accordance with any embodiment of the first aspect of the invention.
The computer program product may be provided on any suitable carrier medium.
In one embodiment the computer program product comprises a tangible carrier medium (e.g. a computer-readable storage medium).
In another embodiment the computer program product comprises a transient carrier medium (e.g. data stream).
Embodiments of the present invention will now be described in detail, by way of example, and with reference to the accompanying drawings, in which:
Figure 1 shows the interaction of a model representative of a walking person with flat and with uneven terrain, and has already been discussed;
Figure 2 illustrates the stages of a foot-plant in an animation of a walking person;
Figure 3 is a table showing values in a stream constructed as part of a method embodying the invention;
Figures 4 and 5 illustrate two potential output streams that can be generated from the table of Figure 3; and
Figure 6 displays two different frames of animation where a model performs a “ramming” motion.
Embodiments of the invention will be described with using the term “float-streams”, which can be understood to mean a lightweight method of describing IK blending within keyframed animations. Their purpose and nature will become apparent from the description that follows.
A float-stream is a series of weighting values that describe how much IK blending should be applied at certain periods over the life-time of the animation loop. In this example, each value is a floating-point value that in the range 0 to 1.
For example, each component that represents a foot in a model’s skeleton is associated with its own float-stream. Within that stream, a value of “1” indicates that IK can take complete control of the pose for that foot, such as when the foot is fully in contact with the ground. Conversely, a value of “0” indicates that no IK blending should occur on the foot orientation, typically when the foot is not considered to be touching the ground.
Float-streams are flexible in how they allow blending to be specified and do not require a value to be recorded for every frame. Instead, blending values can be inferred via interpolation between two values. For example, given a walk cycle animation for a humanoid skeleton, each foot may plant once. To describe the blending thresholds using float-streams requires only four values per foot. For each foot in this example, a value of “1” is recorded at the frame during which the foot first becomes fully in contact with the ground, and therefore should be fully aligned with the ground surface. A subsequent value of “1” is then recorded at the frame the foot ceases to be completely in contact with the ground. For the period between these two frames the blend weight value remains at “ 1” and therefore, the IK system will have complete control over that foot’s pose. A value of “0” is set at corresponding points in the animation timeline during which the orientation of the foot is completely under the control of the animation key-frames. Animations can contain data at a rate of 30 to 60 frames-per-second, whereas a single foot-plant can be described in as few as four values over a basic walk cycle, thus adding a negligible amount of data. These values are not absolute and serve as a guide for illustrative purposes. The flexibility of float-streams allows more data points to be added for finer control.
Typically, animators must annotate their animations with the events that occur within them, such as foot planting, which can present considerable overhead in a development cycle. To reduce this burden, float-streams can be derived automatically using custom tooling. Rather than require manual creation of the float-stream data for every animation, instead the animator can provide a set of information that will allow these values to be determined automatically. First, the animator assigns which vertices are important for a foot-plant. Secondly, a height threshold is then set, such that vertices that fall below this threshold are candidates for a foot-plant. Third, to fine-tune when a foot-plant is considered to have occurred, a percentage of vertices that fall below the height threshold should be set as it is common for a foot to be planted on the ground, but not all parts of the foot may be fully grounded. Lastly, a velocity tolerance is set to account for situations where a foot may be moving close to the ground and therefore, should not be considered to be “planted”. With these parameters set, the float-streams for all feet in all animations can be derived.
The process of creating the float-stream iterates through each frame of animation, and where a foot’s vertices satisfy the condition values set, a “1” is recorded for that frame, otherwise “0” is recorded. Figure 2 presents an illustration of the float-stream evaluation tooling. The foot at position indicated at 20 is just beginning to make contact with the ground at 22. The plane 24, represents the height threshold below which vertices are considered to contribute to a foot-plant. For the foot at position shown at 20, insufficient vertices are within the planting threshold and therefore this is not considered to be a foot-plant. The foot at position 26 has most of the target vertices within the planting threshold and therefore is considered a plant.
This process produces a stream of sub-sequences of values that are either “0” or “1”. Such a stream can then be processed to produce a “blend curve” by taking the median point in each sub-sequence and blending the other values according to a specified curve fit. The result is a float-stream that is accessible to the animator and can be manually tuned if desired.
Figure 3 presents an example of the generated float-stream processing and output for an animation where a foot plants twice. The produced float-stream “FS” is shown at the top of the figure, with the processed stream “PS” below. Note that, in this example, the animation is a loop, hence the retention of the “0” values at the beginning and end. The values marked by are then assigned via a defined curve fit to produce the final output. The plots in
Figures 4 and 5 illustrate two potential output streams that can be generated from the same float-stream produced by the tool. The flexibility of the float-streams allows for more values to be used to create a more precise curve fit, or fewer values for coarser grained blending in cases where such precision is not necessary. During gameplay, the stream can be sampled using the current animation time to retrieve the blend value.
With the knowledge of foot-plant positions encoded in the animation data, predictions can now be made as to when and where subsequent foot-plants will occur at run-time. This information allows more accurate positional tracking to be applied to limbs and connected hip bones. With this information, continuous adjustments can be made to correct the positioning of foot placement throughout an animation cycle. When a foot is planted, the float-streams are used to locate the next point in the animation where a foot-plant will occur. The pose information of the skeleton at this point in the animation can then be queried to deduce where the foot will come in contact with the ground. To further optimise this process, the animation poses at the foot-plant positions can be calculated once and cached for future retrieval.
With the position and orientation of the current and future foot-plants known, a smooth motion curve can be generated. Existing systems provide a means to determine the supporting height of a model depending on the distribution of support on connected “feet”. Float-streams make this information readily available, and additionally provide a simple means to determine how much each foot contributes to the overall support of the model through the value of the blend weights.
Float-streams can be used in creating intermediate frames in various other scenarios where control over IK influence would be beneficial. One example is head tracking. During head tracking, certain animation poses limit the freedom of movement of the head and thus IK influence should be diminished or disabled entirely, otherwise the resulting pose would look unnatural.
In head tracking, as with foot-planting, it is often desirable to blend IK control with the key-framed animation in a dynamic manner. Head tracking is often achieved by allowing IK to override the head and neck bones to make the head “look at” a target point, within the constraints of the skeleton joints. However, this can often lead to unnatural results with the head appearing to move independently of the movement of the rest of the body. If an animation is depicting a specialised action, such as a “ramming action”, or even a sharp turn, it is often the case that IK overrides should be limited or disabled entirely for certain periods of the animation. The float-streams mechanism provides an extremely flexible and lightweight means of describing this behaviour, as previously illustrated with foot-planting.
Automatic generation of float-streams for head tracking is also possible. The blend weight for a given frame is determined using a custom target plane to describe the acceptable range of head motion and ray casting to determine if the direction of the head during a specific frame of animation lies within this range. This process is evaluated for each frame in the animation resulting in a series of blend weights.
Figure 4 displays a first frame of animation where a model performs a “ramming” motion. At the start of the animation, as shown in the Figure 4, the model is upright and facing forward. A ray 30 is cast from the head in the current direction it faces and can be seen to intersect with the target motion plane 32. While this remains true, values are recorded in the float-stream to permit IK motion.
The second frame, shown in Figure 5, shows the same model while executing part of the “ramming” action. Clearly the position of the head is important to depicting this action within a game and therefore should not be overridden by IK. By applying the same raycasting technique it can be observed that the there is no longer any intersection of the ray 30 with the target motion plane 32 and thus the float-stream is updated to disable IK during this frame. The float-stream can then be optimised by removing sequences of duplicate values, resulting in a minimal set of weights similar to the float-streams for foot-planting.
At run-time, the game can query the blend weights provided by the float-stream for head tracking and dynamically adjust IK influence over the head. This results in natural blending of head tracking.

Claims (18)

Claims:
1. A method of creating a computer animation of a model, comprising:
a) providing a plurality of key frames that describe the pose of a model at a plurality of times;
b) identifying a plurality of intermediate frame times in an interval between the plurality of the keyframes;
c) for each intermediate frame time, identifying a weighting value associated with the intermediate frame time; and
d) for each intermediate frame time, creating an intermediate frame by a combination of interpolation between adjacent keyframes and inverse kinematics; wherein
e) the extent to which each of interpolation and inverse kinematics determines the content of each intermediate frame is dependent upon the weighting value associated with that intermediate frame.
2. A method according to claim 1, wherein each weighting value is calculated by:
for each intermediate frame time, determining whether or not a condition is met and assigning a first or a second condition value to that frame time depending upon whether the condition is met; and calculating the weighting value of a frame time as a function of one or more of the condition values.
3. A method according to claim 2, wherein the weighting values are calculated by fitting condition values to a curve.
4. A method according to claim 2 or claim 3, wherein the weighting values are calculated by averaging condition values.
5. A method according to any of claims 2-4, wherein each condition value is either 0 or 1, and each weighting value is a real number in the range 0 to 1.
6. A method according to any of claims 2-5, wherein the condition value relates to proximity of the model to a feature in a scene.
7. A method according to claim 6, wherein the model includes a representation of a foot walking on terrain.
8. A method according to claim 7, wherein the condition value is indicative of whether or not the foot is planted on the terrain.
9. A method according to claim 8, wherein the foot is determined as being planted if greater than a predetermined set of vertices of a model representing the foot are within a threshold distance from the terrain.
10. A method according to claim 10, wherein the foot is only determined as being planted if the speed of movement of the foot falls below a predetermined velocity limit.
11. A method according to any of claims 2-5, wherein the condition value is indicative of an orientation of a part of the model relative to a feature in a scene.
12. A method according to claim 11, wherein the condition value is indicative of intersection of a ray cast from a part of the model and the feature.
13. A method according to claim 12, wherein the feature is a plane in the scene.
14. A method according to claim 12 or claim 13, wherein the direction of the ray is determined by a direction of a pose of the model.
15. A method according to any of claims 12-14, wherein the model represents a head of a character.
16. Apparatus operative to carry out a method in accordance with any of the preceding claims.
17. A computer program which, when executed by a processor, causes the processor to carry out a method in accordance with any of claims 1-15.
18. A computer program product carrying instructions which, when executed by a 5 processor, causes the processor to carry out a method in accordance with any of claims 1-15.
18. A computer program product carrying instructions which, when executed by a 5 processor, causes the processor to carry out a method in accordance with any of claims 1-15.
4 03
19
Amendments to the claims have been filed as follows:
Claims:
1. A method of creating a computer animation of a model, comprising:
a) providing a plurality of keyframes that describe a pose of a model at a plurality of 5 times;
b) identifying a plurality of intermediate frame times in an interval between the plurality of the keyframes;
c) for each intermediate frame time, identifying a weighting value associated with the intermediate frame time; and
10 d) for each intermediate frame time, creating an intermediate frame by a combination of interpolation between adjacent keyframes and inverse kinematics; wherein
e) the extent to which each of interpolation and inverse kinematics determines the content of each intermediate frame is dependent upon the weighting value associated with the intermediate frame time of that intermediate frame.
2. A method according to claim 1, wherein each weighting value is calculated by:
for each intermediate frame time, determining whether or not a condition is met and assigning a first or a second condition value to that frame time depending upon whether the condition is met; and
20 calculating the weighting value of a frame time as a function of one or more of the condition values.
3. A method according to claim 2, wherein the weighting values are calculated by fitting condition values to a curve.
4. A method according to claim 2 or claim 3, wherein the weighting values are calculated by averaging condition values.
5. A method according to any of claims 2-4, wherein each condition value is either 0 or 30 1, and each weighting value is a real number in the range 0 to 1.
6. A method according to any of claims 2-5, wherein the condition value relates to proximity of the model to a feature in a scene.
7. A method according to claim 6, wherein the model includes a representation of a foot walking on terrain.
8. A method according to claim 7, wherein the condition value is indicative of whether or not the foot is planted on the terrain.
9. A method according to claim 8, wherein the foot is determined as being planted if greater than a predetermined number of vertices in a set of vertices of a model representing the foot are within a threshold distance from the terrain.
10. A method according to claim 9, wherein the foot is only determined as being planted if the speed of movement of the foot falls below a predetermined velocity limit.
11. A method according to any of claims 2-5, wherein the condition value is indicative of an orientation of a part of the model relative to a feature in a scene.
12. A method according to claim 11, wherein the condition value is indicative of intersection of a ray cast from a part of the model and the feature.
13. A method according to claim 12, wherein the feature is a plane in the scene.
14. A method according to claim 12 or claim 13, wherein the direction of the ray is determined by a direction of a pose of the model.
15. A method according to any of claims 12-14, wherein the model represents a head of a character.
16. Apparatus operative to carry out a method in accordance with any of the preceding claims.
17. A computer program which, when executed by a processor, causes the processor to carry out a method in accordance with any of claims 1-15.
GB1809389.8A 2018-06-07 2018-06-07 Computer Animation Method and Apparatus Active GB2574460B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1809389.8A GB2574460B (en) 2018-06-07 2018-06-07 Computer Animation Method and Apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1809389.8A GB2574460B (en) 2018-06-07 2018-06-07 Computer Animation Method and Apparatus

Publications (3)

Publication Number Publication Date
GB201809389D0 GB201809389D0 (en) 2018-07-25
GB2574460A true GB2574460A (en) 2019-12-11
GB2574460B GB2574460B (en) 2021-05-19

Family

ID=62975533

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1809389.8A Active GB2574460B (en) 2018-06-07 2018-06-07 Computer Animation Method and Apparatus

Country Status (1)

Country Link
GB (1) GB2574460B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111739129B (en) * 2020-06-09 2024-04-12 广联达科技股份有限公司 Keyframe adding method and device in simulated animation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6535215B1 (en) * 1999-08-06 2003-03-18 Vcom3D, Incorporated Method for animating 3-D computer generated characters
US20050041029A1 (en) * 2003-07-21 2005-02-24 Felt Adam C. Processing image data
CN102467749A (en) * 2010-11-10 2012-05-23 上海日浦信息技术有限公司 Three-dimensional virtual human body movement generation method based on key frames and spatiotemporal restrictions
GB2523560A (en) * 2014-02-27 2015-09-02 Naturalmotion Ltd Defining an animation of a virtual object within a virtual world

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6535215B1 (en) * 1999-08-06 2003-03-18 Vcom3D, Incorporated Method for animating 3-D computer generated characters
US20050041029A1 (en) * 2003-07-21 2005-02-24 Felt Adam C. Processing image data
CN102467749A (en) * 2010-11-10 2012-05-23 上海日浦信息技术有限公司 Three-dimensional virtual human body movement generation method based on key frames and spatiotemporal restrictions
GB2523560A (en) * 2014-02-27 2015-09-02 Naturalmotion Ltd Defining an animation of a virtual object within a virtual world

Also Published As

Publication number Publication date
GB2574460B (en) 2021-05-19
GB201809389D0 (en) 2018-07-25

Similar Documents

Publication Publication Date Title
US20040012594A1 (en) Generating animation data
US6088042A (en) Interactive motion data animation system
US20110119332A1 (en) Movement animation method and apparatus
KR101179496B1 (en) Method for constructing motion-capture database and method for motion synthesis by using the motion-capture database
US20110012903A1 (en) System and method for real-time character animation
JP2023502795A (en) A real-time system for generating 4D spatio-temporal models of real-world environments
US9001132B1 (en) Constraint scenarios for retargeting actor motion
GB2579208A (en) Method and system for determining identifiers for tagging video frames with
US20120188257A1 (en) Looping motion space registration for real-time character animation
JP5055223B2 (en) Video content generation apparatus and computer program
van Basten et al. A hybrid interpolation scheme for footprint-driven walking synthesis.
JP2000508804A (en) Limb Adjustment System for Interactive Computer Animation of Characters with Joints Including Mixed Motion Data
Johansen Automated semi-procedural animation for character locomotion
US10026210B2 (en) Behavioral motion space blending for goal-oriented character animation
CN112669414A (en) Animation data processing method and device, storage medium and computer equipment
US9934607B2 (en) Real-time goal space steering for data-driven character animation
Lockwood et al. Biomechanically-inspired motion path editing
GB2574460A (en) Computer Animation Method and Apparatus
US11830121B1 (en) Neural animation layering for synthesizing martial arts movements
Kwon et al. The squash-and-stretch stylization for character motions
WO2016188551A1 (en) Animating a virtual object in a virtual world
Okamoto et al. Temporal scaling of leg motion for music feedback system of a dancing humanoid robot
KR100319758B1 (en) Animation method for walking motion variation
Gibson et al. Capture and synthesis of insect motion
Wang et al. Optimization control for biped motion trajectory