CN111508047A - Animation data processing method and device - Google Patents

Animation data processing method and device Download PDF

Info

Publication number
CN111508047A
CN111508047A CN202010318721.2A CN202010318721A CN111508047A CN 111508047 A CN111508047 A CN 111508047A CN 202010318721 A CN202010318721 A CN 202010318721A CN 111508047 A CN111508047 A CN 111508047A
Authority
CN
China
Prior art keywords
data
animation
determining
target
processed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010318721.2A
Other languages
Chinese (zh)
Other versions
CN111508047B (en
Inventor
盘琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010318721.2A priority Critical patent/CN111508047B/en
Publication of CN111508047A publication Critical patent/CN111508047A/en
Application granted granted Critical
Publication of CN111508047B publication Critical patent/CN111508047B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation

Abstract

The embodiment of the invention provides an animation data processing method and device, which are used for acquiring vertex data of each frame of an animation to be processed; determining a target main body corresponding to the animation to be processed; generating a mapping texture coordinate corresponding to the target subject; determining the motion parameters of the target subject according to the vertex data; compressing the motion parameters to generate color channel data; and generating a storage mapping according to the color channel data and the mapping texture coordinates. Different animations to be processed have different target subjects, the determined motion parameters of the target subjects are different, so that different data to be processed are stored in different ways, and the data are compressed and then stored in the map, so that the data storage precision can be improved.

Description

Animation data processing method and device
Technical Field
The present invention relates to the field of animation technology, and in particular, to an animation data processing method and an animation data processing apparatus.
Background
Animation of a three-dimensional game is generally divided into skeleton animation, particle animation, map animation and vertex animation, the former three types of animation are widely used in the game world due to low cost, small bandwidth occupation, complete tool set and the like, while vertex animation has to largely exchange data between a CPU and a GPU in the previous implementation, the cost is too high, and the cost of the vertex animation is not low in production, so that the game industry with high real-time rendering requirements is not emphasized too much, and in recent years, with the texture support of the vertex phase, developers propose a scheme for producing animation on vertex textures.
In the existing scheme, Houdini software is used for making and recording the motion of each vertex of a model, the original scheme is supported by soft animation, rigid animation, particle animation, fluid animation and the like, and the method comprises two recording modes, namely, accurate displacement data recorded in an EXR format, displacement of the vertex in a bounding box in a mode of cutting and packaging the bounding box, and pixel displacement values in a range of [0-1] are stored to a TGA format; in addition, the scheme also supports the derivation of animation maps of vertex color change and normal line respectively, wherein for rigid body animation, the rotation quaternion of each rigid body centroid is stored.
However, the scheme for recording the displacement in the EXR format in the existing scheme is limited by the existing platform map format, and most of GPUs of smart phones do not support the texture of floating point numbers; the scheme for recording the TGA format displacement is limited by precision, the precision of the animation is extremely poor when the animation with larger displacement is expressed to the extent that the animation cannot be used, and the texture with the size of NPOT used in the scheme also brings performance hidden trouble to a certain extent.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are proposed to provide an animation data processing method and a corresponding animation data processing apparatus that overcome or at least partially solve the above problems.
In order to solve the above problem, an embodiment of the present invention discloses an animation data processing method, including:
acquiring vertex data of each frame of the animation to be processed;
determining a target main body corresponding to the animation to be processed;
generating a mapping texture coordinate corresponding to the target subject;
determining the motion parameters of the target subject according to the vertex data;
compressing the motion parameters to generate color channel data;
and generating a storage mapping according to the color channel data and the mapping texture coordinates.
Optionally, the target subject comprises any one of:
vertex, centroid, particle.
Optionally, the determining a target subject corresponding to the animation to be processed includes:
when the animation to be processed is soft animation, determining a target main body corresponding to the animation to be processed as a vertex; or the like, or, alternatively,
when the animation to be processed is rigid animation, determining a target main body corresponding to the animation to be processed as a mass center; or the like, or, alternatively,
and when the animation to be processed is the particle animation, determining that a target main body corresponding to the animation to be processed is a particle.
Optionally, the determining the motion parameter of the target subject according to the vertex data includes:
acquiring position data of a target main body in vertex data of each frame;
from the position data, a plurality of displacement data of the target subject in each frame relative to the first frame is determined.
Optionally, the compressing the motion parameters to generate color channel data includes:
determining a maximum value M1 of the plurality of displacement data;
normalizing the displacement data by using the maximum value M1 to generate a first numerical value;
multiplying the first numerical value by a preset numerical value to obtain a second numerical value;
and determining a first RBG channel value and a second RBG channel value according to the second numerical value.
Optionally, when the target body is a vertex,
the determining the motion parameters of the target subject according to the vertex data further comprises:
acquiring normal data of a target main body in vertex data in each frame;
determining, from the normal data, a plurality of normal displacement data of the target subject in each frame relative to the first frame;
the compressing the motion parameters to generate color channel data further includes:
determining a maximum value M2 of the plurality of normal displacement data;
normalizing the normal displacement data by using the maximum value M2 to generate a third numerical value;
converting the third numerical value into a parameter in a spherical coordinate system to obtain a first angle value and a second angle value;
and determining the first angle value as a first A channel value, and determining the second angle value as a second A channel value.
Alternatively, when the target body is a particle,
the determining the motion parameters of the target subject according to the vertex data further comprises:
determining, from the position data, a plurality of quaternions of the target subject in each frame relative to the first frame;
the compressing the motion parameters to generate color channel data further includes:
and compressing the quaternion to generate an RBGA channel value.
The embodiment of the invention also discloses an animation data processing device, which comprises:
the data acquisition module is used for acquiring vertex data of each frame of the animation to be processed;
the main body determining module is used for determining a target main body corresponding to the animation to be processed;
a coordinate determination module for generating a mapping texture coordinate corresponding to the target subject;
the parameter determining module is used for determining the motion parameters of the target body according to the vertex data;
the parameter compression module is used for compressing the motion parameters to generate color channel data;
and the mapping generating module is used for generating a storage mapping according to the color channel data and the mapping texture coordinates.
The embodiment of the invention also discloses an electronic device, which comprises:
one or more processors; and
one or more machine-readable media having instructions stored thereon, which when executed by the one or more processors, cause the electronic device to perform one or more of the method steps as described in embodiments of the invention.
Embodiments of the invention also disclose a computer-readable storage medium having instructions stored thereon, which, when executed by one or more processors, cause the processors to perform one or more of the method steps as described in embodiments of the invention.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, the vertex data of each frame of the animation to be processed is obtained; generating a chartlet texture coordinate corresponding to the target main body, determining a motion parameter of the target main body according to the vertex data, and compressing the motion parameter to generate color channel data; and generating a storage map according to the color channel data and the texture coordinates of the map. Different animations to be processed have different target subjects, the determined motion parameters of the target subjects are different, so that different data to be processed are stored in different ways, and the data are compressed and then stored in the map, so that the data storage precision can be improved.
Drawings
FIG. 1 is a flow chart of the steps of an embodiment of a method of animation data processing of the present invention;
FIG. 2A is a diagram of a frame of a soft animation according to the present invention;
FIG. 2B is a diagram of another frame of a soft animation according to the present invention;
FIG. 3 is a schematic diagram of the present invention for correspondingly recording texture coordinates of a map into the map;
FIG. 4 is a schematic diagram of a map resulting from storing motion parameters in RGB color channels in accordance with the present invention;
FIG. 5A is a diagram of a map H for storing high order data according to the present invention;
FIG. 5B is a diagram of a map L for storing low order data according to the present invention;
FIG. 6 is a schematic diagram of a map resulting from storing normal displacement data in an RGB channel according to the present invention;
FIG. 7 is a table storing θ sum on a spherical coordinate system according to the present invention
Figure BDA0002460523430000041
Schematic diagram of the map obtained into the RG channel;
FIG. 8A is a schematic of the present invention showing the storage of θ in channel A of map H;
FIG. 8B is a schematic representation of a device of the present invention
Figure BDA0002460523430000051
Schematic stored in channel a of map L;
FIG. 9 is a schematic illustration of a memory map resulting from the tiling of maps H and L of the present invention;
FIG. 10 is a schematic representation of a rigid body animation of the present invention;
FIG. 11 is a schematic diagram of the present invention decomposing the rigid body animation shown in FIG. 10 into small modules;
FIG. 12 is a schematic diagram of the present invention for mapping texture map coordinates to a map for a subject targeting a centroid;
FIG. 13 is a schematic diagram of the present invention for storing a rotational quaternion of the centroid into the RBGA channels;
FIG. 14A is a diagram of a frame of a particle animation according to the present invention;
FIG. 14B is a diagram of another frame of a particle animation according to the present invention;
FIG. 15 is a schematic of a base patch of the present invention;
FIG. 16 is a diagram illustrating mapping of texture map coordinates into a particle-targeted subject according to the present invention;
FIG. 17 is a block diagram showing the configuration of an embodiment of the animation data processing apparatus according to the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Referring to fig. 1, a block diagram illustrating a structure of an embodiment of an animation data processing method according to the present invention may specifically include the following modules:
step 101, obtaining vertex data of each frame of the animation to be processed;
the animation to be processed may include soft animation, rigid animation, particle animation, and the like. The animation to be processed may include a plurality of frames of image frames.
In the embodiment of the invention, the vertex data of each frame of the animation to be processed can be obtained. The vertex data may include position data, normal data, texture mapping coordinates, and the like corresponding to the vertex.
Step 102, determining a target main body corresponding to the animation to be processed;
in the embodiment of the invention, the animations to be processed correspond to the target subjects one by one, and different animations to be processed have different target subjects. For example, the target body corresponding to the soft animation is a vertex, the target body corresponding to the rigid animation is a centroid, and the target body corresponding to the particle animation is a particle. FIG. 2A is a diagram of one frame of a soft animation of the present invention, and FIG. 2B is a diagram of another frame of a soft animation of the present invention.
In a preferred embodiment of the present invention, the target subject may include any one of:
vertex, centroid, particle.
103, generating a mapping texture coordinate corresponding to the target subject;
in the embodiment of the present invention, a special texture mapping channel may be assigned to each target subject to generate the texture coordinates of the map corresponding to the target subject. The map texture coordinates are used to map the motion parameters corresponding to the target subject into the stored map.
Specifically, the mapping texture coordinates of each target subject are arranged from 0 to 1 in the X-axis direction according to the serial number, each target subject occupies one pixel, and a texture width is given when outputting. The texture pitch of each target subject in the Y-axis direction needs to be guaranteed to be one pixel, because the texture height needs to be dynamically adjusted according to the total frame number of the animation, therefore, one frame number needs to be specified.
As an example, assuming that the texture width is 1024, a maximum of 1024 target bodies are accommodated in the lateral direction, and if the number of target bodies exceeds 1024, stacking is performed in the Y-axis direction, and the arrangement is continued from the second row until all the target bodies are arranged. Assuming that 50 frames need to be output and the number of stacked target objects is 5, then a total of 5 × 50 to 250 pixels of texture height is required, and the step size in the Y-axis direction is 1/250, so as to ensure that each target object can fall on a single pixel. Fig. 3 is a schematic diagram of the present invention for correspondingly recording texture coordinates of a map, which have a distance of one pixel in the Y-axis direction, into the map.
104, determining the motion parameters of the target body according to the vertex data;
in the embodiment of the invention, the motion parameters of the target body can be determined according to the vertex data.
Specifically, the displacement parameter of the target subject may be determined according to the position data of the target subject in each frame. The normal displacement data of the target subject can also be determined according to the normal data of the target subject in each frame. If the target subject is the centroid, the rotation quaternion of the target subject can be determined according to the displacement data of the target subject in each frame.
Step 105, compressing the motion parameters to generate color channel data;
in the embodiment of the present invention, the motion parameters may be compressed, and the numerical values obtained after the compression processing may be used as color channel data.
The color channel data may include R channel data, G channel data, B channel data, and a channel data.
For example, the motion parameters may include displacement data composed of a displacement of the target body in the X-axis direction, a displacement in the Y-axis direction, and a displacement in the Z-axis direction, and then the displacement in the X-axis direction may be designated as R-channel data; designating the displacement in the Y-axis direction as G-channel data; the displacement in the Z-axis direction is designated as B-channel data. In addition, the motion parameters may further include normal displacement data, and the normal displacement data may be designated as a-channel data. The motion parameters may also be rotational quaternion, and four parameters in the quaternion may be respectively specified as R channel data, G channel data, B channel data, and a channel data. FIG. 4 is a diagram illustrating a map resulting from storing motion parameters in RGB color channels according to the present invention.
And 106, generating a storage mapping according to the color channel data and the mapping texture coordinates.
In an embodiment of the present invention, a memory map may be generated from color channel data and map texture coordinates.
In a preferred embodiment of the present invention, the step 102 may include the following sub-steps:
when the animation to be processed is soft animation, determining a target main body corresponding to the animation to be processed as a vertex; or when the animation to be processed is rigid animation, determining that a target main body corresponding to the animation to be processed is a mass center; or when the animation to be processed is the particle animation, determining that the target main body corresponding to the animation to be processed is the particle.
In the embodiment of the invention, when the animation to be processed is soft animation, the target main body corresponding to the animation to be processed can be determined as a vertex; when the animation to be processed is rigid animation, the target main body corresponding to the animation to be processed can be determined as a mass center; when the animation to be processed is a particle animation, it may be determined that a target subject corresponding to the animation to be processed is a particle.
In a preferred embodiment of the present invention, the step 104 may comprise the following sub-steps:
acquiring position data of a target main body in vertex data of each frame; from the position data, a plurality of displacement data of the target subject in each frame relative to the first frame is determined.
In the embodiment of the invention, the position data of the target main body in the vertex data of each frame can be obtained; based on the position data, a plurality of displacement data of the target subject in each frame relative to the first frame is determined.
In a preferred embodiment of the present invention, the step 105 may comprise the following sub-steps:
determining a maximum value M1 of the plurality of displacement data; normalizing the displacement data by using the maximum value M1 to generate a first numerical value; multiplying the first numerical value by a preset numerical value to obtain a second numerical value; and determining a first RBG channel value and a second RBG channel value according to the second numerical value.
Specifically, the displacement data in each frame of the target subject may be divided by M1 to obtain a normalized displacement data. The displacement data may be composed of displacement in the X-axis direction, displacement in the Y-axis direction, and displacement in the Z-axis direction, and is denoted as XYZ. The maximum values of XYZ are determined respectively, and then the displacement data in each frame is normalized to obtain the displacement data between 0 and 1, i.e. the first numerical value. The first value is multiplied by a preset value to obtain a second value, wherein the preset value may be a preset value for improving the precision of the displacement data, for example, the preset value may be 65025, i.e., 255 × 255. The obtained second numerical value is divided into two parts, i.e., upper data and lower data, where the integer part is upper and the decimal part is lower, and for example, the second numerical value is 40.625, the upper data is determined to be 40 and the lower data is determined to be 625.
In a specific implementation, the first RBG channel value and the second RBG channel value may be stored in 8-bit maps, denoted as maps H and L, respectively, with high and low bits, for example, the high bit data 40 is stored in the map H, and the low bit data is stored in the map L, to further improve the accuracy of the stored data, fig. 5A shows a schematic diagram of a map H for storing high bit data, and fig. 5A shows a schematic diagram of a map L for storing low bit data according to the present invention.
In a preferred embodiment of the invention, when the target body is an apex,
the step 104 may further comprise the sub-steps of:
acquiring normal data of a target main body in vertex data in each frame; from the normal data, a plurality of normal displacement data of the target subject in each frame relative to the first frame is determined.
In the embodiment of the present invention, the normal data of the target body in the vertex data in each frame may be acquired, and the data of the displacement of the target body in each frame with respect to the plurality of normals in the first frame may be determined according to the normal data.
The step 105 may further comprise the sub-steps of:
determining a maximum value M2 of the plurality of normal displacement data; normalizing the normal displacement data by using the maximum value M2 to generate a third numerical value; converting the third numerical value into a parameter in a spherical coordinate system to obtain a first angle value and a second angle value; and determining the first angle value as a first A channel value, and determining the second angle value as a second A channel value.
In the embodiment of the invention, the normal displacement data can be converted into theta sum on a spherical coordinate system
Figure BDA0002460523430000091
To sum theta with
Figure BDA0002460523430000092
And storing the data in the A channel.
For the obtained normal displacement data, it may be recorded into the RGB channel in the above storage manner, for example, fig. 6 shows a schematic diagram of a map obtained by storing the normal displacement data into the RGB channel. Theta and theta on a spherical coordinate system obtained by converting normal displacement data and the like
Figure BDA0002460523430000093
Can be stored in the RG channel as shown in FIG. 7The invention relates to a method for storing theta sum on a spherical coordinate system
Figure BDA0002460523430000094
Schematic diagram of the resulting map into the RG channel. It should be noted that fig. 6 and 7 are stored only as intermediate processing data and are not used for generating the final memory map.
Specifically, the upper bit displacement data, and θ, may be used as a set of RGBA to generate the upper bit map H, and the lower bit displacement data, and
Figure BDA0002460523430000095
as another set of RGBA, a lower bitmap L is generated, then, the maps H and L are tiled to generate a memory map, FIG. 8A shows a schematic of the invention storing θ in the A channel of the map H, FIG. 8B shows a schematic of the invention storing θ in the A channel of the map H
Figure BDA0002460523430000096
Schematic diagrams stored in channel A of map L, FIG. 9 shows a schematic diagram of a memory map obtained by splicing maps H and L according to the present invention.
In an embodiment of the present invention, when the target subject is a particle, the step 104 may further include the following sub-steps:
determining, from the position data, a plurality of quaternions of the target subject in each frame relative to the first frame; the compressing the motion parameters to generate color channel data further includes: and compressing the quaternion to generate an RBGA channel value.
In the rigid body animation, normal displacement data does not need to be recorded, and the rotation quaternion of the centroid needs to be recorded, so that the quaternion can be compressed by determining the quaternion of the target main body, and an RBGA channel value is obtained. Fig. 10 is a schematic diagram of a rigid body animation according to the present invention, and fig. 11 is a schematic diagram of a rigid body animation according to the present invention, which is decomposed into small blocks according to fig. 10. In fig. 10 and 11, the black dots are the centroids corresponding to each module. Fig. 12 is a schematic diagram illustrating the mapping of texture map coordinates determined by using a centroid as a target subject according to the present invention. Fig. 13 shows a schematic diagram of storing a rotation quaternion of the centroid into an RBGA channel according to the present invention, and in a rigid animation, the rotation quaternion of the centroid is used instead of a normal line and is stored on an RGBA channel.
When the storage map of the rigid body animation is generated, firstly, the displacement data of the centroid is stored to obtain a high-level map H and a low-level map L, then the quaternion of the centroid is stored to obtain a map N, and the maps H, L and N are spliced to obtain the storage map of the rigid body animation.
For particle animation, since it uses base patch replication, the number of times is equal to the number of particles, and a patch is always directed to a player, only displacement data of particles need to be stored, normal data does not need to be processed, and quaternions such as centroids of rigid animation do not need to be processed.
For the particle animation, the displacement data of the particles may be recorded, and the particle animation may be generated from the displacement data of the particles. The process of recording the displacement data of the particles is consistent with the soft animation and the rigid animation, and is not repeated herein. Fig. 14A is a schematic diagram of one frame of a particle animation according to the present invention, and fig. 14B is a schematic diagram of another frame of a particle animation according to the present invention. Fig. 15 shows a schematic diagram of a base patch of the present invention. Fig. 16 shows a schematic diagram of mapping texture map coordinates determined by using particles as target bodies into a map according to the present invention.
During game playing, for soft animation, according to the progress (frame number/total frame number) of the animation, assuming t, the texture coordinate Y is shifted to the Y axis to obtain the coordinate Y to be sampled by the vertex of the current frame, then the coordinate Y and the original coordinate x are combined into (x, (Y + t)/2) and (x, (Y + t)/2+0.5) to obtain the channel data of H and L, the decompression algorithm ((H x 255+ L)/65525) is used and multiplied by the maximum deviation value M to obtain the corresponding offset, the offset is set as the position of the vertex, the transparent channels H and L are sampled to obtain the normal spherical coordinate parameters theta and phi of the current vertex at the current time, the normal in a Cartesian coordinate system is solved by using a spherical formula, and the normal of the vertex is set, and playing of the animation can be completed.
For the rigid animation, the playing rule is similar to that of the soft animation, and only when the normal is processed, the point on the model and the normal are rotated by using quaternion, and then the decompressed offset is added, so that the position in the animation can be restored.
For the particle animation, the playing rule is similar to that of the rigid animation, the normal line does not need to be processed, only one deviation value needs to be processed for each patch, and the playing of the animation can be finished towards a player.
On the basis of the animation data, an algorithm is used for animation playing control in the vertex calculation stage, and animation circulation and speed are controlled:
float loop=((GameTime-Offset)*Speed);
float played_once=step(1,loop);
float t=lerp(frac(loop),0.9999,played_once*Freeze);
the input value GameTime is the current game time, Speed is the animation Speed, Offset is a duration Offset, Freeze is the control switch, and the return value t is the current animation progress.
When single play is performed, the imported parameter Offset is the current GameTime, and Freeze is 1; when the loop playing is performed, the afferent parameter Freeze is 0.
In the embodiment of the invention, the vertex data of each frame of the animation to be processed is obtained; generating a chartlet texture coordinate corresponding to the target main body, determining a motion parameter of the target main body according to the vertex data, and compressing the motion parameter to generate color channel data; and generating a storage map according to the color channel data and the texture coordinates of the map. Different animations to be processed have different target subjects, the determined motion parameters of the target subjects are different, so that different data to be processed are stored in different ways, and the data are compressed and then stored in the map, so that the data storage precision can be improved.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 17, a block diagram of an embodiment of an animation data processing apparatus according to the present invention is shown, and may specifically include the following modules:
a data obtaining module 201, configured to obtain vertex data of each frame of the animation to be processed;
a subject determining module 202, configured to determine a target subject corresponding to the animation to be processed;
a coordinate determination module 203, configured to generate a mapping texture coordinate corresponding to the target subject;
a parameter determining module 204, configured to determine a motion parameter of the target subject according to the vertex data;
a parameter compression module 205, configured to perform compression processing on the motion parameters to generate color channel data;
and the map generating module 206 is configured to generate a storage map according to the color channel data and the map texture coordinates.
In a preferred embodiment of the present invention, the target subject includes any one of:
vertex, centroid, particle.
In a preferred embodiment of the present invention, the subject determination module 202 includes:
the first determining submodule is used for determining a target main body corresponding to the animation to be processed as a vertex when the animation to be processed is soft animation; or the like, or, alternatively,
the second determining submodule is used for determining that the target main body corresponding to the animation to be processed is a mass center when the animation to be processed is a rigid body animation; or the like, or, alternatively,
and the third determining submodule is used for determining that the target main body corresponding to the animation to be processed is a particle when the animation to be processed is a particle animation.
In a preferred embodiment of the present invention, the parameter determining module 204 includes:
the data acquisition submodule is used for acquiring position data of the target main body in the vertex data of each frame;
and the data determination sub-module is used for determining a plurality of displacement data of the target body in each frame relative to the first frame according to the position data.
In a preferred embodiment of the present invention, the parameter compression module 205 includes:
a maximum value determination submodule for determining a maximum value M1 of the plurality of displacement data;
the normalization processing submodule is used for performing normalization processing on the displacement data by adopting the maximum value M1 to generate a first numerical value;
the numerical value generation submodule is used for multiplying the first numerical value by a preset numerical value to obtain a second numerical value;
and the channel value determining submodule is used for determining a first RBG channel value and a second RBG channel value according to the second numerical value.
In a preferred embodiment of the invention, when the target body is an apex,
the parameter determining module 204 further includes:
the normal data acquisition submodule is used for acquiring normal data of the target main body in the vertex data of each frame;
a normal displacement determination submodule, configured to determine, according to the normal data, a plurality of normal displacement data of the target subject in each frame with respect to the target subject in a first frame;
the parameter compression module 205 further includes:
a maximum determination submodule for determining a maximum M2 of the plurality of normal displacement data;
the normalization processing submodule is used for performing normalization processing on the plurality of normal displacement data by adopting the maximum value M2 to generate a third numerical value;
the angle determining submodule is used for converting the third numerical value into a parameter in a spherical coordinate system to obtain a first angle value and a second angle value;
and the A-channel value determining submodule is used for determining that the first angle value is a first A-channel value and determining that the second angle value is a second A-channel value.
In a preferred embodiment of the invention, when the target body is a particle,
the parameter determining module 204 further includes:
a quaternion determining submodule for determining a plurality of quaternions of the target subject in each frame relative to the first frame based on the position data;
the parameter compression module 205 further includes:
and the quaternion compression submodule is used for compressing the quaternion to generate an RBGA channel value.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
An embodiment of the present invention further provides an electronic device, including:
one or more processors; and
one or more machine-readable media having instructions stored thereon, which when executed by the one or more processors, cause the electronic device to perform steps of a method as described by embodiments of the invention.
Embodiments of the present invention also provide a computer-readable storage medium having stored thereon instructions, which, when executed by one or more processors, cause the processors to perform the steps of the method according to embodiments of the present invention.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The animation data processing method and the animation data processing device provided by the invention are described in detail, specific examples are applied in the description to explain the principle and the implementation mode of the invention, and the description of the embodiments is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. An animation data processing method, comprising:
acquiring vertex data of each frame of the animation to be processed;
determining a target main body corresponding to the animation to be processed;
generating a mapping texture coordinate corresponding to the target subject;
determining the motion parameters of the target subject according to the vertex data;
compressing the motion parameters to generate color channel data;
and generating a storage mapping according to the color channel data and the mapping texture coordinates.
2. The method of claim 1, wherein the target subject comprises any one of:
vertex, centroid, particle.
3. The method of claim 2, wherein the determining a target subject corresponding to the pending animation comprises:
when the animation to be processed is soft animation, determining a target main body corresponding to the animation to be processed as a vertex; or the like, or, alternatively,
when the animation to be processed is rigid animation, determining a target main body corresponding to the animation to be processed as a mass center; or the like, or, alternatively,
and when the animation to be processed is the particle animation, determining that a target main body corresponding to the animation to be processed is a particle.
4. The method of claim 1, wherein said determining motion parameters of said target subject from said vertex data comprises:
acquiring position data of a target main body in vertex data of each frame;
from the position data, a plurality of displacement data of the target subject in each frame relative to the first frame is determined.
5. The method according to claim 4, wherein the compressing the motion parameters to generate color channel data comprises:
determining a maximum value M1 of the plurality of displacement data;
normalizing the displacement data by using the maximum value M1 to generate a first numerical value;
multiplying the first numerical value by a preset numerical value to obtain a second numerical value;
and determining a first RBG channel value and a second RBG channel value according to the second numerical value.
6. The method of claim 2, wherein when the target body is a vertex,
the determining the motion parameters of the target subject according to the vertex data further comprises:
acquiring normal data of a target main body in vertex data in each frame;
determining, from the normal data, a plurality of normal displacement data of the target subject in each frame relative to the first frame;
the compressing the motion parameters to generate color channel data further includes:
determining a maximum value M2 of the plurality of normal displacement data;
normalizing the normal displacement data by using the maximum value M2 to generate a third numerical value;
converting the third numerical value into a parameter in a spherical coordinate system to obtain a first angle value and a second angle value;
and determining the first angle value as a first A channel value, and determining the second angle value as a second A channel value.
7. The method of claim 2, wherein, when the target body is a particle,
the determining the motion parameters of the target subject according to the vertex data further comprises:
determining, from the position data, a plurality of quaternions of the target subject in each frame relative to the first frame;
the compressing the motion parameters to generate color channel data further includes:
and compressing the quaternion to generate an RBGA channel value.
8. An animation data processing apparatus, characterized in that the apparatus comprises:
the data acquisition module is used for acquiring vertex data of each frame of the animation to be processed;
the main body determining module is used for determining a target main body corresponding to the animation to be processed;
a coordinate determination module for generating a mapping texture coordinate corresponding to the target subject;
the parameter determining module is used for determining the motion parameters of the target body according to the vertex data;
the parameter compression module is used for compressing the motion parameters to generate color channel data;
and the mapping generating module is used for generating a storage mapping according to the color channel data and the mapping texture coordinates.
9. An electronic device, comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the electronic device to perform the steps of the method of one or more of claims 1-7.
10. A computer-readable storage medium having stored thereon instructions, which, when executed by one or more processors, cause the processors to perform the steps of the method of one or more of claims 1-7.
CN202010318721.2A 2020-04-21 2020-04-21 Animation data processing method and device Active CN111508047B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010318721.2A CN111508047B (en) 2020-04-21 2020-04-21 Animation data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010318721.2A CN111508047B (en) 2020-04-21 2020-04-21 Animation data processing method and device

Publications (2)

Publication Number Publication Date
CN111508047A true CN111508047A (en) 2020-08-07
CN111508047B CN111508047B (en) 2023-08-22

Family

ID=71877716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010318721.2A Active CN111508047B (en) 2020-04-21 2020-04-21 Animation data processing method and device

Country Status (1)

Country Link
CN (1) CN111508047B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6072496A (en) * 1998-06-08 2000-06-06 Microsoft Corporation Method and system for capturing and representing 3D geometry, color and shading of facial expressions and other animated objects
US6690376B1 (en) * 1999-09-29 2004-02-10 Sega Enterprises, Ltd. Storage medium for storing animation data, image processing method using same, and storage medium storing image processing programs
JP2010033295A (en) * 2008-07-28 2010-02-12 Namco Bandai Games Inc Image generation system, program and information storage medium
CN109045691A (en) * 2018-07-10 2018-12-21 网易(杭州)网络有限公司 A kind of the special efficacy implementation method and device of special efficacy object
CN109598777A (en) * 2018-12-07 2019-04-09 腾讯科技(深圳)有限公司 Image rendering method, device, equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6072496A (en) * 1998-06-08 2000-06-06 Microsoft Corporation Method and system for capturing and representing 3D geometry, color and shading of facial expressions and other animated objects
US6690376B1 (en) * 1999-09-29 2004-02-10 Sega Enterprises, Ltd. Storage medium for storing animation data, image processing method using same, and storage medium storing image processing programs
JP2010033295A (en) * 2008-07-28 2010-02-12 Namco Bandai Games Inc Image generation system, program and information storage medium
CN109045691A (en) * 2018-07-10 2018-12-21 网易(杭州)网络有限公司 A kind of the special efficacy implementation method and device of special efficacy object
CN109598777A (en) * 2018-12-07 2019-04-09 腾讯科技(深圳)有限公司 Image rendering method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN111508047B (en) 2023-08-22

Similar Documents

Publication Publication Date Title
CN109215123B (en) Method, system, storage medium and terminal for generating infinite terrain based on cGAN
US20040176908A1 (en) Map displaying apparatus
US7920143B1 (en) Method for defining animation parameters for an animation definition interface
CN106558017B (en) Spherical display image processing method and system
US11189096B2 (en) Apparatus, system and method for data generation
CN107146274A (en) Image data processing system, texture mapping compression and the method for producing panoramic video
US7274367B2 (en) Method for defining animation parameters for an animation definition interface
CN111476877A (en) Shadow rendering method and device, electronic equipment and storage medium
CN107203962B (en) Method for making pseudo-3D image by using 2D picture and electronic equipment
US6774897B2 (en) Apparatus and method for drawing three dimensional graphics by converting two dimensional polygon data to three dimensional polygon data
CN114092611A (en) Virtual expression driving method and device, electronic equipment and storage medium
CN111724313B (en) Shadow map generation method and device
CN111508047A (en) Animation data processing method and device
KR102065632B1 (en) Device and method for acquiring 360 VR images in a game using a plurality of virtual cameras
CN116977532A (en) Cube texture generation method, apparatus, device, storage medium, and program product
CN1902661A (en) Method of rendering graphical objects
JP5007633B2 (en) Image processing program, computer-readable recording medium storing the program, image processing apparatus, and image processing method
US20080012864A1 (en) Image Processing Apparatus and Method, and Program
US20100207940A1 (en) Image display method and image display apparatus
EP0473152B1 (en) Topographical data construction system
CN111506680B (en) Terrain data generation and rendering method and device, medium, server and terminal
CN113506220B (en) Face gesture editing method and system driven by 3D vertex and electronic equipment
JP2005092782A (en) Method, apparatus and computer program for generating three-dimensional model,
CN111354064B (en) Texture image generation method and device
CN117654031A (en) Sky sphere model generation method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant