CN112102452A - Animation model processing method and device, electronic equipment and storage medium - Google Patents

Animation model processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112102452A
CN112102452A CN202011035179.6A CN202011035179A CN112102452A CN 112102452 A CN112102452 A CN 112102452A CN 202011035179 A CN202011035179 A CN 202011035179A CN 112102452 A CN112102452 A CN 112102452A
Authority
CN
China
Prior art keywords
bone
animation model
simplification
skeleton
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011035179.6A
Other languages
Chinese (zh)
Other versions
CN112102452B (en
Inventor
马浩然
刘振涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202011035179.6A priority Critical patent/CN112102452B/en
Publication of CN112102452A publication Critical patent/CN112102452A/en
Application granted granted Critical
Publication of CN112102452B publication Critical patent/CN112102452B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application relates to an animation model processing method, an animation model processing device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a first original animation model; according to a simplification strategy, executing simplification operation on the first original animation model to obtain a first simplified animation model; rendering the first simplified animation model. According to the technical scheme, after face pinching/person pinching is completed, the animation model is simplified, the amount of bone data in the animation model is reduced, the calculated amount is reduced in the rendering process, the calculating speed is increased, and the game fluency is optimized.

Description

Animation model processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an animation model processing method and apparatus, an electronic device, and a storage medium.
Background
When a user pinches a face or pinches a person by modifying model bones, if parameters are completely opened, the number of the pinching face bones can reach 45, the number of the pinching person bones can be increased by about 50, and a certain memory is loaded for recording and storing the modified parameters of each model bone. In the game engine, the complete skeleton can participate in the calculation of the animation model, and the calculation amount is large. Moreover, if there are many animation models in a scene, the computer will be slow to calculate, and the phenomena of display blocking, frame dropping and the like are easy to occur.
Disclosure of Invention
In order to solve the technical problem or at least partially solve the technical problem, embodiments of the present application provide an animation model processing method, apparatus, electronic device, and storage medium.
According to an aspect of an embodiment of the present application, there is provided an animation model processing method, including:
acquiring a first original animation model;
according to a simplification strategy, executing simplification operation on the first original animation model to obtain a first simplified animation model;
rendering the first simplified animation model.
Optionally, the performing, according to a preset simplification policy, a simplification operation on the first original animation model includes:
acquiring bone data corresponding to the first original drawing model;
assigning the vertex information corresponding to the bone data to the vertex of the original animation model;
and deleting the modified bone data corresponding to the simplified strategy from the bone data.
Optionally, the performing a simplification operation on the first original animation model according to a simplification policy further includes:
determining the simplification level of the first original animation model according to the adjustment operation of the simplification control;
determining the adjusted bone data corresponding to the reduced level.
Optionally, the method further includes:
storing the first proto-animation model in a server;
when a first adjustment operation on the first simplified animation model is received, acquiring the first original animation model from the server;
executing the first adjustment operation on the first original animation model to obtain a second original animation model;
according to the simplification strategy, executing simplification operation on the second original animation model to obtain a second simplified animation model;
rendering the second simplified animation model;
storing the second proto-animation model in the server.
Optionally, the method further includes:
acquiring performance parameters of equipment for executing rendering;
and determining the simplification strategy according to the performance parameters.
Optionally, the performing the first adjustment operation on the first original drawing model includes:
determining a current skeleton corresponding to the first adjustment operation and an adjusted skeleton corresponding to the current skeleton;
acquiring a first weight corresponding to the current skeleton and a second weight corresponding to the adjusted skeleton;
determining a second adjustment parameter corresponding to the adjusted bone according to a first adjustment parameter corresponding to the first adjustment operation, the first weight and the second weight;
and executing a second adjustment operation on the adjusted bone according to the second adjustment parameter.
Optionally, the determining an adjusted bone corresponding to the current bone includes:
acquiring a control state of the associated adjusting control;
when the control state is an open state, determining that the adjusted bone comprises the current bone and an associated bone with the current bone;
when the control state is the closing state, determining that the adjusted bone is the current bone.
Optionally, when the second adjustment operation is a zoom operation, the performing a second adjustment operation on the adjusted bone according to the second adjustment parameter includes:
determining a scaling skeleton corresponding to the adjusting skeleton, wherein the scaling skeleton is bound with an animation model and inherits the coordinates of the adjusting skeleton;
reversely obtaining a second scaling factor of the scaling bone in the first coordinate axis according to a first scaling factor of the adjusting bone in the first coordinate axis, wherein the first coordinate axis comprises one coordinate axis or two coordinate axes of the adjusting bone, and the scaling operation of the adjusting bone and the scaling bone in the first coordinate axis is opposite;
obtaining an auxiliary adjusting parameter of the scaling bone according to the first coordinate axis and the second scaling multiple;
performing an adjustment of the zoom bone according to the auxiliary adjustment parameter.
According to another aspect of an embodiment of the present application, there is provided an animation model processing apparatus including:
the acquisition module is used for acquiring a first original animation model;
the simplification module is used for executing simplification operation on the first original animation model according to a simplification strategy to obtain a first simplified animation model;
and the rendering module is used for rendering the first simplified animation model.
According to another aspect of the embodiments of the present application, there is also provided a storage medium including a stored program that executes the above steps when the program is executed.
According to another aspect of an embodiment of the present application, there is provided an electronic device including: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the above method steps when executing the computer program.
According to another aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the above-mentioned method steps.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages:
after face pinching/person pinching is completed, simplification operation is performed on the animation model, the bone data volume in the animation model is reduced, the calculated amount is reduced in the rendering process, the calculating speed is increased, and the game fluency is optimized.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present invention, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained based on these drawings without creative efforts.
FIG. 1 is a flowchart of a method for processing an animation model according to an embodiment of the present disclosure;
FIG. 2 is a flowchart of a method for processing an animation model according to another embodiment of the present application;
FIG. 3a is a schematic diagram of a simplified control provided in an embodiment of the present application when closed;
FIG. 3b is a diagram illustrating a simplified control according to an embodiment of the present disclosure when the simplified control is turned on;
FIG. 3c is a schematic diagram of an associated adjustment control provided in accordance with another embodiment of the present application;
FIG. 4 is a flowchart of a method for processing an animation model according to another embodiment of the present application;
FIG. 5 is a flowchart of a method for processing an animation model according to another embodiment of the present application;
FIG. 6 is a flowchart of a method for processing an animation model according to another embodiment of the present application;
FIG. 7 is a skeletal level schematic representation provided in accordance with an embodiment of the present application;
FIG. 8 is a flowchart of a method for processing an animation model according to another embodiment of the present application;
FIG. 9 is a block diagram of an animation model processing apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
According to the embodiment of the application, after face pinching/person pinching is completed, the animation model is simplified, the vertex information corresponding to the bone data is assigned to the vertex of the original animation model, the set face pinching/person pinching bone is deleted, the modification parameters corresponding to the bone are included, the animation model obtained through rendering is the model after face pinching/person pinching, the bone amount of the animation model is reduced, the calculated amount is reduced in the rendering process, the calculating speed is improved, and the game fluency is optimized.
First, an animation model processing method provided by an embodiment of the present invention is described below.
Fig. 1 is a flowchart of an animation model processing method according to an embodiment of the present application. As shown in fig. 1, the method comprises the steps of:
step S11, acquiring a first original animation model;
step S12, according to the simplification strategy, executing the simplification operation of the first original animation model to obtain a first simplified animation model;
in step S13, the first simplified animation model is rendered.
Optionally, the simplification strategy may include presetting the skeleton to be deleted, and may also include a simplification level for the animation model.
Through the steps S11 to S13, after the face/person pinching is completed, the animation model is simplified, the bone data amount in the animation model is reduced, the calculated amount in the rendering process is reduced, the calculating speed is increased, and the game fluency is optimized.
Fig. 2 is a flowchart of an animation model processing method according to another embodiment of the present application. As shown in fig. 2, the step S12 includes the following steps:
step S21, obtaining skeleton data corresponding to the first original drawing model;
step S22, assigning vertex information corresponding to the bone data to the vertex of the original animation model;
in step S23, modified skeleton data corresponding to the reduction policy is deleted from the skeleton data.
Through the steps S21 to S23, after the face-pinching/person-pinching is completed, the animation model is simplified, the vertex information corresponding to the skeleton data is assigned to the vertex of the original animation model, and the set face-pinching/person-pinching skeleton, including the modification parameters corresponding to the skeleton, is deleted, so that the animation model obtained through rendering is the model after the face-pinching/person-pinching, the skeleton amount of the animation model is reduced, the calculation amount is reduced in the rendering process, the calculation speed is increased, and the game fluency is optimized.
In this embodiment, a simplified control may be provided, and the control may be in the form of a slide bar, a knob, or the like. By simplifying the controls, the simplification operations can be turned on or off, and the level of simplification can also be adjusted.
The step S12 further includes: determining the simplification level of the first original animation model according to the adjustment operation of the simplification control; and determining the adjusted bone data corresponding to the simplification level.
Fig. 3a and fig. 3b are schematic diagrams of a simplified control provided in an embodiment of the present application when the simplified control is closed and opened, respectively. The simplified control may be a knob, as shown in fig. 3a, which is in an off state when the knob 31 is upright. As shown in fig. 3b, when the knob 31 is horizontal, the simplified control is in the on state.
Fig. 3c is a schematic diagram of an associated adjustment control according to another embodiment of the present application. As shown in fig. 3c, the simplified control is provided with at least one shift 32. When the knob 31 is rotated to the shift position 32, the animation model is simplified according to the simplification level corresponding to the shift position 32.
Optionally, the simplified control may be provided with a plurality of gears, the simplified level corresponding to the gear 1 is 1, and 30% of the bone data is deleted; the simplification level corresponding to the gear 2 is 2, and 50% of bone data are deleted; the reduction level corresponding to gear 3 is 3, and 70% of the bone data is deleted.
In an alternative embodiment, the simplification policy is automatically adjusted from processor to processor. The method further comprises the following steps: acquiring performance parameters of equipment for executing rendering; and determining a simplification strategy according to the performance parameters.
Wherein the apparatus to perform rendering may include: graphics Processing Unit (GPU), Central Processing Unit (CPU), and so on.
Wherein the performance parameters of the GPU include at least one of: the number of cores, the core frequency, the video memory bit width, the video memory frequency, the video memory size and the like.
The performance parameters of the CPU include at least one of: frequency, cache capacity, operating voltage, bus style, manufacturing process, superscalar, etc.
According to the performance parameters of the equipment, the performance grade or performance score corresponding to the equipment can be calculated, and the simplification strategy can be determined according to the performance grade or performance score. When the device performing the rendering includes at least two, an average of the performance levels or performance scores of the at least two devices may be calculated, and the simplification policy may be determined based on the average.
Based on performance parameters of a GPU or a CPU and the like, whether a simplified control needs to be started or not can be determined, and if the simplified control is started, the simplification level of the animation model, such as the percentage of deleted skeleton data, can be determined according to the performance parameters.
In an alternative embodiment, whether the simplified control needs to be opened or not can be determined according to the number of the animation model needing to be rendered in the same screen or the number of bones (or the amount of bone data) needing to be rendered, and the simplification level of the animation model after the simplified control is opened. For example, when the number of animation models needing to be rendered under the same screen is 1-3, the simplified control is not started, and when the number of animation models under the same screen is 4 or more, the simplified control is started. When the number of animation models under the same screen is 4-6, a simplification strategy of a simplification level 1 is adopted, and 30% of skeleton data of each animation model is deleted; when the number of animation models under the same screen is 7-9, a simplification strategy of a simplification level 2 is adopted, and 50% of skeleton data of each animation model is deleted; when the number of the animation models under the same screen is 10 or more, 70% of the skeletal data of each animation model is deleted by adopting the simplification strategy of the simplification level 3.
In an alternative embodiment, the reduction policy may also be controlled by the user whether to turn on, and the user may also select the reduction level as desired.
In this embodiment, after the simplification operation is performed on the animation model, the original animation model is not deleted, but the simplified animation model is rendered, and the original animation model is stored in the server, so that the animation model can be modified again in the following process.
FIG. 4 is a flowchart of a method for processing an animation model according to another embodiment of the present application. As shown in fig. 4, the method comprises the steps of:
step S31, storing the first original animation model in the server;
step S32, when receiving the first adjustment operation of the first simplified animation model, obtaining a first original animation model from the server;
step S33, executing a first adjustment operation on the first original animation model to obtain a second original animation model;
step S34, according to the simplification strategy, executing the simplification operation of the second original animation model to obtain a second simplified animation model;
step S35, rendering the second simplified animation model;
in step S36, the second original animated model is stored in the server.
Alternatively, the simplification policy in the above step S34 may be the same as or different from the simplification policy in the above step S12.
Through the steps S31 to S36, when the adjustment operation on the first simplified animation model is received, the first original animation model which is not simplified is obtained from the server, the simplification operation is performed on the first original animation model, the rendering is performed on the second simplified animation model, and the second original animation model which is adjusted on the first original animation model is still stored on the server.
In this embodiment, when the animation model is adjusted, since a certain association relationship exists between bones, when the current bone is adjusted, other bones related to the current bone can be adjusted.
FIG. 5 is a flowchart of a method for processing an animation model according to another embodiment of the present application. As shown in fig. 5, the step S33 includes:
step S41, determining a current skeleton corresponding to the first adjustment operation and an adjusted skeleton corresponding to the current skeleton;
step S42, acquiring a first weight corresponding to the current skeleton and adjusting a second weight corresponding to the skeleton;
step S43, determining a second adjustment parameter corresponding to the adjusted bone according to the first adjustment parameter, the first weight and the second weight corresponding to the first adjustment operation;
and step S44, executing a second adjustment operation on the adjusted bone according to the second adjustment parameter.
In this embodiment, when a user adjusts one of the bones, the adjustment bone corresponding to the bone also performs a corresponding adjustment operation, so as to achieve synchronous adjustment of the bones and the related bones thereof, and quickly and accurately adjust the animation model according to the user's needs. Meanwhile, the user does not need to manually adjust each skeleton one by one, and other related skeletons can be synchronously adjusted only by adjusting one skeleton, so that the complexity of skeleton adjustment operation is reduced.
For the mutually related bones, each bone has a corresponding weight, and the weight represents the corresponding adjustment relation of each bone in the adjustment process. For example, each skeleton is divided into 31 gears of-15 to 15 according to original skeleton data, the weight of the current skeleton a is 0.6, and the weight of the associated skeleton B is 1.2, and each time the current skeleton a adjusts 0.6 gear, the associated skeleton B adjusts 1.2 gear. If the current bone a is adjusted 3 gears from the initial position, the associated bone B is adjusted 3 × 1.2 ÷ 0.6 ═ 6 gears.
The weight may be an adjustment relationship between actual bone data, including a relationship between adjustment parameters such as a rotation angle and a displacement distance, that is, the adjustment parameter is directly calculated according to the bone data of the current bone, and the adjustment parameter corresponding to the associated bone is calculated according to the weight of each bone, which is not described herein again.
In an alternative embodiment, the adjustment facial skeleton range corresponding to the current facial skeleton may be determined according to the turning on or off of the associated adjustment control. The above step S41 includes the following steps: acquiring a control state of the associated adjusting control; when the control state is an opening state, determining that the adjustment skeleton comprises a current skeleton and a related skeleton of the current skeleton; and when the control state is the closing state, determining that the bone is adjusted to be the current bone.
For example, for bones corresponding to an eye, when the associated condition control is closed, and the user adjusts the "right eye socket" bone, the other bones "right inner canthus," "right outer canthus," "left eye socket," "left inner canthus," and "left outer canthus" are not adjusted in association. Only when the associated condition control is on will the associated bone of the "right eye socket" adjust synchronously.
Optionally, when the control state is the open state, the control state further includes an association range level; the step S41 further includes: and determining the related bone of the current bone according to the related range level.
For example, for the bones corresponding to the eye, when the association range level is one level, the user adjusts the bones of the "right eye socket", only the "right inner corner of the eye" and the "right outer corner of the eye" bones follow the adjustment, while the bones of the "left eye socket", "left inner corner of the eye" and "left outer corner of the eye" remain unchanged.
Optionally, a plurality of gears may be further set on the associated adjustment control, and the associated skeleton ranges corresponding to different gears are different. For example, for the bones of the eye, the associated bones corresponding to the gear 1 are all the bones of the eye; the related bones corresponding to the gear 2 comprise nasal bones besides all bones of the eyes; the associated bones corresponding to gear 2 include the mouth bones in addition to all the bones of the eye and the nose bones.
In the above embodiment, each gear of the association adjustment control is associated with a tree-shaped bone data selection range. When the associated adjusting control is started, reading all the sub skeletons of the current skeleton; when the associated adjustment control is closed, only the current bone is read; when the associated regulating control is in a certain gear, reading the sub-skeleton corresponding to the range of the gear. In the process of adjusting the skeleton, the father skeleton of the current skeleton is not affected generally, but the father skeleton can be controlled to make corresponding follow-up adjustment when the specific skeleton is adjusted according to the requirement.
In an alternative embodiment, when adjusting multiple associated bones, each bone needs a uniform initial state, i.e. multiple bones are adjusted from the same initial state. The step S44 includes: acquiring an intermediate gear and a parameter adjusting range corresponding to the associated skeleton, wherein the intermediate gear of the current skeleton is the same as that of the associated skeleton; and determining a second adjusting parameter corresponding to the second adjusting gear according to the intermediate gear and the parameter adjusting range.
When a plurality of associated bones are adjusted, the initial state of each bone, namely the intermediate state of each bone, needs to be synchronized, so that each associated bone can be uniformly adjusted subsequently. FIG. 6 is a flowchart of a method for processing an animation model according to another embodiment of the present application. As shown in fig. 6, the method further includes the step of determining the intermediate gear of each bone, which is as follows:
step S51, acquiring a first parameter adjusting range of the current skeleton, a second parameter adjusting range of the associated skeleton and the gear number of the first adjusting component and the second adjusting component;
step S52, determining a first original gear corresponding to the current bone original bone data according to the first parameter adjusting range and the gear number, and determining a second original gear corresponding to the associated bone original bone data according to the second parameter adjusting range and the gear number;
in step S53, an intermediate gear is calculated according to the first original gear, the second original gear, the first weight and the second weight.
The process of determining the intermediate gear will be described in detail below by way of a specific example.
Each bone is divided into 31 gears of-15 to 15 according to original bone data.
The associated three bones A, B, C, whose original gears were calculated to be-10, 7, 1, respectively, from their original bone data.
If the weight corresponding to the skeleton A, B, C is the same, the intermediate gear is
Figure BDA0002704825530000131
If the weights corresponding to skeleton A, B, C are 0.6, 1.2, and 0.8, respectively, the intermediate gear is
Figure BDA0002704825530000132
In this way, the bone A, B, C is adjusted with the same middle gear as a starting point, and although the initial state is different from the original effect, the error is relatively small, so that the subsequent multi-bone related adjustment is facilitated.
In the embodiment, in order to adjust the skeleton more freely, the user can pinch an exaggerated and abnormal face shape, the user can exert the function freely, and the skeleton is allowed to be adjusted in at least a single axial direction when the skeleton is zoomed in the face pinching process.
In this example, a scaled bone xxx _ Adjust was implanted for the bone (xxx is the bone name of the bone). Fig. 7 is a schematic skeleton level diagram provided in an embodiment of the present application. As shown in fig. 7, each bone has its corresponding subset bone, the subset bone includes the scaled bone corresponding to the bone, if there is a sub-bone, the subset bone also includes a sub-bone, and the sub-bone also has its corresponding scaled bone and sub-bone, … …
For example, bone A, B and Head bone Head are both the children of Root bone Root. Wherein the subset of bones of bone a comprises: the scaled skeleton A _ Adjust of this skeleton, and the sub-skeleton A _1 of this skeleton A, this sub-skeleton A _1 also has its corresponding scaled skeleton A _1_ Adjust.
The subset of bone B includes: a scaled bone B _ Adjust of the bone, and a sub-bone B _1 of the bone B, a subset bone of the sub-bone B _1 comprising: scaling skeleton B _1_ Adjust and sub-skeleton B _2, and sub-skeleton B _2 also has its corresponding scaling skeleton B _2_ Adjust.
The subset skeleton of the Head skeleton Head includes the Head adjustment skeleton Head _ Adjust, while the facial skeletons C _1, C _2, and C _3 serve as the subset skeleton of the Head adjustment skeleton Head _ Adjust. Facial bones include the bones of the eyes, nose, mouth, etc.
Wherein, the scaling skeleton inherits the coordinates of the corresponding skeleton and is bound with the animation model.
Optionally, the zoom bone is a bone of bone, aligned with a CS bone of bone.
The animation model has a skeleton structure formed by mutually connected animation skeletons, and generates animation for the model by changing the orientation and the position of the bone skeletons.
The CS skeleton, Character Studio, is a very important plug-in module for 3DS MAX to simulate the actions of humans and bipedal animals.
The zoom bone is aligned with the CS bone, and the zoom bone is scaled to conform to the CS bone size and moved to the same location as the CS bone.
FIG. 8 is a flowchart of a method for processing an animation model according to another embodiment of the present application. As shown in fig. 8, when the second adjustment operation is a zoom operation, the step S44 includes the following steps:
step S61, determining a scaling skeleton corresponding to the adjustment skeleton, binding the scaling skeleton and the animation model and inheriting the coordinates of the adjustment skeleton;
step S62, reversely obtaining a second zoom multiple of the zoomed bone in the first coordinate axial direction according to the first zoom multiple of the zoomed bone in the first coordinate axial direction, wherein the first coordinate axial direction comprises one coordinate axial direction or two coordinate axial directions of the zoomed bone, and the zoom operation of the zoomed bone in the first coordinate axial direction is opposite to that of the zoomed bone;
step S63, obtaining auxiliary adjustment parameters of the zoomed skeleton according to the first coordinate axis and the second zoom factor;
in step S64, the adjustment of the zoom bone is performed according to the auxiliary adjustment parameter.
Wherein the second scaling factor may be less than or equal to the first scaling factor.
For example, when the length of the bone A needs to be enlarged, the bone A is enlarged in the three coordinate axes of X, Y, Z in a whole manner, the length of the bone is enlarged in the X-axis direction, the thickness of the bone is enlarged in the Y, Z axis direction, and the whole magnification on the three axes is the same.
In order to achieve only the length of the bone a, the zoom bone a _ Adjust needs to be controlled to perform the zoom-out operation in the Y, Z axial direction, and the zoom-out factor in the Y, Z axial direction may be slightly smaller than or equal to the overall zoom-out factor in three axes.
Thus, by the enlargement operation of the combined bone a and the reduction operation of the scaled bone a _ Adjust, the bone a is elongated only in length as a whole, and the bone thickness does not change.
For another example, when it is necessary to Adjust only the thickness of the bone B, only the change in the Y, Z axis direction of the scaled bone B _ Adjust of the bone B may be adjusted. For example, when scaling the thickness of the abdominal bone Abdomen, its scaled bone Abdomen _ Adjust may be adjusted only in the Y, Z axis direction. Thus, the bone B is only scaled in thickness, with no change in length.
Through the embodiment, the single-axis or two-axis adjustment of the skeleton can be realized, and the free change of the body of the animation model is realized. And moreover, the animation model is bound with the scaling skeleton, so that the adjusted skeleton can adapt to animation motion and interaction effect.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application.
Fig. 9 is a block diagram of an animation model processing apparatus provided in an embodiment of the present application, which may be implemented as part or all of an electronic device through software, hardware, or a combination of the two. As shown in fig. 9, the animation model processing apparatus includes:
the acquisition module 1 is used for acquiring a first original animation model;
the simplification module 2 is used for executing simplification operation on the first original animation model according to a simplification strategy to obtain a first simplified animation model;
and the rendering module 3 is used for rendering the first simplified animation model.
Optionally, the simplifying module 2 is configured to obtain bone data corresponding to the first original drawing model; assigning the vertex information corresponding to the bone data to the vertex of the original animation model; and deleting the modified bone data corresponding to the simplified strategy from the bone data.
Optionally, the simplification module 2 is further configured to determine a simplification level of the first original animation model according to an adjustment operation on a simplification control; determining the adjusted bone data corresponding to the reduced level.
Optionally, the apparatus further comprises: a storage module 4 and an adjustment module 5.
The storage module 4 is used for storing the first original animation model in a server;
an obtaining module 1, configured to obtain the first original animation model from the server when a first adjustment operation on the first simplified animation model is received;
the adjusting module 5 is used for executing the first adjusting operation on the first original animation model to obtain a second original animation model;
the simplification module 2 is used for executing simplification operation on the second original animation model according to the simplification strategy to obtain a second simplified animation model;
a rendering module 3, configured to render the second simplified animation model;
and the storage module 4 is used for storing the second original animation model in the server.
Optionally, the apparatus further comprises:
a parameter obtaining module 6, configured to obtain performance parameters of a device that performs rendering;
and a policy determining module 7, configured to determine the simplification policy according to the performance parameter.
Optionally, the adjusting module 5 is configured to determine a current bone corresponding to the first adjusting operation and an adjusted bone corresponding to the current bone; acquiring a first weight corresponding to the current skeleton and a second weight corresponding to the adjusted skeleton; determining a second adjustment parameter corresponding to the adjusted bone according to a first adjustment parameter corresponding to the first adjustment operation, the first weight and the second weight; and executing a second adjustment operation on the adjusted bone according to the second adjustment parameter.
Optionally, the adjusting module 5 is further configured to obtain a control state of the associated adjusting control; when the control state is an open state, determining that the adjusted bone comprises the current bone and an associated bone with the current bone; when the control state is the closing state, determining that the adjusted bone is the current bone.
Optionally, the adjusting module 5 is configured to determine, when the second adjusting operation is a zooming operation, a zooming bone corresponding to the adjusting bone, where the zooming bone is bound with the animation model and inherits coordinates of the adjusting bone; reversely obtaining a second scaling factor of the scaling bone in the first coordinate axis according to a first scaling factor of the adjusting bone in the first coordinate axis, wherein the first coordinate axis comprises one coordinate axis or two coordinate axes of the adjusting bone, and the scaling operation of the adjusting bone and the scaling bone in the first coordinate axis is opposite; obtaining an auxiliary adjusting parameter of the scaling bone according to the first coordinate axis and the second scaling multiple; performing an adjustment of the zoom bone according to the auxiliary adjustment parameter.
An embodiment of the present application further provides an electronic device, as shown in fig. 10, the electronic device may include: the system comprises a processor 1501, a communication interface 1502, a memory 1503 and a communication bus 1504, wherein the processor 1501, the communication interface 1502 and the memory 1503 complete communication with each other through the communication bus 1504.
A memory 1503 for storing a computer program;
the processor 1501, when executing the computer program stored in the memory 1503, implements the steps of the method embodiments described below.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method embodiments described below.
It should be noted that, for the above-mentioned apparatus, electronic device and computer-readable storage medium embodiments, since they are basically similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
It is further noted that, herein, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (11)

1. An animation model processing method, comprising:
acquiring a first original animation model;
according to a simplification strategy, executing simplification operation on the first original animation model to obtain a first simplified animation model;
rendering the first simplified animation model.
2. The method of claim 1, wherein the performing the simplification operation on the first original animation model according to a preset simplification strategy comprises:
acquiring bone data corresponding to the first original drawing model;
assigning the vertex information corresponding to the bone data to the vertex of the original animation model;
and deleting the modified bone data corresponding to the simplified strategy from the bone data.
3. The method of claim 2, wherein the performing the simplification operation on the first proto-animation model according to a simplification policy further comprises:
determining the simplification level of the first original animation model according to the adjustment operation of the simplification control;
determining the adjusted bone data corresponding to the reduced level.
4. The method of claim 1, further comprising:
storing the first proto-animation model in a server;
when a first adjustment operation on the first simplified animation model is received, acquiring the first original animation model from the server;
executing the first adjustment operation on the first original animation model to obtain a second original animation model;
according to the simplification strategy, executing simplification operation on the second original animation model to obtain a second simplified animation model;
rendering the second simplified animation model;
storing the second proto-animation model in the server.
5. The method according to any one of claims 1-4, further comprising:
acquiring performance parameters of equipment for executing rendering;
and determining the simplification strategy according to the performance parameters.
6. The method of claim 4, wherein said performing the first adjustment operation on the first motive drawing model comprises:
determining a current skeleton corresponding to the first adjustment operation and an adjusted skeleton corresponding to the current skeleton;
acquiring a first weight corresponding to the current skeleton and a second weight corresponding to the adjusted skeleton;
determining a second adjustment parameter corresponding to the adjusted bone according to a first adjustment parameter corresponding to the first adjustment operation, the first weight and the second weight;
and executing a second adjustment operation on the adjusted bone according to the second adjustment parameter.
7. The method of claim 6, wherein said determining an adjusted bone to which said current bone corresponds comprises:
acquiring a control state of the associated adjusting control;
when the control state is an open state, determining that the adjusted bone comprises the current bone and an associated bone with the current bone;
when the control state is the closing state, determining that the adjusted bone is the current bone.
8. The method of claim 6, wherein when the second adjustment operation is a zoom operation, the performing a second adjustment operation on the adjusted bone according to the second adjustment parameter comprises:
determining a scaling skeleton corresponding to the adjusting skeleton, wherein the scaling skeleton is bound with an animation model and inherits the coordinates of the adjusting skeleton;
reversely obtaining a second scaling factor of the scaling bone in the first coordinate axis according to a first scaling factor of the adjusting bone in the first coordinate axis, wherein the first coordinate axis comprises one coordinate axis or two coordinate axes of the adjusting bone, and the scaling operation of the adjusting bone and the scaling bone in the first coordinate axis is opposite;
obtaining an auxiliary adjusting parameter of the scaling bone according to the first coordinate axis and the second scaling multiple;
performing an adjustment of the zoom bone according to the auxiliary adjustment parameter.
9. An animation model processing apparatus, comprising:
the acquisition module is used for acquiring a first original animation model;
the simplification module is used for executing simplification operation on the first original animation model according to a simplification strategy to obtain a first simplified animation model;
and the rendering module is used for rendering the first simplified animation model.
10. An electronic device, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor, when executing the computer program, implementing the method steps of any of claims 1-8.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method steps of any one of claims 1 to 8.
CN202011035179.6A 2020-09-27 2020-09-27 Animation model processing method and device, electronic equipment and storage medium Active CN112102452B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011035179.6A CN112102452B (en) 2020-09-27 2020-09-27 Animation model processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011035179.6A CN112102452B (en) 2020-09-27 2020-09-27 Animation model processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112102452A true CN112102452A (en) 2020-12-18
CN112102452B CN112102452B (en) 2024-03-22

Family

ID=73782415

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011035179.6A Active CN112102452B (en) 2020-09-27 2020-09-27 Animation model processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112102452B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020080143A1 (en) * 2000-11-08 2002-06-27 Morgan David L. Rendering non-interactive three-dimensional content
JP2007072916A (en) * 2005-09-08 2007-03-22 Ricoh Co Ltd Three-dimensional shape data capacity reduction apparatus, three-dimensional animation reproducer, three-dimensional shape data capacity reduction method, three-dimensional animation reproduction method, and program
CN101308579A (en) * 2008-05-12 2008-11-19 中山大学 Adaptive simplifying method for three-dimensional animation model
CN101356549A (en) * 2003-05-14 2009-01-28 皮克萨公司 Defrobulated angles for character joint representation
CN101770655A (en) * 2009-12-25 2010-07-07 电子科技大学 Method for simplifying large-scale virtual dynamic group
WO2014008387A2 (en) * 2012-07-05 2014-01-09 King Abdullah University Of Science And Technology Three-dimensional object compression
CN104077797A (en) * 2014-05-19 2014-10-01 无锡梵天信息技术股份有限公司 Three-dimensional game animation system
CN105894555A (en) * 2016-03-30 2016-08-24 腾讯科技(深圳)有限公司 Method and device for simulating body motions of animation model
CN107527384A (en) * 2017-07-14 2017-12-29 中山大学 A kind of lattice simplified method of Three-Dimensional Dynamic based on motion feature and its system
CN206975717U (en) * 2017-08-08 2018-02-06 南京美卡数字科技有限公司 A kind of rendering device of extensive three-dimensional animation

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020080143A1 (en) * 2000-11-08 2002-06-27 Morgan David L. Rendering non-interactive three-dimensional content
CN101356549A (en) * 2003-05-14 2009-01-28 皮克萨公司 Defrobulated angles for character joint representation
JP2007072916A (en) * 2005-09-08 2007-03-22 Ricoh Co Ltd Three-dimensional shape data capacity reduction apparatus, three-dimensional animation reproducer, three-dimensional shape data capacity reduction method, three-dimensional animation reproduction method, and program
CN101308579A (en) * 2008-05-12 2008-11-19 中山大学 Adaptive simplifying method for three-dimensional animation model
CN101770655A (en) * 2009-12-25 2010-07-07 电子科技大学 Method for simplifying large-scale virtual dynamic group
WO2014008387A2 (en) * 2012-07-05 2014-01-09 King Abdullah University Of Science And Technology Three-dimensional object compression
CN104077797A (en) * 2014-05-19 2014-10-01 无锡梵天信息技术股份有限公司 Three-dimensional game animation system
CN105894555A (en) * 2016-03-30 2016-08-24 腾讯科技(深圳)有限公司 Method and device for simulating body motions of animation model
CN107527384A (en) * 2017-07-14 2017-12-29 中山大学 A kind of lattice simplified method of Three-Dimensional Dynamic based on motion feature and its system
CN206975717U (en) * 2017-08-08 2018-02-06 南京美卡数字科技有限公司 A kind of rendering device of extensive three-dimensional animation

Also Published As

Publication number Publication date
CN112102452B (en) 2024-03-22

Similar Documents

Publication Publication Date Title
US11270488B2 (en) Expression animation data processing method, computer device, and storage medium
WO2020001013A1 (en) Image processing method and device, computer readable storage medium, and terminal
JP2020510262A (en) Expression animation generating method and apparatus, storage medium, and electronic device
CN109151540B (en) Interactive processing method and device for video image
US8405676B2 (en) Techniques for interior coordinates
US8538737B2 (en) Curve editing with physical simulation of mass points and spring forces
CN107578467B (en) Three-dimensional modeling method and device for medical instrument
CN110060348A (en) Facial image shaping methods and device
CN114881893A (en) Image processing method, device, equipment and computer readable storage medium
CN114049287B (en) Face model fusion method, device, equipment and computer readable storage medium
CN112090082A (en) Facial skeleton processing method and device, electronic equipment and storage medium
CN101578635A (en) Game device, game device control method, program, and information storage medium
CN109146770A (en) A kind of strain image generation method, device, electronic equipment and computer readable storage medium
CN112102452B (en) Animation model processing method and device, electronic equipment and storage medium
CN112102453B (en) Animation model skeleton processing method and device, electronic equipment and storage medium
JP2007234033A (en) Three-dimensional game apparatus and information storage medium
CN116843809A (en) Virtual character processing method and device
CN109074670B (en) Information processing apparatus, image generating method, and recording medium
CN116052263A (en) Control method and electronic equipment
WO2022042570A1 (en) Image processing method and apparatus
JP2024503930A (en) Map display control method, apparatus, device and medium
CN114596221A (en) Face contour automatic smoothing method and device, electronic equipment and storage medium
CN112107865A (en) Facial animation model processing method and device, electronic equipment and storage medium
CN110827413B (en) Method, apparatus and computer readable storage medium for controlling a change in a form of a virtual object
CN113332712B (en) Game scene picture moving method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant