CN113590334B - Method, device, medium and electronic equipment for processing character model - Google Patents

Method, device, medium and electronic equipment for processing character model Download PDF

Info

Publication number
CN113590334B
CN113590334B CN202110902409.2A CN202110902409A CN113590334B CN 113590334 B CN113590334 B CN 113590334B CN 202110902409 A CN202110902409 A CN 202110902409A CN 113590334 B CN113590334 B CN 113590334B
Authority
CN
China
Prior art keywords
files
material ball
character model
ball
character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110902409.2A
Other languages
Chinese (zh)
Other versions
CN113590334A (en
Inventor
莫顺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguan Information Technology Co Ltd
Original Assignee
Guangzhou Boguan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguan Information Technology Co Ltd filed Critical Guangzhou Boguan Information Technology Co Ltd
Priority to CN202110902409.2A priority Critical patent/CN113590334B/en
Publication of CN113590334A publication Critical patent/CN113590334A/en
Application granted granted Critical
Publication of CN113590334B publication Critical patent/CN113590334B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present disclosure relates to the field of computers, and in particular, to a method, an apparatus, a computer readable storage medium, and an electronic device for processing a character model, including: acquiring a character model to be processed, and a plurality of first material ball files and animation data corresponding to the character model; combining the plurality of first material ball files to obtain one or more second material ball files; baking the animation data into the map to obtain a map file with the animation data; generating a blueprint comprising a plurality of action nodes according to the mapping file, wherein the action nodes are used for indicating the positions of actions in the mapping file; and rendering the character model according to the second material ball file, the map file and the blueprint. Through the technical scheme of the embodiment of the disclosure, the problem of game performance degradation caused by a large number of character models can be solved.

Description

Method, device, medium and electronic equipment for processing character model
Technical Field
The present disclosure relates to the field of computers, and more particularly, to a method of character model processing, an apparatus for character model processing, a computer-readable storage medium, and an electronic device.
Background
With the rapid development of software and hardware, various game engines are increasing, and in these game engines, various tools required for writing a game can be provided for game designers, so that the game designers can easily make a game program.
In some game scenarios, in-game characters need to be set, the in-game characters correspond to different actions, and when a certain condition is triggered, the in-game characters can make corresponding actions. For example, in some action-role playing games, a hostile character may be set that attacks a player-controlled character when it reaches a certain distance from the player-controlled character. Tools are provided in many game engines to handle such characters. For example, the UE4 (Unreal Engine, phantom 4 engine) provides the action of AI system technology to handle the character model.
However, since the number of material balls of characters in the game is large, when the number of character models is increased, the number of DRAW CALLS is increased, and the interaction caused by the increase of characters is also greatly increased, so that the operation pressure of the CPU is increased, and the performance of the game running is affected, and the game experience of the player is poor.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a character model processing method, a character model processing apparatus, a computer-readable storage medium, and an electronic device, which can solve the problem of a decrease in game performance caused by a large number of character models.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a method of character model processing, comprising: acquiring a character model to be processed, and a plurality of first material ball files and animation data corresponding to the character model; combining the plurality of first material ball files to obtain one or more second material ball files; baking the animation data into a map to obtain a map file with animation data; generating a blueprint comprising a plurality of action nodes according to the map file, wherein the action nodes are used for indicating the positions of actions in the map file; and rendering the role model according to the second material ball file, the map file and the blueprint.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the action node is further configured to indicate a switching relationship between actions.
In an exemplary embodiment of the disclosure, based on the foregoing scheme, the action node includes an instance object random node, a vector node, and a judgment node; the example object random nodes are used to randomly determine an initial action from the map file for the character model; the vector node is used for determining other actions except the initial action from the map file; and the judging node is used for determining actions to be switched and switching time from the map file.
In an exemplary embodiment of the disclosure, based on the foregoing solution, the merging the plurality of first material ball files to obtain one or more second material ball files includes: and combining the plurality of first material ball files according to the number of the plurality of first material ball files to obtain one or more second material ball files.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, merging the plurality of first material ball files according to the number of the plurality of first material ball files to obtain one or more second material ball files, including: and when the number of the plurality of first material ball files is larger than or equal to a first preset value, merging the plurality of first material ball files to obtain one or more second material ball files.
In an exemplary embodiment of the disclosure, based on the foregoing solution, the merging the plurality of first material ball files to obtain one or more second material ball files includes: and combining the plurality of first material ball files according to the material ball attributes of the plurality of first material ball files to obtain one or more second material ball files.
In an exemplary embodiment of the disclosure, based on the foregoing solution, the merging the plurality of first material ball files according to the material ball attributes of the plurality of first material ball files to obtain one or more second material ball files includes: obtaining similarity of material ball attributes among the plurality of first material ball files; and merging the first material ball files with the similarity being greater than or equal to a second preset value to obtain one or more second material ball files.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the material ball attribute is material ball transparency.
In an exemplary embodiment of the present disclosure, based on the foregoing, the character model is a model in a cluster, the cluster containing a plurality of the character models.
In one exemplary embodiment of the present disclosure, the actions of the plurality of character models are controlled by a centralized control architecture based on the foregoing schemes.
According to a second aspect of the present disclosure, there is provided an apparatus for character model processing, the apparatus comprising: the character model acquisition module is used for acquiring a character model to be processed, and a plurality of first material ball files and animation data corresponding to the character model; the material ball file merging module is used for merging the plurality of first material ball files to obtain one or more second material ball files; the map file baking module is used for baking the animation data into a map to obtain a map file with the animation data; the blueprint generating module is used for generating a blueprint comprising a plurality of action nodes according to the mapping file, wherein the action nodes are used for indicating the positions of actions in the mapping file; and the role model rendering module is used for rendering the role model according to the second material ball file, the map file and the blueprint.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of character model processing as described in the first aspect of the embodiments described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising:
a processor; and
A memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement a method of character model processing as in the first aspect of the embodiments described above.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
in the method for processing a character model according to an embodiment of the present disclosure, a character model to be processed, a plurality of first material ball files and animation data corresponding to the character model may be obtained, the plurality of first material ball files are combined to obtain one or more second material ball files, the animation data are baked into a map to obtain a map file with animation data, a blueprint including a plurality of action nodes is generated according to the map file, the action nodes are used for indicating positions of actions in the map file, and the character model is rendered according to the second material ball files, the map file and the blueprint.
According to the embodiment of the disclosure, the material ball files corresponding to the character model can be combined, the animation data are baked into the map file, and the character model is rendered through the combined material ball files, the map file and the blueprint. On one hand, the number of material balls of the character model can be reduced, the number of DRAW CALLS is reduced, the rendering speed is improved, and the game frame rate is improved; on the other hand, the animation data is baked into the map, and a plurality of actions of the character are realized through the animation map, so that the operation pressure of the CPU can be reduced, and the game performance can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort. In the drawings:
FIG. 1 schematically illustrates a schematic diagram of an exemplary system architecture for a method of character model processing in an exemplary embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow chart of a method of character model processing in an exemplary embodiment of the present disclosure;
FIG. 3 schematically illustrates a flowchart of a method of another character model processing in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a schematic diagram of a cluster character application scenario in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a flowchart of a method of another character model processing in an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates a schematic composition of an apparatus for character model processing in an exemplary embodiment of the present disclosure;
Fig. 7 schematically illustrates a structural schematic diagram of a computer system suitable for use in implementing the electronic device of the exemplary embodiments of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the disclosed aspects may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
FIG. 1 illustrates a schematic diagram of an exemplary system architecture for a method of character model processing to which embodiments of the present disclosure may be applied.
As shown in fig. 1, system architecture 1000 may include one or more of terminal devices 1001, 1002, 1003, a network 1004, and a server 1005. The network 1004 serves as a medium for providing a communication link between the terminal apparatuses 1001, 1002, 1003 and the server 1005. The network 1004 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 1005 may be a server cluster formed by a plurality of servers.
A user can interact with a server 1005 via a network 1004 using terminal apparatuses 1001, 1002, 1003 to receive or transmit messages or the like. The terminal devices 1001, 1002, 1003 may be various electronic devices having a display screen including, but not limited to, smartphones, tablet computers, portable computers, desktop computers, and the like. In addition, the server 1005 may be a server providing various services.
In one embodiment, the execution subject of the method for processing a character model of the present disclosure may be a server 1005, and the server 1005 may acquire a plurality of first material ball files and animation data corresponding to the character model sent by the terminal devices 1001, 1002, 1003, merge the plurality of first material ball files to obtain one or more second material ball files, bake the animation data into a map to obtain a map file with animation data, generate a blueprint including a plurality of action nodes according to the map file, and render the character model according to the second material ball files, the map file, and the blueprint. In addition, the method of character model processing of the present disclosure may be performed by the terminal apparatuses 1001, 1002, 1003, etc. to implement a process of rendering a character model according to the second material ball file, the map file, and the blueprint.
Further, the implementation of the method of character model processing of the present disclosure may also be implemented by the terminal devices 1001, 1002, 1003 and the server 1005 in common. For example, the terminal device 1001, 1002, 1003 may acquire a plurality of first material ball files and animation data corresponding to the character model, merge the plurality of first material ball files to obtain one or more second material ball files, bake the animation data into a map to obtain a map file with animation data, generate a blueprint including a plurality of action nodes according to the map file, and send the second material ball files, the map file and the blueprint to the server 1005, so that the server 1005 may render the character model according to the second material ball files, the map file and the blueprint.
With the rapid development of software and hardware, more and more game engine tools have been developed. For example, unreal Engine (ghost engine 4, ue 4) may provide various tools for game designers to write games, which aims to enable the game designers to easily and quickly make game programs, to realize quick development, and to enable the game engines to call resources in order of the game design requirements and to control the running of the game.
In some games, character models are required to be set, and when the character models trigger certain preset conditions, corresponding actions are triggered. For example, in some life simulation games, multiple identical or similar passer-by characters may be set in the game scene, and when a player-controlled character collides with a passer-by character, the passer-by character will avoid the player-controlled character; or in some instant strategic games (Real-TIME STRATEGY GAME, RTS), an opponent soldier character may be set within the game scene that will attack the player-controlled soldier character as the player-controlled soldier approaches the opponent soldier character. In the UE4, an AI system is provided for game designers to handle the actions of character models of the type described above. A character model can be built, a plurality of material balls are rendered on the model, and then the CPU controls the character model, so that the character model executes corresponding actions.
However, since the character model has a plurality of material balls, if the number of character models needs to be increased, the number of the character models is increased by DRAW CALLS, so that the rendering speed is reduced, and the game frame rate is reduced rapidly; the CPU processes the animation data of the character model, and the pressure on the CPU is smaller when the number of the character models is smaller, but the operation pressure of the CPU is continuously increased along with the increase of the number of the character models, and the reduction of the game frame rate is also caused, so that the game running performance is influenced; in addition, as the number of character models increases, the interaction of actions between character models increases more and more frequently, further increasing the operation pressure of the CPU, resulting in a decrease in the game frame rate, and thus in a decrease in the game performance, resulting in discontent players.
According to the method for processing the character model provided in the present exemplary embodiment, a character model to be processed, a plurality of first material ball files and animation data corresponding to the character model may be obtained, the plurality of first material ball files are combined to obtain one or more second material ball files, the animation data are baked into a map to obtain a map file with animation data, a blueprint including a plurality of action nodes is generated according to the map file, and the character model is rendered according to the second material ball files, the map file and the blueprint. As shown in FIG. 2, the method of character model processing can include the steps of:
Step S210, acquiring a character model to be processed, and a plurality of first material ball files and animation data corresponding to the character model;
Step S220, combining the plurality of first material ball files to obtain one or more second material ball files;
step S230, baking the animation data into the map to obtain a map file with the animation data;
Step S240, generating a blueprint comprising a plurality of action nodes according to the mapping file, wherein the action nodes are used for indicating the positions of actions in the mapping file;
And step S250, rendering the character model according to the second material ball file, the map file and the blueprint.
According to the embodiment of the disclosure, the material ball files corresponding to the character model can be combined, the animation data are baked into the map file, and the character model is rendered through the combined material ball files, the map file and the blueprint. On one hand, the number of material balls of the character model can be reduced, the number of DRAW CALLS is reduced, the rendering speed is improved, and the game frame rate is improved; on the other hand, the animation data is baked into the map, and a plurality of actions of the character are realized through the animation map, so that the operation pressure of the CPU can be reduced, and the game performance can be improved.
Next, steps S210 to S250 of the character model processing in the present exemplary embodiment will be described in more detail with reference to the drawings and embodiments.
Step S210, acquiring a character model to be processed, and a plurality of first material ball files and animation data corresponding to the character model;
In one example embodiment of the present disclosure, the character model may include game characters in various games, such as TPS (third person perspective shooting), FPS (first person perspective shooting), RPG (role playing game), ACT (action game), SLG (strategy class game), FTG (fighting game), SPG (sports game class game), RCG (racing game), AVG (adventure game), RTS (instant strategy game), etc., and the character model processing scheme of the present disclosure may be applied as long as the scene relates to the character model. The character model may also include characters in an interactive application image application, and the like. Note that the present disclosure is not limited to the application scenario of the character model.
In one example embodiment of the present disclosure, a character model may include a main character, a match character, an enemy, a unit object in a game, a plurality of groups of objects having independent action elements, and the like, and a presentation form of the character model may include a human form, an animal form, an object form, and the like, and the character model may further include character models in which some characters have poor sense of existence, and which are embodied in a device form (e.g., a machine in a machine-class game, a tank in a tank-class game, an aircraft in an aviation-class game, and the like). The specific form of the color model is not particularly limited in the present disclosure.
In one example embodiment of the present disclosure, a character model may be created in a model creation tool. For example, a character model may be created in a Zbrush, maya, 3DMax, or the like tool and a material sphere rendered for the character model. Specifically, when rendering a texture ball for a character model, the texture ball rendering tool may be used. For example, an object to be rendered may be selected in a texture editor, and then a ball of the texture to be rendered may be selected to complete rendering. Furthermore, the material balls in the material ball rendering tool can be used, the material balls manufactured by other authors can be downloaded into the material ball rendering tool, or the material balls can be manufactured and used by themselves.
In one example embodiment of the present disclosure, when rendering a character model, the character model may be obtained after rendering a plurality of material balls in the character model due to different characteristics of various locations of the character model, and thus the character model has a plurality of first material ball files. For example, for a certain character model, the material balls may include a diffusion type material ball, a reflection type material ball, an opaque type material ball, and a concave-convex type material ball, and each type material ball further includes a plurality of material balls with different patterns, so that a plurality of first material ball files corresponding to the character model may be obtained.
In one example embodiment of the present disclosure, animation data corresponding to a character model may be acquired. Specifically, the animation data can be used to drive the character model to perform corresponding actions, and after the character model is created, the animation data corresponding to the character model can be created through an animation data creation tool.
For example, animation data corresponding to a character model can be produced through 3Dmax, a skeleton system corresponding to the character model is created according to the character model, binding between the skeleton system corresponding to the character model and the character model is performed, influence weight values of each skeleton in the skeleton system on corresponding parts of the character model are adjusted, so that the skeleton system corresponding to the character model is completely matched with the character model, action data of the corresponding skeleton are adjusted on key frames of each time axis, and then all actions of the character model are recorded through all the key frames on the time axis, so that the animation data corresponding to the character model is obtained. The animation data corresponding to the character model may be stored in a memory or a server of the terminal device, and may be acquired in the memory or the server when the animation data corresponding to the character model needs to be used. The method for generating the animation data corresponding to the character model is not particularly limited in the present disclosure.
Step S220, combining the plurality of first material ball files to obtain one or more second material ball files;
In an example embodiment of the present disclosure, a plurality of first material ball files may be combined to obtain one or more second material ball files. Specifically, in the process of rendering the character model, a draw call is called once per rendering, and as the number of first material ball files corresponding to the character model increases, the number of draw calls also increases, so that a plurality of first material ball files can be combined to obtain one or more second material ball files. When the combination is carried out, the material balls can be combined according to the material ball attributes of the material balls. For example, the first material ball files may be combined according to the style of the material ball, for example, the first material ball files of the same style may be combined together, the material ball may be combined according to the type of the material ball, and the material ball may be combined according to the processing mode of the material ball. For example, the material balls can render the character model in a NORMAL BUMP mode, or render the character model in a DirectX loader mode, and the material balls adopting a unified rendering mode can be combined together, so that the rendering of the material balls of the character model is facilitated, and the calculation pressure of the processor is reduced. The inventor of the application discovers that for 192 character models, 7700 times of draw calls are needed before the material ball files of the character models are not combined, and only 1000 times of draw calls are needed after the material ball files are combined, so that the game frame rate and the game stability are greatly improved. Note that, the method of merging the first material ball files is not particularly limited in this disclosure.
In an example embodiment of the present disclosure, a plurality of first material ball files may be combined to obtain one or more second material ball files. Specifically, the number of the second material ball files may be one or more. For example, the first material ball files may be merged into one or more second material ball files according to the type of the material ball files, if the plurality of first material ball files belong to the same material ball type, only one merged material ball file exists, and if the plurality of first material ball files include two material ball types, two merged material ball files exist; or the standard value of the number of the second material ball files can be set, so that the number of the combined second material ball files is a specific value, for example, a plurality of first material ball files can be combined according to the combination standard, if the number of the current material ball files is equal to the standard value of the number, the combination of the first material ball files is stopped, and all the current material ball files are used as the second material ball files. Note that, the number of the second material ball files is not particularly limited in this disclosure.
In an example embodiment of the present disclosure, the plurality of first material ball files may be combined according to the number of the plurality of first material ball files to obtain one or more second material ball files. For example, when the number of the first material ball files is large, a plurality of first material ball files may be combined; or when the number of the material ball files of a certain type in the plurality of first material ball files is large, the plurality of first material ball files can be combined. Note that, the specific manner of merging the plurality of first material ball files according to the number of the plurality of first material ball files is not particularly limited in the present disclosure.
In an example embodiment of the present disclosure, when the number of the plurality of first material ball files is greater than or equal to a first preset value, the plurality of first material ball files may be combined to obtain one or more second material ball files. Specifically, the first preset value may indicate that when the number of first material ball files corresponding to the character model is greater than the first preset value, the pressure is greater when rendering the character model. Therefore, when the number of the first material ball files is greater than or equal to the first preset value, the plurality of first material ball files can be combined. The first preset value can be adjusted according to the service scene, and is stored in the terminal equipment or the server, and when the number of the first material ball files is required to be compared with the first preset value, the first preset value is called from the terminal equipment or the server. It should be noted that, the specific numerical value of the first preset value is not particularly limited in this disclosure.
In an example embodiment of the present disclosure, the plurality of first material ball files may be combined according to the material ball attributes of the plurality of first material ball files to obtain one or more second material ball files.
Specifically, the ball properties corresponding to the plurality of first ball files may be acquired, and after the plurality of first ball files corresponding to the character model are acquired, the ball properties corresponding to the plurality of first ball files may be acquired. Specifically, the material ball attribute may include general material attributes such as color, material ball transparency, environmental color, self-luminescence, etc.; special effects such as hidden sources, glow intensity, etc. may also be included; high light coloration such as eccentricity, high light diffusion, reflectivity, etc. may also be included; ray tracing may also be included. Such as refractive index, refractive confinement, light absorptivity, etc. Note that, the material ball properties are not particularly limited in this disclosure. When the character model is rendered, different material balls can be rendered at different positions of the character model, and material ball attributes corresponding to a plurality of first material ball files can be determined in a material ball rendering tool. For example, when obtaining the ball attribute corresponding to the first ball file, the tool such as maya/3Dmax may click on the position corresponding to the first ball file, and at this time, the ball attributes corresponding to the plurality of first ball files may be determined according to the prompt information; or a plurality of first material ball files corresponding to the character models can be obtained through the material ball rendering tool, and the material ball attributes corresponding to the first material ball files are extracted according to the first material ball files. Note that, the manner of obtaining the material ball attribute is not particularly limited in this disclosure.
After the material ball attributes corresponding to the plurality of first material ball files are obtained, the plurality of first material ball files can be combined into one or more second material ball files according to the material ball attributes. Specifically, the first material ball files with the same material ball attribute may be combined together. Note that, the manner of merging the first material balls is not particularly limited in this disclosure. The number of material ball files can be reduced, the number of draw calls can be reduced, and the frame rate of the game can be improved.
In an example embodiment of the present disclosure, a similarity of material ball properties between a plurality of first material ball files may be obtained, and the first material ball files with the similarity greater than or equal to a second preset value may be combined to obtain one or more second material ball files. Referring to fig. 3, merging the first material ball files with similarity greater than or equal to the second preset value to obtain one or more second material ball files, which may include the following steps S310 to S320:
step S310, obtaining the similarity of the material ball attributes among a plurality of first material ball files;
In an example embodiment of the present disclosure, a similarity of material ball properties between a plurality of first material ball files may be obtained. Specifically, a specific value of a material ball attribute of the first material ball file may be obtained, where the closer the specific value of the material ball attribute is, the higher the similarity of the material ball file is.
For example, the transparency of the ball file a is 50%, the transparency of the ball file B is 60%, and the transparency of the ball file C is 60%, so that the similarity between the ball attributes of the ball file a and the ball attributes of the ball file B is greater than the similarity between the ball attributes of the ball file a and the ball attributes of the ball file C. The similarity of the ball properties between the first ball files may be numerically characterized. Note that, the method for obtaining the similarity of the material ball attributes among the plurality of first material ball files is not particularly limited in this disclosure.
Step S320, merging the first material ball files with similarity greater than or equal to the second preset value to obtain one or more second material ball files.
In an example embodiment of the present disclosure, after obtaining the similarity of the material ball properties between the plurality of first material ball files, the first material ball files having the similarity greater than or equal to the second preset value may be merged. Specifically, the second preset value may indicate that the similarity of the material ball attributes of the material ball files is higher, and at this time, the material ball files may be combined.
The second preset value can be adjusted according to the service scene, and the second preset value is stored in the terminal equipment or the server, and when the similarity of the material ball attributes between the first material ball files is required to be compared with the second preset value, the second preset value is called from the terminal equipment or the server. It should be noted that, the specific numerical value of the second preset value is not particularly limited in this disclosure.
Through the steps S310 to S320, the similarity of the material ball attributes among the plurality of first material ball files can be obtained, and the first material ball files with the similarity greater than or equal to the second preset value are combined to obtain one or more second material ball files.
Step S230, baking the animation data into the map to obtain a map file with the animation data;
In one example embodiment of the present disclosure, after the animation data corresponding to the character model is obtained, the animation data may be baked into a map to obtain a map file with the animation data. Specifically, the change information of the vertex data of the character model may be included in the map file of the animation data. At this time, the animation data is not calculated by the CPU, but is calculated by the GPU (graphics processor, graphics processing unit), so that the calculation pressure of the CPU is greatly reduced, and the game performance is improved.
In one example embodiment of the present disclosure, animation data corresponding to a character model may be baked into a map file of animation data, and when baking is performed, the animation data corresponding to the character model may be baked into a map file of animation data by an animation baking tool. For example, the animation data corresponding to the character model may be baked into a map file of animation data by DCC software such as tools such as maya/3 Dmax and related plugins (e.g., vertex Anim Toolset plugins in UE 4). The specific manner of baking the animation data into the map to obtain the map file with the animation data is not particularly limited in the present disclosure.
Step S240, generating a blueprint comprising a plurality of action nodes according to the mapping file, wherein the action nodes are used for indicating the positions of actions in the mapping file;
In one example embodiment of the present disclosure, a blueprint including a plurality of action nodes may be generated from a map file. Wherein, the blueprint can be used for driving the character model to switch states, namely, driving the character model to execute actions corresponding to the animation data. Specifically, a plurality of map files are stored in the blueprint, and the state switching can be performed by driving the character model through the map files. Specifically, the blueprint further comprises a plurality of action nodes, and the action nodes can be used for indicating the positions of actions in the mapping file, namely, the actions to be applied can be searched in the mapping file through the action nodes.
Further, the action nodes may also be used to indicate a switching relationship between actions. Specifically, the next action to be switched can be known by the action node, or the specific position of the next action to be switched in the map file can be known by the action node, or the switching time can be known by the action node. The specific form of the switching relationship is not particularly limited in this disclosure.
Specifically, the action nodes comprise example object random nodes, vector nodes and judging nodes. Wherein the example object random nodes are used to randomly determine an initial action from the map file for the character model; the vector node is used for determining other actions except the initial action from the mapping file; the judging node is used for determining the action to be switched and the switching time from the mapping file, for example, when the character model meets the action triggering condition, the character model can be controlled to make corresponding action.
Specifically, when the character model is driven to switch states by the map file, vertex data of the character model can be acquired in the map file. For example, the model vertex data can be directly accessed by the vertex detection plugin, and also can be determined by a manual labeling mode. And then recording the model vertex data corresponding to the character model of each time unit according to the time change, or recording the model vertex data corresponding to the character model in sequence according to the sequence of the key frames. It should be noted that, the method for obtaining the model vertex data is not particularly limited in this disclosure. After the model vertex data corresponding to the map file is obtained, the motion of the character model can be controlled according to the model vertex data corresponding to the map file. In particular, the current character model may be controlled to move in accordance with changes in model vertex data.
For example, the character model includes a vertex a with coordinates (15, 20, 15), and the vertex a corresponding to the first time unit in the map file has coordinates (15, 20, 20), then the vertex a of the character model is moved to the position (15, 20, 20), and so on, all vertices in the character model are moved according to the map file, so that the motion of the character model can be controlled by the model vertex data. The specific manner in which the character model is driven to switch states by the map file is not particularly limited in this disclosure.
And step S250, rendering the character model according to the second material ball file, the map file and the blueprint.
In one example embodiment of the present disclosure, the character model may be rendered from the second material ball file, the map file, and the blueprint. Specifically, after the second material ball file, the map file, and the blueprint are obtained through the steps, the character model may be rendered according to the second material ball file, the map file, and the blueprint. At this time, the number of material ball files is reduced, and the state switching is performed by using the map files and the blueprint to drive the character model, so that the calculation pressure of the CPU can be reduced, and the game performance can be improved.
In one example embodiment of the present disclosure, the aspects of the present disclosure may be used in an application scenario for clustering character models. Wherein multiple character models may be included in a clustered character model. As shown in FIG. 4, multiple character models may be included in a clustered character application scenario.
In one example embodiment of the present disclosure, the second material ball file, the map file, and the blueprint corresponding to the character model may be generated in a batch manner by instantiating the mesh model. Specifically, in the application scenario of the cluster game, if multiple character models are to be added, the number of draw calls is more, and at this time, the second material ball file, the map file and the blueprint corresponding to the character model may be generated in batches by using the instantiation grid model, and the cluster character model may be rendered according to the second material ball file, the map file and the blueprint. Specifically, the instantiation grid model (INSTANCEMESH) can call the existing second material ball file, map file and blueprint, so as to achieve the purpose of generating the second material ball file, map file and blueprint in batches, and the inventor of the application finds that when 216 character models are generated in batches without using the instantiation grid model, the game frame rate is 51, and when 900 character models are generated in batches by using the instantiation grid model, the game frame rate is 53, and the calculation pressure of cpu (central processing unit ) is reduced due to the fact that the character models are generated in batches by using the instantiation grid model, so that the game performance is ensured.
In one example embodiment of the present disclosure, the actions of multiple character models may be controlled by a centralized control architecture. Specifically, in processing actions of multiple character models, all of the character models can be controlled by a single overall console due to the large number of character models. For example, when a character model collides within a game scene, the collision information may be sent to a console, which in turn controls the character model to react accordingly (e.g., avoid). Through the centralized control architecture, the running performance of the game can be improved.
In an example embodiment of the present disclosure, the centralized control architecture may include an Entity-Component-System (ECS) architecture, which follows a combinatorial-over-inheritance principle, each role model being an Entity, each Entity in turn being made up of one or more components, each Component containing only data representing its characteristics (i.e., no method in the Component), the System being a tool to handle a collection of entities having one or more identical components, which have only behavior (i.e., no data in the System), the Entity and the Component being in a one-to-many relationship, the behavior of the Entity being changed at run-time by dynamically adding or deleting components. The ECS architecture is a data-oriented design, and can process data in parallel, so that the processing speed is improved.
In an example embodiment of the present disclosure, the second material ball file, the map file, and the blueprint corresponding to the character model may be stored in the component in a coarse manner, and the execution action corresponding to each character model may be controlled by the system. Specifically, the components in the ECS architecture have no specific behavior, and when the role model needs to be controlled to execute an action, the system needs to send a command to the component corresponding to the action, and the system sends the command to the corresponding component to control the component corresponding to the role model to complete the corresponding action. The ECS architecture optimizes the rendering efficiency of the CPU end, improves the running frame rate of the game, and further improves the stability of the game.
In one example embodiment of the present disclosure, the action trigger instructions corresponding to each character model may be monitored, and after the action trigger instructions for a particular character model are received, the action trigger instructions may be sent to the system of the ECS architecture. Specifically, the character model receives an action trigger instruction in an application scenario (game scenario). For example, in an instant strategy game, when a player-controlled character is located a distance from a hostile character (non-player character, npc, non-PLAYER CHARACTER), an action trigger command for the hostile character may be monitored, where the action trigger command is aimed at causing the current hostile character to attack the player-controlled character. At this point, an action trigger instruction may be sent into the ECS architecture system.
When the system receives the action trigger instruction, the component corresponding to the character model can be controlled to switch to the action corresponding to the action trigger instruction. Specifically, the system can send a command for executing the action to the component corresponding to the action triggering command, so that the component corresponding to the character model is switched to the action corresponding to the action triggering command, and the character model is controlled to make the corresponding action. Where each character model acts as an entity, it may have one or more components, each of which may carry different actions.
For example, in an instant strategic game, when a player-controlled character is close to an a-distance from a hostile character (character model), the hostile character may make an a-action (e.g., vigilance tense) corresponding to the a-component, and when a player-controlled character is close to a B-distance from the hostile character (character model), the hostile character may make a B-action (e.g., attack the player-controlled character) corresponding to the B-component.
In an example embodiment of the present disclosure, a character model may be imported, a plurality of first material ball files corresponding to the character model may be obtained, the plurality of first material ball files may be combined to obtain animation data corresponding to the character model, the animation data may be baked into a map, the character model may be generated in batches by instantiating a mesh model, and the actions of the character model may be centralized. Referring to fig. 5, a method for processing a character model of the present disclosure may include the following steps S510 to S570:
Step S510, importing a character model; step S520, a plurality of first material ball files corresponding to the character model are acquired; step S530, combining the plurality of first material ball files; step S540, obtaining animation data corresponding to the character model; step S550, baking the animation data into the map; step S560, generating character models in batches by instantiating the grid models; step S570, the actions of the character model are collectively processed.
In the method for processing the character model provided by the embodiment of the disclosure, a character model to be processed, a plurality of first material ball files and animation data corresponding to the character model can be obtained, the first material ball files are combined to obtain one or more second material ball files, the animation data are baked into a map to obtain the map file with the animation data, a blueprint comprising a plurality of action nodes is generated according to the map file, the action nodes are used for indicating positions of actions in the map file, and the character model is rendered according to the second material ball files, the map file and the blueprint.
According to the embodiment of the disclosure, the material ball files corresponding to the character model can be combined, the animation data are baked into the map file, and the character model is rendered through the combined material ball files, the map file and the blueprint. On one hand, the number of material balls of the character model can be reduced, the number of DRAW CALLS is reduced, the rendering speed is improved, and the game frame rate is improved; on the other hand, the animation data is baked into the map, and a plurality of actions of the character are realized through the animation map, so that the operation pressure of the CPU can be reduced, and the game performance can be improved.
It is noted that the above-described figures are merely schematic illustrations of processes involved in a method according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
In addition, in exemplary embodiments of the present disclosure, an apparatus for character model processing is also provided. Referring to fig. 6, an apparatus 600 for character model processing includes: a character model acquisition module 610, a texture ball file merging module 620, a map file baking module 630, a blueprint generation module 640, and a character model rendering module 650.
The character model acquisition module is used for acquiring a character model to be processed, and a plurality of first material ball files and animation data corresponding to the character model; the material ball file merging module is used for merging the plurality of first material ball files to obtain one or more second material ball files; the map file baking module is used for baking the animation data into a map to obtain a map file with the animation data; the blueprint generating module is used for generating a blueprint comprising a plurality of action nodes according to the mapping file, wherein the action nodes are used for indicating the positions of actions in the mapping file; and the character model rendering module is used for rendering the character model according to the second material ball file, the map file and the blueprint.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the action node is further configured to indicate a switching relationship between actions.
In one exemplary embodiment of the present disclosure, the action nodes include instance object random nodes, vector nodes, judgment nodes; the example object random nodes are used to randomly determine an initial action from the map file for the character model; the vector node is used for determining other actions except the initial action from the mapping file; the judging node is used for determining the action to be switched from the mapping file and the switching time.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the merging processing is performed on the plurality of first material ball files to obtain one or more second material ball files, where the apparatus further includes: the first merging unit is used for merging the plurality of first material ball files according to the number of the plurality of first material ball files to obtain one or more than two second material ball files.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the merging processing is performed on the plurality of first material ball files according to the number of the plurality of first material ball files, to obtain one or more second material ball files, where the apparatus further includes: and the second merging unit is used for merging the plurality of first material ball files to obtain one or more second material ball files when the number of the plurality of first material ball files is larger than or equal to a first preset value.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the merging processing is performed on the plurality of first material ball files to obtain one or more second material ball files, where the apparatus further includes: and the third merging unit is used for merging the plurality of first material ball files according to the material ball attributes of the plurality of first material ball files to obtain one or more second material ball files.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the merging processing is performed on the plurality of first material ball files according to the material ball attributes of the plurality of first material ball files, to obtain one or more second material ball files, where the apparatus further includes: the similarity acquisition unit is used for acquiring the similarity of the material ball attributes among the plurality of first material ball files; and the fourth merging unit is used for merging the first material ball files with the similarity being greater than or equal to a second preset value to obtain one or more second material ball files.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the texture ball attribute is texture ball transparency.
In one exemplary embodiment of the present disclosure, the character model is a model in a cluster containing a plurality of character models based on the foregoing.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the apparatus further includes: and the action control unit is used for controlling actions of the plurality of character models through the centralized control architecture.
Since each functional module of the apparatus for character model processing according to the exemplary embodiment of the present disclosure corresponds to a step of the exemplary embodiment of the method for character model processing described above, for details not disclosed in the apparatus embodiments of the present disclosure, reference is made to the embodiment of the method for character model processing described above in the present disclosure.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above-described method of character model processing is also provided.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 700 according to such an embodiment of the present disclosure is described below with reference to fig. 7. The electronic device 700 shown in fig. 7 is merely an example and should not be construed to limit the functionality and scope of use of embodiments of the present disclosure in any way.
As shown in fig. 7, the electronic device 700 is embodied in the form of a general purpose computing device. Components of electronic device 700 may include, but are not limited to: the at least one processing unit 710, the at least one storage unit 720, a bus 730 connecting the different system components (including the storage unit 720 and the processing unit 710), and a display unit 740.
Wherein the storage unit stores program code that is executable by the processing unit 710 such that the processing unit 710 performs steps according to various exemplary embodiments of the present disclosure described in the above-described "exemplary methods" section of the present specification. For example, the processing unit 710 may perform step S210 shown in fig. 2, to acquire a character model to be processed, and a plurality of first material ball files and animation data corresponding to the character model; step S220, combining the plurality of first material ball files to obtain one or more second material ball files; step S230, baking the animation data into the map to obtain a map file with the animation data; step S240, generating a blueprint comprising a plurality of action nodes according to the mapping file, wherein the action nodes are used for indicating the positions of actions in the mapping file; and step S250, rendering the character model according to the second material ball file, the map file and the blueprint.
As another example, the electronic device may implement the various steps shown in fig. 2.
The memory unit 720 may include readable media in the form of volatile memory units, such as Random Access Memory (RAM) 721 and/or cache memory 722, and may further include Read Only Memory (ROM) 723.
The storage unit 720 may also include a program/utility 724 having a set (at least one) of program modules 725, such program modules 725 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 730 may be a bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 700 may also communicate with one or more external devices 770 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 700, and/or any device (e.g., router, modem, etc.) that enables the electronic device 700 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 750. Also, electronic device 700 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through network adapter 760. As shown, network adapter 760 communicates with other modules of electronic device 700 over bus 730. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 700, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification is also provided. In some possible embodiments, the various aspects of the present disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Furthermore, the above-described figures are only schematic illustrations of processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (13)

1. A method of character model processing, the method comprising:
Acquiring a character model to be processed, and a plurality of first material ball files and animation data corresponding to the character model;
Combining the plurality of first material ball files to obtain one or more second material ball files;
baking the animation data into a map to obtain a map file with animation data;
generating a blueprint comprising a plurality of action nodes according to the map file, wherein the action nodes are used for indicating the positions of actions in the map file;
and rendering the role model according to the second material ball file, the map file and the blueprint.
2. The method of claim 1, wherein the action node is further configured to indicate a switching relationship between actions.
3. The method of claim 1, wherein the action nodes comprise instance object random nodes, vector nodes, decision nodes;
The example object random nodes are used to randomly determine an initial action from the map file for the character model;
the vector node is used for determining other actions except the initial action from the map file;
And the judging node is used for determining actions to be switched and switching time from the map file.
4. The method of claim 1, wherein the merging the plurality of first material ball files to obtain one or more second material ball files comprises:
And combining the plurality of first material ball files according to the number of the plurality of first material ball files to obtain one or more second material ball files.
5. The method of claim 4, wherein merging the plurality of first material ball files according to the number of the plurality of first material ball files to obtain one or more second material ball files, comprises:
And when the number of the plurality of first material ball files is larger than or equal to a first preset value, merging the plurality of first material ball files to obtain one or more second material ball files.
6. The method of claim 1, wherein the merging the plurality of first material ball files to obtain one or more second material ball files comprises:
And combining the plurality of first material ball files according to the material ball attributes of the plurality of first material ball files to obtain one or more second material ball files.
7. The method of claim 6, wherein the merging the plurality of first ball files according to the ball attributes of the plurality of first ball files to obtain one or more second ball files comprises:
obtaining similarity of material ball attributes among the plurality of first material ball files;
And merging the first material ball files with the similarity being greater than or equal to a second preset value to obtain one or more second material ball files.
8. The method of claim 6, wherein the texture ball attribute is texture ball transparency.
9. The method of claim 1, wherein the character model is a model in a cluster, the cluster comprising a plurality of the character models.
10. The method according to claim 9, wherein the method further comprises:
Actions of the plurality of character models are controlled by a centralized control architecture.
11. An apparatus for character model processing, comprising:
the character model acquisition module is used for acquiring a character model to be processed, and a plurality of first material ball files and animation data corresponding to the character model;
the material ball file merging module is used for merging the plurality of first material ball files to obtain one or more second material ball files;
The map file baking module is used for baking the animation data into a map to obtain a map file with the animation data;
The blueprint generating module is used for generating a blueprint comprising a plurality of action nodes according to the mapping file, wherein the action nodes are used for indicating the positions of actions in the mapping file;
And the role model rendering module is used for rendering the role model according to the second material ball file, the map file and the blueprint.
12. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any one of claims 1-10.
13. An electronic device, comprising:
a processor; and
A memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-10.
CN202110902409.2A 2021-08-06 2021-08-06 Method, device, medium and electronic equipment for processing character model Active CN113590334B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110902409.2A CN113590334B (en) 2021-08-06 2021-08-06 Method, device, medium and electronic equipment for processing character model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110902409.2A CN113590334B (en) 2021-08-06 2021-08-06 Method, device, medium and electronic equipment for processing character model

Publications (2)

Publication Number Publication Date
CN113590334A CN113590334A (en) 2021-11-02
CN113590334B true CN113590334B (en) 2024-06-04

Family

ID=78255896

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110902409.2A Active CN113590334B (en) 2021-08-06 2021-08-06 Method, device, medium and electronic equipment for processing character model

Country Status (1)

Country Link
CN (1) CN113590334B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103136778A (en) * 2013-01-28 2013-06-05 吉林禹硕动漫游戏科技股份有限公司 Movie-level group animation manufacture method based on autonomous cache system
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN108961367A (en) * 2018-06-21 2018-12-07 珠海金山网络游戏科技有限公司 The method, system and device of role image deformation in the live streaming of three-dimensional idol
CN109272567A (en) * 2018-11-29 2019-01-25 成都四方伟业软件股份有限公司 Model optimization method and apparatus
CN109615683A (en) * 2018-08-30 2019-04-12 广州多维魔镜高新科技有限公司 A kind of 3D game animation model production method based on 3D dress form
CN109961498A (en) * 2019-03-28 2019-07-02 腾讯科技(深圳)有限公司 Image rendering method, device, terminal and storage medium
CN111028361A (en) * 2019-11-18 2020-04-17 杭州群核信息技术有限公司 Three-dimensional model and material merging method, device, terminal, storage medium and rendering method
CN111192353A (en) * 2019-12-30 2020-05-22 珠海金山网络游戏科技有限公司 Material generation method and device
CN111710020A (en) * 2020-06-18 2020-09-25 腾讯科技(深圳)有限公司 Animation rendering method and device and storage medium
CN112843704A (en) * 2021-03-12 2021-05-28 腾讯科技(深圳)有限公司 Animation model processing method, device, equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7106325B2 (en) * 2001-08-03 2006-09-12 Hewlett-Packard Development Company, L.P. System and method for rendering digital images having surface reflectance properties
US8411086B2 (en) * 2009-02-24 2013-04-02 Fuji Xerox Co., Ltd. Model creation using visual markup languages

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103136778A (en) * 2013-01-28 2013-06-05 吉林禹硕动漫游戏科技股份有限公司 Movie-level group animation manufacture method based on autonomous cache system
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN108961367A (en) * 2018-06-21 2018-12-07 珠海金山网络游戏科技有限公司 The method, system and device of role image deformation in the live streaming of three-dimensional idol
CN109615683A (en) * 2018-08-30 2019-04-12 广州多维魔镜高新科技有限公司 A kind of 3D game animation model production method based on 3D dress form
CN109272567A (en) * 2018-11-29 2019-01-25 成都四方伟业软件股份有限公司 Model optimization method and apparatus
CN109961498A (en) * 2019-03-28 2019-07-02 腾讯科技(深圳)有限公司 Image rendering method, device, terminal and storage medium
CN111028361A (en) * 2019-11-18 2020-04-17 杭州群核信息技术有限公司 Three-dimensional model and material merging method, device, terminal, storage medium and rendering method
CN111192353A (en) * 2019-12-30 2020-05-22 珠海金山网络游戏科技有限公司 Material generation method and device
CN111710020A (en) * 2020-06-18 2020-09-25 腾讯科技(深圳)有限公司 Animation rendering method and device and storage medium
CN112843704A (en) * 2021-03-12 2021-05-28 腾讯科技(深圳)有限公司 Animation model processing method, device, equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"Visualizing large-scale atomistic simulations in ultra-resolution immersive environments";Khairi Reda等;《2013 IEEE Symposium on Large-Scale Data Analysis and Visualization (LDAV)》;全文 *
"基于Unity3d的智能制造仿真系统的研究与设计";朱然;《中国优秀硕士学位论文全文数据库》;正文第44页 *
基于模型移植的高效数据引擎构造方法;王敏;吕强;;计算机工程与应用(35);全文 *

Also Published As

Publication number Publication date
CN113590334A (en) 2021-11-02

Similar Documents

Publication Publication Date Title
KR102656970B1 (en) Virtual object selection methods and devices, devices, and storage media
CN111880877B (en) Animation switching method, device, equipment and storage medium
CN113350779A (en) Game virtual character action control method and device, storage medium and electronic equipment
CN111589135B (en) Virtual object control method, device, terminal and storage medium
EP3575958B1 (en) Object moving method and device, storage medium, and electronic device
CN113018862B (en) Virtual object control method and device, electronic equipment and storage medium
JP2024072870A (en) Server-based video help in video game
CN113590334B (en) Method, device, medium and electronic equipment for processing character model
KR102198838B1 (en) Apparatus and method for improving the processing speed of a game implementing multi-threaded
WO2024026206A1 (en) User sentiment detection to identify user impairment during game play providing for automatic generation or modification of in-game effects
CN111589115A (en) Visual field control method and device for virtual object, storage medium and computer equipment
EP4432237A1 (en) Animation frame display method and apparatus, device, and storage medium
CN111389007A (en) Game control method and device, computing equipment and storage medium
CN117695623A (en) Method and device for managing physical scene resources in virtual world and computer equipment
CN113262468B (en) Skill rendering method and device, electronic equipment and storage medium
CN117046111B (en) Game skill processing method and related device
CN115155057B (en) Interface display method and device, storage medium and electronic equipment
CN112587924B (en) Avoidance method and device for game AI, storage medium and computer equipment
US11876685B1 (en) Locally predicting state using a componentized entity simulation
US20240127520A1 (en) Dynamic interactions between avatar attachments
US20240127521A1 (en) Safety policy for contact interactions between avatars
WO2023226569A9 (en) Message processing method and apparatus in virtual scenario, and electronic device, computer-readable storage medium and computer program product
US20240261678A1 (en) Text extraction to separate encoding of text and images for streaming during periods of low connectivity
WO2024179194A1 (en) Virtual object generation method and apparatus, and device and storage medium
KR20240101669A (en) Loading state detection for gaming applications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant