CN117456140A - Model processing method and device, electronic equipment and storage medium - Google Patents

Model processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117456140A
CN117456140A CN202311269084.4A CN202311269084A CN117456140A CN 117456140 A CN117456140 A CN 117456140A CN 202311269084 A CN202311269084 A CN 202311269084A CN 117456140 A CN117456140 A CN 117456140A
Authority
CN
China
Prior art keywords
color
map
mapping
dimensional model
pixel block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311269084.4A
Other languages
Chinese (zh)
Inventor
黄茂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311269084.4A priority Critical patent/CN117456140A/en
Publication of CN117456140A publication Critical patent/CN117456140A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Image Generation (AREA)

Abstract

The application discloses a method, a device, electronic equipment and a storage medium for model processing, and relates to the technical field of image processing. The method comprises the following steps: obtaining an initial three-dimensional model, and splitting the initial three-dimensional model into a plurality of sub three-dimensional models; obtaining a color summary map; obtaining a UV unfolded view of the sub three-dimensional model, and laying out the UV unfolded view in a first mapping template according to the position of a color pixel block corresponding to the UV unfolded view in the color summary mapping to obtain a UV unfolded view mapping; obtaining a display attribute pixel block of the sub three-dimensional model, and laying out the display attribute pixel block in a second mapping template according to the position of the color pixel block corresponding to the UV unfolded graph in the color summary mapping to obtain a display attribute mapping; and rendering the initial three-dimensional model according to the UV unfolding map and the display attribute map to obtain the three-dimensional model. Therefore, the size of the game model file inclusion can be reduced, the consumption of the game model on the performance is reduced, and the game experience of a user is improved.

Description

Model processing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method and apparatus for model processing, an electronic device, and a storage medium.
Background
In game production, in order to express the reality and detail of a game model, a virtual scene of a game is generally realized by building a three-dimensional model and a large number of maps. Some game models have more details and may require the use of large maps with a greater number of pixels. In a game, a plurality of game components are arranged, and each game component needs a large number of large-scale maps, so that the size of the whole game bag body can reach more than 10G. When multiple game components need to be loaded simultaneously, the excessive calculation amount formed by superposition can cause the card-on delay of game running and even the breakdown of a running system, so that the game experience of a user is affected.
Disclosure of Invention
In view of this, embodiments of the present application provide a method, an apparatus, an electronic device, and a storage medium for model processing. The method provided by the embodiment of the application can solve the problems that the game inclusion of the game model in the prior art is large, the performance consumption of the game model is large, and the game experience of a user is influenced to a certain extent.
An embodiment of the present application provides a method for model processing, where the method includes:
obtaining an initial three-dimensional model, and splitting the initial three-dimensional model into a plurality of sub three-dimensional models;
Obtaining a color summary map comprising color pixel blocks of different colors;
obtaining a UV unfolded view of the sub three-dimensional model, and laying out the UV unfolded view in a first mapping template according to the position of a color pixel block corresponding to the UV unfolded view in the color summary mapping to obtain a UV unfolded view mapping;
obtaining a display attribute pixel block of the sub three-dimensional model, and laying out the display attribute pixel block in a second mapping template according to the position of the color pixel block corresponding to the UV unfolded graph in the color summarizing mapping to obtain a display attribute mapping;
and rendering the initial three-dimensional model according to the UV unfolding map and the display attribute map to obtain a rendered three-dimensional model.
A second aspect of an embodiment of the present application provides an apparatus for model processing, including:
an initial model obtaining unit, configured to obtain an initial three-dimensional model, and split the initial three-dimensional model into a plurality of sub three-dimensional models;
a color summary map obtaining unit configured to obtain a color summary map, where the color summary map includes color pixel blocks of different colors;
a UV-unfolded-map-mapping obtaining unit, configured to obtain a UV unfolded map of the sub-three-dimensional model, and layout the UV unfolded map in a first mapping template according to a position of a color pixel block corresponding to the UV unfolded map in the color summary mapping, to obtain a UV unfolded-map;
A display attribute mapping obtaining unit, configured to obtain a display attribute pixel block of the sub three-dimensional model, and layout the display attribute pixel block in a second mapping template according to a position of a color pixel block corresponding to the UV-unfolded graph in the color summary mapping, so as to obtain a display attribute mapping;
and the rendering unit is used for rendering the initial three-dimensional model according to the UV unfolding map and the display attribute map to obtain a rendered three-dimensional model.
The third aspect of the embodiments of the present application further provides an electronic device, including:
a processor;
a memory;
the memory is used for storing a program of a method of model processing, which program, when read by a processor for execution, performs the method as described in the first aspect.
A fourth aspect of the embodiments of the present application also provides a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, perform the method according to the first aspect.
Compared with the prior art, the model processing method provided by the application has the advantages that an initial three-dimensional model is obtained, the initial three-dimensional model is split into a plurality of sub three-dimensional models, a color summarizing map is obtained according to color pixel blocks with different colors, then UV unfolding is carried out on the sub three-dimensional models to obtain a UV unfolding map, the UV unfolding map is laid out in a first map model according to the position of the color pixel block corresponding to the UV unfolding map in the color summarizing map, a display attribute pixel block of the sub three-dimensional model is further obtained, the display attribute pixel block is laid out in a second map template according to the position of the color pixel block corresponding to the UV unfolding map in the color summarizing map, the display attribute map is obtained, and finally the initial three-dimensional model is rendered according to the obtained UV unfolding map and the display attribute map, and the rendered three-dimensional model is obtained. In this way, the initial three-dimensional model is split, the UV unfolded map of the sub three-dimensional model is sequentially obtained, the UV unfolded map is obtained, the display attribute map corresponding to the position of the UV unfolded map is obtained, the initial three-dimensional model is rendered according to the UV unfolded map and the display attribute map, and a plurality of attribute features of the three-dimensional model are integrated on one map, so that the traditional mode of rendering the model by using a plurality of posted maps is changed, the size of a bag body of the game model is reduced, the consumption of game performance is reduced, and the game experience of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person of ordinary skill in the art.
FIG. 1 is a schematic diagram of a prior art process for creating a generic model using a PBR flow;
FIG. 2 is a schematic diagram of an exemplary system architecture provided by an embodiment of the present application;
FIG. 3 is a flow diagram of a method of model processing provided by an embodiment of the present application;
FIG. 4 is a schematic illustration of an initial three-dimensional model provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a plurality of sub-three-dimensional models after splitting provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of a color summary map provided by an embodiment of the present application;
FIG. 7 is a schematic illustration of another color summary map provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a categorized color summary map provided by an embodiment of the present application;
FIG. 9 is a detailed view of a UV expanded view of a pant sub-three-dimensional model provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of all UV-expanded image pixel block maps for an initial three-dimensional model provided by an embodiment of the present application;
FIG. 11 is a schematic diagram of a display attribute map provided in an embodiment of the present application;
FIG. 12 is a schematic diagram of a rendered model provided by an embodiment of the present application;
FIG. 13 is a schematic diagram of a rendered model file memory obtained by the model processing method according to the embodiment of the present application;
FIG. 14 is a block diagram of a device of a model processing method provided in an embodiment of the present application;
fig. 15 is a schematic logic structure diagram of an electronic device according to an embodiment of the present application.
Specific embodiments of the present disclosure have been shown by way of the above drawings and will be described in more detail below. These drawings and the written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the disclosed concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present application, the present application is clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. This application is intended to be limited to the details of the construction set forth in the following description, but it is intended to cover all such modifications and variations as fall within the scope of the present application.
It should be noted that the terms "first," "second," "third," and the like in the claims, specification, and drawings herein are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. The data so used may be interchanged where appropriate to facilitate the embodiments of the present application described herein, and may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and their variants are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or" is merely an association relationship describing an association object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. "comprising A, B and/or C" means comprising any 1 or any 2 or 3 of A, B, C.
It should be understood that in the embodiments of the present application, "B corresponding to a", "a corresponding to B", or "B corresponding to a", means that B is associated with a, from which B may be determined. Determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information.
Before describing the embodiments of the present application, a brief description of related background and nouns of the present application and problems existing in the prior art will be presented.
1. MMORPG: namely, a massive multi-player online role playing Game (Massively Multiplayer Online Role-playing Game), which is a network Game in which electronic role playing games are classified and separated according to the number of people in the electronic Game. In a MMORPG, a player may play one or more virtual roles and control the activities and behavior of that role in the virtual world in a game. Typically, in an MMORPG game, the total number of scene object models is about 3000, occupying about 6GB of the size of the game piece; about 1000 character pieces (e.g., main corner, riding, non-player character, monster, etc.), occupy about 5GB in size of the game piece. The game total inclusion occupied by a common MMORPG game is more than 10 GB.
The MMORPG adopts a strong server logic architecture, the local client only reports the operation of the player as far as possible, the server checks the legality of the operation, and finally, the calculation result is issued to the local client, and the result is displayed by the local client. Taking the skill as an example, the local client reports the ID (Identity) of the skill used by the player, the server checks whether the player can use the skill, checks whether the skill CD (Cool Down Time) is in cooling, calculates the skill result (such as which monster the skill is used to hit, how much blood is discarded, what other effect is caused, etc.), and finally returns the result to the client, and the client displays the result.
Therefore, the operation pressure of the server can be increased, when thousands of people operate the same screen at the same time in the game, different occupations, different skills, different release time and different injury values of different game models are all overlapped together to form an ultra-large calculated amount, and the game is delayed and even the running system collapses. Or in the case of thousands of people with figures, the need to load different character models, monster models, and scene items required for the full figure, while the need to load each game model consumes little performance, is a great challenge for the whole running system.
Therefore, if the size of the game piece in the MMORPG can be reduced, it is very helpful to improve the running performance of the game.
2. PBR: the physical-Based rendering is a physical-Based rendering mode, provides an illumination and rendering method, and can more accurately depict the effect between light and a surface. The PBR process can be generally employed to model MMORPG games.
PBR rendering methods, in general, simulate real-time illumination and real reflection. The use of PBR for model rendering has the following advantages: first, PBR provides a stable art workflow; second, the rendered material appears to be substantially free of significant errors under all lighting conditions; third, creating texture attributes using the PBR method is no longer based on experience and intuition alone, but is based on real data.
PBR is composed of two parts, light attributes including but not limited to direct illumination, indirect illumination, direct highlighting, indirect highlighting, shading, ambient light occlusion, and surface attributes including but not limited to Color (Base Color/Albedo), normal (Normal), highlighting, roughness (Roughness), metallic. In real life, the presentation of visual effects is mainly dependent on: under natural illumination, the Color of the object is shown (Base Color/Albedo); specular reflection angle of object surface to light (roughess); the ratio of Specular to diffuse reflection (Metallic/Specular) of light at the object surface.
3. In the prior art, a PBR process is utilized to manufacture a model process and defects of the model process:
in general, a game model manufactured by using a PBR process is mostly composed of multidimensional materials, strict mapping specifications are needed, at least three mapping is needed for each material during mapping of the model, a basic color map, a normal map and an M map (roughness+metallization+ao integrated map), and the amount of mapping resources used is huge.
As shown in fig. 1, a schematic diagram of a conventional model manufactured by using a PBR process in the prior art is shown. Fig. 1 (a) is a final effect diagram of a game model, fig. 1 (b) is a full-scale UV and UV arrangement situation of a body part model in fig. 1 (a), fig. 1 (c) is a full map required for manufacturing fig. 1 (a), and fig. 1 (d) is a memory size occupied by a file of the game model.
Specifically, after determining the target game model, performing UV expansion on the target game model to obtain a UV expansion result, as shown in fig. 1 (b); and mapping all the maps shown in the figure 1 (c) to corresponding positions of the UV unfolding results according to the UV unfolding results to obtain a mapped target game model, wherein the mapped target game model is used as a rendered game model as shown in the figure 1 (a).
As can be seen from the file attributes shown in fig. 1 (d), even a simple game model already occupies more than 8M of memory. In MMORPG games, thousands of game models exist, so that the consumed memory space is huge, and the game inclusion is quite large, so that the memory occupied by the game inclusion is necessarily reduced when the display effect of the game models is not greatly influenced.
4. UV mapping: UV is an acronym for U, V texture map coordinates, which may also be referred to as texture map coordinates, may define the location of each point in a two-dimensional image. In a game model, the UV may precisely correspond each point on the image to the surface of the game model, thereby causing the game model to present a corresponding visual effect. UV expansion is a representation of converting a game 3D (Three-Dimensional) model into a 2D (Two-Dimensional) plane.
Rendering the game model in a mapping mode in general, namely performing UV (ultraviolet) unfolding on the game model to obtain a UV unfolding result; mapping the map to the corresponding position according to the UV unfolding result to obtain a mapped game model, and using the mapped game model as a rendered model.
5. LowPoly: lowPoly (low polygon) is a method of forming a model by using a low polygon, and can be considered as forming a low-mode model, which has a smaller number of surfaces and fewer details than a high-mode model. LowPoly is actually a multi-color element, divided by triangles, with the color of each triangle being taken from the corresponding location of the original multi-color element. LowPoly is filled with solid colors and shape recognition depends on the color shade, so adjacent triangles cannot use the same color and need to be colored at intervals.
Before describing the method for model processing provided in the embodiments of the present application, an application scenario of the method for model processing applicable to the embodiments of the present application is described below with reference to fig. 2.
Fig. 2 illustrates an exemplary system architecture 200 of a model processing method provided in an embodiment of the present application, the system comprising: terminal devices 201, 202, 203, a network 204 and a background server 205. The network 204 is the medium used to provide the communication links between the terminal devices 201, 202, 203 and the background server 205. The network 204 may include various connection types, such as wired, wireless communication links, or fiber optic cables. The user may interact, i.e. receive or send, a process of the model using the terminal devices 201, 202, 203 via the network 204. The terminal devices 201, 202, 203 may be various electronic devices having a display screen and supporting page browsing, including but not limited to smartphones, tablets, laptop and desktop computers, etc. The background server 205 may be a background server providing various services, such as a background server providing support for the processing of models by users on the terminal devices 201, 202, 203. The background server may perform analysis or other processing on the received data such as operations or modifications, and feed back the processing results to the terminal devices 201, 202, 203, so as to display the processing results on the terminal devices.
It should be noted that, the method for model processing provided in the embodiments of the present application is generally executed by the server 205, and accordingly, the device for model processing is generally deployed in the server 205.
It will be appreciated that the number of terminal devices, networks and background servers shown in fig. 2 is merely illustrative, and that any number of terminal devices, networks and background servers may be provided in actual practice as desired for implementation.
The first embodiment of the application provides a method for model processing. Fig. 3 is a schematic flow chart of a method for model processing according to the first embodiment of the present application.
A method of model processing according to the first embodiment of the present application will be described in detail with reference to fig. 3. It should be noted that the steps illustrated in the flowchart may be performed in a computer system, such as a set of computer-executable instructions, and in some cases, the steps illustrated may be performed in a different logical order than that illustrated in the flowchart.
Step S301, an initial three-dimensional model is obtained, and the initial three-dimensional model is split into a plurality of sub three-dimensional models.
The method comprises the steps of obtaining an initial three-dimensional model and splitting the obtained initial three-dimensional model into a plurality of sub three-dimensional models.
In this embodiment, a white mold with a suitable ratio and structure can be made by using Low Poly as the initial three-dimensional model to be processed. As shown in fig. 4, a schematic diagram of an initial three-dimensional model according to the first embodiment of the present application is shown. The initial three-dimensional model is a simple white model which is manufactured, and the initial three-dimensional model is a game model before rendering. The initial three-dimensional model includes one or more of the following: character model, scene model, UI element.
After the initial three-dimensional model is obtained, splitting the initial three-dimensional model according to the model material types to obtain a plurality of sub three-dimensional models. The final desired fineness of the rendered three-dimensional model is related to the number of sub-three-dimensional models split at this time. For the same initial three-dimensional model, the more the number of sub-three-dimensional models split according to model materials, the finer the rendered three-dimensional model finally obtained, and the more the consumed game resources and the larger the game inclusion. Therefore, it is necessary to balance the relationship between the consumed game resources and the fineness of the finally obtained three-dimensional model in selecting the number of split sub-three-dimensional models. The specific degree of splitting is not limited herein, and the user can set up by himself according to actual demands.
In this embodiment, as shown in fig. 5, a schematic diagram of a plurality of split sub-three-dimensional models provided in the first embodiment of the present application is shown. In this embodiment, the initial three-dimensional model is split into 38 sub-models, and the split sub-three-dimensional models are stored in files respectively and named for the sub-three-dimensional models in turn. Naming convention users can set themselves. In particular, naming can be distinguished from a large class of models, such as body parts, head parts, and further named in connection with specific parts. "body_weijin1" as in fig. 4 is a scarf of the body part, and "tou _eyes00" is one side eye of the head. It will be appreciated that the specific naming scheme is not limited in this embodiment, as the manner in which the user can identify it is subject to.
The method comprises the steps of obtaining an initial three-dimensional model, splitting the initial three-dimensional model into a plurality of sub three-dimensional models, and providing a data base for the subsequent splitting treatment of the plurality of sub three-dimensional models.
In step S302, a color summary map is obtained, the color summary map comprising color pixel blocks of different colors.
The step is used for obtaining a color summary map, wherein the color summary map comprises color pixel blocks with different colors. It should be noted that the color summary map is a summary of all color maps required for multiple game models in a game scene, and a developer can continuously add new pixel blocks in the blank area of the color summary map. The color summary map may be a blank map or a map containing some commonly used color pixel blocks at the beginning of the summary of the color summary map.
Fig. 6 is a schematic diagram of a color summary map according to the first embodiment of the present application. In this example, the color summary map is initially an empty map, and the color summary map is constructed from the required color attribute maps in the map for the sub-three-dimensional model. For the initial three-dimensional model, the total number of the split sub-three-dimensional models is 38, and the color attribute characteristics of each sub-three-dimensional model to be set are different, so that 38 pixel blocks are required in the process of constructing the color summary map.
Fig. 7 is a schematic diagram of another color summary map according to an embodiment of the disclosure. The color summary map has a plurality of color maps of the game model assembled therein, the color maps having a plurality of color pixel blocks therein. Since the image itself is colored, and fig. 7 is a black-and-white image converted from a color image, we can determine the different colors of the pixel block by the brightness of the color pixel block. Because the color pixel blocks in the color summarization map are complex and are all monochrome pixel blocks, in order to save the memory and reduce the size of the game inclusion, the monochrome pixel blocks in the color summarization map shown in fig. 7 can occupy only one pixel-point-sized memory. It will be appreciated that the size of the block of monochrome pixels can be other sizes as preset.
To facilitate the operation of different developers, pixel blocks may be categorized according to the kind of color pixel block. Fig. 8 is a schematic diagram of a classified color summary map according to an embodiment of the present application. In this example of color summary mapping, all cloth, leather, plastic color blocks are classified as one type, all metallic color blocks are classified as one type, all skin color blocks are classified as one type, and all clay, stone color blocks are also classified as one type. The pixel blocks of different types are classified into one type, so that a developer can conveniently and quickly identify whether the color summarization map has the color pixel blocks required by the model rendering or not, and if not, the color summarization map can be timely adjusted.
It should be noted that the text content in fig. 8 is merely schematic, and since the vectorial text consumes a part of memory, the actual color summary map may not be described, and a developer may know the type of the color map according to the map classified into one type.
The step is used for obtaining the color summarization map, and color pixel blocks with different colors are included in the color summarization map, so that the color pixel blocks with different colors can be conveniently provided for the sub-three-dimensional model.
Step S303, a UV unfolded view of the sub three-dimensional model is obtained, and the UV unfolded view is laid out in a first mapping template according to the position of a color pixel block corresponding to the UV unfolded view in the color summary mapping, so that a UV unfolded view mapping is obtained.
The method comprises the steps of obtaining a UV unfolded view of the sub three-dimensional model, and laying out the UV unfolded view in a first mapping template according to the position of a color pixel block corresponding to the UV unfolded view in the color summary mapping to obtain a UV unfolded view mapping.
The split sub-three-dimensional model is a 3D model, which needs to be converted into a 2D model in order to meet the subsequent rendering requirements. In this embodiment, a UV expansion map of the sub-three-dimensional model may be obtained by UV expansion.
Setting the size of the color pixel block to be a first size;
obtaining a UV unfolded view of the sub three-dimensional model, and reducing the size of the UV unfolded view to a first size to obtain a UV unfolded view pixel block;
and according to the position of the color pixel block corresponding to the UV unfolded image pixel block in the color summarizing mapping, arranging the UV unfolded image pixel block in a first mapping template to obtain a UV unfolded image pixel block mapping, wherein the first mapping template is a blank mapping.
In specific implementation, when the size of the color pixel block is set to be one pixel, sequentially performing UV unfolding operation on the plurality of sub-three-dimensional models to obtain UV unfolding graphs of the plurality of sub-three-dimensional models, and respectively shrinking the UV unfolding graphs of the plurality of sub-three-dimensional models to be one pixel size to obtain the UV unfolding image pixel block.
As shown in fig. 9, it is a detailed view of the UV development view of the trousers sub-three-dimensional model provided in the first embodiment of the present application. The left side of fig. 9 is a schematic diagram of a trousers sub-three-dimensional model in the selected split sub-three-dimensional model, and UV unfolding operation is performed on the trousers sub-three-dimensional model to obtain a UV unfolding diagram as shown on the right side of fig. 9. And reducing the UV expansion map of the trousers sub-three-dimensional model to a pixel size to obtain a UV expansion map pixel block of the trousers sub-three-dimensional model. As shown in fig. 10, a schematic diagram of a pixel block map of all UV unfolded images for an initial three-dimensional model according to the first embodiment of the present application is shown. And arranging the UV unfolded image pixel blocks of all the sub three-dimensional models according to a certain rule to obtain a UV unfolded image pixel block map. The highlight position indicated by the arrow is a schematic diagram of the UV-unfolding image pixel block of the trousers submodel of fig. 9.
After the UV unfolded image pixel blocks of the plurality of sub three-dimensional models are obtained, arranging all unfolded image pixel blocks according to a certain rule, and laying out UV unfolded image mapping in a first mapping template.
The specific layout of the UV-unfolded map in the first map template may be as follows: the color pixel blocks of different colors in the color summary map occupy different positions, and the positions of the corresponding UV unfolded map or planar pixel submodel are determined according to the positions of the color pixel blocks in the color summary map, namely, the positions of the UV unfolded map pixel blocks are in one-to-one correspondence with the positions of the color pixel blocks in the color summary map.
Continuing with the above example, in the color summary map of this embodiment, there are 38 color pixel blocks, and all the color pixel blocks are arranged in a certain order, after the UV-developed image pixel blocks of the initial three-dimensional model are obtained, the UV-developed image pixel blocks are laid out at corresponding positions according to the colors required to be set for the UV-developed image pixel blocks, so as to obtain the UV-developed image pixel block map.
For example, in one example, the color pixel blocks required for the pant sub-model in fig. 9 are the color pixel blocks at the first pixel point position of the third row in fig. 10, and then the UV expanded image pixel blocks of the pant sub-model are also laid out at the first pixel point position of the third row in the first mapping template.
When the color summary map includes color pixel blocks (for example, 3000) of a plurality of different game models, only 30 sub-three-dimensional models are split after the three-dimensional models are split, and UV expansion is performed on the 30 sub-three-dimensional models to obtain a UV expansion image pixel block map, positions of the 30 color pixel blocks in the required color summary map need to be corresponding to the 30 UV expansion image pixel blocks. It will be appreciated that there are a large number of blank pixels in the UV spread map, and that 30 UV spread map pixel blocks are spread over the same locations as the desired color pixel blocks.
And if the color summarization map does not contain the color pixel blocks corresponding to the UV unfolded map of the sub three-dimensional model, generating the color pixel blocks corresponding to the UV unfolded map, and filling the color pixel blocks corresponding to the UV unfolded map into the color summarization map.
In the above example, fig. 6 is a schematic diagram of the color summary map, and fig. 9 is a schematic diagram of the obtained UV developed map. The color pixel blocks in fig. 6 are in one-to-one correspondence with the positions of the UV-extended pixel block map in fig. 9.
The method comprises the steps of obtaining a UV unfolded view of the sub three-dimensional model, and laying out the UV unfolded view in a first mapping template according to the position of a color pixel block corresponding to the UV unfolded view in the color summary mapping to obtain a UV unfolded view mapping. And processing the split sub three-dimensional model to obtain the UV unfolded image map constructed by the UV unfolded image pixel blocks corresponding to the positions of the color pixel blocks.
And S304, obtaining a display attribute pixel block of the sub three-dimensional model, and laying out the display attribute pixel block in a second mapping template according to the position of the color pixel block corresponding to the UV unfolded graph in the color summary mapping to obtain a display attribute mapping.
The step is used for obtaining a display attribute pixel block of the sub three-dimensional model, and laying out the display attribute pixel block in a second mapping template according to the position of the color pixel block corresponding to the UV unfolded graph in the color summarizing mapping to obtain a display attribute mapping.
Wherein the display attribute pixel block includes one or more of: color attribute pixel blocks, normal pixel blocks, or texture pixel blocks.
Similar to the method of obtaining the UV unfolded map, this step also requires one or more of a color attribute map, a normal attribute map, and a texture attribute map to be obtained.
Specifically, it can be obtained by:
and obtaining a color attribute pixel block of the sub three-dimensional model, and laying out the color attribute pixel block in a first sub mapping template of a second mapping template according to the coordinate information of the color pixel block corresponding to the UV unfolded graph in the color summary mapping to obtain a color attribute mapping.
And/or obtaining a normal attribute pixel block of the sub-three-dimensional model, and laying out the normal attribute pixel block in a second sub-mapping template of a second mapping template according to the coordinate information of the color pixel block corresponding to the UV unfolded graph in the color summary mapping to obtain a normal attribute mapping.
And/or obtaining a texture attribute pixel block of the sub-three-dimensional model, and laying out the normal attribute pixel block in a third sub-mapping template of a second mapping template according to the coordinate information of the color pixel block corresponding to the UV unfolded graph in the color summary mapping to obtain a texture attribute mapping.
Continuing with the above example, after the UV-unfolded map is obtained, since the color summary map is built on the blank map in this embodiment, the color summary map is the color attribute map in this embodiment. When the color pixel blocks included in other color summary maps are more, the color attribute map needs to be obtained according to the coordinate information of the color pixel block corresponding to the UV unfolded map in the color summary map.
In the case where the display attribute pixel block includes the color attribute pixel block, after obtaining the color attribute map, the method further includes:
And adjusting at least one of hue, brightness and saturation of the color attribute map.
And obtaining a normal attribute pixel block of the sub three-dimensional model in the normal attribute mapping, and laying out the normal attribute pixel block in a second sub mapping template of a second mapping template according to the coordinate information of the color pixel block corresponding to the UV unfolded graph in the color summarizing mapping to obtain the normal attribute mapping.
And obtaining texture attribute pixel blocks of the sub three-dimensional model in the texture attribute mapping, and laying out the normal attribute pixel blocks in a third sub mapping template of a second mapping template according to the coordinate information of the color pixel blocks corresponding to the UV unfolded graph in the color summary mapping to obtain the texture attribute mapping.
Fig. 11 is a schematic diagram of a display attribute map according to the first embodiment of the present application. Fig. 11 (a) is a color attribute map, fig. 11 (b) is a normal attribute map, and fig. 11 (c) is a texture attribute map.
The method is used for obtaining the display attribute map of the sub three-dimensional model and provides basis for rendering the model by using the display attribute map.
And step S305, rendering the initial three-dimensional model according to the UV unfolded map and the display attribute map to obtain a rendered three-dimensional model.
The step is used for rendering the initial three-dimensional model according to the UV unfolding map and the display attribute map to obtain a rendered three-dimensional model.
And rendering the initial three-dimensional model through the UV unfolding mapping and one or more of the color mapping, the normal mapping and the texture mapping to obtain a rendered three-dimensional model.
After obtaining the rendered three-dimensional model, the method further comprises:
and adjusting the display attribute pixel blocks in the display attribute map to obtain an adjusted three-dimensional model.
The adjusting the display attribute pixel block in the display attribute map includes one or more of: adjusting the roughness of the texture attribute map through an R channel in the display attribute pixel block; the metalness of the texture attribute map is adjusted through a G channel in the display attribute pixel block; adjusting the replacement degree of the texture attribute map through a B channel in the display attribute pixel block; and adjusting the transparency of the texture attribute map through an A channel in the display attribute pixel block.
As shown in fig. 12, a schematic diagram of a rendered model according to an embodiment of the present application is provided. Continuing with the above example, the initial three-dimensional model is rendered by the obtained color attribute map, normal attribute map, and texture attribute map, and a rendered three-dimensional model is obtained.
Fig. 13 is a schematic diagram of a rendered model file memory obtained by the model processing method according to the embodiment of the present application. The size of the three-dimensional model flow file rendered in the above manner is only 124KB. Compared with the file size of 8MB in the prior art, the memory occupied by the three-dimensional model in rendering is reduced. When thousands of game models exist in the game, the memory size occupied by the game models is greatly reduced, and the size of a game inclusion is reduced.
The first embodiment of the application provides a model processing method, which comprises the steps of obtaining an initial three-dimensional model, splitting the initial three-dimensional model into a plurality of sub-three-dimensional models, obtaining color summarization maps according to color pixel blocks with different colors, performing UV unfolding on the sub-three-dimensional models to obtain a UV unfolded map, laying out the UV unfolded map in a first map model according to the positions of the color pixel blocks corresponding to the UV unfolded map in the color summarization maps, obtaining a UV unfolded map, further obtaining display attribute pixel blocks of the sub-three-dimensional model, laying out the display attribute pixel blocks in a second map template according to the positions of the color pixel blocks corresponding to the UV unfolded map in the color summarization maps, obtaining a display attribute map, and finally rendering the initial three-dimensional model according to the obtained UV unfolded map and the display attribute map, thereby obtaining a rendered three-dimensional model. In this way, the initial three-dimensional model is split, the UV unfolded map of the sub three-dimensional model is sequentially obtained, the UV unfolded map is obtained, the display attribute map corresponding to the position of the UV unfolded map is obtained, the initial three-dimensional model is rendered according to the UV unfolded map and the display attribute map, and a plurality of attribute features of the three-dimensional model are integrated on one map, so that the traditional mode of rendering the model by using a plurality of posted maps is changed, the size of a bag body of the game model is reduced, the consumption of game performance is reduced, and the game experience of a user is improved.
A second embodiment of the present application provides a device for model processing, which corresponds to the method for model processing provided in the first embodiment of the present application, and is briefly described herein. Reference may be made to the second embodiment, without any ambiguity in the implementation of this embodiment.
Please refer to fig. 8, which is a block diagram of an apparatus according to a third embodiment of the present application.
A third embodiment of the present application provides an apparatus 1400 for model processing, the apparatus comprising: an initial model obtaining unit 1401, a color summary map obtaining unit 1402, a uv developed map obtaining unit 1403, a display attribute map obtaining unit 1404, and a rendering unit 1405.
An initial model obtaining unit 1401, configured to obtain an initial three-dimensional model, and split the initial three-dimensional model into a plurality of sub three-dimensional models;
a color summary map obtaining unit 1402 configured to obtain a color summary map, the color summary map including color pixel blocks of different colors;
a UV-unfolded-map obtaining unit 1403, configured to obtain a UV unfolded map of the sub-three-dimensional model, and layout the UV unfolded map in a first map template according to the position of the color pixel block corresponding to the UV unfolded map in the color summary map, to obtain a UV unfolded-map;
A display attribute map obtaining unit 1404, configured to obtain a display attribute pixel block of the sub three-dimensional model, and layout the display attribute pixel block in a second map template according to a position of the color pixel block corresponding to the UV-unfolded map in the color summary map, to obtain a display attribute map;
and a rendering unit 1405, configured to render the initial three-dimensional model according to the UV-unfolded map and the display attribute map, to obtain a rendered three-dimensional model.
Optionally, the size of the color pixel block is a first size; the step of obtaining a UV-unfolded view of the sub-three-dimensional model, the step of laying out the UV-unfolded view in a first mapping template according to the position of a color pixel block corresponding to the UV-unfolded view in the color summary mapping, and the step of obtaining a UV-unfolded view mapping comprises the following steps: obtaining a UV unfolded view of the sub three-dimensional model, and reducing the size of the UV unfolded view to a first size to obtain a UV unfolded view pixel block; and according to the position of the color pixel block corresponding to the UV unfolded image pixel block in the color summarization mapping, arranging the UV unfolded image pixel block in a first mapping template to obtain a UV unfolded image pixel block mapping.
Optionally, the display attribute pixel block includes one or more of: color attribute pixel blocks, normal pixel blocks, or texture pixel blocks; the obtaining a display attribute pixel block of the sub three-dimensional model, laying out the display attribute pixel block in a second mapping template according to the position of the color pixel block corresponding to the UV-unfolded graph in the color summary mapping, and obtaining a display attribute mapping, including one or more of the following steps: obtaining a color attribute pixel block of the sub three-dimensional model, and laying out the color attribute pixel block in a first sub mapping template of a second mapping template according to the coordinate information of the color pixel block corresponding to the UV unfolded graph in the color summarizing mapping to obtain a color attribute mapping; obtaining a normal attribute pixel block of the sub three-dimensional model, and laying out the normal attribute pixel block in a second sub mapping template of a second mapping template according to coordinate information of a color pixel block corresponding to the UV unfolded graph in the color summarizing mapping to obtain a normal attribute mapping; and obtaining texture attribute pixel blocks of the sub three-dimensional model, and laying out the normal attribute pixel blocks in a third sub mapping template of a second mapping template according to the coordinate information of the color pixel blocks corresponding to the UV unfolded graph in the color summary mapping to obtain texture attribute mapping.
Optionally, the rendering the initial three-dimensional model according to the UV-unfolded-map and the display attribute-map to obtain a rendered three-dimensional model includes: and rendering the initial three-dimensional model through the UV unfolding mapping and one or more of the color mapping, the normal mapping and the texture mapping to obtain a rendered three-dimensional model.
Optionally, the method further comprises: and if the color summarization map does not contain the color pixel blocks corresponding to the UV unfolded map of the sub three-dimensional model, generating the color pixel blocks corresponding to the UV unfolded map, and filling the color pixel blocks corresponding to the UV unfolded map into the color summarization map.
Optionally, in a case where the display attribute pixel block includes the color attribute pixel block, after obtaining the color attribute map, the method further includes: and adjusting at least one of hue, brightness and saturation of the color attribute map.
Optionally, after obtaining the rendered three-dimensional model, the method further comprises: and adjusting the display attribute pixel blocks in the display attribute map to obtain an adjusted three-dimensional model.
Optionally, the adjusting the display attribute pixel block in the display attribute map includes one or more of: adjusting the roughness of the texture attribute map through an R channel in the display attribute pixel block; the metalness of the texture attribute map is adjusted through a G channel in the display attribute pixel block; adjusting the replacement degree of the texture attribute map through a B channel in the display attribute pixel block; and adjusting the transparency of the texture attribute map through an A channel in the display attribute pixel block.
Optionally, the initial three-dimensional model includes one or more of: character model, scene model, UI element.
The fourth embodiment of the present application also provides an electronic device corresponding to the method of the first embodiment of the present application. Fig. 15 is a schematic view of an electronic device according to a third embodiment of the present application. The electronic device includes: at least one processor 1501, at least one communication interface 1502, at least one memory 1503, and at least one communication bus 1504; alternatively, the communication interface 1502 may be an interface of a communication module, such as an interface of a GSM module; the processor 801 may be a processor CPU or a specific integrated circuit ASIC (Application Specific Integrated Circuit) or one or more integrated circuits configured to implement embodiments of the present invention. Memory 1503 may comprise high-speed RAM memory or may also comprise non-volatile memory, such as at least one disk memory. In which the memory 1503 stores a program, and the processor 1501 calls the program stored in the memory 1503 to execute the method of the first embodiment of the present invention.
The fourth embodiment of the present application also provides a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, perform the method of model processing described above.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
While the invention has been described in terms of preferred embodiments, it is not intended to be limiting, but rather, it will be apparent to those skilled in the art that various changes and modifications can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.
It should be noted that, in the embodiments of the present application, the use of user data may be involved, and in practical applications, user specific personal data may be used in the schemes described herein within the scope allowed by applicable legal regulations in the country where the applicable legal regulations are met (for example, the user explicitly agrees to the user to actually notify the user, etc.).
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region, and provide corresponding operation entries for the user to select authorization or rejection.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
1. Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer readable media, as defined herein, does not include non-transitory computer readable media (transmission media), such as modulated data signals and carrier waves.
2. It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.

Claims (12)

1. A method of model processing, comprising:
obtaining an initial three-dimensional model, and splitting the initial three-dimensional model into a plurality of sub three-dimensional models;
obtaining a color summary map comprising color pixel blocks of different colors;
obtaining a UV unfolded view of the sub three-dimensional model, and laying out the UV unfolded view in a first mapping template according to the position of a color pixel block corresponding to the UV unfolded view in the color summary mapping to obtain a UV unfolded view mapping;
obtaining a display attribute pixel block of the sub three-dimensional model, and laying out the display attribute pixel block in a second mapping template according to the position of the color pixel block corresponding to the UV unfolded graph in the color summarizing mapping to obtain a display attribute mapping;
And rendering the initial three-dimensional model according to the UV unfolding map and the display attribute map to obtain a rendered three-dimensional model.
2. The method of claim 1, wherein the size of the block of color pixels is a first size;
the step of obtaining a UV-unfolded view of the sub-three-dimensional model, the step of laying out the UV-unfolded view in a first mapping template according to the position of a color pixel block corresponding to the UV-unfolded view in the color summary mapping, and the step of obtaining a UV-unfolded view mapping comprises the following steps:
obtaining a UV unfolded view of the sub three-dimensional model, and reducing the size of the UV unfolded view to a first size to obtain a UV unfolded view pixel block;
and according to the position of the color pixel block corresponding to the UV unfolded image pixel block in the color summarization mapping, arranging the UV unfolded image pixel block in a first mapping template to obtain a UV unfolded image pixel block mapping.
3. The method of claim 2, wherein the display attribute pixel block comprises one or more of: color attribute pixel blocks, normal pixel blocks, or texture pixel blocks;
the obtaining a display attribute pixel block of the sub three-dimensional model, laying out the display attribute pixel block in a second mapping template according to the position of the color pixel block corresponding to the UV-unfolded graph in the color summary mapping, and obtaining a display attribute mapping, including one or more of the following steps:
Obtaining a color attribute pixel block of the sub three-dimensional model, and laying out the color attribute pixel block in a first sub mapping template of a second mapping template according to the coordinate information of the color pixel block corresponding to the UV unfolded graph in the color summarizing mapping to obtain a color attribute mapping;
obtaining a normal attribute pixel block of the sub three-dimensional model, and laying out the normal attribute pixel block in a second sub mapping template of a second mapping template according to coordinate information of a color pixel block corresponding to the UV unfolded graph in the color summarizing mapping to obtain a normal attribute mapping;
and obtaining texture attribute pixel blocks of the sub three-dimensional model, and laying out the normal attribute pixel blocks in a third sub mapping template of a second mapping template according to the coordinate information of the color pixel blocks corresponding to the UV unfolded graph in the color summary mapping to obtain texture attribute mapping.
4. The method of claim 3, wherein rendering the initial three-dimensional model from the UV-unfolded map and the display attribute map to obtain a rendered three-dimensional model comprises:
and rendering the initial three-dimensional model through the UV unfolding mapping and one or more of the color mapping, the normal mapping and the texture mapping to obtain a rendered three-dimensional model.
5. The method according to claim 4, wherein the method further comprises:
and if the color summarization map does not contain the color pixel blocks corresponding to the UV unfolded map of the sub three-dimensional model, generating the color pixel blocks corresponding to the UV unfolded map, and filling the color pixel blocks corresponding to the UV unfolded map into the color summarization map.
6. The method of claim 4, wherein, in the case where the display attribute pixel block includes the color attribute pixel block, after obtaining the color attribute map, the method further comprises:
and adjusting at least one of hue, brightness and saturation of the color attribute map.
7. The method of claim 5, wherein after obtaining the rendered three-dimensional model, the method further comprises:
and adjusting the display attribute pixel blocks in the display attribute map to obtain an adjusted three-dimensional model.
8. The method of claim 7, wherein said adjusting display property pixel blocks in the display property map comprises one or more of:
adjusting the roughness of the texture attribute map through an R channel in the display attribute pixel block;
The metalness of the texture attribute map is adjusted through a G channel in the display attribute pixel block;
adjusting the replacement degree of the texture attribute map through a B channel in the display attribute pixel block;
and adjusting the transparency of the texture attribute map through an A channel in the display attribute pixel block.
9. The method of claim 1, wherein the initial three-dimensional model comprises one or more of: character model, scene model, UI element.
10. An apparatus for model processing, comprising:
an initial model obtaining unit, configured to obtain an initial three-dimensional model, and split the initial three-dimensional model into a plurality of sub three-dimensional models;
a color summary map obtaining unit configured to obtain a color summary map, where the color summary map includes color pixel blocks of different colors;
a UV-unfolded-map-mapping obtaining unit, configured to obtain a UV unfolded map of the sub-three-dimensional model, and layout the UV unfolded map in a first mapping template according to a position of a color pixel block corresponding to the UV unfolded map in the color summary mapping, to obtain a UV unfolded-map;
a display attribute mapping obtaining unit, configured to obtain a display attribute pixel block of the sub three-dimensional model, and layout the display attribute pixel block in a second mapping template according to a position of a color pixel block corresponding to the UV-unfolded graph in the color summary mapping, so as to obtain a display attribute mapping;
And the rendering unit is used for rendering the initial three-dimensional model according to the UV unfolding map and the display attribute map to obtain a rendered three-dimensional model.
11. An electronic device, comprising: a processor, a memory, and computer program instructions stored on the memory and executable on the processor; a method of model processing according to any of the preceding claims 1-9 when said computer program instructions are executed by said processor.
12. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein computer executable instructions for implementing a method of model processing according to any of the preceding claims 1-9 when executed by a processor.
CN202311269084.4A 2023-09-27 2023-09-27 Model processing method and device, electronic equipment and storage medium Pending CN117456140A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311269084.4A CN117456140A (en) 2023-09-27 2023-09-27 Model processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311269084.4A CN117456140A (en) 2023-09-27 2023-09-27 Model processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117456140A true CN117456140A (en) 2024-01-26

Family

ID=89586402

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311269084.4A Pending CN117456140A (en) 2023-09-27 2023-09-27 Model processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117456140A (en)

Similar Documents

Publication Publication Date Title
US11344806B2 (en) Method for rendering game, and method, apparatus and device for generating game resource file
CN109087369B (en) Virtual object display method, device, electronic device and storage medium
CN110090440B (en) Virtual object display method and device, electronic equipment and storage medium
CN104200506A (en) Method and device for rendering three-dimensional GIS mass vector data
US20230055516A1 (en) Collision data processing method and apparatus, computer device, and storage medium
CN111429557A (en) Hair generating method, hair generating device and readable storage medium
CN103914868A (en) Method for mass model data dynamic scheduling and real-time asynchronous loading under virtual reality
CN104680572A (en) BIM-based mobile terminal building model rendering performance optimization method and system
CN109102560A (en) Threedimensional model rendering method and device
CN110148203B (en) Method and device for generating virtual building model in game, processor and terminal
CN109147023A (en) Three-dimensional special efficacy generation method, device and electronic equipment based on face
KR20080018404A (en) Computer readable recording medium having background making program for making game
CN105144243A (en) Data visualization
US20180276870A1 (en) System and method for mass-animating characters in animated sequences
US20230405452A1 (en) Method for controlling game display, non-transitory computer-readable storage medium and electronic device
KR101670958B1 (en) Data processing method and apparatus in heterogeneous multi-core environment
CN107077746A (en) System, method and computer program product for network transmission and the Automatic Optimal of the 3D texture models of real-time rendering
CN111450529A (en) Game map acquisition method and device, storage medium and electronic device
WO2021096982A1 (en) Programmatically configuring materials
KR102108244B1 (en) Image processing method and device
JP2020532022A (en) Sphere light field rendering method in all viewing angles
JP5893142B2 (en) Image processing apparatus and image processing method
CN117390322A (en) Virtual space construction method and device, electronic equipment and nonvolatile storage medium
CN115965735B (en) Texture map generation method and device
CN117456140A (en) Model processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination