CN112435285A - Normal map generation method and device - Google Patents

Normal map generation method and device Download PDF

Info

Publication number
CN112435285A
CN112435285A CN202010723807.3A CN202010723807A CN112435285A CN 112435285 A CN112435285 A CN 112435285A CN 202010723807 A CN202010723807 A CN 202010723807A CN 112435285 A CN112435285 A CN 112435285A
Authority
CN
China
Prior art keywords
normal
vertex
edge
color
mesh
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010723807.3A
Other languages
Chinese (zh)
Inventor
陈思敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Hode Information Technology Co Ltd
Original Assignee
Shanghai Hode Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Hode Information Technology Co Ltd filed Critical Shanghai Hode Information Technology Co Ltd
Priority to CN202010723807.3A priority Critical patent/CN112435285A/en
Publication of CN112435285A publication Critical patent/CN112435285A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a method for generating a normal map, which comprises the following steps: loading a 3D model, and unfolding and placing texture mapping coordinates of the 3D model on a UV plane; determining normal line colors corresponding to each vertex and each side according to tangent line normals of each vertex and each side of the 3D model; filling a grid on the UV plane according to the vertexes and the edges; and generating a normal map according to the grid and the normal color. The method can directly and automatically generate the normal map according to the 3D model, greatly saves the time of high-modulus manufacturing and baking, and greatly improves the rendering efficiency of the object.

Description

Normal map generation method and device
Technical Field
The invention relates to the technical field of image processing, in particular to a method and a device for generating a normal map.
Background
With the continuous development of computer graphics, images and video technologies, people pursue image fidelity more and more. More and more real images can be described by using image processing technology, wherein the fidelity of the three-dimensional image description can be improved by using a normal mapping method. The normal is a kind of relief. The normal map can make the surface of the three-dimensional object have the rendering effect of light and shadow distribution, and can greatly reduce the number of faces and the calculation content required by the object rendering, thereby achieving the rendering effect of optimizing animation and games.
In next generation games, objects often need soft edge effects to enable the games to render finer effects with fewer model faces. Fig. 1, in which fig. 1(a) and fig. 1(c) are graphs of normal effect without soft edge, and fig. 1(b) and fig. 1(d) are graphs of normal effect with soft edge. The soft edge, namely the smooth group information of the model at all turning positions of the object model, can obtain continuous edge tracing effect through soft edge processing. The existing method is that an art worker respectively makes a set of low mold and a set of high mold, and then the normal of the high mold is baked on the low mold to obtain a normal map. However, the inventors of the present patent application found that: the existing method is high in cost, a large amount of time is consumed for manufacturing a high-modulus and baking a mapping, the normal mapping is low in obtaining efficiency, and the rendering efficiency of an object is seriously reduced.
Disclosure of Invention
The invention aims to provide a normal map generation method, a normal map generation device, computer equipment and a readable storage medium, and manufacturing cost and baking time are saved.
According to an aspect of the present invention, there is provided a method for generating a normal map, the method including the steps of:
loading a 3D model, and unfolding and placing texture mapping coordinates of the 3D model on a UV plane;
determining normal line colors corresponding to each vertex and each side according to tangent line normals of each vertex and each side of the 3D model;
filling a grid on the UV plane according to the vertexes and the edges;
and generating a normal map according to the grid and the normal color.
Optionally, the filling, according to the vertices and the edges, the corresponding mesh on the UV plane includes:
filling an edge mesh and a vertex mesh on the UV plane according to the vertexes and the edges, and writing the normal color into the vertex colors corresponding to the edge mesh and the vertex mesh to respectively obtain an edge normal mesh and a vertex normal mesh;
the generating a normal map from the grid and the normal color comprises: and rendering the edge normal mesh and the vertex normal mesh into a target texture map to obtain the normal map.
Optionally, the filling, according to the vertices and the edges, an edge mesh and a vertex mesh on the UV plane, and writing the normal color into vertex colors corresponding to the edge mesh and the vertex mesh to obtain an edge normal mesh and a vertex normal mesh, respectively, includes:
on the UV plane, generating corresponding edge grids by adjacent edges with different tangent normals, and writing the normal color of the edge and the normal color of the vertex into the vertex color corresponding to the edge grids to obtain the edge normal grids;
and on the UV plane, respectively taking each vertex as a circle center, generating a circle, placing the circle below the edge grid, and writing the normal color of the vertex into the vertex corresponding to the vertex grid to obtain the vertex normal grid.
Optionally, writing the normal color of the edge and the normal color of the vertex into a vertex color corresponding to the edge mesh to obtain the edge normal mesh includes:
writing the normal color of the edge and the normal color of the vertex into the vertex color corresponding to the edge grid;
decomposing the quadrilateral surface composition in the side grid into a triangular surface composition to obtain an original normal map;
and according to a downward movement instruction of a user, downward moving the grid vertex at the UV vertex on the basis of the original normal map to obtain the edge normal grid.
Optionally, the determining, according to tangent normals of each vertex and each edge of the 3D model, a normal color corresponding to each vertex and each edge includes:
acquiring tangent normal lines corresponding to the vertexes and the edges;
respectively processing tangent normals corresponding to the vertexes and the sides to obtain target tangent normals corresponding to the vertexes and the sides;
and determining the normal color corresponding to each vertex and each side according to the normal of the target tangent line.
Optionally, the processing the tangent normals corresponding to the vertices and the edges respectively to obtain target tangent normals corresponding to the vertices and the edges includes:
respectively converting tangent normals corresponding to the vertexes and the sides into corresponding world normals, wherein the world normals refer to normals of a preset space;
calculating the average value of the world normal of each vertex and the world normal of the adjacent vertex to obtain the target world normal of each vertex in the preset space;
calculating the average value of the world normal of each side and the world normal of the adjacent side to obtain the target world normal of each side in the preset space;
converting the target world normal of each vertex into a target tangent normal corresponding to each vertex on the UV plane; and
and converting the target world normal of each side into a target tangent normal corresponding to each side on the UV plane.
Optionally, the calculation formula for determining the normal color corresponding to each vertex and each edge according to the target tangent normal is as follows:
R=x*0.5+0.5;
G=y*0.5+0.5;
B=1;
r, G, B represents the color of the texture picture pixel, x represents the abscissa of the object tangent normal line, y represents the ordinate of the object tangent normal line, the value ranges of x and y are [ -1,1], and the value range of R, G is [0,1 ].
In order to achieve the above object, the present invention further provides a normal map generation device, including:
the loading module is used for loading the 3D model and unfolding and placing the texture mapping coordinates of the 3D model on a UV plane;
the determining module is used for determining the normal color corresponding to each vertex and each side according to the tangent normal of each vertex and each side of the 3D model;
a filling module for filling the mesh on the UV plane according to the vertexes and the edges;
and the generating module is used for generating a normal map according to the grid and the normal color.
In order to achieve the above object, the present invention further provides a computer device, which specifically includes: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method for generating a normal map introduced above when executing the computer program.
In order to achieve the above object, the present invention also provides a computer-readable storage medium on which a computer program is stored, which, when being executed by a processor, realizes the steps of the above-described normal map generation method.
According to the normal map generation method and device, the computer equipment and the readable storage medium, the normal map is automatically generated according to the 3D model, time and manufacturing cost for high-modulus manufacturing and baking of the map are greatly saved, and rendering efficiency of an object is improved.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a graph comparing a no soft edge normal effect with a soft edge normal effect;
fig. 2 is an alternative flow chart of a normal map generation method provided in the embodiment of the present disclosure;
FIG. 3 is a diagram of an exemplary low-modulus 3D model;
FIG. 4 is a schematic view of the UV development of the low mode of FIG. 3;
fig. 5 is a schematic diagram illustrating an alternative specific flowchart of step S102 in fig. 2;
fig. 6 is a schematic diagram illustrating an alternative specific flowchart of step S202 in fig. 2;
fig. 7 is a schematic diagram illustrating an alternative specific flowchart of step S104 in fig. 2;
fig. 8 is a schematic diagram illustrating an alternative specific flowchart of step S400 in fig. 2;
FIG. 9 is a schematic diagram of edge grid generation;
FIG. 10 is a diagram of the effect of writing normal colors into the vertex colors corresponding to the edge grid and the corresponding line;
FIG. 11 is a graph of the effect of greater than 180 edges before and after vertex normal mesh generation;
FIG. 12 is a schematic diagram of vertex normal mesh circle generation;
FIG. 13 is an effect diagram of the mesh after the edge normal mesh and the vertex normal mesh are superimposed;
fig. 14 is a schematic flow chart of another alternative method for generating a normal map according to an embodiment of the present disclosure;
FIG. 15 is a diagram of the effect of leaving the default pixels in the edge grid and the corresponding line graph;
fig. 16 is a schematic flow chart of another alternative method for generating a normal map according to an embodiment of the present disclosure;
FIG. 17 is an exploded view of the quadrilateral faces in the edge mesh and the corresponding line drawing;
FIG. 18 is a diagram of the effect of the use of the original normal map;
FIG. 19 is a schematic drawing showing the grid vertices being downshifted at UV vertices;
fig. 20 is a schematic flow chart of another alternative method for generating a normal map according to an embodiment of the present disclosure;
FIG. 21 is a diagram illustrating normal map effects generated by the final rendering;
FIG. 22 is a schematic view of the final model display effect;
FIG. 23 is a diagram of other display effects;
FIG. 24 is a schematic diagram of an alternative program module of an apparatus for generating a normal map provided in an embodiment of the present disclosure;
fig. 25 is a schematic diagram of an alternative hardware architecture of a computer device according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The method for generating the normal map according to the present invention will be described below with reference to the drawings.
The noun explains:
texture: both the texture of the surface of the object in the usual sense, i.e. the relief of the surface of the object, and the coloured pattern on the smooth surface of the object.
UV: the short names of the coordinates of the U and V texture maps are similar to the X, Y and Z axes of the space model, and the position information of each point on the picture is defined. UV is the exact mapping of each point on the image to the surface of the model object. All image files are two-dimensional and one plane. The horizontal direction is U, the vertical direction is V, and a coordinate system formed by UV coordinates is a UV coordinate system. With this planar, two-dimensional UV coordinate system, any one pixel on the image can be located. The position of the gap between a point and a point is subjected to image smoothing interpolation processing by software, so-called UV mapping.
Vertex color: the color of the model vertices. Taking a triangle as an example, the colors of all pixels inside the triangle can be determined by only needing the colors of 3 vertexes.
Normal line: the orientation of the mold surface. Only the normals to the vertices may be recorded, or the normals to the individual pixels may be recorded. A map recording pixel tangent normals is called a normal map and is typically used for next generation games.
Normal mapping: the method has the advantages that the chartlet of each pixel tangent normal of the model is recorded, so that the 3D surface has a rendering effect of light and shadow distribution, some detail textures on a high mode can be simulated, particularly, smoothness and roughness on the high mode are projected onto a low mode, the low mode also has the effect of the high mode, the number of faces and the calculation content required by the rendering of an expression object can be greatly reduced, and the rendering effect of optimizing animation and games is achieved. The usual recording method is RGB, each corresponding to xyz, with the z-axis fixed to 1.
Tangent space: and the tangent coordinate system corresponding to each surface of each vertex of the model.
Presetting a space: the preset space in this embodiment is a game space (e.g., a next generation game space), and the preset space is determined according to rendering requirements during actual use.
World normal: the normal of the space is preset.
Soft edge: and the information is smoothly grouped through the unified model, so that the soft edge is realized. When the smooth set of faces is uniform, the transition between faces is not apparent, for example: a sphere. If the smooth set of each face is not uniform, then the effects of corner angles may occur, for example: a cuboid.
Low modulus: 3D model with less model surface number.
High modulus: the model is also called a high-precision 3D model and has the characteristics of complex structure, more faces, rich detail expression and the like.
Fig. 2 is an alternative flow chart of the method for generating the normal map according to the present invention. It is to be understood that the flow charts in the embodiments of the present method are not used to limit the order of executing the steps, and a computer device is taken as an executing subject to be described as an example below. The computer devices may include mobile terminals such as tablet computers, notebook computers, palm top computers, Personal Digital Assistants (PDAs), and the like, and fixed terminals such as Digital TVs, desktop computers, and the like.
As shown in fig. 2, the method specifically includes the following steps:
step S100: and loading a 3D model, and unfolding and placing texture mapping coordinates of the 3D model on a UV plane. In an embodiment of the invention, the 3D model may be a low mode model.
In an exemplary embodiment, the 3D model may be a low-mode model as shown in fig. 3, and the texture map coordinates of the low-mode model of fig. 3 are expanded to obtain the UV expansion map as shown in fig. 4. The UV developed image of fig. 4 was then placed on the UV plane.
Step S102: and determining the normal color corresponding to each vertex and each side according to the tangent normal of each vertex and each side of the 3D model.
In particular, in order to make the corners of the 3D model sharp, the distinction is made by different colors. For example: the 3D model is visually made to appear rounded by color.
In an exemplary embodiment, as shown in fig. 5, the step S102 may include steps S200 to S204.
Step S200: and acquiring tangent normal lines corresponding to the vertexes and the sides. It should be noted that the vertex and the edge on the same plane have the same tangent normal.
Step S202: and respectively processing the tangent normals corresponding to the vertexes and the sides to obtain target tangent normals corresponding to the vertexes and the sides.
Specifically, in practical applications, since each vertex has a plurality of adjacent vertices, and each edge has 0 or 1 adjacent edge, at this time, the condition of the adjacent vertex of each vertex and the condition of the adjacent edge of each edge need to be considered, and then the tangent normal of each vertex and each edge needs to be determined.
It should be noted that, because the normals of two adjacent faces are different in the 3D model, and all the edges in each face have the same normal as the face where the face is located, even if the two faces have the same edge, the normals of the same edge are also different, and the same edge is the adjacent edge. For a vertex, all points that share a coordinate position with the vertex are referred to as "neighboring vertices".
In an exemplary embodiment, as shown in fig. 6, the step S202 may include steps S300 to S308:
step S300: and respectively converting tangent normals corresponding to the vertexes and the sides into corresponding world normals, wherein the world normals are normals of a preset space, and the preset space is a game world space in the embodiment of the invention.
Step S302: and calculating the average value of the world normal of each vertex and the world normal of the adjacent vertex to obtain the target world normal of each vertex in the preset space.
Step S304: and calculating the average value of the world normal of each side and the world normal of the adjacent side to obtain the target world normal of each side in the preset space.
Step S306: and converting the target world normal of each vertex into a target tangent normal corresponding to each vertex on the UV plane.
Step S308: and converting the target world normal of each side into a target tangent normal corresponding to each side on the UV plane.
Since the coordinate systems of the tangent normals of different surfaces are different, but the calculation of different coordinate spaces is meaningless, it is necessary to convert the tangent normals of the respective surfaces into world normals of the same coordinate space, and calculate the average value of the world normals of the respective vertices and the respective sides in the same coordinate space. Then, the normal line of the tangent line is converted back according to the calculation result, and further, various different color effects can be obtained. And calculating the average values of the vertexes, the adjacent vertexes and the edges and the adjacent edges to obtain the tangent normal relations of the vertexes, the edges and the adjacent vertexes and the adjacent edges, so that the tangent normals corresponding to the vertexes and the edges can be accurately described.
Step S204: and determining the normal color corresponding to each vertex and each side according to the normal of the target tangent line.
In an exemplary embodiment, the calculation formula for determining the normal color corresponding to each vertex and each edge according to the target tangent normal is as follows:
R=x*0.5+0.5;
G=y*0.5+0.5;
B=1;
r, G, B represents the color of the texture picture pixel, x represents the abscissa of the object tangent normal line, y represents the ordinate of the object tangent normal line, the value ranges of x and y are [ -1,1], and the value range of R, G is [0,1 ]. That is, after the tangent normals of the vertices and the edges are determined, the normal colors of the vertices and the edges can be determined by conversion of a formula. By this step, the tangent normal can be accurately converted into a corresponding normal color.
Step S104: and filling corresponding grids on the UV plane according to the vertexes and the edges.
Step S106: and generating a normal map according to the grid and the normal color.
Specifically, the normal color is written into the grid, and the normal map is generated according to the grid into which the normal color is written.
In an exemplary embodiment, as shown in fig. 7, the step S104 may include a step S400.
Step S400: and filling an edge mesh and a vertex mesh on the UV plane according to the vertexes and the edges, and writing the normal color into the vertex colors corresponding to the edge mesh and the vertex mesh to respectively obtain an edge normal mesh and a vertex normal mesh.
Specifically, after the edge mesh and the vertex mesh are filled on the UV plane, writing the normal color of each vertex and the normal color of each edge into the vertex color corresponding to the vertex mesh to obtain the edge normal mesh; and writing the normal color of each vertex into the vertex corresponding to the vertex mesh to obtain the vertex normal mesh.
In an exemplary embodiment, referring to fig. 8, the step S400 may include steps S500 and S502:
step S500: and on the UV plane, generating corresponding edge grids by adjacent edges with different tangent normal lines, writing the normal color of the edge and the normal color of the vertex into the vertex color corresponding to the edge grids, and obtaining the edge normal grids.
Specifically, please refer to fig. 9, which is a schematic diagram of edge grid generation. Fig. 9(a) is a schematic view of UV spreading in a low mode, and fig. 9(b) is a schematic view of the resulting edge grid. As can be seen from fig. 9, one mesh is generated for adjacent edges whose normals are different, and no mesh is generated for adjacent edges whose normals are the same. The bold frame in fig. 9(b) is the generated edge mesh. Fig. 10 is an exemplary effect graph of writing the normal color into the vertex color corresponding to the edge mesh and a corresponding line graph, where fig. 10(a) is the effect graph before writing the normal color into the vertex color corresponding to the edge mesh, fig. 10(b) is the effect graph after writing the normal color into the vertex color corresponding to the edge mesh, fig. 10(c) is the line graph corresponding to the effect graph before writing the normal color into the vertex color corresponding to the edge mesh, and fig. 10(d) is the line graph corresponding to the effect graph after writing the normal color into the vertex color corresponding to the edge mesh.
Step S502: and on the UV plane, respectively taking each vertex as a circle center, generating a circle, placing the circle below the edge grid, and writing the normal color of the vertex into the vertex corresponding to the vertex grid to obtain the vertex normal grid.
In practical operation, when the included angle of the side is greater than 180 °, a gap appears, and the effect diagram refers to fig. 11(a), where the gap needs to be complemented, and the complemented effect diagram is shown in fig. 11 (b). As shown in fig. 12, a schematic is generated for the vertex normal mesh circle. When the included angle of the edge is larger than 180 degrees, a circle is generated by taking the top point as the center of the circle and is placed below the edge grid. Then, according to the user's circumferential vertex color setting instruction, the vertex color on the circumference is set to (0.5,0.5,1), and the corresponding normal line is set to (0,0,1), that is, the line color cannot be set. At this time, a vertex normal mesh is obtained. Through the steps, for the edges larger than 180 degrees, the connection of the two edges has a gap, and the gap between the connection of the two edges can be complemented, so that the generated vertex normal line grid is more complete. When the vertex normal mesh is generated, the resulting effect graph is as shown in fig. 13, since the vertex normal mesh is placed under the edge mesh.
The steps can accurately acquire the vertex color and the texture of the normal map by generating the normal mesh of the edge and the normal mesh of the vertex.
In an exemplary embodiment, with continuing reference to fig. 7, the step S106 may include a step S402: and rendering the edge normal mesh and the vertex normal mesh into a target texture map to obtain the normal map.
Specifically, after the normal meshes of the vertices and the edges are generated, the normal meshes corresponding to the vertices and the edges are rendered into a texture map, and a final normal map is obtained.
In an exemplary embodiment, as shown in fig. 14, after generating a corresponding edge grid on the UV plane by using adjacent edges with different tangent normals in step S500, step S600 may further be included:
and acquiring the grid operation information of the user, and reserving preset pixels in the side grid according to the grid operation information.
In practical applications, the compression of the picture by the game engine often causes distortion of the picture, so that after the edge mesh is generated, a preset pixel (i.e., "bleeding") needs to be set aside to solve the problem that the picture is distorted due to the compression of the game engine. Please refer to fig. 15, where fig. 15(a) is an effect diagram before the preset pixels are set aside in the side grid, fig. 15(b) is an effect diagram after the preset pixels are set aside in the side grid, fig. 15(c) is a line diagram corresponding to the effect diagram before the preset pixels are set aside in the side grid, and fig. 15(d) is a line diagram corresponding to the effect diagram after the preset pixels are set aside in the side grid.
In an exemplary embodiment, as shown in fig. 16, writing the normal color of the edge and the normal color of the vertex into a vertex color corresponding to the edge mesh in step S500 to obtain the edge-normal mesh includes steps S700 to S704:
step S700: and writing the normal color of the edge and the normal color of the vertex into the vertex color corresponding to the edge grid.
Step S702: and decomposing the quadrilateral surface composition in the edge mesh into a triangular surface composition to obtain an original normal mapping.
Specifically, after writing the normal color of the edge and the normal color of the vertex into the vertex color corresponding to the edge mesh, the quadrilateral surface composition in the edge mesh needs to be decomposed into a triangular surface composition, so as to obtain an original normal map. Note that, when the decomposition is performed on the quadrilateral surface in the side mesh, attention should be paid to the direction of the oblique line of the decomposition, and 1/2 of the tangent normal line whose vertex is the midpoint of the oblique line should be secured instead of 1/2 of the tangent normal line of the side. Fig. 17 is an exploded view of a quadrilateral surface in a side grid and a corresponding line graph, fig. 17(a) is an exploded view in a correct oblique direction, fig. 17(b) is an exploded view in an incorrect oblique direction, fig. 17(c) is a line graph corresponding to the exploded view in the correct oblique direction, and fig. 17(d) is a line graph corresponding to the exploded view in the incorrect oblique direction.
Step S704: and according to a downward movement instruction of a user, downward moving the grid vertex at the UV vertex on the basis of the original normal map to obtain the edge normal grid.
Specifically, after writing the normal color of the side into the corresponding side, since the side mesh is composed of quadrilateral surfaces, the quadrilateral surface composition needs to be decomposed into triangular surface compositions. Then, the characteristics of the triangular surface of the mesh are automatically filled with the vertex color, that is, when all three vertexes of the triangular surface have the vertex color, the color of the triangular surface is automatically interpolated. After the quadrilateral surface composition in the edge mesh is decomposed into a triangular surface composition, a preliminarily usable normal map, namely an original normal map, can be obtained. In practice, however, when using the original normal map, when there is a normal grid on both edges, problems arise in the staggered places, and an exemplary effect is shown in fig. 18. Therefore, by moving the grid vertex at the UV vertex downward, the diagram is moved downward as shown in fig. 19, and the normal grid corresponding to the final edge is obtained after moving downward. Through the steps, the problem when the normal grids on the two edges are staggered can be avoided.
In an exemplary embodiment, as shown in fig. 20, in step S402, when rendering the edge normal mesh and the vertex normal mesh into the target texture map, step S800 and step S802 may be further included.
Step S800: and acquiring the background color setting information, the network material information and the color space information input by the user.
Step S802: and setting the background color, the network material and the color space corresponding to the target texture mapping according to the background color setting information, the network material information and the color space information.
Specifically, in practical applications, the background color is (0.5,0.5,1), the network material is non-transparent, color mixing is avoided, and the color space uses a Gamma color space to enhance the color of the linear space. The final generated normal map effect diagram can be seen in fig. 21. The final model display effect diagram can be seen in fig. 22. The other display effect diagrams are shown in fig. 23, in which fig. 23(a) is a normal map without soft edge, fig. 23(b) is a grid diagram when no normal map with soft edge is present, fig. 23(c) is a normal map with soft edge, and fig. 23(d) is a grid diagram when a normal map with soft edge is present.
According to the normal map generating method provided by the embodiment of the invention, a 3D model is loaded, texture map coordinates of the 3D model are unfolded and placed on a UV plane, normal colors corresponding to each vertex and each side are determined according to tangent normals of each vertex and each side, then a grid is filled on the UV plane, and a normal map is generated according to the grid and the normal colors. According to the embodiment of the invention, the normal map is directly and automatically generated according to the 3D model, so that the time and the manufacturing cost for high-modulus manufacturing and baking the map are greatly saved, and the rendering efficiency of the object is improved.
Based on the method for generating the normal map provided in the foregoing embodiment, the present embodiment provides a device for generating the normal map, and the device for generating the normal map can be applied to a computer device. Specifically, fig. 24 shows an alternative configuration block diagram of the normal map generation apparatus, which is divided into one or more program modules, and the one or more program modules are stored in a storage medium and executed by one or more processors to complete the present invention. The program module referred to in the present invention refers to a series of computer program instruction segments capable of performing specific functions, and is more suitable for describing the execution process of the normal map generation apparatus in the storage medium than the program itself.
As shown in fig. 24, the normal map generation device specifically includes the following components:
the loading module 201 is configured to load a 3D model, and unfold and place texture map coordinates of the 3D model on a UV plane. The 3D model may be a low mode model.
Specifically, the 3D model loaded by the loading module 201 is shown in fig. 3, where the 3D model is a low-mode model, and the UV expansion diagram shown in fig. 4 is obtained after expanding texture mapping coordinates of the low-mode model of fig. 3. The UV developed image of fig. 4 was then placed on the UV plane.
A determining module 202, configured to determine, according to tangent normals of each vertex and each edge of the 3D model, a normal color corresponding to each vertex and each edge.
In particular, in order to make the corners of the 3D model sharp, the distinction is made by different colors. For example: the 3D model is visually made to appear rounded by color.
In an exemplary embodiment, the determination module 202 may include a calculation unit, a processing unit, and a determination unit.
And the computing unit is used for acquiring tangent normals corresponding to the vertexes and the edges of the low-mode model. It should be noted that the vertex and the edge on the same plane have the same tangent normal.
And the processing unit is used for respectively processing the tangent normals corresponding to the vertexes and the sides so as to obtain target tangent normals corresponding to the vertexes and the sides.
Specifically, in practical applications, since each vertex has a plurality of adjacent vertices, and each edge has 0 or 1 adjacent edge, at this time, the condition of the adjacent vertex of each vertex and the condition of the adjacent edge of each edge need to be considered, and then the tangent normal of each vertex and each edge needs to be determined.
It should be noted that, because the normals of two adjacent faces are different in the 3D model, and all the edges in each face have the same normal as the face where the face is located, even if the two faces have the same edge, the normals of the same edge are also different, and the same edge is the adjacent edge. For a vertex, all points that share a coordinate position with the vertex are referred to as "neighboring vertices".
In an exemplary embodiment, the processing unit may include a conversion unit and a calculation unit:
the conversion unit is configured to convert tangent normals corresponding to the vertices and the edges into corresponding world normals, respectively, where the world normals are normals of a preset space, and in an embodiment of the present invention, the preset space is a game world space.
And the computing unit is used for carrying out average value computation on the world normal of each vertex and the world normal of the adjacent vertex so as to obtain the target world normal of each vertex in the preset space.
The computing unit is further configured to perform average computation on the world normal of each side and the world normal of an adjacent side to obtain a target world normal of each side in the preset space.
The conversion unit is further configured to convert the target world normal of each vertex into a target tangent normal corresponding to each vertex on the UV plane.
The conversion unit is further configured to convert the target world normal of each side into a target tangent normal corresponding to each side on the UV plane.
Since the coordinate systems of the tangent normals of different surfaces are different, but the calculation of different coordinate spaces is meaningless, it is necessary to convert the tangent normals of the respective surfaces into world normals of the same coordinate space, and calculate the average value of the world normals of the respective vertices and the respective sides in the same coordinate space. Then, the normal line of the tangent line is converted back according to the calculation result, and further, various different color effects can be obtained. And calculating the average values of the vertexes, the adjacent vertexes and the edges and the adjacent edges to obtain the tangent normal relations of the vertexes, the edges and the adjacent vertexes and the adjacent edges, so that the tangent normals corresponding to the vertexes and the edges can be accurately described.
And the determining unit is used for determining the normal color corresponding to each vertex and each side according to the target tangent normal.
In an exemplary embodiment, the calculation formula for determining the normal color corresponding to each vertex and each edge according to the target tangent normal is as follows:
R=x*0.5+0.5;
G=y*0.5+0.5;
B=1;
r, G, B represents the color of the texture picture pixel, x represents the abscissa of the object tangent normal line, y represents the ordinate of the object tangent normal line, the value ranges of x and y are [ -1,1], and the value range of R, G is [0,1 ]. That is, after the tangent normals of the vertices and the edges are determined, the normal colors of the vertices and the edges can be determined by conversion of a formula. By this step, the tangent normal can be accurately converted into a corresponding normal color.
And a filling module 203, configured to fill the corresponding mesh on the UV plane according to the vertices and the edges.
A generating module 204, configured to generate a normal map according to the grid and the normal color.
Specifically, the normal color is written into the grid, and the normal map is generated according to the grid into which the normal color is written.
In an exemplary embodiment, the filling module 203 may include a color writing unit.
And the color writing unit is used for filling an edge mesh and a vertex mesh on the UV plane according to the vertexes and the edges, and writing the normal color into the vertex color corresponding to the edge mesh and the vertex mesh so as to respectively obtain an edge normal mesh and a vertex normal mesh.
Specifically, after the edge mesh and the vertex mesh are filled on the UV plane, the color writing module 205 writes the normal color of each vertex and the normal color of each edge into the vertex color corresponding to the vertex mesh, so as to obtain the edge normal mesh; and writing the normal color of each vertex into the vertex corresponding to the vertex mesh to obtain the vertex normal mesh.
In an exemplary embodiment, the color writing unit may be configured to:
and on the UV plane, generating corresponding edge grids by adjacent edges with different tangent normal lines, writing the normal color of the edge and the normal color of the vertex into the vertex color corresponding to the edge grids, and obtaining the edge normal grids.
Specifically, please refer to fig. 9, which is a schematic diagram of edge grid generation. Fig. 9(a) is a schematic view of a low-mode UV expansion, and fig. 9(b) is a schematic view of a generated edge grid. As can be seen from fig. 9, one mesh is generated for adjacent edges whose normals are different, and no mesh is generated for adjacent edges whose normals are the same. The bold frame in fig. 9(b) is the generated edge mesh. Fig. 10 is an exemplary effect graph of writing the normal color into the vertex color corresponding to the edge mesh and a corresponding line graph, where fig. 10(a) is the effect graph before writing the normal color into the vertex color corresponding to the edge mesh, fig. 10(b) is the effect graph after writing the normal color into the vertex color corresponding to the edge mesh, fig. 10(c) is the line graph corresponding to the effect graph before writing the normal color into the vertex color corresponding to the edge mesh, and fig. 10(d) is the line graph corresponding to the effect graph after writing the normal color into the vertex color corresponding to the edge mesh.
In an exemplary embodiment, the color writing unit may be further configured to:
and on the UV plane, respectively taking each vertex as a circle center, generating a circle, placing the circle below the edge grid, and writing the normal color of the vertex into the vertex corresponding to the vertex grid to obtain the vertex normal grid.
In practical operation, when the included angle of the side is greater than 180 °, a gap appears, and the effect diagram refers to fig. 11(a), where the gap needs to be complemented, and the complemented effect diagram is shown in fig. 11 (b). As shown in fig. 12, a schematic is generated for the vertex normal mesh circle. When the included angle of the edge is larger than 180 degrees, a circle is generated by taking the top point as the center of the circle and is placed below the edge grid. Then, according to the user's circumferential vertex color setting instruction, the vertex color on the circumference is set to (0.5,0.5,1), and the corresponding normal line is set to (0,0,1), that is, the line color cannot be set. At this time, a vertex normal mesh is obtained. Through the steps, for the edges larger than 180 degrees, the connection of the two edges has a gap, and the gap between the connection of the two edges can be complemented, so that the generated vertex normal line grid is more complete. When the vertex normal mesh is generated, the resulting effect graph is as shown in fig. 13, since the vertex normal mesh is placed under the edge mesh.
The steps can accurately acquire the vertex color and the texture of the normal map by generating the normal mesh of the edge and the normal mesh of the vertex.
In an exemplary embodiment, the generating module 204 may include a rendering unit, and the rendering unit is configured to render the edge normal mesh and the vertex normal mesh into a target texture map to obtain the normal map.
Specifically, after the normal meshes of the vertices and the edges are generated, the rendering unit renders the normal meshes corresponding to the vertices and the edges into texture maps, so as to obtain final normal maps.
In an exemplary embodiment, the generating device of the normal map further includes a pixel processing module, configured to:
and acquiring the grid operation information of the user, and reserving preset pixels in the side grid according to the grid operation information.
In practical applications, the compression of the picture by the game engine often causes distortion of the picture, so that after the edge mesh is generated, a preset pixel (i.e., "bleeding") needs to be set aside to solve the problem that the picture is distorted due to the compression of the game engine. Please refer to fig. 15, where fig. 15(a) is an effect diagram before the preset pixels are set aside in the side grid, fig. 15(b) is an effect diagram after the preset pixels are set aside in the side grid, fig. 15(c) is a line diagram corresponding to the effect diagram before the preset pixels are set aside in the side grid, and fig. 15(d) is a line diagram corresponding to the effect diagram after the preset pixels are set aside in the side grid.
In an exemplary embodiment, the color writing unit is further configured to:
writing the normal color of the edge and the normal color of the vertex into the vertex color corresponding to the edge grid; decomposing the quadrilateral surface composition in the side grid into a triangular surface composition to obtain an original normal map; and according to a downward movement instruction of a user, downward moving the grid vertex at the UV vertex on the basis of the original normal map to obtain the edge normal grid.
Note that, when the decomposition is performed on the quadrilateral surface in the side mesh, attention should be paid to the direction of the oblique line of the decomposition, and 1/2 of the tangent normal line whose vertex is the midpoint of the oblique line should be secured instead of 1/2 of the tangent normal line of the side. Fig. 17 is an exploded view of a quadrilateral surface in a side grid and a corresponding line graph, fig. 17(a) is an exploded view in a correct oblique direction, fig. 17(b) is an exploded view in an incorrect oblique direction, fig. 17(c) is a line graph corresponding to the exploded view in the correct oblique direction, and fig. 17(d) is a line graph corresponding to the exploded view in the incorrect oblique direction.
Specifically, after writing the normal color of the side into the corresponding side, since the side mesh is composed of quadrilateral surfaces, the quadrilateral surface composition needs to be decomposed into triangular surface compositions. Then, the characteristics of the triangular surface of the mesh are automatically filled with the vertex color, that is, when all three vertexes of the triangular surface have the vertex color, the color of the triangular surface is automatically interpolated. After the quadrilateral surface composition in the edge mesh is decomposed into a triangular surface composition, a preliminarily usable normal map, namely an original normal map, can be obtained. In practice, however, when using the original normal map, when there is a normal grid on both edges, problems arise in the staggered places, and an exemplary effect is shown in fig. 18. Therefore, by moving the grid vertex at the UV vertex downward, the diagram is moved downward as shown in fig. 19, and the normal grid corresponding to the final edge is obtained after moving downward. Through the steps, the problem when the normal grids on the two edges are staggered can be avoided.
In an exemplary embodiment, the generating device of the normal map may further include an obtaining module and a setting module.
The acquisition module is used for acquiring the background color setting information, the network material information and the color space information input by the user.
And the setting module is used for setting the background color, the network material and the color space corresponding to the target texture mapping according to the background color setting information, the network material information and the color space information.
Specifically, in practical applications, the background color is (0.5,0.5,1), the network material is non-transparent, color mixing is avoided, and the color space uses a Gamma color space to enhance the color of the linear space. The final generated normal map effect diagram can be seen in fig. 21. The final model display effect diagram can be seen in fig. 22. The other display effect diagrams are shown in fig. 23, in which fig. 23(a) is a normal map without soft edge, fig. 23(b) is a grid diagram when no normal map with soft edge is present, fig. 23(c) is a normal map with soft edge, and fig. 23(d) is a grid diagram when a normal map with soft edge is present.
The normal map generating device provided by the embodiment of the invention loads a 3D model, expands texture map coordinates of the 3D model and places the texture map coordinates on a UV plane, then determines normal colors corresponding to each vertex and each side according to tangent normals of each vertex and each side, then fills a grid on the UV plane, and generates a normal map according to the grid and the normal colors. According to the embodiment of the invention, the normal map is directly and automatically generated according to the 3D model, so that the time and the manufacturing cost for high-modulus manufacturing and baking the map are greatly saved, and the rendering efficiency of the object is improved.
The embodiment also provides a computer device, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a rack server, a blade server, a tower server or a rack server (including an independent server or a server cluster composed of a plurality of servers) capable of executing programs, and the like. As shown in fig. 25, the computer device 30 of the present embodiment includes at least, but is not limited to: a memory 301, a processor 302 communicatively coupled to each other via a system bus. It is noted that FIG. 25 only shows the computer device 30 having components 301 and 302, but it is understood that not all of the shown components are required and that more or fewer components may be implemented instead.
In this embodiment, the memory 301 (i.e., the readable storage medium) includes a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the storage 301 may be an internal storage unit of the computer device 30, such as a hard disk or a memory of the computer device 30. In other embodiments, the memory 301 may also be an external storage device of the computer device 30, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a flash Card (FlashCard), or the like provided on the computer device 30. Of course, the memory 301 may also include both internal and external storage devices for the computer device 30. In the present embodiment, the memory 301 is generally used for storing an operating system installed in the computer device 30 and various types of application software, such as program codes of the normal map generation apparatus of the above-described embodiment. In addition, the memory 301 may also be used to temporarily store various types of data that have been output or are to be output.
Processor 302 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 302 generally serves to control the overall operation of the computer device 30.
Specifically, in this embodiment, the processor 302 is configured to execute a program of a normal map generation method stored in the processor 302, and the program of the normal map generation method implements the following steps when executed:
loading a 3D model, and unfolding and placing texture mapping coordinates of the 3D model on a UV plane;
determining normal line colors corresponding to each vertex and each side according to tangent line normals of each vertex and each side of the 3D model;
filling corresponding grids on the UV plane according to the vertexes and the edges;
and generating a normal map according to the grid and the normal color.
For the specific embodiment of the process of the above method steps, reference may be made to the above embodiments, and details of this embodiment are not repeated herein.
The present embodiments also provide a computer readable storage medium, such as a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, a server, an App application mall, etc., having stored thereon a computer program that when executed by a processor implements the method steps of:
loading a 3D model, and unfolding and placing texture mapping coordinates of the 3D model on a UV plane;
determining normal line colors corresponding to each vertex and each side according to tangent line normals of each vertex and each side of the 3D model;
filling corresponding grids on the UV plane according to the vertexes and the edges;
and generating a normal map according to the grid and the normal color.
For the specific embodiment of the process of the above method steps, reference may be made to the above embodiments, and details of this embodiment are not repeated herein.
The computer device and the readable storage medium provided in this embodiment load a 3D model, expand texture mapping coordinates of the 3D model, place the expanded texture mapping coordinates on a UV plane, determine normal colors corresponding to each vertex and each side according to tangent normals of each vertex and each side, fill a mesh on the UV plane according to each vertex and each side, and generate a normal mapping according to the mesh and the normal colors. According to the embodiment of the invention, the normal map is directly and automatically generated according to the 3D model, so that the time and the manufacturing cost for high-modulus manufacturing and baking the map are greatly saved, and the rendering efficiency of the object is improved.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A method for generating a normal map, the method comprising:
loading a 3D model, and unfolding and placing texture mapping coordinates of the 3D model on a UV plane;
determining normal line colors corresponding to each vertex and each side according to tangent line normals of each vertex and each side of the 3D model;
filling corresponding grids on the UV plane according to the vertexes and the edges;
and generating a normal map according to the grid and the normal color.
2. A method for generating a normal map as claimed in claim 1, wherein said filling a corresponding mesh on said UV plane based on said vertices and said edges comprises:
filling an edge mesh and a vertex mesh on the UV plane according to the vertexes and the edges, and writing the normal color into the vertex colors corresponding to the edge mesh and the vertex mesh to respectively obtain an edge normal mesh and a vertex normal mesh;
the generating a normal map from the grid and the normal color comprises: and rendering the edge normal mesh and the vertex normal mesh into a target texture map to obtain the normal map.
3. The method for generating a normal map as claimed in claim 2, wherein the filling an edge mesh and a vertex mesh on the UV plane according to the vertices and the edges, and writing the normal color into the vertex colors corresponding to the edge mesh and the vertex mesh to obtain an edge normal mesh and a vertex normal mesh, respectively, comprises:
on the UV plane, generating corresponding edge grids by adjacent edges with different tangent normals, and writing the normal color of the edge and the normal color of the vertex into the vertex color corresponding to the edge grids to obtain the edge normal grids;
and on the UV plane, respectively taking each vertex as a circle center, generating a circle, placing the circle below the edge grid, and writing the normal color of the vertex into the vertex corresponding to the vertex grid to obtain the vertex normal grid.
4. A method for generating a normal map as claimed in claim 3, wherein said writing the normal color of the edge and the normal color of the vertex into the vertex color corresponding to the edge mesh to obtain the edge normal mesh comprises:
writing the normal color of the edge and the normal color of the vertex into the vertex color corresponding to the edge grid;
decomposing the quadrilateral surface composition in the side grid into a triangular surface composition to obtain an original normal map;
and according to a downward movement instruction of a user, downward moving the grid vertex at the UV vertex on the basis of the original normal map to obtain the edge normal grid.
5. The method for generating a normal map according to claim 1, wherein the determining the normal color corresponding to each vertex and each edge according to the tangent normal of each vertex and each edge of the 3D model comprises:
acquiring tangent normal lines corresponding to the vertexes and the edges;
respectively processing tangent normals corresponding to the vertexes and the sides to obtain target tangent normals corresponding to the vertexes and the sides;
and determining the normal color corresponding to each vertex and each side according to the normal of the target tangent line.
6. The method for generating a normal map according to claim 5, wherein the processing the tangent normals corresponding to the vertices and the edges to obtain target tangent normals corresponding to the vertices and the edges comprises:
respectively converting tangent normals corresponding to the vertexes and the sides into corresponding world normals, wherein the world normals are normals of a preset space;
calculating the average value of the world normal of each vertex and the world normal of the adjacent vertex to obtain the target world normal of each vertex in the preset space;
calculating the average value of the world normal of each side and the world normal of the adjacent side to obtain the target world normal of each side in the preset space;
converting the target world normal of each vertex into a target tangent normal corresponding to each vertex on the UV plane; and
and converting the target world normal of each side into a target tangent normal corresponding to each side on the UV plane.
7. The method for generating a normal map according to claim 5, wherein the calculation formula for determining the normal color corresponding to each vertex and each edge according to the target tangent normal is as follows:
R=x*0.5+0.5;
G=y*0.5+0.5;
B=1;
r, G, B represents the color of the texture picture pixel, x represents the abscissa of the object tangent normal line, y represents the ordinate of the object tangent normal line, the value ranges of x and y are both [ -1,1], and the value range of R, G is [0,1 ].
8. An apparatus for generating a normal map, comprising:
the loading module is used for loading the 3D model and unfolding and placing the texture mapping coordinates of the 3D model on a UV plane;
the determining module is used for determining the normal color corresponding to each vertex and each side according to the tangent normal of each vertex and each side of the 3D model;
a filling module for filling the mesh on the UV plane according to the vertexes and the edges;
and the generating module is used for generating a normal map according to the grid and the normal color.
9. A computer device, the computer device comprising: memory, processor and computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method of generating the normal map of any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of generating the normal map of any one of claims 1 to 7.
CN202010723807.3A 2020-07-24 2020-07-24 Normal map generation method and device Pending CN112435285A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010723807.3A CN112435285A (en) 2020-07-24 2020-07-24 Normal map generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010723807.3A CN112435285A (en) 2020-07-24 2020-07-24 Normal map generation method and device

Publications (1)

Publication Number Publication Date
CN112435285A true CN112435285A (en) 2021-03-02

Family

ID=74689896

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010723807.3A Pending CN112435285A (en) 2020-07-24 2020-07-24 Normal map generation method and device

Country Status (1)

Country Link
CN (1) CN112435285A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007328458A (en) * 2006-06-06 2007-12-20 Sega Corp Image forming program, computer-readable storage medium recording the program, image processor and image processing method
US20090033674A1 (en) * 2007-08-02 2009-02-05 Disney Enterprises, Inc. Method and apparatus for graphically defining surface normal maps
CN104268922A (en) * 2014-09-03 2015-01-07 广州博冠信息科技有限公司 Image rendering method and device
CN104574488A (en) * 2014-12-08 2015-04-29 北京理工大学 Method for optimizing three-dimensional model for mobile augmented reality browser
US20150348285A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Systems and methods for generating refined, high fidelity normal maps for 2d and 3d textures
US20170148205A1 (en) * 2015-11-19 2017-05-25 Adobe Systems Incorporated Creating bump and normal maps from images with multi-scale control
CN107316337A (en) * 2016-04-20 2017-11-03 网易(杭州)网络有限公司 The processing method and processing device of vertex normal
CN107864343A (en) * 2017-10-09 2018-03-30 上海幻电信息科技有限公司 The live image rendering method of computer and system based on video card
CN109427088A (en) * 2017-08-18 2019-03-05 腾讯科技(深圳)有限公司 A kind of rendering method and terminal of illumination simulation
CN110223372A (en) * 2019-06-13 2019-09-10 网易(杭州)网络有限公司 Method, apparatus, equipment and the storage medium of model rendering
CN111383311A (en) * 2020-03-06 2020-07-07 网易(杭州)网络有限公司 Normal map generating method, device, equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007328458A (en) * 2006-06-06 2007-12-20 Sega Corp Image forming program, computer-readable storage medium recording the program, image processor and image processing method
US20090033674A1 (en) * 2007-08-02 2009-02-05 Disney Enterprises, Inc. Method and apparatus for graphically defining surface normal maps
US20150348285A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Systems and methods for generating refined, high fidelity normal maps for 2d and 3d textures
CN104268922A (en) * 2014-09-03 2015-01-07 广州博冠信息科技有限公司 Image rendering method and device
CN104574488A (en) * 2014-12-08 2015-04-29 北京理工大学 Method for optimizing three-dimensional model for mobile augmented reality browser
US20170148205A1 (en) * 2015-11-19 2017-05-25 Adobe Systems Incorporated Creating bump and normal maps from images with multi-scale control
CN107316337A (en) * 2016-04-20 2017-11-03 网易(杭州)网络有限公司 The processing method and processing device of vertex normal
CN109427088A (en) * 2017-08-18 2019-03-05 腾讯科技(深圳)有限公司 A kind of rendering method and terminal of illumination simulation
CN107864343A (en) * 2017-10-09 2018-03-30 上海幻电信息科技有限公司 The live image rendering method of computer and system based on video card
CN110223372A (en) * 2019-06-13 2019-09-10 网易(杭州)网络有限公司 Method, apparatus, equipment and the storage medium of model rendering
CN111383311A (en) * 2020-03-06 2020-07-07 网易(杭州)网络有限公司 Normal map generating method, device, equipment and storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A MERLO等: "3D ModelVisualization Enhancements In Real-time Game Engines", 《THE INTERNATIONAL ARCHIVES OF THE PHOTOGRAMMETRY,REMOTE SENSING AND SPATIAL INFORMATION SCIENCES》, 13 February 2013 (2013-02-13), pages 181 - 188 *
PENGRUI WANG等: "A Unified Multi-output Semi-supervised Network for 3D Face Reconstruction", 《2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS》, 30 December 2019 (2019-12-30), pages 1 - 8 *
管阳: "三维虚拟校园网络漫游系统的实现", 《中国优秀硕士学位论文全文数据库(信息科技辑)》, no. 2, 15 February 2016 (2016-02-15), pages 138 - 1943 *
赵静: "次世代游戏的角色设计与相关技术研究", 《中国优秀硕士学位论文全文数据库(信息科技辑)》, no. 7, 15 July 2012 (2012-07-15), pages 138 - 1793 *

Similar Documents

Publication Publication Date Title
US8294726B2 (en) Methods and apparatus for multiple texture map storage and filtering
US20230053462A1 (en) Image rendering method and apparatus, device, medium, and computer program product
US9013499B2 (en) Methods and apparatus for multiple texture map storage and filtering including irregular texture maps
CN111402390B (en) Model rendering method, device, equipment and storage medium
US8289323B2 (en) Drawing processing apparatus, texture processing apparatus, and tessellation method
WO2000002165A1 (en) Method for generating polygon data and image display using the same
JP2008257752A (en) Perspective editing tool to two-dimensional image
CN112365598B (en) Method, device and terminal for converting oblique photography data into three-dimensional data
CN113112581A (en) Texture map generation method, device and equipment for three-dimensional model and storage medium
CN114375464A (en) Ray tracing dynamic cells in virtual space using bounding volume representations
CN112419460A (en) Method, apparatus, computer device and storage medium for baking model charting
CN116597063B (en) Picture rendering method, device, equipment and medium
KR101107114B1 (en) Method of rendering graphical objects
CN110038302B (en) Unity 3D-based grid generation method and device
CN112233241A (en) Method and device for generating height map of virtual scene terrain and storage medium
CN112435285A (en) Normal map generation method and device
CN113674419B (en) Three-dimensional display method and device for meteorological cloud data, electronic equipment and storage medium
CN112149383B (en) Text real-time layout method based on GPU, electronic device and storage medium
CN114119831A (en) Snow accumulation model rendering method and device, electronic equipment and readable medium
CN115409962A (en) Method for constructing coordinate system in illusion engine, electronic equipment and storage medium
CN114820980A (en) Three-dimensional reconstruction method and device, electronic equipment and readable storage medium
Rose et al. Interactive visualization of large finite element models.
CN112419459A (en) Method, apparatus, computer device and storage medium for baked model AO mapping
JP2830339B2 (en) Image conversion method
US20070198783A1 (en) Method Of Temporarily Storing Data Values In A Memory

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination