CN109685869B - Virtual model rendering method and device, storage medium and electronic equipment - Google Patents

Virtual model rendering method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN109685869B
CN109685869B CN201811594804.3A CN201811594804A CN109685869B CN 109685869 B CN109685869 B CN 109685869B CN 201811594804 A CN201811594804 A CN 201811594804A CN 109685869 B CN109685869 B CN 109685869B
Authority
CN
China
Prior art keywords
virtual model
map
rendering
pixel information
weight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811594804.3A
Other languages
Chinese (zh)
Other versions
CN109685869A (en
Inventor
朱长卫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201811594804.3A priority Critical patent/CN109685869B/en
Publication of CN109685869A publication Critical patent/CN109685869A/en
Application granted granted Critical
Publication of CN109685869B publication Critical patent/CN109685869B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The disclosure belongs to the technical field of computer graphics, and relates to a virtual model rendering method and device, a computer readable storage medium and electronic equipment. The method comprises the following steps: acquiring a gray image of an original virtual model, and sampling the gray image in a first axial direction to obtain first pixel information; acquiring a weight map, and sampling the weight map in a second axial direction to obtain second pixel information; and determining a texture map according to the first pixel information and the second pixel information, and rendering the original virtual model by using the texture map to obtain a target virtual model. The method and the device use the weight map to adjust the thickness of the line of the hidden part edge, are flexible and attractive, and enable the line to be hooked more vividly and smoothly; moreover, the operation of rendering the virtual model is transferred to the plane, the processing process is simpler, and the performance requirement of the electronic equipment for executing the virtual model rendering method is reduced.

Description

Virtual model rendering method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer graphics technologies, and in particular, to a virtual model rendering method, a virtual model rendering apparatus, a computer-readable storage medium, and an electronic device.
Background
Cartoon rendering is an important field in graphics and is widely applied to online game production. The cartoon rendering does not extremely pursue very real details and pursues stylization more, parts needing important expression can be extracted, stylized parts are intensively expressed, and the writing details which are not needed are ignored. Compared with the writing, the method has the advantages that the method has more simplified and clear representation in the aspects of color, outline and the like. For the modeling of the character, if the hair of the character is written, the effect is more fine, the modeling sense of the curve does not need to be studied, but the cartoon rendering has very exquisite pursuits on the hair, the figure proportion, the shape and the like of the character.
The general cartoon rendering mainly realizes the rendering effect of the bright part and the dark part, but under the condition that the structures of the models are different and the surface curvatures are not uniform, the edge-hooking effect of the dark part outline of the model can be influenced. For example, the body and the sleeves of a character may have a problem that the hook line of the sleeve portion is too thin, and the hook line of the flat back portion of the body is too thick, so that the thickness of the hook edge is not controllable. In order to realize more rendering styles, more types of stylized expressions are needed, and the problem of flexible control over the line drawing of the dark part outline needs to be solved.
In view of this, there is a need in the art to develop a new virtual model rendering method and apparatus.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a virtual model rendering method, a virtual model rendering apparatus, a computer-readable storage medium, and an electronic device, so as to overcome, at least to some extent, the problem of uncontrollable edge thickness of a dark portion contour due to different curvatures of a model surface caused by limitations of related art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the present disclosure, there is provided a virtual model rendering method, the method including: acquiring a gray level image of an original virtual model; sampling the gray level image in a first axial direction to obtain first pixel information; acquiring a weight map, and sampling the weight map in a second axial direction to obtain second pixel information; and determining a texture map according to the first pixel information and the second pixel information, and rendering an original virtual model by using the texture map to obtain a target virtual model.
In an exemplary embodiment of the present disclosure, the acquiring a grayscale image of an original virtual model includes: acquiring a preset light source vector and a normal vector of an original virtual model; and generating a gray level image of the original virtual model according to the calculation result of the light source vector and the normal vector.
In an exemplary embodiment of the present disclosure, the determining the texture map according to the first pixel information and the second pixel information includes: acquiring weight information of the weight texture, and converting the weight information into floating point weight information; adjusting the second pixel information according to the floating point weight information; and determining the texture map according to the first pixel information and the adjusted second pixel information.
In an exemplary embodiment of the disclosure, the adjusting the mesh structure of the original two-dimensional map model to obtain the optimized two-dimensional map model includes: replacing a first grid in the grid structure of the original two-dimensional map model with a second grid; wherein the second mesh has a different mesh shape than the first mesh.
In an exemplary embodiment of the present disclosure, the generating a grayscale image of the original virtual model according to the calculation result of the light source vector and the normal vector includes: performing point multiplication on the light source vector and the normal vector to obtain a gray value; calculating the gray value to obtain a brightness information gray value; and determining the gray image of the original virtual model according to the brightness information gray value.
In an exemplary embodiment of the disclosure, the sampling the weight map in the second axis to obtain second pixel information includes: rendering according to the weight map to obtain a virtual model; and sampling the rendered virtual model, and determining second pixel information of the second axial direction.
In an exemplary embodiment of the present disclosure, the rendering the original virtual model by using the texture map to obtain the target virtual model includes: acquiring a light shadow map corresponding to the original virtual model; and simultaneously rendering the texture mapping and the light shadow mapping of the original virtual model to obtain the target virtual model.
In an exemplary embodiment of the disclosure, the rendering the original virtual model by using the texture map to obtain the target virtual model includes: acquiring a color map and a light shadow map corresponding to the original virtual model; and simultaneously rendering the texture map, the color map and the light shadow map of the original virtual model to obtain the target virtual model.
In an exemplary embodiment of the present disclosure, the texture map includes gray maps displayed in an R channel, a G channel, and/or a B channel.
According to an aspect of the present disclosure, there is provided a virtual model rendering apparatus, the apparatus including: the virtual model comprises a first axial module, a second axial module and a third axial module, wherein the first axial module is configured to acquire a gray level image of an original virtual model, and the gray level image is sampled in a first axial direction to obtain first pixel information; a second axial module configured to obtain a weight map, and sample the weight map in a second axial direction to obtain second pixel information; and the rendering module is configured to determine a texture map according to the first pixel information and the second pixel information, and render the original virtual model by using the texture map to obtain a target virtual model.
According to an aspect of the present disclosure, there is provided an electronic device including: a processor and a memory; wherein the memory has stored thereon computer readable instructions which, when executed by the processor, implement the virtual model rendering method of any of the above exemplary embodiments.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a virtual model rendering method in any of the above exemplary embodiments.
Exemplary embodiments of the present disclosure have the following advantageous effects:
in the method and the device provided by the exemplary embodiment of the disclosure, a gray image is determined through an operation result of a light source vector and a normal vector, the gray image is sampled to obtain first pixel information, a weight map is sampled to obtain second pixel information, a texture map is determined according to the first pixel information and the second pixel information to obtain a target virtual model, on one hand, the thickness of a dark part edge-drawing line is adjusted in real time by using the weight map, and the method and the device are more flexible and attractive, so that the rendered edge-drawing line is more vivid and smooth; on the other hand, the rendering process of the traditional three-dimensional virtual model is transferred to the two-dimensional plane, and then the drawn texture mapping is used for rendering the virtual model, so that the processing process is simpler, the same effect is achieved, and the performance requirement of the method for executing the rendering of the virtual model on the electronic equipment is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It should be apparent that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived by those of ordinary skill in the art without inventive effort.
FIG. 1 schematically illustrates a flow chart of a virtual model rendering method in an exemplary embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow diagram of a method of generating a grayscale image of a virtual model in an exemplary embodiment of the disclosure;
FIG. 3 schematically illustrates a flow chart for determining a texture map in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a flow diagram for generating a grayscale image from light source vectors and normal vectors in an exemplary embodiment of the disclosure;
FIG. 5 is a schematic diagram illustrating a flow of sampling a weight map to obtain second pixel information according to an exemplary embodiment of the disclosure;
FIG. 6 schematically illustrates a flow diagram for generating a target virtual model in an exemplary embodiment of the disclosure;
FIG. 7 schematically illustrates another flow diagram for generating a target virtual model in exemplary embodiments of the present disclosure;
FIG. 8 schematically illustrates a two-channel overlay display effect in an exemplary embodiment of the disclosure;
fig. 9 (a) schematically illustrates a grayscale image of a sphere in an exemplary embodiment of the disclosure;
FIG. 9 (b) is a schematic representation of a grayscale image of a character model in an exemplary embodiment of the disclosure;
FIG. 10 (a) schematically illustrates a schematic diagram of sampling the X-axis in an exemplary embodiment of the disclosure;
FIG. 10 (b) schematically illustrates a corresponding sphere delineation obtained by sampling the X-axis in an exemplary embodiment of the disclosure;
FIG. 10 (c) schematically illustrates a corresponding character model delineation diagram obtained by sampling the X-axis in an exemplary embodiment of the disclosure;
fig. 11 (a) schematically illustrates a texture map acquired in an exemplary embodiment of the present disclosure;
FIG. 11 (b) is a schematic diagram of a virtual model rendered using texture mapping in an exemplary embodiment of the present disclosure;
FIG. 11 (c) schematically illustrates a sampling of the Y-axis in an exemplary embodiment of the disclosure;
fig. 11 (d) schematically illustrates a corresponding sphere delineation obtained by sampling the Y-axis in an exemplary embodiment of the disclosure;
FIG. 11 (e) is a schematic diagram illustrating a corresponding character model delineation obtained by sampling the Y-axis in an exemplary embodiment of the present disclosure;
FIG. 12 (a) is a schematic diagram illustrating a target virtual model rendered via a light map and a color map of a character model in an exemplary embodiment of the disclosure;
FIG. 12 (b) is a schematic diagram illustrating a target virtual model rendered via a light map, a color map, and a texture map of a character model in an exemplary embodiment of the disclosure;
fig. 13 is a schematic structural diagram of a virtual model rendering apparatus in an exemplary embodiment of the present disclosure;
FIG. 14 schematically illustrates an electronic device for implementing a virtual model rendering method in an exemplary embodiment of the present disclosure;
fig. 15 schematically illustrates a computer-readable storage medium for implementing a virtual model rendering method in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
For the problems existing in the related art, the present disclosure provides a virtual model rendering method, fig. 1 shows a flow diagram of the virtual model rendering method, and as shown in fig. 1, the virtual model rendering method mainly includes the following steps:
s101, acquiring a gray image of an original virtual model, and sampling the gray image in a first axial direction to obtain first pixel information;
s102, obtaining a weight map, and sampling the weight map in a second axial direction to obtain second pixel information;
and S103, determining a texture map according to the first pixel information and the second pixel information, and rendering the original virtual model by using the texture map to obtain a target virtual model.
In the exemplary embodiment of the disclosure, a gray image is determined through an operation result of a light source vector and a normal vector, the gray image is sampled to obtain first pixel information, a weight map is sampled to obtain second pixel information, a texture map is determined according to the first pixel information and the second pixel information to obtain a target virtual model, and on one hand, the weight map is used for realizing real-time adjustment of the thickness of a dark part edge-drawing line, so that the method is more flexible and attractive, and the rendered edge-drawing line is more vivid and smooth; on the other hand, the rendering process of the traditional three-dimensional virtual model is transferred to the two-dimensional plane, and then the drawn texture mapping is used for rendering the virtual model, so that the processing process is simpler, the same effect is achieved, and the performance requirement of the method for executing the rendering of the virtual model on the electronic equipment is reduced.
The following describes each step of the virtual model rendering method in detail.
In step S101, a grayscale image of the original virtual model is obtained, and the grayscale image is sampled in the first axis to obtain first pixel information.
In an exemplary embodiment of the present disclosure, the illumination direction and illumination intensity of an ambient light, that is, a light which has no light source, no direction and produces the same lighting effect on an original virtual model in a scene, may be simulated by hardware to display the color of the original virtual model. A general rendering method is to use a Blinn-Phong illumination model or a Lambert illumination model under diffuse reflected illumination. The embodiment may obtain a preset light source vector and a normal vector of the original virtual model, where the normal vector is perpendicular to a plane and located on a vertex, and demonstrate the smoothness of the virtual model. In the virtual model rendering method, the virtual model may be considered to be composed of vertices, i.e. of points of the virtual model surface. Fig. 2 shows a flow diagram of a method of generating a grayscale image of a virtual model from an illumination vector and a normal vector. As shown in fig. 2, the method comprises at least the following steps: in step S201, a preset light source vector and a normal vector of the original virtual model are acquired. In an exemplary embodiment of the present disclosure, fig. 3 shows a flow diagram for generating a grayscale image from a light source vector and a normal vector. As shown in fig. 3, the method comprises at least the following steps: in step S301, a light source vector and a normal vector are point-multiplied to obtain a gray scale value. In an exemplary embodiment of the present disclosure, the coloring of the surface vertices of the original virtual model is determined using a Lambert lighting virtual model, i.e., dot multiplication of the light source vector and the normal vector, with the dot multiplication result ranging from-1 to 1. The gray image of the original virtual model is a two-dimensional plane, the horizontal direction is U, the vertical direction is V, and any pixel on the gray image can be located through the two-dimensional UV coordinate system of the plane. When obtaining a grayscale image thereof from the original virtual model, since the original virtual model itself has UV parameters, although the UV parameters are parameters for locating points on the surface, since it is also two-dimensional, it is easy to correspond the points on the original virtual model and pixels on the grayscale image by conversion. However, the numerical ranges of U and V on the grayscale image are both 0 to 1, and obviously, the dot product of the light source vector and the normal vector does not satisfy. In step S302, the gray-scale value is calculated to obtain the gray-scale value of the luminance information. In an exemplary embodiment of the present disclosure, since the range of the dot product result of the light source vector and the normal vector on the vertex of the original virtual model and the range of the UV texture coordinates do not coincide, the dot product result may be processed. For example, the operation result may be added with 1 and then multiplied by 0.5, or the result value of the dot product result smaller than 0 may be set to 0, and the result value of the dot product result larger than 0 may be retained. In this way, the gray values can be defined as the gray values of the luminance information with the range of 0 to 1, reflecting the luminance information of the vertices, which exactly matches the range of the UV texture coordinates of the original virtual model. In step S303, a grayscale image of the original virtual model is determined from the luminance information grayscale value. In an exemplary embodiment of the present disclosure, a dot product result of the light source vector and the normal vector is converted into UV coordinates of a vertex according to a correspondence between the calculated luminance information gray value and the UV coordinates, and a gray image of the original virtual model is determined. Wherein, the pixel point with the value of 0 represents the darkest area which is not irradiated by the light source, and the pixel point with the value of 1 represents the brightest area which is directly irradiated by the light source. In step S202, a grayscale image of the original virtual model is generated from the calculation results of the light source vector and the normal vector. Then, a gray image of the original virtual model can be generated by operating the light source vector and the normal vectors of the respective vertices. Then, the grayscale image is sampled in a first axis to obtain first pixel information. In an exemplary embodiment of the present disclosure, the generated grayscale image of the original virtual model is two-dimensionally distributed information, which may be sampled, and the sampled pixel value is taken as pixel information in a first axis, for example, an X axis, that is, first pixel information. At this time, the sampling in the second axis, for example, the Y axis, is 0, so the bottommost pixel value is obtained.
In step S102, a weight map is obtained, and the weight map is sampled in the second axis direction to obtain second pixel information.
In an exemplary embodiment of the present disclosure, a weight map may be obtained, where an artist draws black and white gray scales at different positions of a model according to visual observation to realize sampling position shift in a second axis direction to realize dark portion edge-drawing line thickness control, and may store any channel of the map as needed. The gray value reflected by the brightest area is 1, the gray value reflected by the darkest area is 0, and the drawing of the selected brightest and darkest areas is determined by the light and dark areas of the virtual model to be obtained. In an exemplary embodiment of the disclosure, fig. 4 shows a flow chart of sampling the weight map in the second axis direction to obtain the second pixel information. As shown in fig. 4, the method comprises at least the following steps: in step S401, a virtual model is rendered according to the weight map. In an exemplary embodiment of the present disclosure, a virtual model is rendered using the retrieved weight maps. At this time, since the weight map has only the gray values 1 and 0 at the brightest and darkest positions, the rendered virtual model is a virtual model composed of gray patches. Generally, areas of the virtual model with smaller curvature, such as the back of a character, appear white where the curvature is relatively flat; the area with larger curvature of the virtual model, such as the relatively curved part of the sleeve of the character, is black. In order to make the edge-hooking effect of the dark part better, the weight mapping can adjust the gray value to be a floating point type, and a virtual model composed of gray colors is obtained through rendering. In step S402, the rendered virtual model is sampled, and second pixel information in a second axis direction is determined. In an exemplary embodiment of the present disclosure, on the basis that the first pixel information in the first axis direction has been determined, the already drawn virtual model is sampled, and the second pixel information in the second axis direction is obtained.
In step S103, a texture map is determined according to the first pixel information and the second pixel information, and the original virtual model is rendered by using the texture map to obtain a target virtual model.
In an exemplary embodiment of the present disclosure, a texture map may be determined based on the first pixel information and the second pixel information that have been obtained. The texture map may be a gray map displayed in the R channel, G channel, and/or B channel. For example, the method can be a gray level map displayed on a G channel, and the thickness of the edge line is controlled by using the G channel of the map. In an exemplary embodiment of the present disclosure, fig. 5 shows a flowchart for determining a texture map. As shown in fig. 5, the method for determining a texture map according to first pixel information and second pixel information at least comprises the following steps: in step S501, weight information of the weight texture is obtained, and the weight information is converted into floating point weight information. In the exemplary embodiment of the present disclosure, the weight information of the weight texture is the texture map with only 0 and 1, so the weight information can be converted into a floating point value to obtain floating point weight information for subsequent texture map determination, so that the thickness of the edge-delineating line controlled by the floating point weight information is more accurate, and the edge-delineating effect is better. In step S502, the second pixel information is adjusted according to the floating point weight information. In an exemplary embodiment of the present disclosure, the sampled second pixel information may be adjusted according to the floating point weight information that has been converted, and at this time, the second pixel information may be gray scale value information having a transition from 0 to 1. In step S503, a texture map is determined according to the first pixel information and the adjusted second pixel information. In an exemplary embodiment of the present disclosure, a texture map may be determined according to the obtained first pixel information and the adjusted second pixel information. The texture map may be a gray map displayed in the R channel, G channel, and/or B channel. For example, a gray-scale map displayed on the G channel may be used, and the thickness of the edge line may be adjusted by using the G channel of the map.
In an exemplary embodiment of the present disclosure, fig. 6 illustrates a flow diagram for generating a target virtual model. As shown in fig. 6, the method for obtaining the target virtual model by rendering the original virtual model with the texture map at least includes the following steps: in step S601, a light map corresponding to the original virtual model is obtained. In an exemplary embodiment of the present disclosure, a light shadow map corresponding to an original virtual model may be obtained, where the light shadow map is a two-dimensional map of a light and dark region appearing on the original virtual model under a hardware-simulated lighting condition. In step S602, the texture map and the light map of the original virtual model are rendered at the same time to obtain the target virtual model. In an exemplary embodiment of the present disclosure, a map having both light and dark portions and a thickness of a bordering line is drawn according to a corresponding relationship between a texture map and a light shadow map of an original virtual model, and the map is covered on a surface of the original virtual model, so that rendering of the original virtual model is completed, and a target virtual model is generated.
In an exemplary embodiment of the present disclosure, fig. 7 shows another flowchart for generating a target virtual model. As shown in fig. 7, the method for obtaining the target virtual model by rendering the original virtual model with the texture map at least includes the following steps: in step S701, a color map and a light map corresponding to the original virtual model are obtained. In an exemplary embodiment of the present disclosure, a light shadow map and a color map corresponding to an original virtual model may be obtained. The light shadow map is a two-dimensional map of a light and dark area presented on an original virtual model under the illumination condition simulated by hardware; the color map is a map of the original virtual model with colored information. In step S702, the texture map, the color map, and the light shadow map of the original virtual model are rendered at the same time to obtain the target virtual model. In an exemplary embodiment of the present disclosure, a map having light and dark portions, color information of an original virtual model, and thickness of a bordering line is drawn according to a corresponding relationship between a texture map, a color map, and a light map of the original virtual model, and the map is covered on a surface of the original virtual model, so that rendering of the original virtual model is completed, and a target virtual model is generated.
The following describes the virtual model rendering method in the embodiment of the present disclosure in detail with reference to an application scenario.
Fig. 8 is a schematic diagram showing the superimposed display effect of the R channel and the G channel. As shown in fig. 8, the gray scale map of two channels according to the R channel and the G channel may display the left portion as green and the right portion as yellow superimposed display effect map. In order to vividly explain the thickness control problem of the edge-hooking line, the application scene uses the gray-scale map of the G channel as the texture map obtained by sampling. In order to more specifically explain that the rendering method of the virtual model can well solve the problem that the thickness of the edge-drawing line of the virtual model with non-uniform surface curvature is not controllable, two rendering objects of the rendering method of the virtual model are used, a sphere with uniform surface curvature is selected as a comparison object, and a character model is selected as a rendering object. Fig. 9 (a) shows a gray scale image of a sphere, which is generated by obtaining a preset light source vector and a normal vector of the sphere and performing operations such as dot product; fig. 9 (b) shows a gray image of a character model generated by acquiring a preset light source vector and a normal vector of the character model and performing a dot product operation or the like. Fig. 10 (a) is a schematic diagram showing a texture map in which pixel information in the X axis direction is obtained by sampling a grayscale image, and sampling values in the Y axis direction are all 0, and a sphere and a character model corresponding to this case can be obtained as shown in fig. 10 (b) and 10 (c). It can be seen that the thickness of the hook edge at the light and dark boundary of the sphere is uniform, and the hook line at the light and dark boundary of the character model is thick and needs to be adjusted. Fig. 11 (a) shows the obtained weight map, and rendering the virtual model using the weight map can obtain a virtual model as shown in fig. 11 (b), in which the back area of the virtual model is displayed in white and the head area and the sleeve area are displayed in black. Fig. 11 (c) shows a texture map in which a virtual model is sampled in addition to the X-axis sampling, and a pixel value in the Y-axis is determined. After two axial pixel information are obtained, the determined texture maps are used for rendering the sphere and the character model respectively to obtain the display effect as shown in fig. 11 (d) and 11 (e), as shown in the figure, the lines of the dark part edge of the sphere are not changed due to the consistent curvature, while the lines of the dark part edge of the character model become thinner, which shows that for the virtual model with inconsistent surface curvature, the rendering method can solve the problem that the thickness of the lines of the dark part edge is uncontrollable. Fig. 12 (a) shows a virtual model rendered by a light shadow map and a color map of a character model, and at this time, the edge-drawing line is not adjusted, so the edge-drawing line is thick, and fig. 12 (b) shows a virtual model rendered by a light shadow map, a color map and a texture map of a character model, and the edge-drawing line is thin compared to the former line, so the edge-drawing effect is good.
Furthermore, in an example embodiment of the present disclosure, a virtual model rendering apparatus is also provided. Referring to fig. 13, the virtual model rendering apparatus 1300 may include: a first axial module 1301, a second axial module 1302, a rendering module 1303. Wherein:
a first axial module 1301, configured to obtain a grayscale image of an original virtual model, and sample the grayscale image in a first axial direction to obtain first pixel information; a second axial module 1302 configured to obtain a weight map, and sample the weight map in a second axial direction to obtain second pixel information; and the rendering module 1303 is configured to determine a texture map according to the first pixel information and the second pixel information, and render the original virtual model by using the texture map to obtain a target virtual model.
The specific details of the virtual model rendering device have been described in detail in the corresponding virtual model rendering method, and therefore are not described herein again.
It should be noted that although several modules or units of the virtual model rendering apparatus 1300 are mentioned in the above detailed description, such division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
An electronic device 1400 according to such an embodiment of the invention is described below with reference to fig. 14. The electronic device 1400 shown in fig. 14 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 14, the electronic device 1400 is embodied in the form of a general purpose computing device. The components of the electronic device 1400 may include, but are not limited to: the at least one processing unit 1410, the at least one memory unit 1420, the bus 1430 that connects the various system components (including the memory unit 1420 and the processing unit 1410), and the display unit 1440.
Wherein the storage unit stores program code that is executable by the processing unit 1410, such that the processing unit 1410 performs steps according to various exemplary embodiments of the present invention described in the above section "exemplary methods" of the present specification.
The storage unit 1420 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM) 1421 and/or a cache memory unit 1422, and may further include a read only memory unit (ROM) 1423.
Storage unit 1420 may also include a program/utility 1424 having a set (at least one) of program modules 1425, such program modules 1425 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1430 may be any type of bus structure including a memory cell bus or memory cell controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 1400 can also communicate with one or more external devices 1600 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 1400, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 1400 to communicate with one or more other computing devices. Such communication can occur via an input/output (I/O) interface 1450. Also, the electronic device 1400 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 1460. As shown, the network adapter 1460 communicates with the other modules of the electronic device 1400 over the bus 1430. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 1400, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary method" of this description, when said program product is run on the terminal device.
Referring to fig. 15, a program product 1500 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice in the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. A method of virtual model rendering, the method comprising:
acquiring a gray image of an original virtual model, and sampling the gray image in a first axial direction to obtain first pixel information;
acquiring a weight map, and sampling the weight map in a second axial direction to obtain second pixel information;
acquiring weight information of weight textures, and converting the weight information into floating point weight information;
adjusting the second pixel information according to the floating point weight information;
and determining a texture map according to the first pixel information and the adjusted second pixel information, and rendering an original virtual model by using the texture map to obtain a target virtual model.
2. The virtual model rendering method of claim 1, wherein the obtaining a grayscale image of an original virtual model comprises:
acquiring a preset light source vector and a normal vector of an original virtual model;
and generating a gray image of the original virtual model according to the calculation result of the light source vector and the normal vector.
3. The virtual model rendering method of claim 2, wherein the generating a grayscale image of the original virtual model from the calculation results of the light source vector and the normal vector comprises:
performing point multiplication on the light source vector and the normal vector to obtain a gray value;
calculating the gray value to obtain a brightness information gray value;
and determining the gray image of the original virtual model according to the brightness information gray value.
4. The virtual model rendering method of claim 1, wherein the sampling the weight map in the second axis to obtain second pixel information comprises:
rendering according to the weight map to obtain a virtual model;
and sampling the rendered virtual model, and determining pixel information in a second axial direction.
5. The virtual model rendering method of claim 1, wherein the rendering the original virtual model using the texture map to obtain the target virtual model comprises:
acquiring a light shadow map corresponding to the original virtual model;
and simultaneously rendering the texture mapping and the light mapping of the original virtual model to obtain the target virtual model.
6. The virtual model rendering method of claim 1, wherein the rendering the original virtual model using the texture map to obtain the target virtual model comprises:
acquiring a color map and a light shadow map corresponding to the original virtual model;
and simultaneously rendering the texture map, the color map and the light shadow map of the original virtual model to obtain the target virtual model.
7. The virtual model rendering method of any one of claims 1-6, wherein the texture map comprises a grayscale map displayed in an R channel, a G channel, and/or a B channel.
8. A virtual model rendering apparatus, comprising:
the virtual model comprises a first axial module, a second axial module and a third axial module, wherein the first axial module is configured to acquire a gray level image of an original virtual model, and the gray level image is sampled in a first axial direction to obtain first pixel information;
a second axial module configured to obtain a weight map, and sample the weight map in a second axial direction to obtain second pixel information;
the rendering module is configured to acquire weight information of the weight texture and convert the weight information into floating point weight information;
adjusting the second pixel information according to the floating point weight information;
and determining a texture map according to the first pixel information and the adjusted second pixel information, and rendering an original virtual model by using the texture map to obtain a target virtual model.
9. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing a virtual model rendering method according to any one of claims 1 to 7.
10. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the virtual model rendering method of any one of claims 1-7 via execution of the executable instructions.
CN201811594804.3A 2018-12-25 2018-12-25 Virtual model rendering method and device, storage medium and electronic equipment Active CN109685869B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811594804.3A CN109685869B (en) 2018-12-25 2018-12-25 Virtual model rendering method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811594804.3A CN109685869B (en) 2018-12-25 2018-12-25 Virtual model rendering method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN109685869A CN109685869A (en) 2019-04-26
CN109685869B true CN109685869B (en) 2023-04-07

Family

ID=66189565

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811594804.3A Active CN109685869B (en) 2018-12-25 2018-12-25 Virtual model rendering method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN109685869B (en)

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110196746B (en) * 2019-05-30 2022-09-30 网易(杭州)网络有限公司 Interactive interface rendering method and device, electronic equipment and storage medium
CN110124318B (en) * 2019-06-12 2022-11-04 网易(杭州)网络有限公司 Method and device for making virtual vegetation, electronic equipment and storage medium
CN110390709B (en) * 2019-06-19 2023-01-03 北京巴别时代科技股份有限公司 Cartoon rendering edge-hooking smoothing method
CN110310359B (en) * 2019-06-28 2023-10-24 网易(杭州)网络有限公司 Method and device for transforming object states in game
CN110400372B (en) * 2019-08-07 2023-10-20 网易(杭州)网络有限公司 Image processing method and device, electronic equipment and storage medium
CN111009026B (en) * 2019-12-24 2020-12-01 腾讯科技(深圳)有限公司 Object rendering method and device, storage medium and electronic device
US11276227B2 (en) 2019-12-24 2022-03-15 Tencent Technology (Shenzhen) Company Limited Object rendering method and apparatus, storage medium, and electronic device using a simulated pre-integration map
CN111127623B (en) * 2019-12-25 2023-08-29 上海米哈游天命科技有限公司 Model rendering method and device, storage medium and terminal
CN111145330B (en) * 2019-12-31 2023-06-30 广州方硅信息技术有限公司 Human model rendering method and device, electronic equipment and storage medium
CN111383320B (en) * 2020-03-09 2024-01-26 网易(杭州)网络有限公司 Virtual model processing method, device, equipment and storage medium
CN111402385B (en) * 2020-03-26 2023-11-17 网易(杭州)网络有限公司 Model processing method and device, electronic equipment and storage medium
CN111494945B (en) * 2020-04-22 2024-04-26 网易(杭州)网络有限公司 Virtual object processing method and device, storage medium and electronic equipment
CN111467805B (en) * 2020-05-11 2023-04-07 网易(杭州)网络有限公司 Method and device for realizing dynamic change of virtual scene, medium and electronic equipment
CN111724463B (en) * 2020-06-29 2021-07-02 苏州幻塔网络科技有限公司 Rendering method and device for model line drawing
CN112070873B (en) * 2020-08-26 2021-08-20 完美世界(北京)软件科技发展有限公司 Model rendering method and device
CN112435323B (en) * 2020-11-26 2023-08-22 网易(杭州)网络有限公司 Light effect processing method, device, terminal and medium in virtual model
CN112370783B (en) * 2020-12-02 2024-06-11 网易(杭州)网络有限公司 Virtual object rendering method, device, computer equipment and storage medium
CN112419465B (en) * 2020-12-09 2024-05-28 网易(杭州)网络有限公司 Virtual model rendering method and device
CN112634416B (en) * 2020-12-23 2023-07-28 北京达佳互联信息技术有限公司 Method and device for generating virtual image model, electronic equipment and storage medium
CN112675545B (en) * 2021-01-07 2022-12-13 腾讯科技(深圳)有限公司 Method and device for displaying surface simulation picture, storage medium and electronic equipment
CN113064539B (en) * 2021-03-04 2022-07-29 北京达佳互联信息技术有限公司 Special effect control method and device, electronic equipment and storage medium
CN113034662B (en) * 2021-03-29 2023-03-31 网易(杭州)网络有限公司 Virtual scene rendering method and device, storage medium and electronic equipment
CN113240783B (en) * 2021-05-27 2023-06-27 网易(杭州)网络有限公司 Stylized rendering method and device, readable storage medium and electronic equipment
CN113487717B (en) * 2021-07-13 2024-02-23 网易(杭州)网络有限公司 Picture processing method and device, computer readable storage medium and electronic equipment
CN113521738B (en) * 2021-08-11 2024-07-16 网易(杭州)网络有限公司 Special effect generation method and device, computer readable storage medium and electronic equipment
CN113947657B (en) * 2021-10-18 2024-07-23 网易(杭州)网络有限公司 Rendering method, device, equipment and storage medium of target model
CN114119847B (en) * 2021-12-05 2023-11-07 北京字跳网络技术有限公司 Graphic processing method, device, computer equipment and storage medium
CN117095108B (en) * 2023-10-17 2024-01-23 海马云(天津)信息技术有限公司 Texture rendering method and device for virtual digital person, cloud server and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107358643A (en) * 2017-07-04 2017-11-17 网易(杭州)网络有限公司 Image processing method, device, electronic equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7098925B1 (en) * 2000-03-10 2006-08-29 Intel Corporation Shading of images using texture
JP2015228186A (en) * 2014-06-02 2015-12-17 株式会社ソニー・コンピュータエンタテインメント Image processor and image processing method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107358643A (en) * 2017-07-04 2017-11-17 网易(杭州)网络有限公司 Image processing method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN109685869A (en) 2019-04-26

Similar Documents

Publication Publication Date Title
CN109685869B (en) Virtual model rendering method and device, storage medium and electronic equipment
CN110196746B (en) Interactive interface rendering method and device, electronic equipment and storage medium
US7583264B2 (en) Apparatus and program for image generation
CN106575445B (en) Fur avatar animation
CN106652007B (en) Virtual sea surface rendering method and system
AU2018253460B2 (en) Framework for local parameterization of 3d meshes
US20230120253A1 (en) Method and apparatus for generating virtual character, electronic device and readable storage medium
CN112700528B (en) Virtual object shadow rendering method for head-mounted augmented reality device
CN110163945B (en) Water surface simulation method in real-time rendering
CN112053423A (en) Model rendering method and device, storage medium and computer equipment
CN113144611B (en) Scene rendering method and device, computer storage medium and electronic equipment
CN113012273A (en) Illumination rendering method, device, medium and equipment based on target model
US20080129738A1 (en) Method and apparatus for rendering efficient real-time wrinkled skin in character animation
US20040056859A1 (en) Image generating method, storage medium, image generating apparatus, data signal and program
CN112862943A (en) Virtual model rendering method and device, storage medium and electronic equipment
CN116363288A (en) Rendering method and device of target object, storage medium and computer equipment
US9135746B2 (en) Image processing apparatus and control method thereof
CN109448123A (en) The control method and device of model, storage medium, electronic equipment
CN114332339A (en) System for coloring vector objects
CN113409465A (en) Method and device for generating hair model, storage medium and electronic equipment
JP3278501B2 (en) Image processing apparatus and method
JP2003168130A (en) System for previewing photorealistic rendering of synthetic scene in real-time
CN112465941B (en) Volume cloud processing method and device, electronic equipment and storage medium
CN100568289C (en) Computer drawing element describing method and device
Boorboor et al. VoxAR: Adaptive Visualization of Volume Rendered Objects in Optical See-Through Augmented Reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant