CN111402381B - Model rendering method and device, and readable storage medium - Google Patents

Model rendering method and device, and readable storage medium Download PDF

Info

Publication number
CN111402381B
CN111402381B CN202010185503.6A CN202010185503A CN111402381B CN 111402381 B CN111402381 B CN 111402381B CN 202010185503 A CN202010185503 A CN 202010185503A CN 111402381 B CN111402381 B CN 111402381B
Authority
CN
China
Prior art keywords
model
width
vertex
rendering
contour line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010185503.6A
Other languages
Chinese (zh)
Other versions
CN111402381A (en
Inventor
宋田骥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010185503.6A priority Critical patent/CN111402381B/en
Publication of CN111402381A publication Critical patent/CN111402381A/en
Application granted granted Critical
Publication of CN111402381B publication Critical patent/CN111402381B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Abstract

The application provides a model rendering method and device and a readable storage medium. The method comprises the following steps: obtaining a basic model to be rendered and vertex data of the basic model to be rendered, wherein the vertex data comprise vertex colors, then determining width rendering parameters of contour line models at all vertices in the basic model to be rendered, wherein the width rendering parameters of the contour line models are related to the vertex colors, so that the contour line models corresponding to the basic model to be rendered are rendered according to the width rendering parameters of the contour line models, and further, outputting a target model according to the basic model to be rendered and the contour line models corresponding to the basic model to be rendered. The method solves the problem that the existing model rendering method cannot meet the personalized demand of art style, and can meet the personalized rendering demand aiming at the outer contour line.

Description

Model rendering method and device, and readable storage medium
Technical Field
The present application relates to the field of computer graphics, and in particular, to a model rendering method and apparatus, and a readable storage medium.
Background
In real-time rendering of game animations, shaders are of paramount importance. The shader may render digital asset data such as models, maps, etc., produced offline, to screens seen by players and spectators by programs running on a graphics processor (Graphics Processing Unit, GPU). Which style, e.g., cartoon style, realistic style, etc., the offline digital asset data is rendered into is implemented by the shader. One of the core technologies of cartoon style rendering is the rendering of the outline.
In the existing model rendering scheme, for example, the external contour line is drawn based on the observation angle and the surface normal method, the external contour line is drawn by the procedural geometry method, or the external contour line is drawn based on the image processing, and the like, only the external contour line rendering effect with uniform width and similar industrialization can be realized. Fig. 1 shows a schematic view of the rendering effect of the outer mold in the prior art, and the outer contour line of the three-dimensional model is uniform in width as shown in fig. 1.
However, with the development of game animation technology, higher requirements are also put forward on the diversification of art styles, and the existing model rendering scheme can only realize the outer contour line rendering with consistent width and cannot meet the personalized requirements of the art styles.
Disclosure of Invention
The application provides a model rendering method and device and a readable storage medium, which are used for solving the problem that the conventional model rendering method cannot meet the personalized demand of art style and meets the personalized rendering demand aiming at an outer contour line.
In a first aspect, the present application provides a model rendering method, including:
obtaining a basic model to be rendered and vertex data of the basic model to be rendered, wherein the vertex data comprise vertex colors;
determining width rendering parameters of an outline model at each vertex in the basic model to be rendered, wherein the width rendering parameters of the outline model are related to the vertex colors;
rendering according to the width rendering parameters of the contour line model to obtain a contour line model corresponding to the basic model to be rendered;
and outputting a target model according to the basic model to be rendered and the corresponding contour line model.
In a second aspect, the present application provides a model rendering apparatus comprising:
the system comprises an acquisition module, a rendering module and a rendering module, wherein the acquisition module is used for acquiring a basic model to be rendered and vertex data of the basic model to be rendered, and the vertex data comprise vertex colors;
the determining module is used for determining width rendering parameters of the contour line model at each vertex in the basic model to be rendered, wherein the width rendering parameters of the contour line model are related to the vertex colors;
the rendering module is used for rendering according to the width rendering parameters of the contour line model to obtain a contour line model corresponding to the basic model to be rendered;
and the output module is used for outputting a target model according to the basic model to be rendered and the contour line model corresponding to the basic model to be rendered.
In a third aspect, the present application provides a model rendering apparatus comprising:
a memory;
a processor; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method of any of the first aspects.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program for execution by a processor to implement the method of the first aspect.
According to the model rendering method and device and the readable storage medium, the width rendering parameters of the vertexes can be determined based on the vertex colors in the vertex data of the model, so that the contour line width is utilized for personalized rendering, and the output target model has personalized contour line effects. Therefore, the embodiment of the application can realize the rendering of the personalized width of the outer contour line by utilizing the association relation between the vertex color and the contour line width, and the technical scheme provided by the embodiment of the application can solve the problem that the conventional model rendering method can not meet the personalized requirement of the art style, and meets the personalized rendering requirement of the outer contour line.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic diagram of an external mold rendering effect in the prior art;
FIG. 2 is a schematic flow chart of a model rendering method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of the vertices of a model in an embodiment of the application;
FIG. 4 is a schematic diagram of a model rendering effect according to an embodiment of the present application;
FIG. 5 is a flowchart of another model rendering method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a method for determining a contour line according to an embodiment of the present application;
FIG. 7 is a flowchart of another model rendering method according to an embodiment of the present application;
FIG. 8 is a functional block diagram of a model rendering apparatus according to an embodiment of the present application;
fig. 9 is a schematic physical structure diagram of a model rendering device according to an embodiment of the present application.
Specific embodiments of the present disclosure have been shown by way of the above drawings and will be described in more detail below. These drawings and the written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the disclosed concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The model rendering method provided by the application can be applied to a shader. Exemplary, the model rendering method provided by the embodiment of the application can be applied to computer animation (Computer Graphics, CG), high-order shader language (High Level Shader Language, HLSL), open graphic library (Open Graphics Library, openGL, or may be called "open graphic library"), or node editor based on game engine.
Further, the method can be applied to a phantom Engine (UE) in particular. Illustratively, it may be performed in the UE4, and the UE4 may implement drawing of the outer contour line based on the vertex data.
Further, the present application can be applied to any electronic device provided with a shader. The electronic device according to the embodiment of the present aspect may include, but is not limited to, a terminal device. The terminal device may be a wireless terminal or a wired terminal. A wireless terminal may be a device that provides voice and/or other traffic data connectivity to a user, a handheld device with wireless connectivity, or other processing device connected to a wireless modem. The wireless terminal may communicate with one or more core network devices via a radio access network (Radio Access Network, RAN for short), which may be mobile terminals such as mobile phones (or "cellular" phones) and computers with mobile terminals, for example, portable, pocket, hand-held, computer-built-in or vehicle-mounted mobile devices that exchange voice and/or data with the radio access network. For another example, the wireless terminal may be a personal communication service (Personal Communication Service, abbreviated PCS) phone, a cordless phone, a session initiation protocol (Session Initiation Protocol, abbreviated SIP) phone, a wireless local loop (Wireless Local Loop, abbreviated WLL) station, a personal digital assistant (Personal Digital Assistant, abbreviated PDA) or the like. A wireless Terminal may also be referred to as a system, subscriber Unit (Subscriber Unit), subscriber Station (Subscriber Station), mobile Station (Mobile Station), mobile Station (Mobile), remote Station (Remote Station), remote Terminal (Remote Terminal), access Terminal (Access Terminal), user Terminal (User Terminal), user Agent (User Agent), user equipment (User Device or User Equipment), without limitation. Optionally, the terminal device may also be a device such as a smart watch or a tablet computer.
The application scene specific to the embodiment of the application is a contour line rendering scene of a multi-dimensional model. The three-dimensional model may include, but is not limited to, a 3-dimensional (3 d) model. For example, a scene may be rendered for an outer contour for a three-dimensional model. For example, a scene in which an outline is rendered for a 3D character (3D mannequin) or an object (3D object model) in a 3D game.
The embodiment of the application can be particularly applied to a scene of drawing a three-dimensional model by a drawing person, and further, the three-dimensional model can be provided with an outline of a personalized style, for example, an outline with a water-ink style.
In another exemplary implementation scenario, the embodiment of the present application may also be applied to a scenario where a three-dimensional model is rendered and output. For example, a painter can draw a three-dimensional model with a contour line in a water-ink style, and then when the three-dimensional model is output and displayed, the rendering of the three-dimensional model can be realized according to the model rendering scheme provided by the embodiment of the application.
It should be understood that, in the embodiment of the present application, the rendering style of the model is not particularly limited, and the foregoing ink style is an exemplary embodiment, and in an actual scene, other styles of rendering of the model may also be implemented according to the present scheme. For example, rendering of models of watercolor style, oil painting style, sketch style, etc., may also be implemented, which is not exhaustive herein.
In performing the outer contour line rendering on the multi-dimensional model, various implementations are provided in the prior art, which specifically may include, but are not limited to: the method comprises the steps of drawing an outer contour line based on a viewing angle and a surface normal method, drawing the outer contour line by a procedural geometric method, drawing the outer contour line based on image processing, and drawing the outer contour line based on contour edges.
The drawing method based on the observation angle and the surface normal is based on the multiplication result of the viewing angle direction and the surface normal point, whether the section is nearly vertical or not is judged, and then the outline is drawn based on the judgment result. When the process type geometric method is used for drawing the outer contour line, the method generally comprises the following steps of firstly rendering a back surface layer, enabling the contour to be visible through a method of expanding the vertexes of the model, then normally rendering the front surface model, and superposing the two layers to achieve contour line rendering of the model. When the contour line rendering is performed based on image processing, the operator can be directly utilized to judge the depth and the normal texture of the model, and then the external contour line is drawn based on the calculation result. When the external contour line drawing is performed based on the contour edge, the edge can be detected first, then, the contour line is drawn on the edge, whether two adjacent triangles face to the visual angle or not and whether two adjacent triangles face away from the visual angle or not is detected, and if the two adjacent triangles face to the visual angle, the edge is between the two triangles. In addition to the single use of one of the foregoing methods, a combination of the foregoing methods may also be used. For example, contour lines may be detected using a contour edge-based detection method, then the model and contour edges are rendered, then the contour lines are identified using an image-based processing method, and then stylized rendering is performed in image space.
The model rendering method in the prior art can only obtain the external contour line rendering effect with consistent width and style as shown in fig. 1. Obviously, this cannot meet the personalized requirements of the art style. In the prior art, in order to achieve the outer contour effect of different widths of contour lines, a user is also required to perform complex post-processing (the post-processing process and mode are not limited here), in this case, the consumption of hardware is extremely high, and the user operation is complicated and inconvenient to control.
The technical scheme provided by the application aims to solve the technical problems in the prior art.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
The embodiment of the application provides a model rendering method. Referring to fig. 2, the method includes the following steps:
s202, obtaining vertex data of a to-be-rendered basic model and a to-be-rendered model, wherein the vertex data comprise: and (3) vertex color.
As before, the embodiment of the present aspect is suitable for rendering a scene on a contour line of a multidimensional model, and therefore, a basic model to be rendered referred to herein may be a 2d model or a 3d model. For convenience of comparison and explanation, the model shown in fig. 1 is taken as a basic model to be rendered, and the embodiment of the application is specifically explained.
For example, referring to fig. 3, fig. 3 shows a schematic vertex diagram of a model in an embodiment of the present application. As shown in fig. 3, the to-be-rendered basic model may have a wiring display function, and as shown in fig. 3, the intersection point between any plurality of (2 or more) wirings is a vertex.
In other words, for any one base model to be rendered, the base model to be rendered may be provided with a plurality of vertices. The vertex data is the data of each vertex.
Vertex data according to embodiments of the present application may include, but is not limited to: vertex Color (Vertex Color). In addition, vertex data may include, but is not limited to: vertex position and Vertex Normal direction (Vertex Normal WS).
The embodiment of the application does not limit the acquisition mode of the basic model to be rendered and the vertex data thereof. In general, vertex data of a base model to be rendered may be generally carried in the base model to be rendered.
In an exemplary embodiment, the base model to be rendered is a multi-dimensional model, and the base model to be rendered is generated by using a multi-dimensional model drawing tool; the vertex data is output data of the multi-dimensional model drawing tool. Taking the example of the base model to be rendered as a 3D model, the multidimensional model rendering tool may be a digital content creation (Digital Content Creation, DCC) tool, where the DCC tool includes, but is not limited to: drawing tools such as 3DS MAX and MAYA. When generating (or rendering) a 3D model using DCC tools, DCC tools may output vertex data together.
It should be noted that, in the process of generating or drawing the base model to be rendered, the vertex data may be selected or set by user definition. In particular, taking color data as an example, a user can color a base model to be rendered by using a color brush (or referred to as a painting brush or the like), so that the complicated step of individually selecting the color of each vertex is avoided.
S204, determining width rendering parameters of the contour line model at each vertex in the basic model to be rendered, wherein the width rendering parameters of the contour line model are related to the colors of the vertices.
In embodiments of the present application where the width rendering parameters are associated with vertex colors, then the width rendering parameters at each vertex may be determined based on the vertex data (including, but not limited to, vertex colors) at each vertex (location).
The implementation of this step is described in detail later.
S206, rendering according to the width rendering parameters of the contour line model to obtain a contour line model corresponding to the basic model to be rendered.
That is, rendering is performed according to the width rendering parameters of the S204 contour model, so as to obtain a rendered contour model.
S208, outputting a target model according to the basic model to be rendered and the corresponding contour line model.
Exemplary, fig. 4 shows a schematic diagram of a model rendering effect according to an embodiment of the present application. As shown in FIG. 4, after the processing of the model rendering method, the object model output by the embodiment of the application has contour lines with different widths, so as to form a model rendering effect similar to the ink and wash style.
According to the model rendering method and device and the readable storage medium, the width rendering parameters of the vertexes can be determined based on the vertex colors in the vertex data of the model, so that the contour line width is utilized for personalized rendering, and the output target model has the personalized contour line effect after rendering. Therefore, the embodiment of the application can realize the rendering of the personalized width of the outer contour line by utilizing the association relation between the vertex color and the contour line width, and the technical scheme provided by the embodiment of the application can solve the problem that the conventional model rendering method can not meet the personalized requirement of the art style, and meets the personalized rendering requirement of the outer contour line.
The specific implementation manner of the model rendering method provided by the embodiment of the present application will be specifically described based on the embodiment shown in fig. 2.
For example, fig. 5 shows a schematic flow chart of another model rendering method, and as shown in fig. 5, the step S204 may include the following steps:
s2042, for any vertex in the basic model to be rendered, acquiring the style width and standard width of the contour line at the vertex, wherein the style width is related to the color of the vertex.
For example, reference may be made to the schematic diagram of the contour line determination method shown in fig. 6. In the embodiment shown in FIG. 6, the style width is associated with the vertex color, vertex normal direction; whereas the normal width is related to the vertex normal direction.
In one aspect, as shown in fig. 6, the method for obtaining the style width of the contour line at the vertex may be: determining a first width value according to the single-channel color value in the vertex color, and then obtaining the product of a style weight parameter (ink_factor) and the first width value to obtain a second width value; the style weight parameter is used for adjusting the width change of the contour line under the corresponding style, then obtaining the product of the vertex normal direction in the vertex data and the second width value to obtain a third width value, and further obtaining the product of the contour Thickness parameter (outline_thickness) and the third width value to obtain the style width; wherein the profile thickness parameter is used to determine the thickness of the profile.
The first width value is determined based on the single-channel color value, and the single-channel color value can be directly obtained as the first width value for subsequent processing in actual implementation. In an actual scenario, the vertex color may be determined by the values of the R channel, the G channel, and the B channel, and in the embodiment of the present application, the value of one of the channels is used as the first width value. For example, as shown in fig. 6, the color value of the R channel may be used as the first width value for the subsequent processing. In an actual scenario, the style width may also be determined using the value of the G channel, or the value of the B channel, as the first width value.
In this implementation, as shown in fig. 6, the foregoing process may be implemented by a multiplication operation of a multiplier (multiple). Fig. 6 shows multipliers 1 to 3 for determining the style width. The style width serves as one input to the smoother (shown as the B input in fig. 6). It will be appreciated that a and B in any one module each represent an input, each module having two or more inputs, the black dots on the right side of each module in fig. 6 representing the outputs.
In the embodiment shown in fig. 6, the style weight parameter, the contour thickness parameter, and the scale parameter mentioned later in the embodiment of the present application may be freely selected and determined by the user. In other words, the user may adjust the width of the contour line by adjusting one or more of the style weight parameter, the contour thickness parameter, and the scale parameter, which will be described in detail later.
On the other hand, in fig. 6, the method for obtaining the standard width of the contour line at the vertex may be: determining a fourth width value according to an Outline Thickness parameter (outline_thickness); the contour thickness parameter is used for determining the thickness of a contour line, and then, the product of the Vertex Normal direction (Vertex Normal WS) and the fourth width value in Vertex data is obtained to obtain a standard width.
In the embodiment shown in fig. 6, the normal width of the contour line is related to the vertex normal direction, the contour thickness parameter. The user also directly affects the standard width of the contour line when modifying the contour thickness parameter.
In this implementation, as shown in fig. 6, the adjustment and determination of the standard width can be achieved by multiplication by the multiplier 4 and the multiplier 5. The standard width serves as the other input to the smoother (shown as the a input in fig. 6). The inputs of the multiplier 4 are a contour thickness parameter and a parameter influence coefficient, wherein the parameter influence coefficient is used for adjusting the influence condition of the contour thickness parameter on the width rendering parameter. In general, the parameter influence coefficient is generally a preset value, and the user does not have modification authority on the parameter influence coefficient. For example, the parameter influence coefficient may be preset to 0.1, and in other scenarios, the value may be preset to other values according to the actual situation, which is not exhaustive.
S2044, determining width rendering parameters at the vertexes according to the style width and the standard width.
Based on the foregoing processing, after the style width and the standard width of the contour line are obtained, when the step is executed, an adjustment ratio between the style width and the standard width may be determined according to a ratio parameter (thickness_vercolor_factor), and then, the style width and the standard width are calculated by using the adjustment ratio, so as to obtain a width rendering parameter at the vertex.
In particular, the scale parameter may be used to characterize the ratio of the style width to the standard width, or may be used to characterize the ratio of the standard width to the style width, or the ratio of the style width in the width rendering parameter, or the ratio of the standard width in the width rendering parameter. The width rendering parameter is composed of style width and standard width.
For example, if the third adjustment width is the ratio of the style width in the width rendering parameter, and the third adjustment width is 0.6, the finally determined contour width is determined by the style width and the standard width according to 6:4, and fusing the mixture.
As shown in fig. 6, this step, when implemented, may be implemented using a smoother (Lerp).
Based on the determined contour line width, when model rendering (S206) is performed, the contour line may be rendered on the first layer using the width rendering parameters at each vertex, and the base model to be rendered may be rendered on the second layer, so that the second layer is superimposed on the first layer, resulting in a target model, which may refer to fig. 4.
The rendering of the first layer and the second layer may be performed sequentially or simultaneously, which is not particularly limited in the embodiment of the present application.
Specifically, when the contour line is rendered on the first layer, a contour line track of a contour line model corresponding to the basic model to be rendered can be obtained, then, according to width rendering parameters of each vertex in the basic model to be rendered and the position relationship between each track point and each vertex on the contour line, the width rendering parameters of the contour line track are determined, and further, according to the width rendering parameters, the contour line model corresponding to the basic model to be rendered is rendered.
Thus, the contour line rendering of the basic model to be rendered can be completed, and the target model is obtained and output.
In any of the foregoing embodiments, when the contour rendering is performed, a contour Color (Outline Color) and a contour texture (Backface model) may be further determined. It should be appreciated that the contour line color is used to indicate the color of the contour line to be rendered; and the contour line material is used for indicating the material of the contour line to be rendered on the second layer. The color and the material of the contour line can also be selected and determined by user definition.
In an exemplary embodiment, before executing S206, the width rendering parameter may be determined according to the method shown in fig. 6, and RGB values for the contour line color may also be obtained, the contour line color may be determined, and the material obtained by the user may be obtained as the contour line material, so that the rendering of the second layer may be implemented according to the width rendering parameter, the contour line color, and the contour line material.
In addition, in the model drawing scene or the direct rendering scene of the model, the embodiment of the application further provides a processing mode for adjusting and optimizing the rendering effect of the target model.
In an exemplary embodiment, referring to fig. 7, after outputting the target model, the method further includes the steps of:
s702, responding to a modification instruction aiming at a target model, and outputting parameter modification information which can be operated by a user.
The modification content related to the parameter modification information according to the embodiment of the present application may include, but is not limited to: one or more of vertex color, style weight parameters, contour thickness parameters, and scale parameters. For example, it may further include: one or more of contour material and contour color.
In other words, the user can edit any of the foregoing parameter modification information, and the editing can be performed with a submission (or determination).
S704, responding to a submitting instruction aiming at parameter modification information, and re-rendering the basic model to be rendered according to the submitting instruction to obtain an adjustment model of the target model.
In this embodiment, after the user submits the modification, the method may redetermine the width rendering parameters (in some embodiments, the contour line color and the contour line material need to be redetermined) according to the same method as the previous embodiment, and further, the model rendering is redefined, so as to obtain the adjustment model of the target model.
S706, outputting an adjustment model of the target model.
And after re-rendering, the adjustment model of the target model can be re-output.
Steps S702 to S706 shown in fig. 7 may be implemented in a modifier (modifier). In the modifier, the user can operate to call out the parameter modification information, so that the user can edit on the parameter modification information to modify and adjust the modifiable content and submit the modifiable content after the modification is completed, and the computer side can re-render the model according to the method and output the rendered model. It should be noted that, in some embodiments, the submitting step may be omitted, and at this time, as the user inputs the modified adjustment parameters, the model is automatically re-rendered and the rendered model is output.
It is to be understood that some or all of the steps or operations in the above-described embodiments are merely examples, and that embodiments of the present application may also perform other operations or variations of the various operations. Furthermore, the various steps may be performed in a different order presented in the above embodiments, and it is possible that not all of the operations in the above embodiments are performed.
When used in the present application, although the terms "first", "second", etc. may be used in the present application to describe various adjustment parameters, these adjustment parameters should not be limited by these terms. These terms are only used to distinguish one tuning parameter from another tuning parameter. For example, without changing the meaning of the description, the style weight parameters may be called contour thickness parameters, and likewise, the contour thickness parameters may be called style weight parameters, provided that all occurrences of "style weight parameters" are renamed consistently and all occurrences of "contour thickness parameters" are renamed consistently. The style weight parameter and the contour thickness parameter are both adjustment parameters, but may not be the same adjustment parameters.
The terminology used in the present application is used for the purpose of describing embodiments only and is not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a," "an," and "the" (the) are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this disclosure is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, when used in the present disclosure, the terms "comprises," "comprising," and/or variations thereof, mean that the recited features, integers, steps, operations, elements, and/or components are present, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Based on the model rendering method provided by the above embodiment, the embodiment of the present application further provides an apparatus embodiment for implementing each step and method in the above method embodiment.
Referring to fig. 8, an embodiment of the present application provides a model rendering device 800, including:
an obtaining module 82, configured to obtain vertex data of the base model to be rendered and the base model to be rendered, where the vertex data includes vertex colors;
a determining module 84, configured to determine a width rendering parameter of the contour line model at each vertex in the base model to be rendered, where the width rendering parameter of the contour line model is related to the vertex color;
the rendering module 86 is configured to render, according to the width rendering parameter of the contour model, a contour model corresponding to the basic model to be rendered;
and the output module 88 is used for outputting the target model according to the basic model to be rendered and the corresponding contour line model.
In an exemplary embodiment, the determining module 84 is specifically configured to:
aiming at any vertex in the basic model to be rendered, acquiring the style width and standard width of the contour line at the vertex, wherein the style width is related to the color of the vertex;
and determining width rendering parameters at the vertexes according to the style width and the standard width.
In another exemplary embodiment, the determining module 84 is specifically configured to:
determining a first width value according to the single channel color value in the vertex color;
obtaining a product of the style weight parameter and the first width value to obtain a second width value; the style weight parameters are used for adjusting the width change of the contour line under the corresponding style;
obtaining a product of the vertex normal direction and the second width value to obtain a third width value, wherein the vertex normal direction is derived from vertex data;
obtaining the product of the contour thickness parameter and the third width value to obtain the style width; wherein the profile thickness parameter is used to determine the thickness of the profile.
In another exemplary embodiment, the determining module 84 is specifically configured to:
determining a fourth width value according to the profile thickness parameter; the profile thickness parameter is used for determining the thickness of the profile line;
and obtaining the product of the normal direction of the vertex and the fourth width value to obtain the standard width, wherein the normal direction of the vertex is derived from the vertex data.
In another exemplary embodiment, the determining module 84 is specifically configured to:
according to the proportion parameters, determining an adjustment proportion between the style width and the standard width;
and calculating the style width and the standard width by using the adjustment proportion to obtain the width rendering parameters at the vertex.
In another exemplary embodiment, the rendering module 86 is specifically configured to:
acquiring a contour line track of a contour line model corresponding to a basic model to be rendered;
determining width rendering parameters of the contour line track according to the width rendering parameters of each vertex in the basic model to be rendered and the position relation between each track point and each vertex on the contour line;
and rendering according to the width rendering parameters to obtain a contour line model corresponding to the basic model to be rendered.
In another exemplary embodiment, the output module 88 is further configured to output parameter modification information that is available for user operation in response to modification instructions for the target model;
the rendering module 86 is further configured to, in response to a commit instruction for parameter modification information, re-render the base model to be rendered according to the commit instruction, and obtain an adjustment model of the target model;
the output module 88 is further configured to output an adjustment model of the target model.
In another exemplary embodiment, the parameter modification information includes: one or more of vertex color, style weight parameters, contour thickness parameters, and scale parameters.
In another exemplary embodiment, the base model to be rendered is a multi-dimensional model, the base model to be rendered being generated using a multi-dimensional model drawing tool; the vertex data is output data of the multi-dimensional model drawing tool.
The model rendering device 800 of the embodiment shown in fig. 8 may be used to implement the technical solution of the above-described method embodiment, and the implementation principle and technical effects may be further referred to the related description in the method embodiment, and optionally, the model rendering device 800 may be a server or a terminal.
It should be understood that the above division of the modules of the model rendering apparatus 800 shown in fig. 8 is merely a division of a logic function, and may be fully or partially integrated into a physical entity or may be physically separated. And these modules may all be implemented in software in the form of calls by the processing element; or can be realized in hardware; it is also possible that part of the modules are implemented in the form of software called by the processing element and part of the modules are implemented in the form of hardware. For example, the determination module 84 may be a processing element that is set up separately, may be integrated into the model rendering device 800, for example, a chip of a terminal, may be stored in a memory of the model rendering device 800 in a program form, and may be called by a processing element of the model rendering device 800 to execute the functions of the respective modules. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
For example, the modules above may be one or more integrated circuits configured to implement the methods above, such as: one or more specific integrated circuits (Application Specific Integrated Circuit, ASIC), or one or more microprocessors (digital singnal processor, DSP), or one or more field programmable gate arrays (Field Programmable Gate Array, FPGA), or the like. For another example, when a module above is implemented in the form of a processing element scheduler, the processing element may be a general purpose processor, such as a central processing unit (Central Processing Unit, CPU) or other processor that may invoke the program. For another example, the modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Further, referring to fig. 9, an embodiment of the present application provides a model rendering apparatus 800, including: a memory 89; a processor 820; a computer program;
wherein the computer program is stored in the memory 89 and configured to be executed by the processor 820 to implement the method as described in the above embodiments.
The number of the processors 820 in the model rendering device 800 may be one or more, and the processors 820 may also be referred to as processing units, which may implement a certain control function. The processor 820 may be a general purpose processor or a special purpose processor, etc. In an alternative design, processor 820 may also have instructions stored thereon that are executable by processor 820 to cause model rendering device 800 to perform the methods described in the method embodiments above.
In yet another possible design, model rendering device 800 may include circuitry that may implement the functions of transmitting or receiving or communicating in the foregoing method embodiments.
Alternatively, the number of the memories 89 in the model rendering device 800 may be one or more, and the memories 89 may have instructions or intermediate data stored thereon, where the instructions may be executed on the processor 820, so that the model rendering device 800 performs the method described in the above method embodiments. Optionally, other relevant data may also be stored in the memory 89. Instructions and/or data may also optionally be stored in processor 820. The processor 820 and the memory 89 may be provided separately or may be integrated.
In addition, as shown in fig. 9, a transceiver 830 is further provided in the model rendering apparatus 800, where the transceiver 830 may be referred to as a transceiver unit, a transceiver circuit, or a transceiver, etc. for performing data transmission or communication with a test device or other terminal devices, which is not described herein.
As shown in fig. 9, the memory 89, the processor 820 and the transceiver 830 are connected and communicate by a bus.
If the model rendering device 800 is used to implement a method corresponding to that of fig. 2, for example, the transceiver 830 may output the target model or an adjustment model of the target model, and the processor 820 may be used to perform a corresponding determination or control operation, and optionally, may store corresponding instructions in the memory 89. The manner in which the various components are handled may be referred to in the foregoing description of the method embodiments.
Furthermore, an embodiment of the present application provides a readable storage medium having stored thereon a computer program that is executed by a processor to implement the model rendering method according to the method embodiment.
Since each module in the present embodiment is capable of executing the model rendering method shown in the embodiment, a relevant explanation of the method embodiment is referred to for a part not described in detail in the present embodiment.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (9)

1. A model rendering method, comprising:
obtaining a basic model to be rendered and vertex data of the basic model to be rendered, wherein the vertex data comprise vertex colors, and the vertex data are selected or set by user definition in the process of generating or drawing the basic model to be rendered;
for any vertex in the basic model to be rendered, acquiring the style width and standard width of the contour line at the vertex, wherein the style width is related to the color of the vertex;
determining a width rendering parameter of a contour line model at the vertex according to the style width and the standard width, wherein the width rendering parameter of the contour line model is related to the vertex color;
rendering according to the width rendering parameters of the contour line model to obtain a contour line model corresponding to the basic model to be rendered;
outputting a target model according to the basic model to be rendered and the corresponding contour line model;
the step of obtaining the style width of the contour line at the vertex comprises the following steps:
determining a first width value according to the single channel color value in the vertex color;
obtaining a product of the style weight parameter and the first width value to obtain a second width value; the style weight parameters are used for adjusting the width change of the contour line under the corresponding style;
obtaining a product of a vertex normal direction and the second width value to obtain a third width value, wherein the vertex normal direction is derived from the vertex data;
obtaining the product of the contour thickness parameter and the third width value to obtain the style width; the profile thickness parameter is used for determining the thickness of a profile line;
the obtaining the standard width of the contour line at the vertex comprises the following steps:
determining a fourth width value according to the profile thickness parameter; the profile thickness parameter is used for determining the thickness of a profile line;
and obtaining the product of the normal direction of the vertex and the fourth width value to obtain the standard width, wherein the normal direction of the vertex is derived from the vertex data.
2. The method of claim 1, wherein the determining the width rendering parameters at the vertices from the style widths and the standard widths comprises:
determining an adjustment ratio between the style width and the standard width according to a ratio parameter;
and calculating the style width and the standard width by using the adjustment proportion to obtain the width rendering parameter at the vertex.
3. The method according to claim 1, wherein the rendering, according to the width rendering parameter of the contour model, obtains a contour model corresponding to the base model to be rendered, includes:
acquiring a contour line track of a contour line model corresponding to the basic model to be rendered;
determining width rendering parameters of the contour line track according to the width rendering parameters of each vertex in the basic model to be rendered and the position relationship between each track point and each vertex on the contour line;
and rendering according to the width rendering parameters to obtain the contour line model corresponding to the basic model to be rendered.
4. The method according to claim 1, wherein the method further comprises:
outputting parameter modification information which can be operated by a user in response to receiving a modification instruction aiming at the target model;
responding to receiving a submitting instruction aiming at the parameter modification information, and adjusting the target model according to the submitting instruction to obtain an adjustment model of the target model;
and outputting an adjustment model of the target model.
5. The method of claim 4, wherein the parameter modification information comprises: one or more of vertex color, style weight parameters, contour thickness parameters, and scale parameters.
6. The method of claim 1, wherein the base model to be rendered is a multi-dimensional model, the base model to be rendered generated using a multi-dimensional model drawing tool; the vertex data is output data of the multi-dimensional model drawing tool.
7. A model rendering apparatus, characterized by comprising:
the system comprises an acquisition module, a rendering module and a rendering module, wherein the acquisition module is used for acquiring a basic model to be rendered and vertex data of the basic model to be rendered, wherein the vertex data comprise vertex colors, and the vertex data are selected or set by user definition in the process of generating or drawing the basic model to be rendered;
a determining module for
For any vertex in the basic model to be rendered, acquiring the style width and standard width of the contour line at the vertex, wherein the style width is related to the color of the vertex;
determining a width rendering parameter of a contour line model at the vertex according to the style width and the standard width, wherein the width rendering parameter of the contour line model is related to the vertex color;
the rendering module is used for rendering according to the width rendering parameters of the contour line model to obtain a contour line model corresponding to the basic model to be rendered;
the output module is used for outputting a target model according to the basic model to be rendered and the corresponding contour line model;
the determining module is specifically configured to:
determining a first width value according to the single channel color value in the vertex color;
obtaining a product of the style weight parameter and the first width value to obtain a second width value; the style weight parameters are used for adjusting the width change of the contour line under the corresponding style;
obtaining a product of a vertex normal direction and the second width value to obtain a third width value, wherein the vertex normal direction is derived from the vertex data;
obtaining the product of the contour thickness parameter and the third width value to obtain the style width; the profile thickness parameter is used for determining the thickness of a profile line;
determining a fourth width value according to the profile thickness parameter; the profile thickness parameter is used for determining the thickness of a profile line;
and obtaining the product of the normal direction of the vertex and the fourth width value to obtain the standard width, wherein the normal direction of the vertex is derived from the vertex data.
8. A model rendering apparatus, characterized by comprising:
a memory;
a processor; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method of any of claims 1-6.
9. A computer-readable storage medium, having a computer program stored thereon,
the computer program being executed by a processor to implement the method of any of claims 1-6.
CN202010185503.6A 2020-03-17 2020-03-17 Model rendering method and device, and readable storage medium Active CN111402381B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010185503.6A CN111402381B (en) 2020-03-17 2020-03-17 Model rendering method and device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010185503.6A CN111402381B (en) 2020-03-17 2020-03-17 Model rendering method and device, and readable storage medium

Publications (2)

Publication Number Publication Date
CN111402381A CN111402381A (en) 2020-07-10
CN111402381B true CN111402381B (en) 2023-11-21

Family

ID=71432548

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010185503.6A Active CN111402381B (en) 2020-03-17 2020-03-17 Model rendering method and device, and readable storage medium

Country Status (1)

Country Link
CN (1) CN111402381B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112233215B (en) * 2020-10-15 2023-08-22 网易(杭州)网络有限公司 Contour rendering method, device, equipment and storage medium
CN113538647B (en) * 2021-06-23 2023-09-15 厦门大学 Ink image rendering method
CN114119847B (en) * 2021-12-05 2023-11-07 北京字跳网络技术有限公司 Graphic processing method, device, computer equipment and storage medium
CN114627225A (en) * 2022-03-16 2022-06-14 北京字跳网络技术有限公司 Method and device for rendering graphics and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6988059B1 (en) * 1999-09-14 2006-01-17 Kabushiki Kaisha Square Enix Rendering method and device, game device, and computer-readable recording medium for storing program to render stereo model
CN102708585A (en) * 2012-05-09 2012-10-03 北京像素软件科技股份有限公司 Method for rendering contour edges of models
CN107045729A (en) * 2017-05-05 2017-08-15 腾讯科技(深圳)有限公司 A kind of image rendering method and device
CN109903366A (en) * 2019-03-13 2019-06-18 网易(杭州)网络有限公司 The rendering method and device of dummy model, storage medium and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6988059B1 (en) * 1999-09-14 2006-01-17 Kabushiki Kaisha Square Enix Rendering method and device, game device, and computer-readable recording medium for storing program to render stereo model
CN102708585A (en) * 2012-05-09 2012-10-03 北京像素软件科技股份有限公司 Method for rendering contour edges of models
CN107045729A (en) * 2017-05-05 2017-08-15 腾讯科技(深圳)有限公司 A kind of image rendering method and device
CN109903366A (en) * 2019-03-13 2019-06-18 网易(杭州)网络有限公司 The rendering method and device of dummy model, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN111402381A (en) 2020-07-10

Similar Documents

Publication Publication Date Title
CN111402381B (en) Model rendering method and device, and readable storage medium
CN112316420B (en) Model rendering method, device, equipment and storage medium
CN111429557A (en) Hair generating method, hair generating device and readable storage medium
US10593096B2 (en) Graphics processing employing cube map texturing
CN110443877B (en) Model rendering method, device, terminal equipment and storage medium
CN110378947B (en) 3D model reconstruction method and device and electronic equipment
CN105913496B (en) It is a kind of by true dress ornament rapid translating be three-dimensional dress ornament method and system
US6791544B1 (en) Shadow rendering system and method
JP4890553B2 (en) 2D / 3D combined rendering
CN109712226A (en) The see-through model rendering method and device of virtual reality
CN116228943B (en) Virtual object face reconstruction method, face reconstruction network training method and device
CN112190935A (en) Dynamic volume cloud rendering method and device and electronic equipment
CN115810101A (en) Three-dimensional model stylizing method and device, electronic equipment and storage medium
CN111383311B (en) Normal map generation method, device, equipment and storage medium
CN107657648B (en) Real-time efficient dyeing method and system in mobile game
CN109448088A (en) Render method, apparatus, computer equipment and the storage medium of solid figure wire frame
CN108230430A (en) The processing method and processing device of cloud layer shade figure
CN106023290A (en) Three-dimensional display method and device for material
JP2003233836A (en) Image processor for conducting rendering shading processing by using distance component in modeling and its method
CN114820980A (en) Three-dimensional reconstruction method and device, electronic equipment and readable storage medium
CN114742931A (en) Method and device for rendering image, electronic equipment and storage medium
CN111243089B (en) Method, device and system for drawing three-dimensional island model based on island two-dimensional data
WO2004111949A1 (en) A system and a method for drawing development figures and a computer readable medium thereof
CN112560126A (en) Data processing method, system and storage medium for 3D printing
JP2007293392A (en) Image generation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant