CN113610955A - Object rendering method and device and shader - Google Patents
Object rendering method and device and shader Download PDFInfo
- Publication number
- CN113610955A CN113610955A CN202110918915.0A CN202110918915A CN113610955A CN 113610955 A CN113610955 A CN 113610955A CN 202110918915 A CN202110918915 A CN 202110918915A CN 113610955 A CN113610955 A CN 113610955A
- Authority
- CN
- China
- Prior art keywords
- illumination
- color
- highlight
- determining
- rendering
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000009877 rendering Methods 0.000 title claims abstract description 91
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000005286 illumination Methods 0.000 claims abstract description 328
- 230000000694 effects Effects 0.000 claims abstract description 36
- 239000013598 vector Substances 0.000 claims description 17
- 239000003086 colorant Substances 0.000 claims description 13
- 238000013507 mapping Methods 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 abstract description 17
- 238000012545 processing Methods 0.000 abstract description 7
- 230000008569 process Effects 0.000 description 9
- 230000009471 action Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/506—Illumination models
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/812—Ball games, e.g. soccer or baseball
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8011—Ball
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
Abstract
The application relates to an object rendering method, a device and a shader, when the illumination effect rendering is carried out on an object, the illumination information such as illumination direction, color and the like of a virtual light source is transmitted to replace a real-time light mode, and the illumination is divided into a direct illumination part, a diffuse reflection part and a high light reflection part to simulate the real illumination condition, on the basis, the illumination condition of each part of the object is calculated, and the final illumination result of the object is obtained by integrating the illumination condition of each part, therefore, the application provides a simplified illumination model, the illumination effect rendering under the non-real-time illumination is carried out on the object based on the simplified illumination model, compared with the prior art that a complex light assembly is arranged in a game scene, and a game engine carries out the complex illumination calculation and rendering display on the object according to the real-time light mode, the performance consumption brought by the real-time illumination processing of the engine can be greatly reduced, and can meet the requirements of customized lighting schemes.
Description
Technical Field
The present application belongs to the field of computer technologies, and in particular, to an object rendering method, an object rendering device, and a shader.
Background
The golf game has a series of themed golf balls, and the golf balls need to be rendered and displayed during the game or during the shelf exhibition of game objects.
In the conventional solution, a game engine, such as a Unity3d game engine, is used to illuminate and calculate the golf ball in a real-time light manner, and render and display the golf ball based on the illumination calculation result, however, in this manner, a great number of light assemblies are used and the illumination effect of the golf ball is rendered through a complicated calculation process, which results in a certain consumption in performance and fails to meet the requirement of a customized illumination scheme.
Disclosure of Invention
In view of the above, the present application provides an object rendering method, an object rendering device and a shader, which are used to at least save the performance consumption caused by rendering an object by an engine in a real-time illumination manner through an object rendering scheme based on non-real-time illumination.
The specific technical scheme is as follows:
a method of object rendering, the method comprising:
acquiring display characteristic information of an object to be rendered and illumination information of a virtual light source;
determining the direct illumination color, the diffuse reflection illumination quantity and the high illumination quantity in a high illumination area of the object according to the display characteristic information and the illumination information;
determining an illumination result of the object according to the direct illumination color, the diffuse reflection illumination amount and the highlight illumination amount;
and performing illumination effect rendering on the object based on the illumination result.
Optionally, the obtaining of the display characteristic information of the object to be rendered and the illumination information of the virtual light source includes:
acquiring the color, diffuse reflection light intensity, high light color and main texture map transparency of the main texture map of the object;
and acquiring the light color and the illumination direction of the virtual light source.
Optionally, the determining, according to the display feature information and the illumination information, a direct illumination color, a diffuse reflection illumination amount, and a highlight illumination amount in a highlight illumination area of the object includes:
determining the direct illumination color of the object according to the color of the main texture map of the object and the light color of the virtual light source;
determining the diffuse reflection illumination quantity of the front surface and the back surface of the object according to the illumination direction of the virtual light source, the diffuse reflection light color of the object and the diffuse reflection light intensity;
determining a highlight illumination area of the object, and determining a highlight illumination amount of the object in the highlight illumination area according to the highlight intensity, the highlight color and the main texture map transparency of the object;
determining an illumination result of the subject according to the direct illumination color, the diffuse reflectance illumination amount, and the highlight illumination amount, including:
and superposing the direct illumination color, the diffuse reflection illumination quantity and the highlight illumination quantity to obtain an illumination result of the object.
Optionally, the determining the direct illumination color of the object according to the color of the main texture map of the object and the light color of the virtual light source includes:
and calculating the product of the color values of different pixels on the main texture map of the object and the color value of the virtual light source to obtain the direct illumination colors of different pixels on the main texture map of the object.
Optionally, the determining, according to the illumination direction of the virtual light source, the color of the diffuse reflection light of the object, and the intensity of the diffuse reflection light, the amount of the diffuse reflection illumination of the front and back surfaces of the object includes:
calculating the dot product of the normal direction of the different positions of the front surface and the back surface of the object in the world space and the illumination direction of the virtual light source;
and multiplying dot product results of different dot products by the diffuse reflection light color and the diffuse reflection light intensity of the object to obtain the diffuse reflection illumination quantity of the front surface and the back surface of the object.
Optionally, the determining the highlight area of the object includes:
determining a half-angle vector according to the visual angle direction and the illumination direction of the virtual light source, and representing a high illumination area of the object through the half-angle vector;
the determining highlight illumination amount of the object in the highlight illumination area according to highlight intensity, highlight color and main texture map transparency of the object comprises:
and multiplying the highlight intensity, the highlight color, the transparency of the main texture mapping and the dot product result of the half-angle vector and the corresponding normal direction to obtain the highlight illumination.
Optionally, before determining the illumination result of the object according to the direct illumination color, the diffuse reflection illumination amount, and the high illumination amount, the method further includes:
and determining the self-luminous illumination quantity of different pixel points on the object according to the colors of different pixels on the main texture map of the object and the self-luminous colors of the object.
Determining an illumination result of the subject according to the direct illumination color, the diffuse reflection illumination amount, and the highlight illumination amount, including:
and superposing the direct illumination color, the diffuse reflection illumination quantity, the highlight illumination quantity and the self-luminous illumination quantity to obtain an illumination result of the object.
Optionally, the rendering the illumination effect on the object based on the illumination result includes:
determining a current state of the object; the current state is a semitransparent state or a non-transparent state;
rendering the object with an illumination effect by using a target rendering channel matched with the current state and based on the illumination result;
the different states of the object correspond to different rendering channels, and the different rendering channels respectively perform illumination effect rendering on the object based on different transparencies.
An object rendering apparatus, comprising:
the system comprises an acquisition module, a rendering module and a control module, wherein the acquisition module is used for acquiring display characteristic information of an object to be rendered and illumination information of a virtual light source;
the first determining module is used for determining the direct illumination color, the diffuse reflection illumination quantity and the high illumination quantity in a high illumination area of the object according to the display characteristic information and the illumination information;
a second determining module, configured to determine an illumination result of the object according to the direct illumination color, the diffuse reflection illumination amount, and the highlight illumination amount;
and the rendering module is used for rendering the illumination effect of the object based on the illumination result.
A shader having a set of computer instructions embodied therein, the set of computer instructions when executed by a processor implementing a method of object rendering as claimed in any one of the preceding claims.
According to the above scheme, the object rendering method, the device and the shader provided in the embodiments of the present application replace a real-time light mode by transmitting illumination information such as an illumination direction and a color of a virtual light source when rendering an illumination effect of an object, and divide illumination into direct illumination, diffuse reflection and high light reflection to simulate a real illumination situation, on the basis of which the illumination situation of each part of the object is calculated, and the final illumination result of the object is obtained by integrating the illumination situation of each part, so that the present application provides a simplified illumination model, based on which the illumination effect rendering under non-real-time illumination is performed on the object, compared with the prior art in which a complex lighting component is placed in a game scene, and a game engine performs complex illumination calculation and rendering display on the object in a real-time light mode, the performance consumption caused by real-time illumination processing of the engine can be greatly reduced, and the requirement of a customized illumination scheme can be met.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flow chart of an object rendering method according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of a process for calculating the illumination condition of direct illumination, diffuse reflection and high light reflection of an object according to an embodiment of the present disclosure;
FIGS. 3 and 4 are diagrams illustrating the illumination rendering effect of a golf ball under a semitransparent rendering channel and a non-transparent rendering channel according to an embodiment of the present application;
FIG. 5 is another schematic flowchart of an object rendering method according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an object rendering apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the Unity3d game engine, the light components include directional light, point light, area light, and the like. The traditional real-time illumination rendering mode is as follows: the light components are placed in a game scene to realize real-time illumination of game objects such as golf balls, and a Unity (real-time content development platform) engine performs unified illumination calculation processing and rendering based on real-time illumination information of the various light components. This real-time light mode can use comparatively complicated calculation, has certain consumption in the performance, and can't satisfy the demand of customization illumination scheme.
In order to solve the above problem, an embodiment of the present application discloses an object rendering method, an object rendering device, and a shader. The performance consumption brought by engine real-time illumination calculation is reduced by constructing a simplified illumination model in a non-real-time illumination form. Here, "non-real time" is different from a real-time light-based calculation method with a Unity built-in, and does not mean non-real time in terms of "processing time" when performing illumination calculation and illumination effect rendering.
The processing flow of the object rendering method disclosed in the embodiment of the present application is shown in fig. 1, and at least includes:
The "object" to be rendered may refer to, but is not limited to, a virtual model of a ball such as a golf ball, a soccer ball, a table tennis ball, etc. in a scene such as an electronic game, etc., and the present application will mainly take a golf ball as an example to illustrate the scheme.
In this embodiment, the virtual light source is not a real light component, and real-time light is not generated for an object to be rendered in a scene such as a game, that is, illumination calculation is performed for the object by introducing illumination information of the virtual light source instead of a real-time light-based manner. In addition, the virtual light source in this embodiment can intelligently and dynamically adjust the light color information of the object according to the color of the object (such as a white golf ball, an orange ping-pong ball, etc.) to form a highlighting effect on the object.
In this step 101, the obtained display characteristic information of the object to be rendered includes, but is not limited to, a color of a main texture map of the object, a color of diffuse reflection light, a light intensity of diffuse reflection, a high light intensity, a high light color, and a transparency of the main texture map. The lighting information of the virtual light source includes, but is not limited to, the lighting color and the lighting direction of the virtual light source.
And step 102, determining the direct illumination color, the diffuse reflection illumination quantity and the high illumination quantity in the high illumination area of the object according to the acquired display characteristic information of the object and the illumination information of the virtual light source.
The illumination is divided into direct illumination, diffuse reflection and high light reflection to simulate the real illumination condition. Referring to fig. 2, the process of calculating the illumination condition of several parts of direct illumination, diffuse reflection and high light reflection of the object may be implemented as:
In this embodiment, the direct illumination colors of different pixels on the main texture map of the object are obtained by calculating the product of the color values of the different pixels on the main texture map of the object and the light color value of the virtual light source. In implementation, pixel sampling may be performed on the main texture map of the object, and a product operation may be performed on a color value of each sampled pixel and a color value of the virtual light source, where an operation result is a direct illumination color of the object.
Specifically, in physical principle, the color of an object perceived by human eyes depends on the stimulation of the human eye structure by light rays entering the eye part after the light rays irradiated on the object are reflected by the object. For example, a green leaf is seen because the green leaf reflects light in the green wavelength band of sunlight/other light sources, and absorbs light in other color wavelength bands, and the light reflected to the human eye is light in the green wavelength band. Based on the principle, the reflection process is simulated by multiplying the color value of the main texture map and the light color value of the virtual light source, the direct illumination color obtained by calculating through simulation of the reflection process is used as the main illumination amount of the object, and the main illumination amount can be used for representing the main color information of the object after the light source irradiates the object (such as a sphere).
Specifically, the dot product of the normal direction of different positions of the front and back surfaces of the object in the world space and the illumination direction of the virtual light source can be calculated, and the dot product result is multiplied by the diffuse reflection light color and the diffuse reflection light intensity of the object to obtain the diffuse reflection illumination quantity of the front and back surfaces of the object.
The dot product of the normal direction and the illumination direction of the virtual light source essentially determines whether the included angle between the two directions exceeds 90 degrees, if so, the dot product result is a negative value, otherwise, the dot product result is a positive value, in this embodiment, the position of taking a picture relative to the human eyes/camera is represented by the positive and negative of the dot product result, whether the illumination direction is backlight, and if the illumination direction is on the back of the position of the object (such as a spherical virtual model) relative to the human eyes/camera, the included angle between the two vectors exceeds 90 degrees. In addition, the dot product result includes not only positive and negative information (representing directions) but also numerical information, and the magnitude of the diffuse reflection illumination quantity of the front and back surfaces is measured by using the numerical information included in the dot product result to perform calculation.
And step 203, determining a high illumination area of the object, and determining the high illumination amount of the object in the high illumination area according to the high light intensity, the high light color and the transparency of the main texture map of the object.
The present embodiment specifically determines the half-angle vector according to the viewing angle direction and the illumination direction of the virtual light source. The half-angle vector refers to an angular bisector of the illumination direction and the viewing angle direction, that is, a unit vector is taken as an addition result of the two vectors.
In this embodiment, the calculation of the half-angle vector is used to calculate the high illumination area, that is, the high illumination area of the object is characterized by the half-angle vector.
On the basis, the highlight intensity, the highlight color, the transparency of the main texture mapping and the dot product result of the half-angle vector and the normal direction of the object are further multiplied to obtain the highlight illumination quantity of the object.
And 103, determining an illumination result of the object according to the determined direct illumination color, the diffuse reflection illumination quantity and the high illumination quantity.
And finally, calculating the obtained direct illumination color, diffuse reflection illumination quantity and high illumination quantity through superposition to obtain the illumination result of the object.
Based on the calculation mode of each step, the embodiment implements a simplified illumination model by transmitting the illumination direction and color information of the virtual light source, and the model is a non-real-time illumination model distinguished from an engine real-time light mode, and in implementation, the model can be applied to a Shader (Shader) to implement a customized illumination calculation scheme.
And 104, rendering the illumination effect of the object based on the illumination result of the object.
The ball game such as golf has a series of themed golf balls, and the applicant finds that the ball rendering of golf balls and the like has two characteristics in the game:
1) a semitransparent state, a UI display and a non-transparent state display in a 3D scene can be realized when the golf ball is hit by a green without being pulled and when the golf ball is shielded;
2) in terms of illumination, the golf ball is required to have good illumination effect and support parameter adjustment under different viewing angles (such as various viewing angles in UI display, movement and rotation processes), and meanwhile, the performance consumption caused by real-time illumination calculation of an engine is required to be saved.
Based on this, the embodiment of the present application provides a rendering scheme with dual rendering channels (dual Pass) for two states, namely, semi-transparent and non-transparent states of an object, on the basis that the non-real-time illumination model supports good illumination effects of the object at different viewing angles and supports parameter adjustment, wherein one channel is responsible for performing illumination effect rendering on the object (such as a golf ball) in the semi-transparent state, and the other channel is responsible for performing illumination effect rendering on the object in the non-transparent state.
Therefore, after the illumination result of the object is calculated through the illumination model, the present embodiment further determines whether the current state of the object is translucent or non-translucent, and performs illumination effect rendering on the object based on the calculated illumination result by using the target rendering channel matched with the current state of the object. The different states of the object correspond to different rendering channels, and the different rendering channels respectively perform illumination effect rendering on the object based on different transparencies.
Referring to fig. 3 and 4, there are provided illumination rendering effects of a golf ball under a translucent rendering channel and a non-transparent rendering channel, respectively, according to an embodiment of the present application.
In implementation, the translucent or non-transparent state that the object should have can be determined according to the scene information (e.g., whether the object is occluded, whether the object is in a green shot non-ball-playing scene, whether the object is in a UI display scene, etc.) of the object and/or the motion state information (e.g., whether the object is in a stationary state, a moving state, a rotating state, etc.) of the object itself. For the rendering channel in the semi-transparent state, when the object is rendered, the transparency of the rendering channel is higher than that of the rendering channel in the non-transparent state.
It can be known from the above solutions that, in the object rendering method provided in the embodiments of the present application, when the illumination effect rendering is performed on the object, the illumination information, such as the illumination direction and the color, of the virtual light source is transmitted to replace the real-time light manner, and the illumination is divided into the direct illumination, the diffuse reflection, and the specular reflection to simulate the real illumination, on the basis of which the illumination conditions of the above portions of the object are calculated, and the final illumination result of the object is obtained by integrating the illumination conditions of the portions, so that the present application provides a simplified illumination model, and the illumination effect rendering is performed on the object under the non-real-time illumination based on the simplified illumination model, and compared with the prior art in which the complex lighting component is placed in the game scene, and the game engine performs the complex illumination calculation and rendering display on the object in the real-time light manner, the performance consumption caused by the real-time illumination processing of the engine can be greatly reduced, and can meet the requirements of customized lighting schemes.
In an embodiment, referring to the processing flow of the object rendering method provided in fig. 5, the object rendering method disclosed in the present application may further include, before step 105:
In this embodiment, for some scene themes, the object is configured with a self-luminous effect, for example, in the case of a christmas theme, a fluorescent effect or a flame effect is configured for a golf ball, and for this case, when the illumination effect rendering is performed on the object, the self-luminous illumination amount is introduced.
The self-luminous illumination amounts of different pixel points on the object can be calculated according to the colors of different pixels (such as the color of each sampling pixel) on the main texture map of the object and the self-luminous color of the object.
On this basis, as shown in fig. 4, the step 103 can be further implemented as:
and 103', superposing the direct illumination color, the diffuse reflection illumination quantity, the high illumination quantity and the self-luminous illumination quantity of the object to obtain an illumination result of the object.
That is, in the calculation of the subject illumination result, the self-luminous illumination amount of the subject is further introduced.
Corresponding to the object rendering method, an embodiment of the present application further discloses an object rendering apparatus, where a composition structure of the apparatus is shown in fig. 6, and the apparatus specifically includes:
an obtaining module 601, configured to obtain display characteristic information of an object to be rendered and illumination information of a virtual light source;
a first determining module 602, configured to determine, according to the display characteristic information and the illumination information, a direct illumination color, a diffuse reflection illumination amount, and a high illumination amount in a high illumination area of the object;
a second determining module 603, configured to determine an illumination result of the object according to the direct illumination color, the diffuse reflection illumination amount, and the highlight illumination amount;
a rendering module 604, configured to perform illumination effect rendering on the object based on the illumination result.
In an embodiment, the obtaining module 601 is specifically configured to:
acquiring the color, diffuse reflection light intensity, high light color and main texture map transparency of the main texture map of the object;
and acquiring the light color and the illumination direction of the virtual light source.
In an embodiment, the first determining module 602 is specifically configured to:
determining the direct illumination color of the object according to the color of the main texture map of the object and the light color of the virtual light source;
determining the diffuse reflection illumination quantity of the front surface and the back surface of the object according to the illumination direction of the virtual light source, the diffuse reflection light color of the object and the diffuse reflection light intensity;
determining a highlight illumination area of the object, and determining a highlight illumination amount of the object in the highlight illumination area according to the highlight intensity, the highlight color and the main texture map transparency of the object;
the second determining module 603 is specifically configured to:
and superposing the direct illumination color, the diffuse reflection illumination quantity and the highlight illumination quantity to obtain an illumination result of the object.
In an embodiment, the first determining module 602, when determining the direct illumination color of the object according to the color of the main texture map of the object and the lighting color of the virtual light source, is specifically configured to:
and calculating the product of the color values of different pixels on the main texture map of the object and the color value of the virtual light source to obtain the direct illumination colors of different pixels on the main texture map of the object.
In an embodiment, the first determining module 602, when determining the amount of diffuse reflection illumination on the front and back sides of the object according to the illumination direction of the virtual light source, the color of diffuse reflection light of the object, and the intensity of diffuse reflection light, is specifically configured to:
calculating the dot product of the normal direction of the different positions of the front surface and the back surface of the object in the world space and the illumination direction of the virtual light source;
and multiplying dot product results of different dot products by the diffuse reflection light color and the diffuse reflection light intensity of the object to obtain the diffuse reflection illumination quantity of the front surface and the back surface of the object.
In an embodiment, the first determining module 602, when determining the high illumination area of the object, is specifically configured to: determining a half-angle vector according to the visual angle direction and the illumination direction of the virtual light source, and representing a high illumination area of the object through the half-angle vector;
the first determining module 602, when determining the highlight illumination amount of the object in the highlight illumination area according to the highlight intensity, the highlight color and the main texture map transparency of the object, is specifically configured to:
and multiplying the highlight intensity, the highlight color, the transparency of the main texture mapping and the dot product result of the half-angle vector and the corresponding normal direction to obtain the highlight illumination.
In an embodiment, the first determining module 602 is further configured to:
and determining the self-luminous illumination quantity of different pixel points on the object according to the colors of different pixels on the main texture map of the object and the self-luminous colors of the object.
The second determining module 603 is specifically configured to: and superposing the direct illumination color, the diffuse reflection illumination quantity, the highlight illumination quantity and the self-luminous illumination quantity to obtain an illumination result of the object.
In an embodiment, the rendering module 604 is specifically configured to:
determining a current state of the object; the current state is a semitransparent state or a non-transparent state;
rendering the object with an illumination effect by using a target rendering channel matched with the current state and based on the illumination result;
the different states of the object correspond to different rendering channels, and the different rendering channels respectively perform illumination effect rendering on the object based on different transparencies.
For the object rendering device disclosed in the embodiment of the present application, since it corresponds to the object rendering method disclosed in the embodiment of the method above, the description is relatively simple, and for the relevant similarities, please refer to the description of the corresponding method embodiment above, and detailed description is omitted here.
In addition, an embodiment of the present application further discloses a shader, in which a computer instruction set is implemented, and when being executed by a processor, the computer instruction set implements the object rendering method disclosed in any of the above method embodiments.
In practical application, the shader provided by the embodiment of the application can be applied to scenes such as games in the forms of plug-ins and the like so as to render the illumination effect of objects in the scenes such as games.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other.
For convenience of description, the above system or apparatus is described as being divided into various modules or units by function, respectively. Of course, the functionality of the units may be implemented in one or more software and/or hardware when implementing the present application.
From the above description of the embodiments, it is clear to those skilled in the art that the present application can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present application may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments of the present application.
Finally, it is further noted that, herein, relational terms such as first, second, third, fourth, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.
Claims (10)
1. A method of object rendering, the method comprising:
acquiring display characteristic information of an object to be rendered and illumination information of a virtual light source;
determining the direct illumination color, the diffuse reflection illumination quantity and the high illumination quantity in a high illumination area of the object according to the display characteristic information and the illumination information;
determining an illumination result of the object according to the direct illumination color, the diffuse reflection illumination amount and the highlight illumination amount;
and performing illumination effect rendering on the object based on the illumination result.
2. The method according to claim 1, wherein the acquiring display characteristic information of the object to be rendered and illumination information of the virtual light source comprises:
acquiring the color, diffuse reflection light intensity, high light color and main texture map transparency of the main texture map of the object;
and acquiring the light color and the illumination direction of the virtual light source.
3. The method according to claim 2, wherein the determining a direct illumination color, a diffuse reflection illumination amount, and a highlight illumination amount in a highlight illumination area of the object according to the display feature information and the illumination information comprises:
determining the direct illumination color of the object according to the color of the main texture map of the object and the light color of the virtual light source;
determining the diffuse reflection illumination quantity of the front surface and the back surface of the object according to the illumination direction of the virtual light source, the diffuse reflection light color of the object and the diffuse reflection light intensity;
determining a highlight illumination area of the object, and determining a highlight illumination amount of the object in the highlight illumination area according to the highlight intensity, the highlight color and the main texture map transparency of the object;
determining an illumination result of the subject according to the direct illumination color, the diffuse reflectance illumination amount, and the highlight illumination amount, including:
and superposing the direct illumination color, the diffuse reflection illumination quantity and the highlight illumination quantity to obtain an illumination result of the object.
4. The method of claim 3, wherein determining the direct lighting color of the object from the color of the main texture map of the object and the lighting color of the virtual light source comprises:
and calculating the product of the color values of different pixels on the main texture map of the object and the color value of the virtual light source to obtain the direct illumination colors of different pixels on the main texture map of the object.
5. The method according to claim 3, wherein the determining the diffuse reflection illumination amount of the front and back surfaces of the object according to the illumination direction of the virtual light source and the diffuse reflection color and the diffuse reflection intensity of the object comprises:
calculating the dot product of the normal direction of the different positions of the front surface and the back surface of the object in the world space and the illumination direction of the virtual light source;
and multiplying dot product results of different dot products by the diffuse reflection light color and the diffuse reflection light intensity of the object to obtain the diffuse reflection illumination quantity of the front surface and the back surface of the object.
6. The method of claim 3, wherein the determining a high-light area of the object comprises:
determining a half-angle vector according to the visual angle direction and the illumination direction of the virtual light source, and representing a high illumination area of the object through the half-angle vector;
the determining highlight illumination amount of the object in the highlight illumination area according to highlight intensity, highlight color and main texture map transparency of the object comprises:
and multiplying the highlight intensity, the highlight color, the transparency of the main texture mapping and the dot product result of the half-angle vector and the corresponding normal direction to obtain the highlight illumination.
7. The method of claim 1, further comprising, prior to determining the illumination result of the subject from the direct illumination color, the diffuse reflectance illumination amount, and the highlight illumination amount:
determining self-luminous illumination quantity of different pixel points on the object according to the colors of different pixels on the main texture map of the object and the self-luminous colors of the object;
determining an illumination result of the subject according to the direct illumination color, the diffuse reflection illumination amount, and the highlight illumination amount, including:
and superposing the direct illumination color, the diffuse reflection illumination quantity, the highlight illumination quantity and the self-luminous illumination quantity to obtain an illumination result of the object.
8. The method of claim 1, wherein the rendering the lighting effect to the object based on the lighting result comprises:
determining a current state of the object; the current state is a semitransparent state or a non-transparent state;
rendering the object with an illumination effect by using a target rendering channel matched with the current state and based on the illumination result;
the different states of the object correspond to different rendering channels, and the different rendering channels respectively perform illumination effect rendering on the object based on different transparencies.
9. An object rendering apparatus, comprising:
the system comprises an acquisition module, a rendering module and a control module, wherein the acquisition module is used for acquiring display characteristic information of an object to be rendered and illumination information of a virtual light source;
the first determining module is used for determining the direct illumination color, the diffuse reflection illumination quantity and the high illumination quantity in a high illumination area of the object according to the display characteristic information and the illumination information;
a second determining module, configured to determine an illumination result of the object according to the direct illumination color, the diffuse reflection illumination amount, and the highlight illumination amount;
and the rendering module is used for rendering the illumination effect of the object based on the illumination result.
10. A shader, wherein a set of computer instructions is implemented in the shader, which when executed by a processor, implement the object rendering method of any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110918915.0A CN113610955A (en) | 2021-08-11 | 2021-08-11 | Object rendering method and device and shader |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110918915.0A CN113610955A (en) | 2021-08-11 | 2021-08-11 | Object rendering method and device and shader |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113610955A true CN113610955A (en) | 2021-11-05 |
Family
ID=78308187
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110918915.0A Pending CN113610955A (en) | 2021-08-11 | 2021-08-11 | Object rendering method and device and shader |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113610955A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113989439A (en) * | 2021-11-08 | 2022-01-28 | 成都四方伟业软件股份有限公司 | Visual domain analysis method and device based on UE4 |
CN116091684A (en) * | 2023-04-06 | 2023-05-09 | 杭州片段网络科技有限公司 | WebGL-based image rendering method, device, equipment and storage medium |
WO2024082927A1 (en) * | 2022-10-18 | 2024-04-25 | 腾讯科技(深圳)有限公司 | Hair rendering method and apparatus, device, storage medium and computer program product |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015188749A1 (en) * | 2014-06-10 | 2015-12-17 | Tencent Technology (Shenzhen) Company Limited | 3d model rendering method and apparatus and terminal device |
CN106815883A (en) * | 2016-12-07 | 2017-06-09 | 珠海金山网络游戏科技有限公司 | The hair treating method and system of a kind of game role |
US20170193690A1 (en) * | 2016-01-04 | 2017-07-06 | Samsung Electronics Co., Ltd. | 3d rendering method and apparatus |
CN108305328A (en) * | 2018-02-08 | 2018-07-20 | 网易(杭州)网络有限公司 | Dummy object rendering intent, system, medium and computing device |
CN113223131A (en) * | 2021-04-16 | 2021-08-06 | 完美世界(北京)软件科技发展有限公司 | Model rendering method and device, storage medium and computing equipment |
-
2021
- 2021-08-11 CN CN202110918915.0A patent/CN113610955A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015188749A1 (en) * | 2014-06-10 | 2015-12-17 | Tencent Technology (Shenzhen) Company Limited | 3d model rendering method and apparatus and terminal device |
US20170193690A1 (en) * | 2016-01-04 | 2017-07-06 | Samsung Electronics Co., Ltd. | 3d rendering method and apparatus |
CN106815883A (en) * | 2016-12-07 | 2017-06-09 | 珠海金山网络游戏科技有限公司 | The hair treating method and system of a kind of game role |
CN108305328A (en) * | 2018-02-08 | 2018-07-20 | 网易(杭州)网络有限公司 | Dummy object rendering intent, system, medium and computing device |
CN113223131A (en) * | 2021-04-16 | 2021-08-06 | 完美世界(北京)软件科技发展有限公司 | Model rendering method and device, storage medium and computing equipment |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113989439A (en) * | 2021-11-08 | 2022-01-28 | 成都四方伟业软件股份有限公司 | Visual domain analysis method and device based on UE4 |
WO2024082927A1 (en) * | 2022-10-18 | 2024-04-25 | 腾讯科技(深圳)有限公司 | Hair rendering method and apparatus, device, storage medium and computer program product |
CN116091684A (en) * | 2023-04-06 | 2023-05-09 | 杭州片段网络科技有限公司 | WebGL-based image rendering method, device, equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113610955A (en) | Object rendering method and device and shader | |
CN105447906B (en) | The method that weight illumination render is carried out based on image and model calculating illumination parameter | |
CN110599574B (en) | Game scene rendering method and device and electronic equipment | |
CN106534835B (en) | A kind of image processing method and device | |
CN102540464B (en) | Head-mounted display device which provides surround video | |
US20070139408A1 (en) | Reflective image objects | |
US20130229413A1 (en) | Live editing and integrated control of image-based lighting of 3d models | |
CN108043027B (en) | Storage medium, electronic device, game screen display method and device | |
CN103262126A (en) | Image processor, lighting processor and method therefor | |
CN109712226A (en) | The see-through model rendering method and device of virtual reality | |
Hillaire et al. | Design and application of real-time visual attention model for the exploration of 3D virtual environments | |
US8411089B2 (en) | Computer graphics method for creating differing fog effects in lighted and shadowed areas | |
CN114385289B (en) | Rendering display method and device, computer equipment and storage medium | |
CN113888398B (en) | Hair rendering method and device and electronic equipment | |
EP2051211A2 (en) | Shading of translucent objects | |
Ma et al. | Neural compositing for real-time augmented reality rendering in low-frequency lighting environments | |
WO2015052514A2 (en) | Rendering composites/layers for video animations | |
CN109961500B (en) | Sub-surface scattering effect rendering method, device, equipment and readable storage medium | |
Thompson et al. | Real-time mixed reality rendering for underwater 360 videos | |
WO2022042003A1 (en) | Three-dimensional coloring method and apparatus, and computing device and storage medium | |
CN112819929B (en) | Water surface rendering method and device, electronic equipment and storage medium | |
CN112473135B (en) | Real-time illumination simulation method, device and equipment for mobile game and storage medium | |
Pessoa et al. | Illumination techniques for photorealistic rendering in augmented reality | |
Fischer et al. | Stylized depiction in mixed reality | |
Beeson et al. | Skin in the" Dawn" demo |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20211105 |
|
WD01 | Invention patent application deemed withdrawn after publication |