CN110853128A - Virtual object display method and device, computer equipment and storage medium - Google Patents

Virtual object display method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN110853128A
CN110853128A CN201911101501.8A CN201911101501A CN110853128A CN 110853128 A CN110853128 A CN 110853128A CN 201911101501 A CN201911101501 A CN 201911101501A CN 110853128 A CN110853128 A CN 110853128A
Authority
CN
China
Prior art keywords
virtual object
parameters
parameter
roughness
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911101501.8A
Other languages
Chinese (zh)
Other versions
CN110853128B (en
Inventor
叶宬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911101501.8A priority Critical patent/CN110853128B/en
Publication of CN110853128A publication Critical patent/CN110853128A/en
Application granted granted Critical
Publication of CN110853128B publication Critical patent/CN110853128B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Abstract

The embodiment of the application discloses a virtual object display method and device, computer equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: acquiring multiple parameters of a virtual object, acquiring roughness components of the virtual object in multiple dimensions in a virtual environment according to the roughness parameters, acquiring anisotropic illumination parameters of the virtual object according to the roughness components and normal parameters in the multiple dimensions, and displaying the virtual object in the virtual environment according to the multiple parameters and the anisotropic illumination parameters of the virtual object so as to enable the virtual object to be made of anisotropic materials. According to the method, the illumination distribution is adjusted on multiple dimensions according to the roughness components on the multiple dimensions, so that the illumination parameters meeting the anisotropy are obtained, the displayed virtual object is made of anisotropic materials, the reality is higher, and the display effect of the virtual object is improved.

Description

Virtual object display method and device, computer equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a virtual object display method, a virtual object display device, computer equipment and a storage medium.
Background
Under the irradiation of the light source, the object has anisotropy, and the light intensity in different directions can generate non-uniform change. With the rapid development of computer technology and the increasing requirements of users on display effects, a virtual environment is generally required to be constructed in scenes such as electronic games or virtual reality, and virtual objects are created in the virtual environment. In order to make the display effect of the virtual object more natural, it is necessary to realize anisotropy of the virtual object.
In the related art, a virtual object having isotropy is created, and color parameters and high luminance parameters of the virtual object are adjusted to achieve a display effect similar to that of an anisotropic material. However, the illumination distribution effect generated on the surface of the virtual object is still different from that generated by the anisotropic material, and the display effect is not good. For example, for a circular spot, by adjusting the color parameter and the high luminance parameter, only the size of the spot can be changed, or the circular shape can be changed into an elliptical shape, but the spot cannot be changed into a ring shape.
Disclosure of Invention
The embodiment of the application provides a virtual object display method and device, computer equipment and a storage medium, and can solve the problem that the display effect of a virtual object is poor in the related art. The technical scheme is as follows:
in one aspect, a virtual object display method is provided, and the method includes:
acquiring multiple parameters of a virtual object, wherein the multiple parameters comprise a roughness parameter and a normal parameter;
acquiring roughness components of the virtual object in multiple dimensions in a virtual environment according to the roughness parameters;
acquiring anisotropic illumination parameters of the virtual object according to the roughness components on the multiple dimensions and the normal parameters;
and displaying the virtual object in the virtual environment according to the multiple parameters and the anisotropic illumination parameters, so that the material of the virtual object is an anisotropic material.
In another aspect, there is provided a virtual object display apparatus, the apparatus including:
the first parameter acquisition module is used for acquiring a plurality of parameters of the virtual object, wherein the plurality of parameters comprise a roughness parameter and a normal parameter;
the roughness component acquisition module is used for acquiring roughness components of the virtual object in multiple dimensions in a virtual environment according to the roughness parameters;
the first illumination parameter acquisition module is used for acquiring anisotropic illumination parameters of the virtual object according to the roughness components on the multiple dimensions and the normal parameters;
and the display module is used for displaying the virtual object in the virtual environment according to the multiple parameters and the anisotropic illumination parameters so as to enable the material of the virtual object to be anisotropic.
Optionally, the apparatus further comprises:
the second illumination parameter acquisition module is used for acquiring the isotropic illumination parameters of the virtual object according to the roughness parameters and the normal parameters when the indication parameters indicate that the material of the virtual object is isotropic;
the display module is further configured to display the virtual object in the virtual environment according to the multiple parameters and the isotropic illumination parameters, so that the virtual object is made of an isotropic material.
Optionally, when the indication parameter is a first parameter value, indicating that the material of the virtual object is the anisotropic material;
and when the indication parameter is a second parameter value, indicating that the material of the virtual object is the isotropic material.
Optionally, the roughness component obtaining module is further configured to obtain, according to the indication parameter, a rotation parameter matched with the indication parameter, and obtain, according to the roughness parameter and the rotation parameter, roughness components of the virtual object in the multiple dimensions.
Optionally, the roughness component obtaining module is further configured to:
acquiring roughness components of the virtual object in the multiple dimensions by adopting the following formula:
Figure BDA0002270007890000021
αy=α·r;
r=(1.0-0.9*t)2
wherein, αxAs a roughness component in a first dimension, αyFor the roughness component in the second dimension α is the roughness parameter, r is the rotation parameter and t is the indicator parameter.
Optionally, the first parameter obtaining module is further configured to obtain a data packet of the virtual object from a cache region, and analyze the data packet to obtain the multiple parameters.
Optionally, the apparatus further comprises:
a second parameter obtaining module, configured to obtain the currently set multiple parameters of the virtual object;
the data packet generating module is used for generating the data packet according to the plurality of parameters;
and the storage module is used for storing the data packet in the cache region by calling a parameter storage interface.
Optionally, the normal parameter includes a normal vector of the virtual object and a micro-surface normal vector of the virtual object, and the second illumination parameter obtaining module is further configured to:
obtaining the isotropic illumination parameters using the following formula:
Figure BDA0002270007890000031
wherein D isGGX(H) And H is the micro-surface normal vector of the virtual object, N is the normal vector of the virtual object, and α is the roughness parameter of the virtual object.
In another aspect, a computer device is provided, which includes a processor and a memory, the memory having stored therein at least one program code, the at least one program code being loaded and executed by the processor to implement the operations as performed in the virtual object display method.
In another aspect, there is provided a computer-readable storage medium having at least one program code stored therein, the at least one program code being loaded and executed by a processor to implement the operations as performed in the virtual object display method.
In yet another aspect, a computer program is provided, in which at least one program code is stored, the at least one program code being loaded and executed by a processor to implement the operations as performed in the virtual object display method.
The method, the device, the computer equipment and the storage medium adopted by the embodiment of the application obtain multiple parameters of the virtual object, obtain roughness components of the virtual object in multiple dimensions in the virtual environment according to the roughness parameters, obtain anisotropic illumination parameters of the virtual object according to the roughness components and the normal parameters in the multiple dimensions, and display the virtual object in the virtual environment according to the multiple parameters and the anisotropic illumination parameters so as to enable the virtual object to be made of anisotropic materials. According to the method, the illumination distribution is adjusted on multiple dimensions according to the roughness components on the multiple dimensions, so that the illumination parameters meeting the anisotropy are obtained, the displayed virtual object is made of anisotropic materials, the reality is higher, and the display effect of the virtual object is improved.
And the indication parameters are adopted to determine whether the material of the virtual object is isotropic material or anisotropic material, and the virtual object meeting the material requirement can be displayed only by setting or updating the indication parameters, so that the operation is flexible and simple, and the application range is comprehensive.
And in the process of acquiring the multiple parameters of the virtual object, firstly acquiring the multiple parameters set by the virtual object, generating a data packet according to the multiple parameters, storing the data packet in a cache region by calling a parameter setting interface, acquiring the data packet from the cache region, and analyzing the data packet to obtain the multiple parameters. The method comprises the steps of storing multiple parameters, obtaining the multiple parameters from a cache region when obtaining the illumination parameters, obtaining the multiple parameters from the cache region when obtaining the multiple parameters from the cache region for a virtual object with a display region and a non-display region, obtaining the multiple parameters of the display region only, and obtaining the illumination parameters according to the multiple parameters of the display region so as to reduce the calculation amount.
In addition, by adopting the method in the embodiment of the application, when the anisotropic material of the virtual object is realized, an online rendering mode is adopted, the rendering speed is high, the efficiency is high, the occupied storage space is small, and a better rendering effect can be realized.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a virtual object display method according to an embodiment of the present disclosure.
FIG. 2 is a diagram illustrating a texture editor interface according to an embodiment of the present disclosure.
Fig. 3 is a flowchart of parameter storage according to an embodiment of the present application.
Fig. 4 is a flowchart of a virtual object display method according to an embodiment of the present application.
Fig. 5 is a schematic diagram illustrating a display effect of an isotropic material according to an embodiment of the present disclosure.
Fig. 6 is a schematic diagram illustrating a display effect of an anisotropic material according to an embodiment of the present disclosure.
Fig. 7 is a schematic diagram illustrating a display effect of an anisotropic material according to an embodiment of the present disclosure.
Fig. 8 is a flowchart illustrating a display effect of another isotropic material according to an embodiment of the present disclosure.
FIG. 9 is a schematic diagram of a display effect of another anisotropic material according to an embodiment of the present application.
Fig. 10 is a schematic diagram illustrating a display effect of an anisotropic material according to an embodiment of the present application.
Fig. 11 is a schematic structural diagram of a virtual object display apparatus according to an embodiment of the present application.
Fig. 12 is a schematic structural diagram of another virtual object display apparatus according to an embodiment of the present application.
Fig. 13 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application more clear, the embodiments of the present application will be further described in detail with reference to the accompanying drawings.
It will be understood that the terms "first," "second," and the like as used herein may be used herein to describe various concepts, which are not limited by these terms unless otherwise specified. These terms are only used to distinguish one concept from another. For example, a first dimension may be referred to as a second dimension, and a second dimension may be referred to as a first dimension, without departing from the scope of the present application.
In order to facilitate understanding of the technical processes of the embodiments of the present application, some terms referred to in the embodiments of the present application are explained below:
an illumination model: the illumination model describes the color value of an object at a certain point under the influence of rays, visual angles and the like. Such as Phong (specular reflection) model, Blinn-Phong (modified specular reflection) model, and Cook-Torrance (micro-surface) model, etc.
BRDF (Bidirectional reflection Distribution Function): the method is used for defining how radiance in a given incident direction influences radiance in a given emergent direction, describes how incident light is distributed in each emergent direction after being reflected by a certain surface, and can describe various reflections such as ideal specular reflection, diffuse reflection, isotropy or anisotropy and the like.
And (3) delayed rendering: the rendering process of the light-receiving object comprises the following steps:
1. the shape of the light-receiving object is calculated.
2. Surface material characteristic parameters of the light-receiving object, such as normal parameters and BRDF (bidirectional reflectance distribution function) are acquired.
3. And calculating illumination parameters including illumination direction, illumination intensity and the like.
4. And displaying the color of the light-receiving object according to the influence of the illumination parameters and the surface material characteristic parameters on the surface of the light-receiving object.
And in the process of executing delayed rendering, executing the step 1 and the step 2, storing the acquired parameters in a cache region, extracting the parameters from the cache region, and executing the step 3 and the step 4 to reduce the calculation amount of the illumination parameters of the overlapped part of the polygon on the screen. And other rendering modes are to execute the 4 steps once.
Anisotropy: refers to the property that all or part of the physical, chemical and other properties of an object change with different directions. In this application, the non-uniform change of the light intensity of the light-receiving object in different directions when the light-receiving object is illuminated.
G-Buffer (G-Buffer): a buffer area for storing each parameter of the light-receiving object.
The embodiment of the application can be applied to various scenes.
For example, when the method provided by the embodiment of the present application is applied to a game scene, a virtual environment is created in a game application, and a virtual object can be displayed in the virtual environment.
The method and the device are applied to the virtual reality scene, the user uses the camera of the terminal to shoot the picture of the actual scene, and the virtual object is added in the picture, so that the method provided by the embodiment of the application is adopted to display the virtual object, and the anisotropy of the virtual object can be realized.
Fig. 1 is a flowchart of a virtual object display method according to an embodiment of the present disclosure. An execution subject of the embodiment of the application is a computer device, and referring to fig. 1, the method includes:
101. and acquiring a plurality of parameters set by the virtual object.
The virtual object is an object set in a virtual environment, and can be a dynamic object such as a character, an animal and a plant, and can also be a static object such as a table and a chair. The plurality of parameters are parameters required for displaying the virtual object in the virtual environment, and the material, color, and the like of the virtual object can be determined according to the plurality of parameters. And subsequently, after the parameters of the virtual object are determined, the virtual object can be displayed according to the determined parameters, and the displayed virtual object is matched with the parameters to meet the requirements of the parameters.
The virtual object may be composed of one object model, or may include a plurality of object models, and the virtual object is composed of the plurality of object models. For example, a plurality of cuboids having different sizes may be combined to form a virtual staircase. When the virtual object includes a plurality of object models, a plurality of parameters set by each object model may be acquired, and the virtual object may be subsequently displayed according to the parameters of the plurality of object models.
In one possible implementation, the plurality of parameters includes a roughness parameter and a normal parameter.
The roughness parameter is used for describing the distribution attribute of the material of the virtual object on the micro surface, and the roughness on different dimensions can generate different influences on the illumination parameter of the virtual object. By setting the roughness parameter, the roughness of the surface of the virtual object is matched with the roughness parameter when the virtual object is displayed according to the roughness parameter subsequently, and the requirement of the roughness is met. The display effect of the virtual object is influenced by the roughness, when the roughness is small, the reflection of the illumination irradiated on the surface of the virtual object is concentrated, and when the roughness is large, the larger the illumination range reflected on the surface of the virtual object is, the more dispersed the illumination range is. When the roughness parameter in only one dimension is available, the isotropic illumination parameter can be obtained according to the roughness parameter, and when the roughness parameter in two or more dimensions is available, the anisotropic illumination parameter can be obtained.
The normal parameter is used to represent the orientation of the virtual object surface, and includes the normal vector of the virtual object and the micro-surface normal vector of the virtual object. By setting the normal parameters, when the virtual object is displayed according to the normal parameters subsequently, the orientation of the surface of the virtual object is matched with the normal parameters, and the requirements of the normal parameters are met.
The normal vector of the virtual object is used for expressing the normal direction of the virtual object, namely the direction perpendicular to the surface of the virtual object; meanwhile, the surface of the virtual object is provided with a plurality of uneven micro mirror surfaces, the micro mirror surfaces jointly form the surface of the virtual object, and the normal vector of the micro surface of the virtual object is used for expressing the normal direction of the micro mirror surfaces.
In addition, the plurality of parameters may further include a rotation parameter that enables the highlight portion of the virtual object to rotate. The illumination of the anisotropic material has different distributions in different dimensions, and the rotation parameter enables the illumination to have the ability of rotating, for example, the rotation parameter can be set to rotate the highlight in the vertical direction into the highlight in the horizontal direction.
Alternatively, a virtual camera is arranged in the virtual environment, and the process of shooting the virtual environment by the virtual camera can be simulated by the picture of the virtual scene displayed by the computer device. The picture obtained by shooting the virtual environment by the virtual camera is the picture formed by projecting the virtual environment onto the display screen of the virtual camera.
And the virtual camera can rotate in the virtual environment, so that the shooting angle of view is adjusted, once the shooting angle of view is adjusted, the displayed picture is correspondingly adjusted, and the specific style of the displayed virtual object is changed. Therefore, considering the influence of the shooting angle of the virtual camera, the unit vector in the direction of the shooting angle of the virtual camera and the normal vector of the virtual object can be added to obtain the micro-surface normal vector.
In one possible implementation, the plurality of parameters further includes an indication parameter, and the indication parameter is used to indicate whether the material of the virtual object is an anisotropic material.
Through setting up the instruction parameter, can confirm the material of the virtual object that needs the demonstration, when the material that instructs the virtual object of instruction parameter instruction is anisotropic material, the material of the virtual object that needs the demonstration is anisotropic material, consequently when acquireing the illumination parameter of virtual object, needs acquire anisotropic illumination parameter to make follow-up when showing the virtual object according to anisotropic illumination parameter, the illumination distribution on virtual object surface satisfies the illumination distribution condition of anisotropic material, thereby realize that the material of virtual object is anisotropic material. When the indicating parameters indicate that the virtual object is made of the isotropic material, the virtual object to be displayed is made of the isotropic material, so that when the illumination parameters of the virtual object are acquired, the isotropic illumination parameters need to be acquired, and when the virtual object is subsequently displayed according to the isotropic illumination parameters, the illumination distribution on the surface of the virtual object meets the illumination distribution conditions of the isotropic material, so that the virtual object is made of the isotropic material.
Optionally, whether the material of the virtual object is an anisotropic material is indicated according to a preset parameter value. When the indication parameter is the first parameter value, indicating that the material of the virtual object is an anisotropic material; when the indication parameter is the second parameter value, indicating that the material of the virtual object is an isotropic material. The first parameter value is any value different from the second parameter value, for example, if the second parameter value is 0, the first parameter value is any value not equal to 0.
In one possible implementation, the plurality of parameters further includes an initial color value parameter, a metal parameter, a highlight parameter, and the like. The initial color value parameter is used for representing an initial color value set for the virtual object, the metal parameter is used for representing that the material of the virtual object is a metal material, and the highlight parameter is used for representing the brightness of a highlight part displayed by the virtual object, and the parameters have influence on the display effect of the virtual object.
In a possible implementation manner, a technician may set multiple parameters of a virtual object through a parameter interface in a material editor, where the material editor interface is provided with an initial color value parameter interface, a metal parameter interface, a highlight parameter interface, a roughness parameter interface, a normal parameter interface, an indication parameter interface, a rotation parameter interface, and other interfaces, referring to fig. 2. In the process of displaying the virtual object, the technician can also adjust each parameter by triggering the parameter interface.
102. And generating a data packet according to the plurality of parameters.
103. And storing the data packet in a cache region by calling a parameter setting interface.
The parameter setting interface is used for transmitting parameters, and each parameter has a corresponding parameter setting interface. For example, there are an indication parameter interface for transmitting an indication parameter, and a rotation parameter interface for transmitting a rotation parameter.
In a possible implementation manner, the Buffer may be a G-Buffer, a custom data interface is set in the Buffer, two new parameter setting interfaces, that is, CustomData0 and CustomData1, may be obtained by modifying the type of the custom data interface in the Buffer, and the indication parameter and the rotation parameter may be cached by calling the two parameter setting interfaces.
Referring to fig. 3, when storing each parameter, an initial color value parameter, a metal parameter, an indication parameter, a rotation parameter, and the like are stored in a buffer, wherein the indication parameter and the rotation parameter are parameters added by calling a new parameter setting interface after the interface in the buffer is expanded.
For example, by calling CustomData0 and CustomData1, the indication parameters and rotation parameters are stored in the buffer:
GBuffer.CustomData.r=saturate(GetMaterialCustomData0(MaterialParameters));
GBuffer.CustomData.g=saturate(GetMaterialCustomData1(MaterialParameters))。
it should be noted that, in the embodiment of the present application, the multiple parameter generation data packets are cached, and in another embodiment, other caching methods may also be used.
104. And acquiring the data packet from the cache region, analyzing the data packet to obtain a plurality of parameters, and executing step 105 or step 107.
And storing the parameters in a cache region, then acquiring a data packet from the cache region, and analyzing the data packet to obtain the parameters.
In a possible implementation manner, all stored data packets are obtained from the buffer, and all data packets are analyzed to obtain multiple parameters. In the subsequent calculation of the illumination parameters, the calculation is performed based on all the parameters.
In a possible implementation manner, based on the case that the virtual object includes multiple object models in step 101, the virtual object composed of multiple object models includes a displayable region and a non-displayable region due to overlap that may occur between the multiple object models. Since the illumination parameters of the displayable region only need to be calculated when the illumination parameters are subsequently calculated, when the multiple parameters are acquired from the cache region, the multiple parameters of the displayable region only need to be acquired, and the multiple parameters of the non-displayable region are not acquired, so that unnecessary calculation is reduced.
For example, for a virtual ladder formed by combining a plurality of cuboids with different sizes, an overlapping part exists among the cuboids, the overlapping part is a non-display area, and when the illumination parameters of the virtual ladder are calculated subsequently, the illumination parameters of the non-display area do not need to be calculated, and only the illumination parameters of the display area need to be calculated, so that only a plurality of parameters of the display area need to be acquired, and a plurality of parameters of the non-display area do not need to be acquired.
105. And when the indication parameters indicate that the material of the virtual object is an anisotropic material, acquiring roughness components of the virtual object in multiple dimensions in the virtual environment according to the roughness parameters.
Determining the material of the virtual object according to the indication parameter, wherein in a possible implementation manner, when the indication parameter is a first parameter value, the material of the virtual object is indicated to be an anisotropic material; and when the indication parameter is the second parameter value, indicating that the material of the virtual object is an isotropic material.
When the indication parameters indicate that the material of the virtual object is an anisotropic material, acquiring roughness components of the virtual object in multiple dimensions in the virtual environment according to the roughness parameters, and subsequently displaying the virtual object according to the roughness components in the multiple dimensions, wherein the roughness of the virtual object in the multiple dimensions is matched with the acquired roughness components, so that the requirement of the roughness of the anisotropic material can be met.
A coordinate system may be set for the virtual environment, and the coordinate system may include two directions of an x axis and a y axis, or include three directions of an x axis, a y axis, and a z axis, or may also be another form of coordinate system. The plurality of dimensions may include any direction in the coordinate system. For example, the plurality of dimensions include a first dimension and a second dimension, the first dimension is an x-axis, and the second dimension is a y-axis, or the plurality of dimensions include a first dimension, a second dimension, and a third dimension, the first dimension is an x-axis, the second dimension is a y-axis, and the third dimension is a z-axis.
In one possible implementation, the coordinate system provided in the virtual environment is a coordinate system of a tangential space, the coordinate system including an x-axis, a y-axis, and a z-axis. The z-axis is the normal direction of the virtual object surface, and the x-axis and the y-axis are two mutually perpendicular directions parallel to the virtual object surface. In the embodiment of the present application, the multiple dimensions are two directions perpendicular to each other on the surface of the virtual object, and then the roughness components in two directions, i.e., the x axis and the y axis, can be obtained.
In one possible implementation manner, according to the indication parameters, rotation parameters matched with the indication parameters are obtained, and according to the roughness parameters and the rotation parameters, roughness components of the virtual object in multiple dimensions are obtained.
The roughness components of the virtual object in each dimension can be obtained by adopting the following formula:
Figure BDA0002270007890000101
αy=α·r;
r=(1.0-0.9*t)2
wherein, αxAs a roughness component in a first dimension, αyFor the roughness component in the second dimension α is the roughness parameter of the virtual object, r is the rotation parameter of the virtual object, and t is the indicator parameter of the virtual object.
The above manner is only described by taking roughness components in two dimensions as an example, in another embodiment, roughness components in more dimensions may be acquired to make the acquired anisotropic illumination parameters more accurate, and the acquisition manner is similar to the above manner, or other manners may also be adopted.
106. And acquiring anisotropic illumination parameters of the virtual object according to the roughness components and the normal parameters in multiple dimensions, and executing step 108.
The anisotropic illumination parameters of the virtual object are used for representing the illumination intensity of each point in the virtual object, and viewed from the whole virtual object, the anisotropic illumination parameters of the virtual object are used for representing the illumination distribution condition of the surface of the whole virtual object.
In one possible implementation, the following formula may be used to obtain the anisotropic illumination parameters:
Figure BDA0002270007890000111
wherein D isaniso(H) Is an anisotropic illumination parameter of the virtual object, H is a micro-surface normal vector of the virtual object, N is a normal vector of the virtual object, X is a normal component in a first dimension, Y is a normal component in a second dimension, αxAs a roughness component in a first dimension, αyIs the roughness component in the second dimension.
When the formula is used for calculating the illumination parameters, the roughness components in two dimensions are adopted, the illumination distribution can be adjusted in the two dimensions, and further the anisotropy can be embodied, so that when the illumination parameters are obtained, the anisotropic illumination parameters can be obtained according to the roughness components in multiple dimensions.
It should be noted that in another embodiment, the anisotropic illumination parameter may be obtained by other methods, for example, a WARD function (a micro-surface distribution function) or other functions may be used.
107. When the indication parameters indicate that the material of the virtual object is an isotropic material, obtaining isotropic illumination parameters of the virtual object according to the roughness parameters and the normal parameters, and executing step 108.
In one possible implementation, the following formula is used to obtain the isotropic illumination parameters:
Figure BDA0002270007890000112
wherein d (H) is the isotropic illumination parameter of the virtual object, H is the micro-surface normal vector of the virtual object, N is the normal vector of the virtual object, and α is the roughness parameter of the virtual object.
It should be noted that in another embodiment, other methods may be used to obtain the isotropic illumination parameters, for example, a WARD function or other functions may be used.
108. The virtual object is displayed in the virtual environment in accordance with each parameter of the virtual object.
After each parameter of the virtual object is obtained, the virtual object is displayed in the virtual environment according to the influence of each parameter on the color value of the virtual object.
The color value of the virtual object represents the color displayed by the virtual object, and the color value is influenced by a plurality of parameters such as an illumination parameter, an initial color value parameter, a metal parameter and radiance. The radiance can be calculated and obtained by adopting the BRDF, and is used for reflecting the color displayed by the virtual object.
In a possible implementation manner, after the anisotropic illumination parameters of the virtual object are obtained, the virtual object is displayed in the virtual object according to the multiple parameters set by the virtual object and the obtained anisotropic illumination parameters, so that the material of the virtual object is an anisotropic material.
In another possible implementation manner, after the isotropic illumination parameters of the virtual object are obtained, the virtual object is displayed in the virtual object according to the multiple parameters set by the virtual object and the obtained isotropic illumination parameters, so that the material of the virtual object is an isotropic material.
In a possible implementation mode, the method is applied to Unreal (Unreal Engine), a Buffer area is G-Buffer in Unreal, anisotropic materials are applied to a delayed rendering process of Unreal, the G-Buffer can store indication parameters and rotation parameters by expanding custom parameters in the G-Buffer, and an anisotropic model is added to an illumination part of delayed rendering to realize an adjustable anisotropic material rendering technology.
For example, referring to fig. 4, the flow of the display method of the virtual object is that the acquired multiple parameters are stored in a G-cache region, the indication parameters are acquired from the G-cache region, the material of the virtual object is determined according to the indication parameters, when the material of the virtual object is an isotropic material, an isotropic illumination parameter is acquired by an isotropic method, and the color value of the virtual object is displayed; and when the virtual object is made of an anisotropic material, acquiring anisotropic illumination parameters by adopting an anisotropic method, and displaying the color value of the virtual object.
It should be noted that, in the embodiment of the present application, only delayed rendering is taken as an example for description, in another embodiment, a forward rendering manner may be adopted, and the steps 102 to 104 in the embodiment of the present application are not executed, that is, each parameter is not stored, and each acquired parameter is directly adopted to perform illumination calculation.
The method adopted by the embodiment of the application obtains multiple parameters of the virtual object, obtains roughness components of the virtual object in multiple dimensions in the virtual environment according to the roughness parameters, obtains anisotropic illumination parameters of the virtual object according to the roughness components and the normal parameters in the multiple dimensions, and displays the virtual object in the virtual environment according to the multiple parameters and the anisotropic illumination parameters so as to enable the virtual object to be made of anisotropic materials. According to the method, the illumination distribution is adjusted on multiple dimensions according to the roughness components on the multiple dimensions, so that the illumination parameters meeting the anisotropy are obtained, the displayed virtual object is made of anisotropic materials, the reality is higher, and the display effect of the virtual object is improved.
And the indication parameters are adopted to determine whether the material of the virtual object is isotropic material or anisotropic material, and the virtual object meeting the material requirement can be displayed only by setting or updating the indication parameters, so that the operation is flexible and simple, and the application range is comprehensive.
And in the process of acquiring the multiple parameters of the virtual object, firstly acquiring the multiple parameters set by the virtual object, generating a data packet according to the multiple parameters, storing the data packet in a cache region by calling a parameter setting interface, acquiring the data packet from the cache region, and analyzing the data packet to obtain the multiple parameters. The method comprises the steps of storing multiple parameters, obtaining the multiple parameters from a cache region when obtaining the illumination parameters, obtaining the multiple parameters from the cache region when obtaining the multiple parameters from the cache region for a virtual object with a display region and a non-display region, obtaining the multiple parameters of the display region only, and obtaining the illumination parameters according to the multiple parameters of the display region so as to reduce the calculation amount.
By adopting the method in the embodiment of the application, the display effect of the obtained virtual object is shown in FIGS. 5-9, wherein, the material of the virtual object in fig. 5 is isotropic material, the highlight portion of the virtual object has gradual change effect in vertical direction and horizontal direction, and stripe-shaped highlight effect such as wiredrawing stainless steel, silk, etc. can not be formed in vertical direction, in fig. 6, the material of the virtual object is anisotropic, the highlight part of the virtual object in the vertical direction maintains the same brightness, a clear strip highlight band is formed, the high light distribution in the horizontal direction and the vertical direction are different, and, because the application also sets the rotation parameters, the direction of the strip-shaped highlight strip in the vertical direction in fig. 6 can be rotated, and the effect graph obtained after the rotation is shown in fig. 7, the angle of the strip-shaped highlight strip is changed, but the strip-shaped highlight strip is still clear.
As shown in fig. 8 and 9, for the human eye, fig. 8 shows the effect of the isotropic material, and the highlight at the corner of the eye is more obvious, and fig. 9 shows the effect of the anisotropic material, because the anisotropic material is provided with the rotation parameter, the highlight part is rotated based on the rotation parameter, so that the original highlight in the vertical direction can be changed into the highlight in the horizontal direction, and the highlight transition at the corner of the eye is more natural.
As shown in fig. 10, the number indicates the degree of anisotropy, and represents a process of gradually realizing the anisotropy, wherein 0.01 indicates that the degree of anisotropy is small, the highlight portion appears as a light spot on the sphere, the light spot gradually changes with the increase of the degree of anisotropy, and when the degree of anisotropy reaches 1, the highlight portion appears as a circular ring on the sphere, and the anisotropy is realized.
The virtual object display method in the embodiment of the application is based on the on-line rendering method for obtaining the color value of the virtual object, in the related art, the color value of the virtual object is obtained by an off-line rendering method, and the anisotropic material effect is realized by 3D software or an off-line renderer, for example, Maya (a 3D software), Arnold (an off-line renderer), and the like. The rendering mode of the offline renderer is different from the real-time rendering mode, the offline rendering describes the specific implementation of the BRDF, but the offline rendering is not combined with the real-time rendering flow of delayed rendering but combined with the offline rendering flow of ray tracing, the rendering effect is good, but the rendering speed is slow, the efficiency is low, and a large storage space is required to be occupied.
By adopting the method in the embodiment of the application, when the anisotropic material of the virtual object is realized, an online rendering mode is adopted, the rendering speed is high, the efficiency is high, the occupied storage space is small, and a better rendering effect can be realized.
Fig. 11 is a schematic structural diagram of a virtual object display apparatus according to an embodiment of the present application. Referring to fig. 11, the apparatus includes:
a first parameter obtaining module 1101, configured to obtain multiple parameters of a virtual object, where the multiple parameters include a roughness parameter and a normal parameter;
a roughness component obtaining module 1102, configured to obtain roughness components of the virtual object in multiple dimensions in the virtual environment according to the roughness parameter and the rotation parameter;
a first illumination parameter obtaining module 1103, configured to obtain an anisotropic illumination parameter of the virtual object according to the roughness components in the multiple dimensions and the normal parameters;
the display module 1104 is configured to display the virtual object in the virtual environment according to the multiple parameters and the anisotropic illumination parameters, so that the material of the virtual object is an anisotropic material.
Optionally, the multiple parameters further include an indication parameter, where the indication parameter is used to indicate whether a material of the virtual object is an anisotropic material;
the roughness component obtaining module 1102 is further configured to:
and when the indication parameters indicate that the material of the virtual object is an anisotropic material, acquiring roughness components of the virtual object in multiple dimensions according to the roughness parameters and the rotation parameters.
Optionally, referring to fig. 12, the apparatus further comprises:
a second illumination parameter obtaining module 1105, configured to obtain, when the indication parameter indicates that the material of the virtual object is an isotropic material, an isotropic illumination parameter of the virtual object according to the roughness parameter and the normal parameter;
the display module 1104 is further configured to display the virtual object in the virtual environment according to the multiple parameters and the isotropic lighting parameter, so that the material of the virtual object is an isotropic material.
Optionally, when the indication parameter is the first parameter value, indicating that the material of the virtual object is an anisotropic material;
and when the indication parameter is the second parameter value, indicating that the material of the virtual object is an isotropic material.
Optionally, the roughness component obtaining module 1102 is further configured to obtain, according to the indication parameter, a rotation parameter matched with the indication parameter, and obtain, according to the roughness parameter and the rotation parameter, roughness components of the virtual object in multiple dimensions.
Optionally, the roughness component acquiring module 1102 is further configured to acquire roughness components of the virtual object in multiple dimensions by using the following formula:
Figure BDA0002270007890000151
αy=α·r;
r=(1.0-0.9*t)2
wherein, αxAs a roughness component in a first dimension, αyFor the roughness component in the second dimension α is a roughness parameter, r is a rotation parameter and t is an indicator parameter.
Optionally, the first parameter obtaining module 1101 is further configured to obtain a data packet of the virtual object from the cache region, and analyze the data packet to obtain multiple parameters.
Optionally, referring to fig. 12, the apparatus further comprises:
a second parameter obtaining module 1106, configured to obtain a plurality of currently set parameters of the virtual object;
a data packet generating module 1107, configured to generate a data packet according to multiple parameters;
the storage module 1108 is configured to store the data packet in the cache region by calling the parameter storage interface.
Optionally, the normal parameter includes a normal vector of the virtual object and a micro-surface normal vector of the virtual object, and the first illumination parameter obtaining module 1103 is further configured to:
the following formula is used to obtain the anisotropic illumination parameters:
Figure BDA0002270007890000152
wherein D isaniso(H) Is an anisotropic illumination parameter of the virtual object, H is a micro-surface normal vector of the virtual object, N is a normal vector of the virtual object, X is a normal component in a first dimension, Y is a normal component in a second dimension, αxAs a roughness component in a first dimension, αyIs the roughness component in the second dimension.
Optionally, the normal parameter includes a normal vector of the virtual object and a micro-surface normal vector of the virtual object, and the second illumination parameter obtaining module 1105 is further configured to:
the following formula is adopted to obtain isotropic illumination parameters:
Figure BDA0002270007890000161
wherein D isGGX(H) Is the isotropic illumination parameter of the virtual object, H is the micro-surface normal vector of the virtual object, N is the normal vector of the virtual object, α is the roughness parameter of the virtual object.
It should be noted that: in the virtual object display apparatus provided in the above embodiment, when displaying a virtual object, only the division of the function modules is exemplified, and in practical applications, the function distribution may be completed by different function modules according to needs, that is, the internal structure of the computer device is divided into different function modules to complete all or part of the functions described above. In addition, the virtual object display apparatus provided in the above embodiments and the virtual object display method embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments, and are not described herein again.
Fig. 13 is a schematic structural diagram of a terminal 1300 according to an embodiment of the present application.
In general, terminal 1300 includes: a processor 1301 and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, a 5-core processor, and the like. The processor 1301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1301 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1301 may be integrated with a GPU (Graphics Processing Unit, image Processing interactor) for rendering and drawing content required to be displayed on the display screen. In some embodiments, processor 1301 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 1302 may include one or more computer-readable storage media, which may be non-transitory. The memory 1302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1302 is used to store at least one instruction for being possessed by processor 1301 for implementing the virtual object display method provided by method embodiments herein.
In some embodiments, terminal 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. Processor 1301, memory 1302, and peripheral interface 1303 may be connected by a bus or signal line. Each peripheral device may be connected to the peripheral device interface 1303 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, touch display 1305, camera 1306, audio circuitry 1307, positioning component 1308, and power supply 1309.
Peripheral interface 1303 may be used to connect at least one peripheral associated with I/O (Input/Output) to processor 1301 and memory 1302. In some embodiments, processor 1301, memory 1302, and peripheral interface 1303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1301, the memory 1302, and the peripheral device interface 1303 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1304 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1304 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1304 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1304 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 8G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1304 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1305 is used to display a UI (user interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1305 is a touch display screen, the display screen 1305 also has the ability to capture touch signals on or over the surface of the display screen 1305. The touch signal may be input to the processor 1301 as a control signal for processing. At this point, the display 1305 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1305 may be one, providing the front panel of terminal 1300; in other embodiments, display 1305 may be at least two, either on different surfaces of terminal 1300 or in a folded design; in still other embodiments, display 1305 may be a flexible display disposed on a curved surface or on a folded surface of terminal 1300. Even further, the display 1305 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display 1305 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 1306 is used to capture images or video. Optionally, camera assembly 1306 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal 1300 and the rear camera is disposed on the rear side of the terminal 1300. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1301 for processing, or inputting the electric signals to the radio frequency circuit 1304 for realizing voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1300. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1301 or the radio frequency circuitry 1304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 1307 may also include a headphone jack.
The positioning component 1308 is used for positioning the current geographic position of the terminal 1300 to implement navigation or LBS (location based Service). The positioning component 1308 may be a positioning component based on a GPS (global positioning System) of the united states, a beidou System of china, a graves System of russia, or a galileo System of the european union.
Power supply 1309 is used to provide power to various components in terminal 1300. The power source 1309 may be alternating current, direct current, disposable or rechargeable. When the power source 1309 comprises a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1300 also includes one or more sensors 1310. The one or more sensors 1310 include, but are not limited to: acceleration sensor 1311, gyro sensor 1312, pressure sensor 1313, fingerprint sensor 1314, optical sensor 1315, and proximity sensor 1316.
The acceleration sensor 1311 can detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1300. For example, the acceleration sensor 1311 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1301 may control the touch display screen 1305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1311. The acceleration sensor 1311 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1312 may detect the body direction and the rotation angle of the terminal 1300, and the gyro sensor 1312 may cooperate with the acceleration sensor 1311 to acquire a 3D motion of the user with respect to the terminal 1300. Processor 1301, based on the data collected by gyroscope sensor 1312, may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1313 may be disposed on a side bezel of terminal 1300 and/or underlying touch display 1305. When the pressure sensor 1313 is disposed on the side frame of the terminal 1300, a user's holding signal to the terminal 1300 may be detected, and the processor 1301 performs left-right hand recognition or shortcut operation according to the holding signal acquired by the pressure sensor 1313. When the pressure sensor 1313 is disposed at a lower layer of the touch display screen 1305, the processor 1301 controls an operability control on the UI interface according to a pressure operation of the user on the touch display screen 1305. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1314 is used for collecting the fingerprint of the user, and the processor 1301 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 1414, or the fingerprint sensor 1314 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, processor 1301 authorizes the user to have relevant sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1314 may be disposed on the front, back, or side of the terminal 1300. When a physical key or vendor Logo is provided on the terminal 1300, the fingerprint sensor 1314 may be integrated with the physical key or vendor Logo.
The optical sensor 1315 is used to collect the ambient light intensity. In one embodiment, the processor 1301 can control the display brightness of the touch display screen 1305 according to the intensity of the ambient light collected by the optical sensor 1315. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1305 is increased; when the ambient light intensity is low, the display brightness of the touch display 1305 is turned down. In another embodiment, the processor 1301 can also dynamically adjust the shooting parameters of the camera assembly 1306 according to the ambient light intensity collected by the optical sensor 1315.
Proximity sensor 1316, also known as a distance sensor, is typically disposed on a front panel of terminal 1300. Proximity sensor 1316 is used to gather the distance between the user and the front face of terminal 1300. In one embodiment, the processor 1301 controls the touch display 1305 to switch from the bright screen state to the dark screen state when the proximity sensor 1316 detects that the distance between the user and the front face of the terminal 1300 gradually decreases; the touch display 1305 is controlled by the processor 1301 to switch from the rest state to the bright state when the proximity sensor 1316 detects that the distance between the user and the front face of the terminal 1300 gradually becomes larger.
Those skilled in the art will appreciate that the configuration shown in fig. 13 is not intended to be limiting with respect to terminal 1300 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
The embodiment of the present application further provides a computer device for displaying a virtual object, where the computer device includes a processor and a memory, and the memory stores at least one program code, and the at least one program code is loaded and executed by the processor, so as to implement the operations executed in the virtual object display method of the foregoing embodiment.
The embodiment of the present application further provides a computer-readable storage medium, where at least one program code is stored in the computer-readable storage medium, and the at least one program code is loaded and executed by a processor to implement the operations executed in the virtual object display method of the foregoing embodiment.
The embodiment of the present application further provides a computer program, where at least one program code is stored in the computer program, and the at least one program code is loaded and executed by a processor, so as to implement the operations executed in the virtual object display method according to the foregoing embodiment.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only an alternative embodiment of the present application and is not intended to limit the present application, and any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method of displaying a virtual object, the method comprising:
acquiring multiple parameters of a virtual object, wherein the multiple parameters comprise a roughness parameter and a normal parameter;
acquiring roughness components of the virtual object in multiple dimensions in a virtual environment according to the roughness parameters;
acquiring anisotropic illumination parameters of the virtual object according to the roughness components on the multiple dimensions and the normal parameters;
and displaying the virtual object in the virtual environment according to the multiple parameters and the anisotropic illumination parameters, so that the material of the virtual object is an anisotropic material.
2. The method according to claim 1, wherein the plurality of parameters further includes an indication parameter for indicating whether the material of the virtual object is an anisotropic material;
the obtaining roughness components of the virtual object in multiple dimensions in a virtual environment according to the roughness parameters includes:
and when the indication parameters indicate that the material of the virtual object is the anisotropic material, acquiring roughness components of the virtual object in the multiple dimensions according to the roughness parameters.
3. The method of claim 2, wherein after obtaining the plurality of parameters of the virtual object, the method further comprises:
when the indication parameters indicate that the material of the virtual object is an isotropic material, acquiring isotropic illumination parameters of the virtual object according to the roughness parameters and the normal parameters;
and displaying the virtual object in the virtual environment according to the multiple parameters and the isotropic illumination parameters, so that the virtual object is made of isotropic materials.
4. The method according to claim 2 or 3, wherein when the indication parameter is a first parameter value, indicating that the material of the virtual object is the anisotropic material;
and when the indication parameter is a second parameter value, indicating that the material of the virtual object is the isotropic material.
5. The method of claim 2, wherein obtaining the roughness component of the virtual object in multiple dimensions in the virtual environment according to the roughness parameter comprises:
and acquiring rotation parameters matched with the indication parameters according to the indication parameters, and acquiring roughness components of the virtual object on the multiple dimensions according to the roughness parameters and the rotation parameters.
6. The method according to claim 5, wherein the obtaining, according to the indication parameter, a rotation parameter matching the indication parameter, and obtaining, according to the roughness parameter and the rotation parameter, roughness components of the virtual object in the plurality of dimensions comprises:
acquiring roughness components of the virtual object in the multiple dimensions by adopting the following formula:
αy=α·r;
r=(1.0-0.9*t)2
wherein, αxAs a roughness component in a first dimension, αyFor the roughness component in the second dimension α is the roughness parameter, r is the rotation parameter and t is the indicator parameter.
7. The method of claim 1, wherein the obtaining a plurality of parameters of the virtual object comprises:
and acquiring a data packet of the virtual object from a cache region, and analyzing the data packet to obtain the multiple parameters.
8. The method of claim 7, wherein before retrieving the data packet of the virtual object from the buffer, the method further comprises:
acquiring the currently set multiple parameters of the virtual object;
generating the data packet according to the plurality of parameters;
and storing the data packet in the cache region by calling a parameter storage interface.
9. The method of claim 1, wherein the normal parameters comprise a normal vector of the virtual object and a micro-surface normal vector of the virtual object, and wherein obtaining the anisotropic illumination parameters of the virtual object from the roughness components in the plurality of dimensions and the normal parameters comprises:
obtaining the anisotropic illumination parameters using the following formula:
Figure FDA0002270007880000031
wherein D isaniso(H) Is the anisotropic illumination parameter of the virtual object, H is the micro-surface normal vector of the virtual object, N is the normal vector of the virtual object, X is the normal component in the first dimension, Y is the normal component in the second dimension, αxα being the roughness component in said first dimensionyIs the roughness component in the second dimension.
10. The method of claim 3, wherein the normal parameters comprise a normal vector of the virtual object and a micro-surface normal vector of the virtual object, and wherein obtaining the isotropic illumination parameters of the virtual object according to the roughness parameter and the normal parameter comprises:
obtaining the isotropic illumination parameters using the following formula:
Figure FDA0002270007880000032
wherein D isGGX(H) And H is the micro-surface normal vector of the virtual object, N is the normal vector of the virtual object, and α is the roughness parameter of the virtual object.
11. A virtual object display apparatus, characterized in that the apparatus comprises:
the first parameter acquisition module is used for acquiring a plurality of parameters of the virtual object, wherein the plurality of parameters comprise a roughness parameter and a normal parameter;
the roughness component acquisition module is used for acquiring roughness components of the virtual object in multiple dimensions in a virtual environment according to the roughness parameters;
the first illumination parameter acquisition module is used for acquiring anisotropic illumination parameters of the virtual object according to the roughness components on the multiple dimensions and the normal parameters;
and the display module is used for displaying the virtual object in the virtual environment according to the multiple parameters and the anisotropic illumination parameters so as to enable the material of the virtual object to be anisotropic.
12. The apparatus according to claim 11, wherein the plurality of parameters further includes an indication parameter for indicating whether the material of the virtual object is an anisotropic material;
the roughness component obtaining module is further configured to:
and when the indication parameters indicate that the material of the virtual object is the anisotropic material, acquiring roughness components of the virtual object in the multiple dimensions according to the roughness parameters.
13. The apparatus of claim 11, wherein the normal parameters comprise a normal vector of the virtual object and a micro-surface normal vector of the virtual object, and wherein the first illumination parameter acquisition module is further configured to:
obtaining the anisotropic illumination parameters using the following formula:
Figure FDA0002270007880000041
wherein D isaniso(H) Is anisotropy of the virtual objectThe illumination parameters are H, N, X, Y and α, wherein H is the normal vector of the micro-surface of the virtual object, N is the normal vector of the virtual object, X is the normal component in the first dimension, and Y is the normal component in the second dimensionxα being the roughness component in said first dimensionyIs the roughness component in the second dimension.
14. A computer device comprising a processor and a memory, the memory having stored therein at least one program code, the at least one program code being loaded into and executed by the processor to perform operations of the virtual object display method of any of claims 1 to 10.
15. A computer-readable storage medium having at least one program code stored therein, the at least one program code being loaded and executed by a processor to perform the operations performed in the virtual object display method of any one of claims 1 to 10.
CN201911101501.8A 2019-11-12 2019-11-12 Virtual object display method and device, computer equipment and storage medium Active CN110853128B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911101501.8A CN110853128B (en) 2019-11-12 2019-11-12 Virtual object display method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911101501.8A CN110853128B (en) 2019-11-12 2019-11-12 Virtual object display method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110853128A true CN110853128A (en) 2020-02-28
CN110853128B CN110853128B (en) 2023-03-21

Family

ID=69601786

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911101501.8A Active CN110853128B (en) 2019-11-12 2019-11-12 Virtual object display method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110853128B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111462269A (en) * 2020-03-31 2020-07-28 网易(杭州)网络有限公司 Image processing method and device, storage medium and electronic equipment
CN111768473A (en) * 2020-06-28 2020-10-13 完美世界(北京)软件科技发展有限公司 Image rendering method, device and equipment
CN113379884A (en) * 2021-07-05 2021-09-10 北京百度网讯科技有限公司 Map rendering method and device, electronic equipment, storage medium and vehicle
WO2022227996A1 (en) * 2021-04-28 2022-11-03 北京字跳网络技术有限公司 Image processing method and apparatus, electronic device, and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065350A (en) * 2011-10-18 2013-04-24 北京三星通信技术研究有限公司 Texture pipeline synthetic method and system using full color space
CN106408524A (en) * 2016-08-17 2017-02-15 南京理工大学 Two-dimensional image-assisted depth image enhancement method
WO2018076948A1 (en) * 2016-10-24 2018-05-03 京东方科技集团股份有限公司 Display panel and display device
CN109360210A (en) * 2018-10-16 2019-02-19 腾讯科技(深圳)有限公司 Image partition method, device, computer equipment and storage medium
CN110102050A (en) * 2019-04-30 2019-08-09 腾讯科技(深圳)有限公司 Virtual objects display methods, device, electronic equipment and storage medium
CN110298910A (en) * 2019-07-04 2019-10-01 珠海金山网络游戏科技有限公司 A kind of illumination calculation method, apparatus calculates equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065350A (en) * 2011-10-18 2013-04-24 北京三星通信技术研究有限公司 Texture pipeline synthetic method and system using full color space
CN106408524A (en) * 2016-08-17 2017-02-15 南京理工大学 Two-dimensional image-assisted depth image enhancement method
WO2018076948A1 (en) * 2016-10-24 2018-05-03 京东方科技集团股份有限公司 Display panel and display device
CN109360210A (en) * 2018-10-16 2019-02-19 腾讯科技(深圳)有限公司 Image partition method, device, computer equipment and storage medium
CN110102050A (en) * 2019-04-30 2019-08-09 腾讯科技(深圳)有限公司 Virtual objects display methods, device, electronic equipment and storage medium
CN110298910A (en) * 2019-07-04 2019-10-01 珠海金山网络游戏科技有限公司 A kind of illumination calculation method, apparatus calculates equipment and storage medium

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
XU SHAO-PING 等: "Improved Haptic Rendering Algorithm Based on Image Feature", 《COMPUTER ENGINEERING》 *
刘成浩: "路径追踪中出射光线方向的快速采样方法", 《中国图象图形学报》 *
王芳等: "基于BRDF和GPU并行计算的全局光照实时渲染", 《图学学报》 *
赵海英等: "纹理粗糙度度量算法的性能比较", 《计算机科学》 *
赵璐等: "触觉再现技术研究进展", 《计算机辅助设计与图形学学报》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111462269A (en) * 2020-03-31 2020-07-28 网易(杭州)网络有限公司 Image processing method and device, storage medium and electronic equipment
CN111462269B (en) * 2020-03-31 2024-02-02 网易(杭州)网络有限公司 Image processing method and device, storage medium and electronic equipment
CN111768473A (en) * 2020-06-28 2020-10-13 完美世界(北京)软件科技发展有限公司 Image rendering method, device and equipment
CN111768473B (en) * 2020-06-28 2024-03-22 完美世界(北京)软件科技发展有限公司 Image rendering method, device and equipment
WO2022227996A1 (en) * 2021-04-28 2022-11-03 北京字跳网络技术有限公司 Image processing method and apparatus, electronic device, and readable storage medium
CN113379884A (en) * 2021-07-05 2021-09-10 北京百度网讯科技有限公司 Map rendering method and device, electronic equipment, storage medium and vehicle
CN113379884B (en) * 2021-07-05 2023-11-17 北京百度网讯科技有限公司 Map rendering method, map rendering device, electronic device, storage medium and vehicle

Also Published As

Publication number Publication date
CN110853128B (en) 2023-03-21

Similar Documents

Publication Publication Date Title
CN109685876B (en) Hair rendering method and device, electronic equipment and storage medium
CN110488977B (en) Virtual reality display method, device and system and storage medium
CN109712224B (en) Virtual scene rendering method and device and intelligent device
CN110427110B (en) Live broadcast method and device and live broadcast server
CN110853128B (en) Virtual object display method and device, computer equipment and storage medium
CN111324250B (en) Three-dimensional image adjusting method, device and equipment and readable storage medium
CN110971930A (en) Live virtual image broadcasting method, device, terminal and storage medium
CN111464749B (en) Method, device, equipment and storage medium for image synthesis
WO2022134632A1 (en) Work processing method and apparatus
CN109886208B (en) Object detection method and device, computer equipment and storage medium
CN111982305A (en) Temperature measuring method, device and computer storage medium
CN111028144A (en) Video face changing method and device and storage medium
CN110839174A (en) Image processing method and device, computer equipment and storage medium
CN113384880A (en) Virtual scene display method and device, computer equipment and storage medium
CN112565806A (en) Virtual gift presenting method, device, computer equipment and medium
WO2022199102A1 (en) Image processing method and device
CN109783176B (en) Page switching method and device
CN110769120A (en) Method, device, equipment and storage medium for message reminding
CN112396076A (en) License plate image generation method and device and computer storage medium
CN111857793A (en) Network model training method, device, equipment and storage medium
CN111327819A (en) Method, device, electronic equipment and medium for selecting image
CN112967261B (en) Image fusion method, device, equipment and storage medium
WO2018192455A1 (en) Method and apparatus for generating subtitles
CN110728744A (en) Volume rendering method and device and intelligent equipment
CN110708582B (en) Synchronous playing method, device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40021602

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant