CN111915712B - Illumination rendering method and device, computer readable medium and electronic equipment - Google Patents

Illumination rendering method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN111915712B
CN111915712B CN202010889379.1A CN202010889379A CN111915712B CN 111915712 B CN111915712 B CN 111915712B CN 202010889379 A CN202010889379 A CN 202010889379A CN 111915712 B CN111915712 B CN 111915712B
Authority
CN
China
Prior art keywords
illumination
virtual light
light source
scene
rendered
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010889379.1A
Other languages
Chinese (zh)
Other versions
CN111915712A (en
Inventor
赵海峰
王凯
黄垚
胡一博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010889379.1A priority Critical patent/CN111915712B/en
Publication of CN111915712A publication Critical patent/CN111915712A/en
Application granted granted Critical
Publication of CN111915712B publication Critical patent/CN111915712B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Image Generation (AREA)

Abstract

The disclosure provides an illumination rendering method, an illumination rendering device, a computer readable medium and electronic equipment; relates to the technical field of image processing. The illumination rendering method comprises the following steps: generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources; acquiring an environment texture map, sampling the environment texture map according to the light array, and determining illumination information of each virtual light source in the light array on the scene to be rendered; and rendering the scene to be rendered by utilizing the illumination information so as to simulate the illumination effect of the scene to be rendered. The illumination rendering method can overcome the problem that illumination rendering is not accurate enough in the scene to a certain extent, and further improves the authenticity of illumination rendering.

Description

Illumination rendering method and device, computer readable medium and electronic equipment
Technical Field
The disclosure relates to the technical field of image processing, in particular to an illumination rendering method, an illumination rendering device, a computer readable medium and electronic equipment.
Background
In computer animation, environment-based illumination rendering is an important link in achieving realism. The illumination rendering mainly obtains illumination information around the object from the environment texture map and illuminates the object. In general, when rendering in real time in a game engine, a pre-convolution process is performed on a texture map, then sampling is performed on the texture map after the pre-convolution process through a normal line of an object surface, so as to obtain current illumination information, and then a screen space environment shielding technology is utilized to simulate surrounding environment and shielding information of the self. However, since the simulated occlusion information is not prepared enough, incorrect rendering effects, such as light leakage, may occur at some details, resulting in a problem that the rendered scene is less realistic.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The disclosure aims to provide an illumination rendering method, an illumination rendering device, a computer readable medium and electronic equipment, which can overcome the problem of poor illumination effect in a virtual scene to a certain extent, thereby improving the accuracy of illumination rendering.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an illumination rendering method, comprising:
generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources;
Acquiring an environment texture map, sampling the environment texture map according to the light array, and determining illumination information of each virtual light source in the light array on the scene to be rendered;
And rendering the scene to be rendered by utilizing the illumination information so as to simulate the illumination effect of the scene to be rendered.
In an exemplary embodiment of the present disclosure, the sampling the environmental texture map, determining illumination information of the to-be-rendered scene by each of the virtual light sources in the light array includes:
Calculating the sampling range of each virtual light source in the lamplight array;
determining a plurality of sampling directions of the virtual light sources within the sampling range for each of the virtual light sources;
and determining illumination colors of the plurality of sampling directions according to the environment texture mapping to acquire illumination information of the virtual light source.
In an exemplary embodiment of the disclosure, the determining the illumination colors of the plurality of sampling directions according to the environmental texture map to obtain illumination information of the virtual light source includes:
calculating arithmetic average values of illumination colors in the plurality of sampling directions, and determining illumination information of the virtual light source according to the arithmetic average values.
In an exemplary embodiment of the disclosure, the determining a plurality of sampling directions of the virtual light source within the sampling range includes:
generating a plurality of random location points within the sampling range;
And determining a plurality of sampling directions of the virtual light source by utilizing the random position points and the positions of the virtual light source in the lamplight array.
In an exemplary embodiment of the present disclosure, the generating a corresponding light array for a scene to be rendered includes:
and generating a spherical light array around the scene to be rendered, wherein each virtual light source in the spherical light array is distributed in a fibonacci sequence.
In one exemplary embodiment of the present disclosure, the ambient texture map is a high dynamic range image.
In an exemplary embodiment of the present disclosure, the obtaining the environmental texture map includes:
responding to the input operation of a user interface, and updating the environment texture mapping in real time according to the input operation.
According to a second aspect of the present disclosure, there is provided an illumination rendering apparatus, including a virtual light generating module, an illumination information calculating module, and an illumination rendering module, wherein:
the virtual light generating module is used for generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources.
And the illumination information calculation module is used for acquiring an environment texture map, sampling the environment texture map according to the lamplight array, and determining illumination information of each virtual light source in the lamplight array on the scene to be rendered.
And the illumination rendering module is used for rendering the scene to be rendered by utilizing the illumination information so as to simulate the illumination effect of the scene to be rendered.
In an exemplary embodiment of the present disclosure, the illumination information calculation module may specifically include a sampling range calculation module, a sampling direction calculation module, and an illumination sampling module, where:
and the sampling range calculation module is used for calculating the sampling range of each virtual light source in the lamplight array.
And the sampling direction calculation module is used for determining a plurality of sampling directions of the virtual light sources in the sampling range for each virtual light source.
And the illumination sampling module is used for determining illumination colors of the plurality of sampling directions according to the environment texture mapping so as to acquire illumination information of the virtual light source.
In one exemplary embodiment of the present disclosure, the illumination sampling module may be specifically configured to: calculating arithmetic average values of illumination colors in the plurality of sampling directions, and determining illumination information of the virtual light source according to the arithmetic average values.
In one exemplary embodiment of the present disclosure, the sampling direction calculation module may include a random location point generation module and a direction determination module, wherein:
and the random position point generation module is used for generating a plurality of random position points in the sampling range.
And the direction determining module is used for determining a plurality of sampling directions of the virtual light source by utilizing the random position points and the positions of the light sources in the lamplight array.
In an exemplary embodiment of the present disclosure, the virtual light generating module is specifically configured to: and generating a spherical light array around the scene to be rendered, wherein each virtual light source in the spherical light array is distributed in a fibonacci sequence.
In one exemplary embodiment of the present disclosure, the ambient texture map is a high dynamic range image.
In an exemplary embodiment of the present disclosure, the illumination information calculation module is specifically configured to: responding to the input operation of a user interface, and updating the environment texture mapping in real time according to the input operation.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any of the above via execution of the executable instructions.
According to a fourth aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the method of any of the above.
Exemplary embodiments of the present disclosure may have some or all of the following advantages:
In the illumination rendering method provided by an example embodiment of the present disclosure, on one hand, ambient illumination can be simulated by uniformly distributed virtual light sources to obtain correct ambient shielding information, and the color of the virtual light sources can be precisely controlled by ambient texture mapping, so as to improve the authenticity of a virtual scene; on the other hand, the illumination information of more details can be obtained through the lamplight array and the environment texture map, incorrect rendering of the details can be avoided, and therefore the accuracy of rendering is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
FIG. 1 schematically illustrates a flow diagram of a light rendering method according to one embodiment of the disclosure;
FIG. 2 schematically illustrates a schematic diagram of a light array according to one embodiment of the present disclosure;
FIG. 3 schematically illustrates a light rendering method flow diagram according to one embodiment of the disclosure;
FIG. 4 schematically illustrates a random location point schematic according to one embodiment of the present disclosure;
FIG. 5 schematically illustrates an adoption direction diagram according to one embodiment of the present disclosure;
FIG. 6 schematically illustrates a rendering effect schematic according to one embodiment of the present disclosure;
FIG. 7 schematically illustrates a block diagram of an illumination rendering apparatus according to one embodiment of the disclosure;
FIG. 8 schematically illustrates a system architecture diagram for implementing an illumination rendering method according to one embodiment of the disclosure;
Fig. 9 shows a schematic diagram of a computer system suitable for use in implementing embodiments of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
The following describes the technical scheme of the embodiments of the present disclosure in detail:
the present exemplary embodiment first provides an illumination rendering method. Referring to fig. 1, the illumination rendering method may include the steps of:
Step S110: generating a corresponding lamplight array aiming at a scene to be rendered, wherein the lamplight array comprises a plurality of uniformly distributed virtual light sources.
Step S120: and acquiring an environment texture map, sampling the environment texture map according to the lamplight array, and determining illumination information of each virtual light source in the lamplight array on the scene to be rendered.
Step S130: and rendering the scene to be rendered by utilizing the illumination information so as to simulate the illumination effect of the scene to be rendered.
In the illumination rendering method provided by the example embodiment of the present disclosure, on one hand, ambient illumination can be simulated by the virtual light source to obtain correct ambient shielding information, and the color of the virtual light source can be accurately controlled by the ambient texture map, so as to improve the authenticity of the virtual scene; on the other hand, more detailed illumination information can be obtained through the uniformly distributed light arrays and the environment texture map, incorrect rendering of details can be avoided, and therefore rendering accuracy is improved.
Next, the above steps of the present exemplary embodiment will be described in more detail.
In step S110, a corresponding light array is generated for the scene to be rendered, where the light array includes a plurality of uniformly distributed virtual light sources.
The light array may include a plurality of virtual light sources that may have a plurality of illumination parameters, e.g., light orientation, light color, light intensity, etc., and all of the virtual light sources may also have uniform illumination parameters. In an exemplary embodiment, parameter information of a light array configured by a user can be acquired through a user interface, and the light array can be generated by using the parameter information. The user interface may include a plurality of controls, and the user may control parameter information of the light array through clicking, inputting, etc. operations, for example, the total number of virtual light sources in the light array, the illumination intensity or illumination color of the virtual light sources, the shape of the light array, etc. For example, the array of lights may be a three-dimensional sphere, or a hemisphere, distributed around the scene to be rendered. The virtual light sources can be uniformly distributed according to the longitude and latitude of the sphere, and can also be uniformly distributed according to a fibonacci number sequence. As shown in fig. 2, a is longitude and latitude distribution, and B is fibonacci distribution, it can be seen that intervals between virtual light sources in a light array of fibonacci distribution are more uniform, and uniform illumination can be obtained.
In step S120, an environmental texture map is obtained, and the environmental texture map is sampled according to the light array, so as to determine illumination information of each virtual light source in the light array on the scene to be rendered.
The environment texture map can be a cube map, the cube map comprises six texture maps, the environment texture map is sampled and mapped to the model surface of the scene to be rendered, and the environment can be simulated in all directions, so that rendering is more real. And, the user can add different environmental texture maps to the scene to be rendered according to actual demands, so as to render different scene effects. The illumination information may include RGBA information of each pixel in the scene to be rendered, or may include other information, such as transparency, which is not limited in this embodiment. Specifically, the coordinate system of the environmental texture map is converted into the world coordinate system of the scene to be rendered, so that the pixels of the environmental texture map corresponding to each pixel point in the scene to be rendered are determined, the virtual light sources in the lamplight array are further added, the reflection illumination of the virtual light sources obtained at each point in the scene to be rendered can be calculated by utilizing an illumination formula, and various parameters of illumination information such as color, brightness and the like of each pixel point in the scene to be rendered can be obtained.
In addition, the environmental texture map may be updated in real-time. Specifically, the present embodiment may provide a user interface, where the user interface may include an input interface, through which an input operation of a user may be received, and further, the environmental texture map selected by the user may be obtained in response to the input operation, and then the original map is updated by using the environmental texture map selected by the user to perform rendering. The embodiment can support real-time environment texture map updating, so that richer and more flexible rendering effects are realized.
In an exemplary embodiment, the method may include the following steps S310 to S330, as shown in fig. 3.
In step S310, a sampling range of each virtual light source in the light array is calculated. Firstly, information of the lamplight array is acquired, then, the lamplight array can be utilized to calculate the distribution interval and lamplight orientation of each virtual light source, and accordingly, the sampling range corresponding to each virtual light source is obtained. The sampling range is a cone along the lamplight direction, the axis of the cone is the light source direction of the virtual light source, and the included angle of the cone corresponds to the interval between each lamplight.
In step S320, for each of the virtual light sources, a plurality of sampling directions of the virtual light source are determined within the sampling range. For example, a random scatter algorithm may be employed to generate a number of random location points within the employed range. The number of the random position points may be set according to actual requirements, for example, 10, 20, etc., and the embodiment is not limited thereto. For example, a uniformly distributed random location point may be generated by HAMMERSLEY sampling algorithm, as shown in fig. 4. The random position points can be used to determine the sampling direction after the random position points are obtained, and in particular, the sampling direction can be a vector from the position of the virtual light source to the random position points. For example, as shown in fig. 5, the Kx and Ky planes may be the ground, a virtual light source is located at the origin in the coordinate system, kz is the light direction of the virtual light source, and Kn (n=1, 2,3,4 …) is the randomly generated sampling direction.
In step S330, the illumination colors of the plurality of sampling directions are determined according to the environmental texture map, so as to obtain illumination information of the virtual light source. For example, as shown in fig. 5, the virtual light source is located at the origin of the coordinate system, an illumination color can be obtained according to each sampling direction, and the calculated arithmetic average value of the illumination color in each sampling direction can be used as illumination information of the virtual light source. The illumination information may be calculated for each sampling direction by other calculation methods, for example, the average square value of each sampling direction may be calculated as the illumination information, and the present embodiment is not limited thereto.
In the present exemplary embodiment, the process of calculating the illumination information of the virtual light may be completed in the GPU, which has a very high real-time frame rate, but the real-time frame rate cannot be achieved by rendering through pre-convolution environment mapping, so the method may meet the requirement of real-time rendering.
In an exemplary embodiment, the ambient texture map may be a high dynamic range image. Rendering is typically performed in a game engine by ambient texture mapping of low dynamic range images, but low dynamic range images can represent a limited range of colors and cannot accurately represent details. The color position can be increased through the high dynamic range image, so that more accurate illumination color is obtained, and the reality of the virtual scene is enhanced. For example, as shown in fig. 6, 610 is a rendering effect of a low dynamic range map, 620 is a rendering effect of a high dynamic range map, and it can be seen that the rendering effect of the high dynamic range map has finer and richer color levels, and can represent more details. If the rendering engine cannot match the high dynamic range map, the underlying code of the engine may be modified. For example, if the unreal game engine is used to render the scene, the underlying code of the game engine can be modified to store color information with more color bits, so as to match with the high dynamic range image, thus showing more accurate illumination effect.
In step S130, the illumination information is used to render the scene to be rendered, so as to simulate the lighting effect of the scene to be rendered.
After the illumination information of each virtual light is determined, the illumination formula can be used to calculate the illumination effect of the virtual light on the scene to be rendered, such as a diffuse reflection illumination formula, a specular reflection illumination formula and the like. Specifically, firstly, calculating reflected light obtained by each pixel in a virtual scene to be obtained by using the illumination intensity of virtual lamplight in a lamplight array and the illumination color of the virtual lamplight obtained by calculation. And calculating each virtual light in the light array, determining the reflected light of each virtual light on each point of the scene to be rendered, and then superposing to obtain the final lighting effect on each point.
In the present exemplary embodiment, actual light calculation may be performed by using the virtual light sources in the light array, so as to obtain correct shielding information, avoid phenomena such as light leakage, and improve the accuracy of rendering. And, the surrounding environment information can be obtained from the environment texture map, so that the environment effect is rendered for the model scene to be rendered.
Further, in this example embodiment, there is also provided an illumination rendering apparatus configured to execute the illumination rendering method described in the disclosure. The device can be applied to a server or terminal equipment.
Referring to fig. 7, the light rendering apparatus 700 may include: an entity determination module 710, an information acquisition module 720, and a text query module 730, wherein:
the virtual light generating module 710 is configured to generate a corresponding light array for a scene to be rendered, where the light array includes a plurality of uniformly distributed virtual light sources.
The illumination information calculation module 720 is configured to obtain an environmental texture map, sample the environmental texture map according to the light array, and determine illumination information of each virtual light source in the light array on the scene to be rendered.
And the illumination rendering module 730 is configured to render the scene to be rendered by using the illumination information so as to simulate an illumination effect of the scene to be rendered.
In an exemplary embodiment of the present disclosure, the illumination information calculation module 720 may specifically include a sampling range calculation module, a sampling direction calculation module, and an illumination sampling module, where:
and the sampling range calculation module is used for calculating the sampling range of each virtual light source in the lamplight array.
And the sampling direction calculation module is used for determining a plurality of sampling directions of the virtual light sources in the sampling range for each virtual light source.
And the illumination sampling module is used for determining illumination colors of the plurality of sampling directions according to the environment texture mapping so as to acquire illumination information of the virtual light source.
In one exemplary embodiment of the present disclosure, the illumination sampling module may be specifically configured to: calculating arithmetic average values of illumination colors in the plurality of sampling directions, and determining illumination information of the virtual light source according to the arithmetic average values.
In one exemplary embodiment of the present disclosure, the sampling direction calculation module may include a random location point generation module and a direction determination module, wherein:
and the random position point generation module is used for generating a plurality of random position points in the sampling range.
And the direction determining module is used for determining a plurality of sampling directions of the virtual light source by utilizing the random position points and the positions of the light sources in the lamplight array.
In an exemplary embodiment of the present disclosure, the virtual light generating module 710 is specifically configured to: and generating a spherical light array around the scene to be rendered, wherein each virtual light source in the spherical light array is distributed in a fibonacci sequence.
In one exemplary embodiment of the present disclosure, the ambient texture map is a high dynamic range image.
In an exemplary embodiment of the present disclosure, the illumination information calculation module 720 is specifically configured to: responding to the input operation of a user interface, and updating the environment texture mapping in real time according to the input operation.
Since each functional module of the light rendering device of the exemplary embodiment of the present disclosure corresponds to a step of the above-described exemplary embodiment of the light rendering method, for details not disclosed in the embodiment of the device of the present disclosure, please refer to the above-described embodiment of the light rendering method of the present disclosure.
Referring to fig. 8, fig. 8 illustrates a schematic diagram of a system architecture of an exemplary application environment based on an illumination rendering device and an illumination rendering method to which embodiments of the present disclosure may be applied.
As shown in fig. 8, the system architecture 800 may include one or more of terminal devices 801, 802, 803, a network 804, and a server 805. The network 804 serves as a medium for providing communication links between the terminal devices 801, 802, 803 and the server 805. The network 804 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The terminal devices 801, 802, 803 may be various electronic devices with display screens including, but not limited to, desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks and servers in fig. 8 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 805 may be a server cluster formed by a plurality of servers.
The illumination rendering method provided by the embodiment of the present disclosure may be performed by the terminal devices 801, 802, 803, and accordingly, the illumination rendering apparatus may be disposed in the terminal devices 801, 802, 803.
Fig. 9 shows a schematic diagram of a computer system suitable for use in implementing embodiments of the present disclosure.
It should be noted that, the computer system 800 of the electronic device shown in fig. 9 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present disclosure.
As shown in fig. 9, the computer system 900 includes a Central Processing Unit (CPU) 901, which can execute various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 902 or a program loaded from a storage section 908 into a Random Access Memory (RAM) 903. In the RAM 903, various programs and data required for system operation are also stored. The CPU 901, ROM902, and RAM 903 are connected to each other through a bus 904. An input/output (I/O) interface 905 is also connected to the bus 904.
The following components are connected to the I/O interface 905: an input section 906 including a keyboard, a mouse, and the like; an output portion 907 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage portion 908 including a hard disk or the like; and a communication section 909 including a network interface card such as a LAN card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. The drive 910 is also connected to the I/O interface 905 as needed. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed as needed on the drive 910 so that a computer program read out therefrom is installed into the storage section 908 as needed.
In particular, according to embodiments of the present disclosure, the processes described below with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from the network via the communication portion 909 and/or installed from the removable medium 911. When being executed by a Central Processing Unit (CPU) 901, performs the various functions defined in the method and apparatus of the present application.
It should be noted that the computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As another aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer-readable medium carries one or more programs which, when executed by one of the electronic devices, cause the electronic device to implement the methods described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 1 and 2, and so on.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (9)

1. A method of illumination rendering, comprising:
Generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources;
Obtaining an environment texture map, and calculating the distribution interval and the light direction of each virtual light source in the light array to obtain the sampling range of each virtual light source; determining a plurality of sampling directions of the virtual light sources within the sampling range for each of the virtual light sources; determining illumination colors of the plurality of sampling directions according to the environment texture map so as to acquire illumination information of the virtual light source;
and determining the reflected light of each virtual light source on each pixel point in the scene to be rendered according to the illumination information of each virtual light source in the lamplight array, and superposing the reflected light of each virtual light source on each pixel point to simulate the illumination effect of the scene to be rendered.
2. The method of claim 1, wherein determining illumination colors for the plurality of sampling directions from the ambient texture map to obtain illumination information for the virtual light source comprises:
Calculating arithmetic average values of illumination colors in the plurality of sampling directions, and determining illumination information of the virtual light source according to the arithmetic average values.
3. The method of claim 1, wherein the determining a plurality of sampling directions of the virtual light source within the sampling range comprises:
generating a plurality of random location points within the sampling range;
And determining a plurality of sampling directions of the virtual light source by utilizing the random position points and the positions of the virtual light source in the lamplight array.
4. The method of claim 1, wherein the generating a corresponding array of lights for the scene to be rendered comprises:
and generating a spherical light array around the scene to be rendered, wherein each virtual light source in the spherical light array is distributed in a fibonacci sequence.
5. The method of claim 1, wherein the ambient texture map is a high dynamic range image.
6. The method of claim 1, wherein the obtaining the environmental texture map comprises:
responding to the input operation of a user interface, and updating the environment texture mapping in real time according to the input operation.
7. An illumination rendering apparatus, comprising:
the virtual light generating module is used for generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources;
the illumination information calculation module is used for obtaining an environment texture map, calculating the distribution interval and the lamplight direction of each virtual light source in the lamplight array, and obtaining the sampling range of each virtual light source; determining a plurality of sampling directions of the virtual light sources within the sampling range for each of the virtual light sources; determining illumination colors of the plurality of sampling directions according to the environment texture map so as to acquire illumination information of the virtual light source;
And the illumination rendering module is used for determining the reflected light of each virtual light source on each pixel point in the scene to be rendered according to the illumination information of each virtual light source in the lamplight array, and superposing the reflected light of each virtual light source on each pixel point so as to simulate the illumination effect of the scene to be rendered.
8. A computer readable medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of any of claims 1-6.
9. An electronic device, comprising:
A processor; and
A memory for storing executable instructions of the processor;
Wherein the processor is configured to perform the method of any of claims 1-6 via execution of the executable instructions.
CN202010889379.1A 2020-08-28 2020-08-28 Illumination rendering method and device, computer readable medium and electronic equipment Active CN111915712B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010889379.1A CN111915712B (en) 2020-08-28 2020-08-28 Illumination rendering method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010889379.1A CN111915712B (en) 2020-08-28 2020-08-28 Illumination rendering method and device, computer readable medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN111915712A CN111915712A (en) 2020-11-10
CN111915712B true CN111915712B (en) 2024-05-28

Family

ID=73266438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010889379.1A Active CN111915712B (en) 2020-08-28 2020-08-28 Illumination rendering method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111915712B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112785672B (en) * 2021-01-19 2022-07-05 浙江商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
CN115115747A (en) * 2021-03-09 2022-09-27 网易(杭州)网络有限公司 Illumination rendering method and device, electronic equipment and storage medium
WO2024066559A1 (en) * 2022-09-28 2024-04-04 杭州群核信息技术有限公司 Rendering method, apparatus and system, electronic device, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69430647D1 (en) * 1993-01-28 2002-06-27 Philips Electronics Uk Ltd image display
JP2003296750A (en) * 2002-03-21 2003-10-17 Microsoft Corp Graphic image rendering using radiance autotransfer of low frequency lighting environment
CN103021020A (en) * 2012-12-05 2013-04-03 上海创图网络科技发展有限公司 Three-dimensional (3D) rendering method based on multiple light sources
CN107909638A (en) * 2017-11-15 2018-04-13 网易(杭州)网络有限公司 Rendering intent, medium, system and the electronic equipment of dummy object
CN110223363A (en) * 2019-04-25 2019-09-10 合刃科技(深圳)有限公司 Image generating method and device
CN111127624A (en) * 2019-12-27 2020-05-08 珠海金山网络游戏科技有限公司 Illumination rendering method and device based on AR scene
CN111260766A (en) * 2020-01-17 2020-06-09 网易(杭州)网络有限公司 Virtual light source processing method, device, medium and electronic equipment
CN111343444A (en) * 2020-02-10 2020-06-26 清华大学 Three-dimensional image generation method and device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6753875B2 (en) * 2001-08-03 2004-06-22 Hewlett-Packard Development Company, L.P. System and method for rendering a texture map utilizing an illumination modulation value
US8217940B2 (en) * 2007-07-23 2012-07-10 Disney Enterprises, Inc. Directable lighting method and apparatus
US8698806B2 (en) * 2009-11-09 2014-04-15 Maxon Computer Gmbh System and method for performing volume rendering using shadow calculation
US9600904B2 (en) * 2013-12-30 2017-03-21 Samsung Electronics Co., Ltd. Illuminating a virtual environment with camera light data

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69430647D1 (en) * 1993-01-28 2002-06-27 Philips Electronics Uk Ltd image display
JP2003296750A (en) * 2002-03-21 2003-10-17 Microsoft Corp Graphic image rendering using radiance autotransfer of low frequency lighting environment
CN103021020A (en) * 2012-12-05 2013-04-03 上海创图网络科技发展有限公司 Three-dimensional (3D) rendering method based on multiple light sources
CN107909638A (en) * 2017-11-15 2018-04-13 网易(杭州)网络有限公司 Rendering intent, medium, system and the electronic equipment of dummy object
CN110223363A (en) * 2019-04-25 2019-09-10 合刃科技(深圳)有限公司 Image generating method and device
CN111127624A (en) * 2019-12-27 2020-05-08 珠海金山网络游戏科技有限公司 Illumination rendering method and device based on AR scene
CN111260766A (en) * 2020-01-17 2020-06-09 网易(杭州)网络有限公司 Virtual light source processing method, device, medium and electronic equipment
CN111343444A (en) * 2020-02-10 2020-06-26 清华大学 Three-dimensional image generation method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
三维动画渲染技术解析;林喆;现代电影技术;22-25 *
基于光场的渲染技术研究;胡孔明;于瀛洁;张之江;;微计算机应用(第02期);22-27 *
基于直接光照的全局光照模拟;吴伟和;郝爱民;李智;王文涛;杨跃东;王立君;;计算机工程(第10期);257-258+261 *

Also Published As

Publication number Publication date
CN111915712A (en) 2020-11-10

Similar Documents

Publication Publication Date Title
CN111915712B (en) Illumination rendering method and device, computer readable medium and electronic equipment
CN111260766B (en) Virtual light source processing method, device, medium and electronic equipment
CN109045691B (en) Method and device for realizing special effect of special effect object
CN108882025B (en) Video frame processing method and device
CN113674389B (en) Scene rendering method and device, electronic equipment and storage medium
CN112184873B (en) Fractal graph creation method, fractal graph creation device, electronic equipment and storage medium
CN112734896B (en) Environment shielding rendering method and device, storage medium and electronic equipment
CN110084873B (en) Method and apparatus for rendering three-dimensional model
CN113012273A (en) Illumination rendering method, device, medium and equipment based on target model
CN111882631A (en) Model rendering method, device, equipment and storage medium
CN111798554A (en) Rendering parameter determination method, device, equipment and storage medium
CN113332714B (en) Light supplementing method and device for game model, storage medium and computer equipment
CN115965727A (en) Image rendering method, device, equipment and medium
CN112580213A (en) Method and apparatus for generating display image of electric field lines, and storage medium
CN110930492B (en) Model rendering method, device, computer readable medium and electronic equipment
CN116167239A (en) Infrared simulation method, system, computer and readable storage medium
CN115082628B (en) Dynamic drawing method and device based on implicit optical transfer function
CN112446944B (en) Method and system for simulating real environment light in AR scene
CN115631289A (en) Vehicle model surface generation method, system, equipment and storage medium
US11727535B2 (en) Using intrinsic functions for shadow denoising in ray tracing applications
CN112967369A (en) Light ray display method and device
CN113744379A (en) Image generation method and device and electronic equipment
US20180005432A1 (en) Shading Using Multiple Texture Maps
CN116152425A (en) Method and device for drawing image, electronic equipment and storage medium
CN117252976A (en) Model rendering order determining method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant