CN111915712A - Illumination rendering method and device, computer readable medium and electronic equipment - Google Patents

Illumination rendering method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN111915712A
CN111915712A CN202010889379.1A CN202010889379A CN111915712A CN 111915712 A CN111915712 A CN 111915712A CN 202010889379 A CN202010889379 A CN 202010889379A CN 111915712 A CN111915712 A CN 111915712A
Authority
CN
China
Prior art keywords
illumination
scene
rendered
virtual light
sampling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010889379.1A
Other languages
Chinese (zh)
Other versions
CN111915712B (en
Inventor
赵海峰
王凯
黄垚
胡一博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010889379.1A priority Critical patent/CN111915712B/en
Publication of CN111915712A publication Critical patent/CN111915712A/en
Application granted granted Critical
Publication of CN111915712B publication Critical patent/CN111915712B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Image Generation (AREA)

Abstract

The present disclosure provides an illumination rendering method, an illumination rendering apparatus, a computer readable medium, and an electronic device; relates to the technical field of image processing. The illumination rendering method comprises the following steps: generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources; acquiring an environment texture map, sampling the environment texture map according to the lighting array, and determining illumination information of each virtual light source in the lighting array on the scene to be rendered; and rendering the scene to be rendered by utilizing the illumination information so as to simulate the illumination effect of the scene to be rendered. The illumination rendering method in the disclosure can overcome the problem that illumination rendering is not accurate enough in a scene to a certain extent, and further improve the reality of illumination rendering.

Description

Illumination rendering method and device, computer readable medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an illumination rendering method, an illumination rendering apparatus, a computer-readable medium, and an electronic device.
Background
In computer animation production, environment-based lighting rendering is an important link for realizing reality. The illumination rendering mainly obtains illumination information around the object from the environment texture map to illuminate the object. Generally, when real-time rendering is performed in a game engine, a pre-convolution process is performed on a texture map, then sampling is performed on the texture map after the pre-convolution process through a normal line of an object surface to obtain current illumination information, and then a screen space environment shielding technology is used for simulating a surrounding environment and shielding information of the surrounding environment and the self. However, since the simulated occlusion information is not prepared, an incorrect rendering effect, such as a light leakage phenomenon, may occur at some details, causing a problem that the rendered scene is poor in reality.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure aims to provide an illumination rendering method, an illumination rendering apparatus, a computer readable medium and an electronic device, which can overcome the problem of poor illumination effect in a virtual scene to a certain extent, and further improve the accuracy of illumination rendering.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a lighting rendering method, comprising:
generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources;
acquiring an environment texture map, sampling the environment texture map according to the lighting array, and determining illumination information of each virtual light source in the lighting array on the scene to be rendered;
and rendering the scene to be rendered by utilizing the illumination information so as to simulate the illumination effect of the scene to be rendered.
In an exemplary embodiment of the present disclosure, the sampling the environment texture map and determining illumination information of each virtual light source in the lighting array on the scene to be rendered includes:
calculating the sampling range of each virtual light source in the light array;
for each of the virtual light sources, determining a plurality of sampling directions for the virtual light source within the sampling range;
and determining the illumination colors of the multiple sampling directions according to the environment texture map so as to obtain the illumination information of the virtual light source.
In an exemplary embodiment of the present disclosure, the determining, according to the environment texture map, illumination colors of the plurality of sampling directions to obtain illumination information of the virtual light source includes:
and calculating the arithmetic mean value of the illumination colors of the plurality of sampling directions, and determining the illumination information of the virtual light source according to the arithmetic mean value.
In an exemplary embodiment of the present disclosure, the determining a plurality of sampling directions of the virtual light source within the sampling range includes:
generating a plurality of random location points within the sampling range;
and determining a plurality of sampling directions of the virtual light source by using the random position points and the position of the virtual light source in the light array.
In an exemplary embodiment of the present disclosure, the generating a corresponding light array for a scene to be rendered includes:
and generating a spherical lighting array around the scene to be rendered, wherein each virtual light source in the spherical lighting array is distributed in a Fibonacci number series.
In an exemplary embodiment of the present disclosure, the ambient texture map is a high dynamic range image.
In an exemplary embodiment of the present disclosure, the obtaining the environment texture map includes:
responding to the input operation of the user interface so as to update the environment texture mapping in real time according to the input operation.
According to a second aspect of the present disclosure, there is provided an illumination rendering apparatus, including a virtual light generation module, an illumination information calculation module, and an illumination rendering module, wherein:
the virtual light generation module is used for generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources.
And the illumination information calculation module is used for acquiring the environment texture map, sampling the environment texture map according to the light array, and determining the illumination information of the scene to be rendered by each virtual light source in the light array.
And the illumination rendering module is used for rendering the scene to be rendered by utilizing the illumination information so as to simulate the illumination effect of the scene to be rendered.
In an exemplary embodiment of the present disclosure, the illumination information calculation module may specifically include a sampling range calculation module, a sampling direction calculation module, and an illumination sampling module, where:
and the sampling range calculation module is used for calculating the sampling range of each virtual light source in the light array.
A sampling direction calculation module for determining, for each of the virtual light sources, a plurality of sampling directions of the virtual light source within the sampling range.
And the illumination sampling module is used for determining the illumination colors of the plurality of sampling directions according to the environment texture map so as to acquire the illumination information of the virtual light source.
In an exemplary embodiment of the present disclosure, the illumination sampling module may be specifically configured to: and calculating the arithmetic mean value of the illumination colors of the plurality of sampling directions, and determining the illumination information of the virtual light source according to the arithmetic mean value.
In an exemplary embodiment of the present disclosure, the sampling direction calculation module may include a random location point generation module and a direction determination module, wherein:
and the random position point generating module is used for generating a plurality of random position points in the sampling range.
And the direction determining module is used for determining a plurality of sampling directions of the virtual light source by utilizing the random position points and the positions of the light sources in the light array.
In an exemplary embodiment of the present disclosure, the virtual light generation module is specifically configured to: and generating a spherical lighting array around the scene to be rendered, wherein each virtual light source in the spherical lighting array is distributed in a Fibonacci number series.
In an exemplary embodiment of the present disclosure, the ambient texture map is a high dynamic range image.
In an exemplary embodiment of the present disclosure, the illumination information calculation module is specifically configured to: responding to the input operation of the user interface so as to update the environment texture mapping in real time according to the input operation.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any one of the above via execution of the executable instructions.
According to a fourth aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the above.
Exemplary embodiments of the present disclosure may have some or all of the following benefits:
in the illumination rendering method provided by an example embodiment of the present disclosure, on one hand, the environment illumination may be simulated by the uniformly distributed virtual light sources to obtain correct environment shielding information, and the color of the virtual light sources may be accurately controlled by the environment texture mapping, thereby improving the reality of the virtual scene; on the other hand, more detailed illumination information can be obtained through the light array and the environment texture map, and incorrect rendering of details can be avoided, so that the rendering accuracy is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 schematically shows a flow diagram of a lighting rendering method according to an embodiment of the present disclosure;
FIG. 2 schematically illustrates a light array schematic according to one embodiment of the present disclosure;
fig. 3 schematically shows a flow diagram of a lighting rendering method according to an embodiment of the present disclosure;
FIG. 4 schematically illustrates a random location point diagram according to one embodiment of the present disclosure;
FIG. 5 schematically illustrates a schematic diagram of an orientation employed according to one embodiment of the present disclosure;
FIG. 6 schematically shows a rendering effect diagram according to an embodiment of the disclosure;
fig. 7 schematically shows a block diagram of a lighting rendering apparatus according to one embodiment of the present disclosure;
fig. 8 schematically shows a system architecture diagram for implementing a lighting rendering method according to one embodiment of the present disclosure;
FIG. 9 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The technical solution of the embodiment of the present disclosure is explained in detail below:
the present exemplary embodiment first provides an illumination rendering method. Referring to fig. 1, the lighting rendering method may include the steps of:
step S110: and generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources.
Step S120: and acquiring an environment texture map, sampling the environment texture map according to the lighting array, and determining illumination information of each virtual light source in the lighting array on the scene to be rendered.
Step S130: and rendering the scene to be rendered by utilizing the illumination information so as to simulate the illumination effect of the scene to be rendered.
In the illumination rendering method provided by the exemplary embodiment of the present disclosure, on one hand, the virtual light source may simulate the environmental illumination to obtain correct environmental shielding information, and the environment texture map may accurately control the color of the virtual light source, thereby improving the reality of the virtual scene; on the other hand, more detailed illumination information can be obtained through the uniformly distributed light array and the environment texture map, and incorrect rendering of details can be avoided, so that the rendering accuracy is improved.
The above steps of the present exemplary embodiment will be described in more detail below.
In step S110, a corresponding light array is generated for a scene to be rendered, where the light array includes a plurality of uniformly distributed virtual light sources.
The light array may include a plurality of virtual light sources, which may have a plurality of lighting parameters, e.g., light orientation, lighting color, lighting intensity, etc., and all of the virtual light sources may also have uniform lighting parameters. In an exemplary embodiment, parameter information of the light array configured by the user can be obtained through the user interface, and the light array can be generated by using the parameter information. The user interface may include a plurality of controls, and a user may control parameter information of the light array by clicking, inputting, and the like, for example, the number of total virtual light sources in the light array, the illumination intensity or the illumination color of the virtual light source, the shape of the light array, and the like. Illustratively, the light array may be a three-dimensional sphere, or a hemisphere, distributed around the scene to be rendered. The virtual light sources can be uniformly distributed according to the longitude and latitude of a sphere or a Fibonacci sequence. As shown in fig. 2, a is longitude and latitude distribution, and B is fibonacci distribution, which shows that the intervals between the virtual light sources in the light array distributed in fibonacci are more uniform, and uniform illumination can be obtained.
In step S120, an environment texture map is obtained, the environment texture map is sampled according to the lighting array, and illumination information of each virtual light source in the lighting array on the scene to be rendered is determined.
The environment texture map can be a cube map, the cube map comprises six texture maps, the environment texture map is sampled and mapped to the surface of a model of a scene to be rendered, and an omnibearing surrounding environment can be simulated, so that rendering is more real. In addition, the user can add different environment texture maps to the scene to be rendered according to actual requirements, so that different scene effects are rendered. The illumination information may include information of RGBA of each pixel in the scene to be rendered, and may also include other information, such as transparency, which is not particularly limited in this embodiment. Specifically, the coordinate system of the environment texture map is converted into the world coordinate system of the scene to be rendered, so that the pixels of the environment texture map corresponding to each pixel point in the scene to be rendered are determined, the virtual light source in the light array is further used, the reflected light of the virtual light source obtained at each point in the scene to be rendered can be calculated by using a light formula, and each parameter of light information such as color, brightness and the like of each pixel point in the scene to be rendered can be obtained.
In addition, the environmental texture map may be updated in real time. Specifically, the embodiment may provide a user interface, where the user interface may include an input interface, and the input interface may receive an input operation of a user, and further obtain an environment texture map selected by the user in response to the input operation, and further update an original map with the environment texture map selected by the user for rendering. The method and the device can support real-time updating of the environment texture map, so that richer and more flexible rendering effects are realized.
In an exemplary embodiment, the method may include the following steps S310 to S330, as shown in fig. 3.
In step S310, a sampling range of each of the virtual light sources in the light array is calculated. Firstly, the information of the light array is obtained, and then the light array can be used for calculating the distribution interval of each virtual light source and the light direction, so that the sampling range corresponding to each virtual light source is obtained. The sampling range is a cone along the direction of light, the axis of the cone faces the direction of a light source of the virtual light source, and the included angle of the cone corresponds to the interval between each pair of light.
In step S320, for each of the virtual light sources, a plurality of sampling directions of the virtual light source are determined within the sampling range. Illustratively, a random scatter algorithm may be employed to generate a number of random location points within the employed range. The number of the random position points may be set according to actual requirements, for example, 10, 20, and the like, and the embodiment is not limited thereto. For example, a uniform distribution of random location points may be generated by the Hammersley sampling algorithm, as shown in FIG. 4. After the random position point is obtained, a sampling direction may be determined by using the random position point, and specifically, the sampling direction may be a vector from the position of the virtual light source to the random position point. For example, as shown in fig. 5, the Kx and Ky planes may be the ground, a virtual light source is located at the origin in the coordinate system, Kz is the light direction of the virtual light source, and Kn (n is 1, 2, 3, 4 …) is the randomly generated sampling direction.
In step S330, determining the illumination colors in the multiple sampling directions according to the environment texture map to obtain the illumination information of the virtual light source. For example, as shown in fig. 5, if the virtual light source is located at the origin of the coordinate system, an illumination color may be obtained according to each sampling direction, an arithmetic average of the illumination colors in each sampling direction is calculated, and the arithmetic average may be used as the illumination information of the virtual light source. In addition, the illumination information may be calculated by integrating the illumination information for each sampling direction by other calculation methods, for example, calculating the square average of the illumination information for each sampling direction, and the present embodiment is not limited to this.
In the exemplary embodiment, the process of calculating the illumination information of the virtual lighting can be completed in the GPU, and has a very high real-time frame rate, and the real-time frame rate cannot be achieved by rendering through the pre-convolution environment map, so that the method can meet the requirement of real-time rendering.
In an exemplary embodiment, the ambient texture map may be a high dynamic range image. Rendering is typically performed in a game engine by ambient texture mapping of a low dynamic range image, but the low dynamic range image has a limited range of colors that can be represented and does not accurately represent details. The color positions can be increased through the high dynamic range image, so that more accurate illumination color is obtained, and the reality of the virtual scene is enhanced. For example, as shown in fig. 6, 610 is a rendering effect of a low dynamic range map, and 620 is a rendering effect of a high dynamic range map, it can be seen that the rendering effect of the high dynamic range map has more detailed and rich color levels, and can represent more details. If the rendering engine cannot match the high dynamic range map, the underlying code of the engine may be modified. For example, if a uniral game engine is used to render a scene, the underlying code of the game engine may be modified to store color information using more color bits, and then the color information may be matched with a high dynamic range image, thereby achieving a more accurate illumination effect.
In step S130, the scene to be rendered is rendered by using the illumination information to simulate a lighting effect of the scene to be rendered.
After the illumination information of each virtual light is determined, the illumination effect of the virtual light on the scene to be rendered can be calculated by using an illumination formula, such as a diffuse reflection illumination formula, a specular reflection illumination formula and the like. Specifically, firstly, the illumination intensity of the virtual light in the light array and the illumination color of the virtual light obtained through calculation are used for calculating the reflected light obtained by each pixel in the scene to be virtualized. And calculating each virtual light in the light array, determining the reflected light of each virtual light on each point of the scene to be rendered, and then superposing to obtain the final lighting effect on each point.
In the exemplary embodiment, the actual light calculation can be performed through the virtual light source in the light array, so that correct shielding information is obtained, the phenomena of light leakage and the like are avoided, and the rendering accuracy can be improved. And the surrounding environment information can be acquired from the environment texture map, so that the environment effect is rendered for the model scene to be rendered.
Further, in this exemplary embodiment, an illumination rendering apparatus is further provided, which is configured to execute the illumination rendering method of the present disclosure. The device can be applied to a server or terminal equipment.
Referring to fig. 7, the lighting rendering apparatus 700 may include: an entity determination module 710, an information acquisition module 720, and a text query module 730, wherein:
the virtual light generating module 710 is configured to generate a corresponding light array for a scene to be rendered, where the light array includes a plurality of uniformly distributed virtual light sources.
And the illumination information calculation module 720 is configured to obtain an environment texture map, sample the environment texture map according to the lighting array, and determine illumination information of each virtual light source in the lighting array on the scene to be rendered.
And an illumination rendering module 730, configured to render the scene to be rendered by using the illumination information, so as to simulate an illumination effect of the scene to be rendered.
In an exemplary embodiment of the present disclosure, the illumination information calculation module 720 may specifically include a sampling range calculation module, a sampling direction calculation module, and an illumination sampling module, where:
and the sampling range calculation module is used for calculating the sampling range of each virtual light source in the light array.
A sampling direction calculation module for determining, for each of the virtual light sources, a plurality of sampling directions of the virtual light source within the sampling range.
And the illumination sampling module is used for determining the illumination colors of the plurality of sampling directions according to the environment texture map so as to acquire the illumination information of the virtual light source.
In an exemplary embodiment of the present disclosure, the illumination sampling module may be specifically configured to: and calculating the arithmetic mean value of the illumination colors of the plurality of sampling directions, and determining the illumination information of the virtual light source according to the arithmetic mean value.
In an exemplary embodiment of the present disclosure, the sampling direction calculation module may include a random location point generation module and a direction determination module, wherein:
and the random position point generating module is used for generating a plurality of random position points in the sampling range.
And the direction determining module is used for determining a plurality of sampling directions of the virtual light source by utilizing the random position points and the positions of the light sources in the light array.
In an exemplary embodiment of the present disclosure, the virtual light generation module 710 is specifically configured to: and generating a spherical lighting array around the scene to be rendered, wherein each virtual light source in the spherical lighting array is distributed in a Fibonacci number series.
In an exemplary embodiment of the present disclosure, the ambient texture map is a high dynamic range image.
In an exemplary embodiment of the disclosure, the illumination information calculation module 720 is specifically configured to: responding to the input operation of the user interface so as to update the environment texture mapping in real time according to the input operation.
Since each functional module of the illumination rendering apparatus according to the exemplary embodiment of the present disclosure corresponds to a step of the above-described illumination rendering method according to the exemplary embodiment of the present disclosure, please refer to the embodiment of the above-described illumination rendering method according to the present disclosure for details that are not disclosed in the embodiment of the present disclosure.
Referring to fig. 8, fig. 8 is a schematic diagram illustrating a system architecture of an exemplary application environment to which the illumination rendering method and the illumination rendering apparatus according to the embodiment of the present disclosure may be applied.
As shown in fig. 8, the system architecture 800 may include one or more of terminal devices 801, 802, 803, a network 804, and a server 805. The network 804 serves to provide a medium for communication links between the terminal devices 801, 802, 803 and the server 805. Network 804 may include various types of connections, such as wire, wireless communication links, or fiber optic cables, to name a few.
The terminal devices 801, 802, 803 may be various electronic devices having a display screen including, but not limited to, desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks, and servers in fig. 8 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 805 may be a server cluster comprised of multiple servers, or the like.
The illumination rendering method provided by the embodiment of the present disclosure may be executed by the terminal devices 801, 802, and 803, and accordingly, the illumination rendering apparatus may be disposed in the terminal devices 801, 802, and 803.
FIG. 9 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present disclosure.
It should be noted that the computer system 800 of the electronic device shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiments of the present disclosure.
As shown in fig. 9, the computer system 900 includes a Central Processing Unit (CPU)901 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)902 or a program loaded from a storage section 908 into a Random Access Memory (RAM) 903. In the RAM 903, various programs and data necessary for system operation are also stored. The CPU 901, ROM902, and RAM 903 are connected to each other via a bus 904. An input/output (I/O) interface 905 is also connected to bus 904.
The following components are connected to the I/O interface 905: an input portion 906 including a keyboard, a mouse, and the like; an output section 907 including components such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 908 including a hard disk and the like; and a communication section 909 including a network interface card such as a LAN card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. The drive 910 is also connected to the I/O interface 905 as necessary. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 910 as necessary, so that a computer program read out therefrom is mounted into the storage section 908 as necessary.
In particular, the processes described below with reference to the flowcharts may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 909, and/or installed from the removable medium 911. The computer program executes various functions defined in the method and apparatus of the present application when executed by a Central Processing Unit (CPU) 901.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 1 and 2, and so on.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A lighting rendering method, comprising:
generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources;
acquiring an environment texture map, sampling the environment texture map according to the lighting array, and determining illumination information of each virtual light source in the lighting array on the scene to be rendered;
and rendering the scene to be rendered by utilizing the illumination information so as to simulate the illumination effect of the scene to be rendered.
2. The method of claim 1, wherein sampling the environment texture map according to the lighting array, and determining illumination information of each virtual light source in the lighting array on the scene to be rendered comprises:
calculating the sampling range of each virtual light source in the light array;
for each of the virtual light sources, determining a plurality of sampling directions for the virtual light source within the sampling range;
and determining the illumination colors of the multiple sampling directions according to the environment texture map so as to obtain the illumination information of the virtual light source.
3. The method according to claim 2, wherein the determining the illumination colors of the plurality of sampling directions according to the environment texture map to obtain the illumination information of the virtual light source comprises:
and calculating the arithmetic mean value of the illumination colors of the plurality of sampling directions, and determining the illumination information of the virtual light source according to the arithmetic mean value.
4. The method of claim 2, wherein determining the plurality of sampling directions of the virtual light source within the sampling range comprises:
generating a plurality of random location points within the sampling range;
and determining a plurality of sampling directions of the virtual light source by using the random position points and the position of the virtual light source in the light array.
5. The method of claim 1, wherein generating the corresponding light array for the scene to be rendered comprises:
and generating a spherical lighting array around the scene to be rendered, wherein each virtual light source in the spherical lighting array is distributed in a Fibonacci number series.
6. The method of claim 1, wherein the ambient texture map is a high dynamic range image.
7. The method of claim 1, wherein obtaining the environmental texture map comprises:
responding to the input operation of the user interface so as to update the environment texture mapping in real time according to the input operation.
8. An illumination rendering apparatus, comprising:
the virtual light generation module is used for generating a corresponding light array aiming at a scene to be rendered, wherein the light array comprises a plurality of uniformly distributed virtual light sources;
the illumination information calculation module is used for acquiring an environment texture map, sampling the environment texture map and determining illumination information of each virtual light source in the light array on the scene to be rendered;
and the illumination rendering module is used for rendering the scene to be rendered by utilizing the illumination information so as to simulate the illumination effect of the scene to be rendered.
9. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1-7 via execution of the executable instructions.
CN202010889379.1A 2020-08-28 2020-08-28 Illumination rendering method and device, computer readable medium and electronic equipment Active CN111915712B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010889379.1A CN111915712B (en) 2020-08-28 2020-08-28 Illumination rendering method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010889379.1A CN111915712B (en) 2020-08-28 2020-08-28 Illumination rendering method and device, computer readable medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN111915712A true CN111915712A (en) 2020-11-10
CN111915712B CN111915712B (en) 2024-05-28

Family

ID=73266438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010889379.1A Active CN111915712B (en) 2020-08-28 2020-08-28 Illumination rendering method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111915712B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112785672A (en) * 2021-01-19 2021-05-11 浙江商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
WO2022188460A1 (en) * 2021-03-09 2022-09-15 网易(杭州)网络有限公司 Illumination rendering method and apparatus, and electronic device and storage medium
WO2024066559A1 (en) * 2022-09-28 2024-04-04 杭州群核信息技术有限公司 Rendering method, apparatus and system, electronic device, and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69430647D1 (en) * 1993-01-28 2002-06-27 Philips Electronics Uk Ltd image display
US20030025706A1 (en) * 2001-08-03 2003-02-06 Ritter Bradford A. System and method for rendering a texture map utilizing an illumination modulation value
JP2003296750A (en) * 2002-03-21 2003-10-17 Microsoft Corp Graphic image rendering using radiance autotransfer of low frequency lighting environment
US20090027391A1 (en) * 2007-07-23 2009-01-29 Disney Enterprises, Inc. Directable lighting method and apparatus
US20110109631A1 (en) * 2009-11-09 2011-05-12 Kunert Thomas System and method for performing volume rendering using shadow calculation
CN103021020A (en) * 2012-12-05 2013-04-03 上海创图网络科技发展有限公司 Three-dimensional (3D) rendering method based on multiple light sources
US20150187093A1 (en) * 2013-12-30 2015-07-02 Samsung Electronics Co., Ltd. Illuminating a virtual environment with camera light data
CN107909638A (en) * 2017-11-15 2018-04-13 网易(杭州)网络有限公司 Rendering intent, medium, system and the electronic equipment of dummy object
CN110223363A (en) * 2019-04-25 2019-09-10 合刃科技(深圳)有限公司 Image generating method and device
CN111127624A (en) * 2019-12-27 2020-05-08 珠海金山网络游戏科技有限公司 Illumination rendering method and device based on AR scene
CN111260766A (en) * 2020-01-17 2020-06-09 网易(杭州)网络有限公司 Virtual light source processing method, device, medium and electronic equipment
CN111343444A (en) * 2020-02-10 2020-06-26 清华大学 Three-dimensional image generation method and device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69430647D1 (en) * 1993-01-28 2002-06-27 Philips Electronics Uk Ltd image display
US20030025706A1 (en) * 2001-08-03 2003-02-06 Ritter Bradford A. System and method for rendering a texture map utilizing an illumination modulation value
JP2003296750A (en) * 2002-03-21 2003-10-17 Microsoft Corp Graphic image rendering using radiance autotransfer of low frequency lighting environment
US20090027391A1 (en) * 2007-07-23 2009-01-29 Disney Enterprises, Inc. Directable lighting method and apparatus
US20110109631A1 (en) * 2009-11-09 2011-05-12 Kunert Thomas System and method for performing volume rendering using shadow calculation
CN103021020A (en) * 2012-12-05 2013-04-03 上海创图网络科技发展有限公司 Three-dimensional (3D) rendering method based on multiple light sources
US20150187093A1 (en) * 2013-12-30 2015-07-02 Samsung Electronics Co., Ltd. Illuminating a virtual environment with camera light data
CN107909638A (en) * 2017-11-15 2018-04-13 网易(杭州)网络有限公司 Rendering intent, medium, system and the electronic equipment of dummy object
CN110223363A (en) * 2019-04-25 2019-09-10 合刃科技(深圳)有限公司 Image generating method and device
CN111127624A (en) * 2019-12-27 2020-05-08 珠海金山网络游戏科技有限公司 Illumination rendering method and device based on AR scene
CN111260766A (en) * 2020-01-17 2020-06-09 网易(杭州)网络有限公司 Virtual light source processing method, device, medium and electronic equipment
CN111343444A (en) * 2020-02-10 2020-06-26 清华大学 Three-dimensional image generation method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吴伟和;郝爱民;李智;王文涛;杨跃东;王立君;: "基于直接光照的全局光照模拟", 计算机工程, no. 10, pages 257 - 258 *
林喆: "三维动画渲染技术解析", 现代电影技术, pages 22 - 25 *
胡孔明;于瀛洁;张之江;: "基于光场的渲染技术研究", 微计算机应用, no. 02, pages 22 - 27 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112785672A (en) * 2021-01-19 2021-05-11 浙江商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
WO2022188460A1 (en) * 2021-03-09 2022-09-15 网易(杭州)网络有限公司 Illumination rendering method and apparatus, and electronic device and storage medium
WO2024066559A1 (en) * 2022-09-28 2024-04-04 杭州群核信息技术有限公司 Rendering method, apparatus and system, electronic device, and storage medium

Also Published As

Publication number Publication date
CN111915712B (en) 2024-05-28

Similar Documents

Publication Publication Date Title
CN111915712B (en) Illumination rendering method and device, computer readable medium and electronic equipment
CN107886562B (en) Water surface rendering method and device and readable storage medium
CN113674389B (en) Scene rendering method and device, electronic equipment and storage medium
CN111260766A (en) Virtual light source processing method, device, medium and electronic equipment
CN108882025B (en) Video frame processing method and device
CN110765620A (en) Aircraft visual simulation method, system, server and storage medium
CN112184873B (en) Fractal graph creation method, fractal graph creation device, electronic equipment and storage medium
CN112734896B (en) Environment shielding rendering method and device, storage medium and electronic equipment
CN110084873B (en) Method and apparatus for rendering three-dimensional model
WO2023061232A1 (en) Image rendering method and apparatus, device, and medium
CN111882631A (en) Model rendering method, device, equipment and storage medium
CN111798554A (en) Rendering parameter determination method, device, equipment and storage medium
WO2023273133A1 (en) Game model light supplementing method and apparatus, storage medium, and computer device
CN112580213A (en) Method and apparatus for generating display image of electric field lines, and storage medium
CN110930492B (en) Model rendering method, device, computer readable medium and electronic equipment
CN116167239A (en) Infrared simulation method, system, computer and readable storage medium
CN115082628B (en) Dynamic drawing method and device based on implicit optical transfer function
US11727535B2 (en) Using intrinsic functions for shadow denoising in ray tracing applications
CN112446944B (en) Method and system for simulating real environment light in AR scene
CN112967369A (en) Light ray display method and device
CN117581268A (en) Rendering visual representations of luminaires by reusing light values
CN113744379A (en) Image generation method and device and electronic equipment
CN115239869B (en) Shadow processing method, shadow rendering method and device
CN117252976A (en) Model rendering order determining method and device, electronic equipment and storage medium
CN113989439A (en) Visual domain analysis method and device based on UE4

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant