CN106251393B - A kind of gradual Photon Mapping optimization method eliminated based on sample - Google Patents
A kind of gradual Photon Mapping optimization method eliminated based on sample Download PDFInfo
- Publication number
- CN106251393B CN106251393B CN201610555035.0A CN201610555035A CN106251393B CN 106251393 B CN106251393 B CN 106251393B CN 201610555035 A CN201610555035 A CN 201610555035A CN 106251393 B CN106251393 B CN 106251393B
- Authority
- CN
- China
- Prior art keywords
- photon
- photons
- elimination
- rendering
- sample
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 238000013507 mapping Methods 0.000 title claims abstract description 32
- 238000005457 optimization Methods 0.000 title claims abstract description 18
- 238000009877 rendering Methods 0.000 claims abstract description 63
- 238000003384 imaging method Methods 0.000 claims abstract description 5
- 238000012935 Averaging Methods 0.000 claims abstract description 4
- 230000008030 elimination Effects 0.000 claims description 60
- 238000003379 elimination reaction Methods 0.000 claims description 60
- 230000000750 progressive effect Effects 0.000 claims description 24
- 238000004040 coloring Methods 0.000 claims description 18
- 238000004422 calculation algorithm Methods 0.000 claims description 13
- 238000004364 calculation method Methods 0.000 claims description 10
- 238000012216 screening Methods 0.000 claims description 4
- 238000005315 distribution function Methods 0.000 claims description 3
- 230000005484 gravity Effects 0.000 claims description 3
- 230000015572 biosynthetic process Effects 0.000 abstract 1
- 230000000694 effects Effects 0.000 description 19
- 238000005286 illumination Methods 0.000 description 8
- 238000005070 sampling Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 5
- 238000009826 distribution Methods 0.000 description 5
- 238000003860 storage Methods 0.000 description 4
- 238000003672 processing method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003518 caustics Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000002537 cosmetic Substances 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/06—Ray-tracing
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
Abstract
The invention discloses a kind of gradual Photon Mapping optimization methods eliminated based on sample, including ray trace stage;Photon tracking phase;Photon eliminates the stage;Render imaging session:Colored spots are traversed, the photon within the scope of the photon search radius of colored spots and in photon figure is inquired, using colored spots the contribution parameters of screen space are calculated with the contribution color of colored spots, and contribution color is returned into screen space to form rendering image;Photon renders iteration:The photon search radius range for reducing colored spots repeats photon tracking phase to imaging session is rendered, until reaching the iteration render times of setting, is averaging after the rendering image of all formation is added up, obtains final rendering image.The present invention, which has, easily to be implemented, and method is simple, the good feature of blue noise attribute.
Description
Technical Field
The invention relates to the field of photorealistic rendering of graphics, in particular to a progressive photon mapping optimization method based on sample elimination.
Background
The photon mapping algorithm is one of global illumination algorithms, and is mainly used for realizing a realistic Rendering effect to achieve photo-level Rendering image quality (photo cosmetic Rendering). Rendering for realistic photo-level image quality is currently the most central technology in the field of film and television braking, called realistic rendering. The realistic rendering can enable the three-dimensional movie and television cartoon works to be more vivid and show the same realistic effect as that in real life. The realistic rendering mainly comprises the key technologies of a global illumination algorithm, special effect simulation and the like. The global illumination algorithm realizes not only the effect of rendering the irradiation of the light source on the object, but also the effect of transmitting the light energy between the object and the object, including color overflow, caustic, environmental shielding and the like.
The Progressive photon mapping algorithm is an optimized photon mapping algorithm, obtains a Progressive rendering effect by dividing photon rendering into multiple iterations, and solves the problems of storage and deviation in the photon rendering. The first step is a ray tracing phase, starting from the camera, of emitting rays into the scene through the screen space, tracing the intersection points of the rays and the scene, recording these intersection points and the contributing parameters to the screen space, these intersection points also being called tinting points. An iterative phase of photon rendering is then performed. Each photon rendering is divided into a photon emission preprocessing stage and a photon collection stage.
The existing photon mapping algorithm can ensure the rendering precision only by large-scale photon emission, and requires larger system execution time and storage space overhead in both the photon tracking stage and the rendering stage, so that the calculation efficiency is lower. When a certain number of photon emission renderings are selected, the collection number of photons can affect the rendering effect. When the collection number of photons is small, noise occurs due to random distribution of photons, and when the collection number of photons is large, problems such as deviation occur at feature edges.
Disclosure of Invention
The invention provides a progressive photon mapping optimization method based on sample elimination, aiming at the problem of noise caused by random emission and rebound of photons in the existing photon mapping algorithm. The invention can accelerate the progressive photon mapping and optimize the rendering effect.
In order to achieve the purpose, the invention adopts the following technical scheme:
a progressive photon mapping optimization method based on sample elimination comprises the following steps:
step one, a ray tracing stage: tracking intersection points of rays and a scene, and recording the intersection points and contribution parameters to a screen space; wherein the intersection points are colored points;
step two, a photon tracking stage: starting from a light source in a scene, emitting photons and carrying out forward photon tracking, storing the photons on the surface of a diffuse reflection object in the scene into a photon graph, and constructing a KD-tree by the photon graph;
step three, a photon eliminating stage: calculating the elimination weight of each photon in the KD-tree, constructing a maximum pile according to the elimination weight, screening out the photon with the maximum elimination weight, eliminating the photon with the maximum current elimination weight and updating the elimination weight of the neighbor, updating the maximum pile, screening out the photon with the maximum elimination weight again and eliminating and updating until the set number of the eliminated photons is reached;
step four, rendering and imaging: traversing the colored point, inquiring photons in the photon search radius range of the colored point in the photon sub-image, calculating the contribution color of the colored point by using the contribution parameter of the colored point to the screen space, and returning the contribution color to the screen space to form a rendered image;
step five, photon rendering iteration: and (4) reducing the photon search radius range of the coloring point, repeating the second step to the fourth step until the set iterative rendering times are reached, and accumulating all formed rendering images and then averaging to obtain a final rendering image.
In the process of the ray tracing stage, rays are emitted into a scene through a screen space from a camera.
Said step one is during the ray tracing phase, when the ray intersects the diffuse reflecting surface, the shading point and the contribution parameters to the screen space are saved.
During the ray tracing stage, when the ray intersects with the non-diffuse reflection surface, the color point and the contribution parameters to the screen space are stored, and the ray is traced continuously until the ray intersects with the diffuse reflection surface.
In the second step, each light source in the scene emits a certain number of photons, the outgoing direction of the photons is determined by using a random algorithm, and photon rays are tracked in the scene.
In the second step, when the photon collides with the surface of the non-diffuse reflection object, the photon is reflected, refracted, scattered or absorbed according to the attributes of the object, and simultaneously the energy, direction, position and surface normal information of the photon are stored into the photon graph.
In the second step, when photons collide with the diffuse reflection surface, the photons are subjected to diffuse reflection or absorption, and the energy, direction, position and surface normal information of the photons are stored into a photon graph.
In the third step, the elimination weight of a photon in the KD-tree is the sum of the weights of all neighbors of the current photon, wherein the calculation process of the weight of any neighbor of the current photon is as follows:
step (3.1): calculating the ratio of the distance between any adjacent photons of the current photon to the influence parameter of 2 times of photon density;
step (3.2): and (4) subtracting the ratio obtained in the step (3.1) from 1, and finally calculating the weight of any neighbor of the current photon.
The weight calculation formula for all neighbors of the current photon is:
wherein, wijIs the weight between the current photon i and the j neighbor photons, j beingAn integer of 1 or more; dijThe distance between the jth neighbor photon of the current photon i; r ismax,iIs an influencing parameter of the photon density; a. theiIs the nearest neighbor search radius; n is a radical ofiThe number of nearest neighbors is an integer greater than or equal to 1.
In the fourth step, the parameters of the contribution of the coloring point to the screen space include the emergent irradiance of the coloring point, and the calculation formula of the emergent irradiance of the coloring point is as follows:
rt=α*rt-1
wherein,representing the calculated emergent irradiance at the shading point position x; w is the incident ray direction of the colored point x; k is the number of photons within the search radius, k being an integer greater than or equal to 1; w is aiIs the direction of the ith photon of the k photons; phi is aiIs the ith photon energy; f. ofr(x,w,wi) Is a bi-directional reflectance distribution function, i.e., represents the specific gravity between incident irradiance and emergent irradiance; r is the photon elimination ratio, i.e. the number remaining after photon elimination divided by the number of emitted photons; r istthe method comprises the steps of searching the radius range of photons near a colored point, t is the current iteration rendering times, t is an integer larger than or equal to 1, and α is a control parameter and is a constant.
The invention has the beneficial effects that:
(1) the invention provides a progressive photon mapping optimization algorithm based on sample elimination, which applies a sample processing method to a photon mapping method, wherein photons are distributed while preserving randomness, and have a certain blue noise attribute, so that noise in a rendered picture is reduced; the invention adds one-step photon elimination between the two steps of photon emission and photon rendering, eliminates the photon with the nearest distance to other photons by calculating the weight of the relative distance between the photons and selects an optimal elimination proportion, so that the photon graph used by each photon calculation in the progressive photon mapping method has the blue noise characteristic, the photon distribution used in each step is more optimized, thereby accelerating the progressive photon mapping and optimizing the rendering effect.
(2) According to the optimized photon sample processing method, the elimination weight of photons is controlled according to the density of photon distribution, the overall illumination brightness is ensured, and meanwhile, the photon elimination based on the sample method can be applied to the calculation of all rendering effects of overall illumination; the elimination-based sampling method for sampling can be closely combined with progressive photon mapping in the process, so that the photon mapping optimization method based on the sampling method is expanded to the progressive photon mapping method; by optimizing the photon distribution of each iteration step, the convergence is accelerated, and the effect of reducing half of the iteration times can be achieved.
Drawings
FIG. 1 is a flowchart of a progressive photon mapping optimization method based on sample elimination according to the present invention.
Fig. 2(a) is a rendering effect diagram for rendering a Cornell Box scene 64 times using a progressive photon mapping algorithm.
Fig. 2(b) is a rendering effect diagram of 32 passes of rendering a Cornell Box scene using the present invention.
Fig. 2(c) is a rendering effect diagram of 64 passes of rendering a Cornell Box scene using the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
The method of sample elimination can achieve the production of poisson sample geometry, making the sample have stronger blue noise properties. The sample elimination method calculates a weight between every two sample points according to the distance between the sample points, the closer the sample points are, the higher the weight is, and the elimination weight of one sample point is the sum of the weights between the sample point and the surrounding neighbors. The method employs a greedy strategy to eliminate the sample point with the highest weight from a number of randomly generated samples at a time. And updates the cancellation weights of the neighbors around this sample point. Finally, a fixed number of sample point sets with blue noise properties can be obtained. The method for eliminating the sample is used as a new sample processing method and has the characteristics of easiness in implementation, simplicity, good blue noise property and the like. The photon set in the photon mapping method is regarded as a sample point set to carry out sample processing, and the processed photon set can remove random noise in distribution to a certain extent.
For a rendering example process, it is a process of calculating a modeled three-dimensional scene into a picture. Before the rendering task is started, the three-dimensional scene data is converted into an expression mode which can be identified by a rendering engine. A scene data file package containing complete information contains information such as cameras, geometric bodies, light sources, materials, maps and the like which can be identified by a rendering engine. And starting rendering after the scene preparation is finished.
A Cornell Box scene (Box scene for short) is a simple example of rendering global lighting effects. In this scenario, a point light source is placed in the box at the top of the box, and two small boxes are placed in the middle of the box. The rendering variable was set to 64 passes, the elimination ratio was set to 30%, and the pixel sampling rate was 4.
Fig. 1 is a flowchart of a progressive photon mapping optimization method based on sample elimination according to the present invention, and as shown in the figure, the progressive photon mapping optimization method based on sample elimination specifically includes the following steps after rendering is started:
step 1, a ray tracing stage; starting from a camera, rays are emitted into a scene through a screen space, intersections of the rays and the scene are tracked, and the intersections and the contribution parameters to the screen space are recorded.
The step 1 is as follows:
the method comprises the steps of emitting light rays into a scene from a camera right in front of a box, determining the direction of the light rays through the relation between the camera and a screen space, tracking the light rays as 4 sampling points are sampled by one pixel, emitting 4 light rays through one pixel, storing the intersection point at the first intersection point of the light rays and the scene, and storing the contribution parameters of the surface to the screen space, wherein the intersection points are also called coloring points.
Step 2, a photon tracking stage; photons are traced by emitting a certain number of photons from the light source and performing forward photon tracking, and the photons are stored on the surface of the diffuse reflecting object into a photon map, constructing a KD-tree of photons.
The method comprises the following specific steps:
2 ten thousand photons are emitted into the scene from a light source above the box, and then a random direction is generated for each photon, and the motion track of the photon is tracked in the scene. The energy, position, direction and plane finding information of the photons colliding with the plane are stored in a photon map, the photons are absorbed or subjected to diffuse reflection when being emitted and collided with the surface of an object, and the photons after diffuse reflection are continuously tracked in the scene. The maximum number of bounces is used to end the photon tracking process and complete the generation of the photon map.
Step 3, a photon elimination stage; calculating the elimination weight of each photon, constructing a maximum heap according to the elimination weight, selecting the photon with the maximum elimination weight, eliminating the photon and updating the elimination weight of the neighbor of the photon, updating the maximum heap, selecting the photon with the maximum elimination weight again for elimination and updating, and repeating the process until the set photon elimination number is reached.
The method comprises the following specific steps:
step (3.1): traversing each photon, searching the nearest 10 neighbor photons around the photon, and calculating the weight of each neighbor photon and the weight of the current photon according to the formula (1), wherein the elimination weight of the current photon is the sum of the neighbor weights;
the weight calculation formula for all neighbors of the current photon is:
wherein, wijIs the weight between the current photon i and the j neighbor photons, j is an integer greater than or equal to 1; dijThe distance between the jth neighbor photon of the current photon i; r ismax,iIs an influencing parameter of the photon density; a. theiIs the nearest neighbor search radius; n is a radical ofiThe number of nearest neighbors is an integer greater than or equal to 1.
In the present invention, a low-sampling tree is used to store rmax,iEach time selecting the r nearest to him according to the position of the current photonmax,i;rmax,iCan be calculated according to the second formula of formula (1), AiIs the search radius of the nearest neighbor, NiCalculated r 'is used directly in the first iteration as the nearest neighbor number'max,iIs rmax,i(ii) a In the remaining number of iterations, r in the low-sampling tree according to the querymax,iValue and currently calculated r'max,iAverage of the values of (a) as rmax,iAnd updating the data values in the low sampling tree after the calculation;
step (3.2): establishing a maximum heap according to the elimination weights of all photons;
step (3.3): selecting the photon with the maximum elimination weight;
step (3.4): updating the elimination weight of the neighbor photon of the elimination photon selected in step (3.4) by subtracting the weight of this neighbor photon of the elimination photon from the elimination weight:
step (3.5): and (4) updating the maximum stack, and repeating the step (3.3), the step (3.4) and the step (3.5) until the set elimination ratio of 30% is reached, namely 6000 photons are eliminated.
Step 4, a rendering imaging stage; traversing all coloring points recorded in the step 1, searching a photon image for each coloring point and finding the radius r near the coloring pointtThe illumination effect of the light source is estimated by photons within a range, wherein all photons of direct illumination and indirect illumination are included, t is the current iteration number, and the deviation is eliminated by controlling parameters and reducing the search radius in each iteration. And finally, calculating the color according to the recorded contribution parameters of the coloring points, returning the color to the screen space and accumulating the color.
The method comprises the following specific steps:
step (4.1): traversing each coloring point, and searching the vicinity of the coloring point for the radius r at one coloring pointtt is the current iterative rendering times, t is an integer greater than or equal to 1, α is a control parameter which is a constant and takes a value of 0.9 in the running of an example;
rt=α*rt-1formula (2)
Step (4.2): calculating the emergent irradiance of the current colored point according to a formula (3) of the radiant emittance;
wherein,representing the calculated emergent irradiance at the shading point position x; w isThe incident ray direction of color point x; k is the number of photons within the search radius, k being an integer greater than or equal to 1; w is aiIs the direction of the ith photon of the k photons; phi is aiIs the ith photon energy; f. ofr(x,w,wi) Is a bi-directional reflectance distribution function, i.e., represents the specific gravity between incident irradiance and emergent irradiance; r is the photon elimination ratio, i.e. the number remaining after photon elimination divided by the number of photons emitted.
Step (4.3): and calculating the contribution color of the coloring point by utilizing the contribution information of the coloring point to the screen space, returning the color to the screen space, accumulating and averaging the color and the image of the previous iteration to form an image, and displaying the image in a rendering result window.
And 5, photon rendering iteration, namely repeating the steps 2 to 5 until the set iteration rendering times are 64, and terminating rendering. Where fig. 2(c) is the result of 64 rendering passes using the present method.
FIG. 2(a) is a rendering effect diagram for a 64-pass rendering of a Cornell Box scene using a progressive photon mapping algorithm; FIG. 2(b) is a rendering effect diagram of a Cornell Box scene 32 times rendered using the method of the present patent. As can be seen by comparing fig. 2(a) to fig. 2 (c): the image obtained by the method is clearer and has better rendering effect.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
Although the embodiments of the present invention have been described with reference to the accompanying drawings, it is not intended to limit the scope of the present invention, and it should be understood by those skilled in the art that various modifications and variations can be made without inventive efforts by those skilled in the art based on the technical solution of the present invention.
Claims (8)
1. A progressive photon mapping optimization method based on sample elimination is characterized by comprising the following steps:
step one, a ray tracing stage: tracking intersection points of rays and a scene, and recording the intersection points and contribution parameters to a screen space; wherein the intersection points are colored points;
step two, a photon tracking stage: starting from a light source in a scene, emitting photons and carrying out forward photon tracking, storing the photons on the surface of a diffuse reflection object in the scene into a photon graph, and constructing a KD-tree by the photon graph;
step three, a photon eliminating stage: calculating the elimination weight of each photon in the KD-tree, constructing a maximum pile according to the elimination weight, screening out the photon with the maximum elimination weight, eliminating the photon with the maximum current elimination weight and updating the elimination weight of the neighbor, updating the maximum pile, screening out the photon with the maximum elimination weight again and eliminating and updating until the set number of the eliminated photons is reached;
step four, rendering and imaging: traversing the colored point, inquiring photons in the photon search radius range of the colored point in the photon sub-image, calculating the contribution color of the colored point by using the contribution parameter of the colored point to the screen space, and returning the contribution color to the screen space to form a rendered image;
step five, photon rendering iteration: reducing the photon searching radius range of the coloring point, repeating the second step to the fourth step until the set iterative rendering times are reached, accumulating all formed rendering images, and then averaging to obtain a final rendering image;
in the fourth step, the parameters of the contribution of the coloring point to the screen space include the emergent irradiance of the coloring point, and then the calculation formula of the emergent irradiance of the coloring point is as follows:
rt=α*rt-1
wherein,representing the calculated emergent irradiance at the shading point position x; w is the incident ray direction of the colored point x; k is the number of photons within the search radius, k being an integer greater than or equal to 1; w is aiIs the direction of the ith photon of the k photons; phi is aiIs the ith photon energy; f. ofr(x,w,wi) Is a bi-directional reflectance distribution function, i.e., represents the specific gravity between incident irradiance and emergent irradiance; r is the photon elimination ratio, i.e. the number remaining after photon elimination divided by the number of emitted photons; r istFor searching for photons half-way around the colored pointthe method comprises the following steps of determining the range of the current iteration rendering times, wherein t is the current iteration rendering times and is an integer greater than or equal to 1, and α is a control parameter and is a constant.
2. The progressive photon mapping optimization method based on sample elimination as claimed in claim 1, wherein during the ray tracing stage, the light is emitted from the camera through the screen space to the scene.
3. The method of claim 1, wherein the step of progressive photon mapping optimization based on sample elimination preserves shading points and contribution parameters to screen space when a ray intersects a diffuse reflecting surface during a ray tracing phase.
4. The method of claim 1, wherein during the ray tracing phase, when a ray intersects a non-diffuse reflecting surface, the shading point and the contribution parameters to the screen space are saved, and ray tracing is continued until the ray intersects the diffuse reflecting surface.
5. The progressive photon mapping optimization method based on sample elimination as claimed in claim 1, wherein in the second step, each light source in the scene emits a certain number of photons, the emission direction of the photons is determined by using a random algorithm, and the photon ray is traced in the scene.
6. The progressive photon mapping optimization method based on sample elimination as claimed in claim 1, wherein in the second step, when the photon collides with the surface of the non-diffuse reflection object, the photon is reflected, refracted, scattered or absorbed according to the property of the object, and the energy, direction, position and surface normal information of the photon are saved into the photon graph.
7. The progressive photon mapping optimization method based on sample elimination as claimed in claim 1, wherein in the second step, when photons collide with the diffuse reflection surface, the photons are diffusely reflected or absorbed, and simultaneously the energy, direction, position and surface normal information of the photons are saved in the photon map.
8. The progressive photon mapping optimization method based on sample elimination as claimed in claim 1, wherein in step three, the elimination weight of a photon in the KD-tree is the sum of the weights of all neighbors of the current photon; the calculation process of the weight of any neighbor of the current photon is as follows:
step (3.1): calculating the ratio of the distance between any adjacent photons of the current photon to the influence parameter of 2 times of photon density;
step (3.2): and (4) subtracting the ratio obtained in the step (3.1) from 1, and finally calculating the weight of any neighbor of the current photon.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610555035.0A CN106251393B (en) | 2016-07-14 | 2016-07-14 | A kind of gradual Photon Mapping optimization method eliminated based on sample |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610555035.0A CN106251393B (en) | 2016-07-14 | 2016-07-14 | A kind of gradual Photon Mapping optimization method eliminated based on sample |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106251393A CN106251393A (en) | 2016-12-21 |
CN106251393B true CN106251393B (en) | 2018-11-09 |
Family
ID=57613777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610555035.0A Active CN106251393B (en) | 2016-07-14 | 2016-07-14 | A kind of gradual Photon Mapping optimization method eliminated based on sample |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106251393B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107909647B (en) * | 2017-11-22 | 2020-09-15 | 长春理工大学 | Realistic virtual 3D scene light field projection image drawing method based on spatial multiplexing |
CN109509246B (en) * | 2018-03-25 | 2022-08-02 | 哈尔滨工程大学 | Photon map clustering method based on self-adaptive sight division |
CN108961372B (en) * | 2018-03-27 | 2022-10-14 | 北京大学 | Progressive photon mapping method based on statistical model test |
US10748248B2 (en) * | 2018-05-15 | 2020-08-18 | Adobe Inc. | Image down-scaling with pixel sets selected via blue noise sampling |
CN109509248B (en) * | 2018-09-28 | 2023-07-18 | 北京大学 | Photon mapping rendering method and system based on neural network |
CN110211197B (en) * | 2019-05-29 | 2020-10-02 | 山东大学 | Photon mapping optimization method, device and system based on polygon space division |
CN110599579B (en) * | 2019-09-20 | 2023-02-24 | 山东师范大学 | Photon resampling-based random asymptotic photon mapping image rendering method and system |
CN111445422B (en) * | 2020-04-17 | 2022-06-07 | 山东大学 | Random asymptotic photon mapping image noise reduction method and system based on neural network |
CN114549730A (en) * | 2020-11-27 | 2022-05-27 | 华为技术有限公司 | Light source sampling weight determination method for multi-light source scene rendering and related equipment |
CN114066721B (en) * | 2021-11-03 | 2024-02-02 | 抖音视界有限公司 | Display method and device and electronic equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101080508B1 (en) * | 2011-09-14 | 2011-11-04 | 인하대학교 산학협력단 | Multi-level progrssive photon mapping method |
CN104200509A (en) * | 2014-08-19 | 2014-12-10 | 山东大学 | Photon mapping accelerating method based on point cache |
CN104700448A (en) * | 2015-03-23 | 2015-06-10 | 山东大学 | Self adaption photon mapping optimization algorithm based on gradient |
-
2016
- 2016-07-14 CN CN201610555035.0A patent/CN106251393B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101080508B1 (en) * | 2011-09-14 | 2011-11-04 | 인하대학교 산학협력단 | Multi-level progrssive photon mapping method |
CN104200509A (en) * | 2014-08-19 | 2014-12-10 | 山东大学 | Photon mapping accelerating method based on point cache |
CN104700448A (en) * | 2015-03-23 | 2015-06-10 | 山东大学 | Self adaption photon mapping optimization algorithm based on gradient |
Non-Patent Citations (1)
Title |
---|
Progressive Photon Mapping with Sample Elimination;Chunmeng Kang,et al;《ACM SIGGRAPH Symposium on Interactive 3d Graphics and Games》;20160227;正文201页第1节第2-3段,公式1 * |
Also Published As
Publication number | Publication date |
---|---|
CN106251393A (en) | 2016-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106251393B (en) | A kind of gradual Photon Mapping optimization method eliminated based on sample | |
US9523772B2 (en) | Object removal using lidar-based classification | |
JP2669599B2 (en) | Shadow drawing method and three-dimensional graphic computer system | |
US9355491B2 (en) | Ray tracing apparatus and method | |
KR102164541B1 (en) | Apparatus and method for generating acceleration structure in a ray tracing system | |
KR102604737B1 (en) | METHOD AND APPARATUS for generating acceleration structure | |
KR20210012069A (en) | Robust merge of 3d textured meshes | |
JP5711289B2 (en) | Prediction method, generation method, and recording medium | |
US20110032256A1 (en) | Image processing apparatus and method | |
CN104700448A (en) | Self adaption photon mapping optimization algorithm based on gradient | |
CN111161384B (en) | Path guiding method of participation medium | |
Ahn et al. | Panerf: Pseudo-view augmentation for improved neural radiance fields based on few-shot inputs | |
CN112802142B (en) | Non-visual field imaging method and system | |
CN113298925A (en) | Dynamic scene rendering acceleration method based on ray path multiplexing | |
KR102537530B1 (en) | Method and apparatus for generating acceleration structure | |
CN116958367A (en) | Method for quickly combining and rendering complex nerve scene | |
CN109509246B (en) | Photon map clustering method based on self-adaptive sight division | |
CN116805348A (en) | Resampling global illumination method based on sample space filtering | |
US20140267357A1 (en) | Adaptive importance sampling for point-based global illumination | |
CN116740255A (en) | Rendering processing method, device, equipment and medium | |
CN116194960A (en) | Direct volume rendering device | |
Colom et al. | 3D shape reconstruction from non-realistic multiple-view depictions using NVDiffRec | |
CN114529660B (en) | Drawing method and system based on photon storage structure | |
CN106469462A (en) | A kind of gradual evaluation method of three-dimensional scenic radiance | |
CN113096248B (en) | Photon collection method and photon mapping rendering method based on shared video memory optimization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |