CN105869204B - A kind of unbiased Photon Mapping method for drafting in participating media - Google Patents

A kind of unbiased Photon Mapping method for drafting in participating media Download PDF

Info

Publication number
CN105869204B
CN105869204B CN201610183981.7A CN201610183981A CN105869204B CN 105869204 B CN105869204 B CN 105869204B CN 201610183981 A CN201610183981 A CN 201610183981A CN 105869204 B CN105869204 B CN 105869204B
Authority
CN
China
Prior art keywords
subpath
light source
vertex
viewpoint
photon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610183981.7A
Other languages
Chinese (zh)
Other versions
CN105869204A (en
Inventor
侯启明
任重
丁珂
秦昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201610183981.7A priority Critical patent/CN105869204B/en
Publication of CN105869204A publication Critical patent/CN105869204A/en
Application granted granted Critical
Publication of CN105869204B publication Critical patent/CN105869204B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses the unbiased Photon Mapping method for drafting in a kind of participating media, the present invention is improved unbiased photon method for congregating from the improvement of subpath sampling technique, theoretical formula transformation, the importance sampling without bias term estimation, dielectric boundaries situation discussion etc. so that obtains the drawing result of unbiased always in the scene of participating media.The present invention is also improved the weight of multiple importance sampling, to which the unbiased Photon Mapping method in participating media to be combined with two-way ray tracking technology so that can play advantage of the two methods under different scenes simultaneously.

Description

A kind of unbiased Photon Mapping method for drafting in participating media
Technical field
The present invention relates to the unbiased Photon Mapping drafting sides in medium rendering technique field more particularly to a kind of participating media Method.
Background technology
The core concept of Photon Mapping is that local luminous energy is estimated using the density of photon, specifically refers to JENSEN, H.W.2001.Realistic Image Synthesis Using Photon Mapping.A.K.Peters,Ltd., Natick,MA,USA.The locality of photon collection makes this method between challenging caustic and multiple reflections It is especially effective to connect illumination.In addition, Photon Mapping can reuse the light path corresponding to photon, to reduce the cost of sampling.It is close Degree estimation can be effectively reduced noise, but also final result can be given to introduce deviation, therefore photons all before simultaneously Mapping method all has inclined.Although a series of newest progressive Photon Mapping methods can be one when photon numbers be unlimited Cause converge to correctly as a result, specific method can refer to HACHISUKA, T., OGAKI, S., AND JENSEN, H.W.2008.Progressive photon mapping.ACM Trans.Graph.27,5(Dec.),130:1–130:8; KNAUS,C.,AND ZWICKER,M.2011.Progressive photon mapping:A probabilistic approach.ACM Trans.Graph.30,3(May),25:1–25:13;HACHISUKA,T.,AND JENSEN, H.W.2009.Stochastic progressive photon mapping.ACM Trans.Graph.28,5(Dec.), 141:1–141:8;KAPLANYAN,A.S.,AND DACHSBACHER,C.2013.Adaptive progressive photon mapping.ACM Trans.Graph.32,2(Apr.),16:1–16:13.But these methods without exception are arbitrarily having Interior result all has inclined in limited time, and the method invented herein is realized in medium field of drawing, and our method is from beginning To the drawing result that can obtain unbiased eventually.
The rendering equation that Kajiya was proposed in 1986 is the mathematical description for the physics law propagated for luminous energy, the equation It is a multiple integral, can be solved by the method for Monte Carlo, which may refer to KAJIYA, J.T.1986.The rendering equation.SIGGRAPH Comput.Graph.20,4(Aug.),143–150.It is two-way Ray trace is then to carry out complete light path by sampling sight and light respectively, and by the method that two light paths are connected Sampling.This method is especially effective for the light source and camera of indoor scene and limited size, and specific method may refer to LAFORTUNE,E.P.,AND WILLEMS,Y.D.1993.Bi-directional path tracing.In Proceedings of CompuGraphics,vol.93,145–153;VEACH,E.,AND GUIBAS, L.J.1994.Bidirectional estimators for light transport.In Proceedings of the Fifth Eurographics Workshop on Rendering,Eurographics,147–162。
Multiple importance sampling is also proposed by Veach et al. later, is mainly used for tying among two-way ray trace at that time The different connection type of various light paths is closed, referring to VEACH, E.1998.Robust Monte Carlo Methods for Light Transport Simulation.PhD thesis,Stanford,CA,USA.AAI9837162.Photon Mapping and double All it is to form the light path completed sampling by connecting sight and light to ray trace.For two-way ray trace, light Son mapping is shone since sight-line end and photon are very close in the indirect light for sampling more complicated caustic and multiple reflections It is more advantageous;However when direct illumination, two-way ray trace is then more efficient.Therefore both methods is in difference Type light path sampling in be complementary, specifically may refer to HASAN ˇ, M., KRIV ˇ ANEK ', J., WALTER, B., AND BALA,K.2009.Virtual spherical lights for many-light rendering of glossy scenes.ACM Trans.Graph.28,5(Dec.),143:1–143:6;VORBA,J.2011.Bidirectional photon mapping.In Proc.of the Central European Seminar on Computer Graphics (CESCG 11)。
Invention content
In view of the above-mentioned deficiencies in the prior art, it is an object of the present invention to the unbiased Photon Mapping in providing a kind of participating media Method for drafting.The present invention solves while giving full play to higher efficiency of the Photon Mapping method for drafting in medium drafting Error problem caused by approximate evaluation in this method.The applicability on medium is done to unbiased Photon Mapping method for drafting to change Into enabling the method to preferably to apply in medium draws scene, there is very high use value.
The purpose of the present invention is achieved through the following technical solutions:A kind of unbiased Photon Mapping in participating media is painted Method processed, includes the following steps:
Step 1:In the scene for having participating media, light source subpath is generated according to the light source in scene, according in scene Viewpoint position generate viewpoint subpath, specific generation method is as follows:
1.1, it is starting point from light source or viewpoint position, photon is emitted into scene.
1.2, photon continues to advance to exit direction, when it encounters the object in scene, the material that is defined according to object And the incidence angle of photon samples to obtain the exit direction of photon, photon moves in this direction.
1.3, when photon encounters the medium in scene, due to the scattering properties of medium, photon can be after a distance It scatters, finds out photon and arrive generation scattering phenomenon when institute forward travel distance next time on the medium.
1.4, after the forward travel distance of photon in the medium has been calculated, it is also necessary to sample side when being scattered after photon advances To the direction can be acquired according to the phase function property and incident direction of medium.
1.5, the whole string location information that the photon is sampled is recorded, as the sample path of the photon, the i.e. sub- road of light source DiameterOr viewpoint subpathI=1,2 ... s, j=1,2 ... t;Wherein yi Refer to i-th of subpath vertex, z on light source subpathjRefer to j-th of subpath vertex, y on viewpoint subpath1For light source position It sets, z1For viewpoint position, ysFor the last term point on light source subpath, ztFor the last term point on viewpoint subpath, s is light source Subpath vertex number on path, t are the subpath vertex number on viewpoint subpath.
Step 2:According to light source subpathWith viewpoint subpathMeter Calculate the measurement contribution margin of simultaneously recording light source subpathThe probability density function of light source subpathViewpoint subpath Measurement contribution marginThe probability density function of viewpoint subpath
The probability density function of light source subpathRefer to the conditional probability density letter on this string of vertex on the path Several formed Joint Distributions.First, i-th of subpath vertex y is definediProbability density function:
And intermediate variable:
Wherein, p (y1) what is indicated is to up-sample y in light source1The probability of point, pσ(y1→y2) indicate from light source position y1Place refers to To next sub- path vertices y2When y1Locate the probability density function in solid angle, pσ(yi-1→yi→yi+1) indicate that incidence angle is yi-1→yiThe angle of emergence is yi→yi+1When yi-1Locate the probability density function in solid angle, θi→i+1Refer to subpath vertex yiOn Surface normal and direction vectorBetween angle, subfix σ indicates that probability density function is estimated for solid angle.
According to subpath vertex yiProbability density function p (yi), the probability density function of light source subpath can be obtained
The measurement contribution margin of light source subpath
Wherein,For light source subpath vertex yiWith yi+1Between geometry item;Indicate light Source subpath vertex yiWith yi+1Between visual function, 0 is taken when being blocked, otherwise take 1;fr(yi-1→yi→yi+1) indicate i-th Bidirectional reflectance distribution function at a sub- path vertices.
Similarly:The probability density function of viewpoint subpath
The measurement contribution margin of viewpoint subpath
Wherein,For viewpoint subpath vertex zjWith zj+1Between geometry item;Expression regards Idea path vertices zjWith zj+1Between visual function;fs(zj-1→zj→zj+1) indicate the two-way of j-th subpath apex Reflectance Distribution Function.
Step 3:All light source subpath is tracked in sampling, by they directly with viewpoint position z1It is connected, calculates viewpoint position Set z1The direct illumination at place:
Wherein, Ω is the sample space of light source subpath,For the probability-distribution function of light source subpath, N is light source The number of subpath, k are the ordinal number of light source subpath.
Simultaneously to y on all light source subpaths1y2…ysSubpath last term point ys, established using balance search tree method Space querying accelerating structure.
Step 4:By all obtained light source subpath y1y2…ysWith viewpoint subpath z1z2…ztDo attended operation, i.e. handle Light source subpath last term point ysWith viewpoint subpath last term point ztIt is connected, calculates connection path y1y2…yszt…z2z1Brightness, It is added to viewpoint position z1On corresponding pixel, the connection pathRadiance:
Wherein, subscript VC indicates vertex attended operation,It is the expansion item of light source subpath:
It is the expansion item of viewpoint subpath:
For light source subpath last term point ysWith viewpoint subpath last term point ztThe measurement contribution margin of junction:
fr(ys-1→ys→zt) refer to incidence angle as ys-1→ysThe angle of emergence is ys→ztShi Guangyuan subpaths vertex ysPair at place To Reflectance Distribution Function, fr(zt-1→zt→ys) refer to incidence angle as zt-1→ztThe angle of emergence is zt→ysWhen viewpoint subpath vertex zt The bidirectional reflectance distribution function at place,Refer to light source subpath last term point ysWith viewpoint subpath last term point ztBetween Geometry item.
Step 5:Obtained viewpoint subpath z is sampled to every1z2…ztOn subpath last term point zt, in step 3 Balance search tree is searched and ztDistance is less than all light source subpath last term point y of rs, according to ysThe corresponding sub- road of light source can be obtained Diameter y1y2…ys, then by viewpoint subpath z1z2…ztWith light source subpath y1y2…ysMerged with subpath last term point, calculates and close And pathBrightness, be added to viewpoint position z1On corresponding pixel, the merging path y1y2…yszt-1…z2z1Radiance:
Wherein, subscript VM indicates vertex attended operation,It is the expansion item of light source subpath:
It is the expansion item of viewpoint subpath:
It is light source subpath last term point ysWith viewpoint subpath last term point ztIntermediate variable at merging:
Wherein,It is light source subpath last term point ysWith viewpoint subpath last term point ztMeasurement contribution at merging Value,It is light source subpath last term point ysWith viewpoint subpath last term point ztProbability density function at merging calculates Formula is as follows:
F hereins(ys-1→ys→zt-1) refer to incidence angle be ys-1→ysThe angle of emergence is ys→zt-1When subpath vertex ys's Bidirectional reflectance distribution function, similarly fs(zt-2→zt-1→ys) refer to incidence angle be zt-2→zt-1The angle of emergence is zt-1→ysShi Zilu Diameter vertex zt-1Bidirectional reflectance distribution function.Then indicate subpath vertex ysWith zt-1Between geometry item;
Wherein, r is the radius of photon collection, S (ys, r) and it is exactly with ysCentered on, radius is the Neighbourhood set of r, px(zt-2 →zt-1→ z) it is by the first two subpath vertex zt-2,zt-1Sampling obtains the probability density of position z:
For from current view point subpath vertex zt-1Constantly to light source subpath last term point ysTransmitting sampling light, Light hits sphere neighborhood S (y for the first times, r) after get in total test light number, E [] indicate it is expected calculate.
Step 6 obtains the light intensity of all vertex connection paths in corresponding pixel points according to step 4With step 5 Obtain the light intensity that all vertex in corresponding pixel points merge pathUsing multiple importance sampling technology, obtain Light intensity I after the weighting of corresponding pixel pointsVCM, so as to complete the drafting of entire unbiased Photon Mapping:
Wherein, subscript VC indicates that vertex attended operation, subscript VM indicate that vertex attended operation, subscript VCM indicate that vertex connects Meet and merge joint operation, nVCFor the fullpath number that all vertex attended operations on the pixel obtain, nVMFor the picture The fullpath number that all vertex union operations on vegetarian refreshments obtain, subscript k are the ordinal number in the path, and subscript s is the path In the number on light source subpath vertex that contains, subscript s is the number on the viewpoint subpath vertex contained in the path.Refer to the light intensity for the fullpath that the attended operation of kth vertex obtains,It is corresponding multiple for the path Weights of importance.Refer to the light intensity for the fullpath that kth vertex union operation obtains,For the road The corresponding multiple weights of importance of diameter.IdirectFor the direct light intensity obtained in step 3.
The calculation formula of multiple weights of importance is as follows:
It is the probability that light source subpath and viewpoint subpath end-node do the fullpath obtained after attended operation Density function:
It is that light source subpath and viewpoint subpath end-node do the general of the fullpath obtained after union operation Rate density function:
Wherein, the probability density function of subpathCalculation formula is provided by step 2, for vertex Probability density function at mergingCalculation formula is provided by step 5.
The beneficial effects of the invention are as follows:The present invention eliminates while keeping Photon Mapping method medium to draw efficient Approximation in conventional photonic mapping method has bias term, keeps final drawing result accurate.And it solves applied to medium Applicable sex chromosome mosaicism under inside and outside different scenes, and in contrast to conventional photonic mapping method for, method of the invention will not be by light Son aggregation effect of radius stability.Specifically, in terms of being embodied in following four:
1, distance samples are carried out to the subpath in medium scene with Woodcook Tracking methods, then uses medium Phase function importance sampling technology samples direction, forms a strip light path.
2, the optical characteristics of binding medium, by pair in the radiance calculation formula of complete optical path in unbiased Photon Mapping Item is answered to carry out formula transformation.
3, by sampling the limitation carried out on angle and sampled distance to Bernoulli Jacob, to improve Bernoulli Jacob's sampling Efficiency makes the cost of the probability calculation of unbiased reduce as far as possible.
4, a point situation discussion is done to the different integrated forms inside and outside dielectric boundaries, and rewrites multiple important under medium participates in Property sample weight formula, completes the consistent unification of method, enables the method part suitable for whether there is or not medium and scene simultaneously There are all kinds of situations such as participating media in region.
Description of the drawings
Fig. 1 be in the present invention about in medium scene Bernoulli trials and sampling angle, sampling direction of advance limitation Schematic diagram, wherein the left side is to sample azimuthal limitation schematic diagram, and the right be the limitation schematic diagram of sampling forward travel distance;
Fig. 2 is the schematic diagram about the virtual boundary point in sample path in the present invention;Wherein, left side is to have participating media Scene rendering as a result, right side is the schematic diagram of a scenario for having participating media;
Fig. 3 is the result schematic diagram of the method for drafting introduced in the present invention compared with other state-of-the-art method for drafting; Wherein, (a) is two-way approach tracking method for drafting handling result figure, is (b) handling result that Photon Mapping handles method for drafting Figure is (c) the handling result figure of method for drafting of the present invention, is (d) original image;
Fig. 4 is result of the method for drafting of the present invention compared with traditional Photon Mapping method is under different aggregation radiuses Schematic diagram, wherein figure (a)-(d) be the present invention method under different aggregation radiuses handling result figure, figure (e)-(h) is traditional Photon Mapping method under different aggregation radiuses handling result figure;And r=3.0 in (a) and (e), r=1.5 (b) and in (f), (c) and (g) in r=0.5, (d) and (h) in r=0.2.
Specific implementation mode
Unbiased Photon Mapping method for drafting in participating media of the present invention, specific implementation steps are as follows:
Step 1:In the scene for having participating media, light source subpath is generated according to the light source in scene, according in scene Viewpoint position generate viewpoint subpath, specific generation method is as follows:
1.1, it is starting point from light source or viewpoint position, photon is emitted into scene.
1.2,1.2, photon continues to advance to exit direction, when it encounters the object in scene, is defined according to object Material and the incidence angle of photon sample to obtain the exit direction of photon, and photon moves in this direction.
The method of sample direction can refer to PHARR M, HUMPHREYS G.Physically based rendering: From theory to implementation[M].[S.l.]:Morgan Kaufmann,2004。
1.3, when photon encounters the medium in scene, due to the scattering properties of medium, photon can be after a distance It scatters, finds out photon and arrive generation scattering phenomenon when institute forward travel distance next time on the medium.
The method of sampled distance uses WOODCOCK E, MURPHY T, HEMMINGS P, et al.Techniques used in the GEM code for Monte Carlo neutronics calculations in reactors and other systems of complex geometry[C]//Proc.Conf.Applications of Computing Methods to Reactor Problems:Vol 557.1965。
1.4, after the forward travel distance of photon in the medium has been calculated, it is also necessary to sample side when being scattered after photon advances To the direction can be acquired according to the phase function property and incident direction of medium.
1.5, the whole string location information that the photon is sampled is recorded, as the sample path of the photon, the i.e. sub- road of light source DiameterOr viewpoint subpathI=1,2 ... s, j=1,2 ... t;Wherein yi Refer to i-th of subpath vertex, z on light source subpathjRefer to j-th of subpath vertex, y on viewpoint subpath1For light source position It sets, z1For viewpoint position, ysFor the last term point on light source subpath, ztFor the last term point on viewpoint subpath, s is light source Subpath vertex number on path, t are the subpath vertex number on viewpoint subpath.
Step 2:According to light source subpathWith viewpoint subpathMeter Calculate the measurement contribution margin of simultaneously recording light source subpathThe probability density function of light source subpathViewpoint subpath Measurement contribution marginThe probability density function of viewpoint subpath
The probability density function of light source subpathRefer to the conditional probability density letter on this string of vertex on the path Several formed Joint Distributions.First, i-th of subpath vertex y is definediProbability density function:
And intermediate variable:
Wherein, p (y1) what is indicated is to up-sample y in light source1The probability of point, pσ(y1→y2) indicate from light source position y1Place refers to To next sub- path vertices y2When y1Locate the probability density function in solid angle, pσ(yi-1→yi→yi+1) indicate that incidence angle is yi-1→yiThe angle of emergence is yi→yi+1When yi-1Locate the probability density function in solid angle, θi→i+1Refer to subpath vertex yiOn Surface normal and direction vectorBetween angle, subfix σ indicates that probability density function is estimated for solid angle.
According to subpath vertex yiProbability density function p (yi), the probability density function of light source subpath can be obtained
The measurement contribution margin of light source subpath
Wherein,For light source subpath vertex yiWith yi+1Between geometry item;Indicate light Source subpath vertex yiWith yi+1Between visual function, 0 is taken when being blocked, otherwise take 1;fr(yi-1→yi→yi+1) indicate i-th Bidirectional reflectance distribution function at a sub- path vertices is specifically defined and can refer to PHARR M, HUMPHREYS G.Physically based rendering:From theory to implementation[M].[S.l.]:Morgan Kaufmann,2004。
Similarly:The probability density function of viewpoint subpath
The measurement contribution margin of viewpoint subpath
Wherein,For viewpoint subpath vertex zjWith zj+1Between geometry item;Expression regards Idea path vertices zjWith zj+1Between visual function;fs(zj-1→zj→zj+1) indicate the two-way of j-th subpath apex Reflectance Distribution Function.
Step 3:All light source subpath is tracked in sampling, by they directly with viewpoint position z1It is connected, calculates viewpoint position Set z1The direct illumination at place:
Wherein, Ω is the sample space of light source subpath,For the probability-distribution function of light source subpath, N is light source The number of subpath, k are the ordinal number of light source subpath.
Simultaneously to y on all light source subpaths1y2…ysSubpath last term point ys, established using balance search tree method Space querying accelerating structure.
Step 4:(vertex attended operation) is by all obtained light source subpath y1y2…ysWith viewpoint subpath z1z2…zt Attended operation is done, i.e., light source subpath last term point ysWith viewpoint subpath last term point ztIt is connected, calculates connection path y1y2… yszt…z2z1Brightness, be added to viewpoint position z1On corresponding pixel, the connection path Radiance:
Wherein, subscript VC indicates vertex attended operation (Vertex Connection),It is the exhibition of light source subpath Open item:
It is the expansion item of viewpoint subpath:
For light source subpath last term point ysWith viewpoint subpath last term point ztThe measurement contribution margin of junction:
fr(ys-1→ys→zt) refer to incidence angle as ys-1→ysThe angle of emergence is ys→ztShi Guangyuan subpaths vertex ysPair at place To Reflectance Distribution Function, fr(zt-1→zt→ys) refer to incidence angle as zt-1→ztThe angle of emergence is zt→ysWhen viewpoint subpath vertex zt The bidirectional reflectance distribution function at place,Refer to light source subpath last term point ysWith viewpoint subpath last term point ztBetween Geometry item.
In the scene of participating media, Wo Menyou:
Wherein,WithRespectively by xi-1It is directed toward xiIncident direction with by xiIt is directed toward xi+1Exit direction, Have when there was only isotropic medium in scene for the phase function of participating media For the visual function in step 2 geometry item formula,For from current vertex xiLocate to next vertex xi+1Between Medium Optics thickness.In addition and the form of optical thickness is indicated with symbol tau:
Wherein σtFor the extinction coefficient of medium here, d is vertex xiTo xi+1The distance between,For from xiIt is directed toward xi+1's Unit vector, t are from vertex xiTo xi+1The distance of advance.
Step 5:(the vertex union operation of unbiased) we sample obtained viewpoint subpath z to every herein1z2…ztOn Subpath last term point zt, searched with the balance search tree in step 3 and ztDistance is less than all light source subpath last terms of r Point ys, according to ysCorresponding light source subpath y can be obtained1y2…ys, then by viewpoint subpath z1z2…ztWith light source subpath y1y2…ysMerged with subpath last term point, calculates and merge path y1y2…yszt-1…z2z1Brightness, be added to viewpoint position z1 On corresponding pixel, the merging path y1y2…yszt-1…z2z1Radiance:
Wherein, subscript VM indicates vertex attended operation (Vertex Merging), is the expansion item of light source subpath:
It is the expansion item of viewpoint subpath:
It is light source subpath last term point ysWith viewpoint subpath last term point ztIntermediate variable at merging:
Wherein,It is light source subpath last term point ysWith viewpoint subpath last term point ztMeasurement contribution at merging Value,It is light source subpath last term point ysWith viewpoint subpath last term point ztProbability density function at merging.To ask SolutionWe need to calculate separatelyWithWithout bias:
5.1、Unbiased in item medium calculates
In traditional Photon Mapping method,Item is not the light source subpath last term point y after mergingsWith the sub- road of viewpoint Diameter vertex zt-1It is connected, but approximate evaluation in the following way:
Wherein ys(zt) refer to using vertex ysCover vertex zt.And fr(ys-1→ys(zt)→zt-1) refer to incidence angle be ys-1 →ysThe angle of emergence is zt→zt-1Shi Dingdian ysBidirectional reflectance distribution function, it means that in ysOn BSDF analyses be using close As zt→zt-1To substitute ys→zt-1.And subsequent geometry itemWith bidirectional reflectance distribution function fr(zt-2→ zt-1→zt) calculated for convenience all using approximate item, calculation formula is provided by step 2.And in fact, we use strictly The formula form of unbiased:
F hereins(ys-1→ys→zt-1) refer to incidence angle be ys-1→ysThe angle of emergence is ys→zt-1When subpath vertex ys's Bidirectional reflectance distribution function, similarly fs(zt-2→zt-1→ys) refer to incidence angle be zt-2→zt-1The angle of emergence is zt-1→ysShi Zilu Diameter vertex zt-1Bidirectional reflectance distribution function.Then indicate subpath vertex ysWith zt-1Between geometry item, specifically Formula provided in step 2.
Therefore in our method, the sub- mapping method of polarisation becomes unbiased traditional, it is necessary first in use Unbiased formula is stated to replace original approximate formula.
5.2、Unbiased in item medium calculates
In traditional Photon Mapping method,Item is block to be used to filter using traditional photon density estimation mode Device function:
Wherein px(zt-2→zt-1→zt) it is by the first two vertex zt-2,zt-1Sampling obtains vertex ztProbability density, r is The radius of photon collection.And stringent definition should calculate ztAll photon probability density value p in collect range r neighborhoodsx's With it is as follows using the form of integral:
Wherein r is the radius of photon collection, S (ys, r) and it is exactly with ysCentered on, radius is the Neighbourhood set of r, px(zt-2→ zt-1→ z) it is by the first two subpath vertex zt-2,zt-1Sampling obtains the probability density of position z.
We can not direct solutionValue, but we can be reciprocal with unbiased esti-mator probability integral Value:
For from current view point subpath vertex zt-1Constantly to light source subpath last term point ysTransmitting sampling light (the same step 1) of sampling process in the direction and forward travel distance of sampling light, light hit sphere neighborhood S (y for the first times, r) after in total The number for the test light got, E [] indicate it is expected to calculate.We term it Bernoulli trials for above-mentioned sampling experiment.
And an apparent problem of the solution be exactly this unknown Bernoulli trials number may be very big.Due to It needs to be sampled out in entire scene space with vertex ys'Centered on r be radius neighborhood probability, it is evident that such probability is non- Often small, therefore Bernoulli trials number will be very big, then the calculating cost of unbiased esti-mator very big (such as can may pass through herein 100 samplings could terminate, and primary photon mapping method then only needs 1 approximate calculation).
Then in the scene of not medium, the range of sampling can be reduced by azimuthai boundary, to reduce primary Nu Li tests total degree, as shown in fig. 1 on the left-hand side.And in the scene for having medium, since photon samples in dielectric space (not Only on a surface of an), Bernoulli trials total degree still can not effectively only be reduced by azimuth limitation.Therefore we carry Go out and limited jointly by azimuthai boundary and forward travel distance boundary, the test light that experiment is got is made to sample the institute on the right side of Fig. 1 The area of space shown.
We are defined forward travel distance using the theory deduction in Woodcock Tracking simultaneously.For given Equally distributed random number r ∈ [0,1) according to sampled probability density functionForward travel distance, which can be obtained, is:
Wherein σtFor the extinction coefficient of medium.Later we can write out apart from boundary R and corresponding random number range rboundBetween relationship:
Therefore we generate random number r progress scope limitations using by initial distance, make The forward travel distance of test light can be allowed to be limited in range [R1,R2] in, to be effectively improved our method in medium To the unbiased esti-mator efficiency of probability integral inverse in scene.
And in our realization, R1With R2It is to add or subtract by the distance between initial vertex and representative points Aggregation radius r is gone to be acquired to calculate.In addition to this, it would be desirable to which surface vertices or medium top are distinguished to two connection vertex The a variety of situation discussion of point:
1) it when starting point vertex is surface vertices, needs to carry out azimuth about using the bidirectional reflectance distribution function in material Otherwise beam uses the phase function on medium;
2) when representative points are surface vertices, as long as then ensureing apart from lower limit R1, upper limit R is more than for sampled distance2Institute There is something special all directly beats on the surface, therefore our needs will be apart from upper limit R2Remove, corresponding is exactly to allow the upper limit of random number rEqual to 1.
Then after we take importance sampling to accelerate, the form of calculation of probability density function inverse is as follows:
Wherein Nb'Indicate time that Bernoulli trials is tested in total after first time success after taking importance sampling to accelerate Number.It indicates to limit the product of probability score value in distance range in our importance samplings.σt(z) it indicates at the z of position Medium extinction coefficient.Ωb'It indicates to limit the locational space in distance range.Due to before we to being limited on its sampled distance System, thenThe analytical form of probability item as after limited samples distance.Then have in uniform dielectric:
Therefore, there are the limitation on above-mentioned azimuth and forward travel distance, the expectation number of final Bernoulli trials big Width reduces, and greatly increases probability integral inverseUnbiased esti-mator efficiency.
Step 6:There is the light intensity for all vertex connection paths that step 4 obtains in corresponding pixel pointsWith step 5 Obtain the light intensity that all vertex in corresponding pixel points merge pathAfterwards, it would be desirable to be adopted using multiple importance Sample (Multiple Importance Sampling, referred to as MIS) technology, to obtain the light after the weighting of corresponding pixel points Strong IVCM, so as to complete the drafting flow of entire unbiased Photon Mapping:
Wherein subscript VC indicates that vertex attended operation (Vertex Connection), subscript VM indicate vertex attended operation (Vertex Merging), subscript VCM indicate vertex connection and merge joint operation (Vertex Connection and Merging),nVCFor the fullpath number that all vertex attended operations on the pixel obtain, nVMFor on the pixel The fullpath number that all vertex union operations obtain, subscript k are the ordinal number in the path, and subscript s is to contain in the path The number on light source subpath vertex, subscript s are the number on the viewpoint subpath vertex contained in the path;Refer to The light intensity for the fullpath that k vertex attended operation obtains,For the corresponding multiple weights of importance in the path;Refer to the light intensity for the fullpath that kth vertex union operation obtains,It is corresponding more for the path Weight weights of importance;IdirectFor the direct light intensity obtained in step 3.
The calculation formula of multiple weights of importance is as follows:
It is the probability that light source subpath and viewpoint subpath end-node do the fullpath obtained after attended operation Density function:
It is that light source subpath and viewpoint subpath end-node do the general of the fullpath obtained after union operation Rate density function:
Wherein, the probability density function of subpathCalculation formula is provided by step 2, for vertex Probability density function at mergingCalculation formula is provided by step 5.
Significantly, since the addition of medium, the calculation formula of multiple importance sampling weight can do certain tune It is whole.Consistency is maintained in order to make to pass through the corresponding formula of fullpath inside and outside medium, you can estimate with direct use simultaneously Contribution rate and probability density function, which are calculated, no longer needs to consider that ray is the different form in medium or outside medium Expression, we add virtual boundary point and are marked.Then when calculating multiple importance sampling weight weight, it should be noted that Vertex connects and union operation weight ratio ηVCMVariation.Understand the η on surfaceVCMForm is as follows:
And in participating media in the case of for, we use following form:
Wherein nVCIndicate the light path sum of vertex attended operation, nVMIndicate that the light path sum of vertex union operation, r are top The combined radius of neighbourhood of point.Then we can realize vertex connection in medium scene and merge multiple under two kinds of strategies Importance sampling scheme.
Embodiment
Inventor is equipped with the double-core central processing unit of Intel i5 3.00GHz and the computer of 16GB memories at one On realize method as described above, and realize other method for drafting being closely related with our methods, generate in figure Rendering result.
Experimental result (Fig. 3) shows to be substantially better than traditional pair in medium rendering by our improved integrated processes To path tracing and Photon Mapping method.In contrast to two-way approach method for tracing, the convergence of our methods is with obvious effects to get well.Phase Than in conventional photonic mapping method, our methods have apparent while keeping method is efficient in the unbiasedness of result Advantage.Fig. 4 considers from photon aggregation radius factor later, and rendering performance of our method in medium drafting is relative to tradition Photon Mapping method for, aggregation radius may be significantly smaller the influence of our method performance.

Claims (1)

1. the unbiased Photon Mapping method for drafting in a kind of participating media, which is characterized in that include the following steps:
Step 1:In the scene for having participating media, light source subpath is generated according to the light source in scene, according to regarding in scene Point position generates viewpoint subpath, and specific generation method is as follows:
1.1, it is starting point from light source or viewpoint position, photon is emitted into scene;
1.2, photon continues to advance to exit direction, when it encounters the object in scene, the material that is defined according to object and The incidence angle of photon samples to obtain the exit direction of photon, and photon moves in this direction;
1.3, when photon encounters the medium in scene, due to the scattering properties of medium, photon can occur after a distance Scattering finds out photon and arrives generation scattering phenomenon when institute forward travel distance next time on the medium;
1.4, after the forward travel distance of photon in the medium has been calculated, it is also necessary to direction when being scattered after photon advances is sampled, The direction can be acquired according to the phase function property and incident direction of medium;
1.5, the whole string location information that the photon is sampled is recorded, as the sample path of the photon, i.e. light source subpathOr viewpoint subpathWherein yiIt is Refer to i-th of subpath vertex on light source subpath, zjRefer to j-th of subpath vertex, y on viewpoint subpath1For light source position, z1For viewpoint position, ysFor the last term point on light source subpath, ztFor the last term point on viewpoint subpath, s is the light source subpath On subpath vertex number, t be viewpoint subpath on subpath vertex number;
Step 2:According to light source subpathWith viewpoint subpathIt calculates simultaneously The measurement contribution margin of recording light source subpathThe probability density function of light source subpathThe survey of viewpoint subpath Measure contribution marginThe probability density function of viewpoint subpath
The probability density function of light source subpathRefer to the conditional probability density function institute on this string of vertex on the path The Joint Distribution of composition;First, i-th of subpath vertex y is definediProbability density function:
And intermediate variable:
Wherein, p (y1) what is indicated is to up-sample y in light source1The probability of point, pσ(y0→y1) indicate from light source position y0Under place is directed toward One sub- path vertices y1When y1Locate the probability density function in solid angle, pσ(yi-1→yi→yi+1) expression incidence angle be yi-1→ yiThe angle of emergence is yi→yi+1When yiLocate the probability density function in solid angle, θi→i+1Refer to subpath vertex yiOn surface normal Amount and direction vectorBetween angle, subfix σ indicates that probability density function is estimated for solid angle;
According to subpath vertex yiProbability density function p (yi), the probability density function of light source subpath can be obtained
The measurement contribution margin of light source subpath
Wherein,For light source subpath vertex yiWith yi+1Between geometry item;Indicate the sub- road of light source Diameter vertex yiWith yi+1Between visual function, 0 is taken when being blocked, otherwise take 1;fr(yi-1→yi→yi+1) indicate i-th of sub- road The bidirectional reflectance distribution function of diameter apex;
Similarly:The probability density function of viewpoint subpath
The measurement contribution margin of viewpoint subpath
Wherein,For viewpoint subpath vertex zjWith zj+1Between geometry item;Indicate the sub- road of viewpoint Diameter vertex zjWith zj+1Between visual function;fs(zj-1→zj→zj+1) indicate that the bidirectional reflectance of j-th of subpath apex divides Cloth function;
Step 3:All light source subpath is tracked in sampling, by they directly with viewpoint position z1It is connected, calculates viewpoint position z1Place Direct illumination:
Wherein, Ω is the sample space of light source subpath,For the probability-distribution function of light source subpath, N is the sub- road of light source The number of diameter, k are the ordinal number of light source subpath;
Simultaneously to y on all light source subpaths1y2…ysSubpath last term point ys, space is established using balance search tree method and is looked into Ask accelerating structure;
Step 4:By all obtained light source subpath y1y2…ysWith viewpoint subpath z1z2…ztAttended operation is done, i.e., light source Subpath last term point ysWith viewpoint subpath last term point ztIt is connected, calculates connection path y1y2…yszt…z2z1Brightness, add up To viewpoint position z1On corresponding pixel, the connection pathRadiance:
Wherein, subscript VC indicates vertex attended operation,It is the expansion item of light source subpath:
It is the expansion item of viewpoint subpath:
For light source subpath last term point ysWith viewpoint subpath last term point ztThe measurement contribution margin of junction:
fr(ys-1→ys→zt) refer to incidence angle as ys-1→ysThe angle of emergence is ys→ztShi Guangyuan subpaths vertex ysThe bidirectional reflectance at place Distribution function, fr(zt-1→zt→ys) refer to incidence angle as zt-1→ztThe angle of emergence is zt→ysWhen viewpoint subpath vertex ztPair at place To Reflectance Distribution Function,Refer to light source subpath last term point ysWith viewpoint subpath last term point ztBetween geometry item;
Step 5:Obtained viewpoint subpath z is sampled to every1z2…ztOn subpath last term point zt, with the balance in step 3 Search tree is searched and ztDistance is less than all light source subpath last term point y of rs, according to ysCorresponding light source subpath can be obtained y1y2…ys, then by viewpoint subpath z1z2…ztWith light source subpath y1y2…ysMerged with subpath last term point, calculates and merge PathBrightness, be added to viewpoint position z1On corresponding pixel, the merging path y1y2…yszt-1…z2z1Radiance:
Wherein, subscript VM indicates vertex union operation,It is the expansion item of light source subpath:
It is the expansion item of viewpoint subpath:
It is light source subpath last term point ysWith viewpoint subpath last term point ztIntermediate variable at merging:
Wherein,It is light source subpath last term point ysWith viewpoint subpath last term point ztMeasurement contribution margin at merging,It is light source subpath last term point ysWith viewpoint subpath last term point ztProbability density function at merging, calculation formula It is as follows:
F hereins(ys-1→ys→zt-1) refer to incidence angle be ys-1→ysThe angle of emergence is ys→zt-1When subpath vertex ysIt is two-way Reflectance Distribution Function, similarly fs(zt-2→zt-1→ys) refer to incidence angle be zt-2→zt-1The angle of emergence is zt-1→ysWhen subpath top Point zt-1Bidirectional reflectance distribution function;Then indicate subpath vertex ysWith zt-1Between geometry item;
Wherein, r is the radius of photon collection, S (ys, r) and it is exactly with ysCentered on, radius is the Neighbourhood set of r, px(zt-2→zt-1 → z) it is by the first two subpath vertex zt-2,zt-1Sampling obtains the probability density of position z:
For from current view point subpath vertex zt-1Constantly to light source subpath last term point ysTransmitting sampling light, light Sphere neighborhood S (y are hit for the first times, r) after get in total test light number, E [] indicate it is expected calculate;
Step 6:The light intensity of all vertex connection paths in corresponding pixel points is obtained according to step 4It is obtained with step 5 All vertex in corresponding pixel points merge the light intensity in pathUsing multiple importance sampling technology, corresponded to Light intensity I after the weighting of pixelVCM, so as to complete the drafting of entire unbiased Photon Mapping:
Wherein, subscript VC indicates vertex attended operation, and subscript VM indicates vertex union operation, subscript VCM indicate vertex connection and Merge joint operation, nVCFor the fullpath number that all vertex attended operations on the pixel obtain, nVMFor the pixel On the obtained fullpath number of all vertex union operations, subscript k is the ordinal number in the path, and subscript s is to contain in the path The number on some light source subpaths vertex, subscript s are the number on the viewpoint subpath vertex contained in the path;It is Refer to the light intensity for the fullpath that kth vertex attended operation obtains,For the corresponding multiple weights of importance in the path;Refer to the light intensity for the fullpath that kth vertex union operation obtains,It is corresponding more for the path Weight weights of importance, IdirectFor the direct light intensity obtained in step 3;
The calculation formula of multiple weights of importance is as follows:
It is the probability density that light source subpath and viewpoint subpath end-node do the fullpath obtained after attended operation Function:
Be light source subpath and viewpoint subpath end-node do the fullpath obtained after union operation probability it is close Spend function:
Wherein, the probability density function of subpathCalculation formula is provided by step 2, and vertex is merged The probability density function at placeCalculation formula is provided by step 5.
CN201610183981.7A 2016-03-28 2016-03-28 A kind of unbiased Photon Mapping method for drafting in participating media Active CN105869204B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610183981.7A CN105869204B (en) 2016-03-28 2016-03-28 A kind of unbiased Photon Mapping method for drafting in participating media

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610183981.7A CN105869204B (en) 2016-03-28 2016-03-28 A kind of unbiased Photon Mapping method for drafting in participating media

Publications (2)

Publication Number Publication Date
CN105869204A CN105869204A (en) 2016-08-17
CN105869204B true CN105869204B (en) 2018-07-17

Family

ID=56625280

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610183981.7A Active CN105869204B (en) 2016-03-28 2016-03-28 A kind of unbiased Photon Mapping method for drafting in participating media

Country Status (1)

Country Link
CN (1) CN105869204B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200509A (en) * 2014-08-19 2014-12-10 山东大学 Photon mapping accelerating method based on point cache
CN105118083A (en) * 2015-08-11 2015-12-02 浙江大学 Unbiased photon mapping drawing method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9041721B2 (en) * 2012-02-13 2015-05-26 Nvidia Corporation System, method, and computer program product for evaluating an integral utilizing a low discrepancy sequence and a block size
KR102110819B1 (en) * 2013-05-08 2020-05-15 삼성전자주식회사 Image processing apparatus and method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200509A (en) * 2014-08-19 2014-12-10 山东大学 Photon mapping accelerating method based on point cache
CN105118083A (en) * 2015-08-11 2015-12-02 浙江大学 Unbiased photon mapping drawing method

Also Published As

Publication number Publication date
CN105869204A (en) 2016-08-17

Similar Documents

Publication Publication Date Title
Liu et al. Neural sparse voxel fields
Wang et al. Learning indoor inverse rendering with 3d spatially-varying lighting
Nguyen-Phuoc et al. Snerf: stylized neural implicit representations for 3d scenes
Yin et al. Scale recovery for monocular visual odometry using depth estimated with deep convolutional neural fields
US9245377B1 (en) Image processing using progressive generation of intermediate images using photon beams of varying parameters
US20180012407A1 (en) Motion Capture and Character Synthesis
US20120299922A1 (en) Image processing apparatus and method
US20200312009A1 (en) Rendering images from deeply learned raytracing parameters
Liang et al. Gs-ir: 3d gaussian splatting for inverse rendering
US20120299921A1 (en) Directing indirect illumination to visibly influenced scene regions
Bittner et al. Adaptive global visibility sampling
Hu et al. Aerial infrared target tracking based on a Siamese network and traditional features
Hofmann et al. Interactive path tracing and reconstruction of sparse volumes
Kang et al. A survey of photon mapping state-of-the-art research and future challenges
Ibrahim et al. Probabilistic occlusion culling using confidence maps for high-quality rendering of large particle data
Cuntz et al. Fast hierarchical 3D distance transforms on the gpu.
Holzschuch Accurate computation of single scattering in participating media with refractive boundaries
CN105869204B (en) A kind of unbiased Photon Mapping method for drafting in participating media
Ghosh et al. Sequential Sampling for Dynamic Environment Map Illumination.
Frisvad et al. Photon differential splatting for rendering caustics
Köster et al. Screen Space Particle Selection.
Bruder et al. Real-time performance prediction and tuning for interactive volume raycasting
CN105118083B (en) A kind of Photon Mapping method for drafting of unbiased
Zhang et al. Benchmarking the Robustness of Object Detection Based on Near‐Real Military Scenes
Gillsjö et al. In depth Bayesian semantic scene completion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant