EP2589025A2 - Verfahren zur schätzung der lichtstreuung - Google Patents

Verfahren zur schätzung der lichtstreuung

Info

Publication number
EP2589025A2
EP2589025A2 EP11729394.4A EP11729394A EP2589025A2 EP 2589025 A2 EP2589025 A2 EP 2589025A2 EP 11729394 A EP11729394 A EP 11729394A EP 2589025 A2 EP2589025 A2 EP 2589025A2
Authority
EP
European Patent Office
Prior art keywords
medium
light
heterogeneous
projection
coefficients
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP11729394.4A
Other languages
English (en)
French (fr)
Inventor
Cyril Delalandre
Pascal Gautron
Jean-Eudes Marvie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thomson Licensing SAS
Original Assignee
Thomson Licensing SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing SAS filed Critical Thomson Licensing SAS
Priority to EP11729394.4A priority Critical patent/EP2589025A2/de
Publication of EP2589025A2 publication Critical patent/EP2589025A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models

Definitions

  • the invention relates to the field of synthetic image composition and more particularly to the field of simulating the scattering of light in a heterogeneous participating medium.
  • the invention is also in the context of special effects for a composition in real time (of the English "live").
  • the participating media correspond to media composed of suspended particles that interact with the light to modify the path and the intensity in particular.
  • the participating media can be broken down into two parts, namely homogeneous media such as water and heterogeneous media, such as smoke or clouds.
  • homogeneous participating media it is possible to calculate in an analytical way the attenuation of the light emitted by a light source. Indeed, because of their homogeneous nature, these media have parameters such as the absorption coefficient of light or the scattering coefficient of light of constant value at any point in the medium.
  • the absorption and scattering properties of light vary from one point to another in a heterogeneous participating medium. The calculations necessary to simulate the scattering of light in such a heterogeneous medium are then very expensive and it is thus not possible to calculate analytically and in real time the amount of light scattered by a heterogeneous participating medium.
  • the quantity of light diffused by the medium also varies according to the direction of diffusion of the light. That is, the direction in which a person looks at this environment. Calculations estimating the amount of light scattered must then be repeated for each direction of observation of the medium by a person to obtain a realistic rendering of the medium.
  • some methods perform the pre-calculation of certain parameters representative of the heterogeneous participating medium. While these methods are ideally suited for use in a post-production studio, for example, and provide good quality rendering, these methods are not suitable in the context of interactive design and real-time rendering of a heterogeneous participating medium.
  • WO2009 / 003143 Such a method is for example described in the patent application WO2009 / 003143 filed by Microsoft Corporation and published on December 31, 2008.
  • the object of the invention WO2009 / 003143 object is a real-time software rendering a heterogeneous medium and describes a solution using radial basic functions.
  • this solution can not be considered as a real-time rendering solution since certain pre-treatments must be applied offline (from the "offline") to the participating medium in order to calculate projection coefficients representing the environment that will be used for real time calculations of image synthesis.
  • the invention aims to overcome at least one of these disadvantages of the prior art.
  • the invention particularly aims to optimize the computation time required to compose a realistic real-time rendering of the light scattering in a heterogeneous participating medium.
  • the invention relates to a method for estimating the amount of light diffused by a heterogeneous participating medium, the method comprising the steps of:
  • estimating projection coefficients in a function basis from representative density values for a set of elements of the heterogeneous participating medium located along at least one direction of light emission by a light source, and estimating the amount of light diffused by the heterogeneous participating medium, according to at least one direction of light scattering, from the estimated projection coefficients.
  • the elements of the heterogeneous participating medium are points or particles.
  • the estimation of the projection coefficients is independent of the wavelength of the light emitted by the light source.
  • the projection coefficients are estimated by taking into account a predetermined scale factor ⁇ .
  • the projection coefficients are estimated using a ray sampling method, the heterogeneous participating medium being composed of points.
  • the projection coefficients are estimated using a particle addition method, the heterogeneous participating medium being composed of particles.
  • the method comprises a step of estimating values representative of the reduction in luminous intensity from the estimated projection coefficients.
  • the estimation of the quantity of light diffused by said medium is carried out by discretizing the heterogeneous participating medium along the at least one diffusion direction.
  • the estimation of the amount of light diffused by the heterogeneous participating medium is carried out using a ray sampling method.
  • the estimation of the amount of light diffused by the heterogeneous participating medium is carried out using a particle addition method.
  • the projection coefficients are stored in a projection texture.
  • FIG. 1 schematically illustrates a heterogeneous, light-scattering participating medium, according to a particular embodiment of the invention
  • FIG. 2 schematically illustrates a method for estimating the attenuation of light in a medium of FIG. 1, according to a particular embodiment of the invention
  • FIG. 3 schematically illustrates a method for estimating the quantity of light diffused by a medium of FIG. 1, according to one particular embodiment of the invention
  • FIG. 4 illustrates a device implementing a method for estimating the quantity of scattered light, according to an example of a particular implementation of the invention
  • FIG. 5 illustrates a method for estimating the quantity of scattered light, according to one particular embodiment of the invention.
  • FIG. 1 illustrates a heterogeneous participating media (heterogeneous participant media), for example a cloud.
  • a participating medium is a medium, composed of a multitude of particles in suspension, which absorbs, emits and / or diffuses light.
  • a participating medium absorbs only light, for example light received from a light source 11 such as the sun for example. This means that light passing through the medium 10 is attenuated, the attenuation depending on the density of the medium.
  • the medium is heterogeneous, that is to say that the physical characteristics of the medium, such as the density of the particles composing, for example, vary from one point to another in the medium.
  • the participating medium is composed of small particles that interact with the light
  • the incident light that is to say received from the light source 1 1 in a direction ⁇ ⁇ 1 10 is not only absorbed but is also broadcast.
  • the light In an isotropic scattering participating medium, the light is diffused uniformly in all directions.
  • an anisotropic scattering participating medium such as the cloud 10 illustrated in FIG. 1, the scattering of the light depends on the angle between the direction of incidence ⁇ ⁇ 1 10 and the diffusion direction ⁇ 0 ⁇ * 120 of the light.
  • the amount of light diffused into an element (equated with a point or to a particle defined by a center and a radius of influence, a particle advantageously grouping together a set of points having the same properties M 13 of the medium 10 in the diffusion direction out 120 is calculated by the following equation:
  • the amount of light scattered by an element M 13 of the medium reaching the eye of a spectator 12 located at a point C of the space in the out direction 120, that is to say the amount of light scattered by the element M and attenuated by the medium 10 on the path MP, the point P being situated at the intersection of the middle 10 and the direction out in the direction of the spectator 12, is then:
  • D (M) or D (s) is the density of the medium in a given element, the density varying from one element to another since the medium is heterogeneous,
  • ⁇ ( ⁇ , ⁇ 0 ⁇ , ⁇ ) is the phase function describing how the light coming from the direction of incidence ⁇ , ⁇ is diffused in the direction of diffusion ⁇ 0 ⁇ * to the element M,
  • ⁇ L ri (M, cj0in) is the light intensity reduced to the element M coming from the direction of incidence ⁇ , ⁇ 1 and represents the amount of incident light arriving at the element M after attenuation due to the path of the light in the medium 10 on the segment KM, K being the point of intersection between the medium 10 and the radius of incidence ⁇ , ⁇ 1 10, and is:
  • Equation 2 makes it possible to calculate the amount of light diffused by an element M and reaching the eye of a spectator 12 located on the direction ⁇ 0 ⁇ .
  • To calculate the quantity of light received by a spectator looking in the direction ⁇ 0 ⁇ it is then necessary to sum all the contributions of all the middle elements located on the axis or t, that is to say say the elements situated on the segment PM max , P and M max being the two points of intersection between the medium 10 and the direction or t 120.
  • This total scattered luminance arriving at P 15 from the direction or t 120 due to the diffusion simple is then:
  • This total scattered luminance is obtained by integrating the contributions of all the elements located between P and M max on a radius having added as direction.
  • Such an integral equation can not be solved analytically in the general case and even less so for real-time estimation of the amount of scattered light.
  • the integral is evaluated numerically using the so-called ray sampling or ray-marching method. In this method, the integration domain is discretized into a multitude of 5M size intervals and we obtain the following equation:
  • the heterogeneous participating medium 10 is a three-dimensional element, shown in two dimensions in FIG. 1 for the sake of clarity.
  • the heterogeneous participating medium 10 is formed of a multitude of points, a density value being associated with each point.
  • the density values are advantageously stored in a texture called density texture.
  • the heterogeneous participating medium 10 is formed
  • FIG. 2 illustrates a method for estimating the attenuation of light from a light source 1 1 in the heterogeneous participating medium 10, and more particularly the application of the method of ray sampling to estimate the attenuation of light in the medium 10, according to a particular embodiment of the invention.
  • the light diffused at a point M 13 by the medium 10 is a composition of the light attenuation received by the medium 10 of a light source 11 and the diffusion of this light.
  • the term of the equation 1 representative of the attenuation of the light received from the light source 11 in the medium 10 is estimated .
  • the representative term of the attenuation of the simple diffusion at a point M of the medium 10 is represented by the following equation, equivalent to the equation 3: where Atti_ (M) is the attenuation of the luminous intensity at the point M 13 and represents the amount of incident light arriving at the point M after attenuation,
  • is the extinction coefficient of the medium, corresponding to the sum of the medium diffusion coefficient a s and the absorption coefficient of the medium
  • t (s) o t . D (s), which is to say that it is the density which varies from one point to another of the middle O.
  • the density is constant from one point to another and this is the coefficient extinction that varies from one point to another or from one particle to another.
  • each function f (x) (for example the function representative of the density) of a functional space can be represented as a linear combination of basic functions: where ⁇ is the j 'th coefficient of the basis function B j defined by:
  • the integration domain situated on the direction of incidence 1 10 considered between the entry point K 14 of the light ray 1 10 in the medium 10 and a considered point of the medium 10 is discretized in a series of intervals 201, 202, 20i, 20i + 1, 20n of size S 5.
  • the density also varies from one point to another, the density being equal to D in K and D, as a function of the position of the point M, on the radius of incidence ⁇ ⁇ 1 10.
  • To calculate the j 'th coefficient ⁇ at the point M for example from the equation 14, it calculates the sum of the contributions of points K, M and M2 which are associated density values D; D 2 , DM.
  • the variable xi of equation 14 corresponds to the distance between K 14 and the point considered along the radius of incidence considered (for example M 2 or M when calculating q at point M).
  • the set of projection coefficients of the basic functions thus calculated is stored in a projection texture (of the English "projective texture map” or “projective texturing"), such a projection texture can be compared to a map of shadow (of the English "shadow map”).
  • the calculated coefficients are representative of the density (or density variation) along the emission direction associated with each element (so-called texel) of the projection texture.
  • a graphical representation of the density variation in a given direction 1 10 is made possible by using these basic function coefficients, as shown in FIG. 2.
  • Att L (M) The estimate of Att L (M) is fast because the projection coefficients cj have been previously estimated (and advantageously stored in a projection texture). It is then easy to find Lri (M) since L r i (M) is equal to the product of Att L (M) by the quantity of light emitted by the light source 11 along the direction of emission of the light. light. Lri (M) is thus equivalent to Att L (M) by a factor.
  • the extinction coefficient of the medium t is dependent on the wavelength of the light emitted by the light source, it is necessary to calculate a set of basis function coefficients for each elementary component of the light, example the components R, G and B (of the English “Red, Green, Blue” or in French “Red, green, blue”), each component R, G and B having a particular wavelength or the components R, G, B and Y (from "Red, Green, Blue, Yellow” or "Red, Green, Blue, Yellow”).
  • the estimation of the basic function coefficients is performed independently of the wavelength of the light emitted by the light source according to an advantageous variant of the invention. To do this, the term has been removed from equation 7 which becomes:
  • the coefficient t is out of the equation 18 is taken into account when estimating the amount of light emitted by the point M as will be explained in reference to FIG 3.
  • a scaling factor ⁇ is introduced into equation 15 or into equation 18.
  • This scaling factor ⁇ allows advantageously to reduce the influence of the density in the equations 15 or 18 and in particular makes it possible to reduce, or even eliminate, the ringing artifacts, or Gibbs effects, due to the transformation of the reduced intensity of light in the functional space, for example in the Fourier space. Equations 15 and 18 then become according to this variant:
  • the scale factor ⁇ is advantageously parameterizable and determined by the user and is for example equal to twice the maximum density of the medium, or more than twice the maximum density, for example three or four times the maximum of density.
  • the operations described above are repeated for each illumination direction (or direction of incidence or light ray) starting from the light source 1 1 and passing through the medium 10.
  • the coefficients of Basic function representative of the density as the crossing of the medium is stored in the projection texture.
  • the projection texture then comprises all the projection coefficients representative of the density in the medium.
  • FIG. 3 illustrates a method for estimating the simple scattering of light in the heterogeneous participating medium 10, more particularly the application of the radius sampling method for estimating this simple diffusion in the medium 10, and more generally a method estimation of light scattering by the medium 10 using the basic function coefficients calculated above, according to a particular embodiment of the invention.
  • the ray sampling method is implemented according to a non-limiting embodiment of the invention.
  • the attenuation factor of the light of a point M 13 of the medium 10 corresponding to the attenuation of the light on the path going from M 13 to P 15, is estimated by the following equation: the density D (s) of an element s (that is to say, the point M, considered, the position of the point M, ranging from P to M) of the line segment [PM] varying since the medium 10 is heterogeneous.
  • equation 10 is very expensive in computing power and can not be calculated analytically. To overcome this problem, a sampling of the radius PM in the direction cjo out is carried out and after discretization of the segment PM is obtained in a multitude of elements 5 S :
  • Equation 12 represents the amount of light emitted by a point M and received by a spectator. The term is calculated using the equation To obtain the total quantity of light received by a spectator situated at a point C looking in the direction ⁇ 0 ⁇ * 120, it suffices to sum the elementary light quantities emitted by the set of points M, ranging from P to M max . We obtain for this:
  • the estimates described above are repeated for all directions starting from the user and passing through the medium 10.
  • the sum of the amounts of light received by the viewer in each observation direction provides the amount of light received from the medium 10 by the viewer 12.
  • equation 12 According to the variant in which the coefficients Cj are calculated independently of the extinction coefficient a t , equation 12 becomes:
  • equation 12 According to the variant according to which a scale factor is introduced into the calculation of the attenuation of the light in M, equation 12 becomes:
  • FIG. 4 schematically illustrates an example of a hardware embodiment of a device 4 adapted to the estimation of the quantity of light diffused by a heterogeneous participating medium 10.
  • the device 4 corresponding, for example, to a personal computer PC, to a laptop ( from the English "laptop") or a game console.
  • the device 4 comprises the following elements, interconnected by an address and data bus 45 which also carries a clock signal:
  • microprocessor 41 or CPU
  • a graphics card 42 comprising:
  • Random access memory type GRAM Graphical Random Access Memory
  • I / O devices 44 such as for example a keyboard, a mouse, a webcam; and
  • the device 4 also comprises a display screen type display device 43 connected directly to the graphics card 42 to display in particular the rendering of computed and compounded synthesis images in the graphics card, for example in real time.
  • the use of a dedicated bus for connecting the display device 43 to the graphics card 42 has the advantage of having much higher data transmission rates and thus of reducing the latency for the display of data. 'images composed by the graphics card.
  • the display device is external to the device 4.
  • the device 4, for example the graphics card comprises a connector adapted to transmit a display signal to an external display means such as for example an LCD screen or plasma, a video projector.
  • the word "register" used in the description of the memories 42, 46 and 47 designates in each of the memories mentioned, as well a memory area of low capacity (a few binary data) that a memory area of large capacity (allowing storing an entire program or all or part of the representative data data calculated or display).
  • the microprocessor 41 loads and executes the instructions of the program contained in the RAM 47.
  • the random access memory 47 comprises in particular:
  • parameters 471 representative of the heterogeneous participating medium 10 for example density parameters, light absorption coefficients, light scattering coefficients, scale factor ⁇ ).
  • the algorithms implementing the steps of the method specific to the invention and described hereinafter are stored in the GRAM memory 47 of the graphics card 42 associated with the device 4 implementing these steps.
  • the graphics processors 420 of the graphics card 42 loads these parameters into GRAM 421 and executes the instructions of these algorithms in the form of firmware of the "shader" type using the High Level Shader Language (HLSL), the GLSL (OpenGL Shading language), or the English language. OpenGL shaders ”) for example.
  • HLSL High Level Shader Language
  • GLSL OpenGL Shading language
  • English language OpenGL shaders
  • the memory GRAM 421 comprises in particular:
  • projection coefficients 421 1 representative of the density at each point of the medium 10 or associated with each particle of the medium 10;
  • values 4213 representative of the quantity of light diffused by the medium 10 along one or more observation directions.
  • part of the RAM 47 is allocated by the
  • the power supply 48 is external to the device
  • FIG. 5 illustrates a method of estimating the scattering of light in a heterogeneous participating medium implemented in a device 4, according to a first example of non-limiting implementation that is particularly advantageous for the invention.
  • the various parameters of the device 4 are updated.
  • the representative parameters of the heterogeneous participating medium are initialized in some way.
  • projection coefficients of a basic function are estimated, these projection coefficients being representative of the density whose values vary in the heterogeneous participating medium 10.
  • the function has t (s) representative of the density variations in the medium 10 is projected and shown in a functional space of basic functions, for example using a Fourier transform or a discrete cosine transform. From the density values associated with the elements (ie at the points or particles) of the medium 10, a set of projection coefficients is calculated for a direction of emission of the light 1 10, or more precisely for the line segment corresponding to the intersection of a light beam 1 10, coming from a light source 1 1, with the medium 10.
  • the line segment is advantageously divided spatially into a multitude of elementary pieces of the same length or of different lengths and the projection coefficients representative of the density are calculated for a point of each elementary piece of the segment.
  • the method used to discretize the line segment and to estimate the projection coefficients is the so-called ray-marching algorithm method.
  • the associated projection coefficients are obtained by summing density-dependent values associated with each point situated between the intersection point K 14 of the medium 10 and the radius of incidence 1 10 and the point M considered and the distance between the intersection point K 14 and the point corresponding to the discretization of this piece of segment. Then a value representative of the reduction of the luminous intensity at point M 13 is calculated from the estimated projection coefficients. Similarly, a value representative of the reduction in light intensity is calculated for each discretized point of the medium 10 along the radius 1 10 from the associated projection coefficients.
  • projection coefficients are estimated for a given particle of the segment [KL] using a method called particle addition (particle blending).
  • particle addition particle blending
  • density-dependent values associated with particles located between the intersection point K and the particle of interest and dependent on the distance between the particles situated between K and the particle in question are added to each other.
  • the projection coefficients representative of the density are estimated for any point of the medium 10 or for any particle of the medium 10.
  • the estimated projection coefficients are recorded and stored in a projection texture 30.
  • a storage space of the projection texture is allocated for storing the estimated projection coefficients for each incident light beam from the light source 1 1.
  • the projection texture advantageously comprises all the projection coefficients of the medium 10, that is to say a set of projection coefficients for each point or each particle of the medium 10.
  • Such a storage of the projection coefficients offers the advantage of accelerating the estimation calculations of the quantity of light diffused by the medium 10 and perceived by a viewer, the projection coefficients representative of the density being available at any time and immediately for use in the equations for estimating the light intensity reduction values.
  • the amount of light scattered by the medium 10 in a transmission direction 120 is estimated using the projection coefficients estimated previously.
  • the line segment corresponding to the intersection of the transmission direction 120 with the medium 120 that is to say the segment [PM max ] is discretized spatially in a multitude of points or elementary pieces representative of this segment.
  • equation 24 is applied using the projection coefficients previously estimated.
  • the ray sampling method is implemented to estimate the reduction in light intensity between a point of the segment considered and the point P located at the periphery of the medium 10 in the emission direction 120.
  • the estimation of the quantity of light diffused by said medium is carried out using a particle addition method.
  • the total quantity of light received by a spectator located at a point C looking in the direction ⁇ 0 ⁇ * 120 is equal to the sum of the elementary light quantities emitted by all the particles located on the path ⁇ 0 ⁇ * between P to M max .
  • This variant has the advantage of being able to sum the quantities of light emitted by the particles in any order and not necessarily progressing from P to M max by summing the values of quantities of lights emitted in this order.
  • the order of taking into account the amount of light emitted by each particle is arbitrary and is advantageously taken care of directly by the rendering pipeline of the graphics card.
  • the quantity of light diffused by the medium 10 is estimated for several directions of emission. By summing these quantities of lights estimated for a plurality of transmission directions, the total amount of light diffused by the medium 10 and perceived by a spectator observing the medium 10 is obtained.
  • the steps 51 and 52 are advantageously reiterated as a spectator 12 moves around the medium 10, the image forming the rendering of the medium 10 being recomposed for each elementary movement of the spectator 12 around the medium 10.
  • the invention is not limited to the embodiments described above.
  • the invention is not limited to a method for estimating the amount of light diffused by a heterogeneous participating medium but also extends to any device implementing this method and in particular all the devices comprising at least one GPU.
  • the implementation of the equations described with reference to FIGS. 1 to 3 for the estimation of the projection coefficients, the reduction of luminous intensity in the directions of incidence and emission, of the quantity of light diffused is not no longer limited to an implementation in firmware type shader but also extends to an implementation in any type of program, for example programs executable by a CPU type microprocessor.
  • the basic functions used for estimating the projection coefficients are discrete cosine transform functions.
  • the basic functions used are conventional Fourier functions or the Legendre polynomials or the Chebyshev polynomials.
  • the broadcasting method implemented in a device comprising a 3.6GHz Xeon® microprocessor and a nVidia geforce GTX280 graphics card makes it possible to compose the 20 frames per second rendering in real time for a heterogeneous participating medium.
  • cloud type consisting of 4096 spheres.
  • the use of the invention is however not limited to real-time use but also extends to any other use, for example for so-called postproduction processing in the recording studio for the rendering of computer-generated images. example.
  • the implementation of the invention in postproduction offers the advantage of providing an excellent visual rendering in terms of realism in particular while reducing the calculation time required.
  • the invention also relates to a method for composing a two-dimensional or three-dimensional video image in which the amount of light scattered by a heterogeneous participating medium is calculated and the information representative of the luminance that results therefrom is used.
  • for displaying the pixels of the image each pixel corresponding to an observation direction according to an observation direction ⁇ 0 ⁇ .
  • the luminance value calculated for display by each of the pixels of the image is recalculated to suit the viewer's different points of view.
  • the present invention can be used in video game applications for example, whether by programs executable in a PC or portable computer or in specialized gaming consoles producing and displaying images in real time.
  • the device 5 described with reference to FIG. 5 is advantageously provided with interaction means such as keyboard and / or joystick, other modes of introduction of commands such as, for example, voice recognition being also possible.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)
EP11729394.4A 2010-07-01 2011-06-21 Verfahren zur schätzung der lichtstreuung Withdrawn EP2589025A2 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP11729394.4A EP2589025A2 (de) 2010-07-01 2011-06-21 Verfahren zur schätzung der lichtstreuung

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP10305720 2010-07-01
EP11729394.4A EP2589025A2 (de) 2010-07-01 2011-06-21 Verfahren zur schätzung der lichtstreuung
PCT/EP2011/060373 WO2012000847A2 (fr) 2010-07-01 2011-06-21 Procede d'estimation de diffusion de la lumiere

Publications (1)

Publication Number Publication Date
EP2589025A2 true EP2589025A2 (de) 2013-05-08

Family

ID=44627915

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11729394.4A Withdrawn EP2589025A2 (de) 2010-07-01 2011-06-21 Verfahren zur schätzung der lichtstreuung

Country Status (3)

Country Link
US (1) US20130100135A1 (de)
EP (1) EP2589025A2 (de)
WO (1) WO2012000847A2 (de)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2948800A1 (fr) * 2009-11-16 2011-02-04 Thomson Licensing Procede d'estimation de diffusion de la lumiere
WO2013104493A1 (en) 2012-01-10 2013-07-18 Thomson Licensing Method and device for estimating light scattering
BE1021805B1 (nl) 2013-11-05 2016-01-19 Creachem Bvba Methode voor het isoleren van koolhydraat alkylcarbamaten

Family Cites Families (90)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3519354A (en) * 1965-06-17 1970-07-07 Sperry Rand Corp System for measuring extinction coefficients in the atmosphere utilizing backscattered signals
US4128335A (en) * 1977-02-25 1978-12-05 General Electric Company Condensation nuclei counter with automatic ranging
US4475816A (en) * 1980-02-15 1984-10-09 The United States Of America As Represented By The Secretary Of The Navy Method for determining in situ the absorption coefficient of particulate media using pulsed laser technique
US4362387A (en) * 1980-08-22 1982-12-07 Rockwell International Corporation Method and apparatus for measuring visibility from the polarization properties of the daylight sky
FR2504278B1 (fr) * 1981-04-15 1985-11-08 Commissariat Energie Atomique Detecteur de rayons x
EP0167272B1 (de) * 1984-06-30 1991-01-16 Kabushiki Kaisha Toshiba Apparat zur Messung der Abmessungen von Teilchen
JPH05507166A (ja) * 1990-05-12 1993-10-14 レディフュージョン・シミュレーション・リミテッド イメージ発生装置
JPH0757117A (ja) * 1993-07-09 1995-03-03 Silicon Graphics Inc テクスチャマップへの索引を生成する方法及びコンピュータ制御表示システム
JPH0778267A (ja) * 1993-07-09 1995-03-20 Silicon Graphics Inc 陰影を表示する方法及びコンピュータ制御表示システム
KR100402169B1 (ko) * 1995-04-27 2004-03-10 닛폰콜롬비아 가부시키가이샤 다층구조광정보매체
JP2956653B2 (ja) * 1996-12-16 1999-10-04 日本電気株式会社 パーティクルモニター装置
JPH1123458A (ja) * 1997-05-08 1999-01-29 Nittan Co Ltd 煙感知器および監視制御システム
US9007393B2 (en) * 1997-07-02 2015-04-14 Mental Images Gmbh Accurate transparency and local volume rendering
US6500008B1 (en) * 1999-03-15 2002-12-31 Information Decision Technologies, Llc Augmented reality-based firefighter training system and method
AU2001239926A1 (en) * 2000-02-25 2001-09-03 The Research Foundation Of State University Of New York Apparatus and method for volume processing and rendering
US6593923B1 (en) * 2000-05-31 2003-07-15 Nvidia Corporation System, method and article of manufacture for shadow mapping
US6690372B2 (en) * 2000-05-31 2004-02-10 Nvidia Corporation System, method and article of manufacture for shadow mapping
US7348977B2 (en) * 2000-07-19 2008-03-25 Pixar Subsurface scattering approximation methods and apparatus
US6738659B2 (en) * 2000-08-31 2004-05-18 Hsu Pei-Feng Optical imaging using the temporal direct reflective signal from a minimized pulse width laser
US7046243B1 (en) * 2000-11-21 2006-05-16 Microsoft Corporation Rendering volumetric fog and other gaseous phenomena
FR2820965B1 (fr) * 2001-02-16 2003-04-04 Commissariat Energie Atomique Procede d'estimation d'un rayonnement diffuse, notamment afin de corriger des mesures en radiographie
US6930777B1 (en) * 2001-04-03 2005-08-16 The Texas A&M University System Method for characterizing particles in suspension from frequency domain photon migration measurements
KR100612827B1 (ko) * 2001-04-19 2006-08-14 삼성전자주식회사 비 침습적인 헤모글로빈 농도와 산소 포화도 모니터링방법 및 장치
US7045169B2 (en) * 2001-09-04 2006-05-16 J.M. Huber Corporation Method of predicting optical properties and physical characteristics to formulate optimum coating system
US20030049866A1 (en) * 2001-09-05 2003-03-13 Genicon Sciences Corporation Sample device preservation
US6841778B1 (en) * 2001-11-09 2005-01-11 Environmental Systems Products Holdings Inc. Method and apparatus for measuring particulates in vehicle emissions
JP3962588B2 (ja) * 2002-01-07 2007-08-22 キヤノン株式会社 三次元画像処理方法、三次元画像処理装置、三次元画像処理システムおよび三次元画像処理プログラム
EP1493123A4 (de) * 2002-04-06 2010-01-13 Randall L Barbour Modifikation des verfahrens der normierten differenzen für die optische echtzeit-tomographie
JP2003317315A (ja) * 2002-04-19 2003-11-07 Tdk Corp 光記録媒体
EP2410315B1 (de) * 2002-06-04 2020-04-01 Visen Medical, Inc. Bildgebungsdatenträger mit willkürlichen Geometrien bei Kontakt- und kontaktloser Tomographie
WO2004049005A2 (en) * 2002-11-26 2004-06-10 The Trustees Of Columbia University In The City Of New York Systems and methods for modeling the impact of a medium on the appearances of encompassed light sources
US20060173355A1 (en) * 2003-04-17 2006-08-03 Alfano Robert R Detecting human cancer through spectral optical imaging using key water absorption wavelengths
US7443394B2 (en) * 2003-04-30 2008-10-28 Pixar Method and apparatus for rendering of complex translucent objects using multiple volumetric grids
US7859530B2 (en) * 2003-04-30 2010-12-28 Pixar Subsurface rendering methods and apparatus
US7184043B2 (en) * 2003-04-30 2007-02-27 Pixar Color compensated translucent object rendering methods and apparatus
US7019744B2 (en) * 2003-04-30 2006-03-28 Pixar Method and apparatus for rendering of translucent objects using volumetric grids
AU2003902319A0 (en) * 2003-05-14 2003-05-29 Garrett Thermal Systems Limited Laser video detector
US7091973B1 (en) * 2003-06-20 2006-08-15 Jonathan Michael Cohen Apparatus and method for estimating reflected radiance under complex distant illumination
JP2005044744A (ja) * 2003-07-25 2005-02-17 Clariant Internatl Ltd 面光源装置
JP4303138B2 (ja) * 2004-01-28 2009-07-29 富士フイルム株式会社 シート状導光体を用いた通信システム
US7696995B2 (en) * 2004-05-07 2010-04-13 Valve Corporation System and method for displaying the effects of light illumination on a surface
US7218324B2 (en) * 2004-06-18 2007-05-15 Mitsubishi Electric Research Laboratories, Inc. Scene reflectance functions under natural illumination
EP2595129B1 (de) * 2004-11-12 2020-05-20 Xtralis Technologies Ltd Teilchendetektor, System und Verfahren
US7710418B2 (en) * 2005-02-04 2010-05-04 Linden Acquisition Corporation Systems and methods for the real-time and realistic simulation of natural atmospheric lighting phenomenon
US8472020B2 (en) * 2005-02-15 2013-06-25 Cinram Group, Inc. Process for enhancing dye polymer recording yields by pre-scanning coated substrate for defects
WO2006138513A1 (en) * 2005-06-16 2006-12-28 Nomos Corporation Variance reduction simulation system, program product, and related methods
US7312797B2 (en) * 2005-06-24 2007-12-25 Microsoft Corporation Representing quasi-homogenous materials
WO2007052614A1 (ja) * 2005-10-31 2007-05-10 Matsushita Electric Industrial Co., Ltd. 光学的情報記録媒体およびその製造方法
JP4201207B2 (ja) * 2005-11-21 2008-12-24 株式会社バンダイナムコゲームス プログラム、情報記憶媒体及び画像生成システム
US20070285422A1 (en) * 2006-01-18 2007-12-13 Nayar Shree K Method for Separating Direct and Global Illumination in a Scene
US7633062B2 (en) * 2006-10-27 2009-12-15 Los Alamos National Security, Llc Radiation portal monitor system and method
JP4241834B2 (ja) * 2007-01-11 2009-03-18 株式会社デンソー 車載霧判定装置
US7838108B2 (en) * 2007-01-17 2010-11-23 Sabic Innovative Plastics Ip B.V. Nano-cellular polymer foam and methods for making them
US8267927B2 (en) * 2007-01-24 2012-09-18 Koninklijke Philips Electronics N.V. Advanced ablation planning
US8212262B2 (en) * 2007-02-09 2012-07-03 Cree, Inc. Transparent LED chip
WO2008152712A1 (ja) * 2007-06-13 2008-12-18 Shimadzu Corporation ナノ粒子計測装置
JP4873202B2 (ja) * 2007-06-21 2012-02-08 株式会社島津製作所 光学的測定の解析方法
US7990377B2 (en) * 2007-06-26 2011-08-02 Microsoft Corporation Real-time rendering of light-scattering media
US8009168B2 (en) 2007-06-26 2011-08-30 Microsoft Corporation Real-time rendering of light-scattering media
US8190403B2 (en) * 2007-06-26 2012-05-29 Microsoft Corporation Real-time rendering of light-scattering media
US7940268B2 (en) * 2007-06-29 2011-05-10 Microsoft Corporation Real-time rendering of light-scattering media
US7940269B2 (en) * 2007-06-29 2011-05-10 Microsoft Corporation Real-time rendering of light-scattering media
US20090026924A1 (en) * 2007-07-23 2009-01-29 Leung Roger Y Methods of making low-refractive index and/or low-k organosilicate coatings
US8674986B2 (en) * 2007-07-25 2014-03-18 Digital Domain Products, Inc. Method and system for scattered spherical harmonic approximation
US8164749B2 (en) * 2007-08-08 2012-04-24 Shimadzu Corporation Optical measurement apparatus and electrode pair thereof
JP5219440B2 (ja) * 2007-09-12 2013-06-26 キヤノン株式会社 測定装置
TWI654418B (zh) * 2007-11-15 2019-03-21 巴哈馬商愛克斯崔里斯科技有限公司 粒子檢測器
ATE507544T1 (de) * 2008-02-19 2011-05-15 Siemens Ag Rauchdetektion mittels zweier spektral unterschiedlicher streulichtmessungen
US8243071B2 (en) * 2008-02-29 2012-08-14 Microsoft Corporation Modeling and rendering of heterogeneous translucent materials using the diffusion equation
US8804119B2 (en) * 2008-06-10 2014-08-12 Xtralis Technologies Ltd Particle detection
US8181511B2 (en) * 2008-07-18 2012-05-22 Meier Robert R Method and system of imaging electrons in the near earth space environment
US20100033482A1 (en) * 2008-08-11 2010-02-11 Interactive Relighting of Dynamic Refractive Objects Interactive Relighting of Dynamic Refractive Objects
US20100085360A1 (en) * 2008-10-04 2010-04-08 Microsoft Corporation Rendering in scattering media
GB2465792A (en) * 2008-11-28 2010-06-02 Sony Corp Illumination Direction Estimation using Reference Object
WO2010124347A1 (en) * 2009-05-01 2010-11-04 Xtralis Technologies Ltd Improvements to particle detectors
EP2502206A2 (de) * 2009-11-16 2012-09-26 Thomson Licensing Verfahren zur schätzung von lichtstreuung
FR2948800A1 (fr) * 2009-11-16 2011-02-04 Thomson Licensing Procede d'estimation de diffusion de la lumiere
FR2948801A1 (fr) * 2009-11-24 2011-02-04 Thomson Licensing Procede d'estimation de diffusion de la lumiere
JP2011214942A (ja) * 2010-03-31 2011-10-27 Fujifilm Corp 光断層計測装置
FR2966623A1 (fr) * 2010-10-21 2012-04-27 Thomson Licensing Procede d’estimation de l’occultation dans un environnement virtuel
JP5950117B2 (ja) * 2010-12-24 2016-07-13 日本電気株式会社 画像処理方法、画像処理システムおよび画像処理プログラム
US9262860B2 (en) * 2011-02-17 2016-02-16 Sony Corporation System and method for importance sampling of area lights in participating media
US8872826B2 (en) * 2011-02-17 2014-10-28 Sony Corporation System and method for decoupled ray marching for production ray tracking in inhomogeneous participating media
US8922556B2 (en) * 2011-04-18 2014-12-30 Microsoft Corporation Line space gathering for single scattering in large scenes
US8338785B2 (en) * 2011-04-29 2012-12-25 Rosemount Aerospace Inc. Apparatus and method for detecting aircraft icing conditions
US8638331B1 (en) * 2011-09-16 2014-01-28 Disney Enterprises, Inc. Image processing using iterative generation of intermediate images using photon beams of varying parameters
WO2013104493A1 (en) * 2012-01-10 2013-07-18 Thomson Licensing Method and device for estimating light scattering
FR2988502A1 (fr) * 2012-03-26 2013-09-27 Thomson Licensing Procede pour representer un milieu participant dans une scene et dispositif correspondant
TW201401225A (zh) * 2012-06-22 2014-01-01 Thomson Licensing 異質參與媒體之一點所接收光量之估計方法及所構成裝置
US9401043B2 (en) * 2013-01-18 2016-07-26 Pixar Photon beam diffusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2012000847A2 *

Also Published As

Publication number Publication date
US20130100135A1 (en) 2013-04-25
WO2012000847A2 (fr) 2012-01-05
WO2012000847A3 (fr) 2012-03-22

Similar Documents

Publication Publication Date Title
Meng et al. Kernel foveated rendering
US9953457B2 (en) System, method, and computer program product for performing path space filtering
Yu et al. Real‐time depth of field rendering via dynamic light field generation and filtering
FR2988891A1 (fr) Procede d'estimation de niveau d'opacite dans une scene et dispositif correspondant
FR2965652A1 (fr) Procede d’estimation de la quantite de lumiere recue en un point d’un environnement virtuel
EP1292921A1 (de) Verfeinerung von dreidimensionalen polygonalen gitterdaten
WO2004012445A2 (fr) Procede et systeme permettant a un utilisateur de melanger en temps reel des images de synthese avec des images video
EP2504816B1 (de) Verfahren zur schätzung von lichtstreuung
FR2966623A1 (fr) Procede d’estimation de l’occultation dans un environnement virtuel
Simon et al. Rich‐VPLs for improving the versatility of many‐light methods
Delalandre et al. Transmittance function mapping
Bauszat et al. Sample‐based manifold filtering for interactive global illumination and depth of field
Leimkühler et al. Laplacian kernel splatting for efficient depth-of-field and motion blur synthesis or reconstruction
WO2011057997A2 (fr) Procede d'estimation de diffusion de la lumiere
FR2964775A1 (fr) Procede d'estimation de l'occultation dans un environnement virtuel
EP2589025A2 (de) Verfahren zur schätzung der lichtstreuung
FR2988502A1 (fr) Procede pour representer un milieu participant dans une scene et dispositif correspondant
Fuchs et al. Combining confocal imaging and descattering
Elek et al. Spectral ray differentials
WO2011058007A2 (fr) Procede d'estimation de diffusion de la lumiere
Lei et al. Approximate depth of field effects using few samples per pixel
Habel et al. Physically based real-time translucency for leaves
FR2974217A1 (fr) Procede d’estimation d’une information representative d’une hauteur
FR2964776A1 (fr) Procede d’estimation de diffusion de la lumiere dans un milieu homogene
Elek et al. Real-time screen-space scattering in homogeneous environments

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20121219

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20160712