EP2502207A2 - Verfahren zur schätzung von lichtstreuung - Google Patents

Verfahren zur schätzung von lichtstreuung

Info

Publication number
EP2502207A2
EP2502207A2 EP10775825A EP10775825A EP2502207A2 EP 2502207 A2 EP2502207 A2 EP 2502207A2 EP 10775825 A EP10775825 A EP 10775825A EP 10775825 A EP10775825 A EP 10775825A EP 2502207 A2 EP2502207 A2 EP 2502207A2
Authority
EP
European Patent Office
Prior art keywords
light
medium
projection coefficients
point
representative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP10775825A
Other languages
English (en)
French (fr)
Inventor
Cyril Delalandre
Pascal Gautron
Jean-Eudes Marvie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thomson Licensing SAS
Original Assignee
Thomson Licensing SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing SAS filed Critical Thomson Licensing SAS
Publication of EP2502207A2 publication Critical patent/EP2502207A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/02Investigating particle size or size distribution
    • G01N15/0205Investigating particle size or size distribution by optical means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N15/14Optical investigation techniques, e.g. flow cytometry
    • G01N15/1456Optical investigation techniques, e.g. flow cytometry without spatial resolution of the texture or inner structure of the particle, e.g. processing of pulse signals
    • G01N15/1459Optical investigation techniques, e.g. flow cytometry without spatial resolution of the texture or inner structure of the particle, e.g. processing of pulse signals the analysis being performed on a sample stream

Definitions

  • the invention relates to the field of synthetic image composition and more particularly to the field of simulating the scattering of light in a heterogeneous participating medium.
  • the invention is also in the context of special effects for a composition in real time (of the English "live").
  • the participating media correspond to media composed of suspended particles that interact with the light to modify the path and the intensity in particular.
  • the participating media can be broken down into two parts, namely homogeneous media such as water and heterogeneous media, such as smoke or clouds.
  • homogeneous participating media it is possible to calculate in an analytical way the attenuation of the light emitted by a light source. Indeed, by their homogeneous nature, these media have parameters such as the absorption coefficient of light or the scattering coefficient of light of constant value at any point in the medium.
  • the absorption and scattering properties of light vary from one point to another in a heterogeneous participating medium. The calculations necessary to simulate the scattering of light in such a heterogeneous medium are then very expensive and it is thus not possible to calculate analytically and in real time the amount of light scattered by a heterogeneous participating medium.
  • the quantity of light diffused by the medium also varies as a function of the direction of diffusion of the light. That is, the direction in which a person looks at this environment. Calculations estimating the amount of light scattered must then be repeated for each direction of observation of the medium by a person to obtain a realistic rendering of the medium.
  • some methods pre-calculate certain parameters representative of the heterogeneous participating medium. While these methods are ideally suited for use in a post-production studio, for example, and provide good quality rendering, these methods are not suitable in the context of interactive design and real-time rendering of a participating environment. heterogeneous.
  • WO2009 / 003143 Such a method is for example described in the patent application WO2009 / 003143 filed by Microsoft Corporation and published on December 31, 2008.
  • the object of the invention WO2009 / 003143 object is a real-time software rendering a heterogeneous medium and describes a solution using radial basic functions.
  • this solution can not be considered as a real-time rendering solution since certain pre-treatments must be applied offline (from the "offline") to the participating medium in order to calculate projection coefficients representing the environment that will be used for real time calculations of image synthesis.
  • the invention aims to overcome at least one of these disadvantages of the prior art.
  • the object of the invention is notably to optimize the calculation times and / or the computation power necessary to compose a realistic real-time rendering of the light scattering in a heterogeneous participating medium.
  • the invention relates to a method for estimating the amount of light diffused by a heterogeneous participating medium, the method comprising the steps of:
  • first projection coefficients in an orthonormal basis of spherical functions, the first projection coefficients being representative of the reduction in luminous intensity at a point in the middle, the estimation of the first projection coefficients being performed for each point of a first set of points in the middle, and
  • each of the first projection coefficients is estimated from estimated light intensity reduction values along a plurality of particular light emission directions.
  • the estimation of the light intensity reduction values is performed by sampling said particular light emission directions.
  • the method comprises a step of estimating second projection coefficients in the orthonormal basis of spherical functions, the second projection coefficients being representative of the incident luminance for a set of points of said light environment.
  • the method comprises a step of estimating third projection coefficients in the orthonormal basis of spherical functions, the third projection coefficients being representative of the phase function for a second set of points of said medium.
  • the estimation of the quantity of light diffused by the medium is carried out by discretizing the medium along the at least one diffusion direction.
  • the estimation of the quantity of light diffused by the medium is carried out by using the method of ray sampling.
  • the first projection coefficients are stored in a table of a memory associated with at least one graphics processor.
  • FIG. 1 schematically illustrates a heterogeneous, light-scattering participating medium, according to a particular embodiment of the invention
  • FIGS. 2A and 2B illustrate a light environment comprising several light sources, according to a particular embodiment of the invention
  • FIG. 3 schematically illustrates a method for estimating the quantity of light diffused by a medium of FIG. 1 illuminated by a light environment of FIGS. 2A and 2B, according to a particular embodiment of the invention
  • FIG. 4 schematically illustrates a method for estimating projection coefficients at each point of a set of points in the middle of FIG. 1, according to one particular embodiment of the invention
  • FIG. 5 illustrates a device implementing a method for estimating the quantity of scattered light, according to an example of a particular implementation of the invention
  • FIGS. 6 and 7 illustrate a method for estimating the quantity of scattered light, according to two particular embodiments of the invention.
  • FIG. 1 illustrates a heterogeneous participating media (heterogeneous participant media), for example a cloud.
  • a participating medium is a medium, composed of a multitude of particles in suspension, which absorbs, emits and / or diffuses light.
  • a participating medium absorbs only light, for example light received from a light source 11 such as the sun for example. This means that light passing through the medium 10 is attenuated, the attenuation depending on the density of the medium.
  • the medium is heterogeneous, that is to say that the physical characteristics of the medium, such as the density of the particles composing, for example, vary from one point to another in the medium.
  • the participating medium is composed of small particles that interact with the light
  • the incident light that is to say received from the light source 1 1 in a direction ⁇ ⁇ 1 10 is not only absorbed but it is also diffused.
  • the light In an isotropic scattering participating medium, the light is diffused uniformly in all directions.
  • an anisotropic scattering participating medium such as the cloud 10 illustrated in FIG. 1, the scattering of the light depends on the angle between the direction of incidence ⁇ , ⁇ 1 10 and the diffusion direction ⁇ ⁇ 120 of the light. .
  • the amount of light scattered at a point M 13 of the medium 10 in the diffusion direction ⁇ 0 ⁇ 120 is calculated by the following equation:
  • the amount of light scattered by a point M 13 of the medium reaching the eye of a spectator 12 situated at a point C of the space in the direction u) out 120 is then:
  • D (M) is the density of the medium at a given point, the density varying from one point to another since the medium is heterogeneous
  • is the reduced light intensity at the point M from the direction of incidence ⁇ ⁇ 1 and represents the amount of incident light arriving at the point M after attenuation due to the path of the light in the medium 10 on the segment KM, K being the point of intersection between the medium 10 and the radius of incidence ⁇ , ⁇ 1 10, and is:
  • Equation 2 calculates the amount of light scattered by a point M and reaching the eye of a viewer 12 located on the direction ⁇ ⁇ .
  • Equation 2 calculates the amount of light scattered by a point M and reaching the eye of a viewer 12 located on the direction ⁇ ⁇ .
  • To calculate the quantity of light received by a spectator looking in the direction io or t it is then necessary to make the sum of all the contributions of all the points of the middle located on the axis ⁇ ⁇ ", that is to say say the points situated on the segment PM max , P and M max being the two points of intersection between the middle 10 and the direction oj out 120.
  • This total scattered luminance arriving at P 15 from the direction oo or t 120 due to the simple diffusion is then:
  • This total scattered luminance is obtained by integrating the contributions of all the points situated between P and max on a radius having u> or t as direction.
  • Such an integral equation can not be solved analytically in the general case and even less so for real-time estimation of the amount of scattered light.
  • the integral is evaluated numerically using the so-called ray sampling or ray-marching method. In this method, the integration domain is discretized into a multitude of size intervals ⁇ M and we obtain the following equation:
  • the heterogeneous participating medium 10 is a three-dimensional element, shown in two dimensions in FIG. 1 for the sake of clarity.
  • the medium 10 is illuminated by a plurality of light sources, for example 1,000, 100,000 or 1,000 light sources.
  • FIGS. 2A and 2B illustrate a luminous environment 2 comprising several light sources 23, 24 and 25. Identical reference signs are used for identical elements in FIGS. 2A and 2B.
  • Figure 2A illustrates more particularly two points A 21 and B 22 illuminated by three light sources 23, 24 and 25.
  • the point A 21 is illuminated by the first light source 23 in a direction U ) IA 21 1, by the second light source 24 in a direction U> 2A 212 and by the third light source 25 in a direction U) 3A 213.
  • the point B 22 is illuminated by the first light source 23 in a direction U> IB 221, by the second light source 24 in a direction U) 2B 222 and by the third light source 25 along a direction C ⁇ SB 223.
  • the problem posed by such a complex light environment, because having several light sources, is that it is very expensive in terms of calculation for an estimate of the incident light in a medium since the direction of the light between a source and a point in the middle is different for each point in the middle. Indeed, the direction taken by the light emitted by the first source 23 is different for A and B, the direction taken by the light emitted by the second source 24 is different for A and B and the direction taken by the light emitted by the third source 25 is different for A and B.
  • the estimation of the light coming from several distant light sources is carried out by using the environment mapping method. ) according to a particular embodiment of the invention, as illustrated in FIG. 2B.
  • the method called environment Rather than considering an exact direction of light between points A and B on the one hand and light sources 23, 24, 25 on the other hand (as illustrated in FIG. 2A), the method called environment considers that all light sources 23, 24 and 25 of the environment 2 are located at optical infinity with respect to points A and B. It is thus possible to consider that the directions taken by the light emitted by a source 23, 24 or 25 are identical regardless of the points A and B of the medium in question. The parallax effect due to the distance separating points A and B is thus neglected.
  • the direction UJ-IA 211 connecting the point A to the first light source 23 is considered to be identical to the direction ⁇ -is 221 connecting the point B to the first light source 23.
  • the direction U) 2A 212 connecting the point A to the second light source 24 is considered to be identical to the direction ⁇ 2 ⁇ 222 connecting the point B to the second light source 24 and the direction UJSA 213 connecting the point A to the third light source 25 is considered as being identical to the direction (3B 223 connecting the point B to the third light source 23.
  • the light environment comprises two or more light sources, for example 1000, 100000 or 1,000 light sources.
  • FIG. 3 illustrates a method of estimating the quantity of light diffused by a medium 10, the light coming from a light environment 3 comprising several light sources 31, 32 and 33, according to a particular embodiment of the invention . As has been described with reference to FIG.
  • the light diffused at a point M 13 by the medium 10 is a composition of the light attenuation received by the medium 10 of a light source 11 (or an environment 3) and the diffusion of this amount of attenuated light received by the medium 10.
  • the term of the equation 1 representative of the attenuation of the light received from the bright environment 3 in the middle 10 is estimated.
  • a sphere ⁇ surrounding the point M 34 is sampled.
  • the attenuation of light along a path between M and the outside of the medium is estimated using the following equation, equivalent to equation 3:
  • R (M, co) is the attenuation of the luminous intensity at the point M 13 in a direction ⁇ and represents the amount of incident light arriving at the point M after attenuation
  • o t is the extinction coefficient of the medium, corresponding to the sum of the diffusion coefficient of the medium o s and the absorption coefficient of the medium
  • K 35 is the point of intersection between the middle 10 and the outside of the medium 10 in a direction starting from the point M.
  • Equation 6 provides the luminous attenuation at a point for a given direction ⁇ .
  • the integration domain located on the direction of incidence ⁇ is discretized into a series of intervals of size S , the density varying from one interval to another since medium 10 is heterogeneous.
  • Applying the radius sampling method we obtain a value of the attenuation luminous at the point M in the direction ⁇ . This value is stored in a table of a memory associated with a graphics processor GPU (Graphical Processing Unit).
  • GPU Graphic Processing Unit
  • This operation for estimating the light attenuation at the point M 34 is repeated for each direction ⁇ of the sphere ⁇ of center M sampled in a set comprising N directions ⁇ starting from the point M, N being any positive natural integer.
  • N being any positive natural integer.
  • Each function of the functional space can be written as a linear combination of basic functions, a basic function being an element of a basis for a functional space.
  • R (M) is the luminous attenuation function at the point M
  • Crj (M) is the j th projection coefficient (out of a total of Ne coefficients) of the base function B j (M)
  • Cr j (M) is defined by the integral on the sphere ⁇
  • the set Ne of basic function projection coefficients thus calculated is stored in a table of a memory of the GPU.
  • the operations described above are repeated for a set of points M of the middle 10. For each point of the set of points, projection coefficients representative of the light attenuation in all directions are thus calculated and recorded in tables called attenuation recordings.
  • phase function of medium 10 can be represented using an orthonormal basis of spherical functions.
  • phase function of the medium 10 is the same in every point of the medium 10.
  • This projection is carried out for a set N of directions ⁇ starting from the point M on the sphere ⁇ .
  • the projection coefficients representative of the phase function of the medium are stored in a table of the memory associated with a GPU. According to one variant, the phase function varies from one point M to another and the calculated projection coefficients are calculated for a set of points representative of the medium 10.
  • the function describing the environmental map 3 representative of the light incident in the medium 10 is represented using the orthonormal basis of spherical functions.
  • ⁇ _ ⁇ ( ⁇ ⁇ ) is the luminance function incident in a direction of incidence ⁇ , ⁇ .
  • the light attenuation function, the phase function and the incident luminance function are represented in one and the same orthonormal basis of spherical functions.
  • equation 1 describes the quantity of light scattered at a point M 13 of the medium 10 in the diffusion direction ⁇ 120 for a light environment comprising a single light source. 1 1.
  • the functions R, L, and p have been projected in a set of spherical basic function projection coefficients, respectively in a set of first coefficients, in a set of second coefficients and in a set of third coefficients.
  • the projection coefficients are respectively noted
  • the projection coefficients describing the light attenuation function have been estimated for a set of points of the medium 10 and not for all the points of the medium M, the projection coefficients of R for the points for which they have not been estimated by the equation 8 are calculated by interpolation as described in Figure 4.
  • first projection coefficients representative of the attenuation of light in the medium the light coming from the light environment (estimated via equation 8 described with reference to FIG. 2 and by interpolation as described below of FIG. 4), second projection coefficients representing the incident luminance and the third projection coefficients representative of the phase function of the medium 10, it is possible to estimate the global attenuation of the light at a point M such that it is received by a viewer 36 analytically, the resources in terms of computing power needed being much lower than those needed for an analytical resolution of the integral form equations.
  • Equation 12 represents the amount of light emitted by a point
  • FIG. 5 schematically illustrates an example of a hardware embodiment of a device 5 adapted to the estimation of the quantity of light diffused by a heterogeneous participant medium 10 and the creation of display signals of one or more images.
  • the device 5 corresponding for example to a personal computer PC, a laptop (from the English "laptop") or a game console.
  • the device 5 comprises the following elements, interconnected by an address and data bus 45 which also carries a clock signal:
  • microprocessor 51 or CPU
  • a graphics card 52 comprising:
  • I / O devices English “Input / Output" 54, such as for example a keyboard, a mouse, a webcam; and
  • the device 5 also comprises a display screen type display device 53 connected directly to the graphics card 52 to display in particular the rendering of computed and compounded synthesis images in the graphics card, for example in real time.
  • a dedicated bus for connecting the display device 53 to the graphics card 52 offers the advantage of having much higher data transmission rates and thus reducing the latency for the display of data. 'images composed by the graphics card.
  • an apparatus for displaying is external to the device 5 and is connected to the device 5 by a cable transmitting the display signals.
  • the device 5, for example the graphics card 52 comprises a transmission means or connector (not shown in FIG. 5) adapted to transmit a display signal to an external display means such as, for example, an LCD or plasma screen. , a video projector.
  • register used in the description of the memories 52, 56 and 57 designates in each of the memories mentioned, as well a memory zone of low capacity (some binary data) that a memory zone of large capacity (allowing storing an entire program or all or part of the representative data data calculated or display).
  • the microprocessor 51 loads and executes the instructions of the program contained in the RAM 57.
  • RAM 57 comprises in particular:
  • parameters representative of the heterogeneous participating medium for example density parameters, light absorption coefficients, light scattering coefficients.
  • the algorithms implementing the steps of the method of the invention and described below are stored in the memory G RAM 57 of the graphics card 52 associated with the device 5 implementing these steps.
  • the graphics processors 520 of the graphics card 52 load these parameters into G RAM 521 and execute the instructions of these algorithms in the form of microprograms of the type " shader "using High Level Shader Language (HLSL) or” HighGL Shading Language “or” OpenGL Shading language “ OpenGL shader language ”) for example.
  • HLSL High Level Shader Language
  • OpenGL shading language OpenGL shader language
  • RAM RAM 521 comprises in particular:
  • first projection coefficients 521 1 representative of the reduced luminous intensity at each point of the medium 10; luminous intensity reduction values 5212 for each point of the medium 10;
  • - 5215 values representative of the amount of light scattered by the medium 10 in one or more directions of observation.
  • part of the RAM 57 is allocated by the CPU 51 to store the coefficients 521 1, 5213 and 5214 and the values 5212 and 5215 if the available memory space in RAM 521 G is insufficient.
  • This variant results in longer latency times in the composition of an image comprising a representation of the medium 10 composed from the microprograms contained in the GPUs since the data must be transmitted from the graphics card to the random access memory 57 via the bus 55 whose transmission capacity is generally lower than those available in the graphics card for passing the data from GPUs to G RAM and vice versa.
  • FIG. 6 illustrates a method of estimating the quantity of light diffused by a heterogeneous participating medium implemented in a device 5, according to a first example of non-limiting implementation that is particularly advantageous of the invention.
  • the various parameters of the device 5 are updated.
  • the representative parameters of the heterogeneous participant medium are initiated in some way.
  • first projection coefficients of a basic function in an orthonormal basis of spherical functions are estimated, these first projection coefficients being representative of the reduction of the luminous intensity at a given point of the heterogeneous participant medium 10.
  • a plurality of values representative of the reduction of the light intensity are calculated for the given point M 34 of the medium 10 for a plurality of given incidence directions.
  • Each value representative of the reduction in luminous intensity corresponds to a given direction of incidence among the plurality of incidence directions.
  • the value representative of the reduction of the luminous intensity at the point M 34 is calculated using any method of sampling known to those skilled in the art, advantageously using the so-called ray-marching algorithm method.
  • the plurality of incidence directions ⁇ (or ⁇ , ⁇ ) for which are calculated the values representative of the reduction in luminous intensity at the point M form a sphere ⁇ having as center M.
  • the number of directions ⁇ for which are calculated the values representative of the reduction in luminous intensity at the point M is chosen so as to find the best compromise between the computing power necessary to calculate these values and the accuracy of the estimation of the reduction in luminous intensity at the desired point M by a user of the device 5. To choose a number of directions ⁇ is to sample the sphere ⁇ having for center M.
  • step 61 is repeated for each point of a first set of points (for example 50, 100 or 1000 points) representative of the medium 10.
  • First projection coefficients representative of the reduction of the luminous intensity in each of the points of the first set of points.
  • a set of first projection coefficients corresponds to a middle point and there is as many sets of first projection coefficients calculated as there are points in the first set of points representative of the medium 10.
  • an interpolation of the first calculated projection coefficients is used.
  • the amount of light scattered by the medium 10 in a transmission direction 120 is estimated by using the first projection coefficients estimated previously.
  • the line segment corresponding to the intersection of the transmission direction 120 with the medium 120 that is to say the segment [PM max ] is discretized spatially in a multitude of points or elementary pieces representative of this segment.
  • equation 12 is applied using the first projection coefficients estimated previously.
  • the ray sampling method is implemented to estimate the reduction of the luminous intensity between a point of the segment considered and the point P located at the periphery of the medium 10 in the emission direction 120.
  • first projection coefficients representative of the reduction of the luminous intensity at points of the medium makes it possible to simplify the calculations to be implemented while providing a realistic estimate of the reduction of the luminous intensity in a heterogeneous medium. No pre-calculation is needed to render the scattering of light in a heterogeneous participating medium, allowing the rendering of time! such media in interactive applications of the video game type for example in which the user is caused to move virtually in a space comprising one or more heterogeneous participating media.
  • the quantity of light diffused by the medium 10 is estimated for several directions of emission. By summing these quantities of light estimated for a plurality of transmission directions, the total amount of light diffused by the medium 10 and perceived by a spectator observing the medium 10 is obtained.
  • Steps 61 and 62 are advantageously reiterated as a spectator 12 moves around the medium 10, the image forming the rendering of the medium 10 being recomposed for each elementary movement of the viewer 12 around the medium 10.
  • FIG. 7 illustrates a method of estimating the amount of light diffused by a heterogeneous participating medium implemented in a device 5 according to a second particularly advantageous nonlimiting implementation example of the invention.
  • the various parameters of the device 5 are updated.
  • the representative parameters of the heterogeneous participating medium are initialized in some way.
  • steps 71 and 72 light intensity reduction values are estimated at a plurality of points of the medium 10 and then first projection coefficients are estimated in the same manner as that described with respect to step 61 of Figure 6. Steps 71 and 72 are therefore not detailed again here.
  • second projection coefficients representative of the incident luminance are estimated for a set of points of the light environment 3, the points of the assembly being representative of the light environment.
  • the function describing the environment card 3 (called incident luminance function) representative of the light incident in the medium 10 is represented by using the orthonormal base of spherical functions used to represent the light attenuation function.
  • the representative function of the environment card 3 is thus projected into a set of second projection coefficients of a base of spherical functions.
  • these second coefficients are stored in a table of a memory associated with one or more GPUs of the graphics card of the device 5.
  • the second projection coefficients are advantageously identical at any point in the light environment. According to one variant, the second projection coefficients vary from one point to another or from one set of points to another belonging to the luminous environment.
  • third projection coefficients representative of the phase function of the medium 10 are estimated for a second set of points of the medium 10, the points of the set being representative of the light environment.
  • the function describing the phase function of the medium 10 is represented by using the orthonormal basis of spherical functions used to represent the light attenuation function and the incident luminance function.
  • the phase function is thus projected into a set of thirds projection coefficients of a base of spherical functions.
  • these third coefficients are stored in a tabie of a memory associated with one or more GPUs of the graphic card of the device 5.
  • the third projection coefficients are advantageously identical at every point in the medium. According to one variant, the third projection coefficients vary from one point to another or from one set of points to another belonging to the medium 10.
  • the steps 73 and 74 are executed before the steps 71 and 72. According to another variant, the steps 72, 73 and 74 are executed simultaneously, for example in parallel on dedicated GPUs.
  • the first previously estimated projection coefficients are recorded and stored in a data structure composed of tables stored in a memory associated with the GPUs. These records are called attenuation records.
  • the attenuation recording tables advantageously comprise all the first projection coefficients of the points of the medium 10, whether these first coefficients have been calculated from equation 8 or by interpolation from the first coefficients calculated via Equation 8. There exists a set of first projection coefficients per point of the middle 10 or alternatively for a set of points representative of the medium 10. Such a storage of the first projection coefficients offers the advantage of accelerating the calculations.
  • the first projection coefficients representative of the reduction of the incident light intensity being available at any time and immediately for use in the equations 12 and 13.
  • the recording of the second projection coefficients and the third Projection coefficients in dedicated tables accelerate the computations, especially when the second and third coefficients vary from one point to another (respectively of the luminous environment 3 and the medium 10).
  • step 76 the quantity of scattered light is estimated in the same manner as that described with regard to step 62 of FIG.
  • the invention is not limited to the embodiments described above.
  • the invention is not limited to a method for estimating the amount of light diffused by a heterogeneous participating medium but also extends to any device implementing this method and in particular all the devices comprising at least one GPU.
  • the implementation of the equations described with reference to FIGS. 1 to 4 for the estimation of the first, second and third projection coefficients, the reduction in luminous intensity, and the quantity of light scattered is also not limited to implemented in firmware shader type but also extends to an implementation in any type of program, for example programs executable by a CPU type microprocessor.
  • the basic functions used for estimating the projection coefficients are spherical harmonic type functions or spherical wavelet type functions.
  • the use of the invention is not limited to a real-time use but also extends to any other use, for example for so-called postproduction processing in the recording studio for the rendering of synthetic images, for example .
  • the implementation of the invention in postproduction offers the advantage of providing an excellent visual rendering in terms of realism in particular while reducing the calculation time required.
  • the invention also relates to a method for composing a two-dimensional or three-dimensional video image in which the amount of light scattered by a heterogeneous participating medium is calculated and the information representative of the luminance that results therefrom is used.
  • a method for composing a two-dimensional or three-dimensional video image in which the amount of light scattered by a heterogeneous participating medium is calculated and the information representative of the luminance that results therefrom is used.
  • for displaying the pixels of the image each pixel corresponding to an observation direction according to an observation direction or t-
  • the luminance value calculated for display by each pixel of the image is recalculated to fit to the different points of view of the viewer.
  • the present invention can be used in video game applications for example, whether by programs executable in a PC or portable computer or in specialized gaming consoles producing and displaying images in real time.
  • the device 5 described with reference to FIG. 5 is advantageously provided with interaction means such as keyboard and / or joystick, other modes of introduction of commands such as, for example, voice recognition being also possible.

Landscapes

  • Chemical & Material Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Dispersion Chemistry (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
EP10775825A 2009-11-16 2010-11-09 Verfahren zur schätzung von lichtstreuung Withdrawn EP2502207A2 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0958071A FR2948800A1 (fr) 2009-11-16 2009-11-16 Procede d'estimation de diffusion de la lumiere
PCT/EP2010/067108 WO2011058007A2 (fr) 2009-11-16 2010-11-09 Procede d'estimation de diffusion de la lumiere

Publications (1)

Publication Number Publication Date
EP2502207A2 true EP2502207A2 (de) 2012-09-26

Family

ID=42562454

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10775825A Withdrawn EP2502207A2 (de) 2009-11-16 2010-11-09 Verfahren zur schätzung von lichtstreuung

Country Status (4)

Country Link
US (1) US8842275B2 (de)
EP (1) EP2502207A2 (de)
FR (1) FR2948800A1 (de)
WO (1) WO2011058007A2 (de)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130100135A1 (en) * 2010-07-01 2013-04-25 Thomson Licensing Method of estimating diffusion of light
TW201401225A (zh) 2012-06-22 2014-01-01 Thomson Licensing 異質參與媒體之一點所接收光量之估計方法及所構成裝置
US10395422B2 (en) * 2017-10-13 2019-08-27 Activision Publishing, Inc. Participating media baking

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8009168B2 (en) 2007-06-26 2011-08-30 Microsoft Corporation Real-time rendering of light-scattering media
US7940268B2 (en) * 2007-06-29 2011-05-10 Microsoft Corporation Real-time rendering of light-scattering media
US7940269B2 (en) * 2007-06-29 2011-05-10 Microsoft Corporation Real-time rendering of light-scattering media
US20130100135A1 (en) * 2010-07-01 2013-04-25 Thomson Licensing Method of estimating diffusion of light
TW201401225A (zh) * 2012-06-22 2014-01-01 Thomson Licensing 異質參與媒體之一點所接收光量之估計方法及所構成裝置

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2011058007A2 *

Also Published As

Publication number Publication date
FR2948800A1 (fr) 2011-02-04
WO2011058007A3 (fr) 2013-04-18
WO2011058007A2 (fr) 2011-05-19
US20120218549A1 (en) 2012-08-30
US8842275B2 (en) 2014-09-23

Similar Documents

Publication Publication Date Title
EP1982310B1 (de) Verfahren zur synthetisierung eines virtuellen bildes durch strahlabgabe
FR2965652A1 (fr) Procede d’estimation de la quantite de lumiere recue en un point d’un environnement virtuel
FR2988891A1 (fr) Procede d'estimation de niveau d'opacite dans une scene et dispositif correspondant
US20140313198A1 (en) System, method, and computer program product for performing path space filtering
EP1527599A2 (de) Verfahren und system zur echt-zeit mischung von synthesebildern und videobildern
WO2019001968A1 (fr) Procede de generation numerique d'un hologramme, dispositif, equipement terminal, systeme et programme d'ordinateur associes
FR2966623A1 (fr) Procede d’estimation de l’occultation dans un environnement virtuel
EP2504816B1 (de) Verfahren zur schätzung von lichtstreuung
Delalandre et al. Transmittance function mapping
FR2964775A1 (fr) Procede d'estimation de l'occultation dans un environnement virtuel
KR20140000170A (ko) 관여 매질에 의해 수광된 광의 양을 추정하기 위한 방법 및 대응하는 장치
WO2011057997A2 (fr) Procede d'estimation de diffusion de la lumiere
FR2988502A1 (fr) Procede pour representer un milieu participant dans une scene et dispositif correspondant
EP2502207A2 (de) Verfahren zur schätzung von lichtstreuung
EP2589025A2 (de) Verfahren zur schätzung der lichtstreuung
FR2974217A1 (fr) Procede d’estimation d’une information representative d’une hauteur
FR3083950A1 (fr) Procede de visualisation d’elements graphiques issus d’un flux video composite encode
Elek et al. Spectral ray differentials
FR2964776A1 (fr) Procede d’estimation de diffusion de la lumiere dans un milieu homogene
EP2987319A1 (de) Verfahren zur erzeugung eines ausgabevideodatenstroms aus einem breitfeldvideodatenstrom
WO2010094619A1 (fr) Procede d'estimation de diffusion de la lumiere
FR2948799A1 (fr) Procede d'estimation de diffusion de la lumiere
FR2960677A1 (fr) Procede d’estimation de la quantite de lumiere recue en un point d’un environnement virtuel
Liu et al. A Simulation System for Scene Synthesis in Virtual Reality
Chariot Quelques applications de la programmation des processeurs graphiques à la simulation neuronale et à la vision par ordinateur

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120605

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
R17D Deferred search report published (corrected)

Effective date: 20130418

17Q First examination report despatched

Effective date: 20190103

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20190425