EP2107530B1 - Rendern von Streueffekten in durchscheinenden Objekten - Google Patents

Rendern von Streueffekten in durchscheinenden Objekten Download PDF

Info

Publication number
EP2107530B1
EP2107530B1 EP09250808.4A EP09250808A EP2107530B1 EP 2107530 B1 EP2107530 B1 EP 2107530B1 EP 09250808 A EP09250808 A EP 09250808A EP 2107530 B1 EP2107530 B1 EP 2107530B1
Authority
EP
European Patent Office
Prior art keywords
points
point
light
scene
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP09250808.4A
Other languages
English (en)
French (fr)
Other versions
EP2107530A1 (de
Inventor
Bruce Nunzio Tartaglia
Alexander P. Powell
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DreamWorks Animation LLC
Original Assignee
DreamWorks Animation LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DreamWorks Animation LLC filed Critical DreamWorks Animation LLC
Publication of EP2107530A1 publication Critical patent/EP2107530A1/de
Application granted granted Critical
Publication of EP2107530B1 publication Critical patent/EP2107530B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/62Semi-transparency

Definitions

  • This application relates to the electronic rendering of images and movies in general and more specifically to the shading of translucent objects.
  • Computer graphics are used in various fields, such as animation, computer games, etc.
  • An important goal of computer graphics is the ability to generate virtual images that are realistic and/or esthetic.
  • a major impediment to reaching this goal is the inherent computational limitations of existing computer systems.
  • Direct light is light that travels directly from a light source to the object and to the observer.
  • Indirect light is light that does not travel directly from a light source to the object.
  • indirect light may be light that is reflected from another object before it reaches the object being observed.
  • indirect light may be light that goes through a translucent object before reaching the object being observed.
  • a translucent object is an object which allows light to pass through it and modifies the light as it passes through it.
  • seeing a translucent object usually involves seeing at least some of the light that passes through the object.
  • light that passes through an object is not considered direct light by the standard direct light models and is thus not rendered by systems employing these models.
  • systems utilizing standard direct light models usually render translucent objects as being completely opaque. This is naturally undesirable.
  • Some direct light systems render translucent objects by performing some type of ray tracing for these objects.
  • these direct light systems depart from the direct light model when rendering translucent objects, in order to correctly convey the translucency of these objects. This is done under the assumption that translucent objects are relatively rare, therefore the additional computational requirements for performing ray tracing for these objects would not be too high.
  • this method cannot be efficiently employed in scenes in which large numbers of translucent objects exist.
  • U.S. Patent Application Ser. No. 11/975,031 entitled “SHADING OF TRANSLUCENT OBJECTS” and filed on October 16, 2007 discloses methods and systems for providing translucent illumination by utilizing depth maps.
  • the ⁇ 031 application allows for the use of a relatively computationally efficient direct light model to provide authentic illumination for translucent objects. While the methods and systems of the above application are generally quite effective, they do not take into account one particular quality of translucent objectssubsurface scattering. Subsurface scattering refers to the fact that the color of a particular point in a translucent object is often related to the illumination received by other proximate points. This phenomenon is explained in more detail below.
  • US 2004/150642 A1 discloses a computationally efficient method for rendering skin tissue to achieve lifelike results, including application of a blurring algorithm to a two-dimensional light map.
  • the algorithm is not derived from complex mathematical models of sub-surface scattering in translucent materials.
  • the method includes receiving 3-dimensional surface geometry relating to a digital object and other information for defining modeled light reflected from the surface, generating a two-dimensional matrix of light intensity values mapped to the surface geometry, blurring the matrix using a compact algorithm, and rendering the object using the blurred light intensity values.
  • DACHSBACHER C ET AL "Translucent shadow maps" EUROGRAPHICS SYMPOSIUM ON RENDERING, JUNE 25-27, 2003, LEUVEN, BELGIUM, ACM, NEW YORK, NY, USA, 25 June 2003 discloses a method involving an extension to shadow maps, referred to as translucent shadow maps, which allow very efficient rendering of sub-surface scattering.
  • the translucent shadow maps contain depth and incident light information.
  • Sub-surface scattering is computed on-the-fly during rendering by filtering the shadow map neighborhood.
  • the idea of translucent shadow maps is to extend a shadow map so that all information to compute translucency is available.
  • a pixel in the translucent shadow map stores not only the depth and thus 3D-position of the sample, but also the irradiance entering the object at that position and the surface normal.
  • Embodiments of the present invention are directed to modifying an existing scheme for providing translucent illumination in order to take account of subsurface scattering.
  • the color of a selected point of a translucent object is determined using existing methods.
  • the existing methods may be, for example, the methods discussed in the ⁇ 031 application mentioned above.
  • the existing methods need not take subsurface scattering into account at all.
  • the existing methods may take subsurface scattering into account to a certain extent but not adequately.
  • a contribution to the color at the selected point due to subsurface scattering may be calculated.
  • the contribution due to subsurface scattering may be calculated based on a photon map. Entries of the photon map representing points close to the selected point can be used to provide an estimate of the contribution of subsurface scattered light. In addition, entries of the photon map representing points close to a point or points that indicate where light that reaches the selected point enters the translucent object can also be taken into account when calculating the contribution associated with subsurface scattering. Naturally, multiple selected points can be thus processed in order to render a scene.
  • Embodiments of the present invention also include the use of different types of photon maps.
  • a standard photon map may be used.
  • each entry may be associated with a respective set of coordinates.
  • a photon map may be defined in a manner similar to a depth map.
  • the entries of a photon map may be defined in terms of an angle from a light source and a distance between an object's surface and a light source.
  • Fig. 1 is an illustration of subsurface scattering in a translucent object.
  • Fig. 2 is another illustration of subsurface scattering in a translucent object.
  • Fig. 3 is a diagram showing an exemplary method of utilizing embodiments of the present invention.
  • Fig. 4 is a diagram showing the generation of a photon map according to some embodiments of the invention.
  • Fig. 5 is a visual representation of data stored in an exemplary photon map according to some embodiments of the invention.
  • Fig, 6 is a diagram illustrating the second order color determination for a particular pixel according to some embodiments of the invention.
  • Fig. 7 is a diagram illustrating another second order color determination scheme according to some embodiments of the invention.
  • Fig. 8 is a diagram showing an alternative way of storing the photon map according to some embodiments of the invention.
  • Fig. 9 is a diagram of an exemplary distributed system according to some embodiments of the present invention.
  • a "ray of light” is an abstraction used for computational purposes. It signifies a line in space through which light travels, and need not be associated with a single photon.
  • the number of rays of light emitted by a single light source may vary and depend on the desired resolution of the scene. However, this number is usually much smaller than the number of actual photons emitted by a real source of light.
  • Embodiments of the present invention are directed to modifying an existing scheme for providing translucent illumination (such as, for example, the scheme of the '031 application) in order to take account of subsurface scattering.
  • An example of subsurface scattering is illustrated in Fig. 1.
  • Fig. 1 shows a translucent object 100 viewed by camera 110.
  • Light ray 102 generated from light source 101, may travel through the object and reach surface point 103.
  • Many existing models (such as for example the modified direct illumination model of the '031 application) would calculate the color of point 103 solely based on the illumination of ray 102 as well as any other rays that directly illuminate point 103.
  • rays that illuminate points that are in proximity to point 103 may also affect the color of point 103.
  • ray 121 of light source 120 and ray 131 of light source 130 For the purposes of fig. 1 only, it may be assumed that light sources 130 and 120 do not directly illuminate point 103. Instead, ray 131 directly illuminates point 132 while ray 121 directly illuminates point 122. Each of these rays creates a reflection from the surface of the object (rays 123 and 133, respectively).
  • object 100 is translucent, some of the energy of both rays can penetrate the surface of the object and propagate within the object. There, it can travel in various paths within the object (it can also split up further, etc) depending on the optical properties of the object's inner material.
  • 124 and 134 are example light ray paths that can be formed from rays 121 and 131, respectively. Inner rays such as these can travel through the object through various paths and exit the surface at point 103. Thus, they can contribute to the color at point 103. Therefore, the color of point 103 can depend upon light that initially illuminates the object at points different than point 103. This phenomenon is referred to as subsurface scattering.
  • Subsurface scattering is most notable for points that are in close proximity. Thus, rays that illuminate the object at points relatively close to point 103 are likely to be the most significant contributions to any changes of the color of point 103 due to subsurface scattering.
  • Fig. 2 shows another example of subsurface scattering.
  • light source 101 again illuminates obj ect 100.
  • Light ray 102 again goes through obj ect 100 and reaches point 103 at its opposite side. It is also noted that light ray 102 enters the object at point 205.
  • Two additional light rays from light source 101 (light rays 200 and 210) are also considered. These rays reach the surface of object 100 at points 201 and 211, respectively. Then, the rays continue through the object as rays 202 and 212 respectively. However, some of the energy of each ray may be refracted from the surface into rays 204 and 214, respectively. These rays may reach the other end of the object at point 103 - the same point at which ray 102 reaches the end of the obj ect.
  • the illumination at point 103 depends on rays that would not be considered to reach point 103 for most direct light illumination models (including the model of the '031 application). It should be noted that in this example the illumination at point 103 does not depend on light that enters the object close to it (as in Fig. 1 ), but on light that enters the object close to the point where ray 102 enters the object (i.e., point 205). More specifically, light rays 200 and 210 which enter the object at points 211 and 201 which are close to point 205 affect the illumination of point 103.
  • Embodiments of the present invention provide a relatively low cost modification of existing direct light models (such as, for example, the model of the ⁇ 031 application) which allow for the effects of subsurface scattering to be rendered. Additionally, embodiments of the present invention may also be used with other types of models that do not adequately take subsurface scattering into account.
  • Fig. 3 is a diagram of an exemplary method of utilizing embodiments of the present invention.
  • the color at a particular pixel in a translucent object is calculated utilizing existing techniques. These may include various direct illumination methods. They may include the direct illumination methods described in the '031 application.
  • the techniques used in step 300 may be techniques that do not take subsurface scattering into account (as this may be calculated by embodiments of the present invention). Alternatively, the techniques of step 300 may take subsurface scattering into account to some extent but not to an extent that is considered sufficient or desirable. Step 300 may be referred to as the determination of the first order color value.
  • step 302 the color contribution at the particular pixel of step 300 due to subsurface scattering is calculated. The details of this calculation are discussed below. This may be referred to as the subsurface scattering color determination, or the second order color determination.
  • step 304 the colors of steps 300 and 302 are added to obtain a final color for the pixel.
  • steps 300 and 302 may be reversed. In some embodiments, steps 300 and 302 may calculate illuminations levels instead of final colors.
  • the value obtained at step 300 is referred to as first order color determination, because it is usually intended to provide a first approximation of the correct color.
  • the subsurface scattering calculation is intended to provide a correction to the first approximation that takes subsurface scattering into account.
  • adding colors may refer to adding their component values.
  • a color can be defined by a plurality of component values (usually three) each of which defines the brightness of one of a plurality of component colors (usually Red, Green and Blue) that can combine to form the color. Adding two colors simply involves adding their respective component values.
  • Embodiments of the invention utilize a data structure referred to as a photon map in order to perform subsurface scattering.
  • Fig. 4 is a diagram showing the generation of a photon map according to exemplary embodiments of the invention.
  • Fig. 4 shows a scene similar to that of Fig. 1 . Again, light sources 101, 120 and 130 illuminate object 100.
  • the photon map is generated from the point of view of the light source. Accordingly, for each light source a plurality of light rays emitted by the light source is traced. For example, light rays 400 for light source 101 may be traced. Naturally, the traced light rays may encompass a three dimensional space.
  • the light rays that are traced may include rays that encompass the entire angular range of light produced by the light source. Light rays within certain predefined angular intervals of each other may be traced. The angular intervals may be based on a desired precision or granularity (e.g., lower intervals would result in more light rays and higher precision and granularity). In some embodiments, the angular intervals may depend upon the distance between the light source and an object (i.e., object 100) in order to ensure a desired special frequency of intersections of traced light rays and the object's surface.
  • the traced light rays may be traced to the first surface they intersect.
  • light rays 400 may be traced to object 100.
  • traced light rays that are found to not intersect the surface of a translucent object are ignored and no further processing is performed for them (as embodiments of the present invention are centered on rendering translucent objects).
  • the light ray tracing step may be performed together with depth map generation, in order to improve efficiency. Depth map generation also usually requires that light rays of each light source are traced to the first surface they intersect.
  • the location of a point where that light ray intersects the surface is determined. These may include locations 401, 402, 403, etc. The location of each such intersecting point is saved in a memory structure referred to as a photon map.
  • the color of each intersecting point is determined.
  • the color may be determined based on the intensity and color of the illumination provided by the traced light ray, the surface color of the illuminated surface and the angle of incidence of the light ray on the surface. In other words, the color may be determined based on the known formula for determining the color of a surface point directly illuminated by a known light ray.
  • the color of each point is stored in the photon map along with its location. If a particular point is illuminated by two or more light rays, the color stored may be the color resulting from the combined illuminations of all illuminating rays.
  • Fig. 5 is a visual representation of the data stored by an exemplary photon map.
  • the photon map may define the three dimensional positions of a plurality of points 500 as well as their colors.
  • the photon map of Fig. 5 may define a first group of points 500 that are part of a sphere and second group of points 501 that are part of a cube, etc.
  • a single photon map may include points from the surfaces of several objects.
  • FIG. 6 is a diagram illustrating that process.
  • Pixel 600 may be a pixel on the surface of translucent object 100 that is being rendered.
  • the depth map is examined to select a set of points within the depth map that are within a predefined threshold distance r of pixel 600.
  • Distance r may be measured along the surface of object 100. Alternatively, distance r may be measured in all three dimensions. Thus, the points of the depth map that are within a sphere 601 having radius r and pixel 600 as its center may be selected.
  • the distance d i between each selected point and pixel 600 can be measured. While only one d i (the one for point 602) has been shown in Fig. 6 for clarity, a d i may exist for each selected point 602-606, the subscript "i" indicating a particular selected point.
  • the colors of the various points may be integrated. If each point has a color c i , then the integration can be defined as:
  • the result C sec is a second order color for pixel 600. Or, in other words, it is a color which may be used to modify an initial color for pixel 600 in order to take into account light received by pixel 600 due to subsurface scattering.
  • the summation is performed over all selected points (i.e., over all points within radius r).
  • the function D(x,y) is a decay function that decays (i.e., decreases the intensity of) color x based on number y and returns a color. Thus, the higher the value of y, the less intense the color returned by the function is.
  • the decay function may be one of many known decay functions. For example, it may be a linear, parabolic, Gaussian, etc. decay function. It may include one of the various decay functions discussed in the '031 application.
  • a color may be multiplied by a scalar (for the purposes of calculating a decay function, for example) by multiplying each of the three component color values of a color by the scalar. Adding colors may be performed by adding their component values (as discussed above).
  • the second order color may be obtained for a pixel by adding the decayed values of the colors of all points in the photon map within a predefined radius from the pixel.
  • This may be a suitable and efficient representation of the subsurface scattered light that may reach pixel 600.
  • the colors of the various selected points 602-606 may represent the colors of various rays that may be scattered below the surface of the translucent object 100.
  • the decay of these colors may represent the decay these rays experience before they reach pixel 600.
  • the second order color thus obtained may be added to a first order color obtained by a known method in order to take into account subsurface scattering.
  • the second order color may be further modified (e.g., it may be scaled) before being added to the fist order color.
  • FIG. 7 again shows light source 101 that emits ray 102 through object 100, the ray being eventually detected by camera 110. If the process discussed in connection with Fig. 6 above is performed for the pixel at point 103 (pixel 103), then all points in the photon map within area 700 will be integrated to obtain the second order color for point 103.
  • some embodiments also perform a similar integration at the other side of the object 100.
  • all points within a predefined radius of point 205 may be integrated according to equation EQ1. This may cover all points within area 701.
  • This second integration may be intended to take into account subsurface scattered light contributed from rays 200 and 210 as shown in Fig. 2 .
  • the radius of area 701 may or may not be the same of that of area 700.
  • C sec(cam) the second order color obtained from the integration of area 700 at the camera side
  • C sec(light) the second order color obtained from the integration of area 700 at the camera side
  • C sec(light) the second order color obtained from the integration of area 700 at the camera side
  • a total second order color C sec(tot) may be obtained by the following formula:
  • D 2 is another decay function that may be identical or distinct from function D of EQ1.
  • function D 2 may include various decay functions including the examples provided above in the context of function D as well as the examples provided in the '031 application.
  • function D 2 may be modified to take into account empty spaces within object 100, as discussed in the '031 application.
  • the parameter w is the distance between points 205 and 103. In other words, it may be the distance ray 102 travels within object 100.
  • the D 2 function is intended to decay the light side second order color value C sec(light) in order to reflect the fact that light that has been scattered on the light source side (e.g., scattered rays 204 and 214 of Fig. 2 ) may need to travel a longer distance w to reach the rendered pixel 103.
  • a different decay function D may be used in EQ1 when calculating C sec(light) than the one used when calculating C sec(cam) .
  • the decay function used when calculating C sec(light) may be such that it does not decay as sharply as the one used when calculating C sec(cam) .
  • C sec(light) values may be calculated for each light source. These values may then be individually decayed and added to the C sec(cam) value as shown below:
  • C sec(light , i) is the second order color calculated for each light source and w i is the corresponding w value for each light source.
  • the total resulting second order color C sec(tot) may be added to a previously obtained first order color in order to take both types of subsurface scattered light into account.
  • Fig. 8 is a diagram showing an alternative method of storing the photon map.
  • Fig. 8 shows the light sources 101 and the translucent object 100.
  • depth map 800 is well known concept in the art computer graphics. It is a two dimensional array, wherein each entry of the array is associated with a ray of light from light source 101. Each entry usually stores the distance its associated ray travels before hitting the closest surface. Depth maps are often utilized in direct light models. Usually, an entry may define a ray it is associated with by a relationship of the position of the entry in the array with the angular position of the ray as it is emitted from the light source.
  • the photon map may be organized in a manner similar to that of a depth map. More specifically, the photon map may be organized as a two dimensional array 801. Each entry in the array may correspond to a point that is to be saved in the photon map (e.g., one of points 810-814). The association between an entry of the photon map and an actual point may be defined by the angular position of the ray which reaches that point. Thus, ray 820 may reach point 810. The angular position of ray 820 may define the position of an entry 830 within the array 801. Thus, entry 830 of the photon map may be associated with point 810.
  • a two dimensional photon map as the one shown in Fig. 8 may be referred to as a projected photon map.
  • Photon map 801 may store the color of each respective point. It need not store a three dimensional position for each respective point because that position is defined by the angle of the ray defined by the position of each entry within the photon map (as discussed above) and the surface of the translucent object 100. Thus, revisiting the above example, the position of entry 830 within the photon map 801 may define a ray 820. The intersection of ray 820 with the surface of object 100 may define the actual three dimensional position of point 810. Thus, entry 830 may define the three dimensional position of the point it is associated with (point 810) without explicitly storing that position.
  • the embodiment of Fig. 8 may greatly speed up the process of finding a group of points in the photon map that are within a predefined radius of the pixel that is to be rendered (see discussion of Fig. 6 for further details).
  • a three dimensional search of pixels would need to be performed.
  • a large number of points within the photon map may need to be examined in order to determine whether they are within the predefined radius of the pixel.
  • the pixel may be associated with a particular entry in the photon map 801 based on the angle a ray from light source 101 would need to have in order to reach the pixel.
  • the predefined radius may be converted into a number of positions in the photon map (alternatively, the predefined radius may be initially provided as a number of positions).
  • the group of points within the radius may be selected by selecting the entry in the photon map the pixel was associated with as well as all neighboring entries in the photon map placed within the number of positions associated with the predefined radius.
  • the photon map of Fig. 8 may add some complexities.
  • the photon map of Fig. 8 may need to include several distinct arrays if several different light sources are present in the scene.
  • multiple arrays may need to be checked in order to find all points within the predefined radius of a particular pixel.
  • a projected photon map may be provided in addition to the depth map, or may be stored in the same array as the depth map.
  • the depth map may be used in order to improve the selection of entries in the photon map within a predefined radius of a pixel.
  • point 815 of object 840 may also be present at the scene.
  • Point 815 may be associated with entry 835 of the photon map by way of light ray 825. Since entry 835 is very close to entry 830 within the photon map (it borders it) a rendering system may erroneously assume that point 815 is very close to point 810 and include point 815 in the calculation of subsurface scattered light of point 810. However, this would not be correct as point 815 is positioned much further back than point 810.
  • depth map entries of points 815 and 810 may be compared. These would reveal the difference in positions between these two points, as ray 820 must travel a much shorter distance to reach point 810 than ray 825 must travel to reach point 815.
  • some embodiments of the present invention may compare depth map values in order to determine which points of a photon map are within a predefined radius of a pixel that is being rendered.
  • depth map values may be used in combination with comparative positions within the photon map in order to determine distances between various points and the pixel for the purposes of computing the decay function of equation EQ1.
  • embodiments of the present invention further include functionality for determining first order color. As noted above this functionality may be based on any known rendering methods, including various direct illumination methods. Furthermore, the system and methods discussed in the '031 application may be used to determine first order color.
  • the embodiments discussed herein may be implemented as software running on a computer. Furthermore, these embodiments may be implemented as hardware only computing devices or a combination of software and hardware.
  • the computer device which implements the embodiments may include an interface which allows an artist or a user to set up scenes by creating or importing various objects, placing them in an environment, placing various light sources and or translucent objects in the environment, etc.
  • the computer device may also include an interface that provides a picture or a video of the rendered scene.
  • the scenes may be set up automatically by the computing device itself.
  • an embodiment of the present invention may be used to render a virtual environment or a video game.
  • the various scenes may be automatically set up by the computer device depending on events that are occurring in the virtual environment or video game. Therefore, in some embodiments most actions discussed herein as being performed by users, artists or operators, may be performed by suitable programmed computers instead.
  • Some embodiments of the present invention may be implemented on computing devices of high computing power. For example, some embodiments may be implemented on a supercomputer or a distributed computer. Other embodiments may be implemented on personal computers, video game machines, portable computers, etc.
  • Fig. 9 shows an embodiment implemented on a distributed computer system.
  • a plurality of computers 901 execute a software application implementing the present invention in parallel.
  • Each computer 901 includes a processor 903 and memory 904.
  • the computers may render different portions of a scene or different scenes in parallel.
  • the computers communicate through a network 902.
  • the software application may exist in its entirety in the memory 903 of a single computer 901, or alternatively portions of it may be stored and executed at one or more computers.
  • Managing components of the software application running on one or more of the computers may create and assign various tasks to other portions of the software application running on the same or other computers.
  • the multiple computers may be positioned in proximity to each other or they may be located remotely and connected through a pervasive network, such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Claims (11)

  1. Verfahren zum Rendern einer Szene mit einem durchscheinenden Objekt, welches die folgenden Schritte umfasst:
    Vorsehen eines zu rendernden Pixels, wobei das Pixel einem ersten Punkt (103) auf einer Oberfläche (100) eines durchscheinenden Objekts zugeordnet ist, wobei das durchscheinende Objekt (100) Teil der Szene ist;
    Ermitteln einer Primärfarbe des Pixels (103), wobei die Ermittlung der Primärfarbe eine Annäherung unter Verwenden eines Auflichtverfahrens ist, das Volumenstreulicht nicht genügend berücksichtigt;
    Erzeugen einer Photon-Map, wobei die Photon-Map Werte von Farben von mehreren Punkten in der Szene umfasst und mindestens einige Informationen bezüglich der Position jedes der mehreren Punkte in der Szene liefert, wobei die Werte von Farben, die von der Photon-Map umfasst sind, nur durch das Auflichtverfahren abgeleitet werden; dadurch gekennzeichnet, dass es weiterhin umfasst:
    Erzeugen einer Sekundärfarbe durch:
    Erzeugen einer ersten Komponente der Sekundärfarbe, wobei die erste Komponente auf Daten von der Photon-Map beruht, die ersten mehreren gewählten Punkten aus den mehreren Punkten zugeordnet sind, wobei einer der ersten mehreren gewählten Punkte der erste Punkt (103) ist, wobei die erste Komponente die Verteilung des Volumenstreulichts über einer ersten Fläche (700) darstellt, die von Strahlen beleuchtet wird, die von einer Kameraseite des durchscheinenden Objekts (100) stammen;
    Erzeugen einer zweiten Komponente der Sekundärfarbe, wobei die zweite Komponente auf Daten von der Photon-Map beruht, die zweiten mehreren gewählten Punkten der mehreren Punkte zugeordnet sind, wobei einer der zweiten mehreren Punkte ein zweiter Punkt (205) auf der Oberfläche (100) des durchscheinenden Objekts gegenüber dem ersten Punkt (103) ist und die zweiten mehreren Punkte in einem vorab festgelegten Radius des zweiten Punkts (205) liegen, wobei die zweite Komponente die Verteilung von Volumenstreulicht über einer zweiten Fläche (701) darstellt, die von Strahlen beleuchtet wird, die von einer Seite gegenüber der Kameraseite des durchscheinenden Objekts (100) stammen, wobei einer der Strahlen (102), der von einer Lichtquelle (101) an einer Seite gegenüber der Kameraseite stammt, an dem zweiten Punkt (205) in das durchscheinenden Objekt (100) eindringen, das durchscheinende Objekt (100) durchqueren und an dem ersten Punkt (103) austreten würde;
    Dämpfen der zweiten Komponente beruhend auf einer Entfernung zwischen der ersten Fläche (700) und der zweiten Fläche (701) des durchscheinenden Objekts (100); und
    Hinzufügen der ersten und zweiten Komponenten, um Volumenstreulicht zu berücksichtigen.
  2. Verfahren nach Anspruch 1, wobei das Erzeugen der Sekundärfarbe weiterhin umfasst:
    Wählen der ersten mehreren gewählten Punkte aus den mehreren Punkten beruhend auf der Position jedes der ersten mehreren gewählten Punkte im Verhältnis zu dem Pixel;
    Dämpfen jeder der Farben der mehreren gewählten Punkte beruhend auf einer vorab festgelegten Dämpfungsfunktion und der Entfernung zwischen jedem der mehreren gewählten Punkte und dem Pixel; und
    Integrieren der gedämpften Farben der mehreren gewählten Punkte, um die Sekundärfarbe zu erzeugen.
  3. Verfahren nach einem der vorhergehenden Ansprüche, wobei das Erzeugen der Photon-Map weiterhin umfasst:
    Verfolgen von Auflichtstrahlen (400) von einer oder mehreren Lichtquellen (101, 120, 130) zu einem oder mehreren der mehreren Punkte in der Szene; und
    Ermitteln von Farben für einen oder mehrere der mehreren Punkte durch Ableiten der Farbe eines jeweiligen Punkts von den mehreren Punkten nur an Lichtquellen, deren Strahlen zu dem jeweiligen Punkt verfolgt wurden.
  4. Verfahren nach einem der vorhergehenden Ansprüche, wobei das Erzeugen der Photon-Map das Festlegen von jeweiligen dreidimensionalen Positionen für jeden Punkt der mehreren Punkte in der Photon-Map umfasst.
  5. Verfahren nach einem der vorhergehenden Ansprüche, wobei das Erzeugen der Photon-Map das Vorsehen mindestens eines zweidimensionalen Array (801) umfasst, wobei
    Einträge in dem Array (801) jeweiligen Punkten von den mehreren Punkten zugeordnet sind; und
    die Position eines ersten Eintrags in dem Array (801) Daten bezüglich der Position eines jeweiligen Punkts der mehreren Punkte in der Szene vorsieht, wobei der Punkt dem ersten Eintrag zugeordnet ist.
  6. Verfahren nach Anspruch 5, wobei ein Tiefenmapeintrag, der dem ersten Eintrag zugeordnet ist, kombiniert mit der Position des ersten Eintrags in dem Array (801) die Position des Punkts festlegt, der dem ersten Eintrag in der Szene zugeordnet ist.
  7. Computerlesbares Medium, welches mehrere Befehle umfasst, wobei die Befehle ausgelegt sind, um an einem Computer ausgeführt zu werden und um den Computer zu veranlassen, ein Verfahren zum Rendern einer Szene nach einem der Ansprüche 1 bis 6 durchzuführen.
  8. Computerlesbares Medium nach Anspruch 7, wobei das zu rendernde Pixel Teil des durchscheinenden Objekts (100) in der Szene ist.
  9. System zum Rendern einer Szene, welches einen Prozessor und einen Speicher umfasst, wobei der Speicher mehrere Befehle umfasst, die von dem Prozessor ausführbar sind, wobei die Befehle ausgelegt sind, um den Prozessor zu veranlassen, ein Verfahren zum Rendern einer Szene nach einem der Ansprüche 1 bis 6 durchzuführen.
  10. System nach Anspruch 9, wobei das zu rendernde Pixel Teil des durchscheinenden Objekts (100) in der Szene ist.
  11. System nach Anspruch 9 oder 10, wobei das System ein dezentrales Computersystem ist.
EP09250808.4A 2008-04-02 2009-03-23 Rendern von Streueffekten in durchscheinenden Objekten Active EP2107530B1 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/061,435 US7937245B2 (en) 2008-04-02 2008-04-02 Rendering of subsurface scattering effects in translucent objects

Publications (2)

Publication Number Publication Date
EP2107530A1 EP2107530A1 (de) 2009-10-07
EP2107530B1 true EP2107530B1 (de) 2013-10-23

Family

ID=40901986

Family Applications (1)

Application Number Title Priority Date Filing Date
EP09250808.4A Active EP2107530B1 (de) 2008-04-02 2009-03-23 Rendern von Streueffekten in durchscheinenden Objekten

Country Status (3)

Country Link
US (1) US7937245B2 (de)
EP (1) EP2107530B1 (de)
CA (1) CA2660194A1 (de)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101845231B1 (ko) * 2011-06-14 2018-04-04 삼성전자주식회사 영상 처리 장치 및 방법
DE112011105927T5 (de) * 2011-12-07 2014-09-11 Intel Corporation Grafik-Renderingverfahren für autostereoskopisches dreidimensionales Display
US9939377B2 (en) * 2013-01-17 2018-04-10 Disney Enterprises, Inc. Method of fabricating translucent materials with desired appearance
US9401043B2 (en) * 2013-01-18 2016-07-26 Pixar Photon beam diffusion
US10008034B2 (en) * 2013-05-03 2018-06-26 Nvidia Corporation System, method, and computer program product for computing indirect lighting in a cloud network
US10713838B2 (en) * 2013-05-03 2020-07-14 Nvidia Corporation Image illumination rendering system and method
US9905028B2 (en) * 2013-12-11 2018-02-27 Adobe Systems Incorporated Simulating sub-surface scattering of illumination for simulated three-dimensional objects
CN104484896B (zh) * 2014-10-30 2018-01-16 无锡梵天信息技术股份有限公司 一种基于环境贴图来模拟人物皮肤次表面散射的物理方法
KR102282896B1 (ko) 2014-12-23 2021-07-29 삼성전자주식회사 영상 처리 장치 및 방법
US9852537B2 (en) 2015-05-01 2017-12-26 Otoy Inc. Rendering via ray-depth field intersection
US9679398B2 (en) * 2015-10-19 2017-06-13 Chaos Software Ltd. Rendering images using color contribution values of render elements
CN106846450B (zh) * 2017-02-10 2020-03-31 腾讯科技(深圳)有限公司 实时渲染次表面散射的方法及相关装置
US11941752B2 (en) 2020-07-21 2024-03-26 Nvidia Corporation Streaming a compressed light field
US11501467B2 (en) 2020-11-03 2022-11-15 Nvidia Corporation Streaming a light field compressed utilizing lossless or lossy compression
CN113674375B (zh) * 2021-08-19 2023-10-27 北京航空航天大学 一种用于半透明材质渲染的次表面散射计算方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6760024B1 (en) 2000-07-19 2004-07-06 Pixar Method and apparatus for rendering shadows
WO2004047426A2 (en) 2002-11-15 2004-06-03 Esc Entertainment, A California Corporation Reality-based light environment for digital imaging in motion pictures
US7327365B2 (en) * 2004-07-23 2008-02-05 Microsoft Corporation Shell texture functions

Also Published As

Publication number Publication date
US20090254293A1 (en) 2009-10-08
EP2107530A1 (de) 2009-10-07
US7937245B2 (en) 2011-05-03
CA2660194A1 (en) 2009-10-02

Similar Documents

Publication Publication Date Title
EP2107530B1 (de) Rendern von Streueffekten in durchscheinenden Objekten
Krivanek et al. Practical global illumination with irradiance caching
CN111508052B (zh) 三维网格体的渲染方法和装置
US7362332B2 (en) System and method of simulating motion blur efficiently
Longhurst et al. A gpu based saliency map for high-fidelity selective rendering
US20060290696A1 (en) Method and apparatus for implementing level of detail with ray tracing
US8730239B2 (en) Transitioning between shading regions on an object
Yao et al. Multi‐image based photon tracing for interactive global illumination of dynamic scenes
Sabino et al. A hybrid GPU rasterized and ray traced rendering pipeline for real time rendering of per pixel effects
Novák et al. Screen-space bias compensation for interactive high-quality global illumination with virtual point lights
Woo et al. Shadow algorithms data miner
Ganestam et al. Real-time multiply recursive reflections and refractions using hybrid rendering
US8248405B1 (en) Image compositing with ray tracing
JPH06309462A (ja) 動画像処理装置
Döllner Non-photorealistic 3D geovisualization
JP3012828B2 (ja) 描画方法、装置および記録媒体
Rademacher Ray tracing: graphics for the masses
Kennie et al. Modelling for digital terrain and landscape visualisation
CN107067459A (zh) 用于对三维对象按体积地进行可视化的方法和可视化设备
Schwandt High-Quality Illumination of Virtual Objects Based on an Environment Estimation in Mixed Reality Applications
US7265753B1 (en) Particle tracing with on-demand meshing
Kan High-quality real-time global illumination in augmented reality
Shen THE PURDUE UNIVERSITY GRADUATE SCHOOL STATEMENT OF DISSERTATION APPROVAL
Peschel et al. Plausible visualization of the dynamic digital factory with massive amounts of lights
Karlsson et al. Rendering Realistic Augmented Objects Using a Image Based Lighting Approach

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA RS

17P Request for examination filed

Effective date: 20100406

AKX Designation fees paid

Designated state(s): DE FR GB

R17P Request for examination filed (corrected)

Effective date: 20100406

RBV Designated contracting states (corrected)

Designated state(s): DE FR GB

17Q First examination report despatched

Effective date: 20110119

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20130528

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602009019592

Country of ref document: DE

Effective date: 20131219

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602009019592

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20140724

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602009019592

Country of ref document: DE

Effective date: 20140724

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 8

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 9

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 10

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230519

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240327

Year of fee payment: 16

Ref country code: GB

Payment date: 20240327

Year of fee payment: 16

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20240325

Year of fee payment: 16