WO2005083639A1 - Procedes et appareil d'estimation de diffusion de sous-surface - Google Patents

Procedes et appareil d'estimation de diffusion de sous-surface Download PDF

Info

Publication number
WO2005083639A1
WO2005083639A1 PCT/US2004/010046 US2004010046W WO2005083639A1 WO 2005083639 A1 WO2005083639 A1 WO 2005083639A1 US 2004010046 W US2004010046 W US 2004010046W WO 2005083639 A1 WO2005083639 A1 WO 2005083639A1
Authority
WO
WIPO (PCT)
Prior art keywords
thickness
illumination
response
processor
surface point
Prior art date
Application number
PCT/US2004/010046
Other languages
English (en)
Inventor
Bradley S. West
Thomas Lokovic
David Batte
Original Assignee
Pixar
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixar filed Critical Pixar
Publication of WO2005083639A1 publication Critical patent/WO2005083639A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models

Definitions

  • the present invention relates to computer animation. More specifically, the present invention relates to enhanced methods and apparatus for rendering objects, especially translucent objects, while accounting for subsurface scattering effects.
  • Stop motion-based animation techniques typically required the construction of miniature sets, props, and characters. The filmmakers would construct the sets, add props, and position the miniature characters in a pose. After the animator was happy with how everything was arranged, one or more frames of film would be taken of that specific arrangement. Stop motion animation techniques were developed by movie makers such as Willis O'Brien for movies such as "King Kong” (1932). Subsequently, these techniques were refined by animators such as Ray Harryhausen for movies including "The Mighty Joe Young” (1948) and Clash Of The Titans (1981). [0006] With the wide-spread availability of computers in the later part of the twentieth century, animators began to rely upon computers to assist in the animation process.
  • Pixar One of the pioneering companies in the computer aided animation (CAA) industry was Pixar, dba Pixar Animation Studios. Over the years, Pixar developed and offered both computing platforms specially designed for CAA, and Academy- Award® winning rendering software known as RenderMan®. In the present disclosure, rendering broadly refers to the conversion of geometric data described in scenes to visual images.
  • One specific portion of the rendering process is known as surface shading, hi the surface shading process, the surface shader software determines how much light is directed towards the viewer from the surface of objects in response to the applied light sources in a scene.
  • Two specific parameters that are used for shading calculations includes a surface normal and a surface illumination.
  • the surface shading process is straight forward when shading "solid" objects, such as objects made of metal, wood, dense plastic, thick materials, and the like.
  • the surface shading process is much more complex when rendering objects made of translucent or thin materials, such as glass, marble, liquids, plastics, thin materials and the like. This is because the shading process must not only consider the amount of light striking the outer surface of the object, but also any light that "shines through” the object.
  • Drawbacks to this approach include that ray-tracing and diffusion calculations are highly complex and take long times to compute. Accordingly, the user productivity drops because the user is forced to wait until computations are finish. In some cases, the user must wait over night.
  • Other drawbacks include that if the user is not satisfied about how the final object appears in an image (e.g. the material is too dense), the user redefines the material properties (e.g. absorption and scattering properties), but then the user must again wait until the entire simulation is complete to see the results. Accordingly, any user adjustments to the scene cannot be imaged quickly.
  • the present invention relates to computer animation. More specifically, the present invention relates to enhanced methods and apparatus for rendering objects while accounting for non-opaque objects, objects with translucent portions, or the like.
  • the methods typically include two phases of calculations: a phase where volume information is recorded, and a phase where the final scene is rendered using lighting contributions for each of the lights in the scene.
  • thickness maps are generated for each light in the scene.
  • a thickness map is generated from the perspective of the light.
  • a function is generated representing the accumulated thickness of objects that intersect a ray cast from the light at that pixel.
  • the thickness calculations rely on the geometry of the object to be closed and not self-intersecting.
  • the contribution to the scene from a light is calculated using the thickness map that has been generated. For each point to be shaded, thickness information around the region of the point is read from the thickness map and is filtered to calculate an average thickness for the point. This thickness, combined with common surface information such as the normal and color of the surface are used to approximate the contribution from that light, hi some embodiments, scattering differences of different frequencies of illumination can be approximated using a function that interpolates through a list of user-supplied colors indexed by the thickness.
  • a method for determining illumination of surface points of an object in a scene from lighting sources includes determining a first thickness map for a first lighting source for the scene, wherein the first thickness map includes a first plurality of thickness values of the object with respect to distance from the first lighting source, determining a surface point on the object, and determining a first plurality of thickness values associated with the surface point on the object in response to the first thickness map.
  • Techniques may include determining a first filtered thickness value associated with the surface point on the object in response to the first plurality of thickness values, and detennining an illumination contribution from the first lighting source at the surface point in response to the first filtered thickness value.
  • a computer system includes a memory and a processor.
  • the memory is configured to store a first thickness map associated with a first illumination source within a scene, wherein the first thickness map includes a first plurality of thickness functions of at an object versus distance away from the first illumination source.
  • the processor is processor coupled to the memory, and configured to retrieve the first thic?kness map from the memory.
  • the processor is also configured to determine a surface point on the one object, to determine a neighborhood of surface points on the one object in response to the surface point on the one object, and to determine a plurality of thickness values of the at least one object in response to the surface point and the neighborhood of surface points and in response to the first thickness map.
  • the processor is also configured to determine a filtered thickness value of the one object in response to the plurality of thickness values, and to determine an illumination contribution from the first illumination source at the surface point in response to the filtered thickness value of the one object.
  • a computer program product for a computer system including a processor includes code that directs the processor to retrieving a first thickness map for a first illumination source for the scene, wherein the first thickness map includes an array of thickness functions, wherein each thickness functions comprises a relationship between thickness values of the object with respect to distance from the first illumination source, code that directs the processor to determine a surface point on the object, and code that directs the processor to determine a first plurality of thickness functions associated with the surface point on the object in response to the first thickness map.
  • the computer program product also includes code that directs the processor to determine a first plurality of thickness values in response to the first plurality of thickness functions and in response to the surface point on the object, code that directs the processor to determine a first filtered thickness value associated with the surface point on the object in response to the first plurality of thickness values, and code that directs the processor to determine an illumination contribution from the first illumination source at the surface point in response to the first filtered thickness value.
  • the codes typically reside on a tangible media such as a magnetic hard disk, optical memory, semiconductor-based memory, or the like.
  • FIG. 1 is a block diagram of typical computer rendering system according to an embodiment of the present invention.
  • FIG. 2A-C illustrate a block diagram of a process flow according to an embodiment of the present invention
  • FIGs. 3A-D illustrate an example of an embodiment of the present invention.
  • FIG. 4 illustrates an example according to an embodiment of the present invention. DETAILED DESCRIPTION OF THE INVENTION
  • Fig. 1 is a block diagram of typical computer rendering system 100 according to an embodiment of the present invention.
  • computer system 100 typically includes a monitor 110, computer 120, a keyboard 130, a user input device 140, a network interface 150, and the like.
  • user input device 140 is typically embodied as a computer mouse, a trackball, a track pad, wireless remote, and the like.
  • User input device 140 typically allows a user to select objects, icons, text and the like that appear on the monitor 110.
  • Embodiments of network interface 150 typically include an Ethernet card, a modem (telephone, satellite, cable, ISD1NT), (asynchronous) digital subscriber line (DSL) unit, and the like.
  • Network interface 150 are typically coupled to a computer network as shown.
  • network interface 150 may be physically integrated on the motherboard of computer 120, may be a software program, such as soft DSL, or the like.
  • Computer 120 typically includes familiar computer components such as a processor 160, and memory storage devices, such as a random access memory (RAM) 170, disk drives 180, and system bus 190 interconnecting the above components.
  • computer 120 is a PC compatible computer having multiple microprocessors such as XeonTM microprocessor from Intel Corporation. Further, in the present embodiment, computer 120 typically includes a UNIX-based operating system.
  • RAM 170 and disk drive 180 are examples of tangible media for storage of data including, audio / video files, computer programs, compilers, embodiments of the herein described invention including geometric description of objects, object relationships and algorithms of scattering versus depth, thickness maps, thickness ftmctions, object data files, shader descriptors, a rendering engine, output image files, texture maps, displacement maps, scattering lengths, absorption data and / or transmission data of object materials, and the like.
  • Other types of tangible media include floppy disks, removable hard disks, optical storage media such as CD-ROMS and bar codes, semiconductor memories such as flash memories, read-only-memories (ROMS), battery-backed volatile memories, networked storage devices, and the like.
  • computer system 100 may also include software that enables communications over a network such as the HTTP, TCP/IP, RTP/RTSP protocols, and the like.
  • other communications software and transfer protocols may also be used, for example IPX, UDP or the like.
  • Fig. 1 is representative of computer rendering systems capable of embodying the present invention. It will be readily apparent to one of ordinary skill in the art that many other hardware and software configurations are suitable for use with the present invention. For example, the use of other micro processors are contemplated, such as PentiumTM or ItaniumTM microprocessors; OpteronTM or AthlonXPTM microprocessors from Advanced
  • Micro Devices hie; PowerPC G3TM, G4TM microprocessors from Motorola, Inc.; and the like.
  • other types of operating systems are contemplated, such as Windows® operating system such as WindowsXP®, WindowsNT®, or the like from Microsoft Corporation, Solaris from Sun Microsystems, LINUX, UNIX, MAC OS from Apple Computer Corporation, and the like.
  • Fig. 2 A-C illustrate a block diagram of a process flow according to an embodiment of the present invention.
  • one or more objects are geometrically specified in a scene to be shaded (or rendered), step 200. Any conventional methods for entering such geometric specifications are contemplated, for example, Maya, or the like.
  • the geometric specification may include any conventional way of representing a surface, such as triangles, quadrilaterals, NURBS, or the like.
  • one or more lighting sources are also specified for the scene, step 210.
  • the specification of the lights may include the direction of the lights, color of the lights, "barn-door” positions of the lights, harshness of the light, and the like.
  • a first lighting source is then selected from the one or more lighting sources specified for the scene, step 220.
  • a "thickness map" of objects in a scene relative to the first lighting source is then determined, step 230.
  • the thickness map represents a two-dimensional array of thickness functions at a viewing plane in front of the illumination source.
  • the thickness function in each array element represent a thickness as a function of distance from a viewing plane in a direction for the lighting source outwards.
  • the thickness map is a 500 x 500 array of thickness ftmctions, Ik x Ik array of thickness functions, or the like.
  • the thickness map is a function of x and y in the viewing plane, and z in the depth plane, i.e. f( ⁇ ,y,z) [0038]
  • the thickness function represents an integral amount of light- absorbing material each ray passes through beginning at the viewing plane.
  • the thickness function for a given ray represents the total amount of material (from the multiple objects) the ray passes through with respect to distance from the viewing plane.
  • the thickness function is formed assuming the multiple objects have the same density or same light-scattering properties.
  • the thickness function is formed based upon material density.
  • such cases can also take into account light-absorbing properties of volumetric effects or atmospheric material, such as fog, smoke, fire, water, hair, or the like.
  • an increase in "thickness" per unit distance would be smaller for the less dense material, than for denser material (such as marble, flesh, or the like). Examples of thickness functions are illustrated below.
  • the thickness map (the array of thickness functions) is stored, step 240.
  • the thickness map for the lighting source may be stored in RAM, hard drive, optical drive, or the like.
  • the thickness map is later used to determine the surface shading of typically more than one object in the scene. Accordingly, the thickness map is determined and saved once, and is typically retrieved and used more than once.
  • Figs. 2A-C after the thickness map for the first lighting source is saved, if there are any other lighting sources, step 250, a second lighting source is selected, and the process described above is repeated with respect to the second lighting source, etc.
  • the process in Figs. 2A-C continues to the shading (rendering) process, step 260.
  • the above described steps may be performed at a different time from the below steps.
  • the thickness maps may be determined and stored by a user in a first work group, and later, the stored thickness maps may be used by a user in a different work group. In other embodiments, a user may initiate all of the herein described steps.
  • a first surface location on an object is selected for shading, step 270.
  • the coordinates of the surface location within the various coordinate systems is l ⁇ iown, for example with respect to the camera space, and the like. Further, the surface normal, surface characteristics, and the like are also known, and are used later for the shading calculations.
  • Many different rendering methods may be implemented that result in the selection of the surface location on the object. Further, many different techniques for determining which object is selected from the scene are contemplated. For example, in one embodiment, only a portion of an object to be shaded is retrieved at a time. One such cases where this may happen is when using a "bucket" rendering method, as is currently used in Pixar's rendering software RendennanTM.
  • objects are retrieved from disk into memory and are shaded, at a time for a series of related images (a shot).
  • Such embodiments may use the rendering and shading techniques disclosed in co-pending U.S. Patent Application: 10/428,325 filed4/30/03, titled, "Shot Shading Method and Apparatus,” (incorporated by reference for all purposes) commonly assigned to Pixar. Accordingly, how the surface location on the object is determined and how and which portions of objects are selected are done in a variety of ways.
  • the next steps are to detennine the surface shading value for the surface location in response to the different lighting sources in the scene.
  • this is done by first selecting a first lighting source, step 280.
  • this may or may not be the same lighting source selected in step 220 above.
  • the order the lighting sources are evaluated is based upon lighting hierarchy, strength of the lighting source, or the like. In other embodiments, there is no specific order the lighting sources are selected.
  • the thickness map associated with the lighting source is retrieved from memory, step 290.
  • the thickness map determined in step 230 above may be retrieved from a hard disk memory into RAM.
  • each array location in the thickness map is associated with a thickness function that is a function of distance from a lighting viewing plane in the direction away from the lighting source towards the objects in the scene. Accordingly, the surface location is mapped onto the thickness map to determine the thickness function ("center thickness function") associated with the surface location, step 300.
  • thickness functions neighboring the "center thickness function" in the thickness map associated with the surface location are also identified.
  • the thickness is determined at the surface location, step 310.
  • these above steps may be merged together.
  • step 300 thickness functions neighboring the "center thickness function.” are also identified. Further in step 310, in additional embodiments, the neighboring thickness functions are also evaluated given the distance to the surface location, to determine neighboring thicknesses.
  • the neighborhood may be a 3x3 grid around the "center thickness function" in the thickness map.
  • the size of the neighborhood may vary automatically or by user selection. For example, in thinner objects, the neighborhood may be a 3x3 array, whereas for thicker objects, the neighborhood may be a 2x2, 7x7, 9x9, etc. kernel array, or the like.
  • the surface locations and the neighboring surface locations are used to identify the "neighboring" thickness functions.
  • neighboring surface locations map onto unique thickness functions, however in other embodiments, neighboring surface locations may map to the same thickness function. An example of the latter case is when the resolution of the thickness map is much lower than the resolution of surface locations.
  • the thickness at the surface location and the neighboring thickness are combined to obtain a single thickness for the object at the surface location in step 310.
  • the thickness at the surface location is then determined by filtering the thicknesses.
  • the thickness is the average or weighted average of the thicknesses and the neighboring thicknesses; in other cases the thickness is any other combination (convolution) of the thicknesses such as a low- pass filter, median filter, Gaussian filter, high-pass filter and the like.
  • the thicknesses at surface of the object tend to average out or be blurred. Because the object thicknesses are averaged out, the illumination at a particular surface location is "softened.” This simulates the scattering of light rays through the object.
  • the neighborhood size and the filter applied may be varied by the user to vary the scattering effect of the object material. Accordingly, if the user wants to change the scattering characteristics of the material, the user can do so by varying these parameters and without having to recalculate the thickness maps and functions. This ability greatly increases turn-around time.
  • an illumination contribution from the light source at the surface location is determined, step 320.
  • the light absorbing properties for the material versus thickness of the material is typically pre-specified by a user. For example, a user may specify that a percentage (e.g. 50%) of incident illumination is absorbed through a specified thickness of material. Alternatively, the user may specify that a percentage (e.g. 33%) of incident illumination is absorbed through a unit thickness of material.
  • the thickness function can take into account the different absorption qualities of the different materials.
  • the thickness function may be based upon material density. For instance, atmospheric haze may provide a smaller increase in thickness in a thickness function per unit distance compared to thick smoke.
  • the materials that make up the calculated thickness may have different absorption properties for different frequencies of light. Accordingly, users can specify a material absorption versus thickness of material based upon frequency of light. For example, nitrogen gas scatters and absorbs blue wavelengths of light more than red wavelengths of light, accordingly absorption versus material for red light is less than absorption versus material of blue light.
  • combinations of different materials, and different absorption characteristics of the different materials with respect to wavelength can be used to determine the illumination contribution of the surface location.
  • the shading contribution of the light source in a direction of the viewer is then determined, step 330.
  • Conventional shading techniques may be used in embodiments of the present invention taking into account, the surface normal direction, a color of the illumination determined in step 320, above, a color of the surface, a image viewing plane normal, and the like. Other parameters are also contemplated and used in other embodiments of the present invention.
  • step 340 the process above repeats for the remaining illumination sources.
  • the shading contributions of all illumination sources at the surface location have been determined, the shading contributions are then combined, step 350.
  • the combination may simply be a sum of intensities, hi other embodiments, when the shading contributions have different frequency components, thus the combination may be a sum of the intensities at the different frequencies.
  • Other conventional methods for combining the shading values may also be used.
  • a weighted combination of the illumination contributions can be used.
  • the process above may be repeated for other surface locations on an object in the scene, step 360, to obtain combined shading values for such surface locations.
  • stochastic sampling techniques pioneered by Pixar such as U.S. Patent No.4,897,806 may then be used to sample the combined shading values of surface locations, to form a value for a pixel in an image, step 370.
  • Such techniques are incorporated into the Renderman® rendering software.
  • the use of other rendering software and hardware are contemplated in other embodiments of the present invention.
  • the resulting image may be further processed and then recorded onto a tangible media such as a strip of film, a magnetic memory (hard disk), an optical memory (DVD, VCD), or the like. Subsequently, the image may be retrieved from the tangible media and output to users (e.g. animator, audience member, and the like.), step 390.
  • Figs. 3 A-D illustrate an example of an embodiment of the present invention. More specifically, Figs. 3 A-D illustrates thickness map and thickness function representations.
  • FIGs. 3A an object 400 is shown illuminated by a lighting source 410. Also shown are a number of rays, including rays 420, 430, and 440 that are projected from lighting source 410 through a lighting viewing plane 450. As shown in Fig. 3A, ray 420 initially intersects object 400 at point zl 460, passes through object 400, and exits at point z2 470. Further, ray 430 initially intersects object 400 at point z3 480, passes through object 400, and exits at point z4490. In this example, ray 440 does not pass through object 400.
  • Fig. 3B illustrates a thickness map 500 generated at viewing plane 450.
  • ray 420 is mapped to location 510
  • ray 430 is mapped to location 520
  • ray 440 is mapped to location 530.
  • thickness functions 540-560 are also illustrated. As can be seen, in these example, the amount of light varies with respect to distance.
  • Fig. 3C two objects 570 and 580 are illustrated with an atmospheric of volumetric effect 590, in between. Also illustrated is a lighting source 600 and a projected ray 610. As can be seen, ray 610 penetrates object 580, passes through volume 590, and then penetrates object 570. hi the example of Fig. 3D example thickness functions 620A-B are shown.
  • thickness function 620A is shown with material density of objects 570 and 580 being the same, and volume 590 not attenuating ray 610.
  • thickness function 620B is shown with the material density of objects 570 and 580 being different (having different slopes), and the material density of volume 590 adding to the thickness.
  • Fig. 4 illustrates an example according to an embodiment of the present invention.
  • two lights 700 and 710 are illustrated illuminating an object 720.
  • a thickness map 730 and 740 are formed with respect to object 720.
  • a ray 750 in thickness map 730 passes through object 720 and a thickness function 760 is determined.
  • a ray 770 in thickness map 740 passes through object 720 and a thickness function 780 is determined.
  • a shading value for the point zl 790 on the surface of object 720 is to be determined.
  • thickness function 760 is mapped to point zl 790, and the thickness is Tl 800.
  • thickness function 780 is mapped to point zl 790, and in this case, the thickness is 0, i.e. the light is not attenuated.
  • the absorption characteristics of the material are typically specified by the user.
  • the absorption characteristics and the thickness of the material are then used in this embodiment to determine the amount of light remaining at a point, after the light passes through the specified thickness of material.
  • the absorption characteristics of the material maybe linearly dependent or non-linearly dependent upon material thickness (depth).
  • the absorption characteristics of the material of object 720 is referred to.
  • the absorption of the material of light is dependent upon the frequency of light.
  • graph 820 illustrates the attenuation of red light with respect to thickness
  • graph 830 illustrates the attenuation of blue light with respect to thickness. Additional graphs may be specified for other colors, such as green, yellow, or the like.
  • Tl 800 it can be seen that red wavelengths of light are not completely attenuated, whereas at thickness Tl 800, blue wavelengths of light are substantially attenuated.
  • the illumination contribution from light source 700 would be red light at 10% > of the initial red intensity.
  • the illumination contribution from light source 710 would be red and blue light at the initial intensities.
  • the illumination contributions are then combined with the surface normals, the viewer normal, and the like to determine a shading value at point zl 790.
  • Embodiments of the present invention are very useful in cases where light is attenuated, but not necessarily fully blocked by objects in a scene.
  • objects such as fish, bones, jellyfish, octopus, anemones, glass block, and other materials that have translucent qualities.
  • embodiments may be applied to other translucent materials, such as atmospheric effects such as rain, fog, clouds, and the like, volumetric effects such as hair, feathers, fur, and the like.
  • one ray from an illumination source to the surface location may be used to determine the illumination contribution at a surface location.
  • rays from the illumination source to the surface location and neighboring surface locations are used to determine the illumination contribution at the surface location.
  • a filter such as a low-pass filter, median filter, Gaussian filter, high- pass filter and the like is applied to the neighborhood of illumination contributions on the surface location in interest.
  • an illumination ray may strike a fish bone or other internal organ and be absorbed. As this would be repeated for other surface locations, the result would be a sharp silhouette of the fish bones or the like, on the side of the fish away from the light. By considering illumination contributions of a neighborhood of surface locations and filtering them, a result would be a softer silhouette of the fish bones on the side of the fish away from the light.
  • the size of the neighborhood may vary automatically or by user selection. For example, in thinner objects, the neighborhood may be a 3x3 array, whereas for thicker objects, the neighborhood may be a 4x4, 5x5, 1 lxl 1, etc. array, or the like.
  • the neighborhood size and the filter applied may be varied by the user to vary the diffuse effect of the object. By allowing the user to easily vary these parameters, this reduces the requirement to re-calculate thickness maps and thickness functions, every time the material scattering characteristics change.
  • the shaded value of a surface location may be determined from direct illumination of the surface location of the object and illumination passed through the object.
  • the combination of both of these types of illumination can be used to detennine a final illumination value at the surface location.
  • any linear or non-linear combination of these values may be used.
  • a weighting of these values may be performed by the user. For example, the user may determine that an object "glows too much” accordingly, the user may lessen the illumination contribution weighting due to a backlight, or the like.
  • the absorption at a given light frequency may be based upon absorption of primary color components.
  • an absorption relationship for red light, green light, and blue light may be provided by the user.
  • the yellow light is represented as a red light of a particular intensity and a green light of a particular intensity. Based upon the given material thickness, the amount of red and green light absorbed (and remaining) are determined.
  • the illumination contribution is a combination of the remaining red light and green light intensities.
  • a yellow light is directed towards the rear of an obj ect made of green j ade, because red wavelengths are absorbed the j ade, the light that shines through the front of the object will be greenish.
  • the user may specify the color of light transmitted according to thickness of material.
  • the absorption relationship for the material may specify that for a first thickness, the illumination contribution will be red, for a second thickness, the illumination contribution will be green, and for a third thickness, the illumination contribution will be blue, hi such cases the absorption relationship effectively specifies a color "filter" in response to object thickness.
  • the illumination source is white light
  • the illumination contribution is red
  • the illumination contribution is green
  • the like if the illumination source is purple light
  • the object is purple light
  • the object is purple light
  • the object if the object has a first thickness
  • the illumination contribution is red
  • the object has a second thickness
  • there is no illumination contribution if the object has a third thickness
  • the illumination contribution is blue.
  • thickness functions such as illustrated in the above figures need not be actually represented by a mathematical function. Instead, in some embodiments, the thickness functions can simply be a table of entries including a distance and a thickness value. In some embodiments, the surface shading, i.e. the color of the surface of an object is important. Thus, in such cases, only the distance from the illumination viewing plane, and the thickness at that point are relevant for illumination contribution purposes. As an example, in Fig. 3D, for thickness map 620A, the illustrated thicl ⁇ iess function may simply be represented as a series of number pairs: ⁇ (zl, 0), (z2, TO), (z3, TO), (z4, Tl) ⁇ .
  • a simple counter could be used to indicate if a point is within an object or outside an object.
  • Typical non-opaque object material may include waxy or fleshy items such as skin, was, fruit, bones, etc. as well as thin objects, or the like.
  • certain assumptions and techniques are applied that reduce the amount of mathematical calculations compared to the prior art.
  • the shading values produced using embodiments of the present invention will not necessarily be mathematically correct, however is acceptable for animation purposes. Additionally, it is believed that objects shaded using embodiments of the present invention will have characteristics uniquely identifiable from a mathematically correct simulation.
  • Embodiments of the present invention may be applied to any number of rendering platforms and for a variety of purposes.
  • embodiments may be use on engineering workstations for development purposes, on visualization systems for artists and animators, in rendering farm machines for final production, on computer graphics boards and the like. Accordingly, the concepts disclosed above are extremely valuable in a variety of applications.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

L'invention concerne un procédé permettant de déterminer l'éclairage de points de surface d'un objet (720) dans une scène à partir de sources d'éclairage (700, 710). Ce procédé consiste : à déterminer une première carte d'épaisseur (730) pour une première source d'éclairage (700) de la scène, cette première carte d'épaisseur comprenant une première pluralité de valeurs d'épaisseur (760) de l'objet par rapport à la distance à partir de la première source d'éclairage ; à déterminer un point de surface (790) sur l'objet ; à déterminer une première pluralité de valeurs d'épaisseur associées au point de surface sur l'objet, en réponse à la première carte d'épaisseur ; à déterminer une première valeur d'épaisseur filtrée associée au point de surface sur l'objet, en réponse à la première pluralité de valeurs d'épaisseur ; et à déterminer une contribution d'éclairage (820, 830) à partir de la première source d'éclairage, au niveau du point de surface, en réponse à la première valeur d'épaisseur filtrée.
PCT/US2004/010046 2004-02-11 2004-03-31 Procedes et appareil d'estimation de diffusion de sous-surface WO2005083639A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US54417204P 2004-02-11 2004-02-11
US60/544,172 2004-02-11

Publications (1)

Publication Number Publication Date
WO2005083639A1 true WO2005083639A1 (fr) 2005-09-09

Family

ID=34910711

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/010046 WO2005083639A1 (fr) 2004-02-11 2004-03-31 Procedes et appareil d'estimation de diffusion de sous-surface

Country Status (1)

Country Link
WO (1) WO2005083639A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5959731A (en) * 1996-05-03 1999-09-28 Virginia Semiconductor, Inc. Optical micrometer for measuring thickness of transparent substrates based on optical absorption
US6016150A (en) * 1995-08-04 2000-01-18 Microsoft Corporation Sprite compositor and method for performing lighting and shading operations using a compositor to combine factored image layers
US6252608B1 (en) * 1995-08-04 2001-06-26 Microsoft Corporation Method and system for improving shadowing in a graphics rendering system
US6342887B1 (en) * 1998-11-18 2002-01-29 Earl Robert Munroe Method and apparatus for reproducing lighting effects in computer animated objects

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6016150A (en) * 1995-08-04 2000-01-18 Microsoft Corporation Sprite compositor and method for performing lighting and shading operations using a compositor to combine factored image layers
US6252608B1 (en) * 1995-08-04 2001-06-26 Microsoft Corporation Method and system for improving shadowing in a graphics rendering system
US5959731A (en) * 1996-05-03 1999-09-28 Virginia Semiconductor, Inc. Optical micrometer for measuring thickness of transparent substrates based on optical absorption
US6342887B1 (en) * 1998-11-18 2002-01-29 Earl Robert Munroe Method and apparatus for reproducing lighting effects in computer animated objects

Similar Documents

Publication Publication Date Title
US7327360B2 (en) Hair rendering method and apparatus
US7348977B2 (en) Subsurface scattering approximation methods and apparatus
US7436404B2 (en) Method and apparatus for rendering of translucent objects using volumetric grids
US7327362B2 (en) Method and system for providing a volumetric representation of a three-dimensional object
US7973789B2 (en) Dynamic model generation methods and apparatus
US6028955A (en) Determining a vantage point of an image
US7184043B2 (en) Color compensated translucent object rendering methods and apparatus
US8244029B1 (en) Recursive filters on GPUs
JP4829893B2 (ja) 低計算量のフィルム粒子シミュレーション技術
US20070139409A1 (en) Global illumination filtering methods and apparatus
US7864176B2 (en) Translucent object rendering methods and apparatus
US7176918B2 (en) Three-dimensional paint projection weighting of diffuse and scattered illumination methods and apparatus
WO2009018417A1 (fr) Procédés et appareils de rendu d'apparence artistique multiple
US7443394B2 (en) Method and apparatus for rendering of complex translucent objects using multiple volumetric grids
Lacewell et al. Raytracing prefiltered occlusion for aggregate geometry
Thonat et al. Video‐Based Rendering of Dynamic Stationary Environments from Unsynchronized Inputs
WO2005083639A1 (fr) Procedes et appareil d'estimation de diffusion de sous-surface
Hendrickx et al. Adaptively layered statistical volumetric obscurance
US7345686B2 (en) Method and apparatus for visibility determination and processing
CA2533451A1 (fr) Procede de projection de peinture ameliore et dispositif correspondant
Jung Rendering of Teeth and Dental Restorations using Robust Statistical Estimation Techniques
WO2005116929A2 (fr) Ponderation de la projection de peinture tridimensionnelle de procedes d'eclairage diffus et diffuse et appareil associe
Weyrich Acquisition of human faces using a measurement-based skin reflectance model
Morvan et al. A perceptual approach to trimming unstructured lumigraphs
Doidge Utilising path-vertex data to improve Monte Carlo global illumination

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase