EP3022717A1 - Système et procédé de génération de textures procédurales à l'aide de particules - Google Patents
Système et procédé de génération de textures procédurales à l'aide de particulesInfo
- Publication number
- EP3022717A1 EP3022717A1 EP14767078.0A EP14767078A EP3022717A1 EP 3022717 A1 EP3022717 A1 EP 3022717A1 EP 14767078 A EP14767078 A EP 14767078A EP 3022717 A1 EP3022717 A1 EP 3022717A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- data
- module
- particle
- textures
- particles
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/80—2D [Two Dimensional] animation, e.g. using sprites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
- G06T7/49—Analysis of texture based on structural texture description, e.g. using primitives or placement rules
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/21—Collision detection, intersection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/56—Particle system, point based geometry or rendering
Definitions
- the present invention relates to a system and a method for generating textures on an object from particles projected onto an object.
- a color is applied in the form of a layer, in the manner of a layer of paint applied to a real physical medium.
- the process of creating or changing the colors of objects therefore does not take into account the parameters or characteristics of the objects on which the color is applied, or the environment in which the objects are staged. Thus, to create realistic effects, a user must proceed manually to determine the point (s) or target areas, the parameters to be modified, and the level of modification of the selected parameters. If one or more objects of one or more scenes are to be processed, the required operations may involve a considerable amount of time to implement.
- a user To color an area of a wood material to give it a realistic woody appearance, a user must make parametric adjustments in a careful and accurate manner. Since coloring tools do not take into account the properties of materials, or the interactions between objects and the environment, a user who wishes to create a visual effect based on a reaction or behavior of a material must first design or imagine the desired effect in a realistic way, then make the color changes according to the color parameters present. Thus, if a color is applied to an object, its coloring impact will be identical on all areas of this object.
- the object has a metal portion, another wood portion and a plastic zone
- the applied color has the same effect on all these areas, whereas on a real object, the effects produced on each of the materials would be nuanced, even very different depending on the case.
- FR2681967 discloses a method for modifying the colors of an image displayed on a display device based on the determination of colorimetric values.
- the method includes selecting at least one color representative of at least one pixel of the image consisting of a plurality of pixels, determining the color values of the at least one color, selecting a second color, and determining the colorimetric values of the second color, and changing the colorimetric values of a plurality of pixels of the image so that for any given pixel of that plurality having colorimetric values which correspond to the colorimetric values of said at least one color, the color values of the given pixel are modified to match the color values of the second color.
- the applied color is identical whatever the nature of the object (plastic, wood, etc.) and does not take into account textures, but only color variations of a zone selected by the user.
- EP0884694 discloses a method for adjusting colors in digital images, including the correction of "red eyes" in the photos.
- the color data of the pixels are adjusted by identifying the pixels of a digital image having original color data corresponding to the predetermined color.
- the applied color is automatic based solely on colorimetric data, in particular the colors of the iris.
- WO2008066880 discloses a method for obtaining an original set of two or more original colors associated with a work. To do this, an input set of one or more colors chosen by the user is received. For each original color, a mapping from the original color to the derived colors is performed. The plurality of derived colors is obtained on the basis of one or more colors chosen by the user. [0011] WO2012154258 discloses a colorimetric coloring tool in 3 dimensions. Each pixel of the image comprises a set of pixel values in a three-dimensional color space. Even if it allows to use a varied color palette, the applied color does not vary according to the material on which it is applied.
- US7557807 discloses a computer implemented method comprising generating an object with certain characteristics and the emission of a particle. The path of the particle is checked to determine if the particle interacts with the object. In the event of a collision between the particle and the object, the characteristics of the object are modified, in particular to simulate the aging and erosion behaviors of the object.
- the described method involves the implementation of a mapping of the object by points. A ⁇ -tone map is then applied to each of the points.
- An object of the invention is to provide a system and method for improving the efficiency and productivity of the authoring tools. Another object is to provide a system and a graphic process for increasing the flexibility and the graphical possibilities when creating colors or renderings.
- Another object of the invention is to provide a system and a graphic process for increasing the realism of the elements shown.
- an animation simulation module designed to perform an emission and displacement simulation for each of the particles provided using the particle emitter data and the emitted particle data
- a tracer module designed to generate a parameterized trace producing one or more physical and / or chemical modifications of at least the surface of this object, so that at least one of its parameters, in particular a visible characteristic, is modified;
- a system can take into account a parametric architecture to determine the influence of particles projected on objects.
- Parametric architecture takes into account the physical and / or chemical elements inherent in the constituents and properties of particles and objects. Thanks to the fact that parametric objects and their textures can be modified according to traces parameterized according to physical and / or chemical phenomena, a scene can be set up and evolve taking into account many more parameters than the only parameters. colorimetric classically taken into account, thus contributing to considerably increase the realism of the visual effects produced.
- the tracer module comprises a rule selection module and an implementation module for applying the rule to generate the resulting trace data.
- the tracer module comprises a trace mixer submodule for modifying a trace on which a new active particle interacts.
- the system further comprises a temporal backup module provided for retaining the data for generating again a set of textures of an object for which one or more parameters are modified or to obtain a set again. previously generated textures.
- system further comprises a user data input module capable of impacting the data from the simulation module.
- system also includes access to global parameter data that can act on a plurality of emitted particles and / or on at least one object present in the zone of influence of the global parameters.
- the invention also provides a method for generating procedural textures on an object from particles emitted by a particle emitter, comprising the steps in which:
- an animation simulation module receives data from at least one particle emitter, particle data for transmission by the transmitter, data from at least one target object, defined by architectural parameters and procedural textures , capable of receiving impacts of said emitted particles and determines a trajectory for each of the particles to be emitted as a function of the transmitter data and the particle data;
- a tracer module For each particle colliding with a target object, a tracer module generates data for at least one trace on the surface of said object based on the data of the object and the data of the particle;
- an integrating module of physical parameters executes the graphic effects according to the data of the object and the data of the traces;
- the physical parameter integrating module For each object having undergone at least one impact of a particle, the physical parameter integrating module generates a new set of textures taking into account the data of the object, and the graphic effects previously obtained.
- the integrating module generates the textures of the new set by executing the graphic effects according to the data of the object and the data of the traces.
- a rule selection module performs a rule selection to be applied, a rule implementation module performs an evaluation of said rule according to the parameters of the rule. target object to generate the resulting trace data.
- a particle selection module performs a selection of particles affected by the rule to be applied, a rule implementation module performs an evaluation of said rule. depending on the particle parameters and the target object to generate the resulting trace data.
- FIG. 1 schematically represents an example of a texture generation system according to the invention
- Fig. 2 is a functional flowchart showing the main steps of the texture generation method according to the invention.
- Fig. 3 is a functional flow chart showing in detail step 150 of Fig. 2;
- Fig. 4 is a functional flow chart showing in detail a first trace generation mode
- FIG. 5 is a functional flow chart showing in detail a second mode of generating a trace.
- physical parameter is meant any element, property or physical and / or chemical characteristic that can be measured or detected or observed or quantified, characterizing an object, a particle, an environment, an emitter, etc.
- parameter architecture is meant the set of parameters making it possible to define the physical, chemical (constituents, properties, visual appearance of an object, texture, etc.) and behavioral characteristics of an element (particle, texture, object, etc.).
- particle or parametric particle
- physical unit or elementary chemical in its state during the projection (in the solid state, liquid, gaseous or a mixture of these phases), which, when projected on an object, generates a parameterized trace producing one or more physical and / or chemical modifications of at least the surface of that object, in particular textures of this object, so that at least one of its parameters
- particle emitter means an element, in particular a virtual element visible or not on a stage, for projecting one or more particles parameterized physically on an object also physically parameterized, such as a gun, spray gun, jet, jet, emitter, spotlight (for photons, or luminous or heating particles, etc.) etc.
- a scene may include one or more transmitters.
- the parameters of an emitter preferably include its position in the scene, the orientation and the angle of emission or projection of the particles.
- trace or parameterized trace
- graphics effect is meant a description of the physical and / or chemical process that determines how one or more traces generated on a target object affect the texture of that object. As an illustration, here are some examples of graphic effects:
- a trace of liquid on bare wood is absorbed by the wood. Alternatively, it has the effect of darkening the color of the wood;
- the heat applied to a painted material causes the paint to peel and then burn, depending on the temperature set by the user, and possibly to calcine the material on which the paint is applied if it is combustible ;
- procedural texture is meant a texture defined using algorithms and / or mathematically and displayed by means of a rendering engine making it possible to transform the mathematical data into a conventional image format such as, for example, a bitmap. .
- FIG. 1 illustrates an example of a system for generating procedural textures according to the invention. It comprises at least one microprocessor 2 adapted for the implementation of instructions contained in an instruction memory 3. A plurality of modules are advantageously provided by the implementation of the instructions by the microprocessor.
- An animation simulation module 4 makes it possible to establish the data related to the movements of the various elements of the scene. This animation data further includes the spatial coordinates as a function of time, events such as collisions, extinctions, etc., for each of the elements.
- a tracer module 5 makes it possible to determine the displacement data of the particles on a target object after a collision of the particle against the object.
- An integrator 6 of physical parameters makes it possible, from the physical parameters concerned, to generate a new set of textures for the object subjected to the various elements and parameters.
- An optional mixer module 7 makes it possible to take into account the data of several superimposed traces, when several traces have common points or path areas. The mixer thus makes it possible to define, as a function of the influence of each trace, a portion of mixed or global trace. The latter will be used by the integrator in the area concerned.
- To illustrate the function of the trace mixer here are some non-limiting examples:
- a user input 8 can receive data from an external source, including a user who would interact on the physical phenomena in progress or future.
- An optional temporal backup module 9 makes it possible to keep data related to a time scale. This module allows for example to run an animation simulation again after changing one or more parameters, performing only the operations required by the modified data. It is thus possible to realize simply and quickly several successive simulations based on a previous simulation, or to find a simulation previously carried out.
- a bus 10 allows data transfers between the various modules and the memory elements described below.
- a transmitter memory element 1 1 comprises data of at least one emitter or particle engine. These data include, for example, the spatial coordinates and the orientation of the transmitter as a function of time, the particle emission cone, the transmission rate, the speed and / or emission force, etc.
- the data of emitted particles are contained in a particle memory element 12. These data include for example the physical characteristics of the particles such as shapes, dimensions, weight, adhesion, elasticity, etc.
- a target object data element 13 stores the data of the target objects that can be impacted during an animation simulation. These data include, for example, the physical characteristics of the target objects such as shapes, dimensions, weights, and various features related to the surface and textures of the objects.
- a trace data element 14 stores the data of the traces generated by the particles on a given target object. These data may include a plurality of parameters such as width, depth and profile as a function of the position along the trace, roughness, porosity, etc. In general, any parameter that may influence the texture characteristics of the object concerned may be taken into account. Indices can be assigned to each of the parameters in order to weight their relative importance levels.
- a graphic effect data element 15 stores the data of the graphic effects implemented during simulation of animations. These graphic effects may include color, intensity, gloss, grain size, etc. parameters.
- An optional global parameter element 16 comprises the parameters that may affect several elements of the scene, such as for example data of temperature, pressure, humidity, physical force (magnetic, gravitational or other), and so on.
- a target object texture data element 17 stores the data of the new textures of the target objects that can be impacted during an animation simulation. Any initial textural data may also be contained in this memory element 17.
- FIG. 2 shows a functional flowchart of the main steps of the texture generation method according to the invention. In step 110, the system and the payload are initialized. Step 120, optional, can receive any user data to adjust or correct data or parameters to be processed by the animation simulation module 4 according to a particular wish or user need.
- Step 130 at the level of the animation simulation module 4, provides for the reception of the data in relation to the particles, the transmitter or emitters, the object or objects, as well as any environmental data.
- the animation simulation module 4 thus receives all the parameters enabling it to perform an animation of the scene.
- This animation simulation includes a phase of calculating the trajectories of the elements likely to move in the scene, such as the particles and possibly the objects and / or the emitters. Steps 141 to 148 show in more detail the different steps of this trajectory calculation phase.
- phase 150 provides the integration of the physical parameters and the generation and / or adaptation of a new set of textures for the object or objects affected by the events occurring in the scene. This phase is presented in more detail in Figure 3.
- step 141 The calculation of the trajectories advantageously starts with a test, performed in step 141, of checking whether the particle concerned is active or extinguished. If it is off, step 145 applies to update the relevant particle data.
- the data relating to the particle comprises a parameter related to the extinction of said particle.
- step 142 a second test, in step 142, makes it possible to check whether the particle collides with an object. If the test gives a negative result, step 145 applies in order to update the relevant particle data.
- a trace generation phase 143 is performed.
- a possible trace modification phase 144 is then performed in the case where one or more traces are affected by a new collision or trace.
- the next step, 145 makes it possible to ensure that the data affected by the preceding steps or phases are updated. It is in particular the data of particles and / or traces.
- the calculation phase ends in step 146.
- step 141 the test of step 141 is followed by a test 147 making it possible to check whether the particle being processed generates or not a possible new particle or a new emitter. If this test is positive, step 148 then makes it possible to update the emitter data according to this generation of particles. Otherwise, step 145 applies, in order to update the data of particles concerned.
- the test 147 is also carried out in the case where the collision test with an object of the step 142 gives a positive result.
- FIG. 3 shows in more detail the subsidiary steps of the phase 150 of integration of the physical parameters and generation and / or adaptation of textures resulting from the events taking place in the scene.
- the physical parameter integrator 6 receives the trace data, the object data, and the applicable graphic effect data.
- the integrator executes, in step 152, the graphical effects corresponding to the received data, according to the data of the object and the relevant trace data, for example, paint peeling, corrosion (if metallic). , burning, burning, trace stop for a porous material, nonabsorbent flow.
- Step 153 makes it possible to check whether one or more other traces are to be taken into account. If this is the case, the trace mixer module 7 performs the sharing of the trace parameters for the areas common to several traces.
- step 155 Any user data are taken into account in step 155.
- the physical parameters are integrated by the integrator 6 to generate and / or modify the textures of the object.
- phase 143 is detailed in Figures 4 and 5, in steps 200 to 250 for the case of Figure 4, and in steps 300 to 350 for the case of Figure 5.
- the tracer module 5 makes a selection of rules to be applied depending on the type of particle. The rules make it possible to determine the type of influence or physical effect that the particle concerned will have on the generated trace, and finally on the textures obtained for the object interacting with this particle.
- the rule is evaluated according to the parameters of the object.
- a trace is generated or modified according to the adapted rule.
- a test in step 220 makes it possible to check whether another rule applies or not.
- the tracer module 5 performs a selection of particles affected by the rule.
- the rule is evaluated based on the particle parameters and the object.
- a trace is generated or modified according to the adapted rule.
- a test in step 320 makes it possible to check whether another particle applies or not.
- system and method according to the invention is presented in a working environment adapted for an editing tool for a user to create or modify the rendering of one or more objects.
- the system and the method according to the invention are used in autonomous mode, for the generation of object renderings from pre-established physical parameters or that can be calculated by the system itself, for example according to intermediate results.
- Such exemplary embodiments are advantageously used for video games or films, in particular games or films in which the renderings or textures are generated by a procedural texture generation engine.
- WO2012014057 incorporated by reference herein, describes an example of such a system and rendering method.
- the system and method according to the invention make it possible to generate and / or modify renderings of objects by taking into account the technical factors (physical, chemical, thermodynamic, etc.) inherent to the objects themselves as well as to the environment of the scene.
- a transmitter can project parametric particles with parameters in relation to corrosion.
- these physical parameters other than the color data
- the behaviors of the objects in relation to the projected particles make it possible, for example, to define that materials such as plastic do not react to corrosion effects, steel develops corroded areas, copper oxidizes, etc.
- certain parameters may be assigned to either the parameterized particles, objects, the environment, or the graphic effects.
- the distribution or parametric architecture can also vary to produce comparable renderings.
- the particles projected against the objects comprise only non-colorimetric parameters, such as, for example, thermal energy or heat data, pressure data, etc.
- the target object comprising several different materials may have different reaction modes depending on the materials on which the traces evolve.
- the traces can be different, the graphic effects can also be different, so that the final textures take into account the parameters of the different materials of the target.
- hot particles are emitted on a multi-material body.
- the traces make it possible to create a kind of "mapping" of temperature on the surface of the object. This "mapping", to which one applies the graphic effects, makes it possible to produce the final textures taking into account the different materials.
- the temporal backup advantageously allows to go back on a process to find one of the multiple previous states. It can also make it possible to redo a process by modifying only one or a few parameters, while benefiting from other unmodified parameters, thus avoiding having to re-parameterize all the data. This makes it possible, for example, to easily and quickly compare the results that can be obtained by modifying only certain parameters.
- a particle characteristic for example color, size, hardness, temperature, etc.
- a particle characteristic for example color, size, hardness, temperature, etc.
- the figures and their descriptions made above illustrate the invention rather than limiting it.
- the reference signs in the claims are not limiting in nature.
- the verbs "understand” and “include” do not exclude the presence of elements other than those listed in the claims.
- the word “a” preceding an element does not exclude the presence of a plurality of such elements.
- the system and method previously described advantageously operate in multi-channel, that is to say by treating several textures (diffuse, normal, etc.) at each step.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Processing Or Creating Images (AREA)
- Pharmaceuticals Containing Other Organic And Inorganic Compounds (AREA)
- Organic Low-Molecular-Weight Compounds And Preparation Thereof (AREA)
- Image Generation (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1301709A FR3008814B1 (fr) | 2013-07-18 | 2013-07-18 | Systeme et procede de generation de textures procedurales a l'aide de particules |
PCT/IB2014/001327 WO2015008135A1 (fr) | 2013-07-18 | 2014-07-15 | Système et procédé de génération de textures procédurales à l'aide de particules |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3022717A1 true EP3022717A1 (fr) | 2016-05-25 |
Family
ID=49753223
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14767078.0A Pending EP3022717A1 (fr) | 2013-07-18 | 2014-07-15 | Système et procédé de génération de textures procédurales à l'aide de particules |
Country Status (6)
Country | Link |
---|---|
US (2) | US11176714B2 (fr) |
EP (1) | EP3022717A1 (fr) |
JP (1) | JP6486348B2 (fr) |
CA (2) | CA3200133C (fr) |
FR (1) | FR3008814B1 (fr) |
WO (1) | WO2015008135A1 (fr) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR3008814B1 (fr) | 2013-07-18 | 2016-12-23 | Allegorithmic | Systeme et procede de generation de textures procedurales a l'aide de particules |
WO2018017626A2 (fr) * | 2016-07-18 | 2018-01-25 | Patrick Baudisch | Système et procédé d'édition de modèles 3d |
CN114693847A (zh) * | 2020-12-25 | 2022-07-01 | 北京字跳网络技术有限公司 | 动态流体显示方法、装置、电子设备和可读介质 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2681967B1 (fr) | 1991-10-01 | 1994-11-25 | Electronics For Imaging Inc | Procede et appareil pour modifier les couleurs d'une image a l'aide d'un ordinateur. |
US6204858B1 (en) | 1997-05-30 | 2001-03-20 | Adobe Systems Incorporated | System and method for adjusting color data of pixels in a digital image |
US6996509B2 (en) * | 1998-10-19 | 2006-02-07 | Ford Global Technologies, Llc | Paint spray particle trajectory analysis method and system |
JP4408681B2 (ja) | 2003-10-22 | 2010-02-03 | 株式会社バンダイナムコゲームス | プログラム、情報記憶媒体、及び画像生成システム |
US7557807B2 (en) * | 2005-07-01 | 2009-07-07 | Microsoft Corporation | Visual simulation of weathering by y-ton tracing |
US8085276B2 (en) | 2006-11-30 | 2011-12-27 | Adobe Systems Incorporated | Combined color harmony generation and artwork recoloring mechanism |
CA2648441A1 (fr) * | 2007-12-31 | 2009-06-30 | Exocortex Technologies, Inc. | Caracterisation rapide de dynamique des fluides |
JP5145510B2 (ja) | 2008-06-05 | 2013-02-20 | 株式会社大都技研 | 遊技台 |
US8657605B2 (en) * | 2009-07-10 | 2014-02-25 | Lincoln Global, Inc. | Virtual testing and inspection of a virtual weldment |
US8915740B2 (en) * | 2008-08-21 | 2014-12-23 | Lincoln Global, Inc. | Virtual reality pipe welding simulator |
US8319771B2 (en) * | 2008-09-30 | 2012-11-27 | Disney Enterprises, Inc. | Computer modelled environment |
DE102009018165A1 (de) * | 2009-04-18 | 2010-10-21 | Schreiber & Friends | Verfahren zur Darstellung eines animierten Objekts |
EP2599057B1 (fr) | 2010-07-30 | 2017-05-31 | Allegorithmic | Systeme et procede d'edition, d'optimisation et de rendu de textures procedurales |
US8854370B2 (en) | 2011-02-16 | 2014-10-07 | Apple Inc. | Color waveform |
EP3951748B1 (fr) | 2011-04-07 | 2023-10-25 | Lincoln Global, Inc. | Contrôle et inspection virtuels d'un ensemble soudé virtuel |
GB2520658A (en) * | 2012-09-21 | 2015-05-27 | Masonite Corp | Surface texture for molded articles |
US9192874B2 (en) * | 2013-03-15 | 2015-11-24 | Crayola, Llc | Digital coloring tools kit with dynamic digital paint palette |
FR3008814B1 (fr) * | 2013-07-18 | 2016-12-23 | Allegorithmic | Systeme et procede de generation de textures procedurales a l'aide de particules |
-
2013
- 2013-07-18 FR FR1301709A patent/FR3008814B1/fr active Active
-
2014
- 2014-07-15 CA CA3200133A patent/CA3200133C/fr active Active
- 2014-07-15 US US14/905,545 patent/US11176714B2/en active Active
- 2014-07-15 WO PCT/IB2014/001327 patent/WO2015008135A1/fr active Application Filing
- 2014-07-15 EP EP14767078.0A patent/EP3022717A1/fr active Pending
- 2014-07-15 CA CA2917383A patent/CA2917383C/fr active Active
- 2014-07-15 JP JP2016526718A patent/JP6486348B2/ja active Active
-
2021
- 2021-10-13 US US17/500,725 patent/US11688108B2/en active Active
Non-Patent Citations (2)
Title |
---|
None * |
See also references of WO2015008135A1 * |
Also Published As
Publication number | Publication date |
---|---|
JP6486348B2 (ja) | 2019-03-20 |
CA2917383C (fr) | 2023-08-01 |
FR3008814B1 (fr) | 2016-12-23 |
US20220108498A1 (en) | 2022-04-07 |
US11688108B2 (en) | 2023-06-27 |
CA3200133C (fr) | 2024-02-27 |
CA3200133A1 (fr) | 2015-01-22 |
US20160247297A1 (en) | 2016-08-25 |
JP2016528615A (ja) | 2016-09-15 |
FR3008814A1 (fr) | 2015-01-23 |
CA2917383A1 (fr) | 2015-01-22 |
WO2015008135A1 (fr) | 2015-01-22 |
US11176714B2 (en) | 2021-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Logothetis et al. | Px-net: Simple and efficient pixel-wise training of photometric stereo networks | |
CN110599574B (zh) | 游戏场景的渲染方法、装置及电子设备 | |
US9754406B2 (en) | Multiple light source simulation in computer graphics | |
US20180314204A1 (en) | Recording holographic data on reflective surfaces | |
US11688108B2 (en) | Generating procedural textures with the aid of particles | |
EP3180772B1 (fr) | Système et procédé de paramétrage colorimétrique et geométrique de textures procédurales sur un objet | |
Dammertz et al. | Progressive point‐light‐based global illumination | |
CA2917381C (fr) | Systeme et procede de generation de textures procedurales sur un objet | |
US20160282811A1 (en) | Relightable holograms | |
US20180321639A1 (en) | Applying holographic effects to prints | |
CN106780702B (zh) | 一种基于物理着色的方法及系统 | |
Gigilashvili et al. | Appearance manipulation in spatial augmented reality using image differences | |
CN104574493A (zh) | 一种远景平滑淡出的方法及装置 | |
Peddie et al. | Work flow and material standards | |
Litster | Blender 2.5 materials and textures cookbook | |
Rudolfova et al. | High fidelity rendering of the interior of an egyptian temple | |
Young-Sik et al. | Application and Light Analysis for Photo-Realistic Output | |
Bell | Using Precisionism Within American Modern Art as Stylistic Inspiration for 3D Digital Works | |
Hnat et al. | Real-time wetting of porous media | |
De Zwart | Studio-Quality Rendering | |
Paquette et al. | Shaders and Texturing | |
Saam | Creative relighting in compositing based on z depth | |
Jin | A study on environment color simulation of surroundings of CG characters in live-action scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20160218 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20190619 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ADOBE INC. |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230525 |