EP0978102A2 - Procede et dispositif de mappage d'une texture bidimensionnelle sur une surface tridimensionnelle - Google Patents

Procede et dispositif de mappage d'une texture bidimensionnelle sur une surface tridimensionnelle

Info

Publication number
EP0978102A2
EP0978102A2 EP97944280A EP97944280A EP0978102A2 EP 0978102 A2 EP0978102 A2 EP 0978102A2 EP 97944280 A EP97944280 A EP 97944280A EP 97944280 A EP97944280 A EP 97944280A EP 0978102 A2 EP0978102 A2 EP 0978102A2
Authority
EP
European Patent Office
Prior art keywords
texture
dimensional
polygonal
pattern
polygonal surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP97944280A
Other languages
German (de)
English (en)
Inventor
Steven Gentry
Jeffrey Pitts
Joyce Freedman
Maurizio Vecchione
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Styleclick Inc
Original Assignee
Modacad Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/842,622 external-priority patent/US5903270A/en
Application filed by Modacad Inc filed Critical Modacad Inc
Publication of EP0978102A2 publication Critical patent/EP0978102A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Definitions

  • This invention relates generally to the field of computer-aided design (CAD) systems, and particularly to a system for rendering three dimensional objects with surface textures applied thereto.
  • CAD computer-aided design
  • CAD systems have been developed for providing point-of-sale demonstrations of product features and options.
  • One application for such a system allows prospective purchasers of furniture to view an item in any of the available finishes or fabrics prior to making a purchase decision.
  • Texture mapping is a well-known feature of computer graphics systems.
  • a three-dimensional object to be rendered is modeled with a wire frame defining a large number of polygonal surface areas.
  • a mapping function is then determined between a two-dimensional (u,v) texture space and the three-dimensional (x,y,z) object space.
  • Conventional texture mapping processes utilize various projection methods, such as cubic, cylindrical or spherical.
  • One of the primary objectives of these methods is to cover the object surface or parts thereof with seamless texture.
  • Each of these conventional methods introduces some degree of texture distortion as a necessary cost of seamless texture mapping. This is acceptable in most applications where relatively uniform surface textures are desired.
  • the distortions inherent in prior art projection methods are not acceptable due to the appearance of elasticity in the surface. If the texture mapping is intended to depict a relatively inelastic surface covering on an object, such as fabric on upholstered furniture, seamless mapping is not required. Indeed, seams are a desirable artifact contributing to a more realistic appearance.
  • the present invention provides a method for mapping two-dimensional textures onto a three-dimensional object surface which is accurate in terms of pattern placement, scale, repeat and flow of various textures. This is accomplished by providing a unique user interface to control the texture mapping process.
  • the present invention utilizes a projection method where a faceted three-dimensional object is, in a sense, "flattened” by breaking the object along polygon edges. This creates an undistorted projection of the texture onto the surface of the object, albeit at a cost of seamless mapping. This cost is entirely acceptable and even desirable in many applications, such as correctly rendering upholstered furniture.
  • the method of the present invention can be alternatively viewed as "wrapping" a two-dimensional texture onto the three-dimensional surface. Instead of breaking the three-dimensional surface along polygon edges to permit a direct mapping from a two-dimensional space to three-dimensional space, thereby losing the three-dimensional character of the object, the object is wrapped in texture while remaining in three-dimensional space.
  • the user controls how the texture should flow and where the texture should be cut and seamed to fit the geometry of the object. These operations are performed in a three-dimensional viewing window which allows the operator to manipulate a cursor directly on the three-dimensional surface.
  • the foregoing objectives are accomplished with a method comprising the steps of displaying a graphic representation of the three-dimensional object in a first display window; displaying a surface detail pattern or other surface texture in a second display window; designating a first polygonal surface on the three-dimensional object; applying the surface detail pattern to the first polygonal surface; designating a second polygonal surface having a common boundary with the first polygonal surface; and applying the surface detail pattern to the second polygonal surface such that there is pattern continuity across the common boundary.
  • the invention also comprises a method for adding fine-scale surface detail to a three-dimensional object. This is achieved by directly “painting" details onto the three-dimensional surface to approximate local surface lighting effects that would naturally be caused by bumps, fold, waves and other disturbances on the textured surface.
  • Figure 1 illustrates the graphic user interface employed with the present invention.
  • Figure 2 illustrates a first alternative texture mapping of a simple three- dimensional object.
  • Figure 3 illustrates a second alternative texture mapping of a simple three-dimensional object.
  • FIG. 4 is a functional flow diagram of the process steps of the present invention.
  • Figure 5 illustrates calculation of default u,v values for an object polygon.
  • Figures 6—8 illustrate u,v mapping to achieve texture flow across a border between adjacent object polygons.
  • FIG. 1 illustrates the operating environment of the present invention.
  • the operator's computer display screen 10 is partitioned into a plurality of windows.
  • An object window 12 displays a three-dimensional object 20 to which a surface texture is to be applied.
  • a cursor 13 is provided under control of a mouse or other pointing device 18.
  • a control window 14 provides the operator with "point and click" controls for display of the three-dimensional object. These controls allow the operator to rotate the object in any direction and to zoom in or out.
  • a texture window 16 displays the two-dimensional surface texture that is to be applied to the object in window 12.
  • the operator begins by designating one of the surface polygons of object 20, for example polygon A, by positioning the cursor on the polygon. The operator must then decide how to proceed with wrapping the object with the surface texture. Since the texture is treated as a relatively inelastic material, such as a fabric, this entails a decision as to where seams will be placed on the object. As illustrated in Figure 2, the operator first clicks the cursor on polygon A and then drags the cursor to polygon B.
  • each of the polygons is designated by the operator, it is projected onto the two-dimensional texture image.
  • projection A' is displayed in texture window 16.
  • projections B' and C are displayed in texture window 16 as the operator designates polygons B and C, respectively.
  • the end result is a "flattening" of object 20 over the texture image.
  • the operator may adjust the texture mapping by manipulating the polygon projections within the texture window.
  • FIG. 3 an alternative wrapping of texture onto object 20 is illustrated.
  • the operator again initially designates polygon A and then drags the texture onto polygon B.
  • the operator clicks on polygon a again and drags the texture onto polygon C so that there is pattern continuity across the border between polygons A and C. This results in a seam along the border between polygons B and C.
  • texture window 16 where polygon C is now shown adjacent to polygon A' rather than B'.
  • a three-dimensional object is modeled using any of a number of conventional modeling applications.
  • the object is defined as a set of polygons and polygon groups in a three-dimensional (x,y,z) coordinate system.
  • the object data is stored as an image file such as a .OBJ file. Creation of the three-dimensional object model will normally be done "off line" from the texture mapping process of the present invention.
  • the desired object model is created or is imported, it is displayed as a wire frame in the operator's object window. Using the operator controls, the object can be viewed from any location in three-dimensional space.
  • the desired texture image which will typically be imported as a .TIF, .BMP or .WTX file, is displayed in the operator's texture window.
  • the texture image has a defined "center point", which can be assigned and moved by the operator.
  • the center point is a reference for texture pattern placement on the object.
  • the operator can also establish texture flow settings. These operator defined settings, along with the texture image, repeat characteristics and physical dimensions of the texture are all saved as a .WTX formatted file.
  • the information contained in such a file permits a single set of u,v mapping values in real world dimensions to be used with any texture image saved as a .WTX file.
  • any other texture can be automatically applied to that object.
  • the operator designates an initial polygon on the surface of the three-dimensional object.
  • the polygon is automatically filled with a default mapping of the texture based on the defined center point of the texture.
  • the designated polygon is "flattened” and is displayed as a two- dimensional projection on the texture image and the texture window.
  • the polygon projection can be translated and/or rotated in the texture window to modify the default u,v mapping. Changes to the u,v mapping values are reflected in the three-dimensional rendering of the object in the object window. Individual vertices of the projected polygon in the texture window can be translated by the operator to introduce distortion in order to simulate the effect of fabric stretch. Again, changes made in this manner in the texture window are reflected in the three-dimensional rendering of the object.
  • the operator selects the next polygon by dragging from a previously textured adjacent polygon or by simply selecting a new polygon.
  • clicking and dragging from a filled polygon to an adjacent empty polygon flows the texture in a continuous fashion across the border between the two- polygons.
  • the default u.v mapping is applied to the new polygon.
  • Steps 40, 42 and 44 are repeated for each newly selected polygon until surface texture has been applied to the entire object or to the desired portions of the object. If less than the entire object is covered with a particular texture, the set of polygons thus covered can be associated to form a "part". Parts of an object thus defined have the characteristic of sharing the same texture in any future renderings of the object.
  • a polygon on the surface of the three-dimensional object is defined by the coordinates of its vertices in x.y.z space. It is presumed that all polygons defining an object are either triangles or quadrilaterals.
  • the polygon is "flattened" to u,v space by projecting it onto a two-dimensional plane which intersects all of the vertices of the polygon.
  • the flattening process involves three steps: First, one of the vertices is presumed to lie at the origin of two-dimensional x,y space and an adjacent vertex is presumed to be located on the x-axis; Second, the remaining vertex or vertices are transformed about the origin to be coincident with the two-dimensional x,y plane; Third, all vertices are translated using the u.v location of the defined texture center point as a delta value.
  • vertex A is presumed to have x,y coordinates 0,0.
  • Vector a is defined as the polygon side between vertex A and vertex C.
  • Vector B is defined as the polygon side between vertex A and vertex B.
  • Length ⁇ is the length of vector A.
  • Lengths b and c correspond to the x and y coordinates of vertex B, respectively. From Figure 5, it can be seen that:
  • the u,v values of the polygon vertices may be translated and/or rotated in the u.v plane as previously described.
  • the u,v values for the second polygon are calculated as follows.
  • polygon A has been rotated and translated in the u,v plane and default u,v values for the second polygon B have been calculated in the same manner as previously described.
  • the vertices representing the common edge between the first and second polygons are then determined based on the known x.y.z coordinates of the vertices.
  • vector A of polygon A and vector B of polygon B are determined to define the common edge between the polygons.
  • polygon B is translated so that the origin of vector B coincides with the origin of vector A.
  • Rotational values are calculated based on the definition of ⁇ in equation (6) above.
  • Polygon B is rotated as shown in Figure 8 so that vectors A and B are coincident.
  • interior pixel u,v values are mapped to two-dimensional screen space using any conventional interpolation scheme.
  • One particularly suitable scheme is rational linear interpolation as described in Interpolation for Polygon Texture Mapping and Shading by Paul S. Heckbert and Henry P. Moreton.
  • the homogenous texture coordinates suitable for linear interpolation in screen space can be computed simply by dividing the texture coordinates by the pixel spacing w, linearly interpolating the normalized coordinates (u/w,v/w), and multiplying the interpolated normalized coordinates by w at each pixel to recover the texture coordinates. This process maps texture to the polygon interior with correct perspective.
  • a problem in achieving photorealism in computer graphics is the computational overhead of replicating realistic lighting and shading models, especially at very fine levels of detail.
  • the system of the present invention extends conventional texture mapping techniques to achieve high quality detail in the rendered image without significantly increasing the computational time required for rendering. This is accomplished by painting certain surface details directly onto the three-dimensional object. Such surface detail approximates local surface lighting effects which would result from bumps, folds, waves and other disturbances in the flow of the surface texture. Such effects are independent of the mapped texture and are difficult to represent directly in the three-dimensional model of the object using conventional modeling techniques.
  • an object texture mode of operation provides the operator with a palette of tools, including variously shaped brushes, pencils, stamps and others.
  • the operator manipulates these tools on the surface of the three-dimensional object in the object window to paint, shift, scale and shear surface textures directly on the rendered three-dimensional object.
  • Use of the tools assigns a pixel- by-pixel multiplier that modulates the lighting intensity at the surface of the object, thereby simulating highlights and shadowing of surface detail features.
  • the multiplier is normalized so that the unmodulated lighting intensity can be either dampened or intensified.
  • the painted-on features become a part of the object model and are included in the .OBJ file. Such features will therefore be represented on the rendered object regardless of the texture that is applied.
  • the features applied with the surface detail tools are mapped to the texture window where they can be further edited in the two-dimensional space of the surface polygon projections.

Abstract

Ce procédé de mappage de textures bidimensionnelles sur un objet tridimensionnel permet d'obtenir des résultats corrects en terme de placement de motif, d'échelle, de répétition et d'écoulement de diverses textures. On peut exécuter ce procédé en montant une interface utilisateur particulière destinée à la commande dudit procédé. L'utilisateur commande le sens d'écoulement de la texture et là où il doit la 'couper et l'ourler' afin de l'ajuster dans la géométrie de l'objet tridimensionnel, et ce en cliquant sur la surface appliquée sur l'objet et en tirant directement sur celle-ci. Deux fenêtres sont présentées simultanément à l'utilisateur: l'une offrant une vision de l'objet tridimensionnel entouré de la texture, et l'autre offrant une vision de la texture bidimensionnelle sur laquelle sont projetés des polygones de surface d'objet, le procédé de mappage se commandant à partir des deux fenêtres. La fenêtre de texture bidimensionnelle permet à l'utilisateur d'orienter la texture et d'introduire une distorsion de mappage afin de simuler, le cas échéant, l'étirement de la texture.
EP97944280A 1996-05-23 1997-05-22 Procede et dispositif de mappage d'une texture bidimensionnelle sur une surface tridimensionnelle Withdrawn EP0978102A2 (fr)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US1821496P 1996-05-23 1996-05-23
US1821P 1996-05-23
US842622 1997-04-15
US08/842,622 US5903270A (en) 1997-04-15 1997-04-15 Method and apparatus for mapping a two-dimensional texture onto a three-dimensional surface
PCT/US1997/009073 WO1997045782A2 (fr) 1996-05-23 1997-05-22 Procede et dispositif de mappage d'une texture bidimensionnelle sur une surface tridimensionnelle

Publications (1)

Publication Number Publication Date
EP0978102A2 true EP0978102A2 (fr) 2000-02-09

Family

ID=26690864

Family Applications (1)

Application Number Title Priority Date Filing Date
EP97944280A Withdrawn EP0978102A2 (fr) 1996-05-23 1997-05-22 Procede et dispositif de mappage d'une texture bidimensionnelle sur une surface tridimensionnelle

Country Status (2)

Country Link
EP (1) EP0978102A2 (fr)
WO (1) WO1997045782A2 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6124858A (en) * 1997-04-14 2000-09-26 Adobe Systems Incorporated Raster image mapping
US6909443B1 (en) 1999-04-06 2005-06-21 Microsoft Corporation Method and apparatus for providing a three-dimensional task gallery computer interface
US6765567B1 (en) 1999-04-06 2004-07-20 Microsoft Corporation Method and apparatus for providing and accessing hidden tool spaces
US7119819B1 (en) 1999-04-06 2006-10-10 Microsoft Corporation Method and apparatus for supporting two-dimensional windows in a three-dimensional environment
WO2000060442A1 (fr) 1999-04-06 2000-10-12 Microsoft Corporation Procede et dispositif permettant de realiser une interface informatique a galerie de taches tridimensionnelle
WO2016086226A1 (fr) * 2014-11-26 2016-06-02 Massachusetts Institute Of Technology Systèmes, dispositifs et procédés d'impression sur des objets tridimensionnels

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5255352A (en) * 1989-08-03 1993-10-19 Computer Design, Inc. Mapping of two-dimensional surface detail on three-dimensional surfaces
US5333245A (en) * 1990-09-07 1994-07-26 Modacad, Inc. Method and apparatus for mapping surface texture

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO9745782A3 *

Also Published As

Publication number Publication date
WO1997045782A2 (fr) 1997-12-04
WO1997045782A3 (fr) 1998-04-30

Similar Documents

Publication Publication Date Title
US5903270A (en) Method and apparatus for mapping a two-dimensional texture onto a three-dimensional surface
US5592597A (en) Real-time image generation system for simulating physical paint, drawing media, and feature modeling with 3-D graphics
US7652675B2 (en) Dynamically adjusted brush for direct paint systems on parameterized multi-dimensional surfaces
US6628295B2 (en) Modifying a stylistic property of a vector-based path
US8456484B2 (en) Apparatus and methods for wrapping texture onto the surface of a virtual object
US5598182A (en) Image synthesis and processing
US9305389B2 (en) Reducing seam artifacts when applying a texture to a three-dimensional (3D) model
US6417850B1 (en) Depth painting for 3-D rendering applications
US7148899B2 (en) Texture mapping 3D objects
Foskey et al. ArtNova: Touch-enabled 3D model design
US5892691A (en) Method, apparatus, and software product for generating weighted deformations for geometric models
US5673377A (en) Method and system for displaying a representation of a three-dimensional object with surface features that conform to the surface of the three-dimensional object
US8963958B2 (en) Apparatus and methods for adjusting a texture wrapped onto the surface of a virtual object
US20030179203A1 (en) System and process for digital generation, placement, animation and display of feathers and other surface-attached geometry for computer generated imagery
US8269765B2 (en) System and method for removing seam artifacts
US20050128210A1 (en) Haptic graphical user interface for adjusting mapped texture
WO1992021096A1 (fr) Synthese et traitement de l'image
US7495663B2 (en) System and method for computing a continuous local neighborhood and paramaterization
US20090033674A1 (en) Method and apparatus for graphically defining surface normal maps
EP0978102A2 (fr) Procede et dispositif de mappage d'une texture bidimensionnelle sur une surface tridimensionnelle
WO1997045782A8 (fr) Procede et dispositif de mappage d'une texture bidimensionnelle sur une surface tridimensionnelle
JP2003504697A (ja) 副標本化テクスチャ端縁部のアンチエイリアシング
US8373715B1 (en) Projection painting with arbitrary paint surfaces
JP2763481B2 (ja) 画像合成装置及び画像合成方法
Sun et al. Interactive texture mapping for polygonal models

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19981212

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: STYLECLICK.COM

17Q First examination report despatched

Effective date: 20020617

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20021028

REG Reference to a national code

Ref country code: HK

Ref legal event code: WD

Ref document number: 1026761

Country of ref document: HK