WO2002023484A1 - Procede de mappage de texture et dispositif de rendu d'image - Google Patents
Procede de mappage de texture et dispositif de rendu d'image Download PDFInfo
- Publication number
- WO2002023484A1 WO2002023484A1 PCT/JP2001/007936 JP0107936W WO0223484A1 WO 2002023484 A1 WO2002023484 A1 WO 2002023484A1 JP 0107936 W JP0107936 W JP 0107936W WO 0223484 A1 WO0223484 A1 WO 0223484A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- texture
- polygon
- vertices
- coordinates
- amount
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
Definitions
- the present invention relates to a texture mapping method and a rendering device, and more particularly, to a texture mapping method and a rendering device capable of matching display results of shading and texture mapping.
- An object in 3DCG is generally represented by a set of polygon planes (Polygons) pasted by a number of vertices.
- the rendering is performed based on the position of each vertex of the three-dimensional model, the connection information, and data on the material.
- calculation processing for each pixel (pixel) on the two-dimensional image is performed.
- the context is determined according to the distance of each drawing pixel from the viewpoint position (hidden surface processing), and the color and brightness of each vertex are interpolated to smoothly shade the Polygon color pasted at that vertex (shading ).
- a method of expressing details in a pseudo manner by pasting a two-dimensional image etc. on the surface of the three-dimensional object obtained in this way This is called char mapping.
- Shading is a method of shading the surface of a 3D object composed of polygons based on the light source and the shape of the object. Since it is extremely difficult to perform physically accurate shading calculations and the number of tatami mats becomes enormous, various calculation methods have been devised to reduce this. In general, the higher the quality of the image, the greater the amount of calculation.
- Types of shading include Flat shading, Gouraud shading, and Phong shading.
- Flat shading is a method in which a normal is drawn for each face of the polygon that makes up the surface, and the color of the face is determined from the inner product of this and the light source vector. Since the shading unit is plane-by-plane, a smooth curved surface cannot be represented, but the computational complexity is small and high-speed display is possible.
- Gouraud shading is a method in which a normal is set for each vertex of a polygon that forms the surface, and the color at each vertex is determined from the inner product of this and the light source vector. Between the vertices, the colors of the vertices are colored by linear interpolation. Although relatively fast and smooth curved surfaces can be simulated, high-curvature curved surfaces and highlights become unnatural.
- Phong shading creates a normal at each vertex of the polygon that composes the surface, and then linearly interpolates the normal from the normal vector at each vertex for each pixel of the result image drawn between the vertices. Based on this, the color is determined for each pixel from the inner product of the light source and the normal. High-quality results are obtained, but the computational complexity per pixel is large, so it is rarely used in current real-time 3D-CG.
- Texture mapping is a method of improving the texture by attaching an image to the surface of a 3D object.
- the u, v coordinate system is used for the texture image to be pasted, and the u (v) coordinate values are assigned to each vertex to determine the method of mapping (mapping method).
- real-time 3D-CG generally generates images using a combination of Gouraud shading and texture mapping.
- Texture mapping is applied to the object obtained in this way. Texture coordinates are defined for each vertex of the object, and are stretched and pasted by linear interpolation between them. The texture applied to the plane of the object will be perfectly represented from any point of view.
- the above method can provide almost no problem. However, there are some problems when trying to represent a curved surface in the same way.
- a curved surface is simulated by dividing it into a set of fine polygonal planes. At this time, many vertices and planes are required, and the computational complexity increases significantly. Here, the number of divisions of the surface is determined by the trade-off between image quality and computational complexity.However, considering the advantages of real-time 3D-CG that images can be obtained from arbitrary viewpoints, when displaying details However, it is a problem that the curved surface is inevitably roughened.
- Gouraud shading provides a pseudo-surface representation with approximately smooth shading.
- an unnatural shadow such as a Mach band occurs.
- an object is projected onto a two-dimensional image, and is approximated through processes such as shading by illumination and pasting of an image pattern onto the object surface (texture mapping).
- a result image is generated.
- the present invention has been made to solve such a new problem, that is, the problem of the difference between the shading and the display result of texture mapping, which occurs when texture mapping to a pseudo curved surface, and the two are in agreement. It is an object of the present invention to provide a texture mapping method and a rendering device capable of causing the texture mapping.
- a texture mapping method comprises the steps of: preparing an object composed of one or more polygons in three-dimensional coordinates;
- a step of obtaining a texture coordinate shift amount which is an error amount generated between the surface of the polygon and the curved surface when viewed from the line of sight, based on the angle ⁇ and the lift amount d at each point on the plane;
- a rendering device includes: a three-dimensional model storage unit that stores an object constituted by one or more polygons in three-dimensional coordinates; and a texture that stores textures associated with a plurality of vertices of the polygon.
- a storage unit a perspective transformation unit that performs perspective transformation of the target by projecting a plurality of vertices of the polygon on screen coordinates, and a pixel on the screen coordinates corresponding to the plurality of vertices of the polygon, A drawing unit that draws based on the texture; a linear interpolation unit that obtains texture coordinates corresponding to pixels between a plurality of vertices of the polygon on the screen coordinates by performing linear interpolation between the plurality of vertices; The angle between the line-of-sight direction in the original space and the surface normal of the polygon The lift amount d at each point on the plane from the surface constituting the polygon to the predetermined curved surface is received, and based on these, it is generated between the surface of
- a texture coordinate shift amount calculating unit for obtaining a texture coordinate shift amount which is an error amount; and a texture coordinate correcting unit for correcting the texture coordinates based on the texture coordinate shift amount, wherein the texture storage is performed based on the corrected texture coordinates.
- a rendering device for reading out the texture data of the section and rendering based on the data.
- FIG. 1 is a block diagram of an apparatus according to an embodiment of the present invention.
- FIG. 2 is a flowchart of the process according to the embodiment of the present invention.
- FIG. 3 is an operation explanatory diagram of the embodiment of the present invention.
- FIG. 4 is an operation explanatory diagram of the embodiment of the present invention.
- FIG. 5 is an operation explanatory diagram of the embodiment of the present invention.
- FIG. 6 is an example of a processing result according to the embodiment of the present invention.
- the embodiment of the present invention proposes an “intra-texture shift” method for performing correction for this problem during texture mapping.
- the curved surface shape assumed in the final rendered image is calculated backward, and a distorted texture image is dynamically generated and pasted. This is a method of obtaining the same effect.
- Dynamic generation of texture images has been studied in the field of image-based rendering (Oliveira, Manuel M. Relief Texture Mapping.Ph.D.Dissertation, University of North Carolina, 2000) In order to reduce the amount of calculation in real time for the purpose, various devices are required. This texture image generation processing is realized by shifting the texture image reference position of each drawing pixel during texture mapping processing, instead of actually distorting the texture image.
- a computer has a central processing unit (including CPU and memory), an external storage device (including a hard disk, a CD-ROM drive, and a floppy disk drive), an input device (including a keyboard and a pointing device), and an output device.
- Equipment including CRT display and liquid crystal display
- communication equipment including modem and terminal adapter.
- FIG. 1 is a block diagram showing a main part of an apparatus according to an embodiment of the present invention. This figure is, for example, a functional block diagram showing a main part in a computer where a program according to this apparatus / method is installed. For convenience of explanation, only necessary parts are shown, and the apparatus / method may include other parts not shown as necessary.
- reference numeral 1 denotes a three-dimensional model storage unit that stores an object constituted by one or more polygons in three-dimensional coordinates
- reference numeral 2 denotes a plurality of vertices of the polygon projected on screen coordinates
- a perspective transformation unit that performs perspective transformation of the object
- reference numeral 3 denotes a drawing unit that renders pixels on the screen coordinates corresponding to a plurality of vertices of the polygon based on a texture
- reference numeral 4 denotes the polygon.
- a texture storage unit that stores textures associated with a plurality of vertices of the polygon
- reference numeral 5 denotes texture coordinates corresponding to pixels between the plurality of vertices of the polygon on the screen constellation S
- Reference numeral 6 denotes a linear interpolation unit that is obtained by performing linear interpolation between: a predetermined angle based on an angle S between a line-of-sight direction in a three-dimensional space and a surface normal of the polygon and a surface forming the polygon in the three-dimensional space.
- a texture coordinate shift which receives a lift amount d at each point on a plane up to a curved surface and calculates a texture coordinate shift amount which is an error amount generated between the polygon surface and the curved surface when viewed from the line of sight based on these.
- An amount calculation unit 7 is a texture coordinate correction unit that corrects the texture coordinates based on the texture coordinate shift amount.
- FIG. 2 is a flowchart of a process according to the embodiment of the present invention.
- the process of drawing each pixel as a result image after completing the coordinate transformation calculation on a per-vertex basis is called "last-rise”.
- Gouraud shading at this stage, based on the position, color, and texture coordinates where the vertices are projected on the resulting image, linear interpolation is performed between them to draw.
- the corresponding color is obtained from the texture image based on the texture coordinates calculated by interpolation, and the color of each pixel to be drawn is determined.
- the problem of applying a texture to a pseudo-surface is a problem because the texture coordinates are defined in units of vertices, and all points between vertices, that is, points on the surface, are calculated by linear interpolation. Therefore, the apparatus according to the embodiment of the present invention includes the texture coordinate shift amount calculator 6 and the texture coordinate corrector 7, and shifts the texture coordinates according to the pseudo curved surface. This enables smooth and natural texture expression.
- the texture coordinate shift amount calculator 6 and the texture coordinate corrector 7 calculate the reference position of the texture coordinates at the time of rasterization.
- rasterization drawing calculation is performed for each pixel, so if a surface calculation can be performed in pixel units, a very detailed surface display can always be performed. This is equivalent to the difference between Gouraud and Phong shading in shading.
- the calculation amount increases by this amount, but the calculation for obtaining the texture reference coordinates on the same plane is relatively simple.
- this method is referred to as pseudo-surface representation by texture coordinate shift. Things to keep in mind when doing this
- e u is the projection of the line-of-sight vector e on this plane.
- this plane is a curved surface
- the point that should exist in A becomes the point B on the plane, so there is a difference between the plane and the curved surface when viewed from the e direction.
- shift the texture coordinates for each pixel in the plane and assign the B texture coordinates to the C location.
- the texture coordinate value of B is required, and for that, d tan0u, that is, the lifting amount d at each point on the plane in B is required. To find this d, the texture coordinate value of B is still necessary. Therefore, assuming that the assumed curved surface is sufficiently smooth and that the angle /? Between the normal of the plane and the line-of-sight vector is sufficiently small (tanS is small), then d (B) i? D (C) There is approximation.
- the angle n between n and e which is required here, can be determined not for each pixel but for one plane.
- the angle between the line-of-sight vector and the normal vector must also be determined by illumination calculation during shading, and this type of calculation can be performed at high speed on a graphic chip.
- the lifting amount d (u, v) from the plane to the assumed curved surface is calculated. This can be done by approximation from the position of each vertex or the normal vector. However, if this calculation is performed for each pixel during rasterization, the amount of calculation will be extremely large. Here, we consider a method to eliminate this.
- the amount of lift to be obtained is a static value that differs at each point on the texture image, but does not change with each drawing. These values may be calculated in advance and stored for each surface.
- a texture image is prepared in which the lifting amount d (u, v) is shaded, and this is referred to in the same way as when referring to the value of each pixel of the texture image at the time of last rise.
- multi-texture technology For example, it is a function that can be used in APIs such as DirectX.
- the second image texture is used as a lift amount table, and a process of “shifting the texture coordinates” is performed instead of mixing, thereby obtaining “multi-text”. It can provide a mounting method that can perform real-time processing even on the current hardware technologies such as the Cuscia technology and the EMBM technology.
- the first method is to give the texture image a value (usually representing transparency) along with the RGB value color, and use the lift amount as the ⁇ value.
- color and transparency are expressed by giving a texture image a total of 4 channels of color information RGB and values.
- Another method is to prepare a separate texture image with a general lift amount
- a floating amount table can be provided in the ⁇ channel, and if transparency is required, a floating amount texture image may be separately prepared.
- the amount of the rising surface can be used as an altitude table that indicates the height in a broad sense. This makes it possible to apply to a wide variety of applications, for example, not only for expressing smooth curved surfaces, but also for expressing arbitrary shapes such as terrain and human faces with a few polygons.
- the reference position of the texture image is corrected at the stage of last rise.
- Texture mapping is performed based on the corrected texture coordinates.
- a known method can be used for the actual processing.
- a Multimedia API proposed by Microsoft Corporation, Direct3D (trademark), a real-time 3D-CG rendering API in DirectX can be used.
- the Direct3D (TM) 1998 Version 6 specification supports a feature called Environment-Mapped Bump-Mapping (EMBM).
- EMBM Environment-Mapped Bump-Mapping
- FIG. 6 shows an example of a processing result of a pseudo curved surface by texture coordinate shift according to the apparatus / method according to the embodiment of the present invention.
- FIG. 6 (c) shows the target to be processed.
- the target is a 32 prism in the upper half and an octagon in the lower half.
- Fig. 6 (b) shows the result of applying normal texture mapping
- Fig. 6 (a) shows the result of pseudo-surface processing by texture coordinate shift.
- the distortion with the curvature of the texture is almost correctly expressed in the part near the front. It was also confirmed that this effect was realized in stereoscopic vision.
- These displays can be processed in real time on existing graphics tools with functions such as EMBM.
- the embodiment of the present invention it is possible to solve the problem (a new problem discovered by the inventor) that the approximate display of shading and texture pinch is different in pseudo-surface expression in real-time 3D-CG. According to the embodiment of the present invention, processing in real time is sufficiently possible.
- means does not necessarily mean physical means, but also includes a case where the function of each means is realized by software. Further, the function of one means may be realized by two or more physical means, or the functions of two or more means may be realized by one physical means.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002527455A JP4714919B2 (ja) | 2000-09-13 | 2001-09-13 | レンダリング装置及び記録媒体並びにプログラム |
AU2001286216A AU2001286216A1 (en) | 2000-09-13 | 2001-09-13 | Texture mapping method and rendering device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2000-278045 | 2000-09-13 | ||
JP2000278045 | 2000-09-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2002023484A1 true WO2002023484A1 (fr) | 2002-03-21 |
Family
ID=18763269
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2001/007936 WO2002023484A1 (fr) | 2000-09-13 | 2001-09-13 | Procede de mappage de texture et dispositif de rendu d'image |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP4714919B2 (ja) |
AU (1) | AU2001286216A1 (ja) |
WO (1) | WO2002023484A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006252424A (ja) * | 2005-03-14 | 2006-09-21 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
JP2007087031A (ja) * | 2005-09-21 | 2007-04-05 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
WO2019049289A1 (ja) * | 2017-09-07 | 2019-03-14 | 株式会社ソニー・インタラクティブエンタテインメント | 画像生成装置および画像生成方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11203498A (ja) * | 1998-01-16 | 1999-07-30 | Nec Corp | 法線ベクトルを用いたテクスチャ座標生成方法および装置 |
US6078334A (en) * | 1997-04-23 | 2000-06-20 | Sharp Kabushiki Kaisha | 3-D texture mapping processor and 3-D image rendering system using the same |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0877991B1 (en) * | 1996-11-21 | 2003-01-08 | Koninklijke Philips Electronics N.V. | Method and apparatus for generating a computer graphics image |
US6980218B1 (en) * | 2000-08-23 | 2005-12-27 | Nintendo Co., Ltd. | Method and apparatus for efficient generation of texture coordinate displacements for implementing emboss-style bump mapping in a graphics rendering system |
-
2001
- 2001-09-13 WO PCT/JP2001/007936 patent/WO2002023484A1/ja active Application Filing
- 2001-09-13 JP JP2002527455A patent/JP4714919B2/ja not_active Expired - Fee Related
- 2001-09-13 AU AU2001286216A patent/AU2001286216A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6078334A (en) * | 1997-04-23 | 2000-06-20 | Sharp Kabushiki Kaisha | 3-D texture mapping processor and 3-D image rendering system using the same |
JPH11203498A (ja) * | 1998-01-16 | 1999-07-30 | Nec Corp | 法線ベクトルを用いたテクスチャ座標生成方法および装置 |
Non-Patent Citations (1)
Title |
---|
TOSHIYUKI TAKAHEI ET AL.: "Ryougan rittaishi no tame no real time, texture mapping shuhou", NIPPON VIRTUAL REALITY GAKKAI DAI 5KAI TAIKAI RONBUNSHUU, 18 September 2000 (2000-09-18), pages 189 - 192, XP002906360 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006252424A (ja) * | 2005-03-14 | 2006-09-21 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
JP4641831B2 (ja) * | 2005-03-14 | 2011-03-02 | 株式会社バンダイナムコゲームス | プログラム、情報記憶媒体及び画像生成システム |
JP2007087031A (ja) * | 2005-09-21 | 2007-04-05 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
JP4662260B2 (ja) * | 2005-09-21 | 2011-03-30 | 株式会社バンダイナムコゲームス | プログラム、情報記憶媒体及び画像生成システム |
WO2019049289A1 (ja) * | 2017-09-07 | 2019-03-14 | 株式会社ソニー・インタラクティブエンタテインメント | 画像生成装置および画像生成方法 |
US11120614B2 (en) | 2017-09-07 | 2021-09-14 | Sony Interactive Entertainment Inc. | Image generation apparatus and image generation method |
Also Published As
Publication number | Publication date |
---|---|
AU2001286216A1 (en) | 2002-03-26 |
JPWO2002023484A1 (ja) | 2004-04-08 |
JP4714919B2 (ja) | 2011-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Welsh | Parallax mapping with offset limiting: A per-pixel approximation of uneven surfaces | |
CN107392988B (zh) | 利用投影几何失真用于以可变采样率渲染的系统、方法和计算机程序产品 | |
US7348989B2 (en) | Preparing digital images for display utilizing view-dependent texturing | |
JP3294149B2 (ja) | 立体テクスチャマッピング処理装置及びそれを用いた3次元画像生成装置 | |
US6593923B1 (en) | System, method and article of manufacture for shadow mapping | |
US8059119B2 (en) | Method for detecting border tiles or border pixels of a primitive for tile-based rendering | |
CN108230435B (zh) | 采用立方图纹理的图形处理 | |
US6731298B1 (en) | System, method and article of manufacture for z-texture mapping | |
KR20050030595A (ko) | 화상 처리 장치 및 그 방법 | |
US7508390B1 (en) | Method and system for implementing real time soft shadows using penumbra maps and occluder maps | |
US20170200302A1 (en) | Method and system for high-performance real-time adjustment of one or more elements in a playing video, interactive 360° content or image | |
US20020030681A1 (en) | Method for efficiently calculating texture coordinate gradient vectors | |
US7158133B2 (en) | System and method for shadow rendering | |
US20140327689A1 (en) | Technique for real-time rendering of temporally interpolated two-dimensional contour lines on a graphics processing unit | |
US8072464B2 (en) | 3-dimensional graphics processing method, medium and apparatus performing perspective correction | |
US5793372A (en) | Methods and apparatus for rapidly rendering photo-realistic surfaces on 3-dimensional wire frames automatically using user defined points | |
Batagelo et al. | Real-time shadow generation using bsp trees and stencil buffers | |
US6924805B2 (en) | System and method for image-based rendering with proxy surface animation | |
GB2302001A (en) | Computer graphics system having per pixel depth cueing | |
KR100559127B1 (ko) | 화상처리장치 | |
JP4714919B2 (ja) | レンダリング装置及び記録媒体並びにプログラム | |
US11989807B2 (en) | Rendering scalable raster content | |
KR100848687B1 (ko) | 3차원 그래픽 처리 장치 및 그것의 동작 방법 | |
US6188409B1 (en) | 3D graphics device | |
KR20020063941A (ko) | 컴퓨터를 이용한 그림자를 포함한 실시간 툰 렌더링 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2002527455 Country of ref document: JP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
122 | Ep: pct application non-entry in european phase |