EP1766584A2 - Inverse texture mapping 3d graphics system - Google Patents

Inverse texture mapping 3d graphics system

Info

Publication number
EP1766584A2
EP1766584A2 EP05749079A EP05749079A EP1766584A2 EP 1766584 A2 EP1766584 A2 EP 1766584A2 EP 05749079 A EP05749079 A EP 05749079A EP 05749079 A EP05749079 A EP 05749079A EP 1766584 A2 EP1766584 A2 EP 1766584A2
Authority
EP
European Patent Office
Prior art keywords
texture
space
positions
screen space
intensities
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP05749079A
Other languages
German (de)
French (fr)
Inventor
Kornelis Meinds
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Entropic Communications LLC
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Priority to EP05749079A priority Critical patent/EP1766584A2/en
Publication of EP1766584A2 publication Critical patent/EP1766584A2/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Definitions

  • the invention relates to an inverse texture mapping 3D graphics processor, a graphics adapter comprising the 3D graphics processor, a computer comprising the 3D graphics processor, a display apparatus comprising the 3D graphics processor, and a method of inverse texture mapping.
  • ITM inverse texture mapping
  • a first aspect of the invention provides an inverse texture mapping 3D graphics processor as claimed in claim 1.
  • a second aspect of the invention provides a graphics adapter comprising the 3D graphics processor, as claimed in claim 10.
  • a third aspect of the invention provides a computer comprising the 3D graphics processor, as claimed in claim 11.
  • a fourth aspect of the invention provides a display apparatus comprising the 3D graphics processor, as claimed in claim 12.
  • a fifth aspect of the invention provides a method of inverse texture mapping as claimed in claim 13.
  • Advantageous embodiments are defined in the dependent claims.
  • the inverse texture mapping 3D graphics processor in accordance with the first aspect maps a 3D model onto screen space.
  • the graphics processor comprises a texture memory for storing texel intensities of texture space grid positions.
  • a plurality of screen space rasterizers determine pixel grid positions within different screen space polygons at a plurality of corresponding different display instants during a same temporal interval between sample instants of geometric data of the 3D model.
  • the pixel grid positions in screen space are considered to be positioned on a grid, the pixel intensities related to these pixel grid positions will be stored in a frame buffer memory and are used to display the image.
  • These different instants are referred to as display instants because the screen space projection of the 3D model is rendered for displaying at these instants.
  • the ratio between the number of display instants and sample instants is the frame rate up-conversion factor.
  • the screen space polygons associated with the same polygon of the 3D model have different positions in the screen space dependent on motion information of the 3D model with respect to the camera position (also called eye position).
  • a plurality of corresponding mappers maps the pixel grid positions of the screen space polygons at the different instants to texture space positions that usually do not coincide with the texel grid positions on which the texel intensities are stored.
  • a texture space resampler determines texel intensities at the mapped screen space positions from the texel grid intensities of the texture space grid positions stored in the texture memory or in texture cache.
  • a texture cache temporarily stores, for every texture space polygon, the texel grid intensities required by the texture space resampler during the temporal interval for all the screen space polygons being associated with a same texture space polygon. If a texture within a polygon would not fit in the texture cache a strategy of partitioning the polygon in smaller parts (e.g. blocks) could be used.
  • a plurality of corresponding pixel shaders determine, at said different display instants, pixel intensities optionally from the texel intensities received from the texture space resampler and optionally on the basis of a formula such as the well known Gouraud shading.
  • the same texture samples stored in the texture cache can be used for all the associated screen space polygons during a same temporal interval wherein the plurality of rendering instants occurs.
  • the texture needs to be fetched once per temporal interval from the texture memory and not for every display instant. Consequently, the data rate between the texture cache and the texture memory does not depend on the frame rate up- conversion factor. This is an important improvement because the data rate to the texture memory is limited by the speed of the texture memory and by the data rate on the bus to the texture memory. This is especially relevant if the texture memory is present on a separate chip.
  • the motion information comprises motion data that can be used to determine the path of motion of the polygon in the screen space within the temporal interval.
  • the vertices of the polygon(s) and the mapping of pixels to texture space can be determined from this path of motion. It has to be noted that the vertices of a polygon can have different paths of motion. Thus, the path of motion of the polygon is determined by the path of motion of each of the vertices of the polygon.
  • the motion information is a displacement vector that indicates a displacement of the vertices of the polygon in the screen space between two sample instants. The displacement at a particular one of the rendering instants can be determined by (linearly) interpolating the displacement defined by the displacement vector.
  • motion data is used in the form of two model/view matrices one for the current sample instant and one for the previous sample instant.
  • a motion vector in screen space can be determined.
  • the parameters of the mapping functions of the mappers for different rendering instants between two successive sample instants can be determined from this information.
  • This is a robust and efficient method to obtain displacement vectors in the eye or world space.
  • the vertices of the previous frame (or more general: the temporal interval) can be subtracted from the vertices of the current frame.
  • the 3D system calculates the coordinates of the eye space vertices for both the current frame instant (or more general: the current sample instant) and the previous frame instant.
  • the 3D application needs to send, next to the normal model- view matrix an additional model- view matrix for the previous frame instant.
  • the application may buffer the model- view matrices to efficiently resend them.
  • the geometry transformation unit of the 3D system applies both model- view matrices to transform each vertex to a "current" and a "previous" position in eye space.
  • the motion information is provided by the 3D application.
  • the ITM 3D graphics processor may determined the motion information by relating the vertices of the geometry of the current sampling instant with that of the previous sampling instant.
  • the ITM processor comprises a plurality of frame buffers for storing the intensities that are determined at the screen grid positions.
  • Each frame buffer stores a rendered image for a particular one of the display instants.
  • the frame rate up-conversion is obtained.
  • no texture maps are stored in the texture cache (TC) and the pixel shaders (PSj) are arranged to perform pixel shading on basis of non-texture data.
  • the ITM processor controls the mappers to perform an identical mapping to the frame buffers for non-moving objects. In fact only one of the mappers needs to perform the mapping and the output obtained this way is copied to all the frame buffers.
  • Fig. 1 shows a display of a 3D object on a display screen
  • Fig. 2 shows a block diagram of a prior art inverse texture mapping 3D graphics system
  • Figs. 3 A and 3B illustrate the operation of the inverse texture mapping system
  • Fig. 4 shows a block diagram of the inverse texture mapping 3D graphics system in accordance with an embodiment of the invention
  • Figs. 5A and 5B illustrate the operation of the embodiment of the inverse texture mapping system shown in Fig. 4
  • Fig. 6 shows a computer comprising the inverse texture mapping system
  • Fig. 7 shows a display apparatus comprising the inverse texture mapping system.
  • Fig. 1 elucidates the display of a 3D object WO in world space on a display screen DS.
  • the object may also be available in other 3D spaces such as model or eye space, in the following all these spaces are referred to as world space.
  • An object WO which may be a three-dimensional object such as the cube shown, is projected on the two-dimensional display screen DS.
  • a surface structure or texture defines the appearance of the three-dimensional object WO.
  • the polygon A has a texture TA and the polygon B has a texture TB.
  • the polygons A and B are with a more general term also referred to as graphics primitives.
  • the projection on to the display screen DS of the object WO is obtained by defining an eye or camera position ECP within the world space.
  • Fig. 1 shows how the polygon SGP projected on the screen DS is obtained from the corresponding polygon A.
  • the polygon SGP in the screen space SSP is defined by it's vertex coordinates in the screen space SSP. It is only the projection of the geometry of the polygon A which is used to determine the geometry of the polygon SGP. Usually, it suffices to know the vertices of the polygon A and the projection to determine the vertices of the polygon SGP.
  • the texture TA of the polygon A is not directly projected from the real world onto the screen space SSP.
  • the different textures of the real world object WO are stored in a texture map memory TM (see Fig. 2) or texture space TSP defined by the coordinates u and v.
  • a texture map memory TM see Fig. 2
  • texture space TSP defined by the coordinates u and v.
  • Fig. 1 shows that the polygon A has a texture TA which is available in the texture space TSP in the area indicated by TA, while the polygon B has another texture TB which is available in the texture space TSP in the area indicated by TB.
  • the polygon A is projected on the texture space TA to obtain a polygon TGP such that when the texture present within the polygon TGP is projected on the polygon A the texture of the real world object WO is obtained or at least resembled as much as possible.
  • a perspective transformation PPT between the texture space TSP and the screen space SSP projects the texture of the polygon TGP on the corresponding polygon SGP.
  • This process is also referred to as texture mapping.
  • the textures are not all present in a global texture space, but every texture defines its own texture space TSP.
  • the textures in the texture space TSP are stored in a texture memory TM for a discrete number of positions in the texture space TSP.
  • these discrete positions are the grid positions in the texture space TSP determined by integer values of u and v. These discrete grid positions are further referred to as grid texture positions or grid texture coordinates.
  • Positions in the texture space which are not limited to the grid positions are referred to as positions in the texture space TSP or as positions in the u,v space TSP.
  • the positions in the u,v space may be represented by floating point numbers.
  • the image to be displayed is stored in a frame buffer memory.
  • these discrete positions are the grid positions in the screen space SSP determined by integer values of x and y. These discrete grid positions are referred to as grid screen positions or grid screen coordinates.
  • Positions in the x,y space which are not limited to the grid positions are referred to as positions in the x,y space or as positions in the screen space SSP.
  • graphics primitive indicates a polygon (such as polygon A) in the world space, or the polygon SGP in the screen space SSP, or the polygon TGP in the texture space TSP. It is clear from the context which graphics primitive is meant.
  • Fig. 2 shows a block diagram of a prior art inverse texture mapping 3D graphics system.
  • the vertex transformation and lighting unit VER transforms the vertex coordinates of the polygon A; B in world space to the screen space SSP to obtain screen space coordinates x v l,y v l to x v 3,y v 3 of the vertices of the screen space polygon SGP.
  • the vertex T&L unit further performs light calculations to determine an intensity (also referred to as color) per vertex. If a texture TA, TB is to be applied to a screen space polygon SGP, the vertex T&L unit receives texture space coordinates u v ⁇ ,v v ⁇ to u v3 ,v v3 from the application.
  • the vertex T&L unit supplies both the screen space coordinates x v ,y v ( vijVvi; Xv 2 ,y v2 ; x v3 ,y v3 in Fig. 3A) and the texture space coordinates u v ,v v (u v ⁇ ,v v ⁇ ; u v2 ,v v2 ; u v3 ,v v3 in Fig. 3B) of the vertices of the screen space polygons SGP and the texture space polygons TGP, respectively, such that the position thereof in the screen space SSP and the texture space TSP, respectively, is known.
  • the screen space rasterizer SRAS determines the grid positions x g ,y g of the pixels which are positioned within the screen space polygon SGP which is determined by the screen space coordinates x v ,y of its vertices. In the example shown in Fig. 3 A, these screen space grid positions x g ,y g within the screen space polygon SGP are indicated by crosses.
  • the rasterizer SRAS may include a so called rasterizer setup which initializes temporal variables required by the rasterizer SRAS for efficient processing based on interpolation of the vertex attributes.
  • the mapper MAP maps the screen space grid positions x g ,y g to corresponding texture space positions u,v in the texture space TSP, see Figs. 3. Generally, these texel positions u,v will not coincident with texture space grid positions u g ,v g .
  • the pixel shader PS determines the intensity PSI(x g ,y g ) (also referred to as color) of a pixel with the screen space coordinates x g ,y g and thus the texture space coordinates u,v.
  • the pixel shader PS receives a set of attributes ATR per pixel, the grid screen coordinates x g ,y g of the pixel and the corresponding texture coordinates u,v.
  • the texture coordinates u,v are used to address texture data TI(u g ,v g ) on grid texture positions U g ,v g stored in the texture memory TM via the texture space resampler TSR.
  • the pixel shader PS may modify the texture coordinate data u,v and may apply and combine several texture maps on the same pixel. It also may perform shading without the use of texture data but on basis of a formula such as the well known Gouraud and Phong shading techniques.
  • the texture space resampler TSR determines the intensity PI(u,v) associated with the intensity PSI(x g ,y g ) of the pixel at the screen space grid position (x g ,y g ) mapped to the texture space coordinate (u,v) in-between texel grid positions (u g ,v g ).
  • the texture data TI(u g ,V g ) corresponding to the texture space grid position u g ,v g is indicated by TI(u g ,v g ).
  • the texel intensities TI(u g ,v g ) for texture space grid positions u g ,v g are stored in the texture memory TM.
  • the texture space resampler TSR determines the intensity PI(u,v) by filtering and accumulating the texel intensities TI(u g ,u v ) of texels with texture space grid coordinates Ug,v g and which have to contribute to the intensity PI(u,v).
  • the texture space resampler TSR determines the intensity PI(u,v) at the texture space position u,v by filtering the texel intensities on texture space grid positions u g ,v g surrounding the texture space position u,v. For example, a bilinear interpolation using the four texture space grid positions u g ,v g (indicated in Fig.
  • the hidden surface removal unit HSR usually includes a Z-buffer which enables determination of the visible colors on a per pixel basis.
  • the depth value z of a produced pixel value PSI(x g ,y g ) is tested against the depth value of the one stored in the Z- buffer at the same pixel screen coordinate x g ,y g (thus on the screen grid).
  • the pixel intensity or color PIP(x g ,y g ) is written into the frame buffer FB and the Z-buffer is updated.
  • the image to be displayed IM is read from the frame buffer FB.
  • a texture cache is present between the texture space resampler TSR and the texture memory TM.
  • the application provides the polygons in groups to minimize texture state switches. Each one of the groups of polygons is related to a same one of the textures.
  • the texture used for a particular group of polygons is stored whole or partially in the texture cache and the texture data can be fetched from texture cache by subsequent polygons from the same group.
  • Figs. 3 A and 3B illustrate the operation of the inverse texture mapping system.
  • Fig. 3A shows the screen space polygon SGP in the screen space SSP.
  • the vertices of the polygon SGP are indicated by the screen space positions x v ⁇ ,y v ⁇ ; v2,y v2 ; v 3 ,y V 3 which usually do not coincide with the screen space grid positions x g ,y g -
  • the screen space grid positions x g ,y g are the positions which have integer values for x and y.
  • the image to be displayed is determined by the intensities (color and brightness) PIP(x g ,y g ) of the pixels which are positioned on the screen space grid positions x g ,y g .
  • the rasterizer SRAS determines the screen space grid positions x g ,y g within the polygon SGP.
  • Fig. 3B shows the texture space polygon TGP in the texture space TSP.
  • the vertices of the texture space polygon TGP are indicated by the texture space positions u v ⁇ ,v v ⁇ ; u v2 ,v v2 ; u v3 ,v v3 which usually do not coincide with the texture space grid positions u g ,v g .
  • the texture space grid positions u g ,v g are the positions which have integer values for u and v.
  • the intensities of the texels TI(u g ,v g ) are stored in the texture memory TM for these texture space grid positions u g ,v g . There may be stored several texture maps in different resolutions of the same texture. A known technique which uses these different resolution textures is called MlP-mapping.
  • the texture space grid positions u g ,v g within the polygon TGP are indicated by dots in Fig. 4B.
  • the mapper MAP maps the screen space grid coordinates x g ,y g to corresponding texture space positions u,v in the texture space.
  • the intensity at a texture space position u,v is determined by filtering.
  • the intensity at the texture space position u,v which is, or contributes to, the intensity of the pixel at the screen space grid position x g ,y g is determined as a weighted sum of intensities at surrounding texture space grid positions u g ,v g .
  • a weighted sum of the texel intensities TI(u g ,v g ) at the texture space grid positions u g ,v g indicated by 1, 2, 3 and 4 is determined.
  • Fig. 4 shows a block diagram of the inverse texture mapping 3D graphics system in accordance with an embodiment of the invention.
  • the basic structure of the ITM shown in Fig. 4 is identical to the known ITM shown in Fig. 2. The difference is that a plurality of pipelines formed by the transform and lighting module VERj, the rasterizer
  • the items may be hardware which is present multiple times, or the same hardware may be used in a time multiplexing mode, or a combination of these two possibilities may be implemented.
  • the texture cache TC in accordance with the prior art ITM system as discussed with respect to Fig. 2 does also store a particular texture but this texture is used for the different polygons of the groups of polygons which require the same texture and not for the same polygon at different display instants.
  • the signals (data) shown in Fig. 4 are the same as in Fig. 2, the only difference is that the index j is added to indicate that the signals depend on the rendering instant tj.
  • FIGs. 5 A and 5B illustrate the operation of the embodiment of the inverse texture mapping system shown in Fig. 4.
  • Fig. 5 A shows the screen space polygon SGP1 at the render instant tl, and the screen space polygon TGPn at the render instant tn.
  • Fig. 5B is identical to Fig. 3B.
  • Fig. 5A shows the screen space polygon SGP1 at the render instant tl, and the screen space polygon TGPn at the render instant tn.
  • Fig. 5B is identical to Fig. 3B.
  • FIG. 5 A shows the screen space polygon SGP1 and SGPn, both are mapped from the same source polygon from the world space WO, only a different mapping has been used according to the associated render instant tl, tn along the motion path.
  • the display instants tl to tn are collectively also referred to as tj
  • the screen space polygons SGP1 to SGPn are also collectively referred to as screen space polygons SGPj.
  • the position of the screen space polygons SGPj depends either on the motion data provided by the application or on motion data determined from the positions of the screen space polygons SGPj at two successive sample instants ts of the geometric data.
  • the 3D application may be a 3D game, a VRML browser, a 3D user interface, a MPEG 4 visual renderer, visiophony or any other 3D application.
  • the screen space polygon SGPn is a translated version of the screen space polygon SGP1, other movements than across a straight line are as well possible.
  • the vertices of the screen space polygon SGP1 are indicated by the screen space positions x v l 1 ,y v l 1 ; x v 12,y v 12; x v 13,y v 13 which usually do not coincide with the screen space grid positions x g ,y g .
  • the vertices of the screen space polygon SGPn are indicated by the screen space positions x v nl,y v nl; x v n2,y v n2; x v n3,y v n3.
  • the vertices u v l,v v l to u v 3,v v 3 of the texture space polygon TGP are provided by the 3D application and stay the same in time (or at least during a certain period in time). It is the same texture which has to be applied to the moving projection (defined by the different screen space polygons SGPj) in the screen space SSP.
  • the moving 3D model together with the perspective mapping to the screen space SSP determines the vertices of the screen space polygons SGPj.
  • the index j is used to refer to the items related to the plurality of the n screen space polygons SGP1 to SGPn.
  • the temporal interval Tf is the period of time between two successive sample instants of the geometric data supplied by the 3D application.
  • the frame period is equal to the sampling period of the input signal (sampling of the geometry delivered by the 3D application).
  • the frame period of the output signal is determined by the number of display instants tj occurring within the temporal interval Tf or the sample period of the input signal.
  • the geometric data comprises the vertices of the texture space polygon TGP, data defining the perspective mapping from the 3D space to the screen space SSP, and the motion data.
  • the motion data is provided which indicates the motion path of the vertices of the screen space polygons SGPj within the temporal interval Tf.
  • the motion data can be used to obtain the motion path which may be described with a displacement vector which indicates the displacement of vertices of the polygon from the previous sampling instant to the current sampling instant.
  • the displacement vectors of the vertices of a polygon may differ in direction and size. For example a triangle polygon may rotate around one of it's vertices (so displacement size of that vertex is zero) and then the displacement vectors of the two other vertices (if they are not co-inside) differ in direction (and size if the distance of both vertices to the first differs).
  • the motion data may be a more advanced description of the motion path, such as, for example, curves described with conies, composite curves, Bezier curves, B- spline curves, or rational polynomials.
  • motion data determined from the positions of the screen space polygons SGPj at more than two successive sample instants ts of the geometric data may be used.
  • the application should supply the motion data together with the geometry data.
  • the vertex transformation and lighting unit VER is splitted into a plural of units indicated by VERj.
  • Each unit VERj transforms the world space vertex coordinates to screen space coordinates of the vertices of the polygons SGPj and calculates a vertex color depending on lighting state and the vertex position at display instant tj.
  • the rasterizer SRASj determines the screen space grid positions Xgj,ygj within the polygon SGPj.
  • the rasterizer SRAS1 determines the screen space grid positions x g l,y g l within the polygon SGP1.
  • These screen space grid positions Xgj,ygj inside the screen space polygons SGPj are indicated with a cross and are also referred to as the pixel positions.
  • the mappers MAPj map the screen space grid positions X g j,ygj within the screen space polygons SGPj to the texture space coordinates uj,vj which generally do not coincident with the texture space grid coordinates u g ,v g for which the intensities TI(u g ,v g ) are stored in the texture memory TM.
  • the texture space grid positions u g ,v g are the positions which have integer values for u and v.
  • the mappers MAPj always map the screen space grid positions Xgj,ygj inside the different screen space polygons SGPj to texture space coordinates uj,vj inside the same texture space polygon TGP.
  • the pixel shaders PSj each receive a set of attributes ATR per pixel, the grid screen coordinates X jjVgj of the pixel and the corresponding texture coordinates uj,yj.
  • the texture coordinates uj,vj are used to address texture data TI(u g ,v g ) on grid texture positions u g ,Vg stored in the texture memory TM via the texture space resampler TSR and the texture cache TC.
  • the pixel shaders PSj may modify the texture coordinate data uj,vj and may apply and combine several texture maps on the same pixel. They also may perform shading without the use of texture data but on basis of a formula such as the well known Gouraud and Phong shading techniques.
  • the texture space resampler TSR determines the intensity PI(uj,vj) associated with the intensity PSI(Xgj,y g j) of the pixel at the screen space grid position (x ⁇ y ⁇ ' ) mapped to the texture space coordinate (uj,vj) in-between texel grid positions (u g ,v g ).
  • the texel intensities TI(u g ,v g ) for the texture space grid positions u g ,v g are stored in the texture memory TM.
  • the texture space resampler TSR determines each one of the intensities PI(uj,vj) by filtering and accumulating the texel intensities TI(u g ,u v ) of texels which have texture space grid coordinates u g ,v g and which have to contribute to the intensity PI(uj,vj).
  • the texture space resampler TSR determines the intensity PI(uj,vj) at the texture space position uj,vj by filtering the texel intensities on texture space grid positions u g ,v g surrounding the texture space position uj,vj.
  • a bilinear interpolation using the four texture space grid positions u g ,v g (indicated in Fig. 5B with 1 to 4) surrounding the texture space position uj,vj may be used.
  • the resulting intensity PI(uj,vj) at the position uj,vj is used by the pixel shader PS to determine the pixel intensity PSI(Xgj,ygj) on the pixel grid position ⁇ y ⁇ .
  • the texture cache TC temporarily stores the texel intensities TI(u g ,v g ) required for the determination of all the intensities PI(uj,vj) of the texture space coordinate (uj,vj) mapped by the mappers MAPj.
  • the pixel shaders PSj determine the contributions of the intensities PI(uj,vj) to the pixel intensities PSI(Xgj,ygj). Thus, if all these contributions to the pixel intensities PSI(X g j,ygj) are determined for the same texture space polygon TGP at all the display instants tj, successively for each screen space polygon SGP, the data traffic between the texture cache TC and the texture memory TM is not increased compared to rendering of only a single screen space polygon SGP, if all the texel intensities TI(u g ,v g ) of the current polygon fit in the texture cache TC.
  • the polygon SGP can be subdivided into smaller parts (e.g. in blocks or other polygons) such that the texels of such a part do fully fit into the texture cache TC. Still, in every temporal interval Tf for every texture space polygon TGP only one fetch of the relevant data from the texture memory TM is required, independent on the number of rendering instants tj. The data bandwidth between the texture cache TC and the texture space resampler TSR increases with a factor equal to the number of rendering instants tj and thus with the frame rate up-conversion factor.
  • the hidden surface removal units HSRj usually include a Z-buffer which enables determination of the visible colors on a per pixel basis.
  • the Z-buffer has the size of a frame or a tile. In case of tile-based rendering, the tile size is relatively small and can even be made smaller for optimal use of the cache in the present frame rate up- conversion technique.
  • the depth value z of a produced pixel value PSI(Xgj,ygj) is tested against the depth value of the one stored in the Z-buffer belonging to the frame buffer FBj at the same pixel screen coordinate X g j,y g j (thus on the screen grid).
  • the pixel intensity or color PIP(xgj,ygj) is written into the frame buffer FBj and the Z-buffer belonging to FBj is updated.
  • the image to be displayed LM is read from the frame buffer FBj.
  • the intensities of the texels TI(u g ,v g ) are stored in the texture memory TM for the texture space grid positions u g ,v g .
  • the texture space grid positions u g ,v g within the texture space polygon TGP are indicated by dots in Fig. 5B.
  • the mappers MAPj map the screen space grid coordinates Xgj,ygj to corresponding texture space positions uj,vj in the texture space TSP.
  • the intensity PI(uj,vj) at a texture space position uj,vj is determined by filtering.
  • the intensity PI(uj,vj) at the texture space position uj,vj which is, or contributes to, the intensity PSI(Xgj,ygj) of the pixel at the screen space grid position Xgj,ygj, is determined as a weighted sum of intensities at surrounding texture space grid positions u g ,v g .
  • a weighted sum of the texel intensities TI(u g ,v g ) at the texture space grid positions Ug,v g indicated by 1, 2, 3 and 4 is determined.
  • Fig. 6 shows a computer comprising the inverse texture mapping system.
  • the computer PC comprises a processor 3, a graphics adapter 2 and a memory 4.
  • the processor 3 is suitably programmed to supply input data II to the graphics adapter 2.
  • the processor 3 communicates with the memory 4 via the bus Dl.
  • the graphics adapter 2 comprises the ITM system 1.
  • the graphics adapter 2 is a module which is plugged into a suitable slot (for example an AGP slot).
  • the graphics adapter comprises its own memory (for example the texture memory TM and the frame buffer FB).
  • the graphics adapter may use part of the memory 4 of the computer PC, now the graphics adapter need to communicate with the memory 4 via the bus D2 or via the processor 3 and the bus Dl .
  • the graphics adapter 2 supplies the output image OI via a standard interface to the display apparatus DA.
  • the display apparatus may be any suitable display, such as, for example, a cathode ray tube, a liquid crystal display, or any other matrix display.
  • the computer PC and the display DA need not be separate units which communicate via a standard interface but may be combined in a single apparatus, such as, for example, a personal digital assistant (PDA or pocket PC) or any other mobile device with a display for displaying images.
  • Fig. 7 shows a display apparatus comprising the inverse texture mapping system.
  • the display apparatus DA comprises the ITM pipeline 1 which receives the input data II (geometry and related data) and supplies the output image OI to a signal processing circuit 11.
  • the signal processing circuit 11 processes the output image OI to obtain a drive signal DS for the display 12.
  • the inverse texture mapping 3D graphics processor maps a 3D model WO onto a screen space SSP.
  • a texture memory TM stores texel intensities TI(u g ,v g ) of texture space grid positions u g ,v g .
  • a plurality of screen space rasterizers SRASj determines pixel grid positions within different screen space polygons SGPj at a plurality of corresponding different display instants tj during a same temporal interval Tf between sample instants ts of geometric data of the 3D model WO.
  • the screen space polygons SGPj have different positions in the screen space SSP dependent on motion information of the 3D model WO with respect to the camera.
  • a plurality of corresponding mappers MAPj map the pixel grid positions of the screen space polygons SGP at the different display instants tj to texture space positions uj,vj.
  • a texture space resampler TSR determines texel intensities PI(uj,vj) at the texture space positions uj,vj from the texel grid intensities TI(u g ,v g ) of the texture space grid positions u g ,v g stored in the texture memory TM.
  • a texture cache TC temporarily stores, for a texture space polygon TGP, the texel intensities TI(u g ,v g ) required by the texture space resampler TSR during the temporal interval Tf for all the screen space polygons SGP which are associated with a same texture space polygon TGP.
  • a plurality of corresponding pixel shaders PSj determine, at said different display instants tj, pixel intensities PSI(xgj,ygj) from the texel intensities PI(uj,vj).
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • Use of the verb "comprise” and its conjugations does not exclude the presence of elements or steps other than those stated in a claim.
  • the article "a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

An inverse texture mapping 3D graphics processor maps a 3D model (WO) onto a screen space (SSP). A texture memory (TM) stores texel intensities TI(ug,vg) of texture space grid positions (ug,vg). A plurality of screen space rasterizers (SRASj) determines pixel grid positions (xgi,ygi) within different screen space polygons (SGP) at a plurality of corresponding different display instants (tj) during a same temporal interval (Tf) between sample instants of geometric data of the 3D model (WO). The screen space polygons (SGP) have different positions in the screen space (SSP) dependent on motion information of the 3D model (WO). A plurality of corresponding mappers (MAPj) map the pixel grid positions (xgi,ygi) of the screen space polygons (SGP) at the different display instants (tj) to texture space positions (uj,vj). A texture space resampler (TSR) determines texel intensities (PI(uj,vj)) at the texture space positions (uj,vj) from the texel grid intensities (TI(ug,vg)) of the texture space grid positions (ug,vg) stored in the texture memory (TM). A texture cache (TC) temporarily stores, for every texture space polygon (TGP), the texel intensities TI(ug,vg) required by the texture space resampler (TSR) during the temporal interval (Tf) for all the screen space polygons (SGP) which are associated with a same texture space polygon (TGP). A plurality of corresponding pixel shaders (PSj) determine, at said different display instants (tj), pixel intensities (PSI(xgj,ygj)) from the texel intensities (PI(uj,vj)).

Description

Inverse texture mapping 3D graphics system
Field of the invention The invention relates to an inverse texture mapping 3D graphics processor, a graphics adapter comprising the 3D graphics processor, a computer comprising the 3D graphics processor, a display apparatus comprising the 3D graphics processor, and a method of inverse texture mapping.
Background of the invention The known inverse texture mapping (further referred to as ITM) is elucidated in detail with respect to Figs. 1, 2 and 3. Such an ITM system is able to generate data for display on a display screen at a particular display frame rate depending on the available amount of recourses such as memory bandwidth and computational power and the 3D scene complexity. A higher display frame rate requires a higher amount of these recourses and the complete ITM processing has to be adapted to be able to provide this higher display frame rate.
Summary of the invention It is an object of the invention to provide an ITM system that is able to provide frame rate up-conversion without increasing the data bandwidth to the texture memory. A first aspect of the invention provides an inverse texture mapping 3D graphics processor as claimed in claim 1. A second aspect of the invention provides a graphics adapter comprising the 3D graphics processor, as claimed in claim 10. A third aspect of the invention provides a computer comprising the 3D graphics processor, as claimed in claim 11. A fourth aspect of the invention provides a display apparatus comprising the 3D graphics processor, as claimed in claim 12. A fifth aspect of the invention provides a method of inverse texture mapping as claimed in claim 13. Advantageous embodiments are defined in the dependent claims. The inverse texture mapping 3D graphics processor in accordance with the first aspect maps a 3D model onto screen space. The graphics processor comprises a texture memory for storing texel intensities of texture space grid positions. A plurality of screen space rasterizers determine pixel grid positions within different screen space polygons at a plurality of corresponding different display instants during a same temporal interval between sample instants of geometric data of the 3D model. The pixel grid positions in screen space are considered to be positioned on a grid, the pixel intensities related to these pixel grid positions will be stored in a frame buffer memory and are used to display the image. These different instants are referred to as display instants because the screen space projection of the 3D model is rendered for displaying at these instants. The ratio between the number of display instants and sample instants is the frame rate up-conversion factor. At these different display instants, the screen space polygons associated with the same polygon of the 3D model have different positions in the screen space dependent on motion information of the 3D model with respect to the camera position (also called eye position). A plurality of corresponding mappers maps the pixel grid positions of the screen space polygons at the different instants to texture space positions that usually do not coincide with the texel grid positions on which the texel intensities are stored. A texture space resampler determines texel intensities at the mapped screen space positions from the texel grid intensities of the texture space grid positions stored in the texture memory or in texture cache. A texture cache temporarily stores, for every texture space polygon, the texel grid intensities required by the texture space resampler during the temporal interval for all the screen space polygons being associated with a same texture space polygon. If a texture within a polygon would not fit in the texture cache a strategy of partitioning the polygon in smaller parts (e.g. blocks) could be used. A plurality of corresponding pixel shaders determine, at said different display instants, pixel intensities optionally from the texel intensities received from the texture space resampler and optionally on the basis of a formula such as the well known Gouraud shading. Thus, the same texture samples stored in the texture cache can be used for all the associated screen space polygons during a same temporal interval wherein the plurality of rendering instants occurs. Thus, the texture needs to be fetched once per temporal interval from the texture memory and not for every display instant. Consequently, the data rate between the texture cache and the texture memory does not depend on the frame rate up- conversion factor. This is an important improvement because the data rate to the texture memory is limited by the speed of the texture memory and by the data rate on the bus to the texture memory. This is especially relevant if the texture memory is present on a separate chip. In an embodiment as claimed in claim 2, the motion information comprises motion data that can be used to determine the path of motion of the polygon in the screen space within the temporal interval. The vertices of the polygon(s) and the mapping of pixels to texture space can be determined from this path of motion. It has to be noted that the vertices of a polygon can have different paths of motion. Thus, the path of motion of the polygon is determined by the path of motion of each of the vertices of the polygon. In an embodiment as claimed in claim 3, the motion information is a displacement vector that indicates a displacement of the vertices of the polygon in the screen space between two sample instants. The displacement at a particular one of the rendering instants can be determined by (linearly) interpolating the displacement defined by the displacement vector. In an embodiment as claimed in claim 4, motion data is used in the form of two model/view matrices one for the current sample instant and one for the previous sample instant. With these two matrices, a motion vector in screen space can be determined. Also, the parameters of the mapping functions of the mappers for different rendering instants between two successive sample instants can be determined from this information. This is a robust and efficient method to obtain displacement vectors in the eye or world space. To determine the eye space displacement vectors, the vertices of the previous frame (or more general: the temporal interval) can be subtracted from the vertices of the current frame. Thus, the 3D system calculates the coordinates of the eye space vertices for both the current frame instant (or more general: the current sample instant) and the previous frame instant. The 3D application needs to send, next to the normal model- view matrix an additional model- view matrix for the previous frame instant. The application may buffer the model- view matrices to efficiently resend them. The geometry transformation unit of the 3D system applies both model- view matrices to transform each vertex to a "current" and a "previous" position in eye space. In an embodiment as claimed in claim 5, the motion information is provided by the 3D application. Alternatively, although more complex, the ITM 3D graphics processor may determined the motion information by relating the vertices of the geometry of the current sampling instant with that of the previous sampling instant. In an embodiment as claimed in claim 6, the ITM processor comprises a plurality of frame buffers for storing the intensities that are determined at the screen grid positions. Each frame buffer stores a rendered image for a particular one of the display instants. Thus, by reading out and displaying all the frame buffers sequentially during a single temporal interval, the frame rate up-conversion is obtained. In an embodiment as claimed in claim 7, for a particular one of the 3D models (WO) no texture maps are stored in the texture cache (TC) and the pixel shaders (PSj) are arranged to perform pixel shading on basis of non-texture data. In an embodiment as claimed in claim 8, the ITM processor controls the mappers to perform an identical mapping to the frame buffers for non-moving objects. In fact only one of the mappers needs to perform the mapping and the output obtained this way is copied to all the frame buffers. These and other aspects of the invention are apparent from and will be elucidated with reference to the embodiments described hereinafter.
Brief description of the drawings In the drawings: Fig. 1 shows a display of a 3D object on a display screen, Fig. 2 shows a block diagram of a prior art inverse texture mapping 3D graphics system, Figs. 3 A and 3B illustrate the operation of the inverse texture mapping system, Fig. 4 shows a block diagram of the inverse texture mapping 3D graphics system in accordance with an embodiment of the invention, Figs. 5A and 5B illustrate the operation of the embodiment of the inverse texture mapping system shown in Fig. 4, Fig. 6 shows a computer comprising the inverse texture mapping system, and Fig. 7 shows a display apparatus comprising the inverse texture mapping system.
Detailed description of the preferred embodiment Fig. 1 elucidates the display of a 3D object WO in world space on a display screen DS. Instead of the world space the object may also be available in other 3D spaces such as model or eye space, in the following all these spaces are referred to as world space. An object WO, which may be a three-dimensional object such as the cube shown, is projected on the two-dimensional display screen DS. A surface structure or texture defines the appearance of the three-dimensional object WO. In Fig. 1 the polygon A has a texture TA and the polygon B has a texture TB. The polygons A and B are with a more general term also referred to as graphics primitives. The projection on to the display screen DS of the object WO is obtained by defining an eye or camera position ECP within the world space. Fig. 1 shows how the polygon SGP projected on the screen DS is obtained from the corresponding polygon A. The polygon SGP in the screen space SSP is defined by it's vertex coordinates in the screen space SSP. It is only the projection of the geometry of the polygon A which is used to determine the geometry of the polygon SGP. Usually, it suffices to know the vertices of the polygon A and the projection to determine the vertices of the polygon SGP. The texture TA of the polygon A is not directly projected from the real world onto the screen space SSP. The different textures of the real world object WO are stored in a texture map memory TM (see Fig. 2) or texture space TSP defined by the coordinates u and v. For example, Fig. 1 shows that the polygon A has a texture TA which is available in the texture space TSP in the area indicated by TA, while the polygon B has another texture TB which is available in the texture space TSP in the area indicated by TB. The polygon A is projected on the texture space TA to obtain a polygon TGP such that when the texture present within the polygon TGP is projected on the polygon A the texture of the real world object WO is obtained or at least resembled as much as possible. A perspective transformation PPT between the texture space TSP and the screen space SSP projects the texture of the polygon TGP on the corresponding polygon SGP. This process is also referred to as texture mapping. Usually, the textures are not all present in a global texture space, but every texture defines its own texture space TSP. It has to be noted that the textures in the texture space TSP are stored in a texture memory TM for a discrete number of positions in the texture space TSP. Usually these discrete positions are the grid positions in the texture space TSP determined by integer values of u and v. These discrete grid positions are further referred to as grid texture positions or grid texture coordinates. Positions in the texture space which are not limited to the grid positions are referred to as positions in the texture space TSP or as positions in the u,v space TSP. The positions in the u,v space may be represented by floating point numbers. In a same manner, the image to be displayed is stored in a frame buffer memory. Again, only a discrete number of positions in the x,y space or screen space SSP is available. Usually, these discrete positions are the grid positions in the screen space SSP determined by integer values of x and y. These discrete grid positions are referred to as grid screen positions or grid screen coordinates. Positions in the x,y space which are not limited to the grid positions are referred to as positions in the x,y space or as positions in the screen space SSP. These positions in the x,y space may be represented by floating point numbers. In the now following, the term graphics primitive indicates a polygon (such as polygon A) in the world space, or the polygon SGP in the screen space SSP, or the polygon TGP in the texture space TSP. It is clear from the context which graphics primitive is meant. Fig. 2 shows a block diagram of a prior art inverse texture mapping 3D graphics system. The vertex transformation and lighting unit VER, further also referred to as the vertex T&L unit, transforms the vertex coordinates of the polygon A; B in world space to the screen space SSP to obtain screen space coordinates xvl,yvl to xv3,yv3 of the vertices of the screen space polygon SGP. The vertex T&L unit further performs light calculations to determine an intensity (also referred to as color) per vertex. If a texture TA, TB is to be applied to a screen space polygon SGP, the vertex T&L unit receives texture space coordinates uvι,vvι to uv3,vv3 from the application. The vertex T&L unit supplies both the screen space coordinates xv,yv ( vijVvi; Xv2,yv2; xv3,yv3 in Fig. 3A) and the texture space coordinates uv,vv (uvι,vvι; uv2,vv2; uv3,vv3 in Fig. 3B) of the vertices of the screen space polygons SGP and the texture space polygons TGP, respectively, such that the position thereof in the screen space SSP and the texture space TSP, respectively, is known. Usually, the positions of the vertices will not coincident with the screen space grid positions or texture space grid positions, respectively. The screen space rasterizer SRAS determines the grid positions xg,yg of the pixels which are positioned within the screen space polygon SGP which is determined by the screen space coordinates xv,y of its vertices. In the example shown in Fig. 3 A, these screen space grid positions xg,yg within the screen space polygon SGP are indicated by crosses. The rasterizer SRAS may include a so called rasterizer setup which initializes temporal variables required by the rasterizer SRAS for efficient processing based on interpolation of the vertex attributes. The mapper MAP maps the screen space grid positions xg,yg to corresponding texture space positions u,v in the texture space TSP, see Figs. 3. Generally, these texel positions u,v will not coincident with texture space grid positions ug,vg. The pixel shader PS determines the intensity PSI(xg,yg) (also referred to as color) of a pixel with the screen space coordinates xg,ygand thus the texture space coordinates u,v. The pixel shader PS receives a set of attributes ATR per pixel, the grid screen coordinates xg,yg of the pixel and the corresponding texture coordinates u,v. The texture coordinates u,v are used to address texture data TI(ug,vg) on grid texture positions Ug,vg stored in the texture memory TM via the texture space resampler TSR. The pixel shader PS may modify the texture coordinate data u,v and may apply and combine several texture maps on the same pixel. It also may perform shading without the use of texture data but on basis of a formula such as the well known Gouraud and Phong shading techniques. The texture space resampler TSR determines the intensity PI(u,v) associated with the intensity PSI(xg,yg) of the pixel at the screen space grid position (xg,yg) mapped to the texture space coordinate (u,v) in-between texel grid positions (ug,vg). The texture data TI(ug,Vg) corresponding to the texture space grid position ug,vg is indicated by TI(ug,vg). The texel intensities TI(ug,vg) for texture space grid positions ug,vg are stored in the texture memory TM. The texture space resampler TSR determines the intensity PI(u,v) by filtering and accumulating the texel intensities TI(ug,uv) of texels with texture space grid coordinates Ug,vg and which have to contribute to the intensity PI(u,v). Thus, the texture space resampler TSR determines the intensity PI(u,v) at the texture space position u,v by filtering the texel intensities on texture space grid positions ug,vg surrounding the texture space position u,v. For example, a bilinear interpolation using the four texture space grid positions ug,vg (indicated in Fig. 3B with 1 to 4) surrounding the texture space position u,v may be used. The resulting intensity PI(u,v) at the position u,v is used by the pixel shader PS to determine the pixel intensity PSI(xg,yg) on the pixel grid position xg,yg. The hidden surface removal unit HSR, usually includes a Z-buffer which enables determination of the visible colors on a per pixel basis. The depth value z of a produced pixel value PSI(xg,yg) is tested against the depth value of the one stored in the Z- buffer at the same pixel screen coordinate xg,yg (thus on the screen grid). Depending on the outcome of the test, the pixel intensity or color PIP(xg,yg) is written into the frame buffer FB and the Z-buffer is updated. The image to be displayed IM is read from the frame buffer FB. It has to be noted that usually a texture cache is present between the texture space resampler TSR and the texture memory TM. Usually, the application provides the polygons in groups to minimize texture state switches. Each one of the groups of polygons is related to a same one of the textures. The texture used for a particular group of polygons is stored whole or partially in the texture cache and the texture data can be fetched from texture cache by subsequent polygons from the same group. With the start of a next group of polygons another texture whole or partially is loaded into the texture cache. During the processing of the polygons of a group, the texture memory TM fetches are minimized because all or almost all texture information is present in the texture cache. Figs. 3 A and 3B illustrate the operation of the inverse texture mapping system. Fig. 3A shows the screen space polygon SGP in the screen space SSP. The vertices of the polygon SGP are indicated by the screen space positions xvι,yvι; v2,yv2; v3,yV3 which usually do not coincide with the screen space grid positions xg,yg- The screen space grid positions xg,yg are the positions which have integer values for x and y. The image to be displayed is determined by the intensities (color and brightness) PIP(xg,yg) of the pixels which are positioned on the screen space grid positions xg,yg. The rasterizer SRAS determines the screen space grid positions xg,yg within the polygon SGP. These screen space grid positions xg,yg are indicated with a cross and are also referred to as the pixel positions. Fig. 3B shows the texture space polygon TGP in the texture space TSP. The vertices of the texture space polygon TGP are indicated by the texture space positions uvι,vvι; uv2,vv2; uv3,vv3 which usually do not coincide with the texture space grid positions ug,vg. The texture space grid positions ug,vg are the positions which have integer values for u and v. The intensities of the texels TI(ug,vg) are stored in the texture memory TM for these texture space grid positions ug,vg. There may be stored several texture maps in different resolutions of the same texture. A known technique which uses these different resolution textures is called MlP-mapping. The texture space grid positions ug,vg within the polygon TGP are indicated by dots in Fig. 4B. The mapper MAP maps the screen space grid coordinates xg,yg to corresponding texture space positions u,v in the texture space. The intensity at a texture space position u,v is determined by filtering. For example, the intensity at the texture space position u,v which is, or contributes to, the intensity of the pixel at the screen space grid position xg,yg, is determined as a weighted sum of intensities at surrounding texture space grid positions ug,vg. For example, a weighted sum of the texel intensities TI(ug,vg) at the texture space grid positions ug,vg indicated by 1, 2, 3 and 4 is determined. Fig. 4 shows a block diagram of the inverse texture mapping 3D graphics system in accordance with an embodiment of the invention. The basic structure of the ITM shown in Fig. 4 is identical to the known ITM shown in Fig. 2. The difference is that a plurality of pipelines formed by the transform and lighting module VERj, the rasterizer
SRASj, the mapper MAPj, the pixel shader PSj, the hidden surface removal unit HSRj, and the frame buffer FBj is present instead of the single pipeline of Fig. 2. The index j indicates that the item known from Fig. 2 is the jth item with 1< j < n. Thus if n = 4, all the items with the index j are present four times and each one is indicated by one of the indices j running from 1 to 4. All the items of Fig. 4 operate in the same manner as the corresponding items of Fig. 2. In a practical embodiment, the items may be hardware which is present multiple times, or the same hardware may be used in a time multiplexing mode, or a combination of these two possibilities may be implemented. What counts is that now the processes in Fig. 2 occur j times at j different rendering instants tj. Thus, in accordance with the invention, the same texture space polygon TSP with the same texture is used at different display instants. Thus, if this texture is stored in the texture cache TC, the texture has to be retrieved from the texture memory TM only once for all the different display instants. It has to be noted that the use of parallel pipelines as such is known. However, these know parallel pipelines are used to obtain a higher performance
(more polygons and pixels per second are processed). These known systems do not store the same texture space polygon TSP with the same texture in the texture cache TC for use at different display instants to reduce the data traffic of the texture memory TM. The texture cache TC in accordance with the prior art ITM system as discussed with respect to Fig. 2 does also store a particular texture but this texture is used for the different polygons of the groups of polygons which require the same texture and not for the same polygon at different display instants. Also, the signals (data) shown in Fig. 4 are the same as in Fig. 2, the only difference is that the index j is added to indicate that the signals depend on the rendering instant tj. The operation of the ITM system shown in Fig. 4 will be elucidated with respect to Figs. 5. Figs. 5 A and 5B illustrate the operation of the embodiment of the inverse texture mapping system shown in Fig. 4. Fig. 5 A shows the screen space polygon SGP1 at the render instant tl, and the screen space polygon TGPn at the render instant tn. For the sake of clarity other polygons at other render instants tj, if present (if n >2, 1 < j < n), are not shown. Fig. 5B is identical to Fig. 3B. Fig. 5 A shows the screen space polygon SGP1 and SGPn, both are mapped from the same source polygon from the world space WO, only a different mapping has been used according to the associated render instant tl, tn along the motion path. The display instants tl to tn are collectively also referred to as tj, the screen space polygons SGP1 to SGPn are also collectively referred to as screen space polygons SGPj. The position of the screen space polygons SGPj depends either on the motion data provided by the application or on motion data determined from the positions of the screen space polygons SGPj at two successive sample instants ts of the geometric data. The 3D application may be a 3D game, a VRML browser, a 3D user interface, a MPEG 4 visual renderer, visiophony or any other 3D application. Although in Fig. 5 the screen space polygon SGPn is a translated version of the screen space polygon SGP1, other movements than across a straight line are as well possible. The vertices of the screen space polygon SGP1 are indicated by the screen space positions xvl 1 ,yvl 1 ; xv12,yv12; xv13,yv13 which usually do not coincide with the screen space grid positions xg,yg. The vertices of the screen space polygon SGPn are indicated by the screen space positions xvnl,yvnl; xvn2,yvn2; xvn3,yvn3. The vertices uvl,vvl to uv3,vv3 of the texture space polygon TGP are provided by the 3D application and stay the same in time (or at least during a certain period in time). It is the same texture which has to be applied to the moving projection (defined by the different screen space polygons SGPj) in the screen space SSP. The moving 3D model together with the perspective mapping to the screen space SSP determines the vertices of the screen space polygons SGPj. It has to be noted that the index j is used to refer to the items related to the plurality of the n screen space polygons SGP1 to SGPn. The image IM to be displayed is determined by the intensities PIP(xgj,ygj) of the pixels which are positioned on the screen space grid positions For example, if n = 4, the display rate (also called frame rate) of the image IM is four times higher than the sample rate of the input data supplied by the 3D application. Or said differently, in one temporal interval Tf four display instants tj (tl to t4) occur to which four positions of the screen space polygon SGPj in the screen space SSP are associated. The temporal interval Tf is the period of time between two successive sample instants of the geometric data supplied by the 3D application. In prior art systems often the temporal interval between two displayed rendered images (frames) is referred to as the frame period and is equal to the sampling period of the input signal (sampling of the geometry delivered by the 3D application). However, in the system in accordance with the invention the frame period of the output signal is determined by the number of display instants tj occurring within the temporal interval Tf or the sample period of the input signal. The geometric data comprises the vertices of the texture space polygon TGP, data defining the perspective mapping from the 3D space to the screen space SSP, and the motion data. Preferably, the motion data is provided which indicates the motion path of the vertices of the screen space polygons SGPj within the temporal interval Tf. The motion data can be used to obtain the motion path which may be described with a displacement vector which indicates the displacement of vertices of the polygon from the previous sampling instant to the current sampling instant. The displacement vectors of the vertices of a polygon may differ in direction and size. For example a triangle polygon may rotate around one of it's vertices (so displacement size of that vertex is zero) and then the displacement vectors of the two other vertices (if they are not co-inside) differ in direction (and size if the distance of both vertices to the first differs). However, instead of displacement vectors the motion data may be a more advanced description of the motion path, such as, for example, curves described with conies, composite curves, Bezier curves, B- spline curves, or rational polynomials. In the case of curved motion paths, motion data determined from the positions of the screen space polygons SGPj at more than two successive sample instants ts of the geometric data may be used. Preferably, the application should supply the motion data together with the geometry data. However, in special cases, it is also possible to detect the motion data from the geometry data at different sampling instants. The vertex transformation and lighting unit VER is splitted into a plural of units indicated by VERj. Each unit VERj transforms the world space vertex coordinates to screen space coordinates of the vertices of the polygons SGPj and calculates a vertex color depending on lighting state and the vertex position at display instant tj. At each one of the rendering instants tj, the rasterizer SRASj determines the screen space grid positions Xgj,ygj within the polygon SGPj. Thus, at the rendering instant tl, the rasterizer SRAS1 determines the screen space grid positions xgl,ygl within the polygon SGP1. These screen space grid positions Xgj,ygj inside the screen space polygons SGPj are indicated with a cross and are also referred to as the pixel positions. The mappers MAPj map the screen space grid positions Xgj,ygj within the screen space polygons SGPj to the texture space coordinates uj,vj which generally do not coincident with the texture space grid coordinates ug,vg for which the intensities TI(ug,vg) are stored in the texture memory TM. The texture space grid positions ug,vg are the positions which have integer values for u and v. Because of the fact that the same texture space polygon TGP is required to provide the texture to the different screen space polygons SGPj, the mappers MAPj always map the screen space grid positions Xgj,ygj inside the different screen space polygons SGPj to texture space coordinates uj,vj inside the same texture space polygon TGP. The pixel shaders PSj each receive a set of attributes ATR per pixel, the grid screen coordinates X jjVgj of the pixel and the corresponding texture coordinates uj,yj. The texture coordinates uj,vj are used to address texture data TI(ug,vg) on grid texture positions ug,Vg stored in the texture memory TM via the texture space resampler TSR and the texture cache TC. The pixel shaders PSj may modify the texture coordinate data uj,vj and may apply and combine several texture maps on the same pixel. They also may perform shading without the use of texture data but on basis of a formula such as the well known Gouraud and Phong shading techniques. The texture space resampler TSR determines the intensity PI(uj,vj) associated with the intensity PSI(Xgj,ygj) of the pixel at the screen space grid position (x^y^') mapped to the texture space coordinate (uj,vj) in-between texel grid positions (ug,vg). The texel intensities TI(ug,vg) for the texture space grid positions ug,vg are stored in the texture memory TM. The texture space resampler TSR determines each one of the intensities PI(uj,vj) by filtering and accumulating the texel intensities TI(ug,uv) of texels which have texture space grid coordinates ug,vg and which have to contribute to the intensity PI(uj,vj). Thus, the texture space resampler TSR determines the intensity PI(uj,vj) at the texture space position uj,vj by filtering the texel intensities on texture space grid positions ug,vg surrounding the texture space position uj,vj. For example, a bilinear interpolation using the four texture space grid positions ug,vg (indicated in Fig. 5B with 1 to 4) surrounding the texture space position uj,vj may be used. The resulting intensity PI(uj,vj) at the position uj,vj is used by the pixel shader PS to determine the pixel intensity PSI(Xgj,ygj) on the pixel grid position ^y^. The texture cache TC temporarily stores the texel intensities TI(ug,vg) required for the determination of all the intensities PI(uj,vj) of the texture space coordinate (uj,vj) mapped by the mappers MAPj. The pixel shaders PSj determine the contributions of the intensities PI(uj,vj) to the pixel intensities PSI(Xgj,ygj). Thus, if all these contributions to the pixel intensities PSI(Xgj,ygj) are determined for the same texture space polygon TGP at all the display instants tj, successively for each screen space polygon SGP, the data traffic between the texture cache TC and the texture memory TM is not increased compared to rendering of only a single screen space polygon SGP, if all the texel intensities TI(ug,vg) of the current polygon fit in the texture cache TC. If all the texels applied on the screen space polygon SGP do not fit within the texture cache TC the polygon SGP can be subdivided into smaller parts (e.g. in blocks or other polygons) such that the texels of such a part do fully fit into the texture cache TC. Still, in every temporal interval Tf for every texture space polygon TGP only one fetch of the relevant data from the texture memory TM is required, independent on the number of rendering instants tj. The data bandwidth between the texture cache TC and the texture space resampler TSR increases with a factor equal to the number of rendering instants tj and thus with the frame rate up-conversion factor. Consequently, if the external texture memory TM is connected to the rest of the ITM system via a bus, the data rate on this bus does not depend on the frame rate up-conversion factor. This in contrast to the prior art ITM system which does not have the texture cache TC and which does not process the texture space polygons TGP one by one, each one for all rendering instants tj. The hidden surface removal units HSRj, usually include a Z-buffer which enables determination of the visible colors on a per pixel basis. Usually, the Z-buffer has the size of a frame or a tile. In case of tile-based rendering, the tile size is relatively small and can even be made smaller for optimal use of the cache in the present frame rate up- conversion technique. The depth value z of a produced pixel value PSI(Xgj,ygj) is tested against the depth value of the one stored in the Z-buffer belonging to the frame buffer FBj at the same pixel screen coordinate Xgj,ygj (thus on the screen grid). Depending on the outcome of the test, the pixel intensity or color PIP(xgj,ygj) is written into the frame buffer FBj and the Z-buffer belonging to FBj is updated. The image to be displayed LM is read from the frame buffer FBj. The intensities of the texels TI(ug,vg) are stored in the texture memory TM for the texture space grid positions ug,vg. There may be stored several texture maps in different resolutions" of the same texture. A known technique that uses these different resolution textures is called MlP-mapping. The texture space grid positions ug,vg within the texture space polygon TGP are indicated by dots in Fig. 5B. The mappers MAPj map the screen space grid coordinates Xgj,ygj to corresponding texture space positions uj,vj in the texture space TSP. The intensity PI(uj,vj) at a texture space position uj,vj is determined by filtering. For example, the intensity PI(uj,vj) at the texture space position uj,vj which is, or contributes to, the intensity PSI(Xgj,ygj) of the pixel at the screen space grid position Xgj,ygj, is determined as a weighted sum of intensities at surrounding texture space grid positions ug,vg. For example, a weighted sum of the texel intensities TI(ug,vg) at the texture space grid positions Ug,vg indicated by 1, 2, 3 and 4 is determined. Fig. 6 shows a computer comprising the inverse texture mapping system. The computer PC comprises a processor 3, a graphics adapter 2 and a memory 4. The processor 3 is suitably programmed to supply input data II to the graphics adapter 2. The processor 3 communicates with the memory 4 via the bus Dl. The graphics adapter 2 comprises the ITM system 1. Usually, the graphics adapter 2 is a module which is plugged into a suitable slot (for example an AGP slot). Usually, the graphics adapter comprises its own memory (for example the texture memory TM and the frame buffer FB). However the graphics adapter may use part of the memory 4 of the computer PC, now the graphics adapter need to communicate with the memory 4 via the bus D2 or via the processor 3 and the bus Dl . The graphics adapter 2 supplies the output image OI via a standard interface to the display apparatus DA. The display apparatus may be any suitable display, such as, for example, a cathode ray tube, a liquid crystal display, or any other matrix display. The computer PC and the display DA need not be separate units which communicate via a standard interface but may be combined in a single apparatus, such as, for example, a personal digital assistant (PDA or pocket PC) or any other mobile device with a display for displaying images. Fig. 7 shows a display apparatus comprising the inverse texture mapping system. The display apparatus DA comprises the ITM pipeline 1 which receives the input data II (geometry and related data) and supplies the output image OI to a signal processing circuit 11. The signal processing circuit 11 processes the output image OI to obtain a drive signal DS for the display 12. To conclude, in a preferred embodiment, the inverse texture mapping 3D graphics processor maps a 3D model WO onto a screen space SSP. A texture memory TM stores texel intensities TI(ug,vg) of texture space grid positions ug,vg. A plurality of screen space rasterizers SRASj determines pixel grid positions within different screen space polygons SGPj at a plurality of corresponding different display instants tj during a same temporal interval Tf between sample instants ts of geometric data of the 3D model WO. The screen space polygons SGPj have different positions in the screen space SSP dependent on motion information of the 3D model WO with respect to the camera. A plurality of corresponding mappers MAPj map the pixel grid positions of the screen space polygons SGP at the different display instants tj to texture space positions uj,vj. A texture space resampler TSR determines texel intensities PI(uj,vj) at the texture space positions uj,vj from the texel grid intensities TI(ug,vg) of the texture space grid positions ug,vg stored in the texture memory TM. A texture cache TC temporarily stores, for a texture space polygon TGP, the texel intensities TI(ug,vg) required by the texture space resampler TSR during the temporal interval Tf for all the screen space polygons SGP which are associated with a same texture space polygon TGP. A plurality of corresponding pixel shaders PSj determine, at said different display instants tj, pixel intensities PSI(xgj,ygj) from the texel intensities PI(uj,vj). It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. Use of the verb "comprise" and its conjugations does not exclude the presence of elements or steps other than those stated in a claim. The article "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims

CLAIMS:
1. An inverse texture mapping 3D graphics processor for mapping a 3D model
(WO) onto a screen space (SSP), said graphics processor comprising: a texture memory (TM) for storing texel intensities TI(ug,vg) of texture space grid positions (ug,vg), a plurality of screen space rasterizers (SRASj) for determining pixel grid positions (xg,yg) within different screen space polygons (SGP) at a plurality of corresponding different display instants (tj) during a same temporal interval (Tf) between sample instants of geometric data of the 3D model (WO), wherein the screen space polygons (SGP) have different positions in the screen space (SSP) dependent on motion information of the 3D model (WO), a plurality of corresponding mappers (MAPj) for mapping the pixel grid positions (xg,yg) of the screen space polygons (SGP) at the different display instants (tj) to texture space positions (uj,vj), a texture space resampler (TSR) for determining texel intensities (PI(uj,vj)) at the texture space positions (uj,vj) from the texel grid intensities (TI(ug,vg)) of the texture . space grid positions (ug,vg) stored in the texture memory (TM), a texture cache (TC) for temporarily storing, for every texture space polygon (TGP), the texel intensities TI(ug,vg) required by the texture space resampler (TSR) during the temporal interval (Tf) for all the screen space polygons (SGP) being associated with a same texture space polygon (TGP), and a plurality of corresponding pixel shaders (PSj) for determining, at said different display instants (tj), pixel intensities from the texel intensities (PI(uj,yj)).
2. An inverse texture mapping 3D graphics processor as claimed in claim 1, wherein the motion information comprises motion data determining a path of motion within the temporal interval (Tf).
3. An inverse texture mapping 3D graphics processor as claimed in claim 1, wherein the motion information comprises a displacement vector indicating a displacement of vertices (xvl l,yvll;Xvl2,yv12;Xvl3,yv13) of the screen space polygon (SGP) at a previous sample instant and vertices (xvnl,yvnl;Xvn2,yvι 2;xvn3,yvn3) of the screen space polygon (SGP) at a current sample instant.
4. An inverse texture mapping 3D graphics processor as claimed in claim 1, wherein the motion data comprises two model/view matrices, one for a current sample instant and one for a previous sample instant.
5. An inverse texture mapping 3D graphics processor as claimed in claim 1, wherein the motion information is received from a 3D application.
6. An inverse texture mapping 3D graphics processor as claimed in claim 1, further comprising a plurality of frame buffers (FBj) for storing intensities (PIP(xgj,ygj)) being determined at the screen grid positions (Xgj,ygj).
7. An inverse texture mapping 3D graphics processor as claimed in claim 1, wherein for a particular one of the 3D models (WO) no texture maps are stored in the texture cache (TC) and wherein the pixel shaders (PSj) are arranged for performing pixel shading on basis of non-texture data.
8. An inverse texture mapping 3D graphics processor as claimed in claim 6, wherein the mappers (MAPj) are arranged for performing an identical mapping to the frame buffers (FBj) for non-moving objects.
9. An inverse texture mapping 3D graphics processor as claimed in claim 1, further comprising means for subdividing the screen space polygon (SGP) into smaller parts if all the texels applied on the screen space polygon (SGP) do not fit within the texture cache (TC) such that the texels of each one of the smaller parts do fully fit into the texture cache (TC).
10. A graphics adapter comprising the inverse texture mapping 3D graphics processor of claim 1.
11. A computer comprising the inverse texture mapping 3D graphics processor of claim 1.
12. A display apparatus comprising the inverse texture mapping 3D graphics processor of claim 1.
13. A method of inverse texture mapping for mapping a 3D model (WO) onto a screen space (SSP), said method comprising: storing (TM) texel intensities TI(ug,vg) of texture space grid positions (ug,vg), determining (SRASj) pixel grid positions (xgi,ygi) within different screen space polygons (SGP) at a plurality of corresponding different display instants (tj) during a same temporal interval (Tf) between sample instants of geometric data of the 3D model (WO), wherein the screen space polygons (SGP) have different positions in the screen space (SSR) dependent on motion information of the 3D model (WO), mapping (MAPj) the pixel grid positions (xgi,ygi) of the screen space polygons (SGP) at the different display instants (tj) to texture space positions (uj,vj), determining (TSR) texel intensities (PI(uj,vj)) at the texture space positions (uj,vj) from the texel grid intensities (TI(ug,vg)) of the texture space grid positions (ug,vg) stored in the texture memory (TM), temporarily storing (TC), for every texture space polygon (TGP), the texel intensities TI(ug,vg) required by the texture space resampler (TSR) during the temporal interval (Tf) for all the screen space polygons (SGP) being associated with a same texture space polygon (TGP), and determining (PSj), at said different display instants (tj), pixel intensities
(PSI(Xgj,ygj)) from the texel intensities (PI(uj,vj)).
EP05749079A 2004-06-16 2005-06-09 Inverse texture mapping 3d graphics system Withdrawn EP1766584A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP05749079A EP1766584A2 (en) 2004-06-16 2005-06-09 Inverse texture mapping 3d graphics system

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP04102746 2004-06-16
EP05749079A EP1766584A2 (en) 2004-06-16 2005-06-09 Inverse texture mapping 3d graphics system
PCT/IB2005/051897 WO2005124693A2 (en) 2004-06-16 2005-06-09 Inverse texture mapping 3d graphics system

Publications (1)

Publication Number Publication Date
EP1766584A2 true EP1766584A2 (en) 2007-03-28

Family

ID=35462636

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05749079A Withdrawn EP1766584A2 (en) 2004-06-16 2005-06-09 Inverse texture mapping 3d graphics system

Country Status (4)

Country Link
EP (1) EP1766584A2 (en)
JP (1) JP2008502979A (en)
CN (1) CN101006471B (en)
WO (1) WO2005124693A2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010041215A1 (en) * 2008-10-09 2010-04-15 Nxp B.V. Geometry primitive shading graphics system
US10726619B2 (en) * 2015-10-29 2020-07-28 Sony Interactive Entertainment Inc. Foveated geometry tessellation
US10726626B2 (en) * 2017-11-22 2020-07-28 Google Llc Interaction between a viewer and an object in an augmented reality environment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993002529A1 (en) * 1991-07-23 1993-02-04 British Telecommunications Public Limited Company Method and device for frame interpolation of a moving image
GB9115874D0 (en) * 1991-07-23 1991-09-04 British Telecomm Frame interpolation
US6331856B1 (en) * 1995-11-22 2001-12-18 Nintendo Co., Ltd. Video game system with coprocessor providing high speed efficient 3D graphics and digital audio signal processing
JP3645024B2 (en) * 1996-02-06 2005-05-11 株式会社ソニー・コンピュータエンタテインメント Drawing apparatus and drawing method
JP3481077B2 (en) * 1997-05-19 2003-12-22 松下電器産業株式会社 Graphic display method and device
CA2250021C (en) * 1997-05-19 2007-02-06 Matsushita Electric Industrial Co., Ltd. Graphic display apparatus, synchronous reproduction method, and av synchronous reproduction apparatus
JP2000025307A (en) * 1998-07-14 2000-01-25 Fuji Xerox Co Ltd Method and system for sharing parameters of image processor
JP2001236519A (en) * 2000-02-21 2001-08-31 Seiko Epson Corp Device and method for reproducing moving image and information recording medium
US7174050B2 (en) * 2002-02-12 2007-02-06 International Business Machines Corporation Space-optimized texture maps
JP3934111B2 (en) * 2004-02-04 2007-06-20 株式会社ソニー・コンピュータエンタテインメント Drawing apparatus and drawing method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2005124693A3 *

Also Published As

Publication number Publication date
CN101006471A (en) 2007-07-25
JP2008502979A (en) 2008-01-31
CN101006471B (en) 2010-09-01
WO2005124693A3 (en) 2006-03-23
WO2005124693A2 (en) 2005-12-29

Similar Documents

Publication Publication Date Title
US6229553B1 (en) Deferred shading graphics pipeline processor
EP0875860B1 (en) Precise gradient calculation system and method for a texture mapping system of a computer graphics system
US6771264B1 (en) Method and apparatus for performing tangent space lighting and bump mapping in a deferred shading graphics processor
US6532013B1 (en) System, method and article of manufacture for pixel shaders for programmable shading
US8224107B2 (en) Method and system for signal processing, for instance for mobile 3D graphic pipelines, and computer program product therefor
US9208605B1 (en) Temporal antialiasing in a multisampling graphics pipeline
US20140347359A1 (en) Cache-efficient processor and method of rendering indirect illumination using interleaving and sub-image blur
US9367946B2 (en) Computing system and method for representing volumetric data for a scene
CN107784622B (en) Graphics processing system and graphics processor
US8736627B2 (en) Systems and methods for providing a shared buffer in a multiple FIFO environment
EP1759355B1 (en) A forward texture mapping 3d graphics system
US7525553B2 (en) Computer graphics processor and method for generating a computer graphics image
Policarpo et al. Deferred shading tutorial
JP2006517705A (en) Computer graphics system and computer graphic image rendering method
EP1766584A2 (en) Inverse texture mapping 3d graphics system
US7385604B1 (en) Fragment scattering
Stewart et al. Pixelview: A view-independent graphics rendering architecture
US20240212257A1 (en) Workload packing in graphics texture pipeline
WO2010041215A1 (en) Geometry primitive shading graphics system
Muszyński et al. Wide Field of View Projection Using Rasterization
Angel et al. An interactive introduction to OpenGL programming
Zhou et al. Scene depth reconstruction on the GPU: a post processing technique for layered fog
Angel et al. An interactive introduction to OpenGL and OpenGL ES programming
Laakso Rendering with Coherent Layers
Chen et al. Blending and Texture Mapping

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20070116

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: NXP B.V.

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20080331

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: TRIDENT MICROSYSTEMS (FAR EAST) LTD.

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ENTROPIC COMMUNICATIONS, INC.

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20140103