EP1766584A2 - Inverse texture mapping 3d graphics system - Google Patents

Inverse texture mapping 3d graphics system

Info

Publication number
EP1766584A2
EP1766584A2 EP05749079A EP05749079A EP1766584A2 EP 1766584 A2 EP1766584 A2 EP 1766584A2 EP 05749079 A EP05749079 A EP 05749079A EP 05749079 A EP05749079 A EP 05749079A EP 1766584 A2 EP1766584 A2 EP 1766584A2
Authority
EP
European Patent Office
Prior art keywords
texture
space
positions
screen space
intensities
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP05749079A
Other languages
German (de)
English (en)
French (fr)
Inventor
Kornelis Meinds
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Entropic Communications LLC
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Priority to EP05749079A priority Critical patent/EP1766584A2/en
Publication of EP1766584A2 publication Critical patent/EP1766584A2/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Definitions

  • the invention relates to an inverse texture mapping 3D graphics processor, a graphics adapter comprising the 3D graphics processor, a computer comprising the 3D graphics processor, a display apparatus comprising the 3D graphics processor, and a method of inverse texture mapping.
  • ITM inverse texture mapping
  • a first aspect of the invention provides an inverse texture mapping 3D graphics processor as claimed in claim 1.
  • a second aspect of the invention provides a graphics adapter comprising the 3D graphics processor, as claimed in claim 10.
  • a third aspect of the invention provides a computer comprising the 3D graphics processor, as claimed in claim 11.
  • a fourth aspect of the invention provides a display apparatus comprising the 3D graphics processor, as claimed in claim 12.
  • a fifth aspect of the invention provides a method of inverse texture mapping as claimed in claim 13.
  • Advantageous embodiments are defined in the dependent claims.
  • the inverse texture mapping 3D graphics processor in accordance with the first aspect maps a 3D model onto screen space.
  • the graphics processor comprises a texture memory for storing texel intensities of texture space grid positions.
  • a plurality of screen space rasterizers determine pixel grid positions within different screen space polygons at a plurality of corresponding different display instants during a same temporal interval between sample instants of geometric data of the 3D model.
  • the pixel grid positions in screen space are considered to be positioned on a grid, the pixel intensities related to these pixel grid positions will be stored in a frame buffer memory and are used to display the image.
  • These different instants are referred to as display instants because the screen space projection of the 3D model is rendered for displaying at these instants.
  • the ratio between the number of display instants and sample instants is the frame rate up-conversion factor.
  • the screen space polygons associated with the same polygon of the 3D model have different positions in the screen space dependent on motion information of the 3D model with respect to the camera position (also called eye position).
  • a plurality of corresponding mappers maps the pixel grid positions of the screen space polygons at the different instants to texture space positions that usually do not coincide with the texel grid positions on which the texel intensities are stored.
  • a texture space resampler determines texel intensities at the mapped screen space positions from the texel grid intensities of the texture space grid positions stored in the texture memory or in texture cache.
  • a texture cache temporarily stores, for every texture space polygon, the texel grid intensities required by the texture space resampler during the temporal interval for all the screen space polygons being associated with a same texture space polygon. If a texture within a polygon would not fit in the texture cache a strategy of partitioning the polygon in smaller parts (e.g. blocks) could be used.
  • a plurality of corresponding pixel shaders determine, at said different display instants, pixel intensities optionally from the texel intensities received from the texture space resampler and optionally on the basis of a formula such as the well known Gouraud shading.
  • the same texture samples stored in the texture cache can be used for all the associated screen space polygons during a same temporal interval wherein the plurality of rendering instants occurs.
  • the texture needs to be fetched once per temporal interval from the texture memory and not for every display instant. Consequently, the data rate between the texture cache and the texture memory does not depend on the frame rate up- conversion factor. This is an important improvement because the data rate to the texture memory is limited by the speed of the texture memory and by the data rate on the bus to the texture memory. This is especially relevant if the texture memory is present on a separate chip.
  • the motion information comprises motion data that can be used to determine the path of motion of the polygon in the screen space within the temporal interval.
  • the vertices of the polygon(s) and the mapping of pixels to texture space can be determined from this path of motion. It has to be noted that the vertices of a polygon can have different paths of motion. Thus, the path of motion of the polygon is determined by the path of motion of each of the vertices of the polygon.
  • the motion information is a displacement vector that indicates a displacement of the vertices of the polygon in the screen space between two sample instants. The displacement at a particular one of the rendering instants can be determined by (linearly) interpolating the displacement defined by the displacement vector.
  • motion data is used in the form of two model/view matrices one for the current sample instant and one for the previous sample instant.
  • a motion vector in screen space can be determined.
  • the parameters of the mapping functions of the mappers for different rendering instants between two successive sample instants can be determined from this information.
  • This is a robust and efficient method to obtain displacement vectors in the eye or world space.
  • the vertices of the previous frame (or more general: the temporal interval) can be subtracted from the vertices of the current frame.
  • the 3D system calculates the coordinates of the eye space vertices for both the current frame instant (or more general: the current sample instant) and the previous frame instant.
  • the 3D application needs to send, next to the normal model- view matrix an additional model- view matrix for the previous frame instant.
  • the application may buffer the model- view matrices to efficiently resend them.
  • the geometry transformation unit of the 3D system applies both model- view matrices to transform each vertex to a "current" and a "previous" position in eye space.
  • the motion information is provided by the 3D application.
  • the ITM 3D graphics processor may determined the motion information by relating the vertices of the geometry of the current sampling instant with that of the previous sampling instant.
  • the ITM processor comprises a plurality of frame buffers for storing the intensities that are determined at the screen grid positions.
  • Each frame buffer stores a rendered image for a particular one of the display instants.
  • the frame rate up-conversion is obtained.
  • no texture maps are stored in the texture cache (TC) and the pixel shaders (PSj) are arranged to perform pixel shading on basis of non-texture data.
  • the ITM processor controls the mappers to perform an identical mapping to the frame buffers for non-moving objects. In fact only one of the mappers needs to perform the mapping and the output obtained this way is copied to all the frame buffers.
  • Fig. 1 shows a display of a 3D object on a display screen
  • Fig. 2 shows a block diagram of a prior art inverse texture mapping 3D graphics system
  • Figs. 3 A and 3B illustrate the operation of the inverse texture mapping system
  • Fig. 4 shows a block diagram of the inverse texture mapping 3D graphics system in accordance with an embodiment of the invention
  • Figs. 5A and 5B illustrate the operation of the embodiment of the inverse texture mapping system shown in Fig. 4
  • Fig. 6 shows a computer comprising the inverse texture mapping system
  • Fig. 7 shows a display apparatus comprising the inverse texture mapping system.
  • Fig. 1 elucidates the display of a 3D object WO in world space on a display screen DS.
  • the object may also be available in other 3D spaces such as model or eye space, in the following all these spaces are referred to as world space.
  • An object WO which may be a three-dimensional object such as the cube shown, is projected on the two-dimensional display screen DS.
  • a surface structure or texture defines the appearance of the three-dimensional object WO.
  • the polygon A has a texture TA and the polygon B has a texture TB.
  • the polygons A and B are with a more general term also referred to as graphics primitives.
  • the projection on to the display screen DS of the object WO is obtained by defining an eye or camera position ECP within the world space.
  • Fig. 1 shows how the polygon SGP projected on the screen DS is obtained from the corresponding polygon A.
  • the polygon SGP in the screen space SSP is defined by it's vertex coordinates in the screen space SSP. It is only the projection of the geometry of the polygon A which is used to determine the geometry of the polygon SGP. Usually, it suffices to know the vertices of the polygon A and the projection to determine the vertices of the polygon SGP.
  • the texture TA of the polygon A is not directly projected from the real world onto the screen space SSP.
  • the different textures of the real world object WO are stored in a texture map memory TM (see Fig. 2) or texture space TSP defined by the coordinates u and v.
  • a texture map memory TM see Fig. 2
  • texture space TSP defined by the coordinates u and v.
  • Fig. 1 shows that the polygon A has a texture TA which is available in the texture space TSP in the area indicated by TA, while the polygon B has another texture TB which is available in the texture space TSP in the area indicated by TB.
  • the polygon A is projected on the texture space TA to obtain a polygon TGP such that when the texture present within the polygon TGP is projected on the polygon A the texture of the real world object WO is obtained or at least resembled as much as possible.
  • a perspective transformation PPT between the texture space TSP and the screen space SSP projects the texture of the polygon TGP on the corresponding polygon SGP.
  • This process is also referred to as texture mapping.
  • the textures are not all present in a global texture space, but every texture defines its own texture space TSP.
  • the textures in the texture space TSP are stored in a texture memory TM for a discrete number of positions in the texture space TSP.
  • these discrete positions are the grid positions in the texture space TSP determined by integer values of u and v. These discrete grid positions are further referred to as grid texture positions or grid texture coordinates.
  • Positions in the texture space which are not limited to the grid positions are referred to as positions in the texture space TSP or as positions in the u,v space TSP.
  • the positions in the u,v space may be represented by floating point numbers.
  • the image to be displayed is stored in a frame buffer memory.
  • these discrete positions are the grid positions in the screen space SSP determined by integer values of x and y. These discrete grid positions are referred to as grid screen positions or grid screen coordinates.
  • Positions in the x,y space which are not limited to the grid positions are referred to as positions in the x,y space or as positions in the screen space SSP.
  • graphics primitive indicates a polygon (such as polygon A) in the world space, or the polygon SGP in the screen space SSP, or the polygon TGP in the texture space TSP. It is clear from the context which graphics primitive is meant.
  • Fig. 2 shows a block diagram of a prior art inverse texture mapping 3D graphics system.
  • the vertex transformation and lighting unit VER transforms the vertex coordinates of the polygon A; B in world space to the screen space SSP to obtain screen space coordinates x v l,y v l to x v 3,y v 3 of the vertices of the screen space polygon SGP.
  • the vertex T&L unit further performs light calculations to determine an intensity (also referred to as color) per vertex. If a texture TA, TB is to be applied to a screen space polygon SGP, the vertex T&L unit receives texture space coordinates u v ⁇ ,v v ⁇ to u v3 ,v v3 from the application.
  • the vertex T&L unit supplies both the screen space coordinates x v ,y v ( vijVvi; Xv 2 ,y v2 ; x v3 ,y v3 in Fig. 3A) and the texture space coordinates u v ,v v (u v ⁇ ,v v ⁇ ; u v2 ,v v2 ; u v3 ,v v3 in Fig. 3B) of the vertices of the screen space polygons SGP and the texture space polygons TGP, respectively, such that the position thereof in the screen space SSP and the texture space TSP, respectively, is known.
  • the screen space rasterizer SRAS determines the grid positions x g ,y g of the pixels which are positioned within the screen space polygon SGP which is determined by the screen space coordinates x v ,y of its vertices. In the example shown in Fig. 3 A, these screen space grid positions x g ,y g within the screen space polygon SGP are indicated by crosses.
  • the rasterizer SRAS may include a so called rasterizer setup which initializes temporal variables required by the rasterizer SRAS for efficient processing based on interpolation of the vertex attributes.
  • the mapper MAP maps the screen space grid positions x g ,y g to corresponding texture space positions u,v in the texture space TSP, see Figs. 3. Generally, these texel positions u,v will not coincident with texture space grid positions u g ,v g .
  • the pixel shader PS determines the intensity PSI(x g ,y g ) (also referred to as color) of a pixel with the screen space coordinates x g ,y g and thus the texture space coordinates u,v.
  • the pixel shader PS receives a set of attributes ATR per pixel, the grid screen coordinates x g ,y g of the pixel and the corresponding texture coordinates u,v.
  • the texture coordinates u,v are used to address texture data TI(u g ,v g ) on grid texture positions U g ,v g stored in the texture memory TM via the texture space resampler TSR.
  • the pixel shader PS may modify the texture coordinate data u,v and may apply and combine several texture maps on the same pixel. It also may perform shading without the use of texture data but on basis of a formula such as the well known Gouraud and Phong shading techniques.
  • the texture space resampler TSR determines the intensity PI(u,v) associated with the intensity PSI(x g ,y g ) of the pixel at the screen space grid position (x g ,y g ) mapped to the texture space coordinate (u,v) in-between texel grid positions (u g ,v g ).
  • the texture data TI(u g ,V g ) corresponding to the texture space grid position u g ,v g is indicated by TI(u g ,v g ).
  • the texel intensities TI(u g ,v g ) for texture space grid positions u g ,v g are stored in the texture memory TM.
  • the texture space resampler TSR determines the intensity PI(u,v) by filtering and accumulating the texel intensities TI(u g ,u v ) of texels with texture space grid coordinates Ug,v g and which have to contribute to the intensity PI(u,v).
  • the texture space resampler TSR determines the intensity PI(u,v) at the texture space position u,v by filtering the texel intensities on texture space grid positions u g ,v g surrounding the texture space position u,v. For example, a bilinear interpolation using the four texture space grid positions u g ,v g (indicated in Fig.
  • the hidden surface removal unit HSR usually includes a Z-buffer which enables determination of the visible colors on a per pixel basis.
  • the depth value z of a produced pixel value PSI(x g ,y g ) is tested against the depth value of the one stored in the Z- buffer at the same pixel screen coordinate x g ,y g (thus on the screen grid).
  • the pixel intensity or color PIP(x g ,y g ) is written into the frame buffer FB and the Z-buffer is updated.
  • the image to be displayed IM is read from the frame buffer FB.
  • a texture cache is present between the texture space resampler TSR and the texture memory TM.
  • the application provides the polygons in groups to minimize texture state switches. Each one of the groups of polygons is related to a same one of the textures.
  • the texture used for a particular group of polygons is stored whole or partially in the texture cache and the texture data can be fetched from texture cache by subsequent polygons from the same group.
  • Figs. 3 A and 3B illustrate the operation of the inverse texture mapping system.
  • Fig. 3A shows the screen space polygon SGP in the screen space SSP.
  • the vertices of the polygon SGP are indicated by the screen space positions x v ⁇ ,y v ⁇ ; v2,y v2 ; v 3 ,y V 3 which usually do not coincide with the screen space grid positions x g ,y g -
  • the screen space grid positions x g ,y g are the positions which have integer values for x and y.
  • the image to be displayed is determined by the intensities (color and brightness) PIP(x g ,y g ) of the pixels which are positioned on the screen space grid positions x g ,y g .
  • the rasterizer SRAS determines the screen space grid positions x g ,y g within the polygon SGP.
  • Fig. 3B shows the texture space polygon TGP in the texture space TSP.
  • the vertices of the texture space polygon TGP are indicated by the texture space positions u v ⁇ ,v v ⁇ ; u v2 ,v v2 ; u v3 ,v v3 which usually do not coincide with the texture space grid positions u g ,v g .
  • the texture space grid positions u g ,v g are the positions which have integer values for u and v.
  • the intensities of the texels TI(u g ,v g ) are stored in the texture memory TM for these texture space grid positions u g ,v g . There may be stored several texture maps in different resolutions of the same texture. A known technique which uses these different resolution textures is called MlP-mapping.
  • the texture space grid positions u g ,v g within the polygon TGP are indicated by dots in Fig. 4B.
  • the mapper MAP maps the screen space grid coordinates x g ,y g to corresponding texture space positions u,v in the texture space.
  • the intensity at a texture space position u,v is determined by filtering.
  • the intensity at the texture space position u,v which is, or contributes to, the intensity of the pixel at the screen space grid position x g ,y g is determined as a weighted sum of intensities at surrounding texture space grid positions u g ,v g .
  • a weighted sum of the texel intensities TI(u g ,v g ) at the texture space grid positions u g ,v g indicated by 1, 2, 3 and 4 is determined.
  • Fig. 4 shows a block diagram of the inverse texture mapping 3D graphics system in accordance with an embodiment of the invention.
  • the basic structure of the ITM shown in Fig. 4 is identical to the known ITM shown in Fig. 2. The difference is that a plurality of pipelines formed by the transform and lighting module VERj, the rasterizer
  • the items may be hardware which is present multiple times, or the same hardware may be used in a time multiplexing mode, or a combination of these two possibilities may be implemented.
  • the texture cache TC in accordance with the prior art ITM system as discussed with respect to Fig. 2 does also store a particular texture but this texture is used for the different polygons of the groups of polygons which require the same texture and not for the same polygon at different display instants.
  • the signals (data) shown in Fig. 4 are the same as in Fig. 2, the only difference is that the index j is added to indicate that the signals depend on the rendering instant tj.
  • FIGs. 5 A and 5B illustrate the operation of the embodiment of the inverse texture mapping system shown in Fig. 4.
  • Fig. 5 A shows the screen space polygon SGP1 at the render instant tl, and the screen space polygon TGPn at the render instant tn.
  • Fig. 5B is identical to Fig. 3B.
  • Fig. 5A shows the screen space polygon SGP1 at the render instant tl, and the screen space polygon TGPn at the render instant tn.
  • Fig. 5B is identical to Fig. 3B.
  • FIG. 5 A shows the screen space polygon SGP1 and SGPn, both are mapped from the same source polygon from the world space WO, only a different mapping has been used according to the associated render instant tl, tn along the motion path.
  • the display instants tl to tn are collectively also referred to as tj
  • the screen space polygons SGP1 to SGPn are also collectively referred to as screen space polygons SGPj.
  • the position of the screen space polygons SGPj depends either on the motion data provided by the application or on motion data determined from the positions of the screen space polygons SGPj at two successive sample instants ts of the geometric data.
  • the 3D application may be a 3D game, a VRML browser, a 3D user interface, a MPEG 4 visual renderer, visiophony or any other 3D application.
  • the screen space polygon SGPn is a translated version of the screen space polygon SGP1, other movements than across a straight line are as well possible.
  • the vertices of the screen space polygon SGP1 are indicated by the screen space positions x v l 1 ,y v l 1 ; x v 12,y v 12; x v 13,y v 13 which usually do not coincide with the screen space grid positions x g ,y g .
  • the vertices of the screen space polygon SGPn are indicated by the screen space positions x v nl,y v nl; x v n2,y v n2; x v n3,y v n3.
  • the vertices u v l,v v l to u v 3,v v 3 of the texture space polygon TGP are provided by the 3D application and stay the same in time (or at least during a certain period in time). It is the same texture which has to be applied to the moving projection (defined by the different screen space polygons SGPj) in the screen space SSP.
  • the moving 3D model together with the perspective mapping to the screen space SSP determines the vertices of the screen space polygons SGPj.
  • the index j is used to refer to the items related to the plurality of the n screen space polygons SGP1 to SGPn.
  • the temporal interval Tf is the period of time between two successive sample instants of the geometric data supplied by the 3D application.
  • the frame period is equal to the sampling period of the input signal (sampling of the geometry delivered by the 3D application).
  • the frame period of the output signal is determined by the number of display instants tj occurring within the temporal interval Tf or the sample period of the input signal.
  • the geometric data comprises the vertices of the texture space polygon TGP, data defining the perspective mapping from the 3D space to the screen space SSP, and the motion data.
  • the motion data is provided which indicates the motion path of the vertices of the screen space polygons SGPj within the temporal interval Tf.
  • the motion data can be used to obtain the motion path which may be described with a displacement vector which indicates the displacement of vertices of the polygon from the previous sampling instant to the current sampling instant.
  • the displacement vectors of the vertices of a polygon may differ in direction and size. For example a triangle polygon may rotate around one of it's vertices (so displacement size of that vertex is zero) and then the displacement vectors of the two other vertices (if they are not co-inside) differ in direction (and size if the distance of both vertices to the first differs).
  • the motion data may be a more advanced description of the motion path, such as, for example, curves described with conies, composite curves, Bezier curves, B- spline curves, or rational polynomials.
  • motion data determined from the positions of the screen space polygons SGPj at more than two successive sample instants ts of the geometric data may be used.
  • the application should supply the motion data together with the geometry data.
  • the vertex transformation and lighting unit VER is splitted into a plural of units indicated by VERj.
  • Each unit VERj transforms the world space vertex coordinates to screen space coordinates of the vertices of the polygons SGPj and calculates a vertex color depending on lighting state and the vertex position at display instant tj.
  • the rasterizer SRASj determines the screen space grid positions Xgj,ygj within the polygon SGPj.
  • the rasterizer SRAS1 determines the screen space grid positions x g l,y g l within the polygon SGP1.
  • These screen space grid positions Xgj,ygj inside the screen space polygons SGPj are indicated with a cross and are also referred to as the pixel positions.
  • the mappers MAPj map the screen space grid positions X g j,ygj within the screen space polygons SGPj to the texture space coordinates uj,vj which generally do not coincident with the texture space grid coordinates u g ,v g for which the intensities TI(u g ,v g ) are stored in the texture memory TM.
  • the texture space grid positions u g ,v g are the positions which have integer values for u and v.
  • the mappers MAPj always map the screen space grid positions Xgj,ygj inside the different screen space polygons SGPj to texture space coordinates uj,vj inside the same texture space polygon TGP.
  • the pixel shaders PSj each receive a set of attributes ATR per pixel, the grid screen coordinates X jjVgj of the pixel and the corresponding texture coordinates uj,yj.
  • the texture coordinates uj,vj are used to address texture data TI(u g ,v g ) on grid texture positions u g ,Vg stored in the texture memory TM via the texture space resampler TSR and the texture cache TC.
  • the pixel shaders PSj may modify the texture coordinate data uj,vj and may apply and combine several texture maps on the same pixel. They also may perform shading without the use of texture data but on basis of a formula such as the well known Gouraud and Phong shading techniques.
  • the texture space resampler TSR determines the intensity PI(uj,vj) associated with the intensity PSI(Xgj,y g j) of the pixel at the screen space grid position (x ⁇ y ⁇ ' ) mapped to the texture space coordinate (uj,vj) in-between texel grid positions (u g ,v g ).
  • the texel intensities TI(u g ,v g ) for the texture space grid positions u g ,v g are stored in the texture memory TM.
  • the texture space resampler TSR determines each one of the intensities PI(uj,vj) by filtering and accumulating the texel intensities TI(u g ,u v ) of texels which have texture space grid coordinates u g ,v g and which have to contribute to the intensity PI(uj,vj).
  • the texture space resampler TSR determines the intensity PI(uj,vj) at the texture space position uj,vj by filtering the texel intensities on texture space grid positions u g ,v g surrounding the texture space position uj,vj.
  • a bilinear interpolation using the four texture space grid positions u g ,v g (indicated in Fig. 5B with 1 to 4) surrounding the texture space position uj,vj may be used.
  • the resulting intensity PI(uj,vj) at the position uj,vj is used by the pixel shader PS to determine the pixel intensity PSI(Xgj,ygj) on the pixel grid position ⁇ y ⁇ .
  • the texture cache TC temporarily stores the texel intensities TI(u g ,v g ) required for the determination of all the intensities PI(uj,vj) of the texture space coordinate (uj,vj) mapped by the mappers MAPj.
  • the pixel shaders PSj determine the contributions of the intensities PI(uj,vj) to the pixel intensities PSI(Xgj,ygj). Thus, if all these contributions to the pixel intensities PSI(X g j,ygj) are determined for the same texture space polygon TGP at all the display instants tj, successively for each screen space polygon SGP, the data traffic between the texture cache TC and the texture memory TM is not increased compared to rendering of only a single screen space polygon SGP, if all the texel intensities TI(u g ,v g ) of the current polygon fit in the texture cache TC.
  • the polygon SGP can be subdivided into smaller parts (e.g. in blocks or other polygons) such that the texels of such a part do fully fit into the texture cache TC. Still, in every temporal interval Tf for every texture space polygon TGP only one fetch of the relevant data from the texture memory TM is required, independent on the number of rendering instants tj. The data bandwidth between the texture cache TC and the texture space resampler TSR increases with a factor equal to the number of rendering instants tj and thus with the frame rate up-conversion factor.
  • the hidden surface removal units HSRj usually include a Z-buffer which enables determination of the visible colors on a per pixel basis.
  • the Z-buffer has the size of a frame or a tile. In case of tile-based rendering, the tile size is relatively small and can even be made smaller for optimal use of the cache in the present frame rate up- conversion technique.
  • the depth value z of a produced pixel value PSI(Xgj,ygj) is tested against the depth value of the one stored in the Z-buffer belonging to the frame buffer FBj at the same pixel screen coordinate X g j,y g j (thus on the screen grid).
  • the pixel intensity or color PIP(xgj,ygj) is written into the frame buffer FBj and the Z-buffer belonging to FBj is updated.
  • the image to be displayed LM is read from the frame buffer FBj.
  • the intensities of the texels TI(u g ,v g ) are stored in the texture memory TM for the texture space grid positions u g ,v g .
  • the texture space grid positions u g ,v g within the texture space polygon TGP are indicated by dots in Fig. 5B.
  • the mappers MAPj map the screen space grid coordinates Xgj,ygj to corresponding texture space positions uj,vj in the texture space TSP.
  • the intensity PI(uj,vj) at a texture space position uj,vj is determined by filtering.
  • the intensity PI(uj,vj) at the texture space position uj,vj which is, or contributes to, the intensity PSI(Xgj,ygj) of the pixel at the screen space grid position Xgj,ygj, is determined as a weighted sum of intensities at surrounding texture space grid positions u g ,v g .
  • a weighted sum of the texel intensities TI(u g ,v g ) at the texture space grid positions Ug,v g indicated by 1, 2, 3 and 4 is determined.
  • Fig. 6 shows a computer comprising the inverse texture mapping system.
  • the computer PC comprises a processor 3, a graphics adapter 2 and a memory 4.
  • the processor 3 is suitably programmed to supply input data II to the graphics adapter 2.
  • the processor 3 communicates with the memory 4 via the bus Dl.
  • the graphics adapter 2 comprises the ITM system 1.
  • the graphics adapter 2 is a module which is plugged into a suitable slot (for example an AGP slot).
  • the graphics adapter comprises its own memory (for example the texture memory TM and the frame buffer FB).
  • the graphics adapter may use part of the memory 4 of the computer PC, now the graphics adapter need to communicate with the memory 4 via the bus D2 or via the processor 3 and the bus Dl .
  • the graphics adapter 2 supplies the output image OI via a standard interface to the display apparatus DA.
  • the display apparatus may be any suitable display, such as, for example, a cathode ray tube, a liquid crystal display, or any other matrix display.
  • the computer PC and the display DA need not be separate units which communicate via a standard interface but may be combined in a single apparatus, such as, for example, a personal digital assistant (PDA or pocket PC) or any other mobile device with a display for displaying images.
  • Fig. 7 shows a display apparatus comprising the inverse texture mapping system.
  • the display apparatus DA comprises the ITM pipeline 1 which receives the input data II (geometry and related data) and supplies the output image OI to a signal processing circuit 11.
  • the signal processing circuit 11 processes the output image OI to obtain a drive signal DS for the display 12.
  • the inverse texture mapping 3D graphics processor maps a 3D model WO onto a screen space SSP.
  • a texture memory TM stores texel intensities TI(u g ,v g ) of texture space grid positions u g ,v g .
  • a plurality of screen space rasterizers SRASj determines pixel grid positions within different screen space polygons SGPj at a plurality of corresponding different display instants tj during a same temporal interval Tf between sample instants ts of geometric data of the 3D model WO.
  • the screen space polygons SGPj have different positions in the screen space SSP dependent on motion information of the 3D model WO with respect to the camera.
  • a plurality of corresponding mappers MAPj map the pixel grid positions of the screen space polygons SGP at the different display instants tj to texture space positions uj,vj.
  • a texture space resampler TSR determines texel intensities PI(uj,vj) at the texture space positions uj,vj from the texel grid intensities TI(u g ,v g ) of the texture space grid positions u g ,v g stored in the texture memory TM.
  • a texture cache TC temporarily stores, for a texture space polygon TGP, the texel intensities TI(u g ,v g ) required by the texture space resampler TSR during the temporal interval Tf for all the screen space polygons SGP which are associated with a same texture space polygon TGP.
  • a plurality of corresponding pixel shaders PSj determine, at said different display instants tj, pixel intensities PSI(xgj,ygj) from the texel intensities PI(uj,vj).
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • Use of the verb "comprise” and its conjugations does not exclude the presence of elements or steps other than those stated in a claim.
  • the article "a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
EP05749079A 2004-06-16 2005-06-09 Inverse texture mapping 3d graphics system Withdrawn EP1766584A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP05749079A EP1766584A2 (en) 2004-06-16 2005-06-09 Inverse texture mapping 3d graphics system

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP04102746 2004-06-16
PCT/IB2005/051897 WO2005124693A2 (en) 2004-06-16 2005-06-09 Inverse texture mapping 3d graphics system
EP05749079A EP1766584A2 (en) 2004-06-16 2005-06-09 Inverse texture mapping 3d graphics system

Publications (1)

Publication Number Publication Date
EP1766584A2 true EP1766584A2 (en) 2007-03-28

Family

ID=35462636

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05749079A Withdrawn EP1766584A2 (en) 2004-06-16 2005-06-09 Inverse texture mapping 3d graphics system

Country Status (4)

Country Link
EP (1) EP1766584A2 (ja)
JP (1) JP2008502979A (ja)
CN (1) CN101006471B (ja)
WO (1) WO2005124693A2 (ja)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010041215A1 (en) * 2008-10-09 2010-04-15 Nxp B.V. Geometry primitive shading graphics system
US10726619B2 (en) * 2015-10-29 2020-07-28 Sony Interactive Entertainment Inc. Foveated geometry tessellation
US10726626B2 (en) * 2017-11-22 2020-07-28 Google Llc Interaction between a viewer and an object in an augmented reality environment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5410358A (en) * 1991-07-23 1995-04-25 British Telecommunications Public Limited Company Method and device for frame interpolation of a moving image
GB9115874D0 (en) * 1991-07-23 1991-09-04 British Telecomm Frame interpolation
US6331856B1 (en) * 1995-11-22 2001-12-18 Nintendo Co., Ltd. Video game system with coprocessor providing high speed efficient 3D graphics and digital audio signal processing
JP3645024B2 (ja) * 1996-02-06 2005-05-11 株式会社ソニー・コンピュータエンタテインメント 描画装置及び描画方法
CA2250021C (en) * 1997-05-19 2007-02-06 Matsushita Electric Industrial Co., Ltd. Graphic display apparatus, synchronous reproduction method, and av synchronous reproduction apparatus
JP3481077B2 (ja) * 1997-05-19 2003-12-22 松下電器産業株式会社 グラフィック表示方法と装置
JP2000025307A (ja) * 1998-07-14 2000-01-25 Fuji Xerox Co Ltd 画像処理装置のパラメータ共有方法およびシステム
JP2001236519A (ja) * 2000-02-21 2001-08-31 Seiko Epson Corp 動画像再生装置および動画像再生方法ならびに情報記録媒体
US7174050B2 (en) * 2002-02-12 2007-02-06 International Business Machines Corporation Space-optimized texture maps
JP3934111B2 (ja) * 2004-02-04 2007-06-20 株式会社ソニー・コンピュータエンタテインメント 描画装置及び描画方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2005124693A3 *

Also Published As

Publication number Publication date
JP2008502979A (ja) 2008-01-31
CN101006471A (zh) 2007-07-25
WO2005124693A3 (en) 2006-03-23
WO2005124693A2 (en) 2005-12-29
CN101006471B (zh) 2010-09-01

Similar Documents

Publication Publication Date Title
US6229553B1 (en) Deferred shading graphics pipeline processor
EP0875860B1 (en) Precise gradient calculation system and method for a texture mapping system of a computer graphics system
US6771264B1 (en) Method and apparatus for performing tangent space lighting and bump mapping in a deferred shading graphics processor
US6532013B1 (en) System, method and article of manufacture for pixel shaders for programmable shading
US8224107B2 (en) Method and system for signal processing, for instance for mobile 3D graphic pipelines, and computer program product therefor
US9208605B1 (en) Temporal antialiasing in a multisampling graphics pipeline
US20140347359A1 (en) Cache-efficient processor and method of rendering indirect illumination using interleaving and sub-image blur
US9367946B2 (en) Computing system and method for representing volumetric data for a scene
US8736627B2 (en) Systems and methods for providing a shared buffer in a multiple FIFO environment
EP1759355B1 (en) A forward texture mapping 3d graphics system
CN107784622B (zh) 图形处理系统和图形处理器
US7525553B2 (en) Computer graphics processor and method for generating a computer graphics image
Policarpo et al. Deferred shading tutorial
JP2006517705A (ja) コンピュータグラフィックスシステム及びコンピュータグラフィクイメージのレンダリング方法
WO2005124693A2 (en) Inverse texture mapping 3d graphics system
US7385604B1 (en) Fragment scattering
Stewart et al. Pixelview: A view-independent graphics rendering architecture
WO2010041215A1 (en) Geometry primitive shading graphics system
Muszyński et al. Wide Field of View Projection Using Rasterization
Angel et al. An interactive introduction to OpenGL programming
Zhou et al. Scene depth reconstruction on the GPU: a post processing technique for layered fog
Angel et al. An interactive introduction to OpenGL and OpenGL ES programming
Laakso Rendering with Coherent Layers
Chen et al. Blending and Texture Mapping
Szirmay-Kalos et al. Local Illumination Rendering Pipeline of GPUs

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20070116

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: NXP B.V.

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20080331

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: TRIDENT MICROSYSTEMS (FAR EAST) LTD.

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ENTROPIC COMMUNICATIONS, INC.

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20140103